• ANSICHT
    • Meine Themen
    • Alle Themen
    • Alle Publikationen
    • Alle Autoren
    • Alle Schlagworte
    • Nicht zugeordnet
    • Aktuell
    • Suchen
  • EXPORT
    • Alle Publikationen exportieren
  • SORTIERE NACH
    • Autor
    • Titel
    • Typ/Zeitschrift
    • Jahr
    • Erstelldatum
  • SYSTEM
    • Hilfe
    • Über diese Seite
  • GASTNUTZER
  • anonymous
  • Themen abonnieren
  • ANMELDUNG
  • Name:
    Passwort:



Deutsch, English, Nederlands, Norsk, Português, <mehr...>
Large Language Models

Related keywords:


  • compliance automation
  • crypto-asset markets
  • Explainable AI
  • federated learning
  • fraud detection
  • gender-based violence
  • Hybrid AI
  • Knowledge Injection
  • markets in Crypto- Assets Regulation (MiCAR)
  • Medical Visual Question Answering
  • Multi-Agent Systems
  • Multimodal Learning
  • Natural Language Processing
  • off-chain due diligence
  • Personal- ized Recommender Systems
  • regulatory technology (RegTech)
  • Social bias in media representation
  • Web text mining

Publikationen für Schlagwort "Large Language Models"
2026
Mario Trerotola, Mimmo Parente und Davide Calvaresi, A Hybrid Multi-Agent System for Early Scam Detection in Crypto-Assets (2026), in: Applied Sciences - MDPI
attachment
2025
Zhan Liu und Nicole Glassey Balet, A Hybrid AI System for Evaluating Media Representation of Violence and Inequality, in: 26th International Conference on Web Information Systems Engineering, Marrakech, Morocco, Springer Nature, 2025
attachment
Ege Soyarar, Reyhan Aydoğan, Berk Buzcu und Davide Calvaresi, Explaining Federated Learning-based Movie Recommendations, in: IEEE MetroXRAINE 2025, Ancona, IEEE, 2025
attachment
Henning Müller, Ao Ma, Zhiyuan Li, Zhuonan Liang, Tiancheng Gu, Jianan Fan, Jieting Long und Weidong Cai, LLM-Enhanced Information Mining for Medical Visual Question Answering, Workshop on Large Language Model Using Multi-modal Data for User Modeling, Sydney, Australia, 2025. (2025), in: WWW '25: The ACM Web Conference 2025(2297-2305)
attachment
[DOI]
[URL]
Ausführdauer: 0.1099 Sekunden.