SP
BravenNow
Decoder-based Sense Knowledge Distillation
| USA | technology | ✓ Verified - arxiv.org

Decoder-based Sense Knowledge Distillation

#large language models #knowledge distillation #decoder models #lexical knowledge #word senses #DSKD framework #arXiv paper

📌 Key Takeaways

  • Researchers developed DSKD framework to integrate lexical knowledge into decoder LLMs
  • Framework enhances knowledge distillation without requiring dictionary lookup at inference
  • Extensive experiments show significant improvement in decoder performance
  • Method enables generative models to inherit structured semantics efficiently

📖 Full Retelling

Researchers Qitong Wang, Mohammed J. Zaki, Georgios Kollias, and Vasileios Kalantzis introduced a new framework called Decoder-based Sense Knowledge Distillation (DSKD) in a paper submitted to arXiv on February 25, 2026, addressing the challenge of incorporating structured lexical knowledge into decoder-style large language models that have traditionally overlooked word senses and relationships. The paper tackles a significant limitation in current AI systems where large language models excel at capturing rich semantic information through contextual embeddings but often fail to incorporate structured lexical knowledge such as word senses and their relationships. This limitation is particularly problematic for decoder-style generative models, which have proven more challenging to enhance with sense-based knowledge compared to encoder models. The DSKD framework represents an innovative approach to this problem by integrating lexical resources into the training process of decoder LLMs without requiring dictionary lookup at inference time. This is a crucial distinction as it maintains the computational efficiency that makes generative models practical for real-world applications. The researchers conducted extensive experiments on diverse benchmarks to validate their approach, demonstrating that DSKD significantly enhances knowledge distillation performance for decoders. This breakthrough enables generative models to inherit structured semantics while maintaining efficient training processes, potentially leading to more nuanced and contextually appropriate language generation that better understands subtle differences in word meanings and relationships.

🏷️ Themes

AI Research, Natural Language Processing, Knowledge Distillation

Entity Intersection Graph

No entity connections available yet for this article.

Original Source
--> Computer Science > Computation and Language arXiv:2602.22351 [Submitted on 25 Feb 2026] Title: Decoder-based Sense Knowledge Distillation Authors: Qitong Wang , Mohammed J. Zaki , Georgios Kollias , Vasileios Kalantzis View a PDF of the paper titled Decoder-based Sense Knowledge Distillation, by Qitong Wang and 3 other authors View PDF HTML Abstract: Large language models learn contextual embeddings that capture rich semantic information, yet they often overlook structured lexical knowledge such as word senses and relationships. Prior work has shown that incorporating sense dictionaries can improve knowledge distillation for encoder models, but their application to decoder as generative models remains challenging. In this paper, we introduce Decoder-based Sense Knowledge Distillation , a framework that integrates lexical resources into the training of decoder-style LLMs without requiring dictionary lookup at inference time. Extensive experiments on diverse benchmarks demonstrate that DSKD significantly enhances knowledge distillation performance for decoders, enabling generative models to inherit structured semantics while maintaining efficient training. Subjects: Computation and Language (cs.CL) ; Artificial Intelligence (cs.AI) Cite as: arXiv:2602.22351 [cs.CL] (or arXiv:2602.22351v1 [cs.CL] for this version) https://doi.org/10.48550/arXiv.2602.22351 Focus to learn more arXiv-issued DOI via DataCite (pending registration) Submission history From: Qitong Wang [ view email ] [v1] Wed, 25 Feb 2026 19:15:04 UTC (1,823 KB) Full-text links: Access Paper: View a PDF of the paper titled Decoder-based Sense Knowledge Distillation, by Qitong Wang and 3 other authors View PDF HTML TeX Source view license Current browse context: cs.CL < prev | next > new | recent | 2026-02 Change to browse by: cs cs.AI References & Citations NASA ADS Google Scholar Semantic Scholar export BibTeX citation Loading... BibTeX formatted citation × loading... Data provided by: Bookmark Bibliogr...
Read full article at source

Source

arxiv.org

More from USA

News from Other Countries

🇬🇧 United Kingdom

🇺🇦 Ukraine