HyperAIHyperAI

Command Palette

Search for a command to run...

MIDAS: Multi-level Intent, Domain, And Slot Knowledge Distillation for Multi-turn NLU

Li Yan Kim So-Eon Park Seong-Bae Han Soyeon Caren

Abstract

Although Large Language Models (LLMs) can generate coherent text, they oftenstruggle to recognise user intent behind queries. In contrast, Natural LanguageUnderstanding (NLU) models interpret the purpose and key information of userinput for responsive interactions. Existing NLU models typically map utterancesto a dual-level semantic frame, involving sentence-level intent (SI) andword-level slot (WS) labels. However, real-life conversations primarily consistof multi-turn dialogues, requiring the interpretation of complex and extendedexchanges. Researchers encounter challenges in addressing all facets ofmulti-turn dialogue using a unified NLU model. This paper introduces MIDAS, anovel approach leveraging multi-level intent, domain, and slot knowledgedistillation for multi-turn NLU. We construct distinct teachers for SIdetection, WS filling, and conversation-level domain (CD) classification, eachfine-tuned for specific knowledge. A multi-teacher loss is proposed tofacilitate the integration of these teachers, guiding a student model inmulti-turn dialogue tasks. Results demonstrate the efficacy of our model inimproving multi-turn conversation understanding, showcasing the potential foradvancements in NLU through multi-level dialogue knowledge distillation. Ourimplementation is open-sourced on https://github.com/adlnlp/Midas.


Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing

HyperAI Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp