HyperAIHyperAI

Command Palette

Search for a command to run...

Attention and Lexicon Regularized LSTM for Aspect-based Sentiment Analysis

Toni Badia Patrik Lambert Lingxian Bao

Abstract

Abstract Attention based deep learning systems have been demonstrated to be the state of the art approach for aspect-level sentiment analysis, however, end-to-end deep neural networks lack flexibility as one can not easily adjust the network to fix an obvious problem, especially when more training data is not available: e.g. when it always predicts extit{positive} when seeing the word extit{disappointed}. Meanwhile, it is less stressed that attention mechanism is likely to {}over-focus{''} on particular parts of a sentence, while ignoring positions which provide key information for judging the polarity. In this paper, we describe a simple yet effective approach to leverage lexicon information so that the model becomes more flexible and robust. We also explore the effect of regularizing attention vectors to allow the network to have a broader {}focus{''} on different parts of the sentence. The experimental results demonstrate the effectiveness of our approach.


Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing

HyperAI Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Attention and Lexicon Regularized LSTM for Aspect-based Sentiment Analysis | Papers | HyperAI