Paying Attention to Product Reviews: Sentiment Analysis with Additive, Multiplicative, and Local Attention Mechanisms
Skip to main content
eScholarship
Open Access Publications from the University of California

UCLA

UCLA Electronic Theses and Dissertations bannerUCLA

Paying Attention to Product Reviews: Sentiment Analysis with Additive, Multiplicative, and Local Attention Mechanisms

  • Author(s): Eastman, Greg
  • Advisor(s): Wu, Yingnian
  • et al.
Abstract

Every business needs to understand its consumers’ experiences to succeed, but it is impossiblefor a large company to read every review. By having a computer read customer responses and relay their preferences, businesses can respond to consumers with a more informed approach. We begin this work by reviewing text preprocessing strategies as well as previous solutions to sentiment tasks. Then we use Amazon tool review data to introduce the addition of an attention mechanism to a BiLSTM encoder-decoder format. Three attention strategies are tested: additive, multiplicative, and local. To experimentally investigate their performance, we compare the attention models to each other as well as two controls.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View