Skip to main content
eScholarship
Open Access Publications from the University of California

Dynamic Word Embedding for News Analysis

  • Author(s): Zhang, Haoxiang
  • Advisor(s): Chang, Kai-Wei
  • et al.
Abstract

Dynamic word embeddings refer to dividing data into time slices and obtaining the word vector representations of each time slice. By applying dynamic word embeddings, we can analyze how word trends evolve over time. Recently, a number of efforts have been applying dynamic word embeddings to understand how the meanings of words change over years or decades. However, little or no efforts have focused on the monthly changes, which are important for news analysis, such as how a word trend changes monthly. This thesis introduces how to apply dynamic word embeddings for news analysis and presents meaningful word trends evaluations. The application can be extended for news analysis of any time intervals, given enough data within each time slice.

Main Content
Current View