Skip to main content
eScholarship
Open Access Publications from the University of California

UC Santa Barbara

UC Santa Barbara Electronic Theses and Dissertations bannerUC Santa Barbara

Knowledge-Grounded Natural Language Processing

Abstract

As the primary means of human communication, natural language bears the functionality to bridge the abstract knowledge form and its realizations to all kinds of human usages. In an artificial intelligence assistant, the natural language interface is grounded on various knowledge sources and designated to perform conveyance, navigation, and reasoning among those sources. Based on the mechanism to interact with the knowledge groundings, the natural language processing tasks can be primarily categorized into 1) natural language understanding (NLU): understanding the semantics of the natural language from the user to perform navigation or reasoning among the knowledge sources, and 2) natural language generation (NLG): generating natural language responses to the user grounded on the knowledge sources. In this thesis, I will elaborate on my work on both NLU and NLG divisions in order to extend the edge of current research toward building an advanced, human-like AI assistant.

For the NLU division, first, current large pre-trained language models can achieve very strong performances on many NLP tasks but still struggle with tasks requiring complex reasoning abilities. In this thesis, I will discuss my work on complex numerical reasoning over structured and unstructured knowledge. Second, current dialogue assistants follow a fixed ontology to represent user preferences, which makes them essentially agent-centric and lack the power to represent fine-grained, realistic user preferences. In this thesis, I will discuss our first step to build the user-centric dialogue system by proposing a new formulation for user preference representations.

For the NLG division, first, current models can obtain strong performances with large-scale training data, but this is not realistic for real-world applications. In this thesis, I will discuss my work on few-shot natural language generation from tabular data. Second, NLG models often suffer from factual correctness when generating descriptions with logical inferences. This thesis proposes to study high-fidelity natural language generation with logical inferences. At last, I will discuss my work for NLG in the important real application domain of dialogue systems. Current research on dialogue systems typically studies each type of dialogue individually. However, the ultimate goal of conversational AI is to build a unified assistant capable of conversing with all kinds of dialogues and switching among them seamlessly. Towards this goal, this thesis studies to enrich task-oriented dialogue with knowledge-grounded chitchat.

In the above work, we define the novel tasks and propose new datasets, as well as design approaches to tackle the new challenges. Finally, we will make the conclusion and discuss the future plans and directions.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View