Skip to main content
eScholarship
Open Access Publications from the University of California

Structural and Processing Equivalences Between Graphical and Vector-based Models of Knowledge Representation

Abstract

Both network (graph) and vector space are two representational forms commonly used in modeling knowledge structures and processes, but their relationship has not been extensively explored. In this paper, we conceptualize and formally show that these two types of models can be related in terms of both structure and process. In particular, the traditional and `higher order' cosine similarity in the vector space is mathematically equivalent to intersecting activation spread on a network after traversing direct and indirect paths. Inspired by this equivalence, we transfer graphical techniques to vector space modeling and demonstrate that the `higher-order' information embedded in the vector space can be used to create more powerful representations and accelerate learning in neural networks. Our result may have profound implications for both cognitive representational theory and machine learning practice.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View