A Semantic Loss Function for Deep Learning with Symbolic Knowelge
- Author(s): Xu, Jingyi
- Advisor(s): Van den Broeck, Guy
- et al.
This thesis develops a novel methodology incorporating symbolic knowledge in deep learning.
We constructed a logical-constraint semantic loss, using a neural network's output vectors as variables.
Semantic loss captures the symbolic knowledge and adds previously-lost information to neural networks.
An experimental evaluation shows that it leads the neural network to achieve (near-)state-of-the-art performance on semi-supervised multi-class classification.
Moreover, it is applicable to more complex constraints.
We present a process, at the end of the thesis, of calculating the semantic loss and its derivatives for complex constraints to show its generality.
The work presented in this thesis was published at the International Conference on Machine Learning.