Department of Computer Science: MSc Thesis Presentation
Date: Friday, 20 May 2022
Symbolic Computation in Deep Neural Networks
Author: Yujia Guo
Advisor: Dr. Tommi Gröndahl
Supervisor: Prof. N. Asokan
Studying symbolic computation in deep neural networks (DNNs) is essential for improving their explainability and generalizability. Whether DNNs can conduct symbolic computation is a topic of long-standing controversy. One complaint is that DNNs, as connectionist models, only process associative relations and are unlikely to have higher-level cognitive abilities. However, their success on complex tasks such as language modeling has raised interest in whether symbolic computation is involved. We investigate the presence of symbolic computation in DNNs by testing the performance of state-of-the-art Transformer network BERT and T5 in reasoning and vocabulary generalization tasks. Our results show that the model has good performance on systematic and zero-shot vocabulary generalization but is easily disturbed by task-specific vocabularies. We propose the interpretation that Transformers accomplish the tasks by two main associative and hence not genuinely symbolic methods 1. via embedding similarity between tokens, 2. (in)sensitivity to positional features irrespective of vocabulary.