New York: Springer, 2018. — 99 p.
Acronyms
Deep Learning Background
Incorporating Structural Information into Neural Architectures
Proposed Tree-based Convolutional NNs
Structure of the Book
Refs
Neuron & Multilayer Network
Training Objectives
Learning Neural Parameters
Pretraining Neural Networks
Neural Networks for NLP
Neural Language Models
Word Embeddings
Convolutional Neural Network
Recurrent Neural Network
Recursive Neural Network
Refs
General Idea & Formula of TBCNN
Difficulties in Designing TBCNN
OverviewRepresentation Learning for AST Nodes
Tree-based Convolutional Layer
Dynamic Pooling
Continuous Binary Tree Model
Unsupervised Program Vector Representations
Classifying Programs by Functionalities
Detecting Bubble Sort
Model Analysis
Summary & Discussion
Refs
Sentence Modeling & Constituency Trees
Recursively Representing Intermediate Nodes
Constituency Tree-based Convolutional Layer
Experiments
Sentiment Analysis
Question Classification
Summary & Discussion
Refs
Dependency Trees
Dependency Trees as Input
Convolutional Layer
Dynamic Pooling Layer
Applying d-TBCNN to Sentence Matching
Experiments
Discriminative Sentence Modeling
Sentence Matching
Model Analysis
Visualization
Conclusion & Discussion
Refs
Conclusion & future Work