Graph-to-Sequence Learning in Natural Language Processing

Dr. Lingfei Wu
Research Staff Member
IBM Research AI
IBM T. J. Watson Research Center

Thursday, Apr 18, 2019
10 AM - 11 AM
EB 3105 Engineering Building

Abstract:
The celebrated Seq2Seq technique and its numerous variants achieve excellent performance on many tasks such as neural machine translation, natural language generation, speech recognition, and drug discovery. Despite their flexibility and expressive power, a significant limitation with the Seq2Seq models is that a neural network can only be applied to problems whose inputs are represented as sequences. However, the sequences are probably the simplest structured data and many important problems are best expressed with a complex structure such as a graph. On one hand, these graph-structured data can encode complicated pairwise relationships for learning more informative representations; On the other hand, the structural and semantic information in sequence data can be exploited to augment original sequence data by incorporating the domain-specific knowledge. To cope with the complex structured graph inputs, we propose Graph2Seq, a novel attention-based neural network architecture for graph-to-sequence learning. Our Graph2Seq can be viewed as a generalized Seq2Seq model for graph inputs, which a general end-to-end neural encoder-decoder architecture that encodes an input graph and decodes the target sequence. In this talk, I will first introduce our Graph2Seq model, and then talk about how to apply this model in different NLP tasks. In particular, we illustrate the advantages of our Graph2Seq model over various Seq2Seq models and Tree2Seq models in our two recent works (“Exploiting Rich Syntactic Information for Semantic Parsing with Graph-to-Sequence Model”, EMNLP 2018) and (“SQL-to-Text Generation with Graph-to-Sequence Model”, EMNLP 2018).

Biography:
Lingfei Wu is a Research Staff Member in the IBM AI Foundations Labs, Reasoning group at IBM T. J. Watson Research Center. He earned his Ph.D. degree in computer science from College of William and Mary in August 2016, under the supervision of Prof. Andreas Stathopoulos. His research interests lie at the intersection of Machine Learning(Deep Learning), Representation Learning, and Natural Language Processing, with a particular emphasis on the fast-growing subjects of Graph Neural Networks and its extensions on new application domains and tasks. Lingfei has published more than 30 top-ranked conference and journal papers, including but not limited to SysML, NIPS, KDD, ICDM, AISTATS, NAACL, EMNLP, AAAI, ICASSP, SC, SIAM Journal on Scientific Computing, IEEE Transaction on Big Data, and Journal of Computational Physics. He is also a co-inventor of more than 13 filed US patents. Lingfei is serving as the Tutorial Chairs of IEEE BigData'18. In addition, he has regularly served as a TPC member of many major AI/ML/DL/DM/NLP conferences, including but not limited to IJCAI, AAAI, NIPS, ICML, ICLR, KDD, and ACL.

Host:
Prof. Jiliang Tang