About me
I am a postdoc at MIT, working with Yoon Kim. I am currently interested in developing more controllable and efficient sequence models. Free feel to reach out if you’d like to chat!
I finished my PhD at the University of Edinburgh, advised by Ivan Titov and Mirella Lapata. Prior to my PhD, I worked on structured prediction with Wei Lu. During my PhD, I primarily focus on addressing several generalization challenges that arise in executable semantic parsing (e.g., text-to-SQL parsing), namely domain generalization, learning from weak supervision and systematic generalization, based on methodologies of latent discrete structure learning (e.g., separable permutation) and specialized learning objectives (e.g., meta-learning).
Publications
* denotes equal contribution.
Gated Linear Attention Transformers with Hardware-Efficient Training
Songlin Yang*, Bailin Wang*, Yikang Shen, Rameswar Panda, Yoon Kim
Arxiv, [code]Grammar Prompting for Domain-Specific Language Generation with Large Language Models
Bailin Wang, Zi Wang, Xuezhi Wang, Yuan Cao, Rif A. Saurous and Yoon Kim
In NeurIPS 2023, [presentation] [code]Hierarchical Phrase-based Sequence-to-Sequence Learning
Bailin Wang, Ivan Titov, Jacob Andreas and Yoon Kim
In EMNLP 2022, [poster] [code]Structured Reordering for Modeling Latent Alignments in Sequence Transduction
Bailin Wang, Mirella Lapata and Ivan Titov
In NeurIPS 2021, [code] [slides] [video]Meta-Learning to Compositionally Generalize
Henry Conklin*, Bailin Wang*, Kenny Smith and Ivan Titov
In ACL 2021, [code] [slides] [video (via Underline)]Learning from Executions for Semantic Parsing
Bailin Wang, Mirella Lapata and Ivan Titov
In NAACL 2021, [code] [slides] [video (via Underline)]Meta-Learning for Domain Generalization in Semantic Parsing
Bailin Wang, Mirella Lapata and Ivan Titov
In NAACL 2021, [code] [slides] [video (via Underline)]Learning to Synthesize Data for Semantic Parsing
Bailin Wang, Wenpeng Yin, Victoria Lin and Caiming Xiong
In NAACL 2021, [code] [video (via Underline)]GraPPa: Grammar-Augmented Pre-Training for Table Semantic Parsing
Tao Yu, Chien-Sheng Wu, Xi Victoria Lin, Bailin Wang, Yi Chern Tan, Xinyi Yang, Dragomir Radev, Richard Socher and Caiming Xiong
In ICLR 2021, [video]RAT-SQL: Relation-Aware Schema Encoding and Linking for Text-to-SQL Parsers
Bailin Wang*, Richard Shin*, Xiaodong Liu, Oleksandr Polozov and Matthew Richardson
In ACL 2020, [code] [slides] [video]Learning Semantic Parsers from Denotations with Latent Structured Alignments and Abstract Programs
Bailin Wang, Mirella Lapata and Ivan Titov
In EMNLP 2019, [code] [slides] [video]Combining Spans into Entities: A Neural Two-Stage Approach for Recognizing Discontiguous Entities
Bailin Wang and Wei Lu
In EMNLP 2019, [code]A Neural Transition-based Model for Nested Mention Recognition
Bailin Wang, Wei Lu, Yu Wang and Hongxia Jin
In EMNLP 2018, [code]Neural Segmental Hypergraphs for Overlapping Mention Recognition
Bailin Wang and Wei Lu
In EMNLP 2018, [code] [slides] [video]Learning latent opinions for aspect-level sentiment classification
Bailin Wang and Wei Lu.
In AAAI 2018, [code] [slides]
You can also find them on my Google Scholar profile.
Log of Parsing Papers
Epoch 7: Late 2022, bow down to LLM …
Epoch 6: Early 2022, let’s focus on how to make discrete latent structures/variables work!
Epoch 5: In 2021, I’m convinced that Transformers are indeed powerful, but we also need specialized objectives to regularize the training of them.
Epoch 4: In 2020, Transformers are everywhere, wondering how latent structures can still be useful somehow.
Epoch 3: During 2018-2019, maybe structured prediction is not required as we already have good end-to-end systems? But latent structures can help!
Epoch 2: During 2017-2018, structured prediction is interesting, I can play with DL and fancy structures!
Epoch 1: In 2017, it seems that everyone is doing DL for NLP, so I should follow though I do not understand why they work so well.
Epoch 0: During 2016-2017, I was intrigued by rule/grammar-based parsing systems (and their usage in SMT), and I wish I could do something related.
Work Experience
- Research Intern at Salesforce Research, Summer 2020
- Research Intern at Microsoft Research Redmond, Summer 2019
- Research Intern at Samsung Research America, Summer 2018
- Research Intern at StatNLP Singapore, Summer 2017
- NLP Intern/Engineer at Mobvoi. Beijing, 2016