Bailin Wang

prof_pic.jpg

Researcher at Apple AI/ML, working on foundation models.

Previously, I was a postdoc at MIT and obtained my PhD from the University of Edinburgh. I worked on semantic parsing with latent-variable models before LMs took over NLP.

I’m currently interested in algorithmically improving the efficiency of sequence models to enable capabilities such as:

  • long-context reasoning
  • multimodal learning

latest posts