Bailin Wang

Researcher at Apple AI/ML, working on the pretraining of foundation models.
Previously, I was a postdoc at MIT and obtained my PhD from the University of Edinburgh. I worked on semantic parsing and machine translation.
I’m currently interested in algorithmically improving the efficiency of sequence models to enable capabilities such as:
- long-context reasoning
- multimodal learning
news
latest posts
Jun 14, 2024 | The End of Training Log |
---|