Songlin Yang

Songlin (松琳) is currently a second-year PhD student at MIT CSAIL, advised by Prof. Yoon Kim.
Previously, she obtained her bachelar’s degree from SUSTech in 2020 and her master’s degree from ShanghaiTech in 2023, where she was advised by Prof. Kewei Tu.
Her research is centered on the intersection of machine learning system and large language model, with a specific focus on the hardware-aware algorithm design for efficient sequence modeling.
news
Jan 22, 2025 | 📣 Gated Delta Networks: Improving Mamba2 with Delta Rule and Stick-Breaking Attention have been accepted to ICLR 2025! |
---|---|
Jan 18, 2025 | Gave a remote talk at Hao AI Lab @ UCSD, “What’s Next for Mamba? Towards More Expressive Recurrent Update Rules” |
Dec 8, 2024 | ![]() |
Aug 20, 2024 | Gave a talk at HazyResearch @ Standford, “Linear Transformers for Efficient Sequence Modeling” |
Apr 25, 2024 | Gave a talk at Cornell Tech, “Gated linear Recurrence for Efficient Sequence Modeling” |
Jan 1, 2024 | Introducing our open-source project flash-linear-attention ![]() ![]() ![]() |