Songlin Yang
Songlin (松琳) is currently a second-year PhD student at MIT CSAIL, advised by Prof. Yoon Kim.
Previously, she obtained her bachelar’s degree from SUSTech in 2020 and her master’s degree from ShanghaiTech in 2023, where she was advised by Prof. Kewei Tu.
Her research is centered on the intersection of machine learning system and large language model, with a specific focus on the hardware-aware algorithm design for efficient sequence modeling.
news
Jan 18, 2025 | Gave a remote talk at Hao AI Lab @ UCSD, “What’s Next for Mamba? Towards More Expressive Recurrent Update Rules” |
---|---|
Dec 9, 2024 | 📣 New arXiv paper released: Gated Delta Networks: Improving Mamba2 with Delta Rule |
Dec 8, 2024 | Check out the blog post accompanying our NeurIPS ‘24 paper, “Parallelizing Linear Transformers with the Delta Rule over Sequence Length,” here. |
Aug 20, 2024 | Gave a talk at HazyResearch @ Standford, “Linear Transformers for Efficient Sequence Modeling” |
Apr 25, 2024 | Gave a talk at Cornell Tech, “Gated linear Recurrence for Efficient Sequence Modeling” |
Jan 1, 2024 | Introducing our open-source project flash-linear-attention . Join Discord if you are interested in linear attention/RNN! |