Songlin Yang
Songlin is currently a first-year PhD student at MIT CSAIL, advised by Prof. Yoon Kim.
Previously, she obtained her bachelar’s degree from SUSTech in 2020 and her master’s degree from ShanghaiTech in 2023, where she was advised by Prof. Kewei Tu.
Her research is centered on the intersection of machine learning system and large language model, with a specific focus on the hardware-aware algorithm design for efficient sequence modeling.
news
May 2, 2024 | Gated Linear Attention Transformers (GLA) is accepted to ICML 2024 Code is available at here. |
---|---|
Apr 25, 2024 | Gave a talk at Cornell Tech, “Gated linear Recurrence for Efficient Sequence Modeling” |
Apr 12, 2024 | Introducing HGRN2 , a minimalist linear attention model with strong performance. Code is available at here. |
Jan 1, 2024 | Introducing our open-source project flash-linear-attention . Join Discord if you are interested in linear attention/RNN! |
Dec 14, 2023 | Presenting our linear RNN paper (HGRN) at NeurIPS . Code is available at here. |