news

Jan 22, 2025 📣 Gated Delta Networks: Improving Mamba2 with Delta Rule and Stick-Breaking Attention have been accepted to ICLR 2025!
Jan 18, 2025 Gave a remote talk at Hao AI Lab @ UCSD, “What’s Next for Mamba? Towards More Expressive Recurrent Update Rules
Dec 8, 2024 :loudspeaker: Check out the blog post accompanying our NeurIPS ‘24 paper, “Parallelizing Linear Transformers with the Delta Rule over Sequence Length,” here.
Aug 20, 2024 Gave a talk at HazyResearch @ Standford, “Linear Transformers for Efficient Sequence Modeling
Apr 25, 2024 Gave a talk at Cornell Tech, “Gated linear Recurrence for Efficient Sequence Modeling
Jan 1, 2024 Introducing our open-source project flash-linear-attention :rocket: :rocket: :rocket:. Join Discord if you are interested in linear attention/RNN!