news

May 2, 2024 Gated Linear Attention Transformers (GLA) is accepted to ICML 2024 :smile: Code is available at here.
Apr 25, 2024 Gave a talk at Cornell Tech, “Gated linear Recurrence for Efficient Sequence Modeling
Apr 12, 2024 Introducing HGRN2 :sparkles: :sparkles:, a minimalist linear attention model with strong performance. Code is available at here.
Jan 1, 2024 Introducing our open-source project flash-linear-attention :rocket: :rocket: :rocket:. Join Discord if you are interested in linear attention/RNN!
Dec 14, 2023 Presenting our linear RNN paper (HGRN) at NeurIPS :sparkles: :sparkles:. Code is available at here.