AI / ML / LLM / Transformer Models Timeline Details

Viktor Garske @vemgar, Last update: Sun Jul 30 16:20:21 2023
← Back to the full graph

FlashAttention

This graph is clickable!

timeline 05/2022 05/2022 05/2023 05/2023 05/2022->05/2023 Starcoderbase StarCoderBase Mpt7bBase MPT-7B Base Flashattention FlashAttention Flashattention->Starcoderbase Flashattention->Mpt7bBase
Type
Method
Paper name
FlashAttention: Fast and Memory-Efficient Exact Attention with IO-Awareness
Paper authors
Dao et al.
Paper link
https://arxiv.org/abs/2205.14135
Publish date
2022-05-27
Repository link
https://github.com/HazyResearch/flash-attention
Affiliation
Stanford, University at Buffalo