AI / ML / LLM / Transformer Models Timeline Details
Viktor Garske
@vemgar
, Last update: Tue Dec 26 15:23:35 2023
← Back to the full graph
FlashAttention
This graph is clickable!
timeline
05/2022
05/2022
05/2023
05/2023
05/2022->05/2023
09/2023
09/2023
05/2023->09/2023
Starcoderbase
StarCoderBase
Mpt7bBase
MPT-7B Base
Mistral7b
Mistral 7B
Flashattention
FlashAttention
Flashattention->Starcoderbase
Flashattention->Mpt7bBase
Flashattention->Mistral7b
Type
Method
Paper name
FlashAttention: Fast and Memory-Efficient Exact Attention with IO-Awareness
Paper authors
Dao et al.
Paper link
https://arxiv.org/abs/2205.14135
Publish date
2022-05-27
Repository link
https://github.com/HazyResearch/flash-attention
Affiliation
Stanford, University at Buffalo