[论文总结]Linear Attention
论文list
Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention
2020.08 Angelos Katharopoulos
SANA 1.5: Efficient Scaling of Training-Time and Inference-Time Compute in Linear Diffusion Transformer
2025.01 Enze Xie
评论