r/LocalLLaMA
Posted by u/Fear_ltself
Google Research announces Sequential Attention: Making AI models leaner and faster without sacrificing accuracy
Tools 600 points
45 comments
1 month ago
External link:
https://research.google/blog/sequential-attention-making-ai-models-leaner-and-faster-without-sacrificing-accuracy/More from r/LocalLLaMA
r/LocalLLaMA · u/KvAk_AKPlaysYT
Recent
Hot
Anthropic: "We’ve identified industrial-scale distillation attacks on our models by DeepSeek, Moonshot AI, and MiniMax." 🚨
Tools
3.1K 674 3 weeks ago
r/LocalLLaMA · u/HeadAcanthisitt...
Recent
Hot
I feel personally attacked
Tools
3.0K 151 1 week ago
r/LocalLLaMA · u/Xhehab_
Recent
Hot
Distillation when you do it. Training when we do it.
Tools
2.6K 156 3 weeks ago