r/LocalLLaMA
Posted by u/obvithrowaway34434
People are getting it wrong; Anthropic doesn't care about the distillation, they just want to counter the narrative about Chinese open-source models catching up with closed-source frontier models
Tools 648 points
115 comments
3 weeks ago
Why would they care about distillation when they probably have done the same with OpenAI models and the Chinese labs are paying for the tokens? This is just their attempt to explain to investors and the US government that cheap Chinese models will never be as good as their models without distillation or stealing model weights from them. And they need to put more restrictions on China to prevent the technology transfer.
External link:
https://i.redd.it/1ulaheylwclg1.pngMore from r/LocalLLaMA
r/LocalLLaMA · u/KvAk_AKPlaysYT
Recent
Hot
Anthropic: "We’ve identified industrial-scale distillation attacks on our models by DeepSeek, Moonshot AI, and MiniMax." 🚨
Tools
3.1K 674 3 weeks ago
r/LocalLLaMA · u/HeadAcanthisitt...
Recent
Hot
I feel personally attacked
Tools
3.0K 151 1 week ago
r/LocalLLaMA · u/Xhehab_
Recent
Hot
Distillation when you do it. Training when we do it.
Tools
2.6K 156 3 weeks ago