r/LocalLLaMA
Posted by u/gggghhhhiiiijklmnop
Best "End of world" model that will run on 24gb VRAM
Tools 331 points
176 comments
3 days ago
Hey peeps, I'm feeling in a bit of a omg the world is ending mood and have been amusing myself by downloading and hoarding a bunch of data - think wikipedia, wiktionary, wikiversity, khan academy, etc etc What's your take on the smartest / best model(s) to download and store - they need to fit and run on my 24gb VRAM / 64gb RAM PC.?
More from r/LocalLLaMA
r/LocalLLaMA · u/EmPips
New
Hot
My story of underestimating /r/LocalLLaMA's thirst for VRAM
Tools
1.3K 88 5 days ago
r/LocalLLaMA · u/Dark_Fire_12
New
zai-org/GLM-4.7-Flash · Hugging Face
Tools
719 225 Yesterday
r/LocalLLaMA · u/Fear_ltself
New
NVIDIA's new 8B model is Orchestrator-8B, a specialized 8-billion-parameter AI designed not to answer everything itself, but to intelligently manage and route complex tasks to different tools (like web search, code execution, other LLMs) for greater efficiency
I’ve seen some arguments we’ve reached AGI, it’s just about putting the separate pieces together in the right context....
Tools
705 129 6 days ago