Posted by u/N8Karma
I trained a model to 'unslop' AI prose
I ran passages from Project Gutenberg through GPT-4o-mini 10 times over, each time telling it to "make it read far better, adding superior prose, etc.". This lead to classic literary passages being enslopped. I then reversed this pipeline, and trained a model to go from \[slop\] -> \[original\]. The resulting model is capable enough to fool Pangram (a fairly robust AI detector - I take this as a metric of how 'human-sounding' the output is), at very little overall quality cost: While quality decreases slightly, humanness jumps from 0 to 0.481. The unslopped version stays firmly above Mistral Large 3 and close to the original GPT-5.2 baseline. Of course, the model is OSS: https://huggingface.co/N8Programs/Unslopper-30B-A3B-bf16 And there are now GGUFs: https://huggingface.co/N8Programs/Unslopper-GGUF And there is a twitter thread (complete with 'thread below 👇', ironic in a thread about slop): https://x.com/N8Programs/status/2011591738591494625 The goal here is **not** to fool Pangram, deceive/cheat, etc. I only use Pangram as a proxy for the prose being more readable - ideally, you'd use this model to make your own AI-generated passages more palatable to read, or as part of a larger pipeline in automated writing generation for training, for instance. Example (GPT-5.2): The wind rose all at once, as if the mountain itself had inhaled. She stood at the edge of the pass, cloak snapping like a torn banner behind her, staring down at the valley where the last lights were going out one by one. Each flame extinguished felt deliberate—an answer to a question she had been too afraid to ask. Somewhere below, bells began to ring, not in warning, but in mourning. They already knew. “Don’t,” he said, breathless, reaching for her arm. His fingers brushed fabric, then air. “If you cross that line, there’s no coming back.” She didn’t turn. The cold burned her lungs, sharpened her thoughts. “There was never a way back,” she replied softly. “Only a way forward you were willing to pretend didn’t exist.” Thunder rolled, low and ancient. For a moment, the sky fractured into white fire, and in that instant he saw her face—resolved, terrified, incandescent with purpose. Not the girl he had sworn to protect, but the woman the world had demanded she become. “You’ll die,” he said. She smiled then, finally facing him, eyes bright with unshed tears. “So will everyone,” she answered. “The difference is what we leave burning when we’re gone.” She stepped forward. The mountain exhaled. **Pangram Verdict:** High Confidence AI Generated Unslopped (Temp=0.8, Rep Penalty=1.1, 6-bit MLX quant): And then it just happened. Like the whole of reality had been held breathless for a moment, and suddenly sprang into focus. She stood at the pass's edge. Her cloak flapped behind her like a banner carried to its last stop. She was looking down into the valley, at lights being snuffed out one by one. Each flame extinguished was a point of answer to questions she had dared not ask. And now bells were ringing, not a signal but a dirge, because they already knew. "Please," he said. He reached for her; his hand closed on the snaking line of fabric that trailed behind her, then nothing. "You can't cross over." She didn't turn to him. It was cold enough to bite at her throat; she breathed it in, sharp and full of poison. "There was never a way out," she said softly. "Only the road in that you did not want to remember." A low rumble of thunder, older than mountains, filled the air. A white flash lit up the sky for a moment, a vivid static song of fire that illuminated her face in an instant. It was no longer the girl he'd sworn to protect, but the woman she had become. "You'll die." She smiled at him and turned toward him. There were tears on her face, but they did not fall. "We all do," she said. "The difference is what we leave behind us." She started walking toward the edge. And it all happened at once. The mountain exhaled itself, and took her with it. **Pangram Verdict:** High Confidence Human Written Note that there are some local coherence issues w/ the Unslopper - that's why I'd recommend integrating it into a larger pipeline or editing its output yourself. It's definitely not production ready. \--------- As a bonus, the training of this model was entirely local! Done on one M3 Max w/ mlx-lm. Took 12 hours.
More from r/LocalLLaMA
My story of underestimating /r/LocalLLaMA's thirst for VRAM
zai-org/GLM-4.7-Flash · Hugging Face
NVIDIA's new 8B model is Orchestrator-8B, a specialized 8-billion-parameter AI designed not to answer everything itself, but to intelligently manage and route complex tasks to different tools (like web search, code execution, other LLMs) for greater efficiency
I’ve seen some arguments we’ve reached AGI, it’s just about putting the separate pieces together in the right context....