SmolLM-Smashed is a collection of optimized language models. Each model is quantized and compiled for maximum efficiency while preserving performance.
Parag Ekbote
AINovice2005
AI & ML interests
ML Engineer passionate about taking models from research to production. 1 year supporting tech startups. Active OSS contributor.
Recent Activity
reacted
to
danielhanchen's
post
with π₯
about 21 hours ago
We collaborated with Hugging Face to enable you to train MoE models 12Γ faster with 35% less VRAM via our new Triton kernels (no accuracy loss). π€
Train gpt-oss locally on 12.8GB VRAM with our free notebooks: https://unsloth.ai/docs/new/faster-moe
upvoted
an
article
8 days ago
Training Design for Text-to-Image Models: Lessons from Ablations
upvoted
an
article
15 days ago
Unlocking Agentic RL Training for GPT-OSS: A Practical Retrospective