Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Website
Tasks
HuggingChat
Collections
Languages
Organizations
Community
Blog
Posts
Daily Papers
Learn
Discord
Forum
GitHub
Solutions
Team & Enterprise
Hugging Face PRO
Enterprise Support
Inference Providers
Inference Endpoints
Storage Buckets
Log In
Sign Up
1251.4
TFLOPS
80
8
367
openfree
openfree
Follow
chysh777's profile picture
DdyRizz's profile picture
beatdownflex's profile picture
1,114 followers
·
302 following
https://discord.gg/openfreeai
AI & ML interests
None yet
Recent Activity
updated
a Space
about 18 hours ago
VIDraft/AI
reacted
to
SeaWolf-AI
's
post
with 🧠
about 22 hours ago
🧬 Darwin Family: Zero Gradient Steps, GPQA Diamond 88.89% How far can we push LLM reasoning *without* training? Our team at VIDRAFT submitted this paper to Daily Papers yesterday, and it's currently #3. Huge thanks to everyone who upvoted — sharing the core ideas below. 🔗 Paper: https://huggingface.co/papers/2605.14386 🔗 arXiv: https://arxiv.org/abs/2605.14386 🔗 Model: https://huggingface.co/FINAL-Bench/Darwin-28B-REASON 🔗 Model: https://huggingface.co/FINAL-Bench/Darwin-28B-Opus --- TL;DR Darwin Family is a training-free evolutionary merging framework. By recombining the weight spaces of existing LLM checkpoints — with zero gradient-based training — it reaches frontier-level reasoning. - 🏆 Darwin-28B-Opus: GPQA Diamond 88.89% - 💸 Zero gradient steps — not a single B200 or H200 hour needed - 🧬 Consistent gains across 4B → 35B scale - 🔀 Cross-architecture breeding between Transformer and Mamba families - 🔁 Stable recursive multi-generation evolution #Three Core Mechanisms ① 14-dim Adaptive Merge Genome — fine-grained recombination at both component level (Attention / FFN / MLP / LayerNorm / Embedding) and block level, expanding the prior evolutionary-merge search space. ② MRI-Trust Fusion — we diagnose each layer's reasoning contribution via an **MRI (Model Reasoning Importance)** signal and fuse it with evolutionary search through a **learnable trust parameter**. Trust the diagnostic too much and search collapses; ignore it and search becomes inefficient — Darwin learns the balance from data. ③ Architecture Mapper — weight-space breeding across heterogeneous families. Attention × SSM crossover actually works. Why It Matters > Diagnose latent capabilities already encoded in open checkpoints, > and recombine them — no gradients required. Replies and critiques welcome 🙌
liked
a Space
about 22 hours ago
FINAL-Bench/Darwin-9B-NEG
View all activity
Organizations
openfree
's datasets
5
Sort: Recently updated
openfree/novel-themes-library
Viewer
•
Updated
Sep 10, 2025
•
329
•
30
•
4
openfree/hf-trending-summaries
Viewer
•
Updated
Sep 8, 2025
•
1.2k
•
14
openfree/hle-test
Viewer
•
Updated
Feb 24, 2025
•
2.7k
•
197
openfree/test-parquet
Viewer
•
Updated
Feb 17, 2025
•
203
•
26
openfree/autotrain-flx-fash-test
Viewer
•
Updated
Oct 6, 2024
•
20
•
24