Annnnnd... here it is! https://huggingface.co/deca-ai/3-alpha-ultra —the largest AI model in the world by deca-ai, clocking in at a whopping 4.6T parameters. Apologies for the delay, but we’re stoked to finally drop this, even in its alpha stage. Before you dive in, here are a few things to keep in mind:
1. **No commercial use yet**: We're still working on Deca 2.5 (Proprietary), and releasing Deca 3 for commercial use right now would impact that. Once Deca 3.5 hits in early '26, we’ll be opening it up with a more permissive license. 2. **Built on existing models**: Deca 3 isn’t a ground-up creation—it’s a huge step forward, building on what’s already out there. 3. **It’s experimental**: As much as we’re hyped about its scale, it’s still in testing. 4. **DynaMoE architecture**: Run a (very) small part of the model with 64GB of RAM/VRAM (when quantized - quants coming soon), or the whole thing with 1TB. It’s that scalable. 5. **Not widely supported yet**: Frameworks like vLLM and Transformers aren’t compatible with Deca 3 at the moment, so until we drop the DynaMoE software (beta coming soon), it’s mostly just a concept.
We’re super excited to see what you do with it once the full setup’s ready. Hang tight, and stay tuned!
🚀 Deca 3 Ultra Alpha is coming in the next 72 hours! 🚀
We're on the verge of something monumental. Right now, we're in the final stages of testing, and we're about to drop a game-changing milestone in the open-source AI community. 🎉
In just two weeks, we've managed to almost 4x the size of the largest open-source LLM at that time (and we are still 2.6x bigger than the largest LLM). This is unprecedented and a testament to the power of collaboration, innovation, and the relentless pursuit of pushing AI to its limits.
The future of open-source AI is now. Stay tuned for the release – we’re just getting started.
- Model testing finishes: 24hrs from now - Model gets uploaded: 30hrs from now - Related code/inference stack gets published: 70-90hrs from now