After three years of stealth development and securing $626 million in funding, AI Coding specialist Poolside has officially released its first programming models to the public. Founded in 2023 with a Valuation of $3 billion and backed by heavyweights like Bain Capital Ventures, NVIDIA, and eBay Ventures, the company previously restricted its services to government and public sector CLIents. This launch marks a significant pivot, bringing their Technology to the broader developer community. 🚀 The Laguna Model Family
Laguna M.1 (Flagship): A massive 225B parameter Mixture-of-Experts (MoE) model with 23B ACTive parameters. TrAIned from scratch on 30 trillion tokens using 6,144 Nvidia Hopper GPUs, it achieves a 46.9% score on SWE-bench Pro. This model is available exclusively via API.
Laguna XS.2 (Efficient): A Lightweight 33B parameter MoE model (3B active). It delivers impressive performance with a 44.5% score on SWE-bench Pro and 68.2% on SWE-bench Verified. Crucially, Poolside has open-sourced the weights under the Apache 2.0 license. Optimized for efficiency, it can run locally on Macs with 36GB of RAM and is already available on HuggingFace, OLlama, and OpenRouter.
Both models are currently available for free for a limited time.
Beyond the models, Poolside is releasing its internal Agent runtime pool as open source. This toolkit allows developers to utilize the Same environment used by Poolside for training and evaluating agents. With a research team of APProximately 60 people, the company emphasized in its announcement that "the West needs powerful open-source models, and we want to contribute." This move signals a commitment to transparency and aims to foster a robust ecosystem around Autonomous Coding agents.
Be the first to rate this article.
Comments & Questions (0)
No comments yet
Be the first to comment!