Ex-Google, Apple engineers launched an unconditional open resource of Oumi AI platform that will help to form the next DeepSeek


Join our daily and weekly newsletter for the latest updates and exclusive content on the top AI coverage. Learn more


If it's not clear before, it's definitely very clear now: open resources are really important for AI. The success of DeepSeek-R1 has a great proven to have the need and demand for open-source AI.

But what exactly is the open source of AI? For the meta and its Llama Models, this means free access to use the model, with certain conditions. DeepSeek is available under an allowable open-source license Provides significant access to its architecture and capabilities. However, the specific training code and detailed methods, especially the involvement of the methods of stimulant (RL) such as Group Relative Policy Optimization (GRPO), is not publicly disclosed. This removal limits the community's ability to fully understand and copy the model training process.

What does not allow DeepSeek or Llama, however, is accessible to all model codes, including weights as well as training data. Without all of that information, developers can still work with the open model but they do not have all the necessary tools and insights to understand how it really works and more importantly how to build a new model. It is a challenge that a new startup led by former Google and Apple AI veterans aims to resolve.

Launching today, Oumi is supported by an alliance of 13 leading research universities including Princeton, Stanford, MIT, UC Berkeley, University of Oxford, University of Cambridge, University of Waterloo and Carnegie Mellon. Oumi founders raised $ 10 million, a moderate seed spinning they say meet their needs. While major players like Openai are consuming -mo $ 500 billion investment in massive data centers through projects such as StargateOumi is taking a different radical approach. The platform provides researchers and developers a complete toolkit for developing, reviewing and deploying foundation models.

“Although the biggest companies can't do it on their own,” Oussama Elachqar, Oumi's cofounder and previously a machine study engineer at Apple, told VentureBeat. “We are working effectively with silos within Apple, and there are many other silos that are happening throughout the industry. It needs to be a better way to form these models together.”

What are the open resources like Deepseek and Llama are missing

Oumi CEO and former Google Cloud Ai Senior Engineering Manager Manos Koukoumid told VentureBeat that researchers are constantly telling him an AI experiment has been complicated.

While now open models are a step forward, this is not enough. Kouroumumidis explained that in current “open” AI models such as Deepseek-R1 and Llama, an organization can use the model and deploy it themselves. What is missing is that anyone who wants to build on the model does not know how it is built.

Oumi founders believe that lack of transparency is a major obstacle to the cooperation of research and development of AI. Even a project like Llama requires a significant amount of effort from researchers to learn how to reproduce and build on work.

How does Oumi work to open AI for business users, researchers and all

The Oumi platform works by providing an all-in-one environment that streamlines complex workflows involved in developing AI models.

Kourumidis explained that in order to build a foundation model, there are usually 10 or more measures that need to be taken, often in parallel. Oumi incorporates all the necessary tools and workflows in a unified environment, eliminating the need for researchers to come together and configure different components that are open resources.

The key technical features include:

  • Support for models from 10m to 405B parameters
  • Implementing advanced training techniques including SFT, Lora, Qlora and DPO
  • Capabilities in both text and multimodal models
  • Built-in tools for data synthesis and curation training using LLM judges
  • Expansion options through modern inference engines such as VLLM and SGLANG
  • The comprehensive review of the model on the usual industry benchmarks

“We do not have to deal with the open-source development hell of figuring out what you can combine and what works well,” KouroumamamamamaMidis explained.

The platform allows users to start small, using their own laptops for initial experiments and model training. As users progress, they can scale up to larger compute sources, such as university clusters or cloud providers, all within the same Ouri environment.

You do not need massive training infrastructure to produce an open model

One of the big surprises with DeepSeek-R1 is the fact that it seems to be built with a small portion of the resources used by meta or Openai to form their models.

While Openai and others have invested billion -billions of centralized infrastructure, Oumi will bet on a shared procedure that can reduce costs.

“The idea you need a way -The billions -Billions [of dollars] For AI infrastructure is a starting flawed, ”Kouroumamamamamam. “By sharing computing with universities and research institutions, we will achieve similar or better results at a part of the cost.”

The initial focus for Oumi is to develop the open ecosystem of users and development ecosystems. But that's not what the company planned. Oumi plans to build business offerings to help businesses deploy these models to labor environments.


Leave a Reply

Your email address will not be published. Required fields are marked *