DeepSeek is making waves in the AI community by planning to release the underlying code behind its recently launched simulated reasoning model. This follows last month’s release of the model, which was free to download and use under an MIT license. Starting next week, the company will release five open-source repositories during its “Open Source Week.” According to a social media post, these daily releases aim to provide insight into the foundational components of its online service, which have been tested and deployed in production. DeepSeek emphasized its belief in the open-source community, stating that every shared line of code accelerates collective progress. While specifics about the code remain vague, the accompanying GitHub page for “DeepSeek Open Infra” promises transparency and insight into the model’s development. This move contrasts sharply with OpenAI, whose ChatGPT models remain proprietary. DeepSeek’s initial release included “open weights” access, allowing users to fine-tune model parameters. However, it is unclear if the upcoming release will include training code, which is necessary to meet the Open Source Institute’s definition of “Open Source AI.” A fully open-source release could enhance researchers’ understanding of the model’s core functionality, potentially revealing inherent biases or limitations. Elon Musk’s xAI also recently promised an open-source version of Grok 2, while HuggingFace released an open-source clone of OpenAI’s “Deep Research” feature. — news from Ars Technica
