Over the past few days, DeepSeek published eight open-source projects on GitHub, the world’s largest open-source community. It was the first time that the firm revealed in detail how it squeezed the best performance from chips in compute, communication and storage, which are the key pillars of model training.
DeepSeek’s team of young scientists said they disclosed the company’s “battle-tested building blocks” to share “our small-but-sincere progress with full transparency”.
DeepSeek has been cheered by global developers, who praised the Chinese company for revealing the techniques it used in building its low-cost, high-performance AI models. Some developers, including the founder of AI development platform Hyperbolic, called DeepSeek “the real OpenAI”.