Deepseek can be a multiplier of the power of small AI chip companies


Deepseek rattles off the US-led AI ecosystem with its latest models. nvidia’s Market capitalization. While sector leaders are working on fallout, small AI companies are seeing opportunities to expand into Chinese startups.

Several AI-related companies told CNBC that the emergence of Deepseek is not a threat, but a “big” opportunity for them.

“Developers want to replace the expensive and closed models of Openai with open source models like the Deepseek R1…” said Andrew Feldman, CEO of Celebrus Systems, an artificial intelligence chip startup. It’s there.

The company competes with Nvidia’s graphics processing units to provide cloud-based services through its own computing cluster. Feldman said the release of the R1 model produced one of the biggest spikes ever in demand for the service.

“R1 shows that growth in the (AI market) is not dominated by a single company. There is no hardware and software moat in the open source model,” Feldman added.

Open source refers to software that makes source code freely available on the web. Unlike competitors such as Openai, Deepseek’s models are open source.

Deepseek also claims that the R1 inference model is comparable to the best of American technology despite being run at a lower cost and being trained without cutting-edge graphics processing units, but the industry’s Watchers and competitors I questioned these claims.

“Low prices can help drive global adoption, like PC and the internet market. The AI ​​market is on a similar secular growth path,” Feldman said.

Inference chip

Three important AI themes to watch the Deepseek Shock Market

Many AI chip startups told CNBC that demand for inference chips and computing is growing as clients adopt and build on DeepSeek’s open source model.

“(deepseek) demonstrated that smaller open models can be trained to be more capable or more capable than larger proprietary models. This can be done at a fraction of the cost,” the matrix.

“The wide range of small capacity models catalyze the era of inference because of widespread availability,” he told CNBC, adding that he recently saw a surge in interest from global customers seeking to speed up inference plans. Ta.

Robert Wachen, co-founder and COO of AI Chipmaker Etched, said dozens of companies have contacted startups since Deepseek released Reasoning Models.

“Companies are (now) shifting spending from cluster training to inference clusters,” he said.

“DeepSeek-R1 has proven that inference time calculation is the (state-of-the-art) approach of all major model vendors and thinking. Millions of users.”

Jevon’s Paradox

Deepseek drives new innovation in AI, says Groq COO

This pattern explains Jevon’s paradox. This is the theory that demand will increase due to cost reductions in new technology.

Financial services and investment firm Wedbush said in a research note last week that it continues to expect AI use across businesses and retail consumers around the world to drive demand.

I’ll talk CNBC’s “Fast Money” Last week, Sunny Madra, the COO of GROQ, who develops chips for AI reasoning, suggested that as the overall demand for AI increases, there will be more room for smaller players to grow.

“The world needs more tokens (units of data processed by AI models), making it possible for Nvidia to supply all people with sufficient chips, giving the market a more aggressive opportunity to sell. You can do it,” Madra said.

Leave a Reply

Your email address will not be published. Required fields are marked *