Sell side analyst notes on NVDA this morning

BofA: 'DeepSeek Concerns Overblown in AI Semis; but Foundational Models Keep Demand High'

"On January 20, China-based AI lab DeepSeek created a mini-flurry in the AI semiconductor space by releasing a free, open-source R1 model that reportedly outperforms leading Western AI models, such as OpenAI's GPT-4. The model's creators claimed it took only two months and less than $6 million to build using older-generation NVIDIA H800 chips.

If accurate, this advancement suggests model usefulness and accuracy might not scale directly with compute/memory/networking, potentially reducing the demand for expensive AI chips. However, we believe this concern about slowing AI scaling is overstated. Based on available data, DeepSeek's model appears to be a 'distilled' model relying on larger foundation models like Meta's open-source Llama.

It is these foundational LLMs where significant and rising infrastructure costs are incurred, as evidenced by Meta's plan to raise CY25E capex by over 56% year-over-year to $60-$65 billion. In our view, we will continue to see increasing compute demand driven by a mix of large foundational models (both proprietary and open-source), derivative models (leveraging techniques such as knowledge distillation, sparse attention, and low-rank factorization), and inference at scale across diverse cloud, enterprise, and sovereign AI customers.

We maintain our Buy ratings on NVIDIA (NVDA), Broadcom (AVGO), and Marvell Technology (MRVL)."

JPMorgan analyst Sandeep Deshpande: "Investors are concerned that rather than impede China's progress in AI, the US restrictions have engendered innovation that has enabled the development of a model that prioritises efficiency. ... The news over the past few months has been about the huge capex announcements of Microsoft, which is spending $80bn in '25, while Meta recently announced investments between $6bn and $65bn. Open AI also announced that the Stargate project intends to invest $500m over the next four years building new AI infrastructure in the US. Thus, with these considerable sums flowing into AI investments in the US, that Deepseek's highly efficient and lower resource-intensive AI model has shown such significant innovation and success is posing thoughts to investors that the AI investment cycle may be over-hyped and a more efficient future is possible."

Jefferies analyst Edison Lee: "Re-evaluating computing power needs could cause 2026 AI Capex to fall (or not grow)...We believe DS's success could drive two possible industry strategies: 1) still pursue more computing power to drive even faster model improvements, and 2) refocus on efficiency and ROI, meaning lower demand for computing power as of 2026."

Bernstein analyst Stacy Rasgon: "Is DeepSeek doomsday for AI buildouts? We don't think so...we believe that 1) DeepSeek DID NOT "build OpenAI for $5M"; 2) the models look fantastic but we don't think they are miracles; and 3) the resulting Twitterverse panic over the weekend seems overblown." Though Rasgon acknowledged DeepSeek's models are good. The analyst kept his outperform ratings on Nvidia and Broadcom, advising clients not to buy into the doomsday scenarios.

Citi analyst Malik: "While the dominance of the US companies on the most advanced AI models could be potentially challenged, that said, we estimate that in an inevitably more restrictive environment, US' access to more advanced chips is an advantage. Thus, we don't expect leading AI companies would move away from more advanced GPUs." Malik maintained a buy rating on Nvidia .

Raymond James' semiconductor analyst Srini Pajjuri: "If DeepSeek's innovations are adopted broadly, an argument can be made that model training costs could come down significantly even at U.S. hyperscalers, potentially raising questions about the need for 1-million XPU/GPU clusters as projected by some...: A more logical implication is that DeepSeek will drive even more urgency among U.S. hyperscalers to leverage their key advantage (access to GPUs) to distance themselves from cheaper alternatives." Pajjuri reiterated buy ratings on Nvidia and ASML.

Cantor analyst C.J. Muse: "Following release of DeepSeek's V3 LLM, there has been great angst as to the impact for compute demand, and therefore, fears of peak spending on GPUs. We think this view is farthest from the truth and that the announcement is actually very bullish with AGI seemingly closer to reality and Jevons Paradox almost certainly leading to the AI industry wanting more compute, not less." Muse said buy Nvidia on any weakness.