Bernstein’s outlook for Nvidia appears more positive, bolstered by the latest news from major cloud-computing vendors regarding their AI spending plans.

The technology analysts at Bernstein downplayed short-term and medium-term risks of an AI infrastructure oversupply leading to a potential boom and bust cycle for Nvidia’s semiconductors. Instead, they see robust customer demand for Nvidia AI chips at present.

Bernstein rates Nvidia stock as “Outperform” with a target price of $475. As of Monday’s early trading, the stock remained relatively stable at $468.

The analysts combined their estimates of increased AI-chip spending from Microsoft, Meta Platforms, Bytedance, and Google. They found that just these four companies alone could be sufficient to meet Nvidia’s guidance for the current quarter. Moreover, they anticipate that other major tech companies and AI startups will aggressively invest in Nvidia’s AI semiconductors, further boosting demand.

According to the analysts, customer feedback consistently indicates that demand significantly outpaces supply. Additionally, their analysis of expected wafer starts suggests even higher growth potential.

Bernstein identifies the current limitation on Nvidia’s revenue as advanced chip-packaging capacity at semiconductor foundries. However, with packaging capacity expected to expand next year, the analysts believe Nvidia could generate $75 billion to $90 billion in data-center AI revenue for 2024. This projection significantly exceeds the $42 billion consensus among Wall Street analysts.

Given Nvidia’s dominant position in the AI chip market, it stands to benefit from generative AI, such as the technology behind OpenAI’s ChatGPT chatbots. These innovative solutions, capable of processing text, images, and videos to create content, have sparked considerable excitement since their release last year.

Looking ahead, the Bernstein team remains optimistic about Nvidia’s prospects, foreseeing sustained and potentially even increasing numbers for at least the next 12-18 months.