Cerebras Shares Surge 89% on Nasdaq Debut, Hitting $100 Billion Valuation — AI Chipmaker's IPO Largest Since Uber
Cerebras Systems IPO opens at $350, nearly double its $185 price, valuing the AI chipmaker at over $100 billion. Largest US tech IPO since Uber.
IPO Shatters Records as Investor Frenzy Drives Stock to $350
Shares of Cerebras Systems nearly doubled in their first hours of trading on Wednesday, opening at $350 per share — a staggering 89% premium over the $185 initial public offering price — and catapulting the Silicon Valley chipmaker past a $100 billion market capitalization. The debut instantly ranks Cerebras among the world's most valuable semiconductor firms and confirms Wall Street's hunger for companies powering the artificial intelligence boom.

The company sold 30 million shares at $185 each, raising $5.55 billion in what Bloomberg reports as the largest U.S. tech IPO since Uber went public in 2019. Demand far exceeded expectations: Cerebras initially marketed shares between $115 and $125, then raised the range to $150–$160 as orders poured in, before ultimately pricing above even that elevated band.
'A New Beginning' for Cerebras
"This is a new beginning," Julie Choi, Senior Vice President and Chief Marketing Officer at Cerebras, told VentureBeat in an exclusive interview on the morning of the IPO. She emphasized the company plans to deploy its fresh capital into expanding cloud infrastructure. "With this new capital, we're going to fill more data halls with Cerebras systems to power the world's fastest inference."
The IPO proceeds will accelerate Cerebras' cloud inference service, which has become the centerpiece of its growth strategy. The company had previously relied heavily on a single customer in the United Arab Emirates but has since diversified its revenue base through partnerships with OpenAI and Amazon Web Services.
Background: From Withdrawn Filing to Nasdaq Frenzy
The IPO caps one of the most dramatic corporate turnarounds in recent tech history. Cerebras first filed to go public in September 2024 but withdrew the effort more than a year later amid intense scrutiny over its near-total revenue dependence on a single UAE customer. The company refiled in April 2026 with a radically different business profile: new partnerships with OpenAI and AWS, a fast-growing cloud inference service, and revenue that had climbed 76% to $510 million in 2025.
Investors who were initially skeptical now view Cerebras as a legitimate challenger to Nvidia in the AI chip market. The company's unique architecture — a single chip the size of a dinner plate — promises to deliver unmatched performance for running large language models, a capability that has drawn both strategic partners and deep-pocketed buyers.
The Dinner-Plate-Sized Chip Behind the $100 Billion Valuation
To understand the frenzy, you have to understand the silicon. Cerebras builds the Wafer-Scale Engine (WSE), a single processor that occupies an entire silicon wafer — the dinner-plate-sized disc from which ordinary chips are cut. The third-generation WSE-3 contains 4 trillion transistors, 900,000 compute cores, and 44 gigabytes of on-chip memory.
According to the company's S-1 filing with the SEC, the WSE is 58 times larger than Nvidia's B200 "Blackwell" chip and delivers 2,625 times more memory bandwidth. That bandwidth advantage matters enormously for AI inference — the process of running a trained model to generate answers. When a large language model produces text, it predicts one token at a time, requiring the model's entire set of weights to move from memory to compute. This work is inherently sequential and cannot be parallelized, making memory bandwidth the binding constraint on speed.
Cerebras claims its chip can execute inference tasks far faster than competing architectures, a critical advantage as AI deployment accelerates across industries. The company's cloud service now offers customers instant access to this performance without the need to purchase hardware outright.
What This Means for AI Infrastructure
Cerebras' $100 billion valuation signals that the market sees a future where AI workloads demand specialized hardware beyond traditional GPUs. Memory bandwidth constraints, not just raw compute, are becoming the bottleneck for inference — and Cerebras' wafer-scale approach directly addresses that challenge. The IPO also validates the thesis that cloud-based inference as a service, rather than merely selling chips, can generate enormous revenue and growth.
However, the company still faces significant risks. Nvidia remains the dominant player, and Cerebras must prove it can sustain its revenue momentum while managing the complexity of manufacturing wafer-scale chips. The $5.55 billion raised will help, but investors will watch closely for signs that Cerebras can expand beyond its current customer base and maintain its technological lead.
For now, the market is betting big on a paradigm shift in AI computing. Whether Cerebras can deliver on that promise — and avoid the pitfalls of single-customer dependence — will determine if its stock can hold these stratospheric levels.