Cerebras
A company that produces specialized, wafer-sized chips for AI inference. They secured a major compute deal with OpenAI worth over $10 billion.
First Mentioned
1/17/2026, 5:57:38 AM
Last Updated
1/17/2026, 5:58:56 AM
Research Retrieved
1/17/2026, 5:58:56 AM
Summary
Cerebras Systems Inc. is an American semiconductor and artificial intelligence company renowned for developing the world's largest and most powerful AI processors, specifically the Wafer Scale Engine (WSE). Founded in 2015 by a team of former SeaMicro executives including CEO Andrew Feldman, the company specializes in building high-performance computer systems like the CS-3 for complex deep learning training and inference. Headquartered in Sunnyvale, California, with global operations in Toronto, Bangalore, and San Diego, Cerebras has positioned itself as a major challenger to traditional GPU hardware. The company recently entered a landmark $10 billion partnership with OpenAI to provide specialized inference chips, a deal that significantly bolsters its market valuation to a projected $22 billion as it prepares for an initial public offering. Cerebras also maintains strategic collaborations with the U.S. Department of Energy and G42, focusing on high-speed AI applications across science, national security, and enterprise sectors.
Referenced in 1 Document
Research Data
Extracted Attributes
CEO
Andrew Feldman
Founded
2015
Founders
Andrew Feldman, Gary Lauterbach, Michael James, Sean Lie, Jean-Philippe Fricker
Industry
Semiconductors and Artificial Intelligence
Headquarters
Sunnyvale, California, United States
Employee Count
401 (2024 estimate)
Revenue (2023)
$78.7 million
Primary Products
Wafer Scale Engine (WSE), CS-1, CS-3 AI Supercomputers
Net Income (2023)
-$127 million
Projected Valuation
$22 billion
Timeline
- Cerebras Systems Inc. is founded by former SeaMicro executives. (Source: Wikipedia)
2015-01-01
- Cerebras secures $27 million in Series A funding led by Benchmark, Foundation Capital, and Eclipse Ventures. (Source: Wikipedia)
2016-05-01
- Series B funding round led by Coatue Management. (Source: Wikipedia)
2016-12-01
- Series C funding round led by VY Capital. (Source: Wikipedia)
2017-01-01
- Cerebras raises $1.1 billion in a Series G funding round. (Source: LinkedIn)
2025-10-30
- Cerebras announces a $10 billion deal with OpenAI for specialized AI inference chips. (Source: All-In Podcast; Forbes)
2026-01-15
Wikipedia
View on WikipediaCerebras
Cerebras Systems Inc. is an American artificial intelligence (AI) company with offices in Sunnyvale, San Diego, Toronto, and Bangalore, India. Cerebras builds computer systems for complex AI deep learning applications.
Web Search Results
- Cerebras - Wikipedia
American semiconductor company Cerebras Systems Inc. | Headquarters in Sunnyvale | | Company type | Private | | Industry | Semiconductors Artificial intelligence | | Founded | 2015; 10 years ago (2015) | | Founders | Andrew Feldman Gary Lauterbach Michael James Sean Lie Jean-Philippe Fricker | | Headquarters | Sunnyvale, California , US | | Key people | Andrew Feldman (CEO) | | Products | Wafer Scale Engine | | Revenue | $78.7 million (2023) | | Net income | $−127 million (2023) | | Number of employees | 401 (2024) | | Website | cerebras.ai | Cerebras Systems Inc. is an American artificial intelligence (AI) company with offices in Sunnyvale, San Diego, Toronto, and Bangalore, India. Cerebras builds computer systems for complex AI deep learning applications. ## History [edit] [...] ## History [edit] Cerebras was founded in 2015 by Andrew Feldman, Gary Lauterbach, Michael James, Sean Lie and Jean-Philippe Fricker. These five founders worked together at SeaMicro, which was started in 2007 by Feldman and Lauterbach and was later sold to AMD in 2012 for $334 million. In May 2016, Cerebras secured $27 million in series A funding led by Benchmark "Benchmark (venture capital firm)"), Foundation Capital and Eclipse Ventures. In December 2016, series B funding was led by Coatue Management, followed in January 2017 with series C funding led by VY Capital. [...] ## Technology [edit] The Cerebras Wafer Scale Engine (WSE) is a single, wafer-scale integrated processor that includes compute, memory and interconnect fabric. The WSE-1 powers the Cerebras CS-1, Cerebras’ first-generation AI computer. It is a 19-inch rack-mounted appliance designed for AI training and inference workloads in a datacenter. The CS-1 includes a single WSE primary processor with 400,000 processing cores, as well as twelve 100 Gigabit Ethernet connections to move data in and out. The WSE-1 has 1.2 trillion transistors, 400,000 compute cores and 18 gigabytes of memory.
- OpenAI Partners with Cerebras to Bring High-Speed Inference to the ...
Cerebras is the high-speed solution for AI. Whether running coding agents or voice chat, large language models on Cerebras deliver responses up to 15× faster than GPU-based systems. For consumers, this translates into greater engagement and novel applications. For the broader economy, where AI agents are expected to be a key growth driver over the coming decade, speed directly fuels productivity growth. “OpenAI’s compute strategy is to build a resilient portfolio that matches the right systems to the right workloads. Cerebras adds a dedicated low-latency inference solution to our platform. That means faster responses, more natural interactions, and a stronger foundation to scale real-time AI to many more people,” said Sachin Katti of OpenAI. [...] This partnership was a decade in the making. OpenAI and Cerebras were both founded around the same time with radically ambitious visions for the future of AI: OpenAI set out to create the software that powers AGI while Cerebras upended conventional wisdom about chip making to build a wafer scale AI processor that defied Moore’s Law. Our teams have met frequently since 2017, sharing research, early work, and a common belief that there would come a moment when model scale and hardware architecture would have to converge. That moment has arrived.
- Cerebras | LinkedIn
Cerebras Systems is the world's fastest AI inference. We are powering the future of generative AI. Follow us for model breakthroughs and real-time AI results. We’re a team of pioneering computer architects, deep learning researchers, and engineers building a new class of AI supercomputers from the ground up. Our flagship system, Cerebras CS-3, is powered by the Wafer Scale Engine 3—the world’s largest and fastest AI processor. CS-3s are effortlessly clustered to create the largest AI supercomputers on Earth, while abstracting away the complexity of traditional distributed computing. From sub-second inference speeds to breakthrough training performance, Cerebras makes it easier to build and deploy state-of-the-art AI—from proprietary enterprise models to open-source projects downloaded [...] Cerebras (14h 1d): We cannot wait to see what our nation's best and brightest minds will build with the most powerful AI tools of a generation. "The fastest AI" created by our team at Cerebras takes on a whole new meaning when it's applied to a mission like this. Let's go. 🚀 [Likes: 11, Comments: 0]; Cerebras (1d Edited): 💡 Cerebras is proud to announce that we have signed a Memorandum of Understanding (MOU) with the U.S. Department of Energy (DOE) to explore further collaboration on next-generation AI and high-performance computing (HPC) technologies to accelerate AI+HPC for science and national security. The MOU expresses our intent to support The White House ’s Genesis Mission, a new national effort to use AI to transform how scientific research is conducted and accelerate the speed of [...] ### Website ### Crunchbase ### LinkedIn ### Industry Semiconductor Manufacturing ### Company Size 501-1,000 employees 762 associated members (LinkedIn members who've listed Cerebras as their current workplace on their profile) ### Founded N/A ### Funding Last Round Date: 2025-10-30T00:00:00.000Z Last Round Type: Series G Total Rounds: 9 Last Round Raised: US$ 1.1B ### Investors Atreides Management, Fidelity, + 6 Other investors ### Specialties artificial intelligence, deep learning, natural language processing, inference, machine learning, llm, AI, enterprise AI, and fast inference ## Locations 1237 E Arques Ave, Sunnyvale, California 94085, US, 150 King St W, Toronto, Ontario M5H 1J9, CA, Tokyo, JP, Bangalore, IN
- Cerebras AI Lands A Whale As It Prepares To Go Public - Forbes
You’re Subscribed! Explore More Newsletters Cerebras builds purpose-built AI systems designed for speed — combining massive compute, memory, and bandwidth on a single wafer-scale chip (WSE-3) to eliminate the bottlenecks that slow inference on conventional GPU hardware. This approach results in dramatically faster responses and does so at scale. The wafer’s chips communicate with each other over a network that spans all the chips on the wafer, eliminating the bottlenecks and costs that server-to-server or GPU-to-GPU AI must incur to effectively reattach the chips that were cut up from the original wafer. According to Greg Brockman, OpenAI’s co-founder and president, the partnership will make ChatGPT not just the most capable, but the fastest AI platform in the world. [...] This voice experience is generated by AI. Learn more. Cerebras, famous for being the only AI company with a full wafer-scale chip, has landed OpenAI, its first major US-based hyperscaler. Prior to this deal, Cerebras has been successful securing investments and system commitments from a relatively small number of customers, notably G42, the Abu Dhabi, UAE, AI company. Sam Altman, CEO of OpenAI, was an early investor in Cerebras. Cerebras has been touting its leadership in AI inferencing performance for the last couple years, and it looks like this approach is working. (Like most AI semiconductor companies, Cerebras is a client of Cambrian-AI Research, LLC.) [...] PROMOTED The timing couldn’t be better for CEO/Founder Andrew Feldman and his team at Cerebras. Cerebras Systems is in discussions to raise another $1B that would value the maker of chips for artificial intelligence at $22B, according to The Information. And the company is planning on an IPO soon. No doubt, the OpenAI deal could significantly increase the valuation. MORE FOR YOU ### Trump Suggests U.S. ‘Shouldn’t Even Have’ November Midterms ### U.S. Navy Supercarrier USS Abraham Lincoln Heading To The Middle East ### iOS 26.3 Update Set To Improve Hundreds Of Millions Of iPhones ## Some Context
- Cerebras
Our clinicians will be able to make more informed decisions based on genomic data, significantly reducing the time it takes to find the right treatment and – more importantly – reducing the physical toll on patients. For Notion, productivity is everything. Cerebras gives us the instant, intelligent AI needed to power real-time features like enterprise search, and enables a faster, more seamless user experience. Combining Cerebras’ best-in-class compute with LiveKit’s global edge network has allowed us to create AI experiences that feel more human, thanks to the system’s ultra-low latency. [...] ## Customer Stories By partnering with Cerebras, we are integrating cutting-edge AI infrastructure […] that allows us to deliver the unprecedented speed, most accurate and relevant insights available – helping our customers make smarter decisions with confidence. By delivering over 2,000 tokens per second for Scout – more than 30 times faster than closed models like ChatGPT or Anthropic, Cerebras is helping developers everywhere to move faster, go deeper, and build better than ever before. With Cerebras’ inference speed, GSK is developing innovative AI applications, such as intelligent research agents, that will fundamentally improve the productivity of our researchers and drug discovery process. [...] We have a cancer-drug response prediction model that’s running many hundreds of times faster on that chip (Cerebras) than it runs on a conventional GPU… We are doing in a few months what would normally take a drug development process years… With Cerebras […] developers using Cline are getting a glimpse of the future, as Cline reasons through problems, reads codebases, and writes code in near real-time. Everything happens so fast that developers stay in flow, iterating at the speed of thought. ## Build the fastest & smartest apps Get started in <30 seconds ## Latest News ### OpenAI Partners with Cerebras to Bring High-Speed Inference to the Mainstream ### This new model is smarter than Sonnet 4.5…and 20X faster?
Wikidata
View on WikidataImage
Instance Of
Headquarters
Inception Date
1/1/2016
DBPedia
View on DBPediaLocation Data
Cerebras, 199, Bay Street, Financial District, Commerce Court, Toronto, Golden Horseshoe, Ontario, M5L 1G9, Canada
Coordinates: 43.6481111, -79.3796189
Open Map