Chip developer Cerebras bolsters AI-powered workload capabilities with $250M

Cerebras Systems, the California-based company that has built a “brain-scale” chip to power AI models with 120 trillion parameters, said today it has raised $250 million funding at a valuation of over $4 billion. Cerebras claims its technology significantly accelerates the time involved in today’s AI work processes at a fraction of the power and space. It also claims its innovations will support the multi-trillion parameter AI models of the future.

In a press release, the company stated that this additional capital will enable it to further expand its business globally and deploy its industry-leading CS-2 system to new customers, while continuing to bolster its leadership in AI compute.

Cerebras’ cofounder and CEO Andrew Feldman noted that the new funding will allow Cerebras to extend its leadership to new regions. Feldman believes this will aid the company’s mission to democratize AI and usher in what it calls “the next era of high-performance AI compute” — an era where the company claims its technology will help to solve today’s most urgent societal challenges across drug discovery, climate change, and much more.

Redefining AI-powered possibilities

“Cerebras Systems is redefining what is possible with AI and has demonstrated best in class performance in accelerating the pace of innovation across pharma and life sciences, scientific research, and several other fields,” said Rick Gerson, cofounder, chairman, and chief investment officer at Falcon Edge Capital and Alpha Wave.

“We are proud to partner with Andrew and the Cerebras team to support their mission of bringing high-performance AI compute to new markets and regions around the world,” he added.

Cerebras’ CS-2 system, powered by the Wafer Scale Engine (WSE-2) — the largest chip ever made and the fastest AI processor to date — is purpose-built for AI work. Feldman told VentureBeat in an interview that in April of this year, the company more than doubled the capacity of the chip, bringing it up to 2.6 trillion transistors, 850,000 AI-optimized cores, 40GBs on-chip memory, 20PBs memory bandwidth, and 220 petabits fabric bandwidth. He noted that for AI work, big chips process information more quickly and produce answers in less time.

With only 54 billion transistors, the largest graphics processing unit pales in comparison to the WSE-2, which has 2.55 trillion more transistors. With 56 times more chip size, 123 times more AI-optimized cores, 1,000 times more high-performance on-chip memory, 12,733 times more memory bandwidth, and 45,833 times more fabric bandwidth than other graphic processing unit competitors, the WSE-2 makes the CS-2 system the fastest in the industry. The company says its software is easy to deploy, and enables customers to use existing models, tools, and flows without modification. It also allows customers to write new ML models in standard open source frameworks.

New customers

Cerebras says its CS-2 system is delivering a massive leap forward for customers across pharma and life sciences, oil and gas, defense, supercomputing centers, national labs, and other industries. The company announced new customers including Argonne National Laboratory, Lawrence Livermore National Laboratory, Pittsburgh Supercomputing Center (PSC) for its groundbreaking Neocortex AI supercomputer, EPCC, the supercomputing center at the University of Edinburgh, Tokyo Electron Devices, GlaxoSmithKline, and AstraZeneca.

A list of Cerebras's newest customers. The list of these customers can be found in the text of the article itself.

The series F investment round was spearheaded by Alpha Wave Ventures, a global growth stage Falcon Edge-Chimera partnership, along with Abu Dhabi Growth (ADG).

Alpha Wave Ventures and ADG join a group of strategic world-class investors including Altimeter Capital, Benchmark Capital, Coatue Management, Eclipse Ventures, Moore Strategic Ventures, and VY Capital. Cerebras has now expanded its frontiers beyond the U.S., with new offices in Tokyo, Japan, and Toronto, Canada. On the back of this funding, the company says it will keep up with its engineering work, expand its engineering force, and hunt for talents all over the world going into 2022.

Originally appeared on: TheSpuzz