This AI chip is not just big; it's colossal, with a mind-blowing 4 trillion transistors and 900,000 AI cores.
The WSE-3 isn't just for show—it's the powerhouse behind the CS-3 supercomputer, a behemoth that can train AI models with up to 24 trillion parameters. With up to 1.2PB of external memory, this machine has more space than a black hole.
Need to fine-tune a 70 billion parameter model in a day? The CS-3 can handle it. It's like the Usain Bolt of supercomputers, sprinting through data with the ease of a champion. And with support for PyTorch 2.0, it's not just fast; it's smart, too.
According to Tom’s Hardware Cerebras's CS-3 doubles the performance without increasing the juice.
Meanwhile, a strategic partnership between Cerebras and G42 is also set to expand with the construction of, the Condor Galaxy 3, an AI supercomputer featuring 64 CS-3 systems (packing a whopping 57,600,000 cores).
Together, the two companies have already created two of the biggest AI supercomputers in the world: the Condor Galaxy 1 (CG-1) and the Condor Galaxy 2 (CG-2), which are based in California and have a combined performance of 8 ExaFLOPs. This partnership aims to deliver tens of exaFLOPs of AI compute, globally.
G42 CTO Kiril Evtimov said the Cerebras partnership has been instrumental in propelling innovation at G42, and will contribute to the acceleration of the AI revolution on a global scale.
"Condor Galaxy 3, our next AI supercomputer boasting 8 exaFLOPs, is currently under construction and will soon bring our system’s total production of AI compute to 16 exaFLOPs."