May 17, 2024
Cerebras Breaks Exascale Record for Molecular Dynamics Simulations
Cerebras has set a new record for molecular dynamics…
May 1, 2024
Supercharge your HPC Research with the Cerebras SDK
Cerebras SDK 1.1.0, our second publicly available release,…
April 12, 2024
Cerebras CS-3 vs. Nvidia B200: 2024 AI Accelerators Compared
In the fast-paced world of AI hardware, the Cerebras CS-3…
March 12, 2024
Cerebras CS-3: the world’s fastest and most scalable AI accelerator
Today Cerebras is introducing the CS-3, our…
February 5, 2024
Sparsity Made Easy – Introducing the Cerebras PyTorch Sparsity Library
We release our PyTorch-based sparsity library allowing ML…
December 29, 2023
Introducing gigaGPT: GPT-3 sized models in 565 lines of code
GigaGPT is Cerebras’ implementation of Andrei Karpathy’s…
December 5, 2023
Cerebras Pioneers Ethical AI Development through Collaborative AI Initiatives
Today, Cerebras proudly revealed our pivotal role as a…
November 10, 2023
Cerebras Software Release 2.0: 50% Faster Training, PyTorch 2.0 Support, Diffusion Transformers, and More
Today we are excited to announce Cerebras software release…
October 12, 2023
How we fine-tuned Llama2-70B to pass the US Medical License Exam in a week
New open-access model by M42 outperforms GPT-3.5 in…
September 5, 2023
Jais: a New Pinnacle in Open Arabic NLP
Introducing a new state-of-the-art bi-lingual…
July 24, 2023
BTLM-3B-8K: 7B Performance in a 3 Billion Parameter Model
Cerebras and Opentensor introduce a new standard for…
July 20, 2023
Introducing Condor Galaxy 1: a 4 exaFLOPS Supercomputer for Generative AI
Cerebras, in partnership with G42 unveils CG-1, a 4…
June 9, 2023
SlimPajama: A 627B token, cleaned and deduplicated version of RedPajama
Today we are releasing SlimPajama – the largest…
May 22, 2023
Cerebras Architecture Deep Dive: First Look Inside the HW/SW Co-Design for Deep Learning [Updated]
Our ML-optimized architecture enables the largest models to…
March 28, 2023
Cerebras-GPT: A Family of Open, Compute-efficient, Large Language Models
Cerebras open sources seven GPT-3 models from 111 million…
November 28, 2022
Harnessing the Power of Sparsity for Large GPT AI Models
Enabling innovation of novel sparse ML techniques to…
August 15, 2022
Context is Everything: Why Maximum Sequence Length Matters
GPU-Impossible™ sequence lengths on Cerebras systems may…