PyTorch

Our PyTorch interface library is a simple wrapper for PyTorch program exposed through API calls that is easy to add as few extra lines of code for an existing Pytorch implementation.

Get started

SDK

The Cerebras SDK allows researchers to extend the platform and develop custom kernels – empowering them to push the limits of AI and HPC innovation.

Request access

Cerebras Model Zoo

This repository contains examples of common deep learning models demonstrating best practices for coding for the Cerebras hardware.

Repo

Developer blogs

Cerebras Breaks Exascale Record for Molecular Dynamics Simulations

Cerebras has set a new record for molecular dynamics simulation speed that goes far beyond the exascale level. While this breakthrough has wide-ranging impacts…

Supercharge your HPC Research with the Cerebras SDK

Cerebras SDK 1.1.0, our second publicly available release, includes initial support for the WSE-3. Check out what researchers have been doing with the SDK, and…

Accelerating Large Language Model Training with Variable Sparse Pre-training and Dense Fine-tuning

We reduced pre-training FLOPs by 64% using sparsity. To the best of our knowledge, this is the largest GPT model trained with unstructured weight sparsity…

Variable Sequence Length Training for Long-Context Large Language Models

We show it is possible to accelerate the training for large language models with long context capabilities using a simple staged training method. Faster to…

FAQ

Don’t see your question?

Send us an email at developer@cerebras.net

Please find our example reference model implementation here: https://github.com/Cerebras/cerebras_reference_implementations. To get access to our full list, please contact us at developer@cerebras.net 

Please sign up for our newsletter!