Making the world’s biomedical knowledge computable

nference uses transformer AI models to employ self-supervised learning from large volumes of unstructured data, translating vast amounts of health data into information that can be used to discover insights and drive research. Training large models is complex, computationally intensive, and time-consuming on GPUs. With a Cerebras CS-2 system, researchers can use data of longer sequence lengths than is practical using smaller, conventional processors.


"With Cerebras’s powerful CS-2 system, we can train transformer models with much longer sequence lengths than we could before, enabling us to iterate more rapidly and build better, more insightful models."

Ajit Rajasekharan

CTO @ nference

Pittsburgh Supercomputing Center and Cerebras

nference Accelerates Self-Supervised Language Model Training with Cerebras CS-2 System

New Cerebras Systems technology will double capacity, allow larger deep-learning models and data

Cerebras Brings Wafer-Size AI Chips to Medical Data Analysis