BTLM-3B-8K: 7B Performance in a 3 Billion Parameter Model

Cerebras and Opentensor introduce a new standard for compact large language models


0 Comments15 Minutes

Cerebras-GPT: A Family of Open, Compute-efficient, Large Language Models

Cerebras open sources seven GPT-3 models from 111 million to 13 billion parameters. Trained using the Chinchilla…


0 Comments15 Minutes