logo
logo
  • Product
    • System
    • Chip
    • Software
    • Cloud
  • Industries
    • Health & Pharma
    • Energy
    • Government
    • Scientific Computing
    • Financial Services
    • Web and Social Media
  • Resources
    • Customer Spotlights
    • Blog
    • Publications
    • Events
    • White Papers
  • Developers
    • Community
    • Developer Blog
    • Documentation
    • ML Public Repository
    • Request Access to SDK
  • Company
    • About
    • In the News
    • Awards
    • Press Releases
    • Press Kit
  • Join Us
    • Life at Cerebras
    • All openings
  • Get Demo
  • Search
September 27, 2019
In Business, Chip, Culture, Jobs, Machine Learning, Blog

A Fantastic Month for Cerebras

This month, we at Cerebras Systems continued to build on the momentum from our first public reveal of the world’s […]

This month, we at Cerebras Systems continued to build on the momentum from our first public reveal of the world’s largest chip, the Wafer Scale Engine (WSE) at HotChips 2019.

I am proud to announce our first customers, Argonne National Laboratory and Lawrence Livermore National Laboratory. We’ve commenced on a multi-year partnership with these U.S. Department of Energy National Labs to advance deep learning for basic and applied science, and medicine.

The opportunity to incorporate the largest and fastest AI chip ever—the Cerebras WSE—into our advanced computing infrastructure will enable us to dramatically accelerate our deep learning research in science, engineering and health. It will allow us to invent and test more algorithms, to more rapidly explore ideas, and to more quickly identify opportunities for scientific progress.

Rick Stevens, head of computing at Argonne

Unlocking the vast potential of A.I. and reducing the cost of curiosity for deep learning researchers is deeply ingrained in our mission here at Cerebras. We couldn’t be more proud that our first customers are actively using our system to solve some of today’s biggest challenges – be it cancer research, improved outcomes for traumatic brain injuries, physics simulations, astronomy, material science research – to name but a few.

I have also had the pleasure of speaking at several events this month to share more about our history, our motivations, and where we are headed as a company and industry. Thanks to the organizers of O’Reilly Artificial Intelligence Conference, the MIT Club of Northern California AI conference and the AI Hardware Summit for providing me the opportunity to speak to our community. I also enjoyed my conversations with James Wang of ARK Invest, on the For Your Innovation podcast and Sherry Ahn and Amanda Lang on Bloomberg TV.

Here’s a link to a my keynote at the O’Reilly AI conference, and the slides I presented throughout the month.

I remain grateful for all our employees and their hard work, and our customers for proving the validity of our mission as a company. We are excited and energized at what’s coming down the pipe and to share more updates about our work with the world!

We are always looking for extraordinary team members to join us on our journey to change compute forever. Take a look at our careers page for more details, and follow us on medium and our blog for future updates.


Ugnius
Author posts
Related Posts
ChipMachine LearningSoftwareBlog

August 15, 2022

Context is Everything: Why Maximum Sequence Length Matters

GPU-Impossible™ sequence lengths on Cerebras systems may enable breakthroughs…


by Sia Rezaei

ChipMachine LearningBlog

August 3, 2022

Cerebras Wafer-Scale Engine Inducted into the Computer History Museum

Today was a proud day for the entire Cerebras family.


by Rebecca Lewington

Machine LearningSoftwareBlog

June 22, 2022

Cerebras Sets Record for Largest AI Models Ever Trained on Single Device

Our customers can easily train and reconfigure GPT-3 and GPT-J language models…


by Joel Hestness

Add comment Cancel reply

You must be logged in to post a comment.

  • Prev
  • Next

Explore more ideas in less time. Reduce the cost of curiosity.

Sign up

info@cerebras.net

1237 E. Arques Ave
Sunnyvale, CA 94085

Follow

Product

System
Chip
Software
Cloud

Industries

Health & Pharma
Energy
Government
Scientific Computing
Financial Services
Web & Social Media

Resources

Customer Spotlight
Blog
Publications
Event Replays
Whitepapers

Developers

Community
Developer Blog
Documentation
ML Public Repository
Request Access to SDK

Company

About Cerebras
In the News
Press Releases
Privacy
Legal
Careers
Contact

© 2022 Cerebras. All rights reserved

Privacy Preference Center

Privacy Preferences

Manage Cookie Consent
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
Manage options Manage services Manage vendors Read more about these purposes
View preferences
{title} {title} {title}