Cerebras Systems develops computing systems to accelerate deep learning in the data center. Our first product, the CS-1, is the first computer ever made to host a chip as large as a whole silicon wafer, the Wafer Scale Engine. It was built from the ground up, it is programmable with the Cerebras software platform and is powered by a radically innovative system that fits directly into existing data center infrastructure.
You will be building the data intelligence infrastructure that serves the increasing needs of a rapidly expanding and data-driven hardware team. You will work with leaders from industry, alongside system, electrical and ASIC engineers, as well as other software and data engineers and scientists. This role is highly cross-functional, collaborative and has direct impact on effective development and manufacture of Cerebras’ wafer-scale computing systems.
- Consolidate and integrate heterogeneous data pipelines, implementing ETL (extract-transform-load) process and data warehouse best practices.
- Design, build and maintain web applications for hardware data analytics, comprising tools for interactive visualization of hardware test and manufacturing data.
- Improve approaches to efficiently handle ever-increasing volumes of data.
- Interface regularly with multiple hardware/software development groups in order to enhance performance, functional coverage and usability of the analytics tools.
- Support existing processes running in production.
- Design and build software for operation of custom setups for hardware testing, interfacing with a variety of devices, sensors, microcontrollers via common communication buses, such as USB.
- Partner with hardware, manufacturing engineers and program management to translate data insights into decisions and actions.
- Support engineering team with ad-hoc data analysis needs, identifying opportunities for automation.
- Continuously evaluate team’s processes to maintain a positive and efficient engineering culture.
Skills and Qualifications
- Excellent Python programming skills.
- Proficiency in SQL, relational databases, time-series databases, data warehouse.
- Experience in custom ETL design, implementation and maintenance.
- Experience with distributed systems, such as gRPC or any other IDL defined RPC stack.
- Knowledge of a front-end web framework (such as React, Vue or Angular).
- Experience with a variety of technologies and ability to pick the right tool for the job.
- Experience with Agile development methodologies and industry standard software development lifecycle processes (Jira, Git, code review, design documentation).
- Ability to bring thoughtful perspectives, creativity, and a positive attitude to solve problems at scale.
Nice to have:
- Experience using cloud storage and computing technologies such as AWS Redshift, S3, Hadoop, etc.
- Familiarity with hardware testing and interfacing with devices and microcontrollers via USB for data generation and collection.
- Experience building UIs.
- Headquarters/Los Altos Office