About Us

Petuum Launches Enterprise MLOps Platform in Private Beta

Artificial Intelligence (AI) , Machine Learning (ML), Natural Language Processing (NLP), Deep Learning (DL), AI Model, AI-powered Solutions, Digital Transformation, ML Model, Natural Language Understanding (NLU), AI News, Artificial Intelligence News
A VPN is an essential component of IT security, whether you’re just starting a business or are already up and running. Most business interactions and transactions happen online and VPN

Machine learning startup Petuum.inc, has announced the availability of its MLOps platform in private beta that can be used by artificial intelligence (AI) or machine learning (ML) users. The availability of the platforms was announced by the CEO of Petuum, Aurick Qiao, and Tong Wen.

Aurick Qiao, CEO of Petuum stated, “We have spent the last five years working with customers on the hard problems in MLOps and have learned how to multiply AI team productivity through extensive research. The Petuum Platform helps AI teams do more with less.”

With the world’s first composable platform for MLOps, Petuum enables enterprise AI/ML teams to operationalize and scale their machine learning pipelines to production. Petuum unveiled a restricted release of their platform through an exclusive private beta for select clients, following years of development at CMU, Berkeley, and Stanford, as well as hundreds of customer engagements in banking, healthcare, energy, and heavy industry.

The corporate MLOps platform from Petuum is based on ideas of reusability, openness, and limitless extension. AI applications can be constructed from reusable building pieces and maintained as part of a repetitive assembly-line process thanks to universal data, pipelines, and infrastructure standards. The users of Petuum can concentrate on new project implementations with less time and resources without the concerns of infrastructure, DevOps knowledge, glue code, or tuning.

Tong Wen, Director of Engineering, said, “In training alone, we have seen 3 to 8 times greater time to value. Infrastructure orchestration and Pythonic deployment system are easy enough for a data scientist to use.”

The AI OS and low/no-code Kubernetes designed for AI are part of the end-to-end platform. Users can create and run DAGs with any kind of data pack using a universal pipeline that doesn’t demand much expertise. With observability and user control, the low/no-code Deployment Manager can update, reuse, and reconfigure pipelines in production. A new experiment manager is also included in the platform for amortized autotuning and optimizing model and system pipelines.

Recent News

Newsletter

SUBSCRIBE TO
OUR NEWSLETTER!!