About Us

Redis Joins Hand with Tecton to Support Machine Learning Applications

Artificial Intelligence (AI) , Machine Learning (ML), Natural Language Processing (NLP), Deep Learning (DL), AI Model, AI-powered Solutions, Digital Transformation, ML Model, Natural Language Understanding (NLU), AI News, Artificial Intelligence News
A VPN is an essential component of IT security, whether you’re just starting a business or are already up and running. Most business interactions and transactions happen online and VPN

Real-time data platform, Redis, and enterprise feature store firm, Tecton has entered into a partnership including product integration. The collaboration allows machine learning (ML) applications to offer services with reduced latency, high scalability, and in a more economical way.

Kevin Petrie, Vice President of Research at Eckerson Group, stated, “Tecton and Redis are partnering in order to reduce the time to action for enterprises. Many machine learning use cases require the ability to transform streaming data, serve features to the machine learning model and calculate feature values, all on a real-time basis. Tecton helps transform incoming data and calculate feature values, and Redis helps retrieve feature values at ultra-low latency for model serving.”

ML features are stored in the Tecton feature store and data teams using Python and SQL define these features. ML data pipelines are automated, accurate data training sets are produced, and they are made available for online reference by Tecton. Using DevOps best engineering methods, features are created and shared across models and use cases.

“The Tecton feature store is designed to support a broad range of ML use cases, each with unique serving latency and volume requirements. Customers with latency-sensitive and high-volume use cases have been asking for the option to use Redis Enterprise Cloud for their online store. With today’s announcement, we’re happy to be providing that option and continuing to make the Tecton feature store more flexible and modular,” stated Mike Del Balso, co-founder and CEO of Tecton.

ML data access patterns supported by feature stores include obtaining millions of rows of historical data for model training and accessing individual rows online at millisecond latencies for real-time predictions. For low-latency serving, feature stores often employ key-value databases as online storage.

Redis Enterprise Cloud is a cost-effective, fully managed Database-as-a-Service (DBaaS) solution that can be used in a hybrid or multi-cloud environment. It automates database provisioning on AWS, Microsoft Azure, and Google Cloud, and is built on a serverless paradigm. Redis is a distributed storage system that provides sub-millisecond speed at an almost limitless scale, making it ideal for current distributed applications.

Chief Business Development Officer at Redis, Taimur Rashid added, “As more organizations operationalize machine learning for real-time, performance becomes especially important for customer-facing applications and experiences. Feature stores are at the center of modern data architecture, and there is increasing adoption of Redis to store online features for low latency serving. With Tecton’s capabilities for data orchestration combined with Redis Enterprise Cloud’s low operational overhead and submillisecond performance, organizations can deliver online predictions and perform complex operations in milliseconds.”

The integration enables Tecton clients to use the Redis Enterprise cloud for storing ML features and also provides latency in a 3x faster manner and transaction cost reduction of up to 14x. This will enable the organization to back more ML demand cases.

Recent News

Newsletter

SUBSCRIBE TO
OUR NEWSLETTER!!