New market research identifies demand for real-time model training and inferencing; Highlights major challenges with accuracy, latency, and reliability in current architectures
Mountain View, June 29, 2021—As companies look to expand their use of artificial intelligence (AI) and machine learning (ML) to keep up with the demands of their customers, they are facing hurdles getting these projects to production and ultimately deliver the desired results to their bottom line. In fact, 88% of AI/ML decision-makers expect the use cases that require these technologies to increase in the next one to two years, according to a commissioned study conducted by Forrester Consulting on behalf of Redis Labs. The research looked at the challenges keeping decision-makers from their desired transformation when deploying ML to create AI applications.
The study revealed that companies are developing increasingly more models based on real-time data. Still, more than 40% of respondents believe their current data architectures won’t meet their future model inferencing requirements. Most decision-makers (64%) say their firms are developing between 20% to 39% of their models on real-time data from data streams and connected devices. As teams develop more models on real-time data, the need for accuracy and scalability is becoming increasingly critical. Significantly, thirty-eight percent of leaders are developing roughly a third of models on the real-time spectrum.
Other key findings include:
As Forrester Consulting concludes, “AI powered by ML models mustn’t slow down applications by necessitating a network hop to a service and/or microservice for an application to use an ML model and/or get reference data. Most applications, especially transactional applications, can’t afford those precious milliseconds while meeting service-level agreements (SLAs).”
“Companies are embracing AI/ML to deliver more value for their mission-critical applications, yet need a modern AI/ML infrastructure to support real-time serving and continuous training. There are still gaps that impede companies from making existing applications smarter and delivering new applications,” said Taimur Rashid, Chief Business Development Officer at Redis Labs. “Customers realize this, and the simplicity and versatility of Redis as an in-memory database is enabling them to implement Redis as an online feature store and inferencing engine for low-latency and real-time serving.”
“Fabric was established to help brands migrate from legacy to modern, digital commerce systems,” said Umer Sadiq, CTO of Fabric. “In order to offer businesses the best-in-class technologies that enhance and improve customer experiences, we have crafted and continue to deliver applications that rely on Redis’ real-time data platform hosted on AWS to ensure real-time feature serving to customers, thus maintaining exceptional user satisfaction. Additionally, by combining the power of Amazon SageMaker and Redis Enterprise to bolster the efficiency of our market-leading recommender systems, we guarantee low-latency and high reliability for each individual customer interaction.”
“The Room’s mission is to connect top talent from around the world to meaningful opportunities, and at the core of the technology challenge is a mathematically difficult entity-matching problem,” said Peter Swaniker, CTO of The Room. “To address this complexity, we have architected a joint solution using Scribble Data’s Enrich Feature Store and Redis’ real-time data platform to provide the overall framework for The Room’s Intelligence Platform, which is responsible for entity matching. Using Redis’ high-performance key retrieval based on nearest neighbor vector lookup, the team was able to achieve a 15x+ improvement in the core similarity computation loop without any memory overhead.”
“The use of machine learning (ML) algorithms in simulations continues to grow to improve scientific research with efficiency and accuracy,” said Benjamin Robbins, Director AI & Advanced Productivity, Hewlett Packard Enterprise. “By leveraging Redis and RedisAI in SmartSim, our new open source AI framework which advances simulations that run on supercomputers, users can exchange data between existing simulations and an in-memory database, while the simulation is running. The ease of data exchange helps unlock new machine learning opportunities, such as online inference, online learning, online analysis, reinforcement learning, computational steering, and interactive visualization that can further improve accuracy in simulations and accelerate scientific discovery.”
For additional insights, read the Forrester Consulting opportunity snapshot or learn more how Redis supports AI/ML use cases.
Source: “Deploy ML Models To In-Memory Databases For Blazing Fast Performance,” a Forrester Consulting opportunity snapshot commissioned by Redis, June 2021
This Opportunity Snapshot was commissioned by Redis Labs. To create this profile, Forrester Consulting surveyed 106 IT managers and decision-makers in North America responsible for ML/AI operations strategy. Respondents include individuals in the following roles: C-level executives (29%), Vice Presidents (15%), Directors (42%), and Manager (13%). The custom survey began and was completed in December 2020.
Watch the following presentations:
Data is the lifeline of every business, and Redis helps organizations reimagine how fast they can process, analyze, make predictions, and take action on the data they generate. Redis provides a competitive edge to any business by delivering open source and enterprise-grade data platforms to power applications that drive real-time experiences at any scale. Developers rely on Redis to build performance, scalability, reliability, and security into their applications.
Born in the cloud-native era, Redis uniquely enables users to unify data across multi-cloud, hybrid and global applications to maximize business potential. Learn how Redis can give you this edge at redis.com.
By continuing to use this site, you consent to our updated privacy agreement. You can change your cookie settings at any time but parts of our site will not function correctly without them.