Redis announces major AI advancements and intention to acquire Decoable at Redis Released 2025

Redis outlines an expansive AI strategy featuring significant acquisitions and innovative services, propelling its growth as a key player in AI infrastructure.

  • 1 month ago Posted in

Redis, the world's fastest data platform, recently unveiled an impactful expansion to its AI strategy during the Redis Released 2025 event. The keynote address by CEO Rowan Trollope highlighted several key initiatives, including the acquisition of Decodable, the introduction of LangCache, and numerous advancements to bolster Redis' position as a critical infrastructure layer for AI applications.

"As AI enters its next phase, the challenge isn't proving what language models can do; it's giving them the context and memory to act with relevance and reliability," Trollope noted. He emphasised how Redis' strategic acquisition of Decodable will streamline data pipeline developments, enabling data conversion into actionable context swiftly and efficiently within Redis.

Decodable, established by Eric Sammer, offers a serverless platform that simplifies the ingestion, transformation, and delivery of real-time data. By joining forces with Redis, Decodable aims to enhance AI capabilities and seamlessly connect developers with real-time data sources.

Redis also premiered LangCache, a fully-managed semantic caching service that cuts latency and token usage by up to 70% in LLM-reliant applications. The caching solution optimises performance and reduces costs significantly, supporting Redis’ mission to bolster AI agent efficiency.

The key advantages of LangCache include:

  • Up to 70% reduction in LLM API costs in high-traffic scenarios
  • 15x faster response times for cache hits compared to live LLM inference
  • Enhanced user experiences with lower latency and consistent outputs

Redis continuously adapts to the swift advancements in AI. Recent integrations make it easier for developers to leverage existing AI frameworks and tools. New tools, such as AutoGen and Cognee, along with LangGraph enhancements, provide scalable memory solutions for agents and chatbots.

Developers can now:

  • Utilise AutoGen for a fast-data memory layer
  • Leverage Cognee to manage memory through summarisation and reasoning
  • Implement LangGraph enhancements for reliability

Additional Redis for AI Enhancements

Redis' evolution continues with key improvements in hybrid search and data compression technologies within AI applications. Introduced upgrades include:

  • Hybrid search improvements using Reciprocal Rank Fusion, integrating text and vector rankings
  • Support for int8 quantised embeddings, yielding 75% memory savings and 30% faster search speeds

These latest updates ensure that Redis remains a pivotal platform for developing high-quality, reliable AI agents and frameworks.

SNP SE embarks on a groundbreaking project, migrating its systems to the SAP Cloud ERP Public...
Cloudian empowers governments and enterprises to develop sovereign AI projects, maintaining...
A recent study by OneStream reveals scaling AI demands for CFOs amid investment growth and...
IBM's latest study highlights AI-driven productivity improvements in UK enterprises, amidst...
xtype unveils the 2026 State of ServiceNow Operations Survey at The ServiceNow World Forum, aiming...
Azul and ActiveViam partner to refine cost optimization for financial analytics, addressing cloud...
Hammerspace unveils an innovative solution enhancing AI applications by leveraging existing data...
Simpson Associates has secured significant investment from Beech Tree Private Equity to enhance its...