Redis Expands With Decodable And LangCache

0
77
Redis

Redis announced the acquisition of Decodable, the launch of LangCache to cut LLM costs by up to 70%, and new agent integrations—moves that underscore India’s growing role as both a testing ground and growth hub for its AI infrastructure strategy.

Rowan Trollope, Chief Executive Officer, Redis

Redis is deepening its AI ambitions with a three-pronged move: the acquisition of real-time data platform Decodable, the debut of its LangCache service, and new integrations for AI agents—all revealed at Redis Released 2025 in Bangalore. The announcements mark CEO Rowan Trollope’s first visit to India and highlight the country’s centrality to Redis’ global innovation strategy.

Sameer Dhamane, Business Leader, BFSI India, Redis

At the core of the news is LangCache, a managed semantic caching system designed to slash costs for large language model (LLM) applications by up to 70%. By storing and reusing semantically similar calls, LangCache promises 15x faster responses on cache hits compared with direct LLM inference. For startups and enterprises racing to scale AI chatbots and agents, the cost savings and lower latency could prove critical.

Trollope emphasized that the bottleneck in AI adoption is shifting from model capability to context and memory. Redis’ strategy positions its platform not just as an in-memory data store but as a persistent memory layer for AI agents, delivering relevance and reliability at scale. Decodable’s data streaming expertise, folded into Redis, is expected to bolster this vision by simplifying how developers pipe live data into AI-ready contexts.

India played a starring role in Trollope’s remarks. With 17 million developers and the world’s third-largest startup ecosystem, the country offers both a vast customer base and a deep engineering pool. Redis aims to ride India’s wave of AI adoption, where affordability and performance are non-negotiable.

Beyond LangCache, Redis also rolled out tighter agent integrations. Developers can now use Redis with frameworks like AutoGen and Cognee without custom coding, simplifying memory management and boosting agent reliability. Enhancements to LangGraph further extend Redis’ use in building agents with persistent, scalable memory.

For Redis, the announcements collectively signal a shift from speed-centric infrastructure to intelligence-first architecture—supporting AI that is cheaper, faster, and smarter. With India as both a proving ground and growth hub, the company is betting its future on powering the memory backbone of next-gen AI applications.

LEAVE A REPLY

Please enter your comment!
Please enter your name here