Agent DailyAgent Daily
toolintermediate

Show HN: Agent-cache – Multi-tier LLM/tool/session caching for Valkey and Redis

By kaliadeshackernews
View original on hackernews

Agent-cache is a multi-tier caching solution for AI agents that unifies LLM responses, tool results, and session state caching across Valkey and Redis. It provides framework adapters for LangChain, LangGraph, and Vercel AI SDK with built-in OpenTelemetry and Prometheus monitoring. The solution addresses fragmentation in existing tools by supporting multiple caching tiers and frameworks without requiring additional modules, working on vanilla Valkey 7+ and Redis 6.2+.

Key Points

  • Multi-tier exact-match caching for LLM responses, tool results, and session state in a single connection
  • Framework adapters support LangChain, LangGraph, and Vercel AI SDK with unified interface
  • Works on vanilla Valkey 7+ and Redis 6.2+ without additional module dependencies
  • Built-in OpenTelemetry and Prometheus observability for monitoring cache performance
  • Cluster mode support shipped in v0.2.0 for distributed deployments
  • Solves framework lock-in problem where existing tools only support single caching tier
  • Streaming support planned as next feature release
  • Exact-match caching strategy optimizes for deterministic agent queries and responses
  • Single connection model reduces infrastructure complexity for agent applications
  • Rapid iteration with v0.1.0 and v0.2.0 releases within 24 hours

Found this useful? Add it to a playbook for a step-by-step implementation guide.

Workflow Diagram

Start Process
Step A
Step B
Step C
Complete
Quality

Concepts

Artifacts (2)

agent-cache npm packagebashconfig
npm install @betterdb/agent-cache
agent-cache documentationtemplate
Multi-tier caching for AI agents with framework adapters:
- LangChain integration for LLM caching
- LangGraph integration for state caching
- Vercel AI SDK integration
- OpenTelemetry and Prometheus monitoring
- Cluster mode support
- Streaming support (upcoming)