DEV Community

Cover image for ๐Ÿ“… Day 3: Understanding AI Memory in LangChain โ€“ A Shimla Travel Analogy ๐Ÿ‡ฎ๐Ÿ‡ณ
Utkarsh Rastogi for AWS Community Builders

Posted on โ€ข Edited on

1 1 1

๐Ÿ“… Day 3: Understanding AI Memory in LangChain โ€“ A Shimla Travel Analogy ๐Ÿ‡ฎ๐Ÿ‡ณ

From short-term recall to deep personalization โ€” let's explore how AI remembers like humans!

Imagine you're chatting with your AI travel assistant Lexi and planning a trip to Shimla. Wouldnโ€™t it be amazing if she remembered your favorite hotel, travel dates, and even your preference for toy trains โ€” just like a human would?

That's exactly what AI Memory in LangChain is all about.


๐Ÿง  What is Memory in AI?

In human terms, memory helps us recall:

  • What was said
  • What we like
  • What we did
  • What weโ€™ve learned

For AI agents, memory helps them act smarter, carry over context, and improve over time. LangChain and LangGraph offer robust ways to manage both short-term and long-term memory โ€” just like a human brain.


๐Ÿ” Two Types of Memory in LangChain

1. โœจ Short-Term Memory (Thread-Scoped)

This memory lives within a single conversation.

๐Ÿงณ Example: You told Lexi, โ€œBook a trip to Shimla in December.โ€

Lexi remembers:

  • Destination: Shimla
  • Timing: December

And that memory stays as long as you're in the same thread โ€” thanks to LangGraphโ€™s checkpointer.

Key Highlights:

  • Thread-specific
  • Stored in agentโ€™s state
  • Loaded at every step
  • Temporary

๐Ÿ’ก Like sticky notes on Lexiโ€™s desk โ€” perfect for the current chat.


2. ๐Ÿง  Long-Term Memory (Cross-Thread)

This memory survives across sessions.

โ€œPlan something like last time.โ€

Lexi remembers you meant Shimla in December. Why? Because that was stored in long-term memory, scoped to your user profile.

Key Highlights:

  • Works across conversations
  • Persisted using vector stores or DBs
  • Supports personalization

๐Ÿ““ Like Lexiโ€™s personal diary โ€” useful for lifelong relationships.


๐Ÿงฌ AI Memory Types: Inspired by Human Brain

LangChainโ€™s memory also resembles human memory types:

1. ๐Ÿ“– Episodic Memory

Stores specific events โ€” like your Dec 10 hotel booking in Shimla.

  • Chat logs and user actions
  • Enables time-stamped recall

โ€œBook a toy train to Shimla on the 14thโ€ โ†’ remembered exactly as said.


2. ๐Ÿ“š Semantic Memory

Stores general knowledge โ€” like facts about Shimla.

  • Snowfall in Kufri
  • Best time to visit
  • Toy train info

Even if you donโ€™t say "Shimla", Lexi might recommend it if you say โ€œsnowy hills in North India.โ€


3. โš™๏ธ Procedural Memory

Learns routines or behaviors โ€” like always booking a hotel after a train.

  • Learns booking patterns
  • Automates tasks

Lexi starts suggesting your travel steps without being told โ€” like muscle memory.


๐Ÿง  When Should AI Create Memories?

Unlike humans, AI doesnโ€™t sleep. So when do they store new memories?

LangChain offers two approaches:

๐Ÿ”ฅ 1. Hot Path (Real-Time Writing)

  • Happens during the conversation
  • Fast recall, but slower response time

Lexi notes: "User prefers mountain-facing roomsโ€ while chatting.


๐ŸŒ™ 2. Background (Post-Task Writing)

  • Happens after the task
  • Batched or summarized memory

After your session, Lexi reflects: โ€œUser loves snowy cafes in Himachal.โ€


๐Ÿง  Pro Strategy: Combine Both

  • Use hot path for bookings/preferences
  • Use background for session summarization

๐Ÿ—‚๏ธ Tagging Makes Memory Smarter

To make memory usable, tag it by:

  • Thread ID
  • Location (e.g., Shimla)
  • User ID

Right memory, right moment โ€” just like a thoughtful friend.


๐Ÿ› ๏ธ Memory Management in LangGraph

Many AI applications need memory to share context across multiple interactions. LangGraph provides built-in support for managing memory effectively, enabling agents to stay within the LLM's context window while remaining aware of the conversation history.

LangGraph supports two main types of memory:

๐Ÿ” Short-Term Memory

  • Tracks the ongoing conversation within a session
  • Maintains message history during the current flow
  • Critical for contextual follow-ups

๐Ÿง  Long-Term Memory

  • Stores user-specific or app-level data across sessions
  • Used for persistent personalization and historic recall

๐Ÿ“ Handling Context Window Limits

With short-term memory enabled, long conversations can exceed the LLMโ€™s token limit. LangGraph offers the following strategies:

โœ‚๏ธ Trimming

  • Remove the first or last N messages
  • Keeps the most relevant and recent messages
  • Ensures LLM receives a manageable context

๐Ÿ“ Summarization

  • Earlier parts of the conversation are summarized
  • Summaries replace full message logs
  • Helps maintain continuity while reducing tokens

๐Ÿ—‘๏ธ Deletion

  • Permanently remove messages from LangGraph state
  • Useful for stateless workflows

๐Ÿ› ๏ธ Custom Strategies

  • Filter messages based on importance
  • Retain specific types (e.g., user queries only)
  • Fully customizable to fit app needs

๐ŸŽฏ Why It Matters

These memory management strategies allow your AI agent to:

  • Operate within LLM limits
  • Stay context-aware
  • Provide coherent responses
  • Enhance long-form conversations

๐Ÿงพ Summary Table

Memory Layer Type Scope Shimla Example Management Strategy
Short-Term Episodic One conversation "Shimla trip in December" Trimming, Summarization, Deletion
Long-Term Episodic/Semantic Multiple chats Remembers previous trip to Shimla Stored in DB or vector store
Semantic Knowledge-based General facts Knows Shimla is snowy in winter Stored as knowledge base
Procedural Habitual recall Behavior patterns Always books train โ†’ hotel โ†’ cafe Pattern learning over time
Hot Path Real-time save Immediate Saves hotel preference mid-convo Stored instantly
Background Post-processing Deferred Summarizes entire trip memory Summarized after conversation

๐Ÿงญ Why This Matters for AI Agents

Without memory:

  • AI feels robotic, forgetful, and cold

With memory:

  • AI becomes personal, smart, and useful

Next time you plan a winter trip, Lexi might say:

โ€œShall I book that toy train and hillside hotel you liked last December?โ€

Thatโ€™s the power of AI memory. ๐Ÿง โœจ


๐Ÿ™ Credits

This article is inspired by and references the official LangChain documentation. Special thanks to the LangChain team for making advanced memory handling so intuitive.


๐Ÿ‘จโ€๐Ÿ’ป About Me

Hi, Iโ€™m Utkarsh Rastogi, an AWS Community Builder passionate about:

  • ๐ŸŒฉ๏ธ Cloud-native apps on AWS
  • ๐Ÿค– Building intelligent AI assistants
  • ๐Ÿงฑ Infrastructure-as-Code with Terraform & CloudFormation
  • ๐Ÿ“ Blogging real-world AWS & AI projects on awslearner.hashnode.dev

Letโ€™s connect on LinkedIn!

Happy building! ๐Ÿš€

ACI image

ACI.dev: Best Open-Source Composio Alternative (AI Agent Tooling)

100% open-source tool-use platform (backend, dev portal, integration library, SDK/MCP) that connects your AI agents to 600+ tools with multi-tenant auth, granular permissions, and access through direct function calling or a unified MCP server.

Star our GitHub!

Top comments (3)

Collapse
 
nathan_tarbert profile image
Nathan Tarbert โ€ข

pretty cool seeing memory stuff actually explained in plain words - honestly, i always wonder if real progress comes from better tech or just from making things more human-like

Collapse
 
dotallio profile image
Dotallio โ€ข

Love the Shimla analogy! Have you found any tradeoffs in real apps when deciding between hot path and background memory updates?

Collapse
 
awslearnerdaily profile image
Utkarsh Rastogi โ€ข

I am just trying to learn langchain concepts so will be implementing based on this after days10 series ends

Create a simple OTP system with AWS Serverless cover image

Create a simple OTP system with AWS Serverless

Implement a One Time Password (OTP) system with AWS Serverless services including Lambda, API Gateway, DynamoDB, Simple Email Service (SES), and Amplify Web Hosting using VueJS for the frontend.

Read full post