Back to Tutorials
beginner 20 min

Building a Chatbot with Memory

Create a chatbot that remembers user preferences and past conversations.

Prerequisites

  • Completed the Getting Started tutorial
  • OpenAI API key
  • Python 3.8+

In this tutorial, we’ll build a complete chatbot that remembers everything users tell it across conversations.

What We’re Building

A chatbot that:

  • Remembers user preferences and facts
  • Retrieves relevant context for each response
  • Maintains conversation history
  • Gets smarter over time

Project Setup

Create a new directory and set up the environment:

mkdir memoid-chatbot
cd memoid-chatbot
python -m venv venv
source venv/bin/activate  # On Windows: venvScriptsactivate
pip install memoid openai flask

The Core Memory System

Let’s start by creating a memory-aware chat function:

# memory_chat.py
from memoid import MemoryClient
import openai

class MemoryChat:
    def __init__(self, memoid_key: str, openai_key: str):
        self.memory = MemoryClient(memoid_key)
        openai.api_key = openai_key

    def get_context(self, user_id: str, message: str, limit: int = 5) -> str:
        """Retrieve relevant memories for context."""
        memories = self.memory.search(
            query=message,
            user_id=user_id,
            limit=limit
        )

        if not memories:
            return "No previous context available."

        context_lines = [f"- {m.memory}" for m in memories]
        return "What you know about this user:\n" + "\n".join(context_lines)

    def chat(self, user_id: str, message: str) -> str:
        """Generate a response with memory context."""
        # Get relevant memories
        context = self.get_context(user_id, message)

        # Build the prompt
        system_prompt = f"""You are a friendly, helpful assistant with memory.
You remember things users tell you and use that information to personalize responses.

{context}

Guidelines:
- Reference relevant memories naturally in conversation
- Don't repeat everything you know, just what's relevant
- If you learn something new, acknowledge it
- Be conversational and warm"""

        # Generate response
        response = openai.chat.completions.create(
            model="gpt-4",
            messages=[
                {"role": "system", "content": system_prompt},
                {"role": "user", "content": message}
            ],
            temperature=0.7
        )

        assistant_message = response.choices[0].message.content

        # Store the conversation
        self.memory.add(
            messages=[
                {"role": "user", "content": message},
                {"role": "assistant", "content": assistant_message}
            ],
            user_id=user_id
        )

        return assistant_message

Adding a Web Interface

Let’s create a simple Flask API:

# app.py
from flask import Flask, request, jsonify
from memory_chat import MemoryChat
import os

app = Flask(__name__)

chat = MemoryChat(
    memoid_key=os.environ["MEMOID_API_KEY"],
    openai_key=os.environ["OPENAI_API_KEY"]
)

@app.route("/chat", methods=["POST"])
def handle_chat():
    data = request.json
    user_id = data.get("user_id")
    message = data.get("message")

    if not user_id or not message:
        return jsonify({"error": "Missing user_id or message"}), 400

    response = chat.chat(user_id, message)
    return jsonify({"response": response})

@app.route("/memories", methods=["GET"])
def get_memories():
    user_id = request.args.get("user_id")

    if not user_id:
        return jsonify({"error": "Missing user_id"}), 400

    memories = chat.memory.list(user_id=user_id, limit=50)
    return jsonify({
        "memories": [{"id": m.id, "memory": m.memory} for m in memories]
    })

if __name__ == "__main__":
    app.run(debug=True, port=5000)

Running the Chatbot

Set your environment variables and run:

export MEMOID_API_KEY="your-memoid-key"
export OPENAI_API_KEY="your-openai-key"
python app.py

Testing It Out

Use curl or any HTTP client:

# First conversation
curl -X POST http://localhost:5000/chat 
  -H "Content-Type: application/json" 
  -d '{"user_id": "user_1", "message": "Hi! My name is Sarah and I work as a data scientist."}'

# Response: "Nice to meet you, Sarah! Data science is such an exciting field..."

# Later conversation (could be days later)
curl -X POST http://localhost:5000/chat 
  -H "Content-Type: application/json" 
  -d '{"user_id": "user_1", "message": "Can you recommend some books for me?"}'

# Response: "Of course, Sarah! Given your work in data science, you might enjoy..."

The chatbot remembers Sarah’s name and profession, and uses that to personalize recommendations.

Enhancing Memory Retrieval

For better context, we can add metadata filtering:

def get_context(self, user_id: str, message: str) -> str:
    # Get recent memories for general context
    recent = self.memory.list(user_id=user_id, limit=5)

    # Get semantically relevant memories
    relevant = self.memory.search(
        query=message,
        user_id=user_id,
        limit=5,
        threshold=0.7  # Only high-relevance results
    )

    # Combine and deduplicate
    all_memories = {m.id: m for m in recent + relevant}

    context_lines = [f"- {m.memory}" for m in all_memories.values()]
    return "What you know about this user:\n" + "\n".join(context_lines)

Next Steps

You now have a working chatbot with persistent memory. To take it further:

Complete Code

The full code for this tutorial is available on GitHub.