intermediate
Build a Customer Support Agent with OpenAI
Build an AI-powered customer support agent with OpenAI and Memoid that remembers customer history and provides personalized assistance.
Prerequisites
- Python 3.8+
- Memoid account with API key
- OpenAI API key (for chat responses)
What You’ll Build
A customer support AI agent that:
- Remembers customer interactions and preferences
- Provides context-aware responses based on history
- Tracks issues and resolutions per customer
- Streams responses for better UX
Prerequisites
- Python 3.8+
- Memoid API key (sign up free, get key from dashboard)
- OpenAI API key (for generating chat responses)
Note: Memoid handles all memory operations server-side. Your OpenAI key is only used for generating the agent’s responses, not for memory extraction or search.
Setup
pip install openai requests Full Implementation
import os
import requests
from openai import OpenAI
# Configuration
MEMOID_API_KEY = os.environ["MEMOID_API_KEY"]
OPENAI_API_KEY = os.environ["OPENAI_API_KEY"]
MEMOID_BASE_URL = "https://api.memoid.dev/v1"
class CustomerSupportAgent:
def __init__(self):
self.openai = OpenAI(api_key=OPENAI_API_KEY)
self.headers = {
"Authorization": f"Bearer {MEMOID_API_KEY}",
"Content-Type": "application/json"
}
def get_customer_context(self, customer_id: str, query: str) -> str:
"""Retrieve relevant memories for the customer."""
response = requests.post(
f"{MEMOID_BASE_URL}/search",
headers=self.headers,
json={
"query": query,
"user_id": customer_id,
"limit": 5
}
)
if response.status_code == 200:
memories = response.json().get("results", [])
if memories:
context = "\n".join(f"- {m['memory']}" for m in memories)
return f"Customer History:\n{context}"
return ""
def store_interaction(self, customer_id: str, query: str, response: str):
"""Store the interaction in Memoid."""
requests.post(
f"{MEMOID_BASE_URL}/memories",
headers=self.headers,
json={
"messages": [
{"role": "user", "content": query},
{"role": "assistant", "content": response}
],
"user_id": customer_id,
"metadata": {"type": "support_interaction"}
}
)
def handle_query(self, customer_id: str, query: str) -> str:
"""Handle a customer support query."""
context = self.get_customer_context(customer_id, query)
system_prompt = """You are a helpful customer support agent.
Use the customer's history to provide personalized assistance.
Be empathetic, professional, and solution-oriented."""
messages = [{"role": "system", "content": system_prompt}]
if context:
messages.append({"role": "system", "content": context})
messages.append({"role": "user", "content": query})
response_text = ""
stream = self.openai.chat.completions.create(
model="gpt-4o-mini",
messages=messages,
stream=True
)
print("Agent: ", end="", flush=True)
for chunk in stream:
if chunk.choices[0].delta.content:
content = chunk.choices[0].delta.content
print(content, end="", flush=True)
response_text += content
print()
self.store_interaction(customer_id, query, response_text)
return response_text
def get_customer_history(self, customer_id: str):
"""Get all memories for a customer."""
response = requests.get(
f"{MEMOID_BASE_URL}/memories",
headers=self.headers,
params={"user_id": customer_id}
)
if response.status_code == 200:
return response.json().get("results", [])
return []
def main():
agent = CustomerSupportAgent()
customer_id = "customer_jane_doe"
print("Customer Support Agent")
print("=" * 40)
print("Type 'quit' to exit, 'history' to see memories\n")
while True:
query = input("Customer: ").strip()
if query.lower() == "quit":
break
if query.lower() == "history":
memories = agent.get_customer_history(customer_id)
print("\nCustomer Memories:")
for m in memories:
print(f" - {m['memory']}")
print()
continue
if query:
agent.handle_query(customer_id, query)
print()
if __name__ == "__main__":
main() Running
export MEMOID_API_KEY="your-memoid-key"
export OPENAI_API_KEY="your-openai-key"
python support_agent.py Example Conversation
Customer Support Agent
========================================
Type 'quit' to exit, 'history' to see memories
Customer: Hi, I ordered a laptop last week but it hasn't arrived yet.
Agent: I'm sorry to hear your laptop hasn't arrived yet. Could you
please provide your order number so I can look into this?
Customer: It's ORDER-12345. I'm really frustrated because I needed it for work.
Agent: I completely understand your frustration. Let me check on ORDER-12345...
Customer: history
Customer Memories:
- Customer ordered a laptop last week that hasn't arrived
- Order number is ORDER-12345
- Customer needs the laptop for work and is frustrated Key Concepts
Memory-Augmented Responses
The agent retrieves relevant customer history before generating responses. Memoid handles the semantic search — you just send the query and get back relevant facts.
Automatic Extraction
Every interaction is stored via POST /v1/memories. Memoid automatically extracts key facts from the conversation server-side — no LLM calls on your end.
Ticket Tracking with Metadata
def create_ticket(self, customer_id: str, issue: str):
ticket_id = f"TICKET-{int(time.time())}"
requests.post(
f"{MEMOID_BASE_URL}/memories",
headers=self.headers,
json={
"messages": [{"role": "user", "content": f"Opened ticket {ticket_id}: {issue}"}],
"user_id": customer_id,
"metadata": {"type": "ticket", "ticket_id": ticket_id, "status": "open"}
}
)
return ticket_id Related Tutorials
- Customer Support Agent with Gemini — Same tutorial using Google Gemini
- Getting Started with Memoid
- Chatbot with Memory