Back to Tutorials
advanced

Knowledge Graphs for AI

Extract entities and relationships from conversations to build a knowledge graph that powers smarter AI interactions

Prerequisites

  • Python 3.8+
  • OpenAI API key
  • Memoid account

What You’ll Build

A knowledge-graph-enhanced AI system that:

  • Extracts entities (people, places, organizations) from conversations
  • Identifies relationships between entities
  • Queries the graph for connected information
  • Combines vector search with graph traversal

Why Knowledge Graphs?

Vector search finds semantically similar content, but knowledge graphs capture structured relationships:

ApproachQueryResult
Vector Search“Who does John work with?”Finds memories mentioning “John” and “work”
Graph Query“Who does John work with?”Traverses relationships to find colleagues

Prerequisites

  • Python 3.8+
  • OpenAI API key
  • Memoid API key

Setup

pip install openai requests

Implementation

import os
import requests
from openai import OpenAI

MEMOID_API_KEY = os.environ.get("MEMOID_API_KEY")
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY")
MEMOID_BASE_URL = "https://api.memoid.dev/v1"

class KnowledgeGraphAssistant:
    def __init__(self):
        self.openai = OpenAI(api_key=OPENAI_API_KEY)
        self.headers = {
            "Authorization": f"Bearer {MEMOID_API_KEY}",
            "Content-Type": "application/json"
        }

    def extract_knowledge(self, text, store=True):
        """Extract entities and relationships from text."""
        response = requests.post(
            f"{MEMOID_BASE_URL}/graph/extract",
            headers=self.headers,
            json={"text": text, "store": store}
        )
        return response.json()

    def query_graph(self, entity, depth=2):
        """Query the knowledge graph starting from an entity."""
        response = requests.post(
            f"{MEMOID_BASE_URL}/graph/query",
            headers=self.headers,
            json={"entity": entity, "depth": depth}
        )
        return response.json()

    def search_memories(self, query, user_id, limit=5):
        """Search vector memories."""
        response = requests.post(
            f"{MEMOID_BASE_URL}/memories/search",
            headers=self.headers,
            json={"query": query, "user_id": user_id, "limit": limit}
        )
        return response.json().get("results", [])

    def hybrid_query(self, query, user_id):
        """Combine vector search with graph traversal."""
        memories = self.search_memories(query, user_id)
        extraction = self.extract_knowledge(query, store=False)
        entities = extraction.get("entities", [])

        graph_context = []
        for entity in entities:
            result = self.query_graph(entity["name"], depth=2)
            if result.get("entities") or result.get("relationships"):
                graph_context.append(result)

        return {
            "memories": memories,
            "graph_context": graph_context
        }

    def chat(self, user_id, message):
        """Chat with hybrid memory and graph context."""
        context = self.hybrid_query(message, user_id)

        memory_text = "\n".join(
            f"- {m['memory']}" for m in context["memories"]
        ) or "No relevant memories."

        graph_text = ""
        for g in context["graph_context"]:
            for e in g.get("entities", []):
                graph_text += f"Entity: {e['name']} ({e['type']})\n"
            for r in g.get("relationships", []):
                graph_text += f"  {r['source']} --{r['relation']}--> {r['target']}\n"

        system_prompt = f"""You are an assistant with memories and a knowledge graph.

Memories:
{memory_text}

Knowledge Graph:
{graph_text or "No graph connections."}

Use both sources for comprehensive answers."""

        response = self.openai.chat.completions.create(
            model="gpt-4",
            messages=[
                {"role": "system", "content": system_prompt},
                {"role": "user", "content": message}
            ]
        )

        answer = response.choices[0].message.content

        # Store and extract knowledge
        requests.post(
            f"{MEMOID_BASE_URL}/memories",
            headers=self.headers,
            json={
                "messages": [
                    {"role": "user", "content": message},
                    {"role": "assistant", "content": answer}
                ],
                "user_id": user_id,
                "extract_graph": True
            }
        )

        return answer


def main():
    assistant = KnowledgeGraphAssistant()
    user_id = "graph_user"

    print("Knowledge Graph Assistant")
    print("Type 'quit' to exit\n")

    while True:
        user_input = input("You: ").strip()
        if user_input.lower() == "quit":
            break
        if user_input:
            response = assistant.chat(user_id, user_input)
            print(f"Assistant: {response}\n")


if __name__ == "__main__":
    main()

Key Concepts

Entity Extraction

The system automatically identifies entities from conversations:

  • People: Names, roles, titles
  • Organizations: Companies, teams, departments
  • Locations: Cities, countries, addresses
  • Concepts: Products, technologies, topics

Relationship Detection

Common relationship types detected:

RelationshipExample
works_at“John works at Acme”
reports_to“John reports to Sarah”
located_in“Acme is in New York”
knows“John knows Sarah”

Hybrid Queries

Combining vector search with graph traversal provides:

  1. Semantic context from similar memories
  2. Structured relationships from the knowledge graph
  3. Multi-hop reasoning through relationship chains

Use Cases

  • Enterprise assistants: Track org structure and projects
  • Research tools: Map connections between concepts
  • CRM systems: Understand customer relationships
  • Personal assistants: Remember people and connections

Next Steps

  • Add custom entity types for your domain
  • Implement graph visualization
  • Build relationship inference rules
  • Add temporal reasoning for changing relationships

Related Tutorials