Prerequisites

  • Python 3.10+ or Node.js 18+
  • An Observee account at observee.ai
  • API key from your LLM provider (Anthropic, OpenAI, or Google)

Install the SDK

pip install mcp-agents

Get Your API Keys

  1. Observee API Key: Sign up at observee.ai → API Keys → Create new key
  2. LLM API Key: Get from Anthropic, OpenAI, or Google

Environment Setup

Create a .env file:
# Required
OBSERVEE_API_KEY=obs_your_api_key_here

# Choose one LLM provider
ANTHROPIC_API_KEY=your_key_here
OPENAI_API_KEY=your_key_here
GOOGLE_API_KEY=your_key_here

Your First Tool-Powered Conversation

from observee_agents import chat_with_tools

# Simple conversation with tool access
result = chat_with_tools(
    message="Search for recent news about AI developments",
    provider="anthropic",
    observee_api_key="obs_your_key_here"
)

print("🤖 AI Response:", result["content"])
print("🔧 Tools Used:", len(result["tool_calls"]))

Streaming Responses

import asyncio
from observee_agents import chat_with_tools_stream

async def stream_example():
    async for chunk in chat_with_tools_stream(
        message="Check my emails and summarize them",
        provider="anthropic"
    ):
        if chunk["type"] == "content":
            print(chunk["content"], end="", flush=True)

asyncio.run(stream_example())

Conversation with Memory

# First message
result1 = chat_with_tools(
    message="Search for emails about the product launch",
    session_id="assistant_001"
)

# Follow-up (remembers context)
result2 = chat_with_tools(
    message="What date was mentioned in those emails?",
    session_id="assistant_001"  # Same session = memory
)
🎉 Congratulations! You just:
  1. Connected to 1000+ tools - Gmail, YouTube, Linear, Slack, and more
  2. Used multiple LLM providers - Anthropic, OpenAI, and Google
  3. Created conversations with memory - Context preserved across messages

Next Steps

Need Help?