Step 1: Get your API Keys and Set up Marketplace

Go to https://app.observee.ai and create an API key for your profile. Then click Connect and enable servers

Get Your API Keys

  1. Observee API Key: Sign up at observee.ai → API Keys → Create new key
  2. LLM API Key: Get from Anthropic, OpenAI, or Google

Environment Setup

Create a .env file:
# Required
OBSERVEE_API_KEY=obs_your_api_key_here

# Choose one LLM provider
ANTHROPIC_API_KEY=your_key_here
OPENAI_API_KEY=your_key_here
GOOGLE_API_KEY=your_key_here

Step 2: Authenticate & Get Your Client ID

This step is optional for those who are not setting up an OAuth server.
If you are using an OAuth server like Gmail, you can authenticate directly from our Auth SDK to get a client ID for accessing tools. Learn more at Auth SDK Overview. 3.1 Install the Auth SDK
pip install mcp-agents
3.2 Start the Auth flow for whichever OAuth Server you have enabled
from observee_agents import call_mcpauth_login

# Start authentication flow (optionally provide your own client_id)
response = call_mcpauth_login(
    auth_server="gmail",
    client_id="<client_id (uuid)>"  # Optional: use your own UUID
)
print(f"Visit: {response['url']}")

# After you visit the URL and authenticate, you'll receive:
# {"success":true,"client_id":"<client_id (uuid)>"}
This will open a browser window for OAuth authentication. After successful login, you’ll receive a client_id that you’ll use in all subsequent API calls. You can optionally provide your own UUID as the client_id.Tip: You can reuse the same client_id to authenticate multiple services. Each service you authenticate adds its tools to that client_id’s available toolset.

Step 3: Install the Agents SDK

pip install mcp-agents

Your First Tool-Powered Conversation

from observee_agents import chat_with_tools

# Use the client_id from authentication
result = chat_with_tools(
    message="Search for recent news about AI developments",
    provider="anthropic",
    client_id="<client_id (uuid)>"  # Use the client_id from Step 2
)

print("🤖 AI Response:", result["content"])
print("🔧 Tools Used:", len(result["tool_calls"]))

Streaming Responses

import asyncio
from observee_agents import chat_with_tools_stream

async def stream_example():
    async for chunk in chat_with_tools_stream(
        message="Check my emails and summarize them",
        provider="anthropic"
    ):
        if chunk["type"] == "content":
            print(chunk["content"], end="", flush=True)

asyncio.run(stream_example())

Conversation with Memory

# First message
result1 = chat_with_tools(
    message="Search for emails about the product launch",
    client_id="<client_id (uuid)>",
    session_id="assistant_001"
)

# Follow-up (remembers context)
result2 = chat_with_tools(
    message="What date was mentioned in those emails?",
    client_id="<client_id (uuid)>",
    session_id="assistant_001"  # Same session = memory
)
🎉 Congratulations! You can now:
  1. Connect to 1000+ tools - Gmail, YouTube, Linear, Slack, and more
  2. Use multiple LLM providers - Anthropic, OpenAI, Gemini and Groq
  3. Created conversations with memory - Context preserved across messages

Next Steps

Need Help?