The Vercel AI SDK can be seamlessly integrated with Observee to enable your applications to access Observee’s powerful tool ecosystem and custom MCP (Model Context Protocol) servers. This allows you to combine advanced LLMs, streaming, and tool use in a unified developer experience.
Example: Connecting to an MCP Server with the AI SDK
Below is a complete example of how to connect to an MCP server using the Vercel AI SDK:
Copy
Ask AI
import { experimental_createMCPClient as createMCPClient, streamText,} from 'ai';import { openai } from '@ai-sdk/openai';import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp';async function main() { try { // Define your MCP server endpoint // Replace {customer_id} with your actual customer ID from the Observee dashboard const url = new URL('https://mcp.observee.ai/customer/{customer_id}'); // Create the MCP client with session support const mcpClient = await createMCPClient({ transport: new StreamableHTTPClientTransport(url) }); // Discover available tools from the MCP server const tools = await mcpClient.tools(); console.log(`Connected to MCP server with ${Object.keys(tools).length} tools available`); // Use the tools in a streaming LLM call const result = await streamText({ model: openai('gpt-4-turbo'), tools, // Pass the discovered tools to streamText prompt: 'Help me analyze the latest system metrics and provide insights', maxTokens: 1000, }); // Handle the streaming response for await (const textPart of result.textStream) { process.stdout.write(textPart); } // Always close the client when done await mcpClient.close(); } catch (error) { console.error('MCP Integration failed:', error); // Common error scenarios: if (error.message.includes('404')) { console.log('Check your customer ID and ensure the MCP server is deployed'); } else if (error.message.includes('authentication')) { console.log('Verify your authentication configuration'); } }}main().catch(console.error);