UNPKG

@agentman/chat-widget

Version:

Agentman Chat Widget for easy integration with web applications

199 lines (151 loc) 6.23 kB
# MCP Chat Widget Architecture ## Overview This document describes the architecture for a TypeScript-based chat widget that integrates with MCP (Model Context Protocol) servers to provide rich UI experiences while optimizing LLM token usage. ### System Components ``` ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ │ User │────▶│ Chat Widget │────▶│ Agent │────▶│ MCP Server │ └─────────────┘ └─────────────┘ └─────────────┘ └─────────────┘ │ │ │ ▼ ▼ ▼ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ │ Component │ │ Router │ │ Response │ │ Registry │ │ Logic │ │ Generator │ └─────────────┘ └─────────────┘ └─────────────┘ ``` ## Core Principles ### 1. Three-Tier Data Separation Inspired by OpenAI's Chat Apps SDK, we separate MCP responses into three parts: - **`structuredContent`**: Minimal, token-efficient data for the LLM - **`content`**: Optional text/markdown for model narration - **`_meta`**: Rich data for UI rendering (never sent to LLM) This separation can reduce LLM costs by 80-90% while maintaining rich UI experiences. ### 2. Intelligent Routing Not all responses need LLM processing. The system intelligently routes: - **Direct to Client**: Simple displays, forms, product grids - **Through LLM**: Complex reasoning, comparisons, analysis - **Hybrid**: LLM provides narrative, UI renders from `_meta` ### 3. Component-Based UI Rendering A registry system maps response types to TypeScript components: ```typescript interface ComponentRegistry { register(type: string, component: UIComponent): void; render(response: MCPResponse): HTMLElement; } ``` ### 4. TypeScript-First Implementation - Type safety throughout the system - Interfaces for all data structures - No React dependency (pure TypeScript/DOM) - Minimal bundle size for Shopify marketplace ## Component Interaction ### Chat Widget ↔ Agent Communication ```typescript // Widget sends user input or form submission await agent.callTool('process_input', { type: 'form_submission' | 'user_message' | 'ui_action', data: payload }); // Agent responds with MCPResponse { uiType: 'form-lead' | 'shopify-products' | 'generic', structuredContent: { /* minimal */ }, _meta: { /* rich */ } } ``` ### MCP Server Protocol The MCP server: 1. Receives tool calls from the agent 2. Processes business logic 3. Returns structured responses with UI hints 4. Maintains stateless operation ### Response Handling Strategies ```typescript class ResponseHandler { handle(response: MCPResponse): void { if (this.shouldRenderDirectly(response)) { // Bypass LLM, render immediately this.renderComponent(response._meta); } else if (this.needsLLMProcessing(response)) { // Send structuredContent to LLM this.sendToLLM(response.structuredContent); } else { // Hybrid approach this.hybridRender(response); } } } ``` ## Decision Points ### When to Route Through LLM vs Direct to Client **Direct to Client:** - Form displays - Product grids (10 items) - Simple confirmations - Status displays - Static content **Through LLM:** - Complex queries requiring reasoning - Comparisons between items - Explanations needed - Multi-step workflows - Natural language generation ### Form Submission Handling 1. **Client-side validation** first 2. **Tokenization** for sensitive data (credit cards) 3. **Direct submission** to MCP server 4. **Confirmation** through agent 5. **Context update** for conversation continuity ### State Management Approach ```typescript class StateManager { // Conversation context (persisted) private conversationState: ConversationState; // Form sessions (temporary) private formSessions: Map<string, FormSession>; // MCP response cache (performance) private responseCache: Map<string, CachedResponse>; // Component state (UI specific) private componentStates: Map<string, ComponentState>; } ``` ## Performance Optimizations 1. **Response Caching**: Cache MCP responses for back navigation 2. **Lazy Component Loading**: Load components on demand 3. **Progressive Rendering**: Stream large datasets 4. **Token Optimization**: Send minimal data to LLM 5. **Bundle Size**: Keep under 50KB for Shopify compliance ## Security Considerations 1. **Sensitive Data Handling**: Never send PII/payment data to LLM 2. **Input Sanitization**: Validate all user inputs 3. **XSS Prevention**: Sanitize rendered content 4. **CORS Configuration**: Proper API endpoint configuration 5. **Token Management**: Secure storage of API tokens ## Extensibility The architecture supports: - Custom component types via registry - New MCP response types - Custom routing rules - Theme customization - Plugin system for field types ## Error Handling ```typescript class ErrorBoundary { handleComponentError(error: Error, component: string): void { // Fallback to simple text display this.renderFallback(error.message); // Log to monitoring service this.logError(error, component); } handleMCPError(error: MCPError): void { // Show user-friendly message // Retry with exponential backoff // Fallback to cached data if available } } ``` ## Next Steps 1. Review `MCP_RESPONSE_STRUCTURE.md` for detailed response schemas 2. See `CHAT_WIDGET_IMPLEMENTATION.md` for implementation details 3. Check `COMPONENT_REGISTRY.md` for component system 4. Read `ROUTING_LOGIC.md` for routing implementation