UNPKG

c9ai

Version:

Universal AI assistant with vibe-based workflows, hybrid cloud+local AI, and comprehensive tool integration

175 lines (132 loc) 7.44 kB
# Changelog All notable changes to C9 AI will be documented in this file. ## [2.2.3] - 2025-09-04 ### Packaging - Added `files` whitelist in `package.json` to include only runtime code (`src/`, `bin/`, `scripts/`, `server/`, `README.md`, `CHANGELOG.md`). - Added metadata: `repository`, `bugs`, `homepage`, `keywords`, `engines`, and `license`. - Verified package with `npm pack`; reduced published tarball size by excluding dev artifacts and tests. - Published as `c9ai@2.2.3`. Previous versions remain available (no breaking changes). ### Notes - No functional code changes in this release; safe upgrade for new installs. ## [2.2.4] - 2025-09-04 ### Fixes - JIT calculator: fixed method name collision causing `toLowerCase` error in `@calc` expressions. ### UI/UX - RSS headlines: titles as H2 (no blue link), date + publisher line, pill‑style Read link, full‑width thumbnails, and tighter spacing. ### Docs - Added `public/demo-lineup.html` printable demo handout. - Added Gamma‑compatible deck at `docs/slides/c9ai-gamma-deck.md`. ### Packaging - No change to package contents (uses `files` whitelist). Verified with `npm pack`. ## [2.2.5] - 2025-09-06 ### UI/Slides - Added Slides Library UI and API (`/api/slides`) to list and open generated decks. - Added workflow template “Markdown → Web Slideshow” and generator script `scripts/md-to-slides.js`. ### Packaging - Included `public/` in npm `files` so the updated UI and slides assets ship with the package. ## [2.0.0] - 2025-08-04 ### 🚀 Major Features Added #### 🧠 Local AI Integration - **Local LLM Support**: Full integration with Microsoft Phi-3, TinyLLaMA, and LLaMA models - **Privacy-First Processing**: All AI reasoning can run locally with zero external API calls - **Model Management System**: Download, install, remove, and manage local AI models - **Smart Model Loading**: Automatic model initialization and caching #### 🗣️ Natural Language Interface - **Conversational Commands**: Talk naturally to c9ai - "compile my research paper" - **System Command Understanding**: "list documents in directory" → `ls -la /path` - **Intelligent Intent Recognition**: Natural language → structured @action format - **Context-Aware Processing**: AI understands file paths, common tasks, and user patterns #### ⚡ Enhanced Todo Management - **Intelligent Todo Processing**: Plain English todos automatically converted to executable actions - **Multi-Mode Support**: Manual @action, natural language, and hybrid approaches - **Smart Action Generation**: AI suggests appropriate commands based on task description - **Seamless Execution**: Natural language todos can be executed directly #### 🔄 Advanced Model Switching - **Local ↔ Cloud Routing**: Intelligent switching between local and cloud AI - **Graceful Fallbacks**: Local → Cloud → Manual command progression - **Real-time Switching**: Change models mid-conversation without losing context - **Performance Optimization**: Local models for speed, cloud for complexity ### 🛠️ New Commands #### Model Management ```bash models list # Show available and installed models models install <model> # Download and install local AI models models remove <model> # Remove installed models to free space models status # Show disk usage and model information ``` #### Enhanced Todo Commands ```bash todos add "natural language" # AI converts to structured actions todos execute # Interactive execution of actionable todos todos actions # Show all actionable todos with intents ``` #### Interactive Natural Language ```bash c9ai> compile my research paper # Converts to: @action: compile research.tex c9ai> list documents in /path/directory # Executes: ls -la "/path/directory" c9ai> open my budget spreadsheet # Converts to: @action: open budget.xlsx ``` ### 🔧 Technical Improvements #### Architecture Enhancements - **Modular AI Backend**: Support for multiple local LLM providers - **Async Model Loading**: Non-blocking model initialization - **Configuration Persistence**: Model preferences saved across sessions - **Enhanced Error Handling**: Graceful fallbacks for all failure scenarios #### Performance Optimizations - **Lazy Loading**: Models loaded only when needed - **Caching System**: Intelligent model and response caching - **Pattern Recognition**: Fast local pattern matching before AI processing - **Resource Management**: Efficient memory usage for large models #### Developer Experience - **Enhanced Logging**: Detailed interaction logs in `~/.c9ai/logs/` - **Debug Support**: Optional debug output for troubleshooting - **Extension Points**: Easy integration of new AI models and actions - **Configuration Flexibility**: Customizable model selection and routing ### 📊 Privacy & Security #### Local Processing - **Zero External Dependencies**: Complete local AI processing option - **Data Privacy**: User data never leaves local machine in local mode - **Transparent Processing**: Always shows which model (local/cloud) is being used - **User Control**: Full control over when to use local vs cloud processing #### Intelligent Routing - **Privacy-First Defaults**: Attempts local processing first when available - **Explicit Consent**: Clear indication when routing to cloud APIs - **Fallback Transparency**: User always knows processing location - **Opt-in Cloud**: Cloud processing only with explicit model selection ### 🎯 Use Case Expansions #### Software Development - Natural language compilation commands - Intelligent project file navigation - Context-aware development task processing #### Document Management - Smart document operations via natural language - Automatic file type detection and appropriate actions - Intelligent search and organization commands #### System Administration - Natural language system commands - Intelligent resource monitoring - Context-aware file and process management ### 🔄 Breaking Changes - **Version Bump**: Major version increase to 2.0.0 - **New Dependencies**: Added `node-llama-cpp` for local AI support - **Configuration Changes**: New model configuration in `~/.c9ai/config.json` - **Command Additions**: New `models` command namespace ### 🐛 Bug Fixes - **Initialization Race Conditions**: Fixed async initialization in constructor - **Configuration Loading**: Proper config loading before command execution - **Error Handling**: Improved error messages and recovery - **Path Resolution**: Better handling of file paths with spaces ### 📦 Dependencies - **Added**: `node-llama-cpp@^3.11.0` - Local LLM integration - **Updated**: Enhanced error handling across all existing dependencies - **Maintained**: Full backward compatibility with existing tool integrations --- ## [1.0.1] - Previous Release ### Features - Basic Claude and Gemini AI integration - Todo management with GitHub Issues - Simple @action execution system - Interactive mode with AI switching --- **Migration Guide**: Existing users can upgrade seamlessly. All existing functionality remains unchanged. New features are additive and optional. **Next Release Preview**: Phase 2 will add learning system, analytics dashboard, and advanced context awareness.