ai-workflow-utils
Version:
A comprehensive automation platform that streamlines software development workflows by integrating AI-powered content generation with popular development tools like Jira, Bitbucket, and email systems. Includes startup service management for automatic syst
711 lines (536 loc) β’ 20.9 kB
Markdown
# π AI Workflow Utils
<div align="center">






**The Ultimate AI-Powered Development Workflow Automation Platform**
*Streamline your development process with intelligent Jira ticket creation, AI-powered code reviews & pull request creation with custom template support*
<img width="1920" height="1188" alt="AI-Workflow-Utils-08-01-2025_02_33_PM" src="https://github.com/user-attachments/assets/a96cff4b-caac-4e8c-af54-194a83f80f9b" />
</div>
## π **NEW IN v1.x.x - Game-Changing Features!**
### π― **Feature #1: AI-Powered Jira Ticket Creation**
Create professional Jira tickets (Tasks, Bugs, Stories) using AI with multiple provider support:
- **π€ OpenAI Compatible APIs**: GPT-4, Claude, and other cloud providers
- **π Local AI with Ollama LLaVA**: Complete privacy with local image analysis
- **πΈ Smart Image Analysis**: Upload screenshots and get detailed issue descriptions
- **β‘ Real-time Streaming**: Watch AI generate content live
- **π¨ Professional Templates**: Auto-formatted with proper sections and acceptance criteria
- **π Direct Jira Integration**: Creates tickets instantly with your access token
### π **Feature #2: AI-Powered Pull Request Creation**
Revolutionary AI-powered pull request creation for Atlassian Bitbucket:
- **π€ Intelligent PR Generation**: AI analyzes commit messages to create professional PR titles and descriptions
- **π Smart Commit Analysis**: Automatically determines PR type (feat/fix/chore) based on commit patterns
- **β‘ Real-time Streaming**: Watch AI generate PR content live with streaming updates
- **π Multi-Model Support**: Uses Ollama for local AI processing with privacy
- **βοΈ Editable Previews**: Review and edit AI-generated content before creating the PR
- **πΎ Smart Persistence**: Remembers project and repository settings for faster workflow
### π **Feature #3: AI-Powered Code Review**
Revolutionary AI-powered pull request reviews for Atlassian Bitbucket:
- **π§ Intelligent Code Analysis**: AI reviews your code changes
- **π‘ Smart Suggestions**: Get actionable improvement recommendations
- **π Multi-Model Support**: OpenAI Compatible APIs + Ollama for flexibility
- **β‘ Coming Soon**: AI adds review comments directly to your PRs
- **β‘ Coming Soon**: Direct comment integration
### π **Feature #4: Real-time Logs & Monitoring**
Comprehensive logging and monitoring system for troubleshooting and system insights:
- **π Real-time Log Streaming**: Live view of application logs with automatic updates
- **π Advanced Filtering**: Filter logs by level (Error, Warn, Info, Debug) and search by content
- **π
Log History**: Access historical logs with pagination and date filtering
- **π¨ Syntax Highlighting**: Color-coded log levels for easy identification
- **πΎ Log Management**: Automatic log rotation and size management
- **π§ Debug Mode**: Enable detailed debug logging for troubleshooting
- **π± Responsive Design**: Access logs from any device with mobile-friendly interface
Hereβs a cleaned-up and renumbered version of your βQuick Start Guideβ section for the README. All step numbers are sequential, and optional sub-steps are clearly labeled. Copy-paste this to replace your existing section:
## π Quick Start Guide
### **Step 1: Recommended - Ollama Setup (For Local AI)**
```bash
# Install Ollama (if you want local AI processing)
# macOS
brew install ollama
# Linux
curl -fsSL https://ollama.com/install.sh | sh
# Windows - Download from https://ollama.com/
# Download the LLaVA model for image analysis
ollama pull llava
# Start Ollama service
ollama serve
```
Then configure Ollama as your AI provider in the web interface.
### **Step 2: Installation**
```bash
npm install -g ai-workflow-utils
```
### **Step 3: Launch the Application**
```bash
# Start the application directly
ai-workflow-utils
```
The application will start immediately and be available at `http://localhost:3000`
### **Step 4: (Optional) Install as Startup Service**
For production use or to run automatically on system boot:
```bash
ai-workflow-utils startup install
```
The service will now start automatically on boot. Access at `http://localhost:3000`
**Startup Service Management:**
```bash
ai-workflow-utils startup status # Check service status
ai-workflow-utils startup start # Start the service
ai-workflow-utils startup stop # Stop the service
ai-workflow-utils startup uninstall # Remove startup service
```
**Supported Platforms:**
- **macOS**: Uses LaunchAgents (user-level service)
- **Windows**: Uses Windows Service Manager
- **Linux**: Uses systemd
For detailed startup service documentation, see [STARTUP.md](STARTUP.md)
### **Step 5: Configure Using the Settings Page**
All configuration is managed through the web-based settings page:
- Visit [`http://localhost:3000/settings/environment`](http://localhost:3000/settings/environment)
- Configure your AI provider (Anthropic Claude, OpenAI GPT, Google Gemini, Ollama)
- Set up Jira integration (URL, API token)
- Configure repository provider (Bitbucket)
- Set up issue tracking (Jira, etc.)
All changes are saved to `~/.ai-workflow-utils/environment.json` and persist across upgrades.
**No manual .env setup required!**
### **Step 6: (Optional) PWA Installation (Progressive Web App)**
**AI Workflow Utils is a fully-featured PWA!** Install it as a native app for the best experience:
**π₯οΈ Desktop Installation:**
1. Open `http://localhost:3000` in Chrome, Edge
2. Look for the "Install" button in the address bar
3. Click "Install" to add AI Workflow Utils to your desktop
4. Launch directly from your desktop/dock - no browser needed!
**β¨ PWA Benefits:**
- **π Faster Loading**: Cached resources for instant startup
- **π± Native Feel**: Works like a desktop
- **π Auto Updates**: Always get the latest features
- **πΎ Offline Ready**: Basic functionality works without internet
- **π― Focused Experience**: No browser distractions
<details>
<summary><strong>π― Feature Deep Dive</strong></summary>
### **π« AI Jira Ticket Creation**
**What makes it special:**
- **Dual AI System**: Primary cloud AI with local Ollama fallback
- **Image Intelligence**: Upload screenshots and get detailed bug reports
- **Smart Templates**: Automatically formats content based on issue type
- **Real-time Generation**: Watch AI create your tickets live
**Example Usage:**
1. Navigate to "Create Jira"
2. Describe your issue: *"Login button doesn't work on mobile"*
3. Upload a screenshot (optional)
4. Select issue type (Bug/Task/Story)
5. Watch AI generate professional content
6. Review and create ticket directly in Jira
**AI Providers Supported:**
- **OpenAI GPT-4** (with vision)
- **Anthropic Claude** (with vision)
- **Any OpenAI-compatible API**
- **Ollama LLaVA** (local, private)
### **π AI-Powered Pull Request Creation**
**Revolutionary PR Generation:**
- **Smart Commit Analysis**: AI analyzes your commit messages to understand the changes
- **Automatic Type Detection**: Determines if changes are features, fixes, or chores
- **Professional Formatting**: Generates conventional commit-style titles (feat/fix/chore)
- **Streaming Generation**: Watch AI create content in real-time with live updates
- **Local AI Processing**: Uses Ollama for complete privacy and offline capability
**How it works:**
1. Navigate to "Create PR"
2. Enter project key, repository slug, ticket number, and branch name
3. Click "Preview" to start AI generation
4. Watch AI analyze commits and generate title/description in real-time
5. Edit the generated content if needed
6. Click "Create Pull Request" to submit to Bitbucket
**AI Features:**
- **Commit Message Analysis**: Extracts meaningful information from commit history
- **Smart Categorization**: Automatically prefixes with feat/fix/chore based on content
- **Ticket Integration**: Includes ticket numbers in standardized format
- **Editable Previews**: Full control over final content before submission
- **Persistent Settings**: Remembers project and repo settings for faster workflow
### **π AI Code Review**
**Revolutionary Code Review:**
- **Context-Aware Analysis**: AI understands your codebase
- **Security Scanning**: Identifies potential vulnerabilities
- **Performance Optimization**: Suggests efficiency improvements
- **Best Practices**: Enforces coding standards
**How it works:**
1. Open a pull request in Bitbucket
2. Navigate to "GitStash Review"
3. Enter PR details
4. AI analyzes code changes
5. Get detailed review with suggestions
6. *Coming Soon*: Direct comment integration
### **π Real-time Logs & Monitoring**
**Comprehensive System Monitoring:**
- **Live Log Streaming**: Real-time log updates without page refresh
- **Multi-level Filtering**: Filter by Error, Warn, Info, Debug levels
- **Smart Search**: Full-text search across all log entries
- **Historical Access**: Browse past logs with pagination
- **Performance Insights**: Monitor API calls, response times, and system health
**How it works:**
1. Navigate to "Logs" in the web interface
2. Select log level filters (All, Error, Warn, Info, Debug)
3. Use search to find specific entries or error messages
4. View real-time updates as the system operates
5. Access historical logs for troubleshooting past issues
**Monitoring Features:**
- **Error Tracking**: Immediate visibility into system errors
- **API Monitoring**: Track AI provider calls and response times
- **User Activity**: Monitor feature usage and workflow patterns
- **System Health**: Resource usage and performance metrics
- **Debug Support**: Detailed logging for development and troubleshooting
</details>
<!-- Manual environment setup is deprecated. All configuration should be done via the web-based settings page. -->
<details>
<summary><strong>π οΈ CLI Commands (For Advanced Users)</strong></summary>
### **Setup and Configuration**
```bash
# Interactive setup wizard
ai-workflow-setup
# Check configuration
ai-workflow-utils --config
# Test connections
ai-workflow-utils --test
```
### **Development Commands**
```bash
# Start in development mode
ai-workflow-utils --dev
# Enable debug logging
ai-workflow-utils --debug
# Specify custom port
ai-workflow-utils --port 8080
# View logs in real-time
ai-workflow-utils --logs
# Clear log files
ai-workflow-utils --clear-logs
```
### **Ollama Management**
```bash
# Check Ollama status
ai-workflow-utils --ollama-status
# Download recommended models
ai-workflow-utils --setup-ollama
# List available models
ollama list
```
</details>
<details>
<summary><strong>π§ Advanced Configuration (For Developers)</strong></summary>
### **AI Provider Fallback System**
```javascript
// Automatic fallback order:
1. OpenAI Compatible API (Primary)
2. Ollama LLaVA (Local fallback)
3. Error handling with user notification
```
### **Custom Model Configuration**
```env
# For different OpenAI-compatible providers:
OPENAI_COMPATIBLE_MODEL=gpt-4-vision-preview # OpenAI
OPENAI_COMPATIBLE_MODEL=claude-3-sonnet-20240229 # Anthropic
OPENAI_COMPATIBLE_MODEL=llama-2-70b-chat # Custom API
# For Ollama local models:
OLLAMA_MODEL=llava:13b # Larger model for better quality
OLLAMA_MODEL=llava:7b # Faster, smaller model
OLLAMA_MODEL=codellama:7b # Code-focused model
```
### **Performance Tuning**
```env
# Streaming configuration
STREAM_CHUNK_SIZE=1024
STREAM_TIMEOUT=60000
# Rate limiting
API_RATE_LIMIT=100
API_RATE_WINDOW=900000
# File upload limits
MAX_FILE_SIZE=50MB
ALLOWED_FILE_TYPES=jpg,jpeg,png,gif,mp4,mov,pdf,doc,docx
# Logging configuration
LOG_LEVEL=info # error, warn, info, debug
LOG_MAX_SIZE=10MB # Maximum log file size
LOG_MAX_FILES=5 # Number of rotated log files
LOG_RETENTION_DAYS=30 # Days to keep log files
ENABLE_REQUEST_LOGGING=true # Log all HTTP requests
```
</details>
<details>
<summary><strong>π Production Deployment (For DevOps)</strong></summary>
### **Docker Deployment**
```bash
# Build the application
npm run build
# Create Docker image
docker build -t ai-workflow-utils .
# Run container
docker run -p 3000:3000 --env-file .env ai-workflow-utils
```
### **PM2 Process Management**
```bash
# Install PM2
npm install -g pm2
# Start application
pm2 start ecosystem.config.js
# Monitor
pm2 monit
# View logs
pm2 logs ai-workflow-utils
```
### **Nginx Reverse Proxy**
```nginx
server {
listen 80;
server_name your-domain.com;
location / {
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_cache_bypass $http_upgrade;
}
}
```
</details>
<details>
<summary><strong>π Security & Privacy</strong></summary>
### **Data Privacy**
- **Local AI Processing**: Use Ollama for complete data privacy
- **No Data Storage**: AI conversations are not stored
- **Secure Tokens**: Environment-based credential management
- **HTTPS Support**: SSL/TLS encryption for production
### **Security Features**
- **Rate Limiting**: Prevents API abuse
- **Input Validation**: Sanitizes all user inputs
- **Error Handling**: No sensitive data in error messages
- **Access Control**: Token-based authentication
</details>
<details>
<summary><strong>π Monitoring & Analytics (For DevOps)</strong></summary>
### **Built-in Monitoring**
- **Health Checks**: `/health` endpoint for monitoring
- **Performance Metrics**: Response times and success rates
- **Error Tracking**: Comprehensive error logging
- **Usage Statistics**: AI provider usage analytics
### **Logging Configuration**
```env
# Logging levels: error, warn, info, debug
LOG_LEVEL=info
# Log file rotation
LOG_MAX_SIZE=10MB
LOG_MAX_FILES=5
# Enable request logging
LOG_REQUESTS=true
```
</details>
<details>
<summary><strong>π€ Contributing (For Developers)</strong></summary>
We welcome contributions! Here's how to get started:
### **Development Setup**
```bash
# Clone the repository
git clone https://github.com/anuragarwalkar/ai-workflow-utils.git
cd ai-workflow-utils
# Install dependencies
npm install
# Set up environment
cp .env.example .env
cp ui/.env.example ui/.env
# Start development server
npm run dev
```
### **Project Structure**
```
ai-workflow-utils/
βββ bin/ # CLI scripts
βββ server/ # Backend (Node.js + Express)
βββ ui/ # Frontend (React + Redux)
βββ dist/ # Built files
βββ docs/ # Documentation
```
### **Contribution Guidelines**
1. Fork the repository
2. Create a feature branch
3. Make your changes
4. Add tests if applicable
5. Submit a pull request
</details>
<details>
<summary><strong>π API Documentation (For Developers)</strong></summary>
### **Core Endpoints**
**Jira Ticket Creation:**
```bash
POST /api/jira/preview
Content-Type: application/json
{
"prompt": "Login button not working",
"images": ["base64-encoded-image"],
"issueType": "Bug"
}
```
**Create Pull Request Preview (Streaming):**
```bash
POST /api/pr/stream-preview
Content-Type: application/json
{
"projectKey": "PROJ",
"repoSlug": "my-repo",
"ticketNumber": "PROJ-123",
"branchName": "feature/my-branch"
}
# Returns Server-Sent Events stream with:
# - status updates
# - title_chunk events (streaming title generation)
# - title_complete event (final title)
# - description_chunk events (streaming description generation)
# - description_complete event (final description)
# - complete event (final preview data)
```
**Create Pull Request:**
```bash
POST /api/pr/create
Content-Type: application/json
{
"projectKey": "PROJ",
"repoSlug": "my-repo",
"ticketNumber": "PROJ-123",
"branchName": "feature/my-branch",
"customTitle": "feat(PROJ-123): Add user authentication",
"customDescription": "## Summary\nAdded user authentication feature\n\n## Changes Made\n- Added login component\n- Implemented JWT tokens"
}
```
**GitStash PR Review:**
```bash
POST /api/pr/review
Content-Type: application/json
{
"repoUrl": "https://bitbucket.company.com/projects/PROJ/repos/repo",
"pullRequestId": "123",
"reviewType": "security"
}
```
**File Upload:**
```bash
POST /api/jira/upload
Content-Type: multipart/form-data
file: [binary-data]
issueKey: "PROJ-123"
```
</details>
<details>
<summary><strong>π Troubleshooting (For Support)</strong></summary>
### **Common Issues**
**Ollama Connection Failed:**
```bash
# Check if Ollama is running
ollama list
# Start Ollama service
ollama serve
# Pull required model
ollama pull llava
```
**Jira Authentication Error:**
```bash
# Test Jira connection
curl -H "Authorization: Bearer YOUR_TOKEN" \
https://your-company.atlassian.net/rest/api/2/myself
```
**Port Already in Use:**
```bash
# Use different port
ai-workflow-utils --port 8080
# Or kill existing process
lsof -ti:3000 | xargs kill -9
```
### **Debug Mode**
```bash
# Enable detailed logging
ai-workflow-utils --debug
# Check logs
tail -f logs/app.log
```
</details>
## π **Support**
### **Getting Help**
- **π Documentation**: [GitHub Wiki](https://github.com/anuragarwalkar/ai-workflow-utils/wiki)
- **π Bug Reports**: [GitHub Issues](https://github.com/anuragarwalkar/ai-workflow-utils/issues)
- **π¬ Discussions**: [GitHub Discussions](https://github.com/anuragarwalkar/ai-workflow-utils/discussions)
- **π§ Email**: anurag.arwalkar@gmail.com
### **Community**
- **β Star us on GitHub**: Show your support!
- **π Share**: Help others discover this tool
- **π€ Contribute**: Join our growing community
## π **Roadmap**
### **Coming Soon**
- **π Direct PR Comments**: AI comments directly in Bitbucket
- **π Workflow Automation**: Custom automation workflows
- **π Analytics Dashboard**: Usage insights and metrics
- **π Plugin System**: Extensible architecture
- **π Multi-language Support**: Internationalization
### **Future Features**
- **π€ Advanced AI Agents**: Specialized AI for different tasks
- **π More Integrations**: GitHub, GitLab, Azure DevOps
- **π± Mobile App**: Native mobile applications
- **π― Smart Routing**: Intelligent task assignment
## π **License**
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## ποΈ **Acknowledgments**
Special thanks to the amazing open-source community and the following technologies that make this project possible:
- **π€ OpenAI & Anthropic**: For providing excellent AI APIs
- **π Ollama**: For enabling local AI processing with privacy
- **π― Atlassian**: For robust Jira and Bitbucket APIs
- **βοΈ React & Redux**: For building beautiful, responsive UIs
- **π Node.js & Express**: For reliable backend infrastructure
- **π¨ Material-UI**: For professional design components
## π **Why Choose AI Workflow Utils?**
### **π Productivity Boost**
- **10x Faster**: Create professional Jira tickets in seconds
- **AI-Powered**: Let AI handle the heavy lifting
- **Streamlined**: One tool for all your workflow needs
### **π Privacy First**
- **Local Processing**: Use Ollama for complete data privacy
- **No Vendor Lock-in**: Multiple AI provider support
- **Your Data**: Stays on your infrastructure
### **π οΈ Developer Friendly**
- **Easy Setup**: Get started in minutes
- **CLI Tools**: Powerful command-line interface
- **Extensible**: Open architecture for customization
### **πΌ Enterprise Ready**
- **Scalable**: Handles teams of any size
- **Secure**: Enterprise-grade security features
- **Reliable**: Battle-tested in production environments
<div align="center">
## π **Ready to Transform Your Workflow?**
### **Get Started Today!**
```bash
npm install -g ai-workflow-utils
```
**β Star us on GitHub if this tool helps you!**
**π’ Share with your team and boost everyone's productivity!**
**Made with β€οΈ by [Anurag Arwalkar](https://github.com/anuragarwalkar)**
*Empowering developers worldwide with AI-powered workflow automation*
</div>