UNPKG

@clduab11/gemini-flow

Version:

Revolutionary AI agent swarm coordination platform with Google Services integration, multimedia processing, and production-ready monitoring. Features 8 Google AI services, quantum computing capabilities, and enterprise-grade security.

620 lines (509 loc) 17.4 kB
# Gemini Flow Command Bible 🚀 ## The One-Sheet That Turns Developers Into AI Orchestration Ninjas *Print this. Pin it. Ship faster than your competition.* --- ## Core Commands - Your New Superpowers ⚡ ### 🐝 `hive-mind spawn` - When Shit Gets Complex ```bash gemini-flow hive-mind spawn "objective" [options] ``` **Essential Flags:** - `--workers <n>` - Agent count (default: 4, max: unlimited*) - `--queen-type <type>` - Leadership style: `strategic` | `adaptive` | `hierarchical` - `--consensus <algo>` - Decision making: `majority` | `weighted` | `byzantine` - `--memory-size <mb>` - Shared brain size (default: 1024) - `--auto-scale` - Spawn agents as needed - `--gemini` - **NEW v1.0.4**: Enable enhanced AI coordination with GEMINI.md context **🔥 Killer Combo #1: The Startup Special** ```bash gemini-flow hive-mind spawn "Build MVP in 48 hours" \ --workers 12 \ --queen-type adaptive \ --consensus byzantine \ --auto-scale \ --memory-size 4096 \ --gemini ``` *Why it works: Byzantine consensus prevents analysis paralysis, adaptive queen learns from rapid iterations, Gemini integration provides 66 agent types awareness* **🚀 NEW: Enhanced with --gemini Flag** ```bash # Standard spawn (basic functionality) gemini-flow hive-mind spawn "Build authentication system" # Enhanced spawn (full 66-agent awareness + GEMINI.md context) gemini-flow hive-mind spawn "Build authentication system" --gemini ``` *The --gemini flag automatically loads your project's GEMINI.md context file, enabling AI agents to understand your complete system architecture, available commands, and 66 specialized agent types.* ### 🚀 `swarm` - Quick Strike Force ```bash gemini-flow swarm "task" [options] ``` **Power Flags:** - `--strategy <type>` - `development` | `analysis` | `refactor` | `test` - `--parallel` - Unleash parallel execution - `--depth <level>` - `shallow` | `medium` | `deep` - `--monitor` - Real-time swarm dashboard - `--gemini` - **NEW v1.0.4**: Context-aware swarm coordination **🔥 Killer Combo #2: The Refactor Nuke** ```bash gemini-flow swarm "Refactor legacy auth system" \ --strategy refactor \ --parallel \ --context ./legacy \ --model gemini-1.5-pro \ --gemini ``` *Why it works: Parallel refactoring with full codebase context + 66 agent types = 10x dev speed* **🎯 Enhanced Swarm Examples with --gemini** ```bash # Context-aware development swarm gemini-flow swarm "Implement microservices architecture" \ --strategy development \ --parallel \ --gemini # Deep analysis with full system understanding gemini-flow swarm "Analyze performance bottlenecks" \ --strategy analysis \ --depth deep \ --gemini # Intelligent test strategy based on project context gemini-flow swarm "Create comprehensive test suite" \ --strategy test \ --gemini ``` ### 🔍 `query` - Your Research Army ```bash gemini-flow query "question" [options] ``` **Core Query Features:** - Mini-swarm deployment (web-researcher, gemini-analyst, result-synthesizer) - MCP web research tool integration - Gemini Flash optimization for cost-effective queries - Parallel execution of research agents - Comprehensive result synthesis - `--gemini` - **NEW v1.0.4**: Project-context-aware research **🔥 Killer Combo #3: The Knowledge Vacuum** ```bash # Simple but powerful - spawns mini-swarm automatically gemini-flow query "Kubernetes multi-region failover strategies" --gemini # Or with explicit depth control + project context gemini-flow query "Latest React 19 features for our architecture" \ --depth deep \ --sources 10 \ --gemini ``` *Why it works: Automatic mini-swarm coordination + MCP web tools + Gemini Flash + project context = targeted research* **🧠 Context-Aware Research Examples** ```bash # Research with full project understanding gemini-flow query "Best database for our microservices setup" --gemini # Architecture-specific recommendations gemini-flow query "Security best practices for our tech stack" --gemini # Performance optimization suggestions gemini-flow query "Scaling strategies for our current infrastructure" --gemini ``` ### 🏗️ `sparc` - Systematic Development Pipeline ```bash gemini-flow sparc <mode> "objective" [options] ``` **Core Modes:** - `run <phase>` - Execute specific phase - `pipeline` - Full S.P.A.R.C execution - `tdd` - Test-driven development flow - `modes` - List all available modes **Batch Processing (Optimized):** - `batch <modes> "<task>"` - Execute multiple SPARC modes in parallel - `concurrent <mode> "<tasks-file>"` - Process multiple tasks concurrently **Phase Options:** - `spec-pseudocode` - Requirements + logic - `architect` - System design - `code` - Implementation - `refine` - Optimization - `complete` - Final polish - `integration` - System integration **🔥 Killer Combo #4: The TDD Weapon** ```bash # Standard TDD flow gemini-flow sparc tdd "Payment processing system" # Batch multiple phases for speed gemini-flow sparc batch "spec-pseudocode architect code" "Payment system" # Concurrent processing from file gemini-flow sparc concurrent code "tasks.txt" ``` *Why it works: SPARC methodology + parallel processing = systematic yet rapid development* ### 💾 `memory` - Persistent Context ```bash gemini-flow memory <action> [key] [value] ``` **Memory Operations:** - `store <key> <value>` - Save context - `recall <pattern>` - Retrieve context - `search <query>` - Semantic search - `export` - Dump memory state - `sync` - Team synchronization **🔥 Killer Combo #5: The Context Maximizer** ```bash # Morning standup gemini-flow memory store sprint-context "Building real-time features" gemini-flow memory store tech-debt "Auth system needs refactor" # Later that day gemini-flow swarm "Continue sprint work" --memory-context sprint-* ``` *Why it works: Persistent context = no repeated explanations* ### ⚡ `execute` - Code That Runs Itself ```bash gemini-flow execute "task" [options] ``` **Execution Flags:** - `--live` - Real-time execution - `--framework <name>` - Target framework - `--test-framework <name>` - Testing framework - `--coverage-target <n>` - Min coverage % - `--optimize` - Performance optimization pass **🔥 Killer Combo #6: The Auto-Pilot** ```bash gemini-flow execute "WebSocket chat server" \ --framework fastapi \ --test-framework pytest \ --coverage-target 95 \ --optimize \ --deploy-ready ``` *Why it works: Gemini writes it, tests it, optimizes it, ships it* --- ## 🎯 Agent Types - Choose Your Fighters ### Core Squad - `coder` - Implementation specialist - `architect` - System design expert - `tester` - Quality assurance ninja - `researcher` - Information gatherer - `reviewer` - Code quality enforcer ### Specialist Forces - `production-validator` - Prod readiness checker - `performance-benchmarker` - Speed demon - `security-manager` - Vulnerability hunter - `api-docs` - Documentation generator - `mobile-dev` - Cross-platform specialist ### Advanced Operators - `byzantine-coordinator` - Distributed consensus - `adaptive-coordinator` - Self-improving orchestrator - `swarm-memory-manager` - Collective intelligence - `collective-intelligence-coordinator` - Hive mind optimizer --- ## 🔥 Legendary Combos - Copy, Paste, Ship ### The "Fuck It, Ship It" Special ```bash gemini-flow hive-mind spawn "Production hotfix for memory leak" \ --workers 6 \ --worker-types "coder,tester,production-validator,performance-benchmarker" \ --queen-type strategic \ --consensus weighted \ --priority critical \ --deploy-on-success ``` ### The "Startup Weekend" Destroyer ```bash # Friday 6pm gemini-flow orchestrate "Build complete SaaS MVP" \ --marathon-mode \ --workers 16 \ --auto-scale \ --context ./idea.md \ --checkpoints hourly \ --slack-updates #progress ``` ### The "Legacy Killer" Pattern ```bash # Step 1: Understand the beast gemini-flow analyze --repo ./legacy \ --include-history \ --tech-debt-report \ --output analysis.md # Step 2: Plan the attack gemini-flow sparc run architect "Modernization strategy" \ --context analysis.md \ --constraints "zero downtime" # Step 3: Execute with prejudice gemini-flow swarm "Implement modernization" \ --strategy refactor \ --parallel \ --test-coverage 95 \ --canary-deploy ``` ### The "Documentation Blitz" ```bash gemini-flow agent spawn api-docs \ --scan ./src \ --format openapi \ --examples real \ --readme-update \ --changelog-generate ``` ### The "Performance Hawk" ```bash # Spawn performance optimization swarm gemini-flow hive-mind spawn "Optimize database queries" \ --workers 4 \ --worker-types "performance-benchmarker,code-analyzer,coder,tester" \ --queen-type adaptive \ --consensus weighted ``` ### 🌌 **Quantum Supremacy Sprint** - The Future of Computation ```bash # Deploy quantum computing swarm gemini-flow hive-mind spawn "Quantum algorithm optimization" \ --workers 8 \ --worker-types "quantum-researcher,vqe-specialist,qaoa-optimizer" \ --queen-type adaptive \ --consensus byzantine \ --quantum-backend qiskit \ --memory-size 8192 ``` #### 🚀 Quantum Specialist Agents **`quantum-researcher`** - Quantum algorithm theorist - Analyzes quantum advantage opportunities - Researches latest quantum computing papers - Designs quantum circuit architectures - Evaluates quantum vs classical trade-offs **`vqe-specialist`** - Variational Quantum Eigensolver expert - Optimizes molecular Hamiltonians - Implements quantum chemistry workflows - Designs parameterized quantum circuits - Handles noise mitigation strategies **`qaoa-optimizer`** - Quantum Approximate Optimization Algorithm specialist - Solves combinatorial optimization problems - Implements graph-based algorithms - Optimizes quantum gate parameters - Bridges classical-quantum hybrid workflows #### 🧬 VQE Molecular Simulation Example (Qiskit) ```bash # Deploy molecular simulation swarm gemini-flow swarm "VQE molecular ground state calculation" \ --strategy quantum \ --parallel \ --framework qiskit \ --molecule H2 \ --basis-set sto-3g ``` **Generated Implementation:** ```python from qiskit import QuantumCircuit, QuantumRegister, ClassicalRegister from qiskit.algorithms import VQE from qiskit.algorithms.optimizers import COBYLA from qiskit.circuit.library import TwoLocal from qiskit_nature.drivers import Molecule from qiskit_nature.problems.second_quantization import ElectronicStructureProblem # Swarm-generated molecular Hamiltonian def create_h2_molecule(): molecule = Molecule(geometry=[['H', [0., 0., 0.]], ['H', [0., 0., 0.735]]], charge=0, multiplicity=1) return ElectronicStructureProblem(molecule) # Quantum circuit ansatz ansatz = TwoLocal(rotation_blocks='ry', entanglement_blocks='cz') # VQE algorithm with swarm optimization vqe = VQE(ansatz, optimizer=COBYLA(maxiter=100)) result = vqe.compute_minimum_eigenvalue(qubit_op) ``` #### 🎯 QAOA Optimization Example (PennyLane) ```bash # Deploy combinatorial optimization swarm gemini-flow execute "QAOA graph max-cut problem" \ --framework pennylane \ --optimizer adam \ --layers 3 \ --graph-size 8 ``` **Generated Implementation:** ```python import pennylane as qml import numpy as np # Swarm-optimized QAOA circuit dev = qml.device('default.qubit', wires=4) @qml.qnode(dev) def qaoa_circuit(params, graph_edges): # Initialize superposition for wire in range(4): qml.Hadamard(wires=wire) # QAOA layers for layer in range(len(params)//2): # Cost unitary for edge in graph_edges: qml.CNOT(wires=edge) qml.RZ(params[2*layer], wires=edge[1]) qml.CNOT(wires=edge) # Mixer unitary for wire in range(4): qml.RX(params[2*layer + 1], wires=wire) return qml.expval(qml.PauliZ(0) @ qml.PauliZ(1)) # Swarm optimization loop optimizer = qml.AdamOptimizer(stepsize=0.1) params = np.random.random(6) for step in range(100): params = optimizer.step(qaoa_circuit, params) ``` #### 🤖 Quantum ML Pipeline Example (Cirq) ```bash # Deploy quantum machine learning swarm gemini-flow sparc run quantum-ml "Quantum neural network classifier" \ --backend cirq \ --dataset iris \ --qubits 4 \ --depth 6 ``` **Generated Implementation:** ```python import cirq import tensorflow as tf import tensorflow_quantum as tfq # Swarm-designed quantum data encoding def create_quantum_data_circuit(qubits, data): circuit = cirq.Circuit() for i, qubit in enumerate(qubits): circuit.append(cirq.ry(data[i % len(data)])(qubit)) if i > 0: circuit.append(cirq.CNOT(qubits[i-1], qubit)) return circuit # Quantum neural network model def build_qnn_model(qubits): model = tf.keras.Sequential([ tf.keras.layers.Input(shape=(), dtype=tf.string), tfq.layers.PQC(create_variational_circuit(qubits), operators=[cirq.Z(qubits[0])]), tf.keras.layers.Dense(2, activation='softmax') ]) return model # Swarm-optimized training model = build_qnn_model(cirq.GridQubit.rect(2, 2)) model.compile(optimizer='adam', loss='sparse_categorical_crossentropy') model.fit(quantum_data, labels, epochs=50) ``` #### 🚀 Why Quantum Supremacy Matters **Exponential Speedup Domains:** - **Cryptography**: RSA breaking, post-quantum security - **Drug Discovery**: Molecular simulation, protein folding - **Finance**: Portfolio optimization, risk analysis - **AI/ML**: Feature maps, kernel methods - **Logistics**: Route optimization, scheduling **The Quantum Advantage Pipeline:** ```bash # Full quantum development lifecycle gemini-flow hive-mind spawn "Quantum advantage assessment" \ --phases "theory,simulation,implementation,benchmarking" \ --compare-classical \ --noise-analysis \ --hardware-constraints ibm-quantum ``` **Real-World Impact:** - **2024**: Google's quantum chip achieves computational supremacy - **2025**: IBM's 1000+ qubit processors go mainstream - **2026**: Quantum cloud computing becomes enterprise-ready - **2027**: First quantum-accelerated drug discoveries - **2030**: Quantum advantage in financial modeling **The Quantum Swarm Advantage:** - Parallel quantum algorithm exploration - Hybrid classical-quantum optimization - Noise mitigation strategy development - Hardware-aware circuit compilation - Quantum error correction implementation --- ## 🎮 Pro Tips - Level Up Your Game ### Context Window Hack ```bash # Load your ENTIRE monorepo export GEMINI_CONTEXT_MAX=2000000 gemini-flow swarm "Refactor for microservices" --context . --max-tokens ``` ### Multi-Model Router Abuse ```bash # Let Gemini decide which model to use gemini-flow orchestrate "Build feature" \ --auto-route \ --cost-optimize \ --speed-priority ``` ### The Swarm-in-Swarm Pattern ```bash # Meta AF but it works gemini-flow hive-mind spawn "Optimize our AI workflows" \ --analyze-self \ --meta-optimization \ --recursive-depth 3 ``` ### Continuous Learning Mode ```bash # Let it learn your style gemini-flow learn --from ./src --days 30 gemini-flow generate "New feature" --style learned ``` --- ## 🚨 Emergency Commands - When Prod Is On Fire ```bash # The "Oh Shit" Button gemini-flow swarm "CRITICAL: Fix production outage" \ --emergency \ --workers 20 \ --all-hands \ --skip-review # The "Rollback Navigator" gemini-flow analyze --git-history 24h \ --find-breaking-change \ --suggest-fix \ --test-fix # The "Performance CPR" gemini-flow execute "Emergency optimization" \ --profile-production \ --quick-wins \ --no-breaking-changes ``` --- ## 📊 Benchmark Your Powers ```bash # See how fast you're shipping gemini-flow stats --period 30d --team-compare # Measure swarm efficiency gemini-flow benchmark --operation "Full feature cycle" # Cost analysis (because CFO asks) gemini-flow cost-report --breakdown-by-tier ``` --- ## 🎯 The Bottom Line Every command here is battle-tested. Every flag serves a purpose. Every combo ships features. **Remember**: - Parallel > Sequential - Byzantine > Majority (when stakes are high) - Context is king (2M tokens = entire codebases) - Grounding prevents hallucination - Swarms scale, single agents don't **The Meta-Command** (use sparingly): ```bash gemini-flow hive-mind spawn "Make me a 10x developer" \ --workers unlimited \ --queen-type adaptive \ --consensus byzantine \ --learn-from-me \ --ship-everything ``` --- *Built for developers who ship, not developers who talk about shipping.* **Now stop reading and start orchestrating.** 🚀 ## 📝 Reality Check - What's Actually Available **Core Commands That Exist:** -`hive-mind spawn` - Full swarm orchestration -`swarm` - Quick deployment -`sparc` - SPARC methodology (run, pipeline, tdd, batch, concurrent) -`query` - Mini-swarm with MCP web research -`memory` - Persistent context storage -`agent spawn` - Individual agent creation -`orchestrate` - Complex task coordination **What Gemini CLI Actually Does:** - All file operations (Read, Write, Edit) - All code generation - All bash commands - All actual implementation **What Gemini-Flow MCP Tools Do:** - Coordination and planning - Memory management - Swarm orchestration - Performance tracking - GitHub integration Remember: MCP tools coordinate, Gemini CLI executes!