UNPKG

agentic-data-stack-community

Version:

AI Agentic Data Stack Framework - Community Edition. Open source data engineering framework with 4 core agents, essential templates, and 3-dimensional quality validation.

182 lines (161 loc) 6.31 kB
workflow: id: community-analytics name: Community Analytics Workflow description: >- Streamlined analytics workflow for community users with basic agent collaboration and standard validation. Supports exploratory and business analysis projects. type: greenfield project_types: - exploratory-analysis - business-intelligence - performance-analytics - customer-insights sequence: - step: requirements_gathering agent: data-product-manager action: gather-requirements creates: analysis-requirements.md duration: 0.5 days notes: | Gather business requirements and define analysis objectives: - Business context and objectives - Key stakeholders and success criteria - Data sources and availability assessment - Timeline and resource constraints SAVE OUTPUT: Copy final analysis-requirements.md to your project's docs/ folder. - step: data_profiling agent: data-analyst action: profile-data creates: data-profile-report.md requires: analysis-requirements.md duration: 0.5 days notes: | Conduct data profiling and discovery: - Data quality assessment - Schema understanding and relationship mapping - Missing data patterns and anomaly detection - Statistical summaries and distribution analysis - Data readiness evaluation for analysis goals - step: analysis_execution agent: data-analyst action: analyze-data creates: analysis-results.md requires: [analysis-requirements.md, data-profile-report.md] duration: 1-2 days notes: | Execute data analysis: - Statistical analysis and hypothesis testing - Pattern discovery and trend analysis - Business insights and recommendations - Data visualization and charts - step: dashboard_creation agent: data-analyst action: create-dashboard creates: analytical-dashboard requires: analysis-results.md duration: 1 day notes: | Create dashboard and visualizations: - Interactive charts and visualizations - Key metrics and KPI displays - User-friendly navigation and filters - Export and sharing capabilities - step: quality_validation agent: data-quality-engineer action: validate-data-quality validates: [data-profile-report.md, analysis-results.md, analytical-dashboard] duration: 0.5 days notes: | Quality validation of analysis outputs: - Data accuracy validation in analytical results - Calculation verification and testing - Dashboard functionality testing - Cross-validation with known business metrics - step: stakeholder_review agent: data-product-manager action: define-metrics requires: [analysis-results.md, analytical-dashboard] duration: 0.5 days notes: | Stakeholder validation and sign-off: - Present findings to business stakeholders - Validate business impact and actionability - Document success metrics and KPIs - Plan implementation and next steps validation_gates: - gate: requirements_validation criteria: - Business objectives clearly defined and measurable - Data sources identified and accessible - Success criteria established with stakeholders - Resource requirements validated and approved - gate: analysis_validation criteria: - Statistical methods appropriate for data and objectives - Quality validation passes framework standards - Insights are actionable and business-relevant - Visualizations follow best practices - gate: delivery_validation criteria: - All deliverables pass quality checks - Stakeholder approval documented - Success metrics established - Implementation plan documented success_criteria: business: - Key stakeholders can articulate value and next steps - Insights directly address business objectives - Recommendations are actionable and prioritized - Success metrics are established and trackable technical: - Dashboard performance meets requirements - Data accuracy meets business quality standards - Analysis is reproducible and documented - User adoption targets are achievable analytical: - Statistical analysis is methodologically sound - Business insights are supported by evidence - Visualizations effectively communicate findings - Documentation enables reproducibility workflow_variants: exploratory_analysis: focus: Open-ended data exploration and hypothesis generation duration: 2-3 days additional_emphasis: - Pattern discovery and trend identification - Hypothesis generation and validation - Exploratory data visualization business_intelligence: focus: KPI monitoring and business performance analysis duration: 3-4 days additional_emphasis: - KPI definition and calculation - Performance dashboards - Business reporting automation customer_insights: focus: Customer behavior analysis and segmentation duration: 3-5 days additional_emphasis: - Customer segmentation analysis - Behavior pattern identification - Personalization opportunities escalation_procedures: - condition: Data quality issues preventing reliable analysis action: Escalate to Data Quality Engineer and Data Engineer timeline: Within 2 hours - condition: Technical analysis beyond analyst capability action: Consider external expertise or simplified approach timeline: Within 4 hours - condition: Stakeholder requirement conflicts or scope changes action: Escalate to Data Product Manager and Business Sponsors timeline: Within 1 business day post_completion_activities: - activity: Dashboard usage monitoring frequency: Weekly for first month, then monthly responsible: Data Analyst - activity: Business impact tracking frequency: Monthly for first quarter responsible: Data Product Manager - activity: Analysis quality review frequency: Quarterly responsible: Data Quality Engineer