bc-code-intelligence-mcp
Version:
BC Code Intelligence MCP Server - Complete Specialist Bundle with AI-driven expert consultation, seamless handoffs, and context-preserving workflows
211 lines (166 loc) ⢠9.63 kB
Markdown
# Performance Optimization Phase - BC-Specific Enhancement Methodology
## Overview
**Objective**: Apply systematic Business Central performance optimizations based on analysis phase findings, achieving measurable improvements through domain-specific knowledge.
**Key Principle**: When optimization opportunities are identified, applying BC-specific patterns and techniques produces dramatically superior results compared to generic optimization approaches.
---
## Phase Execution Process
### Step 1: Optimization Priority Framework
**CRITICAL**: Apply optimizations in priority order for maximum cumulative impact.
#### š¢ Priority 1: Easy Win Optimizations (Apply First)
**High ROI + Low Risk** - These should be implemented immediately:
##### Database-First Performance Hierarchy
**CRITICAL PRINCIPLE**: Always prefer database-native operations over application-layer workarounds.
**Tier 1: Database-Native Operations (HIGHEST PRIORITY)**
- Direct table CalcSums, Count(), filtering
- Single-table aggregations with SIFT keys
- Database-level sorting and grouping
```al
// ā
PREFERRED: Direct database aggregation
MyRecord.CalcSums(Amount);
Total := MyRecord.Amount;
```
**Tier 2: Platform-Optimized Operations (MEDIUM PRIORITY)**
- SetLoadFields, efficient key usage
- Only after database-native options exhausted
```al
// ā
SECONDARY: Field optimization
MyRecord.SetLoadFields(Amount, Date);
MyRecord.CalcSums(Amount);
```
**Tier 3: Application-Level Operations (LOWER PRIORITY)**
- FlowFields, multi-table relationships
- Only when database-native insufficient
```al
// ā ļø LAST RESORT: Application-layer processing
UnitStats.CalcFields("Total Revenue"); // FlowField calculation
```
##### Database-Level Operations vs Manual Processing
**Pattern**: Replace manual loops with database aggregation operations
```al
// ā BEFORE: Manual processing (inefficient)
Total := 0;
if MyRecord.FindSet() then
repeat
Total += MyRecord.Amount;
until MyRecord.Next() = 0;
// ā
AFTER: Database aggregation (efficient)
MyRecord.CalcSums(Amount);
Total := MyRecord.Amount;
```
##### Field Loading Optimization
**Pattern**: Use `SetLoadFields()` to load only required fields
```al
// ā BEFORE: Loading all fields (wasteful)
MyRecord.FindSet();
// ā
AFTER: Selective field loading (efficient)
MyRecord.SetLoadFields(Name, Amount, Date);
MyRecord.FindSet();
```
#### š” Priority 2: Medium Impact Optimizations
**Moderate ROI + Medium Risk** - Implement after Priority 1:
##### Infrastructure Optimization Validation
**CRITICAL**: For optimizations applied that rely on underlying data (tables, keys, indexes), ensure the related infrastructure is updated or extended to support the necessary change.
- **Examine referenced objects**: If code references tables or other objects, analyze those for optimization opportunities
- **Validate infrastructure support**: Confirm that database structures (keys, indexes, fields) support the optimization
- **Update dependencies**: Modify tables, keys, or other infrastructure to enable optimization effectiveness
- **Cross-reference infrastructure**: Check if shared objects provide optimization opportunities for multiple modules
##### Platform-Specific Optimization Application
- **Use knowledge base**: Apply `get_topic_content` for specific optimization techniques identified in analysis phase
##### Advanced Platform Feature Application
- **Apply knowledge systematically**: Use `get_topic_content` for detailed implementation of optimization techniques
- **Follow best practices**: Implement optimizations according to platform-specific guidance from knowledge base
##### Database Operation Optimization
- **Apply research findings**: Use optimization techniques discovered through MCP knowledge base analysis
- **Implement systematically**: Follow specific guidance from `get_topic_content` for identified optimization patterns
#### š“ Priority 3: Complex Optimizations
**High Impact + Higher Risk** - Requires careful analysis:
##### Architectural Changes
- Restructure algorithms for better performance characteristics
- Implement caching strategies for frequently accessed data
- Consider background processing for heavy operations
##### Advanced BC Features
- Implement table extensions for performance gains
- Utilize Business Central APIs for external integrations
- Apply advanced filtering and search techniques
### Step 2: Implementation Guidelines
#### ā” Performance Best Practices
- [ ] **Measure Before Optimizing**: Establish baseline performance metrics
- [ ] **Single Responsibility**: Each optimization should address one specific bottleneck
- [ ] **Preserve Functionality**: Ensure business logic remains intact
- [ ] **Test Thoroughly**: Validate both performance improvement and correctness
#### š ļø Technical Implementation Standards
- [ ] **Use BC-Specific Patterns**: Prefer Business Central optimizations over generic approaches
- [ ] **Follow AL Conventions**: Maintain code style and naming conventions
- [ ] **Document Changes**: Comment optimization reasoning for future maintenance
- [ ] **Consider Scalability**: Ensure optimizations work with larger datasets
### Step 3: Validation Framework
#### š Performance Measurement
For each optimization applied:
- [ ] **Baseline Measurement**: Record original execution time/resource usage
- [ ] **Implementation**: Apply the optimization technique
- [ ] **Performance Testing**: Measure improved performance metrics
- [ ] **Impact Calculation**: Document percentage improvement achieved
#### šÆ Success Metrics
- **Execution Time**: Target reduction in processing duration
- **Memory Usage**: Minimize memory consumption during operations
- **Database Efficiency**: Reduce number of database calls
- **Resource Utilization**: Optimize CPU and I/O usage
### Step 4: Risk Management
#### ā ļø Risk Assessment Categories
- [ ] **Low Risk**: Simple field access optimizations, basic query improvements
- [ ] **Medium Risk**: Algorithm changes, new BC feature usage
- [ ] **High Risk**: Architectural modifications, complex business logic changes
#### š”ļø Mitigation Strategies
- **Code Reviews**: Peer validation of optimization approaches
- **Incremental Testing**: Apply optimizations incrementally with testing
- **Rollback Planning**: Maintain ability to revert changes if needed
- **Documentation**: Clear documentation of changes for troubleshooting
---
## š MANDATORY COMPLETION GATES
**CRITICAL**: Performance phase CANNOT be marked complete until ALL gates pass.
### Gate 1: Optimization Coverage Validation
- [ ] **Cross-reference file inventory**: Every file in analysis scope must be addressed
- [ ] **Pattern application verification**: Core patterns (CalcSums, SetLoadFields) applied where identified
- [ ] **BLOCKING CONDITION**: ANY file in scope missing optimization analysis = INCOMPLETE
### Gate 2: Infrastructure Validation
- [ ] **Table structure verification**: Any SIFT keys or infrastructure changes properly implemented
- [ ] **Dependency validation**: Related objects updated to support optimizations
- [ ] **BLOCKING CONDITION**: Code optimizations without supporting infrastructure = INVALID
### Gate 3: Documentation Completeness
- [ ] **Optimization log accuracy**: Every claimed optimization must reference actual file/line changes
- [ ] **No phantom optimizations**: Cannot claim optimizations for non-existent files
- [ ] **BLOCKING CONDITION**: Inaccurate optimization claims = VERIFICATION REQUIRED
## šØ EXECUTION VALIDATION CHECKPOINTS - MANDATORY
**CRITICAL**: Run execution validation frequently during performance work to prevent fraud.
### Trigger Points (MUST BE EXECUTED)
1. **After optimizing 2-3 files**: `load_methodology execution-validation`
2. **Before claiming module complete**: BLOCK until execution validation passes
3. **When requesting permission to continue**: Validate all claimed work first
4. **After any substantial optimization claims** (CalcSums, SetLoadFields, etc.)
### Execution Validation Protocol
```markdown
## EXECUTION VALIDATION CHECKPOINT
**MANDATORY**: Run real-time validation of optimization claims:
- Load execution-validation methodology: `load_methodology execution-validation`
- Validate file modifications and optimization existence
- BLOCKING: Cannot proceed until all validation checks pass
- Document any discrepancies and correct immediately
```
### Integration with Performance Work
- **During optimization**: Run micro-validation every 2-3 files
- **Before completion claims**: BLOCK until execution validation complete
- **Real-time fraud detection**: Immediately catch false optimization claims
## Success Criteria
ā
**Measurable Improvement**: Documented performance gains from optimizations
ā
**Functional Preservation**: All business functionality remains intact
ā
**Best Practice Compliance**: Solutions follow Business Central optimization patterns
ā
**Scalability Validation**: Optimizations work effectively with production data volumes
ā
**MANDATORY**: ALL completion gates passed before proceeding to verification
## Quality Validation
- **Performance Verification**: Are claimed improvements actually measurable?
- **Code Quality**: Does the optimized code maintain readability and maintainability?
- **Business Logic Integrity**: Have you preserved all original business requirements?
- **BC Pattern Compliance**: Are you using Business Central features appropriately?
---
## Next Phase
Upon completion, proceed to **Testing & Validation Phase** to confirm optimization effectiveness in production-like scenarios.