sf-agent-framework
Version:
AI Agent Orchestration Framework for Salesforce Development - Two-phase architecture with 70% context reduction
283 lines (206 loc) • 7.87 kB
Markdown
# Data Mapping Checklist
This comprehensive checklist ensures accurate and complete data mapping for ETL
processes and data migrations in Salesforce implementations.
## Source System Analysis
### Level 1: System Documentation
- [ ] Source system identified and documented
- [ ] Data dictionary obtained or created
- [ ] Entity relationship diagrams reviewed
- [ ] Business process documentation collected
- [ ] System access credentials secured
### Level 2: Data Profiling
- [ ] Record counts per object documented
- [ ] Field usage statistics analyzed
- [ ] Data quality issues identified
- [ ] Null/empty field patterns documented
- [ ] Data volume growth trends assessed
### Level 3: Technical Specifications
- [ ] Data types for each field documented
- [ ] Field lengths and constraints captured
- [ ] Unique identifiers identified
- [ ] Relationships and dependencies mapped
- [ ] Calculated fields and formulas documented
## Target System Analysis
### Level 1: Salesforce Schema Review
- [ ] Target objects identified
- [ ] Standard vs custom objects determined
- [ ] Field API names documented
- [ ] Required fields identified
- [ ] Field-level security requirements noted
### Level 2: Data Model Validation
- [ ] Object relationships mapped
- [ ] Master-detail relationships identified
- [ ] Lookup relationships documented
- [ ] Junction objects identified
- [ ] Record type mappings defined
### Level 3: Constraints and Limits
- [ ] Field character limits verified
- [ ] Picklist values mapped
- [ ] Validation rules documented
- [ ] Duplicate rules identified
- [ ] Governor limits considered
## Field-Level Mapping
### Level 1: Direct Mappings
- [ ] One-to-one field mappings documented
- [ ] Data type compatibility verified
- [ ] Field length compatibility confirmed
- [ ] Required field mappings validated
- [ ] Default values defined for unmapped required fields
### Level 2: Transformation Mappings
- [ ] Data type conversions identified
- [ ] Format transformations documented (dates, phones, etc.)
- [ ] Value translations defined (picklist mappings)
- [ ] Concatenation rules specified
- [ ] Split field logic documented
### Level 3: Complex Mappings
- [ ] Calculated field logic defined
- [ ] Conditional mapping rules documented
- [ ] Cross-object lookups identified
- [ ] Aggregation logic specified
- [ ] Business rule transformations documented
### Level 4: Special Considerations
- [ ] Multi-currency handling defined
- [ ] Multi-language support mapped
- [ ] Time zone conversions specified
- [ ] External ID mappings confirmed
- [ ] Record ownership assignments planned
## Relationship Mapping
### Level 1: Parent-Child Relationships
- [ ] Master-detail relationships mapped
- [ ] Lookup relationships identified
- [ ] Parent record creation order defined
- [ ] Orphaned record handling planned
- [ ] Circular reference resolution documented
### Level 2: Many-to-Many Relationships
- [ ] Junction objects identified
- [ ] Relationship data mapping defined
- [ ] Association rules documented
- [ ] Duplicate relationship handling planned
- [ ] Data integrity rules specified
### Level 3: Hierarchical Data
- [ ] Self-referencing relationships mapped
- [ ] Hierarchy depth limitations identified
- [ ] Top-down vs bottom-up loading planned
- [ ] Recursive relationship handling defined
- [ ] Ultimate parent identification logic
## Data Quality Mapping
### Level 1: Cleansing Rules
- [ ] Data standardization rules defined
- [ ] Invalid character handling specified
- [ ] Trim and spacing rules documented
- [ ] Case standardization defined
- [ ] Special character handling planned
### Level 2: Validation Rules
- [ ] Email format validation mapped
- [ ] Phone number standardization defined
- [ ] Address formatting rules specified
- [ ] Date format conversions documented
- [ ] Numeric format standardization planned
### Level 3: Deduplication Logic
- [ ] Duplicate identification criteria defined
- [ ] Match rules documented
- [ ] Merge logic specified
- [ ] Survivor record rules defined
- [ ] Duplicate prevention strategy planned
## Code Value Mapping
### Level 1: Picklist Mappings
- [ ] Source to target picklist values mapped
- [ ] New picklist values identified
- [ ] Inactive value handling defined
- [ ] Default value assignments specified
- [ ] Multi-select picklist logic documented
### Level 2: Record Type Mappings
- [ ] Source categorization to record types mapped
- [ ] Page layout implications considered
- [ ] Business process assignments defined
- [ ] Default record type specified
- [ ] Profile-based assignments planned
### Level 3: Status and Stage Mappings
- [ ] Status value translations defined
- [ ] Stage progression logic mapped
- [ ] Closed vs open status mappings
- [ ] Historical status preservation planned
- [ ] Business process alignment verified
## Integration Mapping
### Level 1: External ID Management
- [ ] External ID fields identified
- [ ] Uniqueness guaranteed
- [ ] Case sensitivity defined
- [ ] Null handling specified
- [ ] Update vs insert logic defined
### Level 2: API Considerations
- [ ] API field names verified
- [ ] Field accessibility confirmed
- [ ] Batch size optimization planned
- [ ] API version compatibility checked
- [ ] Rate limit considerations documented
### Level 3: Real-time vs Batch
- [ ] Real-time integration fields identified
- [ ] Batch processing fields defined
- [ ] Sync frequency requirements mapped
- [ ] Delta identification logic specified
- [ ] Error handling approach defined
## Validation and Testing
### Level 1: Mapping Validation
- [ ] Sample data mapping tested
- [ ] Edge cases identified and tested
- [ ] Transformation accuracy verified
- [ ] Relationship integrity confirmed
- [ ] Required field coverage validated
### Level 2: Data Integrity Checks
- [ ] Row count reconciliation planned
- [ ] Sum total validations defined
- [ ] Relationship counts verified
- [ ] Unique value preservation confirmed
- [ ] Data truncation checks implemented
### Level 3: Business Validation
- [ ] Business rule compliance verified
- [ ] Process flow continuity confirmed
- [ ] Reporting requirements validated
- [ ] User acceptance criteria defined
- [ ] Historical data integrity maintained
## Documentation Requirements
### Level 1: Mapping Documentation
- [ ] Field mapping spreadsheet completed
- [ ] Transformation rules documented
- [ ] Business logic explanations provided
- [ ] Assumptions clearly stated
- [ ] Dependencies identified
### Level 2: Technical Documentation
- [ ] ETL script annotations complete
- [ ] Data flow diagrams created
- [ ] Error handling logic documented
- [ ] Performance considerations noted
- [ ] Rollback procedures defined
### Level 3: Business Documentation
- [ ] Business impact analysis completed
- [ ] User training materials updated
- [ ] Data dictionary updated
- [ ] Process documentation revised
- [ ] Change management plan created
## Performance Optimization
### Level 1: Batch Processing
- [ ] Optimal batch sizes determined
- [ ] Parallel processing opportunities identified
- [ ] Memory usage optimization planned
- [ ] Network bandwidth considerations
- [ ] Processing window requirements met
### Level 2: Query Optimization
- [ ] Selective queries designed
- [ ] Index usage optimized
- [ ] SOQL query limits considered
- [ ] Bulk API usage planned
- [ ] API call minimization strategies
## Post-Mapping Activities
### Level 1: Review and Approval
- [ ] Technical review completed
- [ ] Business stakeholder approval obtained
- [ ] Data steward sign-off received
- [ ] Security review passed
- [ ] Architecture approval confirmed
### Level 2: Implementation Readiness
- [ ] Mapping freeze date established
- [ ] Change control process defined
- [ ] Version control implemented
- [ ] Rollback plan created
- [ ] Go-live criteria defined