sf-agent-framework
Version:
AI Agent Orchestration Framework for Salesforce Development - Two-phase architecture with 70% context reduction
748 lines (608 loc) • 15.5 kB
Markdown
# Data Flow Architecture
## Overview
This document details how data flows through the SF-Agent Framework, from user input to final output. Understanding these flows is crucial for debugging, optimization, and extension development.
## Primary Data Flows
### 1. User Input to Agent Response Flow
```mermaid
graph TD
A[User Input] --> B{Input Type}
B -->|CLI| C[Command Parser]
B -->|Web UI| D[Slash Command Processor]
B -->|API| E[API Gateway]
C --> F[Orchestrator]
D --> F
E --> F
F --> G{Phase Detection}
G -->|Planning| H[Rich Context Loader<br/>128k tokens]
G -->|Development| I[Lean Context Loader<br/>32k tokens]
H --> J[Planning Agent]
I --> K[Development Agent]
J --> L[Template Engine]
K --> M[Code Generator]
L --> N[Document Store]
M --> O[Artifact Registry]
N --> P[Response Formatter]
O --> P
P --> Q[User Output]
style A fill:#e3f2fd
style Q fill:#c8e6c9
style H fill:#fff9c4
style I fill:#ffebee
```
### 2. Document Processing Flow
```mermaid
graph LR
A[Source Document] --> B[Document Parser]
B --> C{Document Type}
C -->|Requirements| D[Requirement Processor]
C -->|Architecture| E[Architecture Processor]
C -->|Technical| F[Technical Processor]
D --> G[Sharding Engine]
E --> G
F --> G
G --> H[Story Generator]
H --> I[Story Queue]
I --> J[Context Enrichment]
J --> K[Story Files]
K --> L[Development Agent]
L --> M[Implementation]
```
### 3. Agent Handoff Data Flow
```mermaid
graph TD
A[Source Agent] --> B[Create Handoff]
B --> C[Package Artifacts]
C --> D[Generate Manifest]
D --> E[Handoff Registry]
E --> F[Notification System]
F --> G[Target Agent]
G --> H{Accept?}
H -->|Yes| I[Load Package]
H -->|No| J[Queue]
I --> K[Process Artifacts]
K --> L[Create Output]
L --> M[Complete Handoff]
M --> N[Update Registry]
N --> O[Next Handoff]
```
## Data Transformation Pipeline
### 1. Input Transformation
```javascript
// Raw Input
"Create a customer portal with authentication"
// Tokenized
{
intent: "create",
target: "customer portal",
features: ["authentication"]
}
// Contextualized
{
intent: "create",
target: "customer portal",
features: ["authentication"],
context: {
project_type: "experience_cloud",
complexity: "medium",
phase: "planning"
}
}
// Agent-Ready
{
agent: "sf-architect",
task: "solution-design",
params: {
type: "customer portal",
requirements: ["authentication"],
template: "experience-cloud-architecture"
}
}
```
### 2. Context Assembly Flow
```javascript
// Step 1: Identify Requirements
const requirements = {
agent: 'sf-developer',
task: 'implement-authentication',
phase: 'development',
};
// Step 2: Calculate Context Budget
const contextBudget = {
max: 32000, // Development phase
reserved: 2000, // System prompts
available: 30000,
};
// Step 3: Priority Loading
const contextPriority = [
{ file: 'current-story.md', tokens: 5000, priority: 1 },
{ file: 'coding-standards.md', tokens: 3000, priority: 2 },
{ file: 'architecture-snippet.md', tokens: 2000, priority: 3 },
{ file: 'api-reference.md', tokens: 8000, priority: 4 },
];
// Step 4: Load Until Budget Exhausted
const loadedContext = [];
let tokensUsed = 0;
for (const item of contextPriority) {
if (tokensUsed + item.tokens <= contextBudget.available) {
loadedContext.push(item);
tokensUsed += item.tokens;
}
}
// Step 5: Assembly
const finalContext = {
system: 'You are sf-developer agent...',
loaded: loadedContext,
tokens: tokensUsed,
remaining: contextBudget.available - tokensUsed,
};
```
### 3. Output Generation Flow
```javascript
// Agent Output
const agentOutput = {
content: 'Generated Apex class...',
artifacts: ['CustomerPortalController.cls'],
metadata: {
agent: 'sf-developer',
timestamp: '2025-08-11T10:00:00Z',
tokens_used: 15000,
},
};
// Post-Processing
const processed = {
content: formatCode(agentOutput.content),
artifacts: validateArtifacts(agentOutput.artifacts),
validation: runValidation(agentOutput.content),
};
// Persistence
await saveToArtifactRegistry(processed);
await updateMetrics(agentOutput.metadata);
await createHandoffIfNeeded(processed);
// Format for Output Channel
const formatted = {
cli: formatForCLI(processed),
web: formatForWeb(processed),
api: formatForAPI(processed),
};
```
## State Synchronization Flow
### 1. Session State Management
```javascript
// Session Creation
const session = {
id: generateSessionId(),
user: getCurrentUser(),
started: new Date().toISOString(),
state: {
phase: 'planning',
workflow: null,
artifacts: [],
context: {
loaded: [],
used: 0,
limit: 128000,
},
},
};
// State Updates
function updateState(action) {
switch (action.type) {
case 'PHASE_CHANGE':
session.state.phase = action.payload.phase;
session.state.context.limit = getPhaseLimit(action.payload.phase);
break;
case 'ARTIFACT_CREATED':
session.state.artifacts.push(action.payload.artifact);
break;
case 'CONTEXT_LOADED':
session.state.context.loaded.push(action.payload.file);
session.state.context.used += action.payload.tokens;
break;
}
// Persist State
saveSession(session);
// Broadcast Updates
notifySubscribers(session);
}
```
### 2. Multi-Agent Coordination
```javascript
// Agent Communication Bus
class AgentBus {
constructor() {
this.agents = new Map();
this.messages = [];
}
// Register Agent
register(agent) {
this.agents.set(agent.id, {
agent,
state: 'idle',
queue: [],
});
}
// Send Message
send(from, to, message) {
const msg = {
id: generateId(),
from,
to,
message,
timestamp: Date.now(),
};
if (to === 'broadcast') {
// Broadcast to all agents
this.agents.forEach((agent, id) => {
if (id !== from) {
agent.queue.push(msg);
}
});
} else {
// Direct message
const target = this.agents.get(to);
if (target) {
target.queue.push(msg);
}
}
this.messages.push(msg);
this.processQueues();
}
// Process Message Queues
async processQueues() {
for (const [id, data] of this.agents) {
if (data.state === 'idle' && data.queue.length > 0) {
data.state = 'processing';
const message = data.queue.shift();
try {
await data.agent.processMessage(message);
} finally {
data.state = 'idle';
}
}
}
}
}
```
## Workflow Execution Flow
### 1. Interactive Workflow Data Flow
```javascript
// Workflow Definition
const workflow = {
id: 'salesforce-implementation',
phases: [
{
name: 'requirements',
interactive: true,
choices: [
{
id: 'approach',
question: 'Select approach',
options: ['comprehensive', 'rapid', 'iterative'],
},
],
},
],
};
// Execution Flow
async function executeWorkflow(workflow) {
const execution = {
workflow: workflow.id,
started: Date.now(),
state: {},
decisions: [],
artifacts: [],
};
for (const phase of workflow.phases) {
// Process Interactive Choices
if (phase.interactive) {
for (const choice of phase.choices) {
const answer = await promptUser(choice);
execution.decisions.push({
choice: choice.id,
answer,
timestamp: Date.now(),
});
// Branch Based on Decision
const branch = phase.branches[answer];
if (branch) {
await executeBranch(branch, execution);
}
}
}
// Process Sequential Steps
if (phase.steps) {
for (const step of phase.steps) {
const result = await executeStep(step, execution);
execution.artifacts.push(...result.artifacts);
}
}
// Validation Gate
if (phase.validation) {
const valid = await validatePhase(phase, execution);
if (!valid) {
throw new Error(`Validation failed for phase: ${phase.name}`);
}
}
}
return execution;
}
```
### 2. Parallel Processing Flow
```javascript
// Parallel Track Execution
async function executeParallelTracks(tracks) {
const results = new Map();
// Start All Tracks
const promises = tracks.map(async (track) => {
const trackResult = {
id: track.id,
started: Date.now(),
artifacts: [],
status: 'running',
};
try {
// Execute Track Steps
for (const step of track.steps) {
const stepResult = await executeStep(step);
trackResult.artifacts.push(...stepResult.artifacts);
}
trackResult.status = 'completed';
trackResult.completed = Date.now();
} catch (error) {
trackResult.status = 'failed';
trackResult.error = error.message;
}
return { id: track.id, result: trackResult };
});
// Wait for All Tracks
const trackResults = await Promise.all(promises);
// Aggregate Results
trackResults.forEach(({ id, result }) => {
results.set(id, result);
});
return results;
}
```
## Error Handling Flow
### 1. Error Propagation
```javascript
// Error Capture and Enhancement
class ErrorHandler {
handle(error, context) {
// Enhance Error with Context
const enhanced = {
...error,
context: {
agent: context.agent,
task: context.task,
phase: context.phase,
timestamp: Date.now(),
},
stack: error.stack,
recovery: this.getRecoveryOptions(error),
};
// Log Error
this.logError(enhanced);
// Determine Response
if (this.isRecoverable(error)) {
return this.attemptRecovery(enhanced);
} else {
return this.failGracefully(enhanced);
}
}
attemptRecovery(error) {
const strategies = [() => this.retry(error), () => this.fallback(error), () => this.partial(error)];
for (const strategy of strategies) {
try {
return strategy();
} catch (e) {
continue;
}
}
throw error;
}
}
```
### 2. Transaction Rollback Flow
```javascript
// Transactional Operations
class Transaction {
constructor() {
this.operations = [];
this.checkpoints = [];
}
async execute(operations) {
for (const op of operations) {
// Create Checkpoint
const checkpoint = await this.createCheckpoint();
this.checkpoints.push(checkpoint);
try {
// Execute Operation
const result = await op.execute();
this.operations.push({
operation: op,
result,
checkpoint,
});
} catch (error) {
// Rollback on Error
await this.rollback(checkpoint);
throw error;
}
}
// Commit if All Successful
await this.commit();
}
async rollback(toCheckpoint) {
// Reverse Operations
const rollbackOps = this.operations.filter((op) => op.checkpoint.id > toCheckpoint.id).reverse();
for (const op of rollbackOps) {
await op.operation.undo();
}
// Restore State
await this.restoreCheckpoint(toCheckpoint);
}
}
```
## Performance Optimization Flows
### 1. Caching Flow
```javascript
// Multi-Level Cache
class CacheManager {
constructor() {
this.l1 = new Map(); // Memory
this.l2 = new FileCache(); // Disk
this.l3 = new RedisCache(); // Remote
}
async get(key) {
// Check L1
if (this.l1.has(key)) {
return this.l1.get(key);
}
// Check L2
const l2Value = await this.l2.get(key);
if (l2Value) {
this.l1.set(key, l2Value);
return l2Value;
}
// Check L3
const l3Value = await this.l3.get(key);
if (l3Value) {
await this.l2.set(key, l3Value);
this.l1.set(key, l3Value);
return l3Value;
}
return null;
}
async set(key, value, ttl) {
// Write Through
this.l1.set(key, value);
await this.l2.set(key, value, ttl);
await this.l3.set(key, value, ttl);
}
}
```
### 2. Batch Processing Flow
```javascript
// Batch Aggregation
class BatchProcessor {
constructor(options) {
this.batchSize = options.batchSize || 100;
this.timeout = options.timeout || 5000;
this.queue = [];
this.timer = null;
}
add(item) {
this.queue.push(item);
if (this.queue.length >= this.batchSize) {
this.flush();
} else if (!this.timer) {
this.timer = setTimeout(() => this.flush(), this.timeout);
}
}
async flush() {
if (this.timer) {
clearTimeout(this.timer);
this.timer = null;
}
if (this.queue.length === 0) return;
const batch = this.queue.splice(0, this.batchSize);
await this.processBatch(batch);
if (this.queue.length > 0) {
this.flush();
}
}
async processBatch(items) {
// Process items in parallel
const results = await Promise.all(items.map((item) => this.processItem(item)));
return results;
}
}
```
## Security Data Flow
### 1. Authentication Flow
```mermaid
graph TD
A[User Credentials] --> B[Auth Gateway]
B --> C{Auth Type}
C -->|Token| D[Token Validator]
C -->|OAuth| E[OAuth Provider]
C -->|SSO| F[SSO Provider]
D --> G[Session Creation]
E --> G
F --> G
G --> H[Permission Loading]
H --> I[Context Filtering]
I --> J[Authorized Access]
```
### 2. Data Sanitization Flow
```javascript
// PII Detection and Removal
class DataSanitizer {
sanitize(data) {
const patterns = {
email: /[\w._%+-]+@[\w.-]+\.[A-Za-z]{2,}/g,
phone: /\b\d{3}[-.]?\d{3}[-.]?\d{4}\b/g,
ssn: /\b\d{3}-\d{2}-\d{4}\b/g,
creditCard: /\b\d{4}[\s-]?\d{4}[\s-]?\d{4}[\s-]?\d{4}\b/g,
};
let sanitized = data;
for (const [type, pattern] of Object.entries(patterns)) {
sanitized = sanitized.replace(pattern, `[${type.toUpperCase()}_REDACTED]`);
}
return sanitized;
}
}
```
## Monitoring Data Flow
### 1. Metrics Collection Flow
```javascript
// Metrics Pipeline
class MetricsCollector {
constructor() {
this.metrics = [];
this.aggregator = new Aggregator();
}
collect(metric) {
const enhanced = {
...metric,
timestamp: Date.now(),
session: getCurrentSession(),
environment: getEnvironment(),
};
this.metrics.push(enhanced);
// Real-time aggregation
this.aggregator.add(enhanced);
// Batch send
if (this.metrics.length >= 100) {
this.flush();
}
}
async flush() {
const batch = this.metrics.splice(0, 100);
await this.send(batch);
}
async send(metrics) {
// Send to monitoring service
await fetch('/metrics', {
method: 'POST',
body: JSON.stringify(metrics),
});
}
}
```
## Summary
The SF-Agent Framework's data flow architecture is designed for:
1. **Efficiency**: Minimize data movement and transformation
2. **Scalability**: Handle increasing load through parallel processing
3. **Reliability**: Error handling and recovery at every level
4. **Security**: Data protection throughout the pipeline
5. **Observability**: Comprehensive monitoring and tracing
Understanding these flows enables developers to:
- Debug issues effectively
- Optimize performance
- Extend functionality
- Maintain data integrity
---
_Last Updated: 2025-08-11_
_Version: 4.0.0_