UNPKG

sf-agent-framework

Version:

AI Agent Orchestration Framework for Salesforce Development - Two-phase architecture with 70% context reduction

477 lines (356 loc) 13.2 kB
# Salesforce Governor Limits and Platform Limits ## Overview Salesforce enforces various limits to ensure fair resource allocation and optimal platform performance. Understanding these limits is crucial for building scalable applications and avoiding runtime exceptions. ## Governor Limits (Per Transaction) ### SOQL Query Limits | Limit | Synchronous | Asynchronous | | --------------------------------------------- | ----------- | ------------ | | Total SOQL queries | 100 | 200 | | Records retrieved by SOQL | 50,000 | 50,000 | | Records retrieved by Database.getQueryLocator | 10,000 | 10,000 | | SOSL queries | 20 | 20 | | Records retrieved by SOSL | 2,000 | 2,000 | ### DML Statement Limits | Limit | Synchronous | Asynchronous | | ------------------------------------------------ | ----------- | ------------ | | Total DML statements | 150 | 150 | | Records processed by DML | 10,000 | 10,000 | | Total rows retrieved by Database.getQueryLocator | 10,000 | 10,000 | ### Apex Execution Limits | Limit | Synchronous | Asynchronous | | ----------------- | ----------- | ------------ | | CPU time | 10,000 ms | 60,000 ms | | Heap size | 6 MB | 12 MB | | Callouts | 100 | 100 | | Callout time | 120 seconds | 120 seconds | | Email invocations | 10 | 10 | ### Code Coverage Requirements - Production deployment: 75% overall org coverage - Individual trigger: 1% minimum coverage - No coverage required for test classes ## Per-Transaction Apex Limits ### Collection Limits ```apex // Maximum collection size List<Account> accounts = new List<Account>(); // Maximum: 1,000 items in a single list (for iteration) // Maximum string length String longString = 'a'.repeat(6000000); // 6 MB limit // Maximum stack depth // 1,000 for recursive methods ``` ### Describe Limits - Describe calls: 100 - Fields describes: 100 - Record type describes: 100 - Child relationship describes: 100 ### Future Method Limits ```apex // Per 24 hours Integer dailyLimit = 250000 + (licenses * 200); // Per Apex invocation Integer perTransaction = 50; // Example implementation @future public static void processFuture(Set<Id> recordIds) { if (recordIds.size() > 200) { // Split into smaller batches Set<Id> batch = new Set<Id>(); for (Id recordId : recordIds) { batch.add(recordId); if (batch.size() == 200) { processBatch(batch); batch.clear(); } } } } ``` ### Queueable Apex Limits ```apex // Chain depth Integer maxChainDepth = 5; // In Developer Edition Integer maxChainDepthProd = 50; // In other editions // Enqueue limit per transaction Integer enqueueLimit = 50; // Example with chaining public class ChainedQueueable implements Queueable { private Integer chainDepth; public ChainedQueueable(Integer depth) { this.chainDepth = depth; } public void execute(QueueableContext context) { // Process logic if (chainDepth < 5 && hasMoreWork()) { System.enqueueJob(new ChainedQueueable(chainDepth + 1)); } } } ``` ## Batch Apex Limits ### Batch Size Limits ```apex // Default batch size: 200 // Maximum batch size: 2,000 // Minimum batch size: 1 global class MyBatch implements Database.Batchable<SObject> { global Database.QueryLocator start(Database.BatchableContext bc) { // Maximum 50 million records can be returned return Database.getQueryLocator([ SELECT Id FROM Account ]); } global void execute(Database.BatchableContext bc, List<Account> scope) { // Process up to 2,000 records per batch // Subject to normal governor limits } } // Execute with specific batch size Database.executeBatch(new MyBatch(), 200); ``` ### Concurrent Batch Jobs - Maximum concurrent batches: 5 - Flex Queue can hold: 100 jobs - Batch jobs in 24 hours: 250,000 ## Platform Events Limits ### Publishing Limits | Limit | Standard Volume | High Volume | | -------------------- | --------------- | ----------- | | Events per batch | 1,000 | 1,000 | | Event message size | 1 MB | 1 MB | | Publishing per hour | 10,000 | 250,000 | | Delivered events/24h | 10,000 | 1,000,000 | ### Subscription Limits ```apex // Maximum CometD clients per org Integer maxClients = 1000; // 2000 for Performance edition // Events per channel Integer eventsPerChannel = 100000; // Per 24 hours // Retention period Integer retentionHours = 24; // 72 for high volume ``` ## Storage Limits ### Data Storage | Edition | Base Storage | Per User | | ------------ | ------------ | -------- | | Developer | 5 MB | N/A | | Professional | 1 GB | 20 MB | | Enterprise | 1 GB | 20 MB | | Unlimited | 1 GB | 120 MB | | Performance | 1 GB | 120 MB | ### File Storage | Edition | Base Storage | Per User | | ------------ | ------------ | -------- | | All Editions | 10 GB | 612 MB | ### Big Objects Storage - Maximum records: 1 billion - No impact on data storage limits - Query limitations apply ## API Request Limits ### Daily API Calls | Edition | Base | Per License | | ------------ | ------ | ----------- | | Developer | 15,000 | N/A | | Professional | 1,000 | 200 | | Enterprise | 15,000 | 1,000 | | Unlimited | 15,000 | 5,000 | | Performance | 15,000 | 10,000 | ### Concurrent API Request Limits ```apex // Long-running requests (>20 seconds) Integer longRunningLimit = 25; // Total concurrent requests Integer totalConcurrent = 100; // Monitoring API usage Map<String, System.OrgLimit> limitsMap = OrgLimits.getMap(); System.OrgLimit apiLimit = limitsMap.get('DailyApiRequests'); Integer remaining = apiLimit.getLimit() - apiLimit.getValue(); ``` ## Email Limits ### Daily Limits - Single emails: 5,000 - Mass emails: 5,000 - Total recipients: 10,000 external emails ### Per Transaction ```apex // Maximum sendEmail calls Integer maxCalls = 10; // Example with batching public static void sendEmails(List<Messaging.SingleEmailMessage> emails) { while (!emails.isEmpty()) { List<Messaging.SingleEmailMessage> batch = new List<Messaging.SingleEmailMessage>(); for (Integer i = 0; i < 10 && !emails.isEmpty(); i++) { batch.add(emails.remove(0)); } Messaging.sendEmail(batch); } } ``` ## Sharing Limits ### Ownership Skew - Maximum records owned by single user: 10,000 - Parent account records: 10,000 per account ### Sharing Rules - Maximum sharing rules: 300 per object - Maximum criteria-based sharing rules: 50 per object - Manual shares: 30,000,000 per org ## Metadata Limits ### Custom Objects and Fields | Item | Limit | | --------------------------- | ------------------------------------- | | Custom objects | 2,000 (Enterprise), 2,500 (Unlimited) | | Custom fields per object | 500 (varies by object) | | Total custom fields | 900,000 | | Lookup relationships | 40 per object | | Master-detail relationships | 2 per object | ### Automation Limits | Type | Limit per Object | | ------------------------- | --------------------------- | | Active workflow rules | 50 | | Total workflow rules | 500 | | Active triggers | No limit (best practice: 1) | | Process Builder processes | 50 active | | Flow interviews per hour | 1,000,000 | ## Visualforce Limits ### Page Limits - View state size: 170 KB - Maximum response size: 15 MB - Collection items: 1,000 - SOQL queries: 100 ### Component Limits ```xml <!-- Maximum components per page --> <apex:page> <!-- Up to 1,000 components --> <apex:repeat value="{!items}" var="item"> <!-- Collection limited to 1,000 items --> </apex:repeat> </apex:page> ``` ## Lightning Component Limits ### Performance Limits - Component bundle size: 4 MB - Maximum components per app: 10,000 - JavaScript controller size: 1 MB - Server calls per transaction: 100 ### Storable Actions ```javascript // Cache duration action.setStorable({ ignoreExisting: false, refresh: 900, // 15 minutes }); // Maximum cache size: 32 KB per action ``` ## Report and Dashboard Limits ### Report Limits - Custom report types: 2,000 - Reports per folder: 2,000 - Dashboard components: 20 - Dashboard filters: 10 - Scheduled reports: 200 per hour ### Data Limits - Rows returned: 2,000 (can be increased to 50,000) - Report runtime: 10 minutes - Export rows: 1 million ## Best Practices for Limit Management ### Monitoring Limits ```apex // Real-time limit monitoring public class LimitMonitor { public static void checkLimits() { System.debug('SOQL Queries: ' + Limits.getQueries() + '/' + Limits.getLimitQueries()); System.debug('DML Rows: ' + Limits.getDmlRows() + '/' + Limits.getLimitDmlRows()); System.debug('CPU Time: ' + Limits.getCpuTime() + '/' + Limits.getLimitCpuTime()); System.debug('Heap Size: ' + Limits.getHeapSize() + '/' + Limits.getLimitHeapSize()); } public static Boolean isNearLimit(String limitName, Decimal threshold) { Map<String, System.OrgLimit> limitsMap = OrgLimits.getMap(); System.OrgLimit orgLimit = limitsMap.get(limitName); Decimal usage = (Decimal)orgLimit.getValue() / orgLimit.getLimit(); return usage >= threshold; } } ``` ### Bulkification Patterns ```apex // Bad: SOQL in loop for (Account acc : accounts) { Contact c = [SELECT Id FROM Contact WHERE AccountId = :acc.Id]; } // Good: Bulk query Set<Id> accountIds = new Set<Id>(); for (Account acc : accounts) { accountIds.add(acc.Id); } List<Contact> contacts = [SELECT Id, AccountId FROM Contact WHERE AccountId IN :accountIds]; ``` ### Limit Exception Handling ```apex try { // Operations that might hit limits performDMLOperations(); } catch (System.LimitException e) { // Handle gracefully System.debug('Limit exceeded: ' + e.getMessage()); // Queue for asynchronous processing System.enqueueJob(new AsyncProcessor(remainingRecords)); } ``` ## Common Limit Scenarios ### Scenario 1: Large Data Volume ```apex // Use Batch Apex for processing large datasets public class LargeDataProcessor implements Database.Batchable<SObject> { public Database.QueryLocator start(Database.BatchableContext bc) { return Database.getQueryLocator([ SELECT Id FROM Account WHERE NeedsProcessing__c = true ]); } public void execute(Database.BatchableContext bc, List<Account> scope) { // Process in chunks of 200 } } ``` ### Scenario 2: Complex Calculations ```apex // Use Queueable for CPU-intensive operations public class ComplexCalculator implements Queueable { public void execute(QueueableContext context) { // Performs complex calculations with higher CPU limit } } ``` ### Scenario 3: Integration Callouts ```apex // Use Continuation for long-running callouts public class ContinuationController { public Object startRequest() { Continuation con = new Continuation(120); con.continuationMethod = 'processResponse'; HttpRequest req = new HttpRequest(); req.setEndpoint('https://api.example.com/data'); req.setMethod('GET'); con.addHttpRequest(req); return con; } } ``` ## Limit Increase Options ### Purchasable Increases - API calls: Additional packages available - Storage: Additional data and file storage - Sandboxes: Additional sandbox licenses - Platform Events: High-volume add-on ### Performance Solutions - Platform Cache: Reduce SOQL queries - Big Objects: Archive historical data - External Objects: Store data externally - Skinny Tables: Improve query performance ## Additional Resources - [Salesforce Developer Limits Quick Reference](https://developer.salesforce.com/docs/atlas.en-us.salesforce_app_limits_cheatsheet.meta/) - [Execution Governors and Limits](https://developer.salesforce.com/docs/atlas.en-us.apexcode.meta/apexcode/apex_gov_limits.htm) - [API Limits](https://developer.salesforce.com/docs/atlas.en-us.salesforce_app_limits_cheatsheet.meta/salesforce_app_limits_cheatsheet/salesforce_app_limits_platform_api.htm) - [Bulk API Limits](https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/asynch_api_concepts_limits.htm)