sf-agent-framework
Version:
AI Agent Orchestration Framework for Salesforce Development - Two-phase architecture with 70% context reduction
566 lines (425 loc) • 15.1 kB
Markdown
# Salesforce Governor Limits Reference
## Overview
Governor limits are runtime limits enforced by Salesforce to ensure efficient
resource usage in the multi-tenant environment. Understanding and designing
around these limits is crucial for scalable applications.
## Per-Transaction Apex Limits
### SOQL Queries
**Limit**: 100 SOQL queries (200 in async)
**Common Violations**:
```apex
// BAD: Query in loop
for(Account acc : accounts) {
List<Contact> contacts = [SELECT Id FROM Contact WHERE AccountId = :acc.Id];
}
// GOOD: Single query with relationship
List<Account> accounts = [SELECT Id, (SELECT Id FROM Contacts) FROM Account];
```
**Best Practices**:
- Use relationship queries
- Collect IDs and query once
- Cache query results
- Use selective filters
### DML Statements
**Limit**: 150 DML statements
**Common Violations**:
```apex
// BAD: DML in loop
for(Contact con : contacts) {
con.LastName = 'Updated';
update con; // DML in loop!
}
// GOOD: Bulk DML
List<Contact> contactsToUpdate = new List<Contact>();
for(Contact con : contacts) {
con.LastName = 'Updated';
contactsToUpdate.add(con);
}
update contactsToUpdate; // Single DML
```
### DML Rows
**Limit**: 10,000 rows
**Handling Large Data Volumes**:
```apex
public class BatchProcessor implements Database.Batchable<sObject> {
public Database.QueryLocator start(Database.BatchableContext BC) {
return Database.getQueryLocator('SELECT Id FROM Account WHERE NeedsProcessing__c = true');
}
public void execute(Database.BatchableContext BC, List<Account> scope) {
// Process up to 200 records at a time
for(Account acc : scope) {
acc.Processed__c = true;
}
update scope;
}
public void finish(Database.BatchableContext BC) {
// Send notification
}
}
```
### CPU Time
**Limit**: 10,000 ms (10 seconds) synchronous, 60,000 ms async
**CPU Optimization**:
```apex
// BAD: Inefficient string concatenation
String result = '';
for(Integer i = 0; i < 10000; i++) {
result += 'Item ' + i + ', '; // CPU intensive
}
// GOOD: Use StringBuilder pattern
List<String> parts = new List<String>();
for(Integer i = 0; i < 10000; i++) {
parts.add('Item ' + i);
}
String result = String.join(parts, ', ');
```
### Heap Size
**Limit**: 6 MB synchronous, 12 MB async
**Memory Management**:
```apex
// BAD: Loading all data into memory
List<Attachment> allAttachments = [SELECT Id, Body FROM Attachment];
// GOOD: Process in chunks
public void processAttachments() {
Integer offset = 0;
Integer batchSize = 200;
while(true) {
List<Attachment> batch = [SELECT Id, Body FROM Attachment
LIMIT :batchSize OFFSET :offset];
if(batch.isEmpty()) break;
processBatch(batch);
batch.clear(); // Free memory
offset += batchSize;
}
}
```
## Query Limits
### SOQL Query Rows
**Limit**: 50,000 rows
**Handling Large Queries**:
```apex
// Use LIMIT and selective filters
List<Account> accounts = [SELECT Id, Name FROM Account
WHERE CreatedDate = TODAY
AND Industry = 'Technology'
LIMIT 10000];
// For larger datasets, use batch Apex
Database.executeBatch(new LargeDataProcessor(), 200);
```
### SOSL Queries
**Limit**: 20 SOSL queries
**Efficient Searching**:
```apex
// Combine searches when possible
List<List<SObject>> searchResults = [FIND :searchTerm IN ALL FIELDS
RETURNING
Account(Id, Name),
Contact(Id, FirstName, LastName),
Opportunity(Id, Name)];
```
## Email Limits
### Single Email Messages
**Limit**: 5,000 per day
**Email Management**:
```apex
public class EmailManager {
private static Integer emailsSent = 0;
private static final Integer DAILY_LIMIT = 5000;
private static final Integer BATCH_SIZE = 100;
public static void sendEmails(List<Messaging.SingleEmailMessage> emails) {
if(emailsSent + emails.size() > DAILY_LIMIT) {
// Queue for tomorrow or use alternative
queueEmailsForLater(emails);
return;
}
// Send in batches
for(Integer i = 0; i < emails.size(); i += BATCH_SIZE) {
Integer endIndex = Math.min(i + BATCH_SIZE, emails.size());
List<Messaging.SingleEmailMessage> batch = new List<Messaging.SingleEmailMessage>();
for(Integer j = i; j < endIndex; j++) {
batch.add(emails[j]);
}
Messaging.sendEmail(batch);
emailsSent += batch.size();
}
}
}
```
## Callout Limits
### HTTP Callouts
**Limit**: 100 callouts
**Callout Optimization**:
```apex
public class CalloutManager {
// Batch multiple operations into single callout
public static void syncAccounts(List<Account> accounts) {
HttpRequest req = new HttpRequest();
req.setEndpoint('https://api.example.com/bulk-sync');
req.setMethod('POST');
// Send all accounts in one callout
BatchRequest batchReq = new BatchRequest();
for(Account acc : accounts) {
batchReq.addOperation(new AccountSync(acc));
}
req.setBody(JSON.serialize(batchReq));
Http http = new Http();
HttpResponse res = http.send(req);
// Process batch response
processBatchResponse(res.getBody());
}
}
```
### Callout Time
**Limit**: 120 seconds total, 120 seconds per callout
**Timeout Handling**:
```apex
public class ResilientCallout {
public static HttpResponse makeCallout(String endpoint, String body) {
HttpRequest req = new HttpRequest();
req.setEndpoint(endpoint);
req.setMethod('POST');
req.setBody(body);
req.setTimeout(30000); // 30 second timeout
Http http = new Http();
try {
return http.send(req);
} catch(CalloutException e) {
if(e.getMessage().contains('timeout')) {
// Handle timeout - maybe retry with smaller payload
return makeCalloutWithRetry(endpoint, body);
}
throw e;
}
}
}
```
## Async Apex Limits
### Future Methods
**Limit**: 50 per transaction, 250,000 per 24 hours
**Future Method Usage**:
```apex
public class AsyncProcessor {
// Check limits before calling
public static void procesAsync(Set<Id> recordIds) {
Integer futureJobs = Limits.getFutureCalls();
Integer futureLimit = Limits.getLimitFutureCalls();
if(futureJobs < futureLimit) {
processFuture(recordIds);
} else {
// Fall back to queueable
System.enqueueJob(new ProcessQueueable(recordIds));
}
}
@future
private static void processFuture(Set<Id> recordIds) {
// Async processing
}
}
```
### Queueable Jobs
**Limit**: 50 jobs added to queue, 1 from queueable
**Chaining Queueables**:
```apex
public class ChainedQueueable implements Queueable {
private List<Id> recordIds;
private Integer batchNumber;
public void execute(QueueableContext context) {
// Process batch
List<Id> currentBatch = new List<Id>();
Integer batchSize = 200;
for(Integer i = 0; i < batchSize && !recordIds.isEmpty(); i++) {
currentBatch.add(recordIds.remove(0));
}
processRecords(currentBatch);
// Chain next job if more records
if(!recordIds.isEmpty() && !Test.isRunningTest()) {
System.enqueueJob(new ChainedQueueable(recordIds, batchNumber + 1));
}
}
}
```
### Batch Apex
**Limit**: 5 concurrent batches, 250,000 batches per 24 hours
**Batch Design**:
```apex
public class OptimizedBatch implements Database.Batchable<sObject>, Database.Stateful {
private Integer recordsProcessed = 0;
private List<String> errors = new List<String>();
public Database.QueryLocator start(Database.BatchableContext BC) {
// Use QueryLocator for up to 50 million records
return Database.getQueryLocator([
SELECT Id, Name FROM Account
WHERE LastModifiedDate = LAST_N_DAYS:7
]);
}
public void execute(Database.BatchableContext BC, List<Account> scope) {
List<Account> toUpdate = new List<Account>();
for(Account acc : scope) {
if(processAccount(acc)) {
toUpdate.add(acc);
}
}
// Partial success allowed
Database.SaveResult[] results = Database.update(toUpdate, false);
for(Integer i = 0; i < results.size(); i++) {
if(!results[i].isSuccess()) {
errors.add('Failed to update ' + toUpdate[i].Id + ': ' +
results[i].getErrors()[0].getMessage());
} else {
recordsProcessed++;
}
}
}
public void finish(Database.BatchableContext BC) {
// Send summary email
AsyncApexJob job = [SELECT Status, NumberOfErrors FROM AsyncApexJob
WHERE Id = :BC.getJobId()];
String body = 'Batch completed. Processed: ' + recordsProcessed + '\n';
if(!errors.isEmpty()) {
body += 'Errors:\n' + String.join(errors, '\n');
}
// Send notification
}
}
```
## Platform Events
### Event Publishing
**Limit**: 250,000 events per hour
**Efficient Publishing**:
```apex
public class EventPublisher {
private static List<Order_Event__e> eventBuffer = new List<Order_Event__e>();
private static final Integer BUFFER_SIZE = 200;
public static void publishOrderEvent(Order__c order) {
Order_Event__e event = new Order_Event__e(
Order_Id__c = order.Id,
Status__c = order.Status__c,
Amount__c = order.Total_Amount__c
);
eventBuffer.add(event);
if(eventBuffer.size() >= BUFFER_SIZE) {
flushEvents();
}
}
public static void flushEvents() {
if(!eventBuffer.isEmpty()) {
List<Database.SaveResult> results = EventBus.publish(eventBuffer);
// Handle any failures
for(Integer i = 0; i < results.size(); i++) {
if(!results[i].isSuccess()) {
logError('Event publish failed: ' + results[i].getErrors()[0]);
}
}
eventBuffer.clear();
}
}
}
```
## Describe Limits
### Describe Calls
**Limit**: 100 describe calls
**Caching Describe Results**:
```apex
public class SchemaCache {
private static Map<String, Schema.DescribeSObjectResult> objectDescribeMap =
new Map<String, Schema.DescribeSObjectResult>();
private static Map<String, Map<String, Schema.DescribeFieldResult>> fieldDescribeMap =
new Map<String, Map<String, Schema.DescribeFieldResult>>();
public static Schema.DescribeSObjectResult getObjectDescribe(String objectName) {
if(!objectDescribeMap.containsKey(objectName)) {
objectDescribeMap.put(
objectName,
Schema.getGlobalDescribe().get(objectName).getDescribe()
);
}
return objectDescribeMap.get(objectName);
}
public static Schema.DescribeFieldResult getFieldDescribe(String objectName, String fieldName) {
if(!fieldDescribeMap.containsKey(objectName)) {
fieldDescribeMap.put(objectName, new Map<String, Schema.DescribeFieldResult>());
}
Map<String, Schema.DescribeFieldResult> fieldMap = fieldDescribeMap.get(objectName);
if(!fieldMap.containsKey(fieldName)) {
Schema.DescribeSObjectResult objDescribe = getObjectDescribe(objectName);
fieldMap.put(
fieldName,
objDescribe.fields.getMap().get(fieldName).getDescribe()
);
}
return fieldMap.get(fieldName);
}
}
```
## Best Practices for Governor Limits
### Bulkification
```apex
public class BulkificationExample {
public static void updateAccountContacts(List<Account> accounts) {
// Collect all account IDs
Set<Id> accountIds = new Set<Id>();
for(Account acc : accounts) {
accountIds.add(acc.Id);
}
// Single query for all contacts
Map<Id, List<Contact>> accountContactMap = new Map<Id, List<Contact>>();
for(Contact con : [SELECT Id, AccountId, Email FROM Contact
WHERE AccountId IN :accountIds]) {
if(!accountContactMap.containsKey(con.AccountId)) {
accountContactMap.put(con.AccountId, new List<Contact>());
}
accountContactMap.get(con.AccountId).add(con);
}
// Process all contacts
List<Contact> contactsToUpdate = new List<Contact>();
for(Account acc : accounts) {
List<Contact> contacts = accountContactMap.get(acc.Id);
if(contacts != null) {
for(Contact con : contacts) {
con.Email = acc.Email__c;
contactsToUpdate.add(con);
}
}
}
// Single DML
if(!contactsToUpdate.isEmpty()) {
update contactsToUpdate;
}
}
}
```
### Limit Monitoring
```apex
public class LimitMonitor {
public static void checkLimits() {
System.debug('SOQL Queries: ' + Limits.getQueries() + ' of ' + Limits.getLimitQueries());
System.debug('DML Statements: ' + Limits.getDmlStatements() + ' of ' + Limits.getLimitDmlStatements());
System.debug('CPU Time: ' + Limits.getCpuTime() + ' of ' + Limits.getLimitCpuTime());
System.debug('Heap Size: ' + Limits.getHeapSize() + ' of ' + Limits.getLimitHeapSize());
// Warn if approaching limits
if(Limits.getQueries() > Limits.getLimitQueries() * 0.8) {
System.debug(LoggingLevel.WARN, 'Approaching SOQL query limit!');
}
}
public static Boolean canMakeCallout() {
return Limits.getCallouts() < Limits.getLimitCallouts();
}
public static Boolean canSendEmail(Integer count) {
return Limits.getEmailInvocations() + count <= Limits.getLimitEmailInvocations();
}
}
```
## Common Limit Violations and Solutions
### Too Many SOQL Queries
**Problem**: Query in trigger for each record **Solution**: Move queries outside
loops, use collections
### CPU Time Limit Exceeded
**Problem**: Complex calculations in loops **Solution**: Optimize algorithms,
use Map for lookups
### Heap Size Exceeded
**Problem**: Large data sets in memory **Solution**: Process in chunks, clear
collections
### Too Many DML Statements
**Problem**: Update in loops **Solution**: Collect records and perform bulk DML
### View State Exceeded (Visualforce)
**Problem**: Large data in view state **Solution**: Use transient variables,
pagination