sf-agent-framework
Version:
AI Agent Orchestration Framework for Salesforce Development - Two-phase architecture with 70% context reduction
375 lines (278 loc) • 9.9 kB
Markdown
# Salesforce API Limits
## Overview
Salesforce enforces various API limits to ensure platform stability and fair
resource allocation across all customers. Understanding these limits is crucial
for designing scalable applications and integrations.
## API Types and Daily Limits
### REST API Limits
| Edition | Daily API Calls | Per License |
| ------------ | --------------- | ----------------------- |
| Developer | 15,000 | N/A |
| Professional | 1,000 | + (# licenses × 200) |
| Enterprise | 15,000 | + (# licenses × 1,000) |
| Unlimited | 15,000 | + (# licenses × 5,000) |
| Performance | 15,000 | + (# licenses × 10,000) |
### SOAP API Limits
- Same limits as REST API
- Counted together with REST API calls
- Legacy but still widely used
### Bulk API Limits
| Edition | Daily Batches | Records/Day |
| --------------------- | ------------- | ----------- |
| Developer | 15,000 | 5,000,000 |
| Professional | 1,000 | 1,000,000 |
| Enterprise | 10,000 | 10,000,000 |
| Unlimited/Performance | 15,000 | 15,000,000 |
### Streaming API Limits
| Limit Type | Developer | Enterprise | Unlimited |
| ------------------- | --------- | ---------- | --------- |
| Daily Events (24h) | 10,000 | 100,000 | 200,000 |
| Concurrent Clients | 20 | 1,000 | 2,000 |
| Maximum Events/Hour | 3,600 | 36,000 | 72,000 |
## Concurrent Request Limits
### Synchronous Limits
- **Long-Running Requests**: 25 concurrent requests lasting > 20 seconds
- **Short Requests**: 100 total concurrent requests
- **SOAP API Queries**: 5 concurrent queries
### Asynchronous Limits
- **Batch Apex**: 5 concurrent jobs
- **Future Methods**: 50 per org + (# licenses × 10)
- **Queueable Jobs**: 50 per transaction
- **Platform Events**: 100,000 events/hour
## API-Specific Limits
### Metadata API
- **Deploy/Retrieve**: 10,000 files per deployment
- **Concurrent Deployments**: 50
- **Single File Size**: 39 MB (uncompressed)
- **ZIP File Size**: 39 MB (compressed)
### Connect API (Chatter REST API)
- **Requests per Hour**: 5,000
- **Rate Limit Window**: 1 hour rolling
- **Feed Items per Day**: 1,000 per user
### Analytics API
- **Synchronous Reports**: 500 per hour
- **Asynchronous Reports**: 1,200 per hour
- **Dashboard Refreshes**: 200 per hour
- **Report Instances**: 2,000 per org
## Rate Limiting
### API Request Metering
```python
# Example: Checking API limits in response headers
response = requests.get(
f'{instance_url}/services/data/v58.0/sobjects/Account',
headers={'Authorization': f'Bearer {access_token}'}
)
# Check rate limit headers
api_usage = response.headers.get('Sforce-Limit-Info')
# Format: "api-usage=15998/15000"
remaining_calls = int(api_usage.split('/')[1]) - int(api_usage.split('=')[1].split('/')[0])
print(f"Remaining API calls: {remaining_calls}")
```
### Composite Resource Limits
- **Composite**: 25 subrequests
- **Composite Batch**: 25 subrequests
- **SObject Collections**: 200 records
- **SObject Tree**: 200 records total, 5 levels deep
## Platform Events Limits
### Publishing Limits
| Limit | Value |
| ---------------- | --------------------------------- |
| Events per Batch | 1,000 |
| Event Size | 1 MB |
| Publishing Rate | 250,000 events/hour (High Volume) |
| Standard Volume | 10,000 events/hour |
### Subscription Limits
- **CometD Clients**: 1,000 per org
- **Subscription Channels**: 100 per client
- **Event Retention**: 24 hours (standard), 72 hours (high volume)
## Monitoring API Usage
### Using REST API
```python
# Get organization limits
limits_response = requests.get(
f'{instance_url}/services/data/v58.0/limits',
headers={'Authorization': f'Bearer {access_token}'}
)
limits = limits_response.json()
for limit_name, limit_info in limits.items():
print(f"{limit_name}: {limit_info['Remaining']}/{limit_info['Max']}")
```
### Using SOAP API
```xml
<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"
xmlns:urn="urn:enterprise.soap.sforce.com">
<soapenv:Header>
<urn:SessionHeader>
<urn:sessionId>{sessionId}</urn:sessionId>
</urn:SessionHeader>
</soapenv:Header>
<soapenv:Body>
<urn:getLimit/>
</soapenv:Body>
</soapenv:Envelope>
```
## Best Practices for Managing Limits
### 1. **Implement Efficient Querying**
```python
# Bad: Multiple queries
for account_id in account_ids:
account = sf.Account.get(account_id) # API call for each record
# Good: Single query
accounts = sf.query(
f"SELECT Id, Name FROM Account WHERE Id IN {tuple(account_ids)}"
)
```
### 2. **Use Bulk Operations**
```python
# Bad: Individual updates
for record in records:
sf.Account.update(record['Id'], record) # API call for each
# Good: Bulk update
sf.bulk.Account.update(records) # Single API operation
```
### 3. **Implement Caching**
```python
import redis
import json
cache = redis.Redis()
def get_account_with_cache(account_id):
# Check cache first
cached = cache.get(f"account:{account_id}")
if cached:
return json.loads(cached)
# Fetch from Salesforce
account = sf.Account.get(account_id)
# Cache for 1 hour
cache.setex(f"account:{account_id}", 3600, json.dumps(account))
return account
```
### 4. **Implement Rate Limiting**
```python
from time import sleep
from datetime import datetime, timedelta
class RateLimiter:
def __init__(self, max_calls, time_window):
self.max_calls = max_calls
self.time_window = time_window
self.calls = []
def wait_if_needed(self):
now = datetime.now()
# Remove old calls
self.calls = [c for c in self.calls if now - c < self.time_window]
if len(self.calls) >= self.max_calls:
sleep_time = (self.calls[0] + self.time_window - now).total_seconds()
if sleep_time > 0:
sleep(sleep_time)
self.calls.append(now)
# Usage
limiter = RateLimiter(100, timedelta(minutes=1)) # 100 calls per minute
for item in items:
limiter.wait_if_needed()
process_item(item)
```
## Handling Limit Exceptions
### Error Responses
```json
{
"message": "REQUEST_LIMIT_EXCEEDED",
"errorCode": "REQUEST_LIMIT_EXCEEDED",
"fields": []
}
```
### Retry Strategy
```python
import time
def call_with_retry(func, *args, max_retries=3, **kwargs):
for attempt in range(max_retries):
try:
return func(*args, **kwargs)
except Exception as e:
if 'REQUEST_LIMIT_EXCEEDED' in str(e):
if attempt < max_retries - 1:
# Exponential backoff
wait_time = (2 ** attempt) * 60 # 1, 2, 4 minutes
time.sleep(wait_time)
else:
raise
else:
raise
```
## Optimization Strategies
### 1. **Query Optimization**
- Use selective queries with indexed fields
- Limit fields returned with SELECT
- Use relationship queries to reduce API calls
- Implement query result caching
### 2. **Batch Processing**
- Group operations into batches
- Use Bulk API for large data volumes
- Implement asynchronous processing
- Schedule heavy operations during off-peak hours
### 3. **Connection Pooling**
```python
from requests.adapters import HTTPAdapter
from requests.packages.urllib3.util.retry import Retry
session = requests.Session()
retry = Retry(
total=3,
read=3,
connect=3,
backoff_factor=0.3
)
adapter = HTTPAdapter(max_retries=retry, pool_connections=10, pool_maxsize=10)
session.mount('http://', adapter)
session.mount('https://', adapter)
```
## Special Considerations
### High-Volume Platform Events
- Request from Salesforce Support
- 1 million events/hour publishing rate
- 3-day retention period
- Additional costs apply
### API Limit Increases
- Available for purchase
- Contact Salesforce Account Executive
- Temporary increases for migrations
- Requires business justification
### Reserved Capacity
- Guarantees API availability
- Separate from standard limits
- Enterprise agreement required
- Custom pricing
## Monitoring Tools
### Native Salesforce
- Setup → System Overview
- Event Monitoring
- API Usage Notifications
### Third-Party Tools
- New Relic Salesforce Integration
- Splunk Salesforce Add-on
- DataDog Salesforce Integration
- Custom monitoring solutions
## Common Limit Scenarios
### Scenario 1: Daily Import
```python
# Calculate API calls needed
records_to_import = 50000
batch_size = 200
api_calls_needed = records_to_import / batch_size # 250 calls
# Check if within limits
if api_calls_needed > remaining_daily_limit:
use_bulk_api() # Switch to Bulk API
```
### Scenario 2: Real-time Sync
```python
# Use Platform Events for real-time
def publish_change_event(record):
event = {
'Record_Id__c': record['Id'],
'Changed_Fields__c': json.dumps(record),
'Timestamp__c': datetime.now().isoformat()
}
sf.Custom_Change_Event__e.create(event)
```
## Additional Resources
- [Salesforce API Limits Documentation](https://developer.salesforce.com/docs/atlas.en-us.salesforce_app_limits_cheatsheet.meta/salesforce_app_limits_cheatsheet/salesforce_app_limits_platform_api.htm)
- [API Limits Quick Reference](https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/resources_limits.htm)
- [Monitoring Your API Usage](https://help.salesforce.com/s/articleView?id=sf.monitoring_api_usage.htm)
- [Best Practices for API Integration](https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/best_practices.htm)