ai-platform-converter
Version:
Lossless API parameter conversion between multiple AI platforms (OpenAI, Anthropic, Gemini, DeepSeek, Wenwen, Vertex AI, Huawei, BigModel)
396 lines (312 loc) โข 8.68 kB
Markdown
# AI Platform Converter
[](https://www.npmjs.com/package/ai-platform-converter)
[](https://opensource.org/licenses/MIT)
**Lossless API parameter conversion between multiple AI platforms.**
Convert API requests and responses seamlessly between different AI platforms including OpenAI, Anthropic (Claude), Google Gemini, DeepSeek, Wenwen, Vertex AI, Huawei, and BigModel (ๆบ่ฐฑ).
## Features
โจ **Lossless Conversion** - All parameters are preserved during conversion, no data loss
๐ **Bidirectional** - Convert both requests and responses
๐ฏ **Type-Safe** - Full TypeScript support with comprehensive type definitions
๐ **8 Platforms** - Support for major AI platforms
๐ฆ **Zero Dependencies** - Lightweight and fast
๐งช **Well Tested** - Comprehensive test coverage
## Supported Platforms
### OpenAI-Compatible Platforms
- **OpenAI** - GPT-4, GPT-3.5, etc.
- **DeepSeek** - DeepSeek Chat
- **Wenwen (้ฎ้ฎ)** - Tencent's AI platform
- **Vertex AI** - Google Cloud's AI platform
- **Huawei** - Huawei Pangu
- **BigModel (ๆบ่ฐฑ)** - GLM models
### Special Platforms
- **Anthropic** - Claude models (unique API structure)
- **Gemini** - Google Gemini (unique API structure)
## Installation
```bash
npm install ai-platform-converter
```
```bash
yarn add ai-platform-converter
```
```bash
pnpm add ai-platform-converter
```
## Quick Start
```typescript
import { convertRequest, convertResponse, Platform } from 'ai-platform-converter';
// Convert OpenAI request to Anthropic format
const openaiRequest = {
model: 'gpt-4',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Hello!' }
],
temperature: 0.7,
max_tokens: 100
};
const anthropicRequest = convertRequest(
Platform.OpenAI,
Platform.Anthropic,
openaiRequest
);
// Convert Anthropic response back to OpenAI format
const anthropicResponse = {
id: 'msg_123',
type: 'message',
role: 'assistant',
content: [{ type: 'text', text: 'Hello! How can I help you?' }],
model: 'claude-3-opus-20240229',
stop_reason: 'end_turn',
usage: { input_tokens: 10, output_tokens: 20 }
};
const openaiResponse = convertResponse(
Platform.Anthropic,
Platform.OpenAI,
anthropicResponse
);
```
## API Reference
### `convertRequest(from, to, params, options?)`
Convert request parameters from one platform to another.
**Parameters:**
- `from: Platform` - Source platform
- `to: Platform` - Target platform
- `params: any` - Request parameters
- `options?: ConvertOptions` - Conversion options
**Returns:** Converted request parameters
**Example:**
```typescript
const result = convertRequest(
Platform.OpenAI,
Platform.Gemini,
openaiRequest,
{ preserveExtensions: true }
);
```
### `convertResponse(from, to, response, options?)`
Convert response from one platform to another.
**Parameters:**
- `from: Platform` - Source platform
- `to: Platform` - Target platform
- `response: any` - Response data
- `options?: ConvertOptions` - Conversion options
**Returns:** Converted response
### `Platform` Enum
```typescript
enum Platform {
OpenAI = 'openai',
DeepSeek = 'deepseek',
Wenwen = 'wenwen',
VertexAI = 'vertex-ai',
Huawei = 'huawei',
BigModel = 'bigmodel',
Anthropic = 'anthropic',
Gemini = 'gemini'
}
```
### `ConvertOptions`
```typescript
interface ConvertOptions {
strict?: boolean; // Strict mode - throw error on unknown parameters
preserveExtensions?: boolean; // Preserve platform-specific extension fields
debug?: boolean; // Enable debug logging
}
```
## Usage Examples
### Basic Chat Conversion
```typescript
import { convertRequest, Platform } from 'ai-platform-converter';
// OpenAI to Anthropic
const openaiRequest = {
model: 'gpt-4',
messages: [
{ role: 'system', content: 'You are helpful.' },
{ role: 'user', content: 'What is AI?' }
],
temperature: 0.7,
max_tokens: 500
};
const anthropicRequest = convertRequest(
Platform.OpenAI,
Platform.Anthropic,
openaiRequest
);
console.log(anthropicRequest);
// {
// model: 'gpt-4',
// system: 'You are helpful.',
// messages: [{ role: 'user', content: 'What is AI?' }],
// temperature: 0.7,
// max_tokens: 500
// }
```
### Function/Tool Calling
```typescript
// OpenAI request with tools
const openaiRequest = {
model: 'gpt-4',
messages: [{ role: 'user', content: 'What is the weather in Tokyo?' }],
max_tokens: 100,
tools: [{
type: 'function',
function: {
name: 'get_weather',
description: 'Get current weather',
parameters: {
type: 'object',
properties: {
location: { type: 'string', description: 'City name' }
},
required: ['location']
}
}
}],
tool_choice: 'auto'
};
// Convert to Anthropic
const anthropicRequest = convertRequest(
Platform.OpenAI,
Platform.Anthropic,
openaiRequest
);
// Convert to Gemini
const geminiRequest = convertRequest(
Platform.OpenAI,
Platform.Gemini,
openaiRequest
);
```
### Multimodal (Vision) Support
```typescript
const openaiRequest = {
model: 'gpt-4-vision-preview',
messages: [{
role: 'user',
content: [
{ type: 'text', text: 'What is in this image?' },
{
type: 'image_url',
image_url: {
url: 'data:image/jpeg;base64,/9j/4AAQSkZJRg...'
}
}
]
}],
max_tokens: 300
};
// Convert to Anthropic (supports vision)
const anthropicRequest = convertRequest(
Platform.OpenAI,
Platform.Anthropic,
openaiRequest
);
// Convert to Gemini (supports vision)
const geminiRequest = convertRequest(
Platform.OpenAI,
Platform.Gemini,
openaiRequest
);
```
### Preserving Platform-Specific Parameters
```typescript
const anthropicRequest = {
model: 'claude-3-opus-20240229',
messages: [{ role: 'user', content: 'Hello!' }],
max_tokens: 100,
top_k: 50, // Anthropic-specific parameter
metadata: { user_id: 'user123' } // Anthropic-specific
};
// Convert with extension preservation
const openaiRequest = convertRequest(
Platform.Anthropic,
Platform.OpenAI,
anthropicRequest,
{ preserveExtensions: true }
);
// Platform-specific params are preserved in _extensions
console.log(openaiRequest._extensions.anthropic.top_k); // 50
// Convert back - original params are restored
const backToAnthropic = convertRequest(
Platform.OpenAI,
Platform.Anthropic,
openaiRequest,
{ preserveExtensions: true }
);
console.log(backToAnthropic.top_k); // 50
```
### Batch Conversion
```typescript
import { convertRequestBatch, convertResponseBatch } from 'ai-platform-converter';
const requests = [request1, request2, request3];
const converted = convertRequestBatch(
Platform.OpenAI,
Platform.Anthropic,
requests
);
```
## Architecture
The converter uses **OpenAI format as the intermediate representation**:
```
Platform A โ OpenAI Format โ Platform B
```
This approach requires only `N ร 2` converters instead of `N ร (N-1)` converters.
### Conversion Flow
1. **Source โ OpenAI**: Convert from source platform to OpenAI format
2. **OpenAI โ Target**: Convert from OpenAI format to target platform
### Extension Preservation
Platform-specific parameters that don't have equivalents in other platforms are preserved in the `_extensions` field:
```typescript
{
...standardParams,
_extensions: {
platform: 'anthropic',
originalParams: {...}, // Full original request (if preserveExtensions: true)
anthropic: {
top_k: 50,
metadata: {...}
}
}
}
```
## Type Definitions
The package includes comprehensive TypeScript type definitions for all platforms:
```typescript
import type {
OpenAIRequest,
OpenAIResponse,
AnthropicRequest,
AnthropicResponse,
GeminiRequest,
GeminiResponse,
// ... and more
} from 'ai-platform-converter';
```
## Error Handling
```typescript
import { ConversionError } from 'ai-platform-converter';
try {
const result = convertRequest(Platform.OpenAI, Platform.Anthropic, request);
} catch (error) {
if (error instanceof ConversionError) {
console.error(`Conversion failed: ${error.message}`);
console.error(`From: ${error.fromPlatform}, To: ${error.toPlatform}`);
}
}
```
## Testing
```bash
npm test
```
```bash
npm run test:watch
```
## Building
```bash
npm run build
```
## License
MIT
## Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
## Support
If you encounter any issues or have questions, please file an issue on GitHub.