llm-info
Version:
Information on LLM models, context window token limit, output token limit, pricing and more
13 lines (12 loc) • 4.15 kB
Markdown
| Feature | Claude 3.7 Sonnet | Claude 3.5 Sonnet | Claude 3.5 Haiku | Claude 3 Opus | Claude 3 Haiku |
| --------------------- | ----------------------------------------------------------------------------- | -------------------------------------------------------------------------------------- | ------------------------------ | -------------------------------------------------- | -------------------------------------------------------------- |
| Description | Our most intelligent model | Our previous most intelligent model | Our fastest model | Powerful model for complex tasks | Fastest and most compact model for near-instant responsiveness |
| Strengths | Highest level of intelligence and capability with toggleable extendedthinking | High level of intelligence and capability | Intelligence at blazing speeds | Top-level intelligence, fluency, and understanding | Quick and accurate targeted performance |
| Multilingual | Yes | Yes | Yes | Yes | Yes |
| Vision | Yes | Yes | Yes | Yes | Yes |
| Extended thinking | Yes | No | No | No | No |
| API model name | claude-3-7-sonnet-20250219 | Upgraded version:claude-3-5-sonnet-20241022Previous version:claude-3-5-sonnet-20240620 | claude-3-5-haiku-20241022 | claude-3-opus-20240229 | claude-3-haiku-20240307 |
| Comparative latency | Fast | Fast | Fastest | Moderately fast | Fastest |
| Context window | 200K | 200K | 200K | 200K | 200K |
| Max output | 64000 tokens | 8192 tokens | 8192 tokens | 4096 tokens | 4096 tokens |
| Training data cut-off | Nov 20241 | Apr 2024 | July 2024 | Aug 2023 | Aug 2023 |