inference-server
Version:
Libraries and server to build AI applications. Adapters to various native bindings allowing local inference. Integrate it with your application, or use as a microservice.
| Filename | Content Type | Size | |
|---|---|---|---|
| ../ | |||
| 29.3 kB | |||
| 65.2 kB | |||
| 33.9 kB | |||
| 119 kB | |||
| 799 B | |||
| 530 B | |||
| 614 B | |||