inference-server
Version:
Libraries and server to build AI applications. Adapters to various native bindings allowing local inference. Integrate it with your application, or use as a microservice.
Filename | Content Type | Size | |
---|---|---|---|
595 kB | |||
13.5 kB | |||
15.2 kB | |||
293 kB | |||
11.7 kB | |||
3.54 kB | |||
678 B | |||
149 B | |||
289 B |