inference-server
Version:
Libraries and server to build AI applications. Adapters to various native bindings allowing local inference. Integrate it with your application, or use as a microservice.
Filename | Content Type | Size | |
---|---|---|---|
../ | |||
22.3 kB | |||
24.9 kB | |||
122 kB | |||
5.88 kB | |||
25.1 kB | |||
23.3 kB | |||
1.17 kB | |||
355 B | |||
14.1 kB | |||
18.8 kB | |||
14.3 kB | |||
12.4 kB | |||
8.08 kB |