inference-server
Version:
Libraries and server to build AI applications. Adapters to various native bindings allowing local inference. Integrate it with your application, or use as a microservice.
Filename | Content Type | Size | |
---|---|---|---|
../ | |||
5.31 kB | |||
1.32 kB | |||
1.55 kB | |||
2.26 kB | |||
1.98 kB | |||
2.73 kB |