inference-server
Version:
Libraries and server to build AI applications. Adapters to various native bindings allowing local inference. Integrate it with your application, or use as a microservice.
Filename | Content Type | Size | |
---|---|---|---|
../ | |||
512 B | |||
4.69 kB | |||
2.64 kB | |||
0 B | |||
206 B | |||
182 B | |||
697 B | |||
1.76 kB | |||
1.54 kB |