inference-server
Version:
Libraries and server to build AI applications. Adapters to various native bindings allowing local inference. Integrate it with your application, or use as a microservice.
Filename | Content Type | Size | |
---|---|---|---|
../ | |||
3.94 kB | |||
9.31 kB | |||
8.78 kB | |||
633 B | |||
44 B | |||
137 B | |||
316 B | |||
2 kB | |||
1.63 kB | |||
524 B | |||
3.46 kB | |||
3.13 kB |