inference-server
Version:
Libraries and server to build AI applications. Adapters to various native bindings allowing local inference. Integrate it with your application, or use as a microservice.
Filename | Content Type | Size | |
---|---|---|---|
../ | |||
1.57 kB | |||
43 B | |||
109 B | |||
8.54 kB | |||
45 B | |||
113 B | |||
8.66 kB | |||
138 B | |||
179 B | |||
3.75 kB | |||
44 B | |||
111 B |