Skip to content
This repository was archived by the owner on Apr 21, 2025. It is now read-only.

Integrate FastAPI for model serving #12

Closed
davidmezzetti opened this issue Sep 10, 2020 · 0 comments
Closed

Integrate FastAPI for model serving #12

davidmezzetti opened this issue Sep 10, 2020 · 0 comments
Assignees
Milestone

Comments

@davidmezzetti
Copy link
Member

davidmezzetti commented Sep 10, 2020

Similar to neuml/txtai#12 - allow serving codequestion models via FastAPI. This will help speed up calls via the command line (#4), along with the possibility of remote service calls.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant