-
Notifications
You must be signed in to change notification settings - Fork 52
Features Request: Ollama support #55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hello @tysonchamp, its on our roadmap. I'll let you know, when we add the ollama support. Thanks. |
+1 |
1 similar comment
+1 |
Anything done on this yet? This is a REALLY important, high priority requirement for business use. |
Hi folks! We’re exploring Ollama support for Gurubase. Could you share your specific use case? For example, is it for offline usage, privacy concerns, or to try other open-source models as the base LLM? Your input may help us prioritize this feature request. |
I would say for me it is mostly a question of privacy to support ollama or vllm. I wouldn't be looking to self-host a RAG system otherwise. It feels like a core tenant of the offering as a whole. Neither of which need to be fully integrated but at least, if the now standard openai API is supported and I had the option to select the API URL and perhaps the model I would already be happy as a starting point. |
+1 |
In my case, the company I work for has internal process documents that they do not allow to be sent to OpenAI for privacy reasons. So we run Ollama on a local server. |
@fatihbaltaci Go to src/gurubase-backend/backend/core/models.py L1178 and modify to:
source: https://ollama.com/blog/openai-compatibility DONE! |
@joaopalma5 we're already aware of the OpenAI-compatible endpoint Ollama exposes. We use multiple models behind the scenes (OpenAI for base llm, gte-large for embeddings, bge-reranker for reranking, Gemini for summarization, etc.), so plugging Ollama in isn't just a one-line change. We're working on proper Ollama support to ensure the self-hosted experience matches the performance of Gurubase.io. We'll share updates as soon as it's ready. |
Please add ollama support
The text was updated successfully, but these errors were encountered: