The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
-
Updated
Apr 21, 2025 - JavaScript
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Harness LLMs with Multi-Agent Programming
Local Deep Research is an AI-powered assistant that transforms complex questions into comprehensive, cited reports by conducting iterative analysis using any LLM across diverse knowledge sources including academic databases, scientific repositories, web content, and private document collections.
Free, high-quality text-to-speech API endpoint to replace OpenAI, Azure, or ElevenLabs
A simple "Be My Eyes" web app with a llama.cpp/llava backend
Chrome Extension to Summarize or Chat with Web Pages/Local Documents Using locally running LLMs. Keep all of your data and conversations private. 🔐
Code with AI in VSCode, bring your own ai.
Your fully proficient, AI-powered and local chatbot assistant🤖
A python package for developing AI applications with local LLMs.
LLM story writer with a focus on high-quality long output based on a user provided prompt.
Openai-style, fast & lightweight local language model inference w/ documents
Custom TTS component for Home Assistant. Utilizes the OpenAI speech engine or any compatible endpoint to deliver high-quality speech. Optionally offers chime and audio normalization features.
Run Open Source/Open Weight LLMs locally with OpenAI compatible APIs
Run local LLM from Huggingface in React-Native or Expo using onnxruntime.
Recipes for on-device voice AI and local LLM
Python library for the instruction and reliable validation of structured outputs (JSON) of Large Language Models (LLMs) with Ollama and Pydantic. -> Deterministic work with LLMs.
PalmHill.BlazorChat is a chat application and API built with Blazor WebAssembly, SignalR, and WebAPI, featuring real-time LLM conversations, markdown support, customizable settings, and a responsive design. This project supports Llama2 models and was tested with Orca2.
Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.
To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."