Replies: 1 comment
-
this is also a requirement for OpenAI Codex |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
5ire, the most recommended open source MCP client requires streaming and tool use.
however, llama_server doesn't allow this:
llama.cpp/examples/server/utils.hpp
Line 565 in f17a3bb
it would be great to be able to use this tool with llama.cpp directly.
Beta Was this translation helpful? Give feedback.
All reactions