llmcore

mcp
Guvenlik Denetimi
Gecti
Health Gecti
  • License — License: MIT
  • Description — Repository has a description
  • Active repo — Last push 0 days ago
  • Community trust — 11 GitHub stars
Code Gecti
  • Code scan — Scanned 1 files during light audit, no dangerous patterns found
Permissions Gecti
  • Permissions — No dangerous permissions requested
Purpose
This Qt/C++ library provides developers with a unified API for integrating various LLM providers into their applications. It implements the Model Context Protocol (MCP) to allow applications to act as an MCP server or client, handling streaming chat, tool execution, and resource management.

Security Assessment
The library acts as a framework to build AI integrations, meaning its security heavily depends on how developers configure it. The light audit found no hardcoded secrets or dangerous patterns in the scanned files. However, by design, it handles sensitive data like API keys and makes network requests to LLM providers. Additionally, its MCP client capabilities can explicitly execute shell commands (such as running `npx`) and load configurations from JSON, which could introduce vulnerabilities if the application processes untrusted input. Overall risk is rated as Medium, primarily due to the inherent risks of executing external commands and managing API keys, which require secure implementation by the user.

Quality Assessment
The project is in active development, with its last push occurring today. It uses the permissive MIT license, making it highly accessible for most projects. Community trust is currently minimal but positive, reflected by 11 GitHub stars. The repository features a clear description, comprehensive documentation, and automated testing workflows. The main limitation is that only 1 file was scanned during the automated audit, so a manual code review is recommended.

Verdict
Use with caution — the library itself appears well-structured and safe, but developers must carefully secure their API keys and strictly avoid passing untrusted JSON to the command execution functions.
SUMMARY

Qt C++ library for working with AI/LLM Providers and MCP

README.md

LLMCore

Build and Test
GitHub Tag

Qt/C++ library for working with LLM providers and MCP servers. Streaming chat, tool calling, and a full MCP 2025-11-25 client/server — all in one library.

LLM clients — unified streaming API across six providers:

auto *client = new LLMCore::ClaudeClient(url, apiKey, model, this);
client->ask("What is Qt?", cb);

MCP server — expose tools, resources and prompts over stdio or HTTP:

// stdio (stdin/stdout, e.g. for Claude Desktop)
auto *transport = new LLMCore::McpStdioServerTransport(&app);

// or Streamable HTTP
auto *transport = new LLMCore::McpHttpServerTransport({.port = 8080, .path = "/mcp"}, &app);

auto *server = new LLMCore::McpServer(transport, cfg, &app);
server->addTool(new MyTool(server));
server->start();

MCP client — connect to MCP servers and bind their tools into LLM clients:

// Add servers one by one
client->tools()->addMcpServer({.name = "filesystem", .command = "npx",
    .arguments = {"-y", "@modelcontextprotocol/server-filesystem", "/home/user"}});

// Or load from a JSON config
client->tools()->loadMcpServers(QJsonDocument::fromJson(configData).object());

loadMcpServers accepts:

{
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/home/user"]
    }
  }
}

See Quick Start for complete examples.

Supported Providers

Provider Client class Streaming Tools Thinking
Anthropic Claude ClaudeClient
OpenAI (Chat Completions) OpenAIClient
OpenAI (Responses API) OpenAIResponsesClient
Ollama OllamaClient
Google AI GoogleAIClient
llama.cpp LlamaCppClient

MCP (Model Context Protocol)

Client and server implementation of the MCP 2025-11-25 spec:

  • Transports: stdio, Streamable HTTP
  • Server: tools, resources, resource templates, prompts, completions, sampling, elicitation
  • Client: tools, resources, prompts, completions, sampling, elicitation, roots

See MCP Protocol Coverage for the full spec-conformance matrix.

Requirements

  • C++20
  • Qt 6.5+
  • CMake 3.21+

Documentation

Support

  • Report Issues: open an issue on GitHub
  • Contribute: pull requests with bug fixes or new features are welcome
  • Spread the Word: star the repository and share with fellow developers
  • Financial Support:
    • Bitcoin (BTC): bc1qndq7f0mpnlya48vk7kugvyqj5w89xrg4wzg68t
    • Ethereum (ETH): 0xA5e8c37c94b24e25F9f1f292a01AF55F03099D8D
    • Litecoin (LTC): ltc1qlrxnk30s2pcjchzx4qrxvdjt5gzuervy5mv0vy
    • USDT (TRC20): THdZrE7d6epW6ry98GA3MLXRjha1DjKtUx

License

MIT — see LICENSE.

Yorumlar (0)

Sonuc bulunamadi