nosia
Health Pass
- License — License: MIT
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Community trust — 205 GitHub stars
Code Pass
- Code scan — Scanned 12 files during light audit, no dangerous patterns found
Permissions Pass
- Permissions — No dangerous permissions requested
This tool provides a self-hosted platform for Retrieval-Augmented Generation (RAG) and Model Context Protocol (MCP) integration. It allows developers to run AI models privately on their own infrastructure using an OpenAI-compatible API, enabling AI models to interact with external tools and private documents.
Security Assessment
Overall Risk: Medium. The platform processes and stores your private documents and data locally. While the light code scan of 12 files found no dangerous patterns, hardcoded secrets, or excessive permissions, the default installation method relies on piping a remote shell script directly to the system (`curl | sh`). This requires the host machine to execute arbitrary shell commands, which can be a significant security risk if not reviewed first. Additionally, the tool heavily relies on Docker containers, introducing standard containerization attack vectors. Network requests are made to facilitate AI model APIs, external tool connectivity, and server-sent events for streaming.
Quality Assessment
The project appears to be highly maintained, with repository activity as recent as today. It has earned a solid 205 GitHub stars, indicating a healthy level of community trust and adoption. The code is cleanly licensed under the standard MIT license, and the repository features comprehensive documentation covering architecture, deployment, and user guides.
Verdict
Use with caution—while the code itself appears clean and active, the shell script installation method and inherent Docker attack surface mean you should audit the setup script before deploying it in sensitive environments.
Self-hosted AI RAG + MCP Platform
Nosia
Self-hosted AI RAG + MCP Platform
Nosia is a platform that allows you to run AI models on your own data with complete privacy and control. Beyond traditional RAG capabilities, Nosia integrates the Model Context Protocol (MCP) to connect AI models with external tools, services, and data sources. It is designed to be easy to install and use, providing OpenAI-compatible APIs that work seamlessly with existing AI applications.
Features
- 🔒 Private & Secure - Your data stays on your infrastructure
- 🤖 OpenAI-Compatible API - Drop-in replacement for OpenAI clients
- 📚 RAG-Powered - Augment AI responses with your documents
- 🔌 MCP Integration - Connect AI to external tools and services via Model Context Protocol
- 🔄 Real-time Streaming - Server-sent events for live responses
- 📄 Multi-format Support - PDFs, text files, websites, and Q&A pairs
- 🎯 Semantic Search - Vector similarity search with pgvector
- 🐳 Easy Deployment - Docker Compose with one-command setup
- 🔑 Multi-tenancy - Account-based isolation for secure data separation
Preview
Install
curl -fsSL https://get.nosia.ai | sh
Start and First Run
docker compose up -d
https://nosia.localhost
Quick Links
- 📖 Nosia Guides - Step-by-step tutorials
- 🏗️ Architecture Documentation - Technical deep dive
- 💬 Community Support - Get help
Documentation
- 📐 Architecture - Detailed system design and implementation
- 📊 System Diagrams - Visual representations of system components
- 🚀 Deployment Guide - Production deployment strategies and best practices
- 📋 Documentation Index - Complete documentation overview
- 🤝 Code of Conduct - Community guidelines
Table of Contents
- Quickstart
- Configuration
- Using Nosia
- Managing Your Installation
- Troubleshooting
- Contributing
- License
Quickstart
One Command Installation
Get Nosia up and running in minutes on macOS, Debian, or Ubuntu.
Prerequisites
- macOS, Debian, or Ubuntu operating system
- Internet connection
- sudo/root access (for Docker installation if needed)
Installation
The installation script will:
- Install Docker and Docker Compose if not already present
- Download Nosia configuration files
- Generate a secure
.envfile - Pull all required Docker images
curl -fsSL https://get.nosia.ai | sh
You should see the following output:
Setting up prerequisites
Setting up Nosia
Generating .env file
Pulling latest Nosia
[+] Pulling 6/6
✔ llm Pulled
✔ embedding Pulled
✔ web Pulled
✔ reverse-proxy Pulled
✔ postgres-db Pulled
✔ solidq Pulled
Starting Nosia
Start all services with:
docker compose up
# OR run in the background
docker compose up -d
Accessing Nosia
Once started, access Nosia at:
- Web Interface:
https://nosia.localhost - API Endpoint:
https://nosia.localhost/v1
Note: The default installation uses a self-signed SSL certificate. Your browser will show a security warning on first access. For production deployments, see the Deployment Guide for proper SSL certificate configuration.
Custom Installation
Default Models
By default, Nosia uses:
- Completion model:
ai/granite-4.0-h-tiny - Embeddings model:
ai/granite-embedding-multilingual
Using a Custom Completion Model
You can use any completion model available on Docker Hub AI by setting the LLM_MODEL environment variable during installation.
Example with Granite 4.0 32B:
curl -fsSL https://get.nosia.ai | LLM_MODEL=ai/granite-4.0-h-small sh
Model options:
ai/granite-4.0-h-micro- 3B long-context instruct model by IBMai/granite-4.0-h-tiny- 7B long-context instruct model by IBM (default)ai/granite-4.0-h-small- 32B long-context instruct model by IBMai/mistral- Efficient open model (7B) with top-tier performance and fast inference by Mistral AIai/magistral-small-3.2- 24B multimodal instruction model by Mistral AIai/devstral-small- Agentic coding LLM (24B) fine-tuned from Mistral-Small 3.1 by Mistral AIai/llama3.3- Meta's Llama 3.3 modelai/gemma3- Google's Gemma 3 modelai/qwen3- Alibaba's Qwen 3 modelai/deepseek-r1-distill-llama- DeepSeek's distilled Llama model- Browse more at Docker Hub AI
Using a Custom Embeddings Model
By default, Nosia uses ai/granite-embedding-multilingual for generating document embeddings.
To change the embeddings model:
Update the environment variables in your
.envfile:EMBEDDING_MODEL=your-preferred-embedding-model EMBEDDING_DIMENSIONS=768 # Adjust based on your model's output dimensionsRestart Nosia to apply changes:
docker compose down docker compose up -dUpdate existing embeddings (if you have documents already indexed):
docker compose run web bin/rails embeddings:change_dimensions
Important: Different embedding models produce vectors of different dimensions. Ensure
EMBEDDING_DIMENSIONSmatches your model's output size, or vector search will fail.
Advanced Installation
With Docling Document Processing
Docling provides enhanced document processing capabilities for complex PDFs and documents.
To enable Docling:
Start Nosia with the Docling serve compose file:
# For NVIDIA GPUs docker compose -f docker-compose-docling-serve-nvidia.yml up -d # OR for AMD GPUs docker compose -f docker-compose-docling-serve-amd.yml up -d # OR for CPU only docker compose -f docker-compose-docling-serve-cpu.yml up -dConfigure the Docling URL in your
.envfile:DOCLING_SERVE_BASE_URL=http://localhost:5001
This starts a Docling serve instance on port 5001 that Nosia will use for advanced document parsing.
With Augmented Context (RAG)
Enable Retrieval Augmented Generation to enhance AI responses with relevant context from your documents.
To enable RAG:
Add to your .env file:
AUGMENTED_CONTEXT=true
When enabled, Nosia will:
- Search your document knowledge base for relevant chunks
- Include the most relevant context in the AI prompt
- Generate responses grounded in your specific data
Additional RAG configuration:
RETRIEVAL_FETCH_K=3 # Number of document chunks to retrieve
LLM_TEMPERATURE=0.1 # Lower temperature for more factual responses
Configuration
Environment Variables
Nosia validates required environment variables at startup to prevent runtime failures. If any required variables are missing or invalid, the application will fail to start with a clear error message.
Required Variables
| Variable | Description | Example |
|---|---|---|
SECRET_KEY_BASE |
Rails secret key for session encryption | Generate with bin/rails secret |
AI_BASE_URL |
Base URL for OpenAI-compatible API | http://model-runner.docker.internal/engines/llama.cpp/v1 |
LLM_MODEL |
Language model identifier | ai/mistral, ai/granite-4.0-h-tiny |
EMBEDDING_MODEL |
Embedding model identifier | ai/granite-embedding-multilingual |
EMBEDDING_DIMENSIONS |
Embedding vector dimensions | 768, 384, 1536 |
Optional Variables with Defaults
| Variable | Description | Default | Range/Options |
|---|---|---|---|
AI_API_KEY |
API key for the AI service | empty | Any string |
LLM_TEMPERATURE |
Model creativity (lower = more factual) | 0.1 |
0.0 - 2.0 |
LLM_TOP_K |
Top K sampling parameter | 40 |
1 - 100 |
LLM_TOP_P |
Top P (nucleus) sampling | 0.9 |
0.0 - 1.0 |
RETRIEVAL_FETCH_K |
Number of document chunks to retrieve for RAG | 3 |
1 - 10 |
AUGMENTED_CONTEXT |
Enable RAG for chat completions | false |
true, false |
DOCLING_SERVE_BASE_URL |
Docling document processing service URL | empty | http://localhost:5001 |
See .env.example for a complete list of configuration options.
Setting Up Your Environment
For Docker Compose (Recommended)
The installation script automatically generates a .env file. To customize:
Edit the
.envfile in your installation directory:nano .envUpdate values as needed and restart:
docker compose down docker compose up -d
For Manual/Development Setup
Copy the example environment file:
cp .env.example .envGenerate a secure secret key:
SECRET_KEY_BASE=$(bin/rails secret) echo "SECRET_KEY_BASE=$SECRET_KEY_BASE" >> .envUpdate other required values in
.env:AI_BASE_URL=http://your-ai-service:11434/v1 LLM_MODEL=ai/mistral EMBEDDING_MODEL=ai/granite-embedding-multilingual EMBEDDING_DIMENSIONS=768Test your configuration:
bin/rails runner "puts 'Configuration valid!'"
If validation fails, you'll see a detailed error message indicating which variables are missing or invalid.
Using Nosia
Web Interface
After starting Nosia, access the web interface at https://nosia.localhost:
- Create an account or log in
- Upload documents - PDFs, text files, or add website URLs
- Create Q&A pairs - Add domain-specific knowledge
- Start chatting - Ask questions about your documents
API Access
Nosia provides an OpenAI-compatible API that works with existing OpenAI client libraries.
Getting an API Token
- Log in to Nosia web interface
- Navigate to
https://nosia.localhost/api_tokens - Click "Generate Token" and copy your API key
- Store it securely - it won't be shown again
Using the API
Configure your OpenAI client to use Nosia:
Python Example:
from openai import OpenAI
client = OpenAI(
base_url="https://nosia.localhost/v1",
api_key="your-nosia-api-token"
)
response = client.chat.completions.create(
model="default", # Nosia uses your configured model
messages=[
{"role": "user", "content": "What is in my documents about AI?"}
],
stream=True
)
for chunk in response:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")
cURL Example:
curl https://nosia.localhost/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your-nosia-api-token" \
-d '{
"model": "default",
"messages": [
{"role": "user", "content": "Summarize my documents"}
]
}'
Node.js Example:
import OpenAI from 'openai';
const client = new OpenAI({
baseURL: 'https://nosia.localhost/v1',
apiKey: 'your-nosia-api-token'
});
const response = await client.chat.completions.create({
model: 'default',
messages: [
{ role: 'user', content: 'What information do you have about my project?' }
]
});
console.log(response.choices[0].message.content);
For more API examples and details, see the API Guide.
MCP Integration
Nosia supports the Model Context Protocol (MCP), allowing AI models to interact with external tools, services, and data sources. MCP servers can provide tools, prompts, and resources that extend the AI's capabilities beyond text generation.
What is MCP?
The Model Context Protocol is an open protocol that standardizes how applications provide context to Large Language Models (LLMs). MCP enables AI to:
- Execute Tools - Perform actions in external systems (calendars, file storage, databases)
- Access Resources - Read from various data sources in real-time
- Use Prompts - Leverage pre-configured prompt templates
- Extend Capabilities - Add custom functionality without modifying core code
Using MCP Servers
Navigate to MCP Settings in the web interface
Browse the MCP Catalog - Pre-configured servers for popular services:
- Productivity: Infomaniak Calendar, kDrive file storage
- Communication: kChat messaging
- And more - Extensible catalog of integrations
Activate an MCP Server:
- Click on a server from the catalog
- Provide required configuration (API keys, tokens)
- Test the connection
- Enable it for your chats
Add MCP to Chat:
- Open or create a chat session
- Select which MCP servers to use
- The AI can now use tools from connected servers
Example: Using Calendar MCP
from openai import OpenAI
client = OpenAI(
base_url="https://nosia.localhost/v1",
api_key="your-nosia-api-token"
)
# The AI can now use calendar tools if enabled in the chat
response = client.chat.completions.create(
model="default",
messages=[
{"role": "user", "content": "Schedule a meeting tomorrow at 2pm"}
]
)
print(response.choices[0].message.content)
When MCP servers are enabled, the AI can:
- Search your calendar for availability
- Create new events
- Access file storage
- Post messages to chat systems
- And execute any tools provided by connected MCP servers
Custom MCP Servers
Beyond the catalog, you can add custom MCP servers:
Navigate to MCP Settings → Custom Servers
Choose transport type:
- stdio - Local processes (NPX, Python scripts)
- SSE - Server-sent events over HTTP
- HTTP - Standard HTTP endpoints
Configure connection:
- Provide endpoint or command
- Add authentication credentials
- Test connection
Use in chats - Enable the custom server for your conversations
For more details on MCP integration, see the MCP Documentation.
Managing Your Installation
Start
Start all Nosia services:
# Start in foreground (see logs in real-time)
docker compose up
# Start in background (detached mode)
docker compose up -d
Check that all services are running:
docker compose ps
Stop
Stop all running services:
# Stop services (keeps data)
docker compose down
# Stop and remove all data (⚠️ destructive)
docker compose down -v
Upgrade
Keep Nosia up to date with the latest features and security fixes:
# Pull latest images
docker compose pull
# Restart services with new images
docker compose up -d
# View logs to ensure successful upgrade
docker compose logs -f web
Upgrade checklist:
- Backup your data before upgrading (see Deployment Guide)
- Review release notes for breaking changes
- Pull latest images
- Restart services
- Verify functionality
Logs
View logs for troubleshooting:
# All services
docker compose logs -f
# Specific service
docker compose logs -f web
docker compose logs -f postgres-db
docker compose logs -f llm
# Last 100 lines
docker compose logs --tail=100 web
Health Check
Verify Nosia is running correctly:
# Check service status
docker compose ps
# Check web application health
curl -k https://nosia.localhost/up
# Check background jobs
docker compose exec web bin/rails runner "puts SolidQueue::Job.count"
Troubleshooting
Common Issues
Installation Problems
Docker not found:
# Verify Docker is installed
docker --version
# Install Docker if needed (Ubuntu/Debian)
curl -fsSL https://get.docker.com | sh
Permission denied:
# Add your user to docker group
sudo usermod -aG docker $USER
# Log out and back in, then try again
Runtime Issues
Services won't start:
# Check logs for errors
docker compose logs
# Verify .env file exists and has required variables
cat .env | grep -E 'SECRET_KEY_BASE|AI_BASE_URL|LLM_MODEL'
# Restart services
docker compose down && docker compose up -d
Slow AI responses:
- Check background jobs:
https://nosia.localhost/jobs - View job logs:
docker compose logs -f solidq - Ensure your hardware meets minimum requirements (see Deployment Guide)
Can't access web interface:
# Check if services are running
docker compose ps
# Verify reverse-proxy is healthy
docker compose logs reverse-proxy
# Test connectivity
curl -k https://nosia.localhost/up
Database connection errors:
# Check PostgreSQL is running
docker compose ps postgres-db
# View database logs
docker compose logs postgres-db
# Test database connection
docker compose exec web bin/rails runner "ActiveRecord::Base.connection.execute('SELECT 1')"
Document Processing Issues
Documents not processing:
- Check background jobs:
https://nosia.localhost/jobs - View processing logs:
docker compose logs -f web - Verify embedding service is running:
docker compose ps embedding
Embedding errors:
# Verify EMBEDDING_DIMENSIONS matches your model
docker compose exec web bin/rails runner "puts ENV['EMBEDDING_DIMENSIONS']"
# Rebuild embeddings if dimensions changed
docker compose run web bin/rails embeddings:change_dimensions
Log Locations
| Issue Type | Log Location | Command |
|---|---|---|
| Installation | ./log/production.log |
tail -f log/production.log |
| Runtime errors | Docker logs | docker compose logs -f web |
| Background jobs | Jobs dashboard | Visit https://nosia.localhost/jobs |
| Database | PostgreSQL logs | docker compose logs postgres-db |
| AI model | LLM container logs | docker compose logs llm |
Getting Help
If you need further assistance:
Check Documentation:
- Architecture Guide - Understand how Nosia works
- Deployment Guide - Advanced configuration
Search Existing Issues:
- GitHub Issues
- Someone may have encountered the same problem
Open a New Issue:
- Include your Nosia version:
docker compose images | grep web - Describe the problem with steps to reproduce
- Include relevant logs (remove sensitive information)
- Specify your OS and Docker version
- Include your Nosia version:
Community Support:
- GitHub Discussions
- Share your use case and get advice from the community
Contributing
We welcome contributions! Here's how you can help:
- Report bugs - Open an issue with details and reproduction steps
- Suggest features - Share your ideas in GitHub Discussions
- Improve documentation - Submit PRs for clarity and accuracy
- Write code - Fix bugs or implement new features
- Share your experience - Write blog posts or tutorials
See CONTRIBUTING.md if available, or start by opening an issue to discuss your ideas.
License
Nosia is open source software. See LICENSE for details.
Additional Resources
- Website: nosia.ai
- Documentation: guides.nosia.ai
- Source Code: github.com/nosia-ai/nosia
- Docker Hub: hub.docker.com/u/ai
Built with ❤️ by the Nosia community
Reviews (0)
Sign in to leave a review.
Leave a reviewNo results found