mcp-sqlite-tools
Health Pass
- License — License: MIT
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Community trust — 10 GitHub stars
Code Fail
- rm -rf — Recursive force deletion command in package.json
- fs module — File system access in package.json
- exec() — Shell command execution in src/clients/query-executor.ts
- exec() — Shell command execution in src/clients/transaction-manager.ts
Permissions Pass
- Permissions — No dangerous permissions requested
This MCP server enables LLMs to safely interact with local SQLite databases, providing comprehensive query execution, transaction management, and schema operations. It is designed to give AI assistants controlled access to database files.
Security Assessment
While the project claims to have built-in security features and query classification, static analysis flagged several significant concerns. The codebase contains shell command execution within its core query and transaction managers. Although this is typical for database client tools to handle local file operations, it represents a potential vulnerability if an LLM is manipulated into executing malicious payloads. Additionally, the project requests file system access and includes a recursive force deletion (`rm -rf`) command in its package configuration. No hardcoded secrets were found, no dangerous OAuth permissions are requested, and there are no indications of unauthorized network requests.
Overall Risk Rating: Medium
Quality Assessment
The project is actively maintained, with its most recent push occurring today. It uses the permissive MIT license and has a basic level of community validation with 10 GitHub stars. The code is written in TypeScript, which is standard for MCP servers, though the project remains relatively small and early in its adoption lifecycle.
Verdict
Use with caution — the inherent file system and shell execution risks require strict environment isolation and careful oversight of LLM instructions.
A Model Context Protocol (MCP) server that provides comprehensive SQLite database operations for LLMs. This server enables AI assistants to interact with local SQLite databases safely and efficiently, with built-in security features, advanced transaction support, and clear separation between read-only and destructive operations.
mcp-sqlite-tools
A Model Context Protocol (MCP) server that provides comprehensive
SQLite database operations for LLMs. This server enables AI assistants
to interact with local SQLite databases safely and efficiently, with
built-in security features, advanced transaction support, and clear
separation between read-only and destructive operations.
Features
🗄️ Database Management
- Open/Create Database: Open existing databases or create new ones
- Close Database: Properly close database connections
- List Databases: Discover database files in directories
- Database Info: Get comprehensive database metadata and
statistics
📊 Table Operations
- List Tables: View all tables and views in a database
- Describe Table: Get detailed schema information for tables
- Create Table: Create new tables with custom column definitions
- Drop Table: Remove tables (with safety warnings)
🔍 Query Operations
- Execute Read Query: Safe SELECT, PRAGMA, and EXPLAIN queries
- Execute Write Query: INSERT, UPDATE, DELETE operations
- Execute Schema Query: DDL operations (CREATE, ALTER, DROP)
- Bulk Insert: Efficient batch insertion of multiple records
💾 Transaction Management
- Begin Transaction: Start database transactions with savepoint
support - Commit Transaction: Commit changes with nested transaction
handling - Rollback Transaction: Safely rollback changes and nested
savepoints - Auto-cleanup: Automatic cleanup of stale transactions
📋 Schema Operations
- Export Schema: Export database schema to SQL or JSON format
- Import Schema: Import and execute schema from SQL or JSON
- Selective Export: Export specific tables or entire database
structure
🛠️ Database Maintenance
- Backup Database: Create database backups with timestamps
- Vacuum Database: Optimize database storage and performance
- Connection Pooling: Advanced connection management with health
monitoring
⚠️ Security Features
This server implements multiple layers of security:
- Query Classification: Automatic separation of read-only, write,
schema, and transaction operations - Path Validation: Prevents directory traversal attacks
- Configurable Path Restrictions: Control access to absolute paths
- Input Validation: Comprehensive parameter validation using
Valibot - Advanced Connection Pooling: Connection limits, health
monitoring, and idle timeout - Transaction Safety: Automatic stale transaction cleanup and
nested savepoint support - Resource Cleanup: Graceful cleanup on server shutdown with
maintenance scheduling
Tool Separation for Hook-Based Safety
The tools are intentionally separated into distinct categories to
enable fine-grained approval control in MCP clients like Claude Code:
✓ SAFE Tools (Read-only operations):
execute_read_query- SELECT, PRAGMA, EXPLAIN querieslist_tables,describe_table,database_infoexport_schema,backup_database
These tools can be auto-approved or approved once, allowing the AI to
freely explore your database structure and read data.
⚠️ DESTRUCTIVE Tools (Data modification):
execute_write_query- INSERT, UPDATE, DELETEbulk_insert- Batch insertionsdrop_table- Permanent table deletion
These tools should require individual approval for each operation,
giving you visibility into what data will be modified before it
happens.
⚠️ SCHEMA CHANGE Tools (Structure modification):
execute_schema_query- CREATE, ALTER, DROP statementscreate_table- Table creationimport_schema- Schema import
These tools modify database structure and should require individual
approval to prevent unintended schema changes.
🔒 TRANSACTION Tools:
begin_transaction,commit_transaction,rollback_transaction
Can be configured based on your workflow needs.
Example Claude Code Hook Configuration:
// In your Claude Code hooks
export function toolApproval(tool) {
// Auto-approve safe read operations
if (
tool.name.includes('read') ||
tool.name.includes('list') ||
tool.name.includes('describe') ||
tool.name.includes('export') ||
tool.name.includes('backup') ||
tool.name.includes('info')
) {
return 'auto-approve';
}
// Require approval for destructive operations
if (
tool.name.includes('write') ||
tool.name.includes('delete') ||
tool.name.includes('drop') ||
tool.name.includes('insert') ||
tool.name.includes('schema')
) {
return 'require-approval';
}
return 'require-approval'; // Default to safe
}
This separation ensures you maintain control over destructive
operations while allowing the AI to work efficiently with read-only
queries.
Installation
From npm (when published)
npm install -g mcp-sqlite-tools
From source
git clone <repository-url>
cd mcp-sqlite-tools
pnpm install
pnpm run build
Configuration
Environment Variables
The server can be configured using environment variables:
# Default directory for SQLite databases (relative to project root)
SQLITE_DEFAULT_PATH=.
# Allow absolute paths for database files (security setting)
SQLITE_ALLOW_ABSOLUTE_PATHS=true
# Maximum query execution time in milliseconds
SQLITE_MAX_QUERY_TIME=30000
# Default backup directory for database backups
SQLITE_BACKUP_PATH=./backups
# Enable debug logging
DEBUG=false
MCP Client Configuration
Option 1: Global User Configuration (Recommended)
Configure once in your VS Code user settings to work across all
workspaces. Add this to your global mcp.json file
(%APPDATA%\Code\User\mcp.json on Windows):
For VS Code global configuration, edit ~/.config/Code/User/mcp.json
(or equivalent Windows location):
{
"servers": {
"sqlite-tools": {
"command": "npx",
"args": ["-y", "mcp-sqlite-tools"]
}
}
}
For WSL users, use this format in your global config:
{
"servers": {
"sqlite-tools": {
"command": "wsl.exe",
"args": ["bash", "-c", "npx -y mcp-sqlite-tools"]
}
}
}
Benefits:
- ✅ One configuration works everywhere - no per-project setup
needed - 📁 Automatically uses current workspace - databases created in
whatever project you have open - 🔄 Always up to date - uses latest published version via npx
Option 2: Workspace-Specific Configuration
For teams that want to share database configuration via version
control, create a .vscode/mcp.json file in your workspace:
{
"servers": {
"sqlite-tools": {
"command": "npx",
"args": ["-y", "mcp-sqlite-tools"],
"env": {
"SQLITE_DEFAULT_PATH": "${workspaceFolder}/databases",
"SQLITE_ALLOW_ABSOLUTE_PATHS": "true",
"SQLITE_BACKUP_PATH": "${workspaceFolder}/backups"
}
}
}
}
Benefits:
- � Team sharing - configuration committed to version control
- 📂 Organized structure - databases in dedicated
/databases
folder - �️ Project isolation - each project has its own database
configuration
Claude Desktop / Cline Configuration
Add this to your MCP client configuration:
{
"mcpServers": {
"mcp-sqlite-tools": {
"command": "npx",
"args": ["-y", "mcp-sqlite-tools"],
"env": {
"SQLITE_DEFAULT_PATH": ".",
"SQLITE_ALLOW_ABSOLUTE_PATHS": "true",
"SQLITE_MAX_QUERY_TIME": "30000",
"SQLITE_BACKUP_PATH": "./backups"
}
}
}
}
Environment Variables
The following environment variables can be used to configure the MCP
server:
| Variable | Description | Default | Example |
|---|---|---|---|
SQLITE_DEFAULT_PATH |
Default directory for database files | . |
${workspaceFolder}/databases |
SQLITE_ALLOW_ABSOLUTE_PATHS |
Allow absolute paths in database operations | true |
false |
SQLITE_BACKUP_PATH |
Default directory for database backups | Same as SQLITE_DEFAULT_PATH |
./backups |
SQLITE_MAX_QUERY_TIME |
Maximum query execution time (ms) | 30000 |
60000 |
Path Resolution:
- Relative paths are resolved from the default path
- Use
${workspaceFolder}in VS Code for workspace-relative paths - Set
SQLITE_ALLOW_ABSOLUTE_PATHS=trueto enable absolute path
operations
Development Configuration
For development with the MCP inspector:
pnpm run build
pnpm run dev
API Reference
Database Management Tools
open_database
Opens or creates a SQLite database file.
Parameters:
path(string, required): Path to the database filecreate(boolean, optional): Create if doesn't exist (default:
true)
Example:
{
"path": "my-app.db",
"create": true
}
close_database
Closes a database connection.
Parameters:
database(string, optional): Database path to close
list_databases
Lists available database files in a directory.
Parameters:
directory(string, optional): Directory to search
database_info
Gets comprehensive information about a database.
Parameters:
database(string, optional): Database path
Table Operations
list_tables
Lists all tables and views in a database.
Parameters:
database(string, optional): Database path
describe_table
Gets schema information for a table.
Parameters:
table(string, required): Table namedatabase(string, optional): Database pathverbosity(string, optional): 'summary' or 'detailed' (default:
'detailed')
Example Request:
{
"table": "users",
"verbosity": "detailed"
}
Example Response:
{
"database": "/tmp/demo.db",
"table": "users",
"columns": [
{
"name": "id",
"type": "INTEGER",
"nullable": true,
"default_value": null,
"primary_key": true
},
{
"name": "name",
"type": "TEXT",
"nullable": false,
"default_value": null,
"primary_key": false
},
{
"name": "email",
"type": "TEXT",
"nullable": true,
"default_value": null,
"primary_key": false
},
{
"name": "created_at",
"type": "TIMESTAMP",
"nullable": true,
"default_value": "CURRENT_TIMESTAMP",
"primary_key": false
}
],
"verbosity": "detailed",
"column_count": 4
}
create_table
Creates a new table with specified columns.
Parameters:
name(string, required): Table namecolumns(array, required): Column definitionsdatabase(string, optional): Database path
Column Definition:
{
"name": "column_name",
"type": "TEXT|INTEGER|REAL|BLOB",
"nullable": true,
"primary_key": false,
"default_value": null
}
Example:
{
"name": "users",
"columns": [
{
"name": "id",
"type": "INTEGER",
"primary_key": true,
"nullable": false
},
{
"name": "name",
"type": "TEXT",
"nullable": false
},
{
"name": "email",
"type": "TEXT",
"nullable": true
}
]
}
drop_table
Permanently deletes a table and all its data.
Parameters:
table(string, required): Table name to deletedatabase(string, optional): Database path
Query Operations
execute_read_query
Executes read-only SQL queries (SELECT, PRAGMA, EXPLAIN).
Parameters:
query(string, required): SQL queryparams(object, optional): Query parametersdatabase(string, optional): Database pathlimit(number, optional): Maximum rows to return (default: 10000)offset(number, optional): Number of rows to skip (default: 0)verbosity(string, optional): 'summary' or 'detailed' (default:
'detailed')
Example Request:
{
"query": "SELECT * FROM users ORDER BY id",
"verbosity": "detailed"
}
Example Response:
{
"database": "/tmp/demo.db",
"query": "SELECT * FROM users ORDER BY id LIMIT 10000",
"result": {
"rows": [
{
"id": 1,
"name": "Alice Johnson",
"email": "[email protected]",
"created_at": "2025-10-03 09:42:04"
},
{
"id": 3,
"name": "Carol White",
"email": "[email protected]",
"created_at": "2025-10-03 09:42:10"
}
],
"changes": 0,
"lastInsertRowid": 0
},
"row_count": 2,
"pagination": {
"limit": 10000,
"offset": 0,
"returned_count": 2,
"has_more": false
},
"verbosity": "detailed"
}
execute_write_query
Executes SQL that modifies data (INSERT, UPDATE, DELETE).
Parameters:
query(string, required): SQL queryparams(object, optional): Query parametersdatabase(string, optional): Database path
Example Request:
{
"query": "INSERT INTO users (name, email) VALUES ('Alice Smith', '[email protected]')"
}
Example Response:
{
"database": "/tmp/demo.db",
"query": "INSERT INTO users (name, email) VALUES ('Alice Smith', '[email protected]')",
"result": {
"rows": [],
"changes": 1,
"lastInsertRowid": 1
},
"message": "⚠️ DESTRUCTIVE OPERATION COMPLETED: Data modified in database '/tmp/demo.db'. Rows affected: 1"
}
execute_schema_query
Executes DDL queries (CREATE, ALTER, DROP).
Parameters:
query(string, required): DDL SQL queryparams(object, optional): Query parametersdatabase(string, optional): Database path
Example Request:
{
"query": "CREATE TABLE users (\n id INTEGER PRIMARY KEY AUTOINCREMENT,\n name TEXT NOT NULL,\n email TEXT UNIQUE,\n created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP\n)"
}
Example Response:
{
"database": "/tmp/demo.db",
"query": "CREATE TABLE users (\n id INTEGER PRIMARY KEY AUTOINCREMENT,\n name TEXT NOT NULL,\n email TEXT UNIQUE,\n created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP\n)",
"result": {
"rows": [],
"changes": 0,
"lastInsertRowid": 0
},
"message": "⚠️ SCHEMA CHANGE COMPLETED: Database structure modified in '/tmp/demo.db'. Changes: 0"
}
bulk_insert
Insert multiple records in batches.
Parameters:
table(string, required): Target table namedata(array, required): Array of objects to insertbatch_size(number, optional): Records per batch (default: 1000)database(string, optional): Database path
Example Request:
{
"table": "users",
"data": [
{ "name": "David Lee", "email": "[email protected]" },
{ "name": "Emma Davis", "email": "[email protected]" },
{ "name": "Frank Miller", "email": "[email protected]" }
]
}
Example Response:
{
"success": true,
"database": "/tmp/demo.db",
"table": "users",
"inserted": 3,
"batches": 1,
"total_time": 0,
"message": "⚠️ DESTRUCTIVE OPERATION COMPLETED: 3 records inserted into table 'users' in database '/tmp/demo.db'"
}
Transaction Management
begin_transaction
Start a database transaction with optional savepoint support.
Parameters:
database(string, optional): Database path
Returns: Transaction ID for tracking
commit_transaction
Commit the current transaction or release a savepoint.
Parameters:
database(string, optional): Database path
rollback_transaction
Rollback the current transaction or revert to a savepoint.
Parameters:
database(string, optional): Database path
Schema Operations
export_schema
Export database schema to SQL or JSON format.
Parameters:
database(string, optional): Database pathformat(string, optional): Output format - "sql" or "json"
(default: "sql")tables(array, optional): Specific tables to export
Example:
{
"format": "json",
"tables": ["users", "orders"]
}
import_schema
Import and execute schema from SQL or JSON.
Parameters:
database(string, optional): Database pathschema(string, required): Schema content to importformat(string, optional): Input format - "sql" or "json"
(default: "sql")
Database Maintenance
backup_database
Creates a backup copy of a database.
Parameters:
source_database(string, optional): Source database pathbackup_path(string, optional): Backup file path (auto-generated
if not provided)
vacuum_database
Optimizes database storage by reclaiming unused space.
Parameters:
database(string, optional): Database path
Safety Guidelines
Tool Classification
The server automatically classifies tools into safety categories:
- ✓ SAFE: Read-only operations (SELECT, PRAGMA, EXPLAIN, database
info, backups) - ⚠️ DESTRUCTIVE: Data modification (INSERT, UPDATE, DELETE, bulk
insert) - ⚠️ SCHEMA CHANGE: Structure modification (CREATE, ALTER, DROP,
schema import) - ⚠️ TRANSACTION: Transaction control (BEGIN, COMMIT, ROLLBACK)
- ✓ MAINTENANCE: Optimization operations (VACUUM, connection
management)
Best Practices
- Always use parameterized queries to prevent SQL injection
- Use transactions for multi-step operations to ensure data
consistency - Review destructive operations before execution
- Create backups before major schema changes
- Use bulk_insert for inserting large datasets efficiently
- Export schemas before major structural changes
- Use appropriate tools for different operation types
- Monitor connection pool usage in high-traffic scenarios
Development
Building
pnpm run build
Development Mode
pnpm run dev
Cleaning
pnpm run clean
Architecture
The server is built with a modular architecture:
Core Modules
src/index.ts: Main server entry pointsrc/config.ts: Configuration management with Valibot
validation
Database Clients
src/clients/connection-manager.ts: Advanced connection pooling
with health monitoringsrc/clients/query-executor.ts: SQL execution, bulk operations,
and query utilitiessrc/clients/transaction-manager.ts: ACID transaction
management with savepointssrc/clients/schema-manager.ts: Schema export/import
functionalitysrc/clients/sqlite.ts: Main SQLite client interface and
utilities
Tool Handlers
src/tools/handler.ts: Tool registration orchestratorsrc/tools/admin-tools.ts: Database and table management toolssrc/tools/query-tools.ts: Query execution and bulk operation
toolssrc/tools/transaction-tools.ts: Transaction management toolssrc/tools/schema-tools.ts: Schema export/import toolssrc/tools/context.ts: Database context management
Common Utilities
src/common/types.ts: TypeScript type definitionssrc/common/errors.ts: Error handling utilities
This modular design provides:
- Separation of Concerns: Each module has a single responsibility
- Maintainability: Easy to test, debug, and extend individual
components - Scalability: New features can be added without affecting
existing code - Type Safety: Comprehensive TypeScript coverage throughout
Dependencies
- tmcp: Modern TypeScript
MCP framework - better-sqlite3:
High-performance SQLite driver - valibot: Lightweight validation library
for type-safe inputs - csv-parser: CSV
parsing capabilities - csv-writer: CSV export
functionality
Key Features Provided by Dependencies
- tmcp: Streamlined MCP server development with excellent
TypeScript support - better-sqlite3: Synchronous SQLite operations with superior
performance - valibot: Runtime type validation for all tool parameters
- csv-*: Future-ready for CSV import/export capabilities
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT License - see the LICENSE file for details.
Acknowledgments
- Built on the
Model Context Protocol - Inspired by
mcp-turso-cloud - Uses better-sqlite3
for high-performance SQLite operations
Reviews (0)
Sign in to leave a review.
Leave a reviewNo results found