nika
Health Warn
- License — License: AGPL-3.0
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Low visibility — Only 5 GitHub stars
Code Fail
- rm -rf — Recursive force deletion command in .github/workflows/ci.yml
- process.env — Environment variable access in .github/workflows/ci.yml
- rm -rf — Recursive force deletion command in .github/workflows/release.yml
- process.env — Environment variable access in .github/workflows/release.yml
- fs module — File system access in .github/workflows/release.yml
Permissions Pass
- Permissions — No dangerous permissions requested
This tool is an open-source workflow engine that orchestrates AI tasks using simple YAML files. It allows developers to define execution steps that route prompts to various LLM providers and handle the returned data.
Security Assessment
Overall risk: Medium. The application is designed to make external network requests (fetching URLs, routing to AI providers) and execute shell commands based on user-defined YAML workflows, which is standard for its intended purpose but requires careful configuration. There are no hardcoded secrets or dangerous permission requests. However, the rule-based scan failed due to the use of recursive force deletion commands (`rm -rf`) and environment variable access inside its GitHub Actions CI/CD workflows. While typically used for temporary workspace cleanup in automated pipelines, `rm -rf` can be a dangerous pattern if improperly handled.
Quality Assessment
The project is actively maintained, with its most recent code push happening today. It is properly licensed under AGPL-3.0, which is important to note as it imposes strict copyleft requirements for derivative works. However, community trust and visibility are currently very low. With only 5 stars on GitHub, the tool is in its infancy. This means it has not yet undergone broad peer review or real-world battle-testing by the developer community.
Verdict
Use with caution—the core engine is functional and recently updated, but its low community adoption and strict AGPL licensing mean it may not be suitable for all commercial projects.
Inference as Code — 5 YAML verbs to orchestrate any AI. Rust engine, 9 providers, single binary. AGPL-3.0. 🦋
Nika
One file. Any AI.
The open source Inference as Code engine.
Write AI workflows in YAML. Run them anywhere.
Quick Start · 5 Verbs · Examples · Benchmarks · Course · Install
# news.nika.yaml — Scrape Hacker News and summarize the top stories
schema: "nika/[email protected]"
provider: claude # or: openai, mistral, groq, gemini, deepseek, xai, local
tasks:
- id: scrape
fetch: { url: "https://news.ycombinator.com", extract: article }
- id: summarize
with: { page: $scrape }
infer: "3-bullet summary of today's top stories: {{with.page}}"
nika run news.nika.yaml
What is Nika?
Nika is a workflow engine where each step is a YAML task with exactly one verb -- infer, exec, fetch, invoke, or agent. Write your steps in a .nika.yaml file, run nika run, and Nika handles the rest: parallel execution, data flow between tasks, retries, structured output, and multi-provider LLM routing.
Inference as Code. The same shift that Terraform brought to infrastructure -- describe your intent in a file, let a runtime handle execution. Your workflow is a YAML file that you commit, review in a PR, diff, and version. Five verbs describe any automation, from a 3-step summary to a 50-task parallel pipeline across multiple AI providers.
| Without Nika | With Nika | |
|---|---|---|
| Workflow | Copy-paste between ChatGPT tabs | Write steps once, run forever |
| Scale | One thing at a time | 50 items in parallel with for_each |
| Providers | Locked into one vendor at $20/mo | 9 providers, switch in one line |
| Output | Pray the LLM returns valid JSON | 5-layer schema validation with auto-repair |
| Reproducibility | "It worked last time" | Deterministic DAG, NDJSON traces, event replay |
| Deployment | Docker + Python + venv + pip | Single binary, zero dependencies |
Quick Start
# Install (pick one)
brew install supernovae-st/tap/nika # macOS / Linux
cargo install nika # from crates.io
npx @supernovae-st/nika # run without installing
# Set up your API key
nika setup
# Run your first workflow
nika run hello.nika.yaml
hello.nika.yaml
schema: "nika/[email protected]"
provider: claude
inputs:
topic: "butterflies"
tasks:
- id: haiku
infer: "Write a haiku about {{inputs.topic}}"
Want more? Scaffold a full project or start the interactive course:
nika init # 5 starter workflows (one per verb)
nika init --course # 44 hands-on exercises across 12 levels
nika doctor # verify your setup
The 5 Verbs
Every task uses exactly one verb. That is the entire API surface.
| Verb | What it does | Example |
|---|---|---|
infer: |
Call any LLM | infer: "Summarize this: {{with.text}}" |
exec: |
Run a shell command | exec: "git log --oneline -5" |
fetch: |
HTTP request + extraction | fetch: { url: "https://...", extract: markdown } |
invoke: |
Call MCP or builtin tools | invoke: { tool: nika:thumbnail, params: { width: 800 } } |
agent: |
Multi-turn autonomous loop | agent: { prompt: "Research...", max_turns: 15 } |
flowchart LR
classDef verb fill:#6366f1,stroke:#4f46e5,stroke-width:2px,color:#fff
classDef target fill:#06b6d4,stroke:#0891b2,stroke-width:2px,color:#fff
INFER[infer]:::verb --> LLM["9 Providers"]:::target
EXEC[exec]:::verb --> SHELL[Shell]:::target
FETCH[fetch]:::verb --> HTTP["HTTP + 9 Extract Modes"]:::target
INVOKE[invoke]:::verb --> TOOLS["62 Tools + MCP"]:::target
AGENT[agent]:::verb --> LOOP["Agentic Loop + Guardrails"]:::target
Five words. Not fifty abstractions. If you've used Terraform, GitHub Actions, or Docker Compose, this will feel familiar -- because the pattern is the same. Declare what you want, let the engine figure out how.
Examples
Scrape, summarize, translate -- in parallel
schema: "nika/[email protected]"
provider: claude
tasks:
- id: scrape
fetch: { url: "https://example.com/blog", extract: markdown }
- id: summarize
with: { content: $scrape }
infer: "Summarize in 3 bullets: {{with.content}}"
- id: translate
for_each: ["French", "Spanish", "Japanese", "German", "Portuguese"]
as: lang
concurrency: 5
with: { summary: $summarize }
infer: "Translate to {{with.lang}}: {{with.summary}}"
Multi-provider fan-out -- same question, three perspectives
schema: "nika/[email protected]"
tasks:
- id: claude_take
provider: anthropic
infer: "Analyze this trend: {{inputs.topic}}"
- id: gpt_take
provider: openai
model: gpt-4o
infer: "Analyze this trend: {{inputs.topic}}"
- id: gemini_take
provider: gemini
model: gemini-2.5-flash
infer: "Analyze this trend: {{inputs.topic}}"
- id: synthesize
depends_on: [claude_take, gpt_take, gemini_take]
with:
claude: $claude_take
gpt: $gpt_take
gemini: $gemini_take
infer: "Synthesize these 3 perspectives: {{with.claude}} / {{with.gpt}} / {{with.gemini}}"
Structured data extraction -- guaranteed valid JSON
schema: "nika/[email protected]"
provider: claude
tasks:
- id: extract
infer: "Tell me about Alice, 30, Rust and Python developer"
structured:
schema:
type: object
required: [name, age, skills]
properties:
name: { type: string }
age: { type: number, minimum: 0 }
skills: { type: array, items: { type: string }, minItems: 1 }
enable_repair: true
max_retries: 3
The prompt is natural language -- never mention JSON. The 5-layer defense handles extraction, validation, retry, and LLM repair automatically. Same result on all 9 providers.
AI agent with guardrails and cost limits
schema: "nika/[email protected]"
provider: claude
tasks:
- id: research
agent:
prompt: "Research the top 5 competitors for our product"
tools: [nika:read, nika:write, nika:glob]
max_turns: 15
guardrails:
- type: length
max_words: 2000
limits:
max_cost_usd: 1.00
completion:
mode: explicit
Image processing pipeline
schema: "nika/[email protected]"
provider: claude
tasks:
- id: import
invoke: { tool: nika:import, params: { path: "./photo.jpg" } }
- id: thumbnail
with: { img: $import }
invoke:
tool: nika:pipeline
params:
hash: "{{with.img.hash}}"
ops:
- { op: thumbnail, width: 800 }
- { op: optimize }
- { op: convert, format: webp }
- id: describe
with: { img: $import }
infer:
content:
- type: image
source: "{{with.img.hash}}"
- type: text
text: "Write an alt-text description for this image"
115 more examples available via
nika showcase listandnika showcase extract <name>.
Key Features
Providers -- 9 backends, zero lock-in
Switch providers in one line. Same workflow, any AI.
| Provider | Models | Env Var |
|---|---|---|
| Anthropic | claude-opus-4, claude-sonnet-4, claude-haiku-4.5 | ANTHROPIC_API_KEY |
| OpenAI | gpt-4o, gpt-4.1, o3, o4-mini | OPENAI_API_KEY |
| Gemini | gemini-2.5-pro, gemini-2.5-flash | GEMINI_API_KEY |
| Mistral | mistral-large-latest, mistral-small-latest | MISTRAL_API_KEY |
| Groq | llama-3.3-70b-versatile, mixtral-8x7b | GROQ_API_KEY |
| DeepSeek | deepseek-chat, deepseek-reasoner | DEEPSEEK_API_KEY |
| xAI | grok-3 | XAI_API_KEY |
| Native | Any GGUF model locally via mistral.rs | -- |
| Mock | Deterministic test responses -- no API calls, no keys | -- |
Connect to any OpenAI-compatible endpoint (vLLM, Ollama, LiteLLM, SGLang) via base_url:.
Structured Output -- 5-layer defense
Get guaranteed schema-valid JSON from any provider. No prompt hacking required.
| Layer | Strategy |
|---|---|
| L0 | Provider-native tool/schema enforcement |
| L2 | Extract + validate JSON from response |
| L3 | Retry with error feedback |
| L4 | LLM repair call (last resort) |
Same result on all 9 providers. No exceptions.
Data Flow -- 63 transforms, bindings, parallel loops
tasks:
- id: fetch_data
fetch: { url: "https://api.example.com/users" }
- id: process
with:
users: $fetch_data # bind upstream output
name: $fetch_data.data[0].name # JSONPath access
safe: $fetch_data.name ?? "Unknown" # default fallback
infer: "First user: {{with.name | upper | trim}}"
63 pipe transforms: upper, lower, trim, join(","), split(","), sort, unique, flatten, first, last, length, to_json, parse_json, default("x"), pluck(field), where(field, val), sort_by(field), pick(f1,f2), omit(f1,f2), jq(expr), regex(pattern), and 40+ more.
Parallel loops with for_each + concurrency:
- id: translate
for_each: ["en", "fr", "ja", "de", "ko"]
as: locale
concurrency: 5
infer: "Translate to {{with.locale}}: {{with.text}}"
62 Builtin Tools
All accessible via invoke: nika:* -- no external dependencies.
| Tool | Purpose |
|---|---|
nika:import |
Import any file into CAS |
nika:decode |
Base64 string to CAS store |
nika:thumbnail |
SIMD-accelerated resize (Lanczos3) |
nika:convert |
Format conversion (PNG/JPEG/WebP) |
nika:optimize |
Lossless PNG optimization (oxipng) |
nika:pipeline |
Chain operations in-memory |
nika:metadata |
Universal EXIF/audio/video metadata |
nika:dimensions |
Image dimensions (~0.1ms) |
nika:thumbhash |
25-byte compact placeholder |
nika:dominant_color |
Color palette extraction |
nika:strip |
Remove EXIF metadata |
nika:svg_render |
SVG to PNG (resvg) |
nika:phash |
Perceptual image hashing |
nika:compare |
Visual similarity comparison |
nika:pdf_extract |
PDF text extraction |
nika:chart |
Bar/line/pie charts from JSON |
nika:provenance |
C2PA content credentials |
nika:verify |
C2PA verification + EU AI Act |
nika:qr_validate |
QR decode + quality score |
nika:quality |
Image quality (DSSIM/SSIM) |
| Tool | Purpose |
|---|---|
nika:jq |
Full jq stdlib (100+ functions via jaq-core) |
nika:json_merge |
Deep merge JSON objects |
nika:map |
Transform array elements |
nika:filter |
Filter array by condition |
nika:group_by |
Group array into object by field |
nika:chunk |
Split array into N-sized chunks |
nika:aggregate |
Sum, avg, min, max over arrays |
nika:json_flatten |
Flatten nested JSON |
nika:json_unflatten |
Unflatten dotted keys |
nika:set_diff |
Set difference between arrays |
nika:zip |
Zip two arrays together |
nika:token_count |
Count tokens for a model |
| Tool | Purpose |
|---|---|
nika:html_to_md |
HTML to clean Markdown |
nika:css_select |
CSS selector extraction |
nika:extract_metadata |
OG, Twitter Cards, JSON-LD |
nika:extract_links |
Rich link classification |
nika:readability |
Article content extraction |
| Tool | Purpose |
|---|---|
nika:read |
Read file contents |
nika:write |
Write file (with overwrite mode) |
nika:edit |
Edit file in place |
nika:glob |
Pattern-match files |
nika:grep |
Search file contents |
nika:sleep |
Delay execution |
nika:log |
Emit log messages |
nika:emit |
Emit custom events |
nika:assert |
Runtime assertions |
nika:run |
Run sub-workflows |
nika:complete |
Signal agent completion |
nika:inject |
Template marker replacement |
MCP Integration
Nika is an MCP-native client. Connect to any Model Context Protocol server.
mcp:
web_search:
command: npx
args: ["-y", "@anthropic/mcp-web-search"]
tasks:
- id: search
invoke: { mcp: web_search, tool: search, params: { query: "..." } }
- id: agent_task
agent:
prompt: "Research this topic thoroughly"
mcp: [web_search]
max_turns: 10
nika serve -- workflows as HTTP endpoints
Expose any workflow as a REST API. SDKs for Rust, Node.js, and Python.
nika serve --port 3000
curl -X POST http://localhost:3000/v1/jobs \
-H "Content-Type: application/json" \
-d '{"workflow": "news.nika.yaml", "inputs": {"topic": "AI"}}'
SSE streaming, job queues, concurrent execution, and per-job isolation built in.
Terminal UI
Three views: Studio (editor + DAG), Command (chat + execution), Control (settings).
+-----------------------------------------------------------------------+
| Nika Studio v0.71.0 |
|-----------------------------------------------------------------------|
| +- Files --------+ +- Editor ------------------------------------+ |
| | > workflows/ | | 1 | schema: "nika/[email protected]" | |
| | deploy.nika | | 2 | provider: claude | |
| | review.nika | | 3 | tasks: | |
| +- DAG ----------+ | 4 | - id: research | |
| | [research]--+ | | 5 | agent: | |
| | | | | | 6 | prompt: "Find AI papers" | |
| | [analyze] [e] | +--------------------------------------------+ |
| | | | | |
| | [ report ] | Tree-sitter highlighting | LSP | Git gutter |
| +----------------+ Vi/Emacs modes | Fuzzy search | Undo/redo |
+-----------------------------------------------------------------------+
| [1/s] Studio [2/c] Command [3/x] Control |
+-----------------------------------------------------------------------+
Language Server
Full LSP with 16 capabilities: completion, hover, go-to-definition, diagnostics, semantic tokens, code actions, inlay hints, CodeLens, rename, formatting, and more.
cargo install nika-lsp # standalone
code --install-extension supernovae.nika-lang # VS Code
Interactive Course
12 levels. 44 exercises. From shell commands to full AI orchestration.
nika init --course
nika course next
nika course hint
| Level | Name | What You Learn |
|---|---|---|
| 01 | Jailbreak | exec, fetch, infer -- the 3 core verbs |
| 02 | Hot Wire | Data bindings, transforms, templates |
| 03 | Fork Bomb | DAG patterns, parallel execution |
| 04 | Root Access | Context files, imports, inputs |
| 05 | Shapeshifter | Structured output, JSON Schema |
| 06 | Pay-Per-Dream | Multi-provider, native models, cost control |
| 07 | Swiss Knife | Builtin tools, file operations |
| 08 | Gone Rogue | Autonomous agents, skills, guardrails |
| 09 | Data Heist | Web scraping, 9 extraction modes |
| 10 | Open Protocol | MCP integration |
| 11 | Pixel Pirate | Media pipeline, vision |
| 12 | SuperNovae | Boss battle -- everything combined |
Benchmarks
Real benchmarks. Real tasks. No cherry-picking.
RAM usage -- "Summarize 10 web pages" task
| Tool | Peak RAM | Cold start | Lines of config |
|---|---|---|---|
| Nika | ~45 MB | 4 ms | 12 |
| LangChain (Python) | ~230 MB | 1.2 s | 48 |
| LangGraph (Python) | ~210 MB | 1.1 s | 62 |
| CrewAI (Python) | ~280 MB | 1.4 s | 55 |
Nika uses 5x less RAM than LangChain for the same task.
Nika vs. Python -- the deployment story
| Metric | Nika | Python equivalent |
|---|---|---|
| Cold start | 4 ms | 800+ ms |
| RAM (idle) | 12 MB | 60+ MB |
| Binary size | ~25 MB | 200+ MB (with venv) |
| Dependencies | 0 (single binary) | pip install, venv, Docker... |
| Install | Download and run | pip install, venv, requirements.txt |
A Raspberry Pi can run Nika. A GitHub Action can run Nika. A $5/month VPS can run Nika.
Agent reliability
| Tool | Execution model | Guardrails | Retry |
|---|---|---|---|
| Nika | Deterministic DAG | Yes (4 types) | Yes (exponential backoff) |
| CrewAI | Agent negotiation | No | Manual |
| LangGraph | State machine | Partial | Manual |
| AutoGPT | Open-ended loop | No | No |
Architecture
flowchart TD
classDef phase fill:#6366f1,stroke:#4f46e5,stroke-width:2px,color:#fff
classDef verb fill:#06b6d4,stroke:#0891b2,stroke-width:2px,color:#fff
classDef backend fill:#10b981,stroke:#059669,stroke-width:2px,color:#fff
YAML[".nika.yaml"]:::phase
RAW["Parse (source spans)"]:::phase
ANA["Analyze (validate + resolve)"]:::phase
LOW["Lower (runtime types)"]:::phase
DAG["DAG Engine"]:::phase
YAML --> RAW --> ANA --> LOW --> DAG
subgraph Verbs
INF[infer]:::verb
EXC[exec]:::verb
FET[fetch]:::verb
INV[invoke]:::verb
AGT[agent]:::verb
end
DAG --> INF & EXC & FET & INV & AGT
subgraph Backends
PROV["9 Providers"]:::backend
MCPS["MCP Servers"]:::backend
BUILT["62 Builtin Tools"]:::backend
CAS["CAS Media Store"]:::backend
end
INF & AGT --> PROV
INV & AGT --> MCPS
INV --> BUILT
BUILT --> CAS
Three-phase AST (inspired by rustc): Raw (parse with source spans) --> Analyzed (validate, resolve bindings) --> Lowered (concrete runtime types). The immutable DAG is built from petgraph for safe concurrent execution.
17 workspace crates:
tools/
nika/ CLI entry point cargo install nika
nika-engine/ Embeddable runtime cargo add nika-engine
nika-core/ AST, types, catalogs zero I/O
nika-event/ EventLog, TraceWriter
nika-mcp/ MCP client (rmcp)
nika-media/ CAS store, media processor
nika-storage/ Storage abstraction
nika-daemon/ Background daemon + secrets
nika-init/ Project scaffolding + course
nika-cli/ CLI subcommands
nika-tui/ Terminal UI (ratatui)
nika-display/ Render engine
nika-lsp-core/ Protocol-agnostic LSP
nika-lsp/ Standalone LSP binary
nika-serve/ HTTP server
nika-sdk/ Rust SDK
nika-vault/ Encrypted credential store
Install
| Method | Command |
|---|---|
| Homebrew | brew install supernovae-st/tap/nika |
| Cargo | cargo install nika |
| npm | npm install -g @supernovae-st/nika |
| npx | npx @supernovae-st/nika |
| Docker | docker run --rm -v "$(pwd)":/work supernovae/nika run /work/flow.nika.yaml |
| Source | git clone https://github.com/supernovae-st/nika && cargo install --path nika/tools/nika |
nika --version # nika 0.71.0
nika doctor # full system health check
Feature flags
| Feature | Default | Description |
|---|---|---|
tui |
yes | Terminal UI (ratatui, tree-sitter, git2) |
native-inference |
yes | Local GGUF models via mistral.rs |
media-core |
yes | Tier 2 media tools |
media-phash |
yes | Perceptual hashing |
media-pdf |
yes | PDF text extraction |
media-chart |
yes | Chart generation |
media-qr |
yes | QR code validation |
media-iqa |
yes | Image quality assessment |
media-provenance |
no | C2PA signing + verification |
fetch-extract |
yes | HTML extraction |
fetch-markdown |
yes | HTML to Markdown |
fetch-article |
yes | Article extraction |
fetch-feed |
yes | RSS/Atom/JSON Feed |
lsp |
no | Standalone LSP binary |
# Minimal build
cargo install --path tools/nika --no-default-features
# Custom features
cargo install --path tools/nika --features "tui,native-inference,media-core"
Documentation
| Resource | Description |
|---|---|
| User Guide | Getting started, verbs, data flow, providers |
| Interactive Course | 12 levels, 44 exercises |
| Showcase | 8 guided examples + 115 browseable workflows |
| Manifesto | Why Inference as Code matters |
| Contributing | Build, test, conventions |
| Citation | Academic citation (Zenodo DOI) |
CLI at a glance
nika run flow.nika.yaml # execute workflow
nika run flow.nika.yaml --resume # re-run, skip completed tasks
nika check flow.nika.yaml # validate without executing
nika test flow.nika.yaml # test with mock provider
nika lint flow.nika.yaml # best-practice linting
nika explain flow.nika.yaml # human-readable summary
nika graph flow.nika.yaml # visualize DAG
nika ui # TUI
nika chat # direct chat mode
nika serve --port 3000 # HTTP API
nika init # scaffold project
nika init --course # interactive course
nika course next # next exercise
nika provider list # API key status
nika model list # available models
nika mcp list # MCP servers
nika doctor # system health
nika showcase list # browse 115 example workflows
Contributing
git clone https://github.com/supernovae-st/nika.git
cd nika
cargo build # build all 17 crates
cargo test --workspace --lib # 10,000+ tests (safe, no keychain popups)
cargo clippy -- -D warnings # zero warnings policy
Note:
cargo testwithout--libruns contract tests that trigger macOS Keychain popups. Always use--lib.
See CONTRIBUTING.md for full guidelines.
License
AGPL-3.0-or-later -- Nika is free software. Use it, study it, share it, improve it.
The AGPL protects the commons: if you modify Nika and offer it as a hosted service, you share your changes back. For CLI usage, there are zero restrictions. Commercial use is welcome.
Read the Manifesto to understand why.
Nika v0.71.0 · Schema nika/[email protected] · Rust 1.86+ · 17 crates · 10,000+ tests
SuperNovae Studio · QR Code AI · GitHub
Built in Paris. Open source. Forever.
Liberate your AI. 🦋
Reviews (0)
Sign in to leave a review.
Leave a reviewNo results found