hyperframes

mcp
Guvenlik Denetimi
Basarisiz
Health Uyari
  • License — License: MIT
  • Description — Repository has a description
  • Active repo — Last push 0 days ago
  • Low visibility — Only 5 GitHub stars
Code Basarisiz
  • child_process — Shell command execution capability in .claude/settings.json
  • execSync — Synchronous shell command execution in .claude/settings.json
  • process.env — Environment variable access in .claude/settings.json
  • fs module — File system access in .github/workflows/publish.yml
Permissions Gecti
  • Permissions — No dangerous permissions requested
Purpose
This is an open-source video rendering framework and MCP server that allows AI agents to create and render video compositions using HTML. It processes these compositions locally via Puppeteer and FFmpeg to produce final MP4 files.

Security Assessment
Overall Risk: Medium. The tool poses notable security considerations primarily related to system execution. The automated audit flagged shell command execution (`child_process` and `execSync`) inside the AI agent configuration, meaning the tool is explicitly designed to run terminal commands autonomously. It also accesses environment variables and includes file system operations within its continuous integration workflows. While no hardcoded secrets or dangerous permission scopes were detected, allowing an AI agent to execute shell commands inherently increases the risk profile.

Quality Assessment
The project is very new but actively maintained, with repository activity as recent as today. It is properly licensed under the standard MIT license, which is excellent for open-source adoption. However, community trust and visibility are currently very low. With only 5 GitHub stars, the tool has not yet undergone broad public testing or community-driven security review. It is published by HeyGen, providing a baseline of organizational accountability.

Verdict
Use with caution. While the code is openly licensed and actively developed, the tool's ability to autonomously execute shell commands and its low community visibility warrant careful sandboxing and testing before integrating it into production environments.
SUMMARY

Write HTML. Render video. Built for agents.

README.md

Hyperframes

npm version
License: MIT
Node.js

Write HTML. Render video. Built for agents.

Hyperframes is an open-source video rendering framework that lets you create, preview, and render HTML-based video compositions — with first-class support for AI agents via MCP.

Why Hyperframes?

  • HTML-native — AI agents already speak HTML. No React required.
  • Frame Adapter pattern — bring your own animation runtime (GSAP, Lottie, CSS, Three.js).
  • Deterministic rendering — same input = identical output. Built for automated pipelines.
  • AI-first design — not a bolted-on afterthought.

Quick Start

npx hyperframes init my-video
cd my-video

Then open the project with your AI coding agent (Claude Code, Cursor, etc.) — it has HyperFrames skills installed and knows how to create and edit compositions.

npx hyperframes preview      # preview in browser (live reload)
npx hyperframes render   # render to MP4

Requirements: Node.js >= 22, FFmpeg

Documentation

Full documentation at hyperframes.heygen.com — start with the Quickstart, then explore guides, concepts, API reference, and package docs.

How It Works

Define your video as HTML with data attributes:

<div id="stage" data-composition-id="my-video" data-start="0" data-width="1920" data-height="1080">
  <video
    id="clip-1"
    data-start="0"
    data-duration="5"
    data-track="0"
    src="intro.mp4"
    muted
    playsinline
  ></video>
  <img id="overlay" data-start="2" data-duration="3" data-track="1" src="logo.png" />
  <audio
    id="bg-music"
    data-start="0"
    data-duration="9"
    data-track="2"
    data-volume="0.5"
    src="music.wav"
  ></audio>
</div>

Preview instantly in the browser. Render to MP4 locally. Let AI agents compose videos using tools they already understand.

Packages

Package Description
hyperframes CLI — create, preview, lint, and render compositions
@hyperframes/core Types, parsers, generators, linter, runtime, frame adapters
@hyperframes/engine Seekable page-to-video capture engine (Puppeteer + FFmpeg)
@hyperframes/producer Full rendering pipeline (capture + encode + audio mix)
@hyperframes/studio Browser-based composition editor UI

AI Agent Skills

HyperFrames ships skills that teach AI coding agents (Claude Code, Gemini CLI, Codex, Cursor) how to write correct compositions and GSAP animations. Use these instead of writing from scratch — they encode framework-specific patterns that generic docs don't cover.

Install via CLI (recommended)

# Install all skills (HyperFrames + GSAP) — runs automatically during `hyperframes init`
npx hyperframes skills

# Or install to a specific agent
npx hyperframes skills --claude
npx hyperframes skills --cursor

Or via npx skills add

# HyperFrames skills (hyperframes-compose, hyperframes-captions)
npx skills add heygen-com/hyperframes

# GSAP skills (gsap-core, gsap-timeline, gsap-scrolltrigger, gsap-plugins, gsap-performance, gsap-utils, gsap-react, gsap-frameworks)
npx skills add greensock/gsap-skills

Installed Skills

Source Skills What they teach
HyperFrames hyperframes-compose, hyperframes-captions HTML composition structure, class="clip" rules, data-* attributes, timeline registration, rendering constraints
GSAP gsap-core, gsap-timeline, gsap-performance, gsap-plugins, gsap-scrolltrigger, gsap-utils, gsap-react, gsap-frameworks Core API, timeline sequencing, ScrollTrigger, plugin usage, performance best practices

In Claude Code, invoke with /hyperframes-compose, /hyperframes-captions, /gsap-core, etc.

Contributing

See CONTRIBUTING.md for guidelines on how to contribute.

License

See LICENSE for details.

Yorumlar (0)

Sonuc bulunamadi