wall-e
Health Warn
- No license — Repository has no license file
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Low visibility — Only 6 GitHub stars
Code Warn
- crypto private key — Private key handling in src/github.ts
- network request — Outbound network request in src/index.ts
- crypto private key — Private key handling in src/index.ts
Permissions Pass
- Permissions — No dangerous permissions requested
This tool is a GitHub bot that automates the generation of Cloudflare Workers. It uses spec-driven development by reading integration test files from pull requests and generating the corresponding worker code.
Security Assessment
Overall risk: Medium. The tool requests no dangerous system permissions, but standard bot operations require handling private keys (`src/github.ts`, `src/index.ts`) to authenticate with GitHub webhooks. It also makes outbound network requests to communicate with external APIs (likely GitHub and an AI provider). No hardcoded secrets were detected. While standard for this type of application, the required access to pull requests and code generation capabilities warrant careful permission scoping.
Quality Assessment
The project is highly active, with its most recent code push happening today. However, it suffers from extremely low community visibility (only 6 stars) and lacks a license file. The absence of a license is a critical oversight for open-source tools, as it legally restricts how the code can be used, modified, or distributed by others.
Verdict
Use with caution: It is actively maintained and secure against dangerous permissions, but the missing license and low community adoption pose compliance and reliability risks.
A GitHub bot that supercharges spec-driven development through automated generation of Cloudflare Workers.
Worker Assembly Large Language Engine
Table of Contents
What's WALL-E?
WALL-E is a GitHub bot that supercharges spec-driven development through automated generation of Cloudflare Workers. Based on worker functional requirements and integration tests (Spec File), WALL-E creates corresponding worker code, streamlining the development process.
Usage
Installing
Install the bot by visiting the GitHub App installation page and follow these steps:
- On the app page, click "Install" in the top-right corner
- Select your organization or personal account where you'd like to install the app.
- Choose "All repositories" or select specific ones where the bot should be active.
- After selecting repositories, click "Install" again to finish.
Once installed, the bot will automatically start working based on your repository configuration.
Prerequisites
1. Set up your project
For a new project:
Create a Cloudflare Worker project by running:npm create cloudflare@latest -- your-worker-nameReplace
your-worker-namewith the desired name of your worker. This command initializes a new project in a directory named after your worker.For an existing project:
Ensure your project includes the requiredtest/index.spec.tsfile (details below).
2. Create or update a Pull Request
Open a pull request that includes your test/index.spec.ts file.
3. Prepare the test/index.spec.ts file
Follow spec file best practices for creating your test/index.spec.ts file.
Basic Usage
Activate WALL-E in a pull request by commenting:
/wall-e generate
Advanced Usage
For more control, use optional parameters:
/wall-e generate path:workers/generate-embeddings provider:openai temperature:0.8
| Parameter | Aliases | Description | Default |
|---|---|---|---|
path |
custom path to a worker dir | repository root | |
provider |
provider for code generation | anthropic | |
model |
model name from the provider | claude-sonnet-4-5-20250929 | |
temperature |
temp |
model temperature setting (0-1) | 0.5 |
fallback |
whether or not you want to use fallback models | true |
Available Providers
anthropicopenaigoogleai
Available Models
claude-sonnet-4-5-20250929claude-sonnet-4-5-20250929-thinkingclaude-opus-4-5-20251101claude-opus-4-5-20251101-thinkinggpt-4.1o4-mini-2025-04-16o3-pro-2025-06-10gemini-2.5-progemini-2.5-flash
Improve Feature
The Improve Feature uses the existing code and spec file to generate optimized code based on the provided feedback. Example:
/wall-e improve path:workers/deduplicated-insert provider:googleai
---
- No need to import "Ai" from `cloudflare:ai` package
- Update "AI" binding to use "Ai" as type
Use this feature when you need to improve generated code with aspects not covered in spec files, such as:
- Fixing imports
- Adjusting types
- Correcting API usage
- Correcting typos
How It Works
Prompt
The prompt sent to the LLM consists of 2 sections: instructions and a spec file.
Instructions
The instructions section of the prompt explains the task and the general environment. It's relatively static and shouldn't change too often.
test/index.spec.ts
The Spec File is copied from the head branch and should contain 2 important sections: comments covering all functional requirements and Vitest integration tests covering all input/output interfaces, as well as any business logic-related edge cases.
Please adhere to our best practices when writing your spec files!
Showcase
Open-source workers generated by WALL-E running in production:
Code quality
Human vs machine
You might probably be convinced that your home-made raviolis are superior to the ones made by the soulless machines, but it's getting hard to compete with the latest-gen LLMs in terms of code quality and efficiency for smaller workers. If you have more complex projects, it's probably a good idea to split them into smaller components anyway.
Reviews (0)
Sign in to leave a review.
Leave a reviewNo results found