sweep

agent
Guvenlik Denetimi
Uyari
Health Uyari
  • License Ò€” License: AGPL-3.0
  • Description Ò€” Repository has a description
  • Active repo Ò€” Last push 0 days ago
  • Low visibility Ò€” Only 5 GitHub stars
Code Gecti
  • Code scan Ò€” Scanned 1 files during light audit, no dangerous patterns found
Permissions Gecti
  • Permissions Ò€” No dangerous permissions requested

Bu listing icin henuz AI raporu yok.

SUMMARY

πŸ€– A self-hosted Discord bot powered by an LLM with tool calling & per-channel context

README.md

🧹 Sweep

Sweep is not like a traditional Discord bot. It understands complex requests and is able to act directly with the Discord API using tools. No hardcoded commands.

Demo GIF

✨ Features

  • Tool calling to interact directly with the Discord API
  • A per-channel context so messages don't get mixed up
  • Uses an OpenAI-compatible endpoint, so both selfhosted and non-selfhosted backends are supported
  • Versatile use cases. You are not tied to any hardcoded logic

πŸ›‘οΈ Safety

Sweep implements safety measures to prevent unprivileged users from manipulating it into performing unauthorized actions. Note that LLMs can still be tricked.

  • Approval system: Users must explicitly grant permission via an embed button before Sweep can act on their behalf.
    Demo GIF

🌐 OpenAI-Compatible Endpoint

Sweep supports any OpenAI-compatible endpoint. Tools that offer such endpoints are:

βš“ Requirements

Rust requirements:

  • Sweep is always developed on the latest Rust version. Backwards-compatibility is not guaranteed.

LLM requirements:

  • Tool calling support
  • There is no explicit requirement for a parameter count, but be aware that smaller models are more likely to mess up requests.

πŸ”¬ Tested With

We have a discussion category purely for model ratings. Check it out!

⚑ Quickstart

First, clone the repository:

git clone https://github.com/BaxoPlenty/sweep.git

Then, configure Sweep via your .env file (you may also configure the OpenAI endpoint):

DISCORD_TOKEN=your_discord_bot_token
MODEL=your_model

Lastly, run Sweep:

cargo run --release

βš™οΈ Configuration

πŸ’« OpenAI Endpoint Configuration

Sweep uses async-openai for connecting to the OpenAI-compatible endpoint. You can configure the used endpoint with environment variables. The most important ones are:

🧹 Sweep

The binary expects the following environment variables to be present:

  • DISCORD_TOKEN: The Discord bot token for logging into the Discord user
  • MODEL: The model that is used for inference

You may use a .env file.

πŸ—ΊοΈ Roadmap

You can check existing feature requests here. You can also submit new feature requests.

πŸ’£ Common Errors

  • Unexpected Endpoint: Make sure that the OPENAI_BASE_URL does not have a trailing /

🀝 Contributing

Pull requests and issues are very welcome! This applies to bug fixes, bug reports, feature requests and basically everything!

πŸ“„ License

This project is licensed under the AGPL-3.0. This means that if you modify Sweep and run it as a service, you must publish your modifications under the same license.

See LICENSE for details.

Yorumlar (0)

Sonuc bulunamadi