Skip to content

violettance/waffle

Repository files navigation

🧇 Waffle-bot

Waffle-bot provides an AI-powered chat service over Telegram by running a containerized OpenClaw gateway behind a Cloudflare Worker.

Runtime Cloudflare Workers + Containers
AI Engine OpenClaw Gateway
LLM Provider Z.ai (zai/glm-4.7)
Channel Telegram Bot
Storage R2 (media/files) · KV (config cache)
URL https://waffle-bot.ytu-iremkurt.workers.dev

📐 Architecture

┌─────────────────────────────────────────────────────────┐
│                    Cloudflare Edge                       │
│                                                         │
│  ┌──────────────┐       ┌───────────────────────────┐   │
│  │   Worker      │──────▶│  Durable Object (Container)│  │
│  │  (src/index)  │       │    OpenClawContainer       │  │
│  │              │       │                           │   │
│  │  • /ping     │       │  ┌─────────────────────┐  │   │
│  │  • /health   │       │  │  OpenClaw Gateway    │  │   │
│  │  • /health/  │       │  │  (Node 22, port 8080)│  │   │
│  │    startup   │       │  │                     │  │   │
│  │  • /* proxy  │       │  │  ◄── Telegram Bot ──│  │   │
│  └──────────────┘       │  └─────────────────────┘  │   │
│                         └───────────────────────────┘   │
│                                                         │
│  ┌──────────────┐       ┌──────────────┐               │
│  │  R2 Bucket    │       │  KV Namespace │               │
│  │ waffle-storage│       │  WAFFLE_KV    │               │
│  └──────────────┘       └──────────────┘               │
└─────────────────────────────────────────────────────────┘

Flow

  1. Telegram → Webhook message arrives at Cloudflare Worker.
  2. Worker → Proxies the request to the container inside Durable Object (container.fetch(request)).
  3. Container → OpenClaw gateway receives the message, sends it to Z.ai LLM, and returns the response to Telegram.

📁 Project Structure

team/waffle/
├── src/
│   └── index.ts          # Worker + OpenClawContainer (Durable Object)
├── Dockerfile            # Container image definition (Node 22-slim)
├── entrypoint.sh         # Container startup script
├── wrangler.jsonc        # Cloudflare Workers configuration
├── package.json          # Project dependencies
├── tsconfig.json         # TypeScript configuration
└── README.md             # This file

🔧 Components

Worker — src/index.ts

The Worker serves two main functions:

  1. Health check endpoints — Can run independently of the container.
  2. Proxy — Forwards all other incoming requests to the container.

OpenClawContainer (Durable Object)

Property Value Description
defaultPort 8080 Port the container listens on
sleepAfter "30m" Enters sleep mode after 30 minutes of inactivity
enableInternet true Container can access the external network
pingEndpoint "/health" Liveness check endpoint

The container transfers environment variables from Worker secrets to the runtime in the constructor:

  • TELEGRAM_BOT_TOKEN
  • ZAI_API_KEY
  • OPENAI_BASE_URL
  • MOLTBOT_GATEWAY_TOKEN

Hooks:

  • onError — Logs container errors.
  • onStop — Logs the exit code and reason when the container stops.

Container — Dockerfile + entrypoint.sh

  • Base image: node:22-slim
  • Includes Chromium libraries (for OpenClaw internal dependencies).
  • openclaw@latest is installed from npm.
  • entrypoint.sh generates the openclaw.json config file at runtime and starts the gateway.

Runtime Config (openclaw.json — produced in entrypoint)

{
  "channels": {
    "telegram": {
      "enabled": true,
      "allowFrom": ["*"],
      "accounts": {
        "waffle": { "name": "Waffle-bot", "enabled": true, "botToken": "..." }
      },
      "dmPolicy": "open"
    }
  },
  "agents": {
    "defaults": {
      "model": { "primary": "zai/glm-4.7" }
    }
  },
  "gateway": {
    "mode": "local",
    "port": 8080,
    "auth": { "token": "..." }
  }
}

Storage

Binding Type Usage
WAFFLE_STORAGE R2 Bucket Media and file storage
WAFFLE_KV KV Namespace Configuration cache

🚀 Setup & Deployment

Prerequisites

  • Node.js ≥ 18
  • wrangler CLI (npm install -g wrangler or available as project devDependency)
  • Cloudflare account (Workers Paid plan — required for Containers)

1. Install Dependencies

npm install

2. Set Secrets

npx wrangler secret put TELEGRAM_BOT_TOKEN
npx wrangler secret put ZAI_API_KEY
npx wrangler secret put OPENAI_BASE_URL
npx wrangler secret put MOLTBOT_GATEWAY_TOKEN

⚠️ Secret values are not added to version control. Each secret must be set separately via the Cloudflare dashboard or CLI.

3. Deploy

npm run deploy
# or
npx wrangler deploy

Wrangler will build the container image via the Dockerfile, upload it to Cloudflare, and deploy the Worker.

4. Verification

The container will be asleep after the first deployment. To wake it up and check its health:

# Fast ping (without starting the container)
curl -sS https://waffle-bot.ytu-iremkurt.workers.dev/ping
# → waffle is live

# Start the container and check status
curl -sS https://waffle-bot.ytu-iremkurt.workers.dev/health/startup
# → {"ok":true,"container":"waffle-main-v2","state":{"status":"healthy",...},...}

# If already running, just query status
curl -sS https://waffle-bot.ytu-iremkurt.workers.dev/health
# → {"ok":true,"container":"waffle-main-v2","state":{"status":"healthy",...},...}

🩺 Health Check Endpoints

Endpoint Method Container Required? Description
/ping GET Worker liveness check. Returns fixed "waffle is live".
/health GET Queries container state (does not start it).
/health/startup GET Starts the container, waits until ports are ready, then returns state. 45s timeout.

Successful /health/startup Response

{
  "ok": true,
  "container": "waffle-main-v2",
  "state": {
    "status": "healthy",
    "lastChange": 1771785038347
  },
  "timestamp": "2026-02-22T18:30:38.414Z"
}

On Error

{
  "ok": false,
  "container": "waffle-main-v2",
  "error": "Timeout waiting for container ports",
  "timestamp": "..."
}

🛠️ Local Development

npm run dev
# Starts wrangler dev — emulates the container locally

Note: wrangler dev container support is limited; some features can only be tested after deployment.


⚙️ Configuration Reference

wrangler.jsonc

Field Value Description
name "waffle" Worker name
main "src/index.ts" Entry point
compatibility_date "2026-02-22" Cloudflare compatibility date
containers[0].class_name "OpenClawContainer" DO/Container class name
containers[0].image "./Dockerfile" Container image source
containers[0].instance_type "standard-1" Instance type
containers[0].max_instances 2 Max concurrent container instances

Required Secrets

Secret Description
TELEGRAM_BOT_TOKEN Bot token received from Telegram BotFather
ZAI_API_KEY Z.ai API access key
OPENAI_BASE_URL Z.ai/OpenAI compatible API base URL
MOLTBOT_GATEWAY_TOKEN OpenClaw gateway auth token

🐛 Common Issues

Issue Possible Cause Solution
1101: The script will never generate a response Unhandled exception in Worker or missing try/catch Check logs with wrangler tail
The container is not running, consider calling start() Container is in sleep mode or has crashed Wake up with /health/startup
unauthorized: gateway token missing MOLTBOT_GATEWAY_TOKEN secret is missing or incorrect Re-set the secret: wrangler secret put MOLTBOT_GATEWAY_TOKEN
No API key found for provider "anthropic" Model provider mismatch Ensure Z.ai provider is used (zai/glm-4.7)
Container constantly restarts Missing environment variable Verify mandatory secret checks in entrypoint.sh

📊 Operational Notes

  • Sleep Mode: The container automatically goes to sleep after 30 minutes of inactivity. The first request may wait for the cold start duration (~45s).
  • Instance Limit: A maximum of 2 concurrent container instances can run.
  • Log Monitoring: npx wrangler tail can be used for live logs.
  • Telegram Group Behavior: To respond to group messages, check OpenClaw groupPolicy configuration and BotFather privacy mode settings. This feature is not yet stable.
  • DM Policy: Open to all direct messages ("dmPolicy": "open").

📝 Version History

Date Version / Deploy Notes
2026-02-22 waffle-main-v2 Gateway token, Z.ai provider integration, health endpoints, error management improvements. Stable DM mode.

📄 License

Private project — internal use.

About

Waffle is a containerized OpenClaw AI agent deployed on Cloudflare Workers, providing a high-performance Telegram chat service powered by the Z.ai LLM engine

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors