Current view
Run Guide

How the stack runs locally

Practical startup mental model.

Browser Frontend 5050 Backend 5001 Postgres 5432
Scheduler Redis / Celery Poller worker Crawler 3000

Redis is acting as the Celery broker here, not a generic cache. The scheduler pushes `poll_feed.delay(feed_id)` jobs into Redis, and the worker consumes them from there.

The crawler is a separate service. If you see worker activity without crawler activity, the job is stalling before the crawler handoff.

Useful local endpoints

Direct URLs and what they are for.

Telegram Bot

@feedbagelbot — alerts and monitoring.

The crawler sends periodic summaries and error alerts to Telegram via @feedbagelbot.

Where it lives

Notification module: web-crawler-node/libs/whatsappNotifications.js

Wired into server.js — starts on crawler boot, sends summaries every 4h.

Env vars (crawler .env)

TELEGRAM_BOT_TOKEN — from @BotFather

TELEGRAM_CHAT_ID — your Telegram user ID (518234907)

What it sends

Startup message when crawler boots (shows local vs production).

Every 4h: crawl success/fail counts, entries processed, errors, uptime.

Trigger manually: POST /notify/summary on the crawler.

Production (Coolify)

Deployed on Hetzner via Coolify as "Web Crawler" at pongo.letter.so.

Also has a bot command handler (separate from the notification module) that responds to /credits, /help etc.

Coolify API: ubuntu-4gb-nbg1-1.taildd7204.ts.net (via Tailscale).

Deployment

Where things run in production.

Web Crawler
Coolify / Hetzner
pongo.letter.so
Repo: GraemeFulton/web-crawler-node
Backend API
Coolify / Hetzner
api.feedbagel.com
Frontend
Coolify / Hetzner
feedbagel.com
Poller
Coolify / Hetzner
(no public URL)
Repo: Prototypr/feedbagel-poller

Coolify dashboard accessible via Tailscale at ubuntu-4gb-nbg1-1.taildd7204.ts.net or publicly at coolify.prototypr.io.