How the stack runs locally
Practical startup mental model.
Redis is acting as the Celery broker here, not a generic cache. The scheduler pushes `poll_feed.delay(feed_id)` jobs into Redis, and the worker consumes them from there.
The crawler is a separate service. If you see worker activity without crawler activity, the job is stalling before the crawler handoff.
Useful local endpoints
Direct URLs and what they are for.
Telegram Bot
@feedbagelbot — alerts and monitoring.
The crawler sends periodic summaries and error alerts to Telegram via @feedbagelbot.
Notification module: web-crawler-node/libs/whatsappNotifications.js
Wired into server.js — starts on crawler boot, sends summaries every 4h.
TELEGRAM_BOT_TOKEN — from @BotFather
TELEGRAM_CHAT_ID — your Telegram user ID (518234907)
Startup message when crawler boots (shows local vs production).
Every 4h: crawl success/fail counts, entries processed, errors, uptime.
Trigger manually: POST /notify/summary on the crawler.
Deployed on Hetzner via Coolify as "Web Crawler" at pongo.letter.so.
Also has a bot command handler (separate from the notification module) that responds to /credits, /help etc.
Coolify API: ubuntu-4gb-nbg1-1.taildd7204.ts.net (via Tailscale).
Deployment
Where things run in production.
Coolify dashboard accessible via Tailscale at ubuntu-4gb-nbg1-1.taildd7204.ts.net or publicly at coolify.prototypr.io.