An AI-powered intelligence pipeline that monitors your niche, scores every signal with Claude, and delivers a daily brief.
Tell it your industry, your competitors, and where you want the brief. It finds the sources and synthesizes a daily editorial, delivered to Gmail, Slack, or Beehiiv.
Personal mode — an intelligence brief for your own decision-making. Tight, dense, action-oriented. Delivered to your Gmail inbox or Slack channel.
Audience mode — an editorial newsletter for subscribers. Same collection pipeline, different voice and structure. Delivered as a Beehiiv draft for review before publish.
Set mode: personal or mode: audience in your config.
RSS Feeds ────┐
Google News ──┤
Reddit ───────┼── Python Script ── Claude AI Scoring ── Supabase
Competitor │ (cron) │ │
Blogs ──────┘ │ │
▼ ▼
Daily Brief Generator
│ │ │
▼ ▼ ▼
Gmail/SMTP Slack Beehiiv Draft
- Collects signals from RSS feeds, Google News, Reddit, and competitor blogs
- Scores every signal with Claude on relevance, urgency, and content potential
- Synthesizes the top signals into an editorial brief with content ideas
- Delivers via Gmail/SMTP, Slack webhook, or Beehiiv draft
- Stores everything in Supabase for history and deduplication
If you use Claude Code, the setup is fully automated:
# 1. Clone the repo
git clone https://github.com/andrew-shwetzer/daily-intel.git
cd daily-intel
# 2. Install the package
pip install -e .
# 3. Copy the skill to Claude Code
cp -r skill/SKILL.md ~/.claude/skills/daily-intel/SKILL.md
# 4. Run the skill
# In Claude Code, type: /daily-intelThe skill asks 5 questions, researches your niche, provisions your database, and sets up cron. You'll see a live preview of your brief before anything is deployed.
git clone https://github.com/andrew-shwetzer/daily-intel.git
cd daily-intel
pip install -e .export ANTHROPIC_API_KEY=sk-ant-... # Required
export SUPABASE_URL=https://xxx.supabase.co # Required
export SUPABASE_SERVICE_KEY=eyJ... # Required (from Supabase dashboard)
export GMAIL_APP_PASSWORD=xxxx-xxxx-xxxx # If using Gmail/SMTP delivery
export SLACK_WEBHOOK_URL=https://hooks... # If using Slack delivery
export BEEHIIV_API_KEY=... # If using Beehiiv delivery
export BEEHIIV_PUBLICATION_ID=... # If using Beehiiv deliveryApply the schema to your Supabase project:
# Paste migrations/001_initial_schema.sql into the Supabase SQL Editormkdir -p ~/.daily-intel/instances/my-nicheCreate ~/.daily-intel/instances/my-niche/config.yaml:
niche: "Your Industry"
company: "Your Company"
description: "What you do in this space"
mode: personal # personal or audience
sources:
- name: "Industry Blog"
url: "https://industry-blog.com/feed/"
source_type: "rss"
category: "Industry News"
frequency: "4h"
- name: "Google News Stream"
url: "https://news.google.com/rss/search?q=%22your+keyword%22&hl=en-US&gl=US&ceid=US:en"
source_type: "news_rss"
category: "Industry News"
frequency: "4h"
- name: "Reddit Community"
url: "https://www.reddit.com/r/yoursub/search/.rss?q=keyword&sort=new&restrict_sr=on"
source_type: "reddit_rss"
category: "Community"
frequency: "12h"
competitors:
- name: "Competitor A"
url: "https://competitor-a.com"
blog_rss: "https://competitor-a.com/blog/feed/"
- name: "Competitor B"
url: "https://competitor-b.com"
delivery:
method: "gmail" # gmail, slack, beehiiv, or all
gmail_address: "you@gmail.com"
smtp_host: "smtp.gmail.com" # Supports any SMTP server
smtp_port: 587
slack_webhook_url: "$SLACK_WEBHOOK_URL" # $ prefix = read from env var
beehiiv_api_key: "$BEEHIIV_API_KEY"
beehiiv_publication_id: "$BEEHIIV_PUBLICATION_ID"
brief_time: "06:00"
timezone: "America/New_York"
collect_interval_hours: 4
scoring:
min_relevance: 6
scoring_model: "claude-haiku-4-5-20251001"
brief_model: "claude-sonnet-4-6"
database:
supabase_url: "$SUPABASE_URL"
supabase_key: "$SUPABASE_SERVICE_KEY"# Preview a brief without storing anything
daily-intel -i my-niche preview
# Run a full cycle (collect + brief)
daily-intel -i my-niche run
# Check health
daily-intel -i my-niche healthcrontab -eAdd:
# Collect signals every 4 hours
0 */4 * * * cd /path/to/daily-intel && daily-intel -i my-niche collect >> ~/.daily-intel/logs/my-niche.log 2>&1
# Generate and deliver brief at 6 AM
0 6 * * * cd /path/to/daily-intel && daily-intel -i my-niche brief >> ~/.daily-intel/logs/my-niche.log 2>&1
| Command | What it does |
|---|---|
daily-intel -i <slug> collect |
Fetch RSS feeds, score with Claude, store in Supabase |
daily-intel -i <slug> brief |
Generate editorial brief, deliver via configured method |
daily-intel -i <slug> run |
Collect + brief (full daily cycle) |
daily-intel -i <slug> preview |
Generate a brief without storing anything |
daily-intel -i <slug> health |
Check system status |
daily-intel list-instances |
List all configured instances |
Each signal is scored on 3 dimensions (1-5 each):
- Urgency: 5 = breaking now, 1 = low priority
- Relevance: 5 = core to your niche, 1 = tangential
- Content Potential: 5 = multiple content pieces, 1 = archive only
Composite = Urgency x Relevance x Content Potential (max 125)
| Score | Priority | Action |
|---|---|---|
| 75+ | P1 | Act today |
| 30-74 | P2 | This week |
| 10-29 | P3 | Backlog |
The daily brief uses a dark HUD aesthetic with:
- Editorial synthesis (AI-generated headline + analysis connecting signals)
- Priority-grouped signals with content angles
- Competitor activity monitoring
- Content ideas derived from signals
- Data points for future content
In audience mode, the voice and structure shift to editorial newsletter format, delivered as a Beehiiv draft.
Customize the template in daily_intel/templates/brief.html.
Gmail delivery requires an App Password (not your regular password):
- Go to myaccount.google.com/apppasswords
- Select "Mail" and your device
- Copy the 16-character password
- Set it:
export GMAIL_APP_PASSWORD=xxxx-xxxx-xxxx-xxxx
The SMTP delivery supports any compatible mail server via smtp_host and smtp_port.
- Python 3.10+
- Anthropic API key
- Supabase project (free tier works)
- Gmail/SMTP account, Slack workspace, or Beehiiv publication for delivery
MIT