# Feed Hunter Automated X/Twitter feed intelligence pipeline. Scrapes → triages → investigates → simulates. ## Architecture ``` Scrape (CDP) → Triage (claims/links) → Investigate (agent) → Alert (Telegram) ↓ Spawn Project ↓ Simulate / Backtest ``` ## Pipeline Stages 1. **Scrape** — Extract structured posts from X feed via Chrome CDP 2. **Triage** — Identify verifiable claims with actionable links 3. **Investigate** — Agent follows links, verifies claims with real data 4. **Alert** — Telegram notification with findings + inline action buttons 5. **Simulate** — Paper trade the strategy, track P&L without real money 6. **Backtest** — Where historical data exists, test against past performance ## Simulation System Every viable strategy gets a simulated portfolio entry: - Virtual bankroll (configurable, default $1000) - Paper positions tracked in `data/simulations/` - Daily P&L snapshots - Performance metrics: win rate, ROI, Sharpe ratio, max drawdown ## Project Spawning When a strategy passes simulation thresholds: - Auto-scaffold in `projects//` - Working bot code - Risk parameters - Go/no-go recommendation ## Schedule - Feed scrape: Every 2-4 hours during market hours (8am-10pm CST) - Investigation: Triggered by triage hits - Simulation updates: Hourly for active positions - Daily digest: 9am CST summary of all active simulations ## Files - `config.json` — pipeline settings, thresholds, bankroll - `data/simulations/` — active paper positions - `data/backtests/` — historical backtest results - `data/investigations/` — investigation logs per post - `data/alerts/` — alert history