- Built deep-scraper skill (CDP-based X feed extraction) - Three-stage pipeline: scrape → triage → investigate - Paper trading simulator with position tracking - First live investigation: verified kch123 Polymarket profile ($9.3M P&L) - Opened first paper position: Seahawks Super Bowl @ 68c - Telegram alerts with inline action buttons - Portal build in progress (night shift)
1.9 KiB
1.9 KiB
Feed Hunter
Automated X/Twitter feed intelligence pipeline. Scrapes → triages → investigates → simulates.
Architecture
Scrape (CDP) → Triage (claims/links) → Investigate (agent) → Alert (Telegram)
↓
Spawn Project
↓
Simulate / Backtest
Pipeline Stages
- Scrape — Extract structured posts from X feed via Chrome CDP
- Triage — Identify verifiable claims with actionable links
- Investigate — Agent follows links, verifies claims with real data
- Alert — Telegram notification with findings + inline action buttons
- Simulate — Paper trade the strategy, track P&L without real money
- Backtest — Where historical data exists, test against past performance
Simulation System
Every viable strategy gets a simulated portfolio entry:
- Virtual bankroll (configurable, default $1000)
- Paper positions tracked in
data/simulations/ - Daily P&L snapshots
- Performance metrics: win rate, ROI, Sharpe ratio, max drawdown
Project Spawning
When a strategy passes simulation thresholds:
- Auto-scaffold in
projects/<strategy-name>/ - Working bot code
- Risk parameters
- Go/no-go recommendation
Schedule
- Feed scrape: Every 2-4 hours during market hours (8am-10pm CST)
- Investigation: Triggered by triage hits
- Simulation updates: Hourly for active positions
- Daily digest: 9am CST summary of all active simulations
Files
config.json— pipeline settings, thresholds, bankrolldata/simulations/— active paper positionsdata/backtests/— historical backtest resultsdata/investigations/— investigation logs per postdata/alerts/— alert history