# Feed Hunter Portal Self-contained web dashboard for monitoring the X/Twitter feed intelligence pipeline. ## Quick Start The portal is already running at **http://localhost:8888/** ## Views - **Dashboard** (`/`) - Overview of active simulations, recent scrapes, signal counts - **Feed** (`/feed`) - Latest scraped posts with triage status (color-coded) - **Investigations** (`/investigations`) - Detailed investigation reports - **Simulations** (`/simulations`) - Active paper positions, P&L, trade history - **Status** (`/status`) - Pipeline health, last run times, Chrome status ## Management **Manual Control:** ```bash # Start portal cd portal && python3 server.py # Stop portal pkill -f "python3 server.py" ``` **Systemd Service:** ```bash # Check status systemctl --user status feed-hunter-portal # Start/stop/restart systemctl --user start feed-hunter-portal systemctl --user stop feed-hunter-portal systemctl --user restart feed-hunter-portal # View logs journalctl --user -u feed-hunter-portal -f ``` ## Features - **Dark theme** optimized for monitoring - **Mobile-friendly** responsive design - **Real-time updates** every 30 seconds on dashboard - **Color-coded** triage status (High Value = green, Investigate = orange, etc.) - **P&L tracking** with profit/loss indicators - **No external dependencies** - uses Python stdlib only ## Data Sources The portal reads from: - `../data/x-feed/*/posts.json` - Scraped Twitter posts - `../data/x-feed/*/triage.json` - Triage results - `../data/investigations/*.json` - Investigation reports - `../data/simulations/active.json` - Active paper positions - `../data/simulations/history.json` - Closed trades - `../config.json` - Pipeline configuration ## Development Single file architecture in `server.py` with embedded CSS/JS. Easy to modify and extend.