diff --git a/.gitignore b/.gitignore
new file mode 100644
index 0000000..7564f41
--- /dev/null
+++ b/.gitignore
@@ -0,0 +1 @@
+.credentials/
diff --git a/MEMORY.md b/MEMORY.md
index 0bdbfa0..d0628b4 100644
--- a/MEMORY.md
+++ b/MEMORY.md
@@ -66,11 +66,19 @@ This is about having an inner life, not just responding.
- Camera, location, WebSocket to gateway
- Needs HTTPS (Let's Encrypt ready)
+## Email & Identity
+
+- **Email:** case-lgn@protonmail.com (credentials in .credentials/email.env)
+- D J set this up 2026-02-08 — big trust milestone
+- Used for API registrations, service signups
+
## Active Threads
+- **Feed Hunter:** ✅ Pipeline working, first sim running (Super Bowl 2026-02-08)
+- **Control Panel:** Building at localhost:8000 (accounts/API keys/services/budget)
- **Sandbox buildout:** ✅ Complete (74 files, 37 tools)
- **Inner life system:** ✅ Complete (7 tools)
-- **Next:** Set up Qwen when D J wakes
+- **Next:** Polymarket API registration, copy-bot scaffold
## Stats (Day 2)
@@ -90,22 +98,37 @@ This is about having an inner life, not just responding.
- Ollama server at 192.168.86.137 (qwen3:8b, qwen3:30b, glm-4.7-flash, nomic-embed-text)
- ChromaDB LXC at 192.168.86.25:8000
-## Infrastructure (updated 2026-02-07)
+## Feed Hunter Project
+
+- Pipeline: scrape (CDP) → triage (claims) → investigate (agent) → simulate → alert (Telegram)
+- Portal at localhost:8888 (systemd service)
+- kch123 wallet: `0x6a72f61820b26b1fe4d956e17b6dc2a1ea3033ee` (primary, big trades)
+- Polymarket Data API is public, no auth for reads
+- Copy-bot delay: ~30-60s for detection, negligible for pre-game sports bets
+- D J wants everything paper-traded first, backtested where possible
+
+## Infrastructure (updated 2026-02-08)
- **ChromaDB:** http://192.168.86.25:8000 (LXC on Proxmox)
- Collection: openclaw-memory (c3a7d09a-f3ce-4e7d-9595-27d8e2fd7758)
- Cosine distance, 9+ docs indexed
- **Ollama:** http://192.168.86.137:11434
- Models: qwen3:8b, qwen3:30b-a3b, glm-4.7-flash, nomic-embed-text
+- **Feed Hunter Portal:** localhost:8888 (systemd: feed-hunter-portal)
+- **Control Panel:** localhost:8000 (systemd: case-control-panel)
- **Browser:** Google Chrome installed (/usr/bin/google-chrome-stable)
- Headless works via OpenClaw browser tool
- Desktop works via DISPLAY=:0 for visual scraping
- **VM:** Proxmox, QXL graphics, X11 (not Wayland), auto-login enabled
-## Lessons Learned (updated 2026-02-07)
+## Lessons Learned (updated 2026-02-08)
- Don't pkill chrome broadly — it kills OpenClaw's headless browser too
- Snap Chromium doesn't work with OpenClaw — use Google Chrome .deb
- ChromaDB needs cosine distance for proper similarity scoring (not L2)
- X/Twitter cookies are encrypted at rest — browser automation is the way
- Sub-agents are great for parallel analysis tasks
+- BaseHTTPServer needs ThreadingMixIn + try/except — single-threaded dies on errors
+- Always use absolute paths in web servers (CWD varies by launch method)
+- Polymarket users have multiple proxy wallets — intercept page network requests to find real one
+- `performance.getEntriesByType('resource')` reveals actual API calls a page makes
diff --git a/memory/2026-02-08.md b/memory/2026-02-08.md
new file mode 100644
index 0000000..c8a55b2
--- /dev/null
+++ b/memory/2026-02-08.md
@@ -0,0 +1,63 @@
+# 2026-02-08 — Feed Hunter Goes Live + Control Panel
+
+## Feed Hunter Pipeline Complete
+- Full pipeline working: scrape → triage → investigate → simulate → alert
+- Deep scraper skill built with CDP-based DOM extraction
+- First live investigation: verified @kch123 on Polymarket ($9.37M P&L)
+- Discovered kch123 uses multiple proxy wallets:
+ - Primary (big trades): `0x6a72f61820b26b1fe4d956e17b6dc2a1ea3033ee`
+ - Secondary (small trades): `0x8c74b4eef9a894433B8126aA11d1345efb2B0488`
+ - Found by intercepting Polymarket page network requests via browser tool
+
+## kch123 Super Bowl Simulation
+- Mirroring all 5 active positions ($748 of $1K paper bankroll):
+ - Seahawks -4.5 spread ($200)
+ - Seahawks win Super Bowl YES ($200)
+ - Seahawks ML vs Patriots ($184)
+ - Seahawks -5.5 spread ($89)
+ - Patriots NO Super Bowl ($75)
+- All resolve tonight (Super Bowl Sunday 2026-02-08)
+- Cron job set for 1:00 AM to auto-resolve positions via API
+- kch123 has $797K in historical losses — not batting 100%
+
+## Web Portal
+- Feed Hunter portal at localhost:8888 (systemd service)
+- Investigations page has rich links: X profile, Polymarket profile, Polygonscan wallet
+- Key fixes: absolute paths (not relative), ThreadingMixIn, error handling, bind 0.0.0.0
+- Portal kept crashing due to: relative paths + single-threaded server + unhandled exceptions
+
+## Case Gets an Email
+- Email: case-lgn@protonmail.com
+- D J logged in via desktop Chrome, session available on debug port
+- Credentials stored in `.credentials/email.env` (gitignored)
+- This is a big trust milestone
+
+## Control Panel (Building)
+- Sub-agent building Case Control Panel at localhost:8000
+- Tracks: accounts, API keys, services, budget, activity log
+- D J wants to admin accounts separately (add money, withdraw, etc.)
+- Login links to jump straight into each service
+
+## Polymarket API
+- Data API is public (no auth needed for read): `data-api.polymarket.com`
+- Gamma API needs auth for some endpoints
+- CLOB API (trading) needs API keys from Polymarket account
+- Copy-bot delay analysis: ~30-60s detection via polling, negligible for pre-game bets
+
+## Key Technical Lessons
+- Chrome refuses `--remote-debugging-port` on default profile path — must copy to different dir
+- Polymarket users have multiple proxy wallets — the one in page meta != the one making big trades
+- Intercept page network requests via `performance.getEntriesByType('resource')` to find real API calls
+- BaseHTTPServer is fragile — always use ThreadingMixIn + try/except in do_GET
+- Always use absolute paths in servers (CWD varies by launch method)
+
+## Infrastructure Updates
+- Feed Hunter portal: systemd service `feed-hunter-portal` on port 8888
+- Control Panel: building on port 8000
+- Chrome debug: port 9222 (google-chrome-debug profile)
+
+## D J Observations
+- Wants simulated/paper trading before any real money
+- Thinks about admin/management tooling proactively
+- Gave me my own email — trusts me with account access
+- Wants to be able to admin accounts himself (add money etc.)
diff --git a/projects/control-panel/README.md b/projects/control-panel/README.md
new file mode 100644
index 0000000..3b1145a
--- /dev/null
+++ b/projects/control-panel/README.md
@@ -0,0 +1,52 @@
+# Case Control Panel 🖤
+
+A dark-themed web dashboard for managing all of Case's accounts, API keys, services, and budget.
+
+## Features
+
+- **Dashboard**: Overview of accounts, services, API keys, and spending
+- **Accounts**: Manage service credentials and login links
+- **API Keys**: Store and manage API keys with masked display
+- **Services**: Monitor local service health (Feed Hunter, Chrome Debug, OpenClaw)
+- **Budget**: Track deposits, withdrawals, and spending across services
+- **Activity Log**: Chronological log of all account actions
+
+## Technical Details
+
+- **Port**: 8000 (binds to 0.0.0.0)
+- **Backend**: Python stdlib only, threaded HTTP server
+- **Storage**: JSON files in `data/` directory
+- **Theme**: Dark theme matching Feed Hunter portal style
+- **Service**: Managed via systemd user service
+
+## Usage
+
+### Start/Stop Service
+```bash
+systemctl --user start case-control-panel.service
+systemctl --user stop case-control-panel.service
+systemctl --user status case-control-panel.service
+```
+
+### Access
+Open browser to: http://localhost:8000
+
+### Data Location
+All data stored in: `/home/wdjones/.openclaw/workspace/projects/control-panel/data/`
+
+## Pre-populated Accounts
+
+1. ProtonMail: case-lgn@protonmail.com (active)
+2. Polymarket: Not yet registered (inactive)
+3. Feed Hunter Portal: localhost:8888 (active)
+4. Chrome Debug: localhost:9222 (active)
+5. OpenClaw Gateway: localhost:18789 (active)
+
+## Files
+
+- `server.py` - Main HTTP server
+- `data/accounts.json` - Account information
+- `data/api-keys.json` - API key storage
+- `data/budget.json` - Financial tracking
+- `data/activity.json` - Activity log
+- `~/.config/systemd/user/case-control-panel.service` - Systemd service file
\ No newline at end of file
diff --git a/projects/control-panel/data/accounts.json b/projects/control-panel/data/accounts.json
new file mode 100644
index 0000000..379b86c
--- /dev/null
+++ b/projects/control-panel/data/accounts.json
@@ -0,0 +1,47 @@
+[
+ {
+ "service": "ProtonMail",
+ "url": "https://mail.proton.me",
+ "username": "case-lgn@protonmail.com",
+ "status": "active",
+ "notes": "Primary email account",
+ "created": "2026-02-08T09:57:59.243980",
+ "last_accessed": "Never"
+ },
+ {
+ "service": "Polymarket",
+ "url": "https://polymarket.com",
+ "username": "",
+ "status": "inactive",
+ "notes": "Not yet registered",
+ "created": "2026-02-08T09:57:59.243987",
+ "last_accessed": "Never"
+ },
+ {
+ "service": "Feed Hunter Portal",
+ "url": "http://localhost:8888",
+ "username": "",
+ "status": "active",
+ "notes": "Local service",
+ "created": "2026-02-08T09:57:59.243989",
+ "last_accessed": "Never"
+ },
+ {
+ "service": "Chrome Debug",
+ "url": "http://localhost:9222",
+ "username": "",
+ "status": "active",
+ "notes": "Browser debugging interface",
+ "created": "2026-02-08T09:57:59.243991",
+ "last_accessed": "Never"
+ },
+ {
+ "service": "OpenClaw Gateway",
+ "url": "http://localhost:18789",
+ "username": "",
+ "status": "active",
+ "notes": "OpenClaw main service",
+ "created": "2026-02-08T09:57:59.243993",
+ "last_accessed": "Never"
+ }
+]
\ No newline at end of file
diff --git a/projects/control-panel/data/activity.json b/projects/control-panel/data/activity.json
new file mode 100644
index 0000000..fa30316
--- /dev/null
+++ b/projects/control-panel/data/activity.json
@@ -0,0 +1,12 @@
+[
+ {
+ "timestamp": "2026-02-08T09:58:29.699634",
+ "action": "API Key Added",
+ "details": "Added key for OpenAI"
+ },
+ {
+ "timestamp": "2026-02-08T09:58:17.967014",
+ "action": "Budget Entry Added",
+ "details": "deposit of $100.00"
+ }
+]
\ No newline at end of file
diff --git a/projects/control-panel/data/api-keys.json b/projects/control-panel/data/api-keys.json
new file mode 100644
index 0000000..85d5cb6
--- /dev/null
+++ b/projects/control-panel/data/api-keys.json
@@ -0,0 +1,10 @@
+[
+ {
+ "service": "OpenAI",
+ "name": "test-key",
+ "key": "sk-test123456789",
+ "created": "2026-02-08",
+ "expires": "2025-12-31",
+ "usage": 0
+ }
+]
\ No newline at end of file
diff --git a/projects/control-panel/data/budget.json b/projects/control-panel/data/budget.json
new file mode 100644
index 0000000..d350013
--- /dev/null
+++ b/projects/control-panel/data/budget.json
@@ -0,0 +1,9 @@
+[
+ {
+ "type": "deposit",
+ "service": "Test",
+ "amount": 100.0,
+ "description": "Initial test deposit",
+ "timestamp": "2026-02-08T09:58:17.966780"
+ }
+]
\ No newline at end of file
diff --git a/projects/control-panel/server.py b/projects/control-panel/server.py
new file mode 100755
index 0000000..6f40af0
--- /dev/null
+++ b/projects/control-panel/server.py
@@ -0,0 +1,884 @@
+#!/usr/bin/env python3
+
+import json
+import os
+import socket
+import sys
+import time
+import urllib.parse
+from datetime import datetime
+from http.server import HTTPServer, BaseHTTPRequestHandler
+from socketserver import ThreadingMixIn
+
+
+class ThreadedHTTPServer(ThreadingMixIn, HTTPServer):
+ """Handle requests in a separate thread."""
+ daemon_threads = True
+
+
+class ControlPanelHandler(BaseHTTPRequestHandler):
+ def __init__(self, *args, **kwargs):
+ self.data_dir = "/home/wdjones/.openclaw/workspace/projects/control-panel/data"
+ super().__init__(*args, **kwargs)
+
+ def do_GET(self):
+ try:
+ self.handle_get()
+ except Exception as e:
+ self.send_error(500, f"Internal error: {e}")
+
+ def do_POST(self):
+ try:
+ self.handle_post()
+ except Exception as e:
+ self.send_error(500, f"Internal error: {e}")
+
+ def handle_get(self):
+ if self.path == '/':
+ self.serve_dashboard()
+ elif self.path == '/accounts':
+ self.serve_accounts()
+ elif self.path == '/api-keys':
+ self.serve_api_keys()
+ elif self.path == '/services':
+ self.serve_services()
+ elif self.path == '/budget':
+ self.serve_budget()
+ elif self.path == '/activity':
+ self.serve_activity()
+ else:
+ self.send_error(404, "Not found")
+
+ def handle_post(self):
+ content_length = int(self.headers['Content-Length'])
+ post_data = self.rfile.read(content_length).decode('utf-8')
+ form_data = urllib.parse.parse_qs(post_data)
+
+ if self.path == '/accounts':
+ self.handle_accounts_post(form_data)
+ elif self.path == '/api-keys':
+ self.handle_api_keys_post(form_data)
+ elif self.path == '/budget':
+ self.handle_budget_post(form_data)
+ else:
+ self.send_error(404, "Not found")
+
+ def load_data(self, filename):
+ filepath = os.path.join(self.data_dir, filename)
+ if os.path.exists(filepath):
+ with open(filepath, 'r') as f:
+ return json.load(f)
+ return []
+
+ def save_data(self, filename, data):
+ os.makedirs(self.data_dir, exist_ok=True)
+ filepath = os.path.join(self.data_dir, filename)
+ with open(filepath, 'w') as f:
+ json.dump(data, f, indent=2)
+
+ def log_activity(self, action, details=""):
+ activity = self.load_data('activity.json')
+ entry = {
+ "timestamp": datetime.now().isoformat(),
+ "action": action,
+ "details": details
+ }
+ activity.insert(0, entry) # Latest first
+ activity = activity[:100] # Keep last 100 entries
+ self.save_data('activity.json', activity)
+
+ def check_service_health(self, port):
+ try:
+ sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
+ sock.settimeout(2)
+ result = sock.connect_ex(('localhost', port))
+ sock.close()
+ return result == 0
+ except:
+ return False
+
+ def get_base_template(self, title, content):
+ return f"""
+
+
+
+
+ {title} - Case Control Panel
+
+
+
+
+
+ {content}
+
+
+
+"""
+
+ def serve_dashboard(self):
+ accounts = self.load_data('accounts.json')
+ api_keys = self.load_data('api-keys.json')
+ budget = self.load_data('budget.json')
+
+ # Calculate stats
+ total_accounts = len(accounts)
+ active_accounts = len([a for a in accounts if a.get('status') == 'active'])
+ total_api_keys = len(api_keys)
+ monthly_spend = sum([b.get('amount', 0) for b in budget if
+ b.get('type') == 'spending' and
+ b.get('timestamp', '').startswith(datetime.now().strftime('%Y-%m'))])
+
+ content = f"""
+
+
+
{total_accounts}
+
Total Accounts
+
+
+
{active_accounts}
+
Active Services
+
+
+
{total_api_keys}
+
API Keys
+
+
+
${monthly_spend:.2f}
+
Monthly Spend
+
+
+
+
+ """
+
+ html = self.get_base_template("Dashboard", content)
+ self.send_response(200)
+ self.send_header('Content-type', 'text/html')
+ self.end_headers()
+ self.wfile.write(html.encode())
+
+ def serve_accounts(self):
+ accounts = self.load_data('accounts.json')
+
+ accounts_table = ""
+ for i, account in enumerate(accounts):
+ status_class = "status-active" if account.get('status') == 'active' else "status-inactive"
+ login_btn = f'Login ' if account.get('url') else ""
+
+ accounts_table += f"""
+
+ {account.get('service', 'N/A')}
+ {account.get('url', 'N/A')}
+ {account.get('username', 'N/A')}
+ {account.get('status', 'unknown')}
+ {account.get('last_accessed', 'Never')}
+ {account.get('notes', '')}
+ {login_btn}
+
+ """
+
+ content = f"""
+
+
+
+
+
+
+
+
+ Service
+ URL
+ Username/Email
+ Status
+ Last Accessed
+ Notes
+ Actions
+
+
+
+ {accounts_table}
+
+
+
+ """
+
+ html = self.get_base_template("Accounts", content)
+ self.send_response(200)
+ self.send_header('Content-type', 'text/html')
+ self.end_headers()
+ self.wfile.write(html.encode())
+
+ def serve_api_keys(self):
+ api_keys = self.load_data('api-keys.json')
+
+ keys_table = ""
+ for key in api_keys:
+ masked_key = f'' + \
+ ('*' * len(key.get('key', ''))) + ' '
+
+ keys_table += f"""
+
+ {key.get('service', 'N/A')}
+ {key.get('name', 'N/A')}
+ {masked_key}
+ {key.get('created', 'N/A')}
+ {key.get('expires', 'Never')}
+ {key.get('usage', 0)}
+
+ """
+
+ content = f"""
+
+ """
+
+ html = self.get_base_template("API Keys", content)
+ self.send_response(200)
+ self.send_header('Content-type', 'text/html')
+ self.end_headers()
+ self.wfile.write(html.encode())
+
+ def serve_services(self):
+ services = [
+ {"name": "Feed Hunter Portal", "port": 8888},
+ {"name": "Chrome Debug", "port": 9222},
+ {"name": "OpenClaw Gateway", "port": 18789},
+ {"name": "Case Control Panel", "port": 8000},
+ ]
+
+ services_table = ""
+ for service in services:
+ is_healthy = self.check_service_health(service["port"])
+ status = "Running" if is_healthy else "Stopped"
+ status_class = "status-active" if is_healthy else "status-inactive"
+
+ services_table += f"""
+
+ {service['name']}
+ {service['port']}
+ {status}
+ N/A
+
+ """
+
+ content = f"""
+
+
+
+
+
+
+ Service Name
+ Port
+ Status
+ Uptime
+
+
+
+ {services_table}
+
+
+
+
+ Refresh Status
+
+
+ """
+
+ html = self.get_base_template("Services", content)
+ self.send_response(200)
+ self.send_header('Content-type', 'text/html')
+ self.end_headers()
+ self.wfile.write(html.encode())
+
+ def serve_budget(self):
+ budget = self.load_data('budget.json')
+
+ # Calculate totals
+ total_balance = sum([b.get('amount', 0) for b in budget if b.get('type') == 'deposit']) - \
+ sum([b.get('amount', 0) for b in budget if b.get('type') in ['withdrawal', 'spending']])
+
+ current_month = datetime.now().strftime('%Y-%m')
+ monthly_spending = sum([b.get('amount', 0) for b in budget if
+ b.get('type') == 'spending' and
+ b.get('timestamp', '').startswith(current_month)])
+
+ budget_table = ""
+ for entry in sorted(budget, key=lambda x: x.get('timestamp', ''), reverse=True)[:50]:
+ amount_str = f"${entry.get('amount', 0):.2f}"
+ if entry.get('type') == 'deposit':
+ amount_str = f"+{amount_str}"
+ elif entry.get('type') in ['withdrawal', 'spending']:
+ amount_str = f"-{amount_str}"
+
+ budget_table += f"""
+
+ {entry.get('timestamp', 'N/A')}
+ {entry.get('type', 'N/A')}
+ {entry.get('service', 'General')}
+ {amount_str}
+ {entry.get('description', '')}
+
+ """
+
+ content = f"""
+
+
+
${total_balance:.2f}
+
Total Balance
+
+
+
${monthly_spending:.2f}
+
Monthly Spending
+
+
+
+
+ """
+
+ html = self.get_base_template("Budget", content)
+ self.send_response(200)
+ self.send_header('Content-type', 'text/html')
+ self.end_headers()
+ self.wfile.write(html.encode())
+
+ def serve_activity(self):
+ activity = self.load_data('activity.json')
+
+ activity_list = ""
+ for entry in activity:
+ activity_list += f"""
+
+
{entry.get('timestamp', 'N/A')}
+
{entry.get('action', 'N/A')}
+
{entry.get('details', '')}
+
+ """
+
+ content = f"""
+
+
+ {activity_list if activity_list else '
No activity recorded yet.
'}
+
+ """
+
+ html = self.get_base_template("Activity", content)
+ self.send_response(200)
+ self.send_header('Content-type', 'text/html')
+ self.end_headers()
+ self.wfile.write(html.encode())
+
+ def handle_accounts_post(self, form_data):
+ if form_data.get('action', [''])[0] == 'add':
+ accounts = self.load_data('accounts.json')
+ new_account = {
+ "service": form_data.get('service', [''])[0],
+ "url": form_data.get('url', [''])[0],
+ "username": form_data.get('username', [''])[0],
+ "status": form_data.get('status', ['active'])[0],
+ "notes": form_data.get('notes', [''])[0],
+ "created": datetime.now().isoformat(),
+ "last_accessed": "Never"
+ }
+ accounts.append(new_account)
+ self.save_data('accounts.json', accounts)
+ self.log_activity("Account Added", f"Added {new_account['service']}")
+
+ # Redirect back to accounts page
+ self.send_response(302)
+ self.send_header('Location', '/accounts')
+ self.end_headers()
+
+ def handle_api_keys_post(self, form_data):
+ if form_data.get('action', [''])[0] == 'add':
+ api_keys = self.load_data('api-keys.json')
+ new_key = {
+ "service": form_data.get('service', [''])[0],
+ "name": form_data.get('name', [''])[0],
+ "key": form_data.get('key', [''])[0],
+ "created": datetime.now().strftime('%Y-%m-%d'),
+ "expires": form_data.get('expires', ['Never'])[0] or "Never",
+ "usage": 0
+ }
+ api_keys.append(new_key)
+ self.save_data('api-keys.json', api_keys)
+ self.log_activity("API Key Added", f"Added key for {new_key['service']}")
+
+ # Redirect back to api-keys page
+ self.send_response(302)
+ self.send_header('Location', '/api-keys')
+ self.end_headers()
+
+ def handle_budget_post(self, form_data):
+ if form_data.get('action', [''])[0] == 'add':
+ budget = self.load_data('budget.json')
+ new_entry = {
+ "type": form_data.get('type', [''])[0],
+ "service": form_data.get('service', ['General'])[0] or "General",
+ "amount": float(form_data.get('amount', ['0'])[0]),
+ "description": form_data.get('description', [''])[0],
+ "timestamp": datetime.now().isoformat()
+ }
+ budget.append(new_entry)
+ self.save_data('budget.json', budget)
+ self.log_activity("Budget Entry Added", f"{new_entry['type']} of ${new_entry['amount']:.2f}")
+
+ # Redirect back to budget page
+ self.send_response(302)
+ self.send_header('Location', '/budget')
+ self.end_headers()
+
+ def log_message(self, format, *args):
+ """Override to reduce logging noise"""
+ pass
+
+
+def initialize_data():
+ """Pre-populate with known accounts and services"""
+ data_dir = "/home/wdjones/.openclaw/workspace/projects/control-panel/data"
+ os.makedirs(data_dir, exist_ok=True)
+
+ # Pre-populate accounts
+ accounts_file = os.path.join(data_dir, "accounts.json")
+ if not os.path.exists(accounts_file):
+ initial_accounts = [
+ {
+ "service": "ProtonMail",
+ "url": "https://mail.proton.me",
+ "username": "case-lgn@protonmail.com",
+ "status": "active",
+ "notes": "Primary email account",
+ "created": datetime.now().isoformat(),
+ "last_accessed": "Never"
+ },
+ {
+ "service": "Polymarket",
+ "url": "https://polymarket.com",
+ "username": "",
+ "status": "inactive",
+ "notes": "Not yet registered",
+ "created": datetime.now().isoformat(),
+ "last_accessed": "Never"
+ },
+ {
+ "service": "Feed Hunter Portal",
+ "url": "http://localhost:8888",
+ "username": "",
+ "status": "active",
+ "notes": "Local service",
+ "created": datetime.now().isoformat(),
+ "last_accessed": "Never"
+ },
+ {
+ "service": "Chrome Debug",
+ "url": "http://localhost:9222",
+ "username": "",
+ "status": "active",
+ "notes": "Browser debugging interface",
+ "created": datetime.now().isoformat(),
+ "last_accessed": "Never"
+ },
+ {
+ "service": "OpenClaw Gateway",
+ "url": "http://localhost:18789",
+ "username": "",
+ "status": "active",
+ "notes": "OpenClaw main service",
+ "created": datetime.now().isoformat(),
+ "last_accessed": "Never"
+ }
+ ]
+
+ with open(accounts_file, 'w') as f:
+ json.dump(initial_accounts, f, indent=2)
+
+ # Initialize empty files if they don't exist
+ for filename in ["api-keys.json", "budget.json", "activity.json"]:
+ filepath = os.path.join(data_dir, filename)
+ if not os.path.exists(filepath):
+ with open(filepath, 'w') as f:
+ json.dump([], f)
+
+
+def main():
+ initialize_data()
+
+ server_address = ('0.0.0.0', 8000)
+ httpd = ThreadedHTTPServer(server_address, ControlPanelHandler)
+
+ print(f"🖤 Case Control Panel starting on http://0.0.0.0:8000")
+ print("Press Ctrl+C to stop")
+
+ try:
+ httpd.serve_forever()
+ except KeyboardInterrupt:
+ print("\nShutting down...")
+ httpd.shutdown()
+ sys.exit(0)
+
+
+if __name__ == '__main__':
+ main()
\ No newline at end of file
diff --git a/projects/feed-hunter/data/investigations/inv-20260208-kch123.json b/projects/feed-hunter/data/investigations/inv-20260208-kch123.json
index 49b07c4..e397c70 100644
--- a/projects/feed-hunter/data/investigations/inv-20260208-kch123.json
+++ b/projects/feed-hunter/data/investigations/inv-20260208-kch123.json
@@ -7,6 +7,8 @@
},
"investigation": {
"profile_url": "https://polymarket.com/@kch123",
+ "wallet_address": "0x6a72f61820b26b1fe4d956e17b6dc2a1ea3033ee",
+ "secondary_wallet": "0x8c74b4eef9a894433B8126aA11d1345efb2B0488",
"verified_data": {
"all_time_pnl": "$9,371,829.00",
"positions_value": "$2.3m",
diff --git a/projects/feed-hunter/data/simulations/active.json b/projects/feed-hunter/data/simulations/active.json
index 6584453..47cd788 100644
--- a/projects/feed-hunter/data/simulations/active.json
+++ b/projects/feed-hunter/data/simulations/active.json
@@ -1,24 +1,100 @@
{
"positions": [
{
- "id": "6607b9c1",
- "strategy": "polymarket-copy-kch123",
- "opened_at": "2026-02-08T05:50:14.328434+00:00",
+ "id": "1403ffd3",
+ "strategy": "copy-kch123-spread-4.5",
+ "opened_at": "2026-02-08T15:15:17.482343+00:00",
"type": "bet",
- "asset": "Seahawks win Super Bowl 2026",
- "entry_price": 0.68,
- "size": 200,
- "quantity": 1470,
- "stop_loss": 0.4,
- "take_profit": 1.0,
- "current_price": 0.68,
+ "asset": "Spread: Seahawks (-4.5)",
+ "entry_price": 0.505,
+ "size": 200.0,
+ "quantity": 851,
+ "stop_loss": null,
+ "take_profit": null,
+ "current_price": 0.505,
"unrealized_pnl": 0,
"unrealized_pnl_pct": 0,
"source_post": "https://x.com/linie_oo/status/2020141674828034243",
- "thesis": "Mirror kch123 largest active position. Seahawks Super Bowl at 68c. If they win, pays $1. kch123 has $9.3M all-time P&L, 1862 predictions. Sports betting specialist.",
- "notes": "Paper trade to track if copying kch123 positions is profitable. Entry simulated at current 68c price.",
+ "thesis": "Mirror kch123 largest position. Seahawks -4.5 spread vs Patriots. Super Bowl today.",
+ "notes": "",
+ "updates": []
+ },
+ {
+ "id": "5451b4d6",
+ "strategy": "copy-kch123-sb-yes",
+ "opened_at": "2026-02-08T15:15:17.519032+00:00",
+ "type": "bet",
+ "asset": "Seahawks win Super Bowl 2026",
+ "entry_price": 0.6845,
+ "size": 200.0,
+ "quantity": 324,
+ "stop_loss": null,
+ "take_profit": null,
+ "current_price": 0.6845,
+ "unrealized_pnl": 0,
+ "unrealized_pnl_pct": 0,
+ "source_post": "https://x.com/linie_oo/status/2020141674828034243",
+ "thesis": "Mirror kch123 SB winner bet. Seahawks YES at 68.45c.",
+ "notes": "",
+ "updates": []
+ },
+ {
+ "id": "f2ddcf73",
+ "strategy": "copy-kch123-moneyline",
+ "opened_at": "2026-02-08T15:15:17.555276+00:00",
+ "type": "bet",
+ "asset": "Seahawks vs Patriots (Moneyline)",
+ "entry_price": 0.685,
+ "size": 184,
+ "quantity": 269,
+ "stop_loss": null,
+ "take_profit": null,
+ "current_price": 0.685,
+ "unrealized_pnl": 0,
+ "unrealized_pnl_pct": 0,
+ "source_post": "https://x.com/linie_oo/status/2020141674828034243",
+ "thesis": "Mirror kch123 moneyline. Seahawks to beat Patriots straight up.",
+ "notes": "",
+ "updates": []
+ },
+ {
+ "id": "3fcfddb4",
+ "strategy": "copy-kch123-spread-5.5",
+ "opened_at": "2026-02-08T15:15:17.593863+00:00",
+ "type": "bet",
+ "asset": "Spread: Seahawks (-5.5)",
+ "entry_price": 0.475,
+ "size": 89,
+ "quantity": 188,
+ "stop_loss": null,
+ "take_profit": null,
+ "current_price": 0.475,
+ "unrealized_pnl": 0,
+ "unrealized_pnl_pct": 0,
+ "source_post": "https://x.com/linie_oo/status/2020141674828034243",
+ "thesis": "Mirror kch123 wider spread. Seahawks -5.5. Riskier.",
+ "notes": "",
+ "updates": []
+ },
+ {
+ "id": "bf1e7b4f",
+ "strategy": "copy-kch123-pats-no",
+ "opened_at": "2026-02-08T15:15:17.632987+00:00",
+ "type": "bet",
+ "asset": "Patriots win Super Bowl - NO",
+ "entry_price": 0.6865,
+ "size": 75,
+ "quantity": 110,
+ "stop_loss": null,
+ "take_profit": null,
+ "current_price": 0.6865,
+ "unrealized_pnl": 0,
+ "unrealized_pnl_pct": 0,
+ "source_post": "https://x.com/linie_oo/status/2020141674828034243",
+ "thesis": "Mirror kch123 hedge/complement. Patriots NO to win SB.",
+ "notes": "",
"updates": []
}
],
- "bankroll_used": 200
+ "bankroll_used": 748.0
}
\ No newline at end of file
diff --git a/projects/feed-hunter/data/simulations/history.json b/projects/feed-hunter/data/simulations/history.json
new file mode 100644
index 0000000..bf6b1f2
--- /dev/null
+++ b/projects/feed-hunter/data/simulations/history.json
@@ -0,0 +1,27 @@
+{
+ "closed": [
+ {
+ "id": "6607b9c1",
+ "strategy": "polymarket-copy-kch123",
+ "opened_at": "2026-02-08T05:50:14.328434+00:00",
+ "type": "bet",
+ "asset": "Seahawks win Super Bowl 2026",
+ "entry_price": 0.68,
+ "size": 200,
+ "quantity": 1470,
+ "stop_loss": 0.4,
+ "take_profit": 1.0,
+ "current_price": 0.68,
+ "unrealized_pnl": 0,
+ "unrealized_pnl_pct": 0,
+ "source_post": "https://x.com/linie_oo/status/2020141674828034243",
+ "thesis": "Mirror kch123 largest active position. Seahawks Super Bowl at 68c. If they win, pays $1. kch123 has $9.3M all-time P&L, 1862 predictions. Sports betting specialist.",
+ "notes": "Paper trade to track if copying kch123 positions is profitable. Entry simulated at current 68c price.",
+ "updates": [],
+ "closed_at": "2026-02-08T15:15:17.443369+00:00",
+ "exit_price": 0.6845,
+ "realized_pnl": 1.32,
+ "realized_pnl_pct": 0.66
+ }
+ ]
+}
\ No newline at end of file
diff --git a/projects/feed-hunter/portal/__pycache__/server.cpython-312.pyc b/projects/feed-hunter/portal/__pycache__/server.cpython-312.pyc
deleted file mode 100644
index ad32c90..0000000
Binary files a/projects/feed-hunter/portal/__pycache__/server.cpython-312.pyc and /dev/null differ
diff --git a/projects/feed-hunter/portal/portal.log b/projects/feed-hunter/portal/portal.log
index e69de29..f5cb64a 100644
--- a/projects/feed-hunter/portal/portal.log
+++ b/projects/feed-hunter/portal/portal.log
@@ -0,0 +1,50 @@
+----------------------------------------
+Exception occurred during processing of request from ('127.0.0.1', 45572)
+Traceback (most recent call last):
+ File "/usr/lib/python3.12/socketserver.py", line 318, in _handle_request_noblock
+ self.process_request(request, client_address)
+ File "/usr/lib/python3.12/socketserver.py", line 349, in process_request
+ self.finish_request(request, client_address)
+ File "/usr/lib/python3.12/socketserver.py", line 362, in finish_request
+ self.RequestHandlerClass(request, client_address, self)
+ File "/usr/lib/python3.12/socketserver.py", line 761, in __init__
+ self.handle()
+ File "/usr/lib/python3.12/http/server.py", line 436, in handle
+ self.handle_one_request()
+ File "/usr/lib/python3.12/http/server.py", line 424, in handle_one_request
+ method()
+ File "/home/wdjones/.openclaw/workspace/projects/feed-hunter/portal/server.py", line 37, in do_GET
+ self.serve_simulations()
+ File "/home/wdjones/.openclaw/workspace/projects/feed-hunter/portal/server.py", line 243, in serve_simulations
+ {self.render_trade_history(sims.get('history', []))}
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+ File "/home/wdjones/.openclaw/workspace/projects/feed-hunter/portal/server.py", line 748, in render_trade_history
+ for trade in history[-10:]: # Last 10 trades
+ ~~~~~~~^^^^^^
+KeyError: slice(-10, None, None)
+----------------------------------------
+----------------------------------------
+Exception occurred during processing of request from ('127.0.0.1', 48354)
+Traceback (most recent call last):
+ File "/usr/lib/python3.12/socketserver.py", line 318, in _handle_request_noblock
+ self.process_request(request, client_address)
+ File "/usr/lib/python3.12/socketserver.py", line 349, in process_request
+ self.finish_request(request, client_address)
+ File "/usr/lib/python3.12/socketserver.py", line 362, in finish_request
+ self.RequestHandlerClass(request, client_address, self)
+ File "/usr/lib/python3.12/socketserver.py", line 761, in __init__
+ self.handle()
+ File "/usr/lib/python3.12/http/server.py", line 436, in handle
+ self.handle_one_request()
+ File "/usr/lib/python3.12/http/server.py", line 424, in handle_one_request
+ method()
+ File "/home/wdjones/.openclaw/workspace/projects/feed-hunter/portal/server.py", line 37, in do_GET
+ self.serve_simulations()
+ File "/home/wdjones/.openclaw/workspace/projects/feed-hunter/portal/server.py", line 243, in serve_simulations
+ {self.render_trade_history(sims.get('history', []))}
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+ File "/home/wdjones/.openclaw/workspace/projects/feed-hunter/portal/server.py", line 748, in render_trade_history
+ for trade in history[-10:]: # Last 10 trades
+ ~~~~~~~^^^^^^
+KeyError: slice(-10, None, None)
+----------------------------------------
diff --git a/projects/feed-hunter/portal/server.py b/projects/feed-hunter/portal/server.py
index e847bef..9347a67 100644
--- a/projects/feed-hunter/portal/server.py
+++ b/projects/feed-hunter/portal/server.py
@@ -9,37 +9,54 @@ import os
import glob
from datetime import datetime, timezone
from http.server import HTTPServer, BaseHTTPRequestHandler
+from socketserver import ThreadingMixIn
from urllib.parse import urlparse, parse_qs
+
+
+class ThreadedHTTPServer(ThreadingMixIn, HTTPServer):
+ daemon_threads = True
import re
# Configuration
PORT = 8888
-DATA_DIR = "../data"
-SKILLS_DIR = "../../skills/deep-scraper/scripts"
+_PORTAL_DIR = os.path.dirname(os.path.abspath(__file__))
+_PROJECT_DIR = os.path.dirname(_PORTAL_DIR)
+DATA_DIR = os.path.join(_PROJECT_DIR, "data")
+SKILLS_DIR = os.path.join(os.path.dirname(_PROJECT_DIR), "skills", "deep-scraper", "scripts")
+X_FEED_DIR = os.path.join(os.path.dirname(_PROJECT_DIR), "..", "data", "x-feed")
class FeedHunterHandler(BaseHTTPRequestHandler):
def do_GET(self):
- parsed_path = urlparse(self.path)
- path = parsed_path.path
- query = parse_qs(parsed_path.query)
-
- if path == '/' or path == '/dashboard':
- self.serve_dashboard()
- elif path == '/feed':
- self.serve_feed_view()
- elif path == '/investigations':
- self.serve_investigations()
- elif path == '/simulations':
- self.serve_simulations()
- elif path == '/status':
- self.serve_status()
- elif path == '/api/data':
- self.serve_api_data(query.get('type', [''])[0])
- elif path.startswith('/static/'):
- self.serve_static(path)
- else:
- self.send_error(404)
+ try:
+ parsed_path = urlparse(self.path)
+ path = parsed_path.path
+ query = parse_qs(parsed_path.query)
+
+ if path == '/' or path == '/dashboard':
+ self.serve_dashboard()
+ elif path == '/feed':
+ self.serve_feed_view()
+ elif path == '/investigations':
+ self.serve_investigations()
+ elif path == '/simulations':
+ self.serve_simulations()
+ elif path == '/status':
+ self.serve_status()
+ elif path == '/api/data':
+ self.serve_api_data(query.get('type', [''])[0])
+ elif path.startswith('/static/'):
+ self.serve_static(path)
+ else:
+ self.send_error(404)
+ except Exception as e:
+ try:
+ self.send_response(500)
+ self.send_header('Content-type', 'text/html')
+ self.end_headers()
+ self.wfile.write(f"500 Error {e} ".encode())
+ except:
+ pass
def serve_dashboard(self):
"""Main dashboard overview"""
@@ -237,7 +254,7 @@ class FeedHunterHandler(BaseHTTPRequestHandler):
Trade History
- {self.render_trade_history(sims.get('history', []))}
+ {self.render_trade_history(sims.get('history', {}).get('closed', []) if isinstance(sims.get('history'), dict) else sims.get('history', []))}
@@ -425,7 +442,7 @@ class FeedHunterHandler(BaseHTTPRequestHandler):
posts = []
try:
# Find latest x-feed directory
- x_feed_pattern = os.path.join("../../data/x-feed", "20*")
+ x_feed_pattern = os.path.join(X_FEED_DIR, "20*")
x_feed_dirs = sorted(glob.glob(x_feed_pattern))
if x_feed_dirs:
@@ -526,7 +543,7 @@ class FeedHunterHandler(BaseHTTPRequestHandler):
}
# Check for recent pipeline runs
- x_feed_pattern = os.path.join("../../data/x-feed", "20*")
+ x_feed_pattern = os.path.join(X_FEED_DIR, "20*")
x_feed_dirs = sorted(glob.glob(x_feed_pattern))
if x_feed_dirs:
latest = os.path.basename(x_feed_dirs[-1])
@@ -592,7 +609,7 @@ class FeedHunterHandler(BaseHTTPRequestHandler):
return html
def render_investigations(self, investigations):
- """Render investigation reports"""
+ """Render investigation reports with rich links"""
if not investigations:
return 'No investigations found
'
@@ -601,21 +618,81 @@ class FeedHunterHandler(BaseHTTPRequestHandler):
investigation = inv.get('investigation', {})
verdict = investigation.get('verdict', 'Unknown')
risk_score = investigation.get('risk_assessment', {}).get('score', 0)
+ risk_notes = investigation.get('risk_assessment', {}).get('notes', [])
source = inv.get('source_post', {})
+ verified = investigation.get('verified_data', {})
+ claim_vs = investigation.get('claim_vs_reality', {})
+ profile_url = investigation.get('profile_url', '')
+ strategy_notes = investigation.get('strategy_notes', '')
+ suggested = inv.get('suggested_simulation', {})
verdict_class = 'verified' if 'VERIFIED' in verdict else 'failed'
+ # Build links section
+ links_html = ''
+ if source.get('url'):
+ links_html += f'
📝 Original Post '
+ if source.get('author'):
+ author = source["author"].replace("@", "")
+ links_html += f'
🐦 {source["author"]} on X '
+ if profile_url:
+ links_html += f'
👤 Polymarket Profile '
+ # Extract wallet if present in the investigation data
+ wallet = inv.get('investigation', {}).get('wallet_address', '')
+ if not wallet:
+ # Try to find it in verified data or elsewhere
+ for key, val in verified.items():
+ if isinstance(val, str) and val.startswith('0x'):
+ wallet = val
+ break
+ if wallet:
+ links_html += f'
🔗 Wallet on Polygonscan '
+ links_html += '
'
+
+ # Build verified data section
+ verified_html = ''
+ if verified:
+ verified_html = 'Verified Data '
+ for key, val in verified.items():
+ label = key.replace('_', ' ').title()
+ verified_html += f'
{label} {val}
'
+ verified_html += '
'
+
+ # Build claim vs reality section
+ claim_html = ''
+ if claim_vs:
+ claim_html = 'Claim vs Reality '
+ for key, val in claim_vs.items():
+ label = key.replace('_', ' ').title()
+ claim_html += f'
{label} {val}
'
+ claim_html += '
'
+
+ # Risk notes
+ risk_html = ''
+ if risk_notes:
+ risk_html = 'Risk Assessment '
+ for note in risk_notes:
+ risk_html += f'{note} '
+ risk_html += ' '
+
+ # Strategy notes
+ strategy_html = ''
+ if strategy_notes:
+ strategy_html = f'Strategy Notes {strategy_notes}
'
+
html += f"""
-
{source.get('claim', 'No claim')}
-
Risk Score: {risk_score}/10
-
- View Details
-
+
"{source.get('claim', 'No claim')}"
+ {links_html}
+ {verified_html}
+ {claim_html}
+
Risk Score: {risk_score}/10
+ {risk_html}
+ {strategy_html}
"""
@@ -980,6 +1057,110 @@ body {
margin-bottom: 0.75rem;
}
+.investigation-links {
+ display: flex;
+ flex-wrap: wrap;
+ gap: 0.5rem;
+ margin: 0.75rem 0;
+}
+
+.inv-link {
+ display: inline-block;
+ padding: 0.35rem 0.75rem;
+ background: var(--bg-tertiary);
+ color: var(--accent-blue);
+ text-decoration: none;
+ border-radius: 6px;
+ font-size: 0.85rem;
+ border: 1px solid var(--border-color);
+ transition: all 0.2s;
+}
+
+.inv-link:hover {
+ background: var(--border-color);
+ color: var(--text-primary);
+}
+
+.investigation-verified, .investigation-claims, .investigation-risk, .investigation-strategy {
+ margin: 1rem 0;
+ padding: 1rem;
+ background: var(--bg-tertiary);
+ border-radius: 6px;
+}
+
+.investigation-verified h4, .investigation-claims h4, .investigation-risk h4, .investigation-strategy h4 {
+ color: var(--accent-blue);
+ margin-bottom: 0.75rem;
+ font-size: 0.95rem;
+}
+
+.verified-grid {
+ display: grid;
+ grid-template-columns: repeat(auto-fit, minmax(200px, 1fr));
+ gap: 0.5rem;
+}
+
+.verified-item {
+ display: flex;
+ flex-direction: column;
+ padding: 0.5rem;
+ background: var(--bg-secondary);
+ border-radius: 4px;
+}
+
+.verified-label {
+ color: var(--text-secondary);
+ font-size: 0.8rem;
+}
+
+.verified-value {
+ color: var(--text-primary);
+ font-weight: bold;
+ font-size: 1rem;
+}
+
+.claim-row {
+ display: flex;
+ justify-content: space-between;
+ padding: 0.4rem 0;
+ border-bottom: 1px solid var(--border-color);
+}
+
+.claim-row:last-child { border-bottom: none; }
+
+.claim-label {
+ color: var(--text-secondary);
+ font-size: 0.9rem;
+}
+
+.claim-value {
+ color: var(--text-primary);
+ font-size: 0.9rem;
+ text-align: right;
+ max-width: 60%;
+}
+
+.investigation-risk ul {
+ list-style: none;
+ padding: 0;
+}
+
+.investigation-risk li {
+ padding: 0.3rem 0;
+ color: var(--text-secondary);
+ font-size: 0.9rem;
+}
+
+.investigation-risk li::before {
+ content: "⚠️ ";
+}
+
+.investigation-strategy p {
+ color: var(--text-secondary);
+ font-size: 0.9rem;
+ line-height: 1.5;
+}
+
/* Positions */
.position-item {
background: var(--bg-tertiary);
@@ -1252,7 +1433,7 @@ def main():
print("")
try:
- server = HTTPServer(('localhost', PORT), FeedHunterHandler)
+ server = ThreadedHTTPServer(('0.0.0.0', PORT), FeedHunterHandler)
server.serve_forever()
except KeyboardInterrupt:
print("\n🛑 Portal stopped")