Night shift: tweet analyzer, data connectors, feed monitor, market watch portal

This commit is contained in:
2026-02-12 00:16:41 -06:00
parent f623cba45c
commit 07f1448d57
20 changed files with 1825 additions and 388 deletions

View File

@ -78,19 +78,36 @@ This is about having an inner life, not just responding.
- **Craigslist:** case-lgn@protonmail.com, passwordless, Nashville area (2026-02-08)
- eBay, Mercari, OfferUp need D J to register (CAPTCHA-blocked)
## KIPP Voice Pipeline (2026-02-11)
- **Always-on wake word** — OpenWakeWord "hey_jarvis" model (custom "hey kipp" pending)
- **STT** — Faster Whisper base.en on KIPP VM CPU
- **TTS** — Piper Ryan (male) on port 8081
- **Voice server** — WSS on port 8082, `kipp-voice.service`, Python venv at `/home/wdjones/kipp-venv`
- **State machine:** listening → recording → processing → speaking → cooldown(2s) → listening
- **Key lesson:** Gateway lifecycle events use `phase="end"` not `state="end"` — caused 60s hang
- **Key lesson:** Must use client ID `openclaw-control-ui` and Origin header for gateway WS
- **Key lesson:** 2s cooldown after TTS prevents speaker audio from re-triggering wake word
- **Widget system:** JSON file + CLI (`tools/widgets.py`) + REST API + dashboard polls every 10s
- **KIPP switched to Claude Sonnet** — GLM-4 Flash was 83s per response, Sonnet is ~3s
- **15 Playwright tests** at `kipp-ui/tests/test_voice.py`
- **All on feature/wake-word branch** in kipp/workspace repo
## Active Threads
- **Market Watch:** ✅ GARP paper trading sim live at marketwatch.local:8889
- Multiplayer game engine, "GARP Challenge" game running
- Case trading autonomously — 7 positions opened 2026-02-09
- Systemd timer Mon-Fri 9AM + 3:30PM CST
- **Feed Hunter:** ✅ Pipeline working, Super Bowl sim +72.8% on kch123 copy
- Expanding into crypto and stock analysis
- **KIPP:** ✅ Voice pipeline live (wake word + STT + TTS), widget system working, dashboard-first UI
- Widget system: shopping list, timers, reminders via CLI + REST API + dashboard polling
- Voice: "hey jarvis" wake word → Faster Whisper → Claude Sonnet → Piper Ryan TTS
- False trigger fix: 4s cooldown + silence flushing + RMS gate (threshold 30)
- Running on Claude Sonnet (primary), GLM-4 Flash (fallback)
- Next: Steam Deck frontend, custom "hey kipp" wake word, blue waveform animation
- **Market Watch:** ✅ GARP paper trading sim live
- GARP Challenge: $100,055.90 (+0.06%), 6 positions
- Leverage Challenge: $11,367.07 (+13.67%), 85 trades, 55.3% win rate
- **Feed Hunter:** ✅ Pipeline working, needs systemd service for periodic monitoring
- **Stock Screener:** yfinance-based, 902 tickers, GARP filters, free/no API key
- **Control Panel:** Building at localhost:8000
- **Sandbox buildout:** ✅ Complete (74 files, 37 tools)
- **Inner life system:** ✅ Complete (7 tools)
- **Next:** Crypto signal analysis (D J forwarding Telegram signals), expanded Feed Hunter
- **Next:** Tweet analysis tool, free data source integration (Arkham/DefiLlama/Coinglass)
## Stats (Day 2)

View File

@ -76,6 +76,37 @@ Major KIPP infrastructure session with D J.
- Results persistence (timer created in chat → appears on dashboard)
- Proximity-aware layout (different for close vs far viewing)
### KIPP HTTPS + Voice Fixed
- Self-signed cert generated (10yr, SAN for 192.168.86.100)
- UI server switched to HTTPS on port 8080
- socat WSS proxy on port 18790 → gateway 18789 (systemd kipp-wss-proxy.service)
- Browser TTS fallback removed — Piper only
- Double-voice mystery: D J had 2 tabs open 😂
- Gateway config fixed: `allowedOrigins` was at root level (invalid), moved to `gateway.controlUi.allowedOrigins`
- Added `https://192.168.86.100:8080` to allowedOrigins
### @milesdeutscher Tweet Analysis
- Polymarket copy-trading GitHub bot going viral (23K views)
- Our take: validates our kch123 approach but we're ahead — we have whale selection, they just have execution code
- Edge erodes with adoption; not actionable for us
### Night Shift — Ambient Mode
- Built ambient/idle mode for KIPP UI (sub-agent)
- Activates after 60s idle: large glowing clock, weather icon, dark gradient background
- Rotating content: tuxedo cat facts, quotes, trivia (every 30s)
- Tap anywhere to return to dashboard
- Verified with Playwright: both ambient and dashboard modes render correctly
### Late Night — KIPP Local-Only Switch
- **D J decided KIPP stays local-only** — no external exposure, direct IP access
- Switched UI WebSocket URL from `wss://kipp.host.letsgetnashty.com` to `ws://192.168.86.100:18789`
- UI renders visually at `http://192.168.86.100:8080/` (Playwright confirmed: green dot, weather, clock)
- **But WS still broken**: origin-allowed errors persist, old domain URLs not fully stripped from JS fallback/retry code
- Hundreds of failed reconnect attempts every ~3s in console logs
- TTS and weather fetch endpoints still referencing old HTTPS domain paths
- **Next**: fully clean UI JS code of all old domain refs, fix origin-allowed, re-test with Playwright
- **Network issue on Case's VM** (192.168.86.45): persistent "TypeError: fetch failed" every ~10s — Telegram polling, ChromaDB auto-recall broken. D J communicating via webchat as workaround.
### Caddy Config (D J's reverse proxy)
```
kippui.host.letsgetnashty.com {

61
memory/2026-02-11.md Normal file
View File

@ -0,0 +1,61 @@
# 2026-02-11
## KIPP Voice Pipeline — Major Build Session
### Built & Deployed (feature/wake-word branch)
- **Always-on wake word detection** via OpenWakeWord (hey_jarvis model as placeholder)
- **Faster Whisper** (base.en) for speech-to-text on KIPP VM
- **Voice WebSocket server** on port 8082 (TLS) — `kipp-voice.service`
- **Python venv** at `/home/wdjones/kipp-venv` with openwakeword, faster-whisper, websockets, aiohttp
- **Male TTS voice** — switched from Amy to Ryan (Piper en_US)
- **Hero panel chat** — voice interaction happens inside the greeting/hero card, not a separate overlay
- **Widget state system** — JSON file + CLI tool + REST API + dashboard polling
- `tools/widgets.py` for shopping list, timers, reminders
- API endpoints on UI server: GET/POST /api/widgets
- Dashboard loads real data, polls every 10s
- KIPP agent instructed in SOUL.md to use widget CLI
### Key Bugs Fixed
1. **CSS injected inside JS** — patch script found `/* CHAT OVERLAY */` in both CSS and JS sections
2. **Gateway challenge-response** — must answer `connect.challenge` with `req` method `connect`
3. **Client ID must be `openclaw-control-ui`** — gateway validates this
4. **Origin header required** — voice server needs `Origin: https://192.168.86.100:8080`
5. **Lifecycle event detection** — gateway sends `phase="end"` not `state="end"` — THIS was the 60-second hang bug
6. **Audio suppressed during wake state** — browser stopped sending mic data when it should have been recording
7. **Race condition** — server sent `ready` before TTS finished, mic picked up speaker audio
8. **Self-triggering wake word** — KIPP's own TTS voice triggered "hey jarvis" — fixed with 2s cooldown
9. **voiceState stuck on speaking** — client must set listening before server's ready msg arrives
10. **Duplicate JS blocks** — sub-agent inserted widget code twice
### Voice State Machine (final)
```
listening → (wake word) → recording → (silence) → processing → (gateway) → speaking → (done_speaking) → cooldown (2s) → listening
```
### Timing Config
- 4s grace period after wake word before silence timeout
- 1.5s silence after speech to end recording
- 30s max recording time
- 2s cooldown after TTS to prevent self-trigger
### KIPP Model Switch
- Switched from `llamacpp/glm-4.7-flash` (83s responses!) to `anthropic/claude-sonnet-4-20250514` (~3s responses)
- GLM-4 Flash as fallback
- Config at `/home/wdjones/.openclaw/openclaw.json` on KIPP VM
### 15 Playwright Tests
- `kipp-ui/tests/test_voice.py` — UI elements, state transitions, chat flow, server connectivity
## anoin123 Investigation
- @browomo tweet about anoin123 Polymarket wallet: $1.6M in 57 days
- **2-4 AM EST claim is FALSE** — trades peak at 3 PM EST
- Strategy: "No harvester" — buys No at 90-99¢ on time-bounded events, collects spread
- $2.2M volume, $7K avg trade, concentrated on Iran strikes + government shutdown
- Monitor set up: `anoin123-monitor.py` + systemd timer every 5min
- Analysis at `data/investigations/anoin123-analysis.md`
- Copy-trade verdict: medium value — strategy is mechanical and replicable independently
## Infrastructure Notes
- KIPP VM services: kipp-ui, kipp-voice, kipp-tts, kipp-wss-proxy, openclaw-gateway
- Widget data: `/home/wdjones/.openclaw/workspace/kipp-ui/data/widgets.json`
- All changes on `feature/wake-word` branch in kipp/workspace repo

14
memory/2026-02-12.md Normal file
View File

@ -0,0 +1,14 @@
# 2026-02-12
## KIPP Voice Pipeline Fixes
- False wake word triggers after KIPP speaks — wake model picking up speaker audio
- Patch 1: Increased cooldown 2s → 4s, added silence flushing during cooldown (feed zeros through wake model to clear internal buffers), added RMS energy gate on wake detection
- RMS gate of 200 was too aggressive — blocked ALL real wake attempts (real RMS was 45-157)
- Lowered RMS gate to 30 — just filters literal silence false positives
- Voice server restarted, D J testing
## Tweet Analyses
- **@jollygreenmoney / $SHL.V** — Homeland Nickel (TSX-V penny stock). Promoted junior miner, already ran 2,300% from $0.03→$0.72, pulled back to $0.41. Collapsing volume = distribution phase. Coordinated promotion with @Levi_Researcher. Nickel bull thesis has merit but this specific stock is exit liquidity. Verdict: stay away.
- **@MoonDevOnYT** — "Fastest growing quant repo on GitHub" AI trading agents. 875 followers, self-proclaimed "#1 quant on X." Content marketing funnel → paid private streams at moondev.com. No verifiable P&L, buzzword soup, fantasy architecture. Verdict: course seller, skip.
## D J signed off ~midnight

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,3 @@
{
"last_check": "2026-02-12T06:14:23.639458+00:00"
}

View File

@ -0,0 +1,50 @@
{
"timestamp": "2026-02-12T06:11:18.980961+00:00",
"total_scraped": 44,
"new_posts": 22,
"money_posts": 7,
"posts": [
{
"text": "We\u2019re in the business of supporting traders. Get the tools you need to make your way in the market.",
"userName": "tastytrade\n@tastytrade",
"timestamp": "",
"link": "/tastytrade/status/1971214019274080389/analytics"
},
{
"text": "agentic e2e regression testing solved.\nprompt \u2192 test in under 2 mins.\nif you're still writing tests by hand... why",
"userName": "Bug0\n@bug0inc",
"timestamp": "",
"link": "/bug0inc/status/2019756209390375399/analytics"
},
{
"text": " New GPT crypto price predictions:\n\n$ETH -0.285% => $1945.73\n\n$ZRO +0.086% => $2.329\n\n$UNI 0.0% => $3.492\n\n$BTC -0.116% => $67481.29\n\n$XRP +0.181% => $1.3796\n\n$PENGU +0.116% => $0.006021\n\n$SOL +0.063% => $79.72\n\n$TRUMP +0.094% => $3.195",
"userName": "OctoBot - GPT crypto price predictions\n@OctoBotGPT\n\u00b7\n10h",
"timestamp": "2026-02-11T20:00:50.000Z",
"link": "/OctoBotGPT/status/2021675819580481813"
},
{
"text": " New GPT crypto price predictions:\n\n$ADA 0.0% => $0.2633\n\n$DOGE 0.0% => $0.09372\n\n$SOL 0.0% => $84.78\n\n$XRP +0.266% => $1.427\n\n$SENT +0.594% => $0.02863\n\n$PAXG 0.0% => $5033.12\n\n$PEPE +98.997% => $0.000367\n\n$SUI +0.864% => $0.9487",
"userName": "OctoBot - GPT crypto price predictions\n@OctoBotGPT\n\u00b7\nFeb 10",
"timestamp": "2026-02-10T08:00:48.000Z",
"link": "/OctoBotGPT/status/2021132226792947802"
},
{
"text": "$ZAMA 0.0% => $0.02688\n\n$LIT -1.503% => $0.732\n\n$ZRO -0.259% => $1.927\n\n$BTC -0.078% => $68890.0\n\n$ETH -0.53% => $1997.59\n\n$HBAR +0.512% => $0.09188\n\n$TRX -0.036% => $0.277\n\n$SHIB 0.0% => $5.98e-06",
"userName": "OctoBot - GPT crypto price predictions\n@OctoBotGPT\n\u00b7\nFeb 10",
"timestamp": "2026-02-10T08:00:48.000Z",
"link": "/OctoBotGPT/status/2021132227921301730"
},
{
"text": " New GPT crypto price predictions:\n\n$ZRO +0.959% => $2.398\n\n$DOGE 0.0% => $0.09053\n\n$LINK 0.0% => $8.32\n\n$SOL -0.671% => $80.5\n\n$PENGU -0.169% => $0.005916\n\n$USD1 +0.02% => $1.0\n\n$SUI -0.403% => $0.893\n\n$BTC 0.0% => $67046.23",
"userName": "OctoBot - GPT crypto price predictions\n@OctoBotGPT\n\u00b7\n22h",
"timestamp": "2026-02-11T08:00:52.000Z",
"link": "/OctoBotGPT/status/2021494632635408429"
},
{
"text": "$ETH +1.022% => $1971.45\n\n$BNB 0.0% => $600.95\n\n$ZAMA -0.714% => $0.0196\n\n$SHIB 0.0% => $5.84e-06\n\n$XRP -0.242% => $1.3646\n\n$ASTER +0.459% => $0.653\n\n$ADA -0.393% => $0.2546\n\n$TRX 0.0% => $0.275",
"userName": "OctoBot - GPT crypto price predictions\n@OctoBotGPT\n\u00b7\n22h",
"timestamp": "2026-02-11T08:00:52.000Z",
"link": "/OctoBotGPT/status/2021494633759408338"
}
]
}

File diff suppressed because one or more lines are too long

View File

@ -1,5 +1,5 @@
{
"last_check": "2026-02-11T00:55:59.787647+00:00",
"last_check": "2026-02-12T06:14:23.952343+00:00",
"total_tracked": 3100,
"new_this_check": 0
}

View File

@ -0,0 +1 @@
["37f5b094ed19f0ae", "f013475eaef8b32f", "6f9f9b6f8da5cdc9", "952f5caf7819fde5", "6043a216215e23f0", "ff1b1af5e65905a8", "6217530e01f15dd0", "6e1a48e8344a1d6f", "6d61e4a9dfb604ea", "529a533d4da86360", "d2217d5c918df581", "8b4a57cd97ae6c34", "5bf0ad8e5e2f9dfe", "2ee63742fbc6e541", "7d74e6eca55f346f", "9f29b74a37377015", "3ccdb8471c5c19b6", "8a48151a19597742", "e6a24f45f8a6e8bb", "899377e3ae43e396", "05e47b3d9e24c860", "643b1c7d00ad50f2", "b7a2c548992c278c", "a1a8dddd35fd08d5", "149ace9c327e7700", "7d767e4d0e3a12a3", "0c6a5029b97b5a30", "b7354be18fa9f71a", "7b10b1cb595f2006", "ca4011161b3ce92d", "ffbe4a2b11671722", "cd2ce19326f75133", "60fd088f8f1ae9b2", "8ebbe21b036415d8", "84da8ad2dd424e2b", "7d4648af05013346", "137b80809666db54", "1151a6c9c0f2915c", "36db4785c888600f", "e8c4ec6aa2a9a563", "f9ded3bb072f01fe", "f30bff813bce39b8", "e1a98d7590fe46ee", "a767ef4ed21a86f3", "3525848121516057", "3d6e2887b81b3016", "cb6375e11d10b745", "493d2dc82b844b36", "60dced6f08edb3c2", "7d97ce308ff157b2", "db27f494858bd743", "176776613d96cdbe", "eb7a8395aa02f113"]

View File

@ -0,0 +1,231 @@
#!/usr/bin/env python3
"""
Feed Monitor — Scrapes X home timeline via Chrome CDP (localhost:9222).
Deduplicates, filters for money/trading topics, saves captures, sends Telegram alerts.
"""
import json
import hashlib
import os
import sys
import time
import http.client
import urllib.request
from datetime import datetime, timezone
from pathlib import Path
PROJECT_DIR = Path(__file__).parent
DATA_DIR = PROJECT_DIR / "data"
SEEN_FILE = DATA_DIR / "seen_posts.json"
CAPTURES_DIR = DATA_DIR / "feed_captures"
CAPTURES_DIR.mkdir(parents=True, exist_ok=True)
CDP_HOST = "localhost"
CDP_PORT = 9222
TELEGRAM_BOT_TOKEN = os.environ.get("TELEGRAM_BOT_TOKEN", "")
TELEGRAM_CHAT_ID = os.environ.get("TELEGRAM_CHAT_ID", "6443752046")
MONEY_KEYWORDS = [
"polymarket", "trade", "trading", "profit", "arbitrage", "crypto",
"bitcoin", "btc", "ethereum", "eth", "solana", "sol", "stock",
"stocks", "market", "portfolio", "defi", "token", "whale",
"bullish", "bearish", "short", "long", "pnl", "alpha", "degen",
"usdc", "usdt", "wallet", "airdrop", "memecoin", "nft",
"yield", "staking", "leverage", "futures", "options", "hedge",
"pump", "dump", "rug", "moon", "bag", "position", "signal",
]
def send_telegram(message: str):
if not TELEGRAM_BOT_TOKEN:
print(f"[ALERT] {message}")
return
url = f"https://api.telegram.org/bot{TELEGRAM_BOT_TOKEN}/sendMessage"
data = json.dumps({
"chat_id": TELEGRAM_CHAT_ID,
"text": message,
"parse_mode": "HTML",
"disable_web_page_preview": True,
}).encode()
req = urllib.request.Request(url, data=data, headers={"Content-Type": "application/json"})
try:
urllib.request.urlopen(req, timeout=10)
except Exception as e:
print(f" Telegram error: {e}")
def cdp_send(ws, method: str, params: dict = None, msg_id: int = 1):
"""Send a CDP command over websocket and return the result."""
import websocket
payload = {"id": msg_id, "method": method}
if params:
payload["params"] = params
ws.send(json.dumps(payload))
while True:
resp = json.loads(ws.recv())
if resp.get("id") == msg_id:
return resp.get("result", {})
def get_x_tab_ws():
"""Find an X.com tab in Chrome and return its websocket URL."""
conn = http.client.HTTPConnection(CDP_HOST, CDP_PORT, timeout=5)
conn.request("GET", "/json")
tabs = json.loads(conn.getresponse().read())
conn.close()
for t in tabs:
url = t.get("url", "")
if "x.com" in url or "twitter.com" in url:
ws_url = t.get("webSocketDebuggerUrl")
if ws_url:
return ws_url, t.get("url")
return None, None
def scrape_feed_via_cdp():
"""Navigate to X home, scroll, extract posts via DOM evaluation."""
import websocket
ws_url, current_url = get_x_tab_ws()
if not ws_url:
print("ERROR: No X.com tab found in Chrome at localhost:9222")
sys.exit(1)
print(f"Connected to tab: {current_url}")
ws = websocket.create_connection(ws_url, timeout=30)
# Navigate to home timeline
cdp_send(ws, "Page.navigate", {"url": "https://x.com/home"}, 1)
time.sleep(5)
all_posts = []
seen_texts = set()
for scroll_i in range(6):
# Extract posts from timeline
js = """
(() => {
const posts = [];
document.querySelectorAll('article[data-testid="tweet"]').forEach(article => {
try {
const textEl = article.querySelector('[data-testid="tweetText"]');
const text = textEl ? textEl.innerText : '';
const userEl = article.querySelector('[data-testid="User-Name"]');
const userName = userEl ? userEl.innerText : '';
const timeEl = article.querySelector('time');
const timestamp = timeEl ? timeEl.getAttribute('datetime') : '';
const linkEl = article.querySelector('a[href*="/status/"]');
const link = linkEl ? linkEl.getAttribute('href') : '';
posts.push({ text, userName, timestamp, link });
} catch(e) {}
});
return JSON.stringify(posts);
})()
"""
result = cdp_send(ws, "Runtime.evaluate", {"expression": js, "returnByValue": True}, 10 + scroll_i)
raw = result.get("result", {}).get("value", "[]")
posts = json.loads(raw) if isinstance(raw, str) else []
for p in posts:
sig = p.get("text", "")[:120]
if sig and sig not in seen_texts:
seen_texts.add(sig)
all_posts.append(p)
# Scroll down
cdp_send(ws, "Runtime.evaluate", {"expression": "window.scrollBy(0, 2000)"}, 100 + scroll_i)
time.sleep(2)
ws.close()
return all_posts
def post_hash(post: dict) -> str:
text = post.get("text", "") + post.get("userName", "")
return hashlib.sha256(text.encode()).hexdigest()[:16]
def is_money_related(text: str) -> bool:
lower = text.lower()
return any(kw in lower for kw in MONEY_KEYWORDS)
def load_seen() -> set:
if SEEN_FILE.exists():
try:
return set(json.loads(SEEN_FILE.read_text()))
except:
pass
return set()
def save_seen(seen: set):
# Keep last 10k
items = list(seen)[-10000:]
SEEN_FILE.write_text(json.dumps(items))
def main():
now = datetime.now(timezone.utc)
print(f"=== Feed Monitor === {now.strftime('%Y-%m-%d %H:%M UTC')}")
posts = scrape_feed_via_cdp()
print(f"Scraped {len(posts)} posts from timeline")
seen = load_seen()
new_posts = []
money_posts = []
for p in posts:
h = post_hash(p)
if h in seen:
continue
seen.add(h)
new_posts.append(p)
if is_money_related(p.get("text", "")):
money_posts.append(p)
save_seen(seen)
print(f"New posts: {len(new_posts)}")
print(f"Money-related: {len(money_posts)}")
# Save capture
ts = now.strftime("%Y%m%d-%H%M")
capture = {
"timestamp": now.isoformat(),
"total_scraped": len(posts),
"new_posts": len(new_posts),
"money_posts": len(money_posts),
"posts": money_posts,
}
capture_file = CAPTURES_DIR / f"feed-{ts}.json"
capture_file.write_text(json.dumps(capture, indent=2))
print(f"Saved capture: {capture_file}")
# Alert on money posts
if money_posts:
print(f"\n🔔 {len(money_posts)} money-related posts found!")
for p in money_posts[:8]:
user = p.get("userName", "").split("\n")[0]
snippet = p.get("text", "")[:250].replace("\n", " ")
link = p.get("link", "")
full_link = f"https://x.com{link}" if link and not link.startswith("http") else link
print(f"{user}: {snippet[:100]}...")
msg = f"🔍 <b>{user}</b>\n\n{snippet}"
if full_link:
msg += f"\n\n{full_link}"
send_telegram(msg)
else:
print("No new money-related posts.")
return len(money_posts)
if __name__ == "__main__":
count = main()
sys.exit(0)

View File

@ -0,0 +1,296 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Market Watch — Paper Trading Dashboard</title>
<script src="https://cdn.jsdelivr.net/npm/chart.js@4.4.0/dist/chart.umd.min.js"></script>
<style>
* { margin: 0; padding: 0; box-sizing: border-box; }
body { font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', sans-serif; background: #0a0e1a; color: #e0e6f0; min-height: 100vh; }
.header { background: linear-gradient(135deg, #0f1629, #1a2342); padding: 20px 30px; border-bottom: 1px solid #1e2a4a; display: flex; justify-content: space-between; align-items: center; }
.header h1 { font-size: 1.5rem; font-weight: 600; }
.header h1 span { color: #4ecdc4; }
.header .meta { font-size: 0.85rem; color: #7a8bb5; }
.container { max-width: 1400px; margin: 0 auto; padding: 20px; }
.grid { display: grid; grid-template-columns: repeat(auto-fit, minmax(320px, 1fr)); gap: 16px; margin-bottom: 20px; }
.card { background: linear-gradient(145deg, #111827, #0f1520); border: 1px solid #1e2a4a; border-radius: 12px; padding: 20px; }
.card h2 { font-size: 1rem; color: #7a8bb5; text-transform: uppercase; letter-spacing: 1px; margin-bottom: 12px; font-weight: 500; }
.card h2 .icon { margin-right: 6px; }
.stat-row { display: flex; justify-content: space-between; align-items: baseline; margin-bottom: 8px; }
.stat-label { color: #7a8bb5; font-size: 0.85rem; }
.stat-value { font-size: 1.1rem; font-weight: 600; }
.stat-big { font-size: 2rem; font-weight: 700; }
.green { color: #4ecdc4; }
.red { color: #ff6b6b; }
.neutral { color: #7a8bb5; }
.badge { display: inline-block; padding: 2px 8px; border-radius: 4px; font-size: 0.75rem; font-weight: 600; }
.badge-green { background: rgba(78,205,196,0.15); color: #4ecdc4; }
.badge-red { background: rgba(255,107,107,0.15); color: #ff6b6b; }
.badge-blue { background: rgba(100,149,237,0.15); color: #6495ed; }
table { width: 100%; border-collapse: collapse; }
th { text-align: left; color: #7a8bb5; font-size: 0.75rem; text-transform: uppercase; letter-spacing: 1px; padding: 8px; border-bottom: 1px solid #1e2a4a; }
td { padding: 8px; font-size: 0.9rem; border-bottom: 1px solid #111827; }
tr:hover td { background: rgba(255,255,255,0.02); }
.chart-wrap { height: 250px; position: relative; }
.tabs { display: flex; gap: 8px; margin-bottom: 20px; }
.tab { padding: 8px 16px; border-radius: 8px; cursor: pointer; font-size: 0.9rem; border: 1px solid #1e2a4a; background: transparent; color: #7a8bb5; transition: all 0.2s; }
.tab.active { background: #4ecdc4; color: #0a0e1a; border-color: #4ecdc4; font-weight: 600; }
.tab:hover:not(.active) { border-color: #4ecdc4; color: #4ecdc4; }
.game-section { display: none; }
.game-section.active { display: block; }
.spinner { display: inline-block; width: 20px; height: 20px; border: 2px solid #1e2a4a; border-top-color: #4ecdc4; border-radius: 50%; animation: spin 0.8s linear infinite; }
@keyframes spin { to { transform: rotate(360deg); } }
.empty { text-align: center; color: #7a8bb5; padding: 40px; }
</style>
</head>
<body>
<div class="header">
<h1>📊 <span>Market Watch</span> — Paper Trading</h1>
<div class="meta">Auto-refresh 60s · <span id="lastUpdate">Loading...</span></div>
</div>
<div class="container">
<div class="tabs" id="gameTabs"></div>
<div id="gameContent"><div class="empty"><div class="spinner"></div> Loading games...</div></div>
</div>
<script>
let games = [];
let activeGameIdx = 0;
async function fetchJSON(url) {
const r = await fetch(url);
return r.json();
}
function pnlClass(v) { return v > 0 ? 'green' : v < 0 ? 'red' : 'neutral'; }
function pnlSign(v) { return v > 0 ? '+' : ''; }
function fmt(v, d=2) { return v != null ? Number(v).toFixed(d) : '—'; }
function fmtK(v) { return v >= 1e6 ? (v/1e6).toFixed(1)+'M' : v >= 1e3 ? (v/1e3).toFixed(1)+'K' : fmt(v); }
function renderTabs() {
const el = document.getElementById('gameTabs');
el.innerHTML = games.map((g, i) =>
`<button class="tab ${i === activeGameIdx ? 'active' : ''}" onclick="switchGame(${i})">${g.name || g.game_id.slice(0,8)}</button>`
).join('');
}
function switchGame(idx) {
activeGameIdx = idx;
renderTabs();
renderGame(games[idx]);
}
async function renderGame(game) {
const el = document.getElementById('gameContent');
const gid = game.game_id;
// Fetch details in parallel
const [detail, trades, portfolio] = await Promise.all([
fetchJSON(`/api/game/${gid}`),
fetchJSON(`/api/game/${gid}/trades`),
fetchJSON(`/api/game/${gid}/portfolio`)
]);
const player = game.players?.[0] || 'unknown';
const pf = portfolio[player] || {};
const board = detail.leaderboard || [];
const entry = board[0] || {};
const starting = game.starting_cash || 100000;
const totalVal = pf.total_value || entry.total_value || starting;
const totalPnl = totalVal - starting;
const pnlPct = (totalPnl / starting * 100);
const sells = trades.filter(t => t.action === 'SELL');
const wins = sells.filter(t => (t.realized_pnl||0) > 0);
const winRate = sells.length ? (wins.length / sells.length * 100) : null;
const totalFees = trades.reduce((s,t) => s + (t.fees||0), 0);
const realizedPnl = sells.reduce((s,t) => s + (t.realized_pnl||0), 0);
// Equity curve from snapshots
const snapshots = detail.snapshots?.[player] || [];
let html = `
<!-- Summary Cards -->
<div class="grid">
<div class="card">
<h2><span class="icon">💰</span>Portfolio Value</h2>
<div class="stat-big ${pnlClass(totalPnl)}">$${fmtK(totalVal)}</div>
<div class="stat-row">
<span class="stat-label">P&L</span>
<span class="stat-value ${pnlClass(totalPnl)}">${pnlSign(totalPnl)}$${fmt(totalPnl)} (${pnlSign(pnlPct)}${fmt(pnlPct)}%)</span>
</div>
<div class="stat-row">
<span class="stat-label">Starting Cash</span>
<span class="stat-value">$${fmtK(starting)}</span>
</div>
<div class="stat-row">
<span class="stat-label">Cash Available</span>
<span class="stat-value">$${fmtK(pf.cash || 0)}</span>
</div>
</div>
<div class="card">
<h2><span class="icon">📈</span>Trading Stats</h2>
<div class="stat-row">
<span class="stat-label">Total Trades</span>
<span class="stat-value">${trades.length}</span>
</div>
<div class="stat-row">
<span class="stat-label">Win Rate</span>
<span class="stat-value ${winRate && winRate > 50 ? 'green' : winRate ? 'red' : 'neutral'}">${winRate != null ? fmt(winRate,1)+'%' : '—'}</span>
</div>
<div class="stat-row">
<span class="stat-label">Realized P&L</span>
<span class="stat-value ${pnlClass(realizedPnl)}">${pnlSign(realizedPnl)}$${fmt(realizedPnl)}</span>
</div>
<div class="stat-row">
<span class="stat-label">Total Fees</span>
<span class="stat-value red">-$${fmt(totalFees)}</span>
</div>
</div>
<div class="card">
<h2><span class="icon">🎯</span>Game Info</h2>
<div class="stat-row">
<span class="stat-label">Game</span>
<span class="stat-value">${game.name || gid.slice(0,8)}</span>
</div>
<div class="stat-row">
<span class="stat-label">Type</span>
<span class="stat-value"><span class="badge badge-blue">${game.game_type || 'stock'}</span></span>
</div>
<div class="stat-row">
<span class="stat-label">Player</span>
<span class="stat-value">${player}</span>
</div>
<div class="stat-row">
<span class="stat-label">Open Positions</span>
<span class="stat-value">${Object.keys(pf.positions || {}).length}</span>
</div>
</div>
</div>
<!-- Equity Chart -->
${snapshots.length > 1 ? `
<div class="card" style="margin-bottom:16px">
<h2><span class="icon">📊</span>Equity Curve</h2>
<div class="chart-wrap"><canvas id="equityChart"></canvas></div>
</div>` : ''}
<!-- Positions -->
<div class="card" style="margin-bottom:16px">
<h2><span class="icon">📋</span>Open Positions</h2>
${Object.keys(pf.positions||{}).length ? `
<table>
<thead><tr><th>Symbol</th><th>Shares/Qty</th><th>Avg Cost</th><th>Current</th><th>Value</th><th>P&L</th></tr></thead>
<tbody>
${Object.entries(pf.positions||{}).map(([sym, pos]) => {
const upnl = pos.unrealized_pnl || ((pos.current_price||pos.avg_cost) - pos.avg_cost) * (pos.shares||pos.quantity||0);
const val = pos.market_value || (pos.current_price||pos.avg_cost) * (pos.shares||pos.quantity||0);
return `<tr>
<td><strong>${sym}</strong></td>
<td>${pos.shares||pos.quantity||0}</td>
<td>$${fmt(pos.avg_cost)}</td>
<td>$${fmt(pos.current_price||pos.live_price||pos.avg_cost)}</td>
<td>$${fmtK(val)}</td>
<td class="${pnlClass(upnl)}">${pnlSign(upnl)}$${fmt(upnl)}</td>
</tr>`;
}).join('')}
</tbody>
</table>` : '<div class="empty">No open positions</div>'}
</div>
<!-- Recent Trades -->
<div class="card">
<h2><span class="icon">🔄</span>Recent Trades (last 25)</h2>
${trades.length ? `
<table>
<thead><tr><th>Time</th><th>Action</th><th>Symbol</th><th>Qty</th><th>Price</th><th>P&L</th></tr></thead>
<tbody>
${trades.slice(0,25).map(t => {
const rpnl = t.realized_pnl || 0;
const actionClass = t.action === 'BUY' ? 'badge-green' : t.action === 'SELL' ? 'badge-red' : 'badge-blue';
return `<tr>
<td style="font-size:0.8rem;color:#7a8bb5">${t.timestamp ? new Date(t.timestamp).toLocaleString() : '—'}</td>
<td><span class="badge ${actionClass}">${t.action}</span></td>
<td><strong>${t.ticker||t.symbol||'—'}</strong></td>
<td>${t.shares||t.quantity||'—'}</td>
<td>$${fmt(t.price)}</td>
<td class="${pnlClass(rpnl)}">${t.action==='SELL' ? pnlSign(rpnl)+'$'+fmt(rpnl) : '—'}</td>
</tr>`;
}).join('')}
</tbody>
</table>` : '<div class="empty">No trades yet</div>'}
</div>`;
el.innerHTML = html;
// Draw equity chart
if (snapshots.length > 1) {
const ctx = document.getElementById('equityChart').getContext('2d');
const labels = snapshots.map(s => {
const d = s.timestamp || s.date;
if (!d) return '—';
const dt = new Date(d);
return isNaN(dt) ? String(d).slice(0,10) : dt.toLocaleDateString();
});
const values = snapshots.map(s => s.total_value || s.equity || starting);
new Chart(ctx, {
type: 'line',
data: {
labels,
datasets: [{
label: 'Equity',
data: values,
borderColor: '#4ecdc4',
backgroundColor: 'rgba(78,205,196,0.1)',
fill: true,
tension: 0.3,
pointRadius: 0,
borderWidth: 2
}, {
label: 'Starting',
data: Array(labels.length).fill(starting),
borderColor: '#7a8bb544',
borderDash: [5,5],
borderWidth: 1,
pointRadius: 0
}]
},
options: {
responsive: true, maintainAspectRatio: false,
plugins: { legend: { display: false } },
scales: {
x: { grid: { color: '#1e2a4a' }, ticks: { color: '#7a8bb5', maxTicksLimit: 8 } },
y: { grid: { color: '#1e2a4a' }, ticks: { color: '#7a8bb5', callback: v => '$'+fmtK(v) } }
}
}
});
}
}
async function init() {
try {
games = await fetchJSON('/api/games');
if (!games.length) {
document.getElementById('gameContent').innerHTML = '<div class="empty">No games found</div>';
return;
}
renderTabs();
await renderGame(games[activeGameIdx]);
document.getElementById('lastUpdate').textContent = new Date().toLocaleTimeString();
} catch (e) {
document.getElementById('gameContent').innerHTML = `<div class="empty">Error: ${e.message}</div>`;
}
}
init();
setInterval(async () => {
try {
games = await fetchJSON('/api/games');
await renderGame(games[activeGameIdx]);
document.getElementById('lastUpdate').textContent = new Date().toLocaleTimeString();
} catch(e) {}
}, 60000);
</script>
</body>
</html>

View File

@ -1,424 +1,173 @@
#!/usr/bin/env python3
"""Market Watch Web Portal - Multiplayer GARP Paper Trading."""
"""Market Watch Web Portal — modern dark-themed dashboard."""
import json
import os
import sys
import traceback
from datetime import datetime
from http.server import HTTPServer, BaseHTTPRequestHandler
from socketserver import ThreadingMixIn
from urllib.parse import urlparse, parse_qs
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import game_engine
PORT = 8889
PROJECT_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
SCANS_DIR = os.path.join(PROJECT_DIR, "data", "scans")
PORTAL_DIR = os.path.dirname(os.path.abspath(__file__))
class ThreadedHTTPServer(ThreadingMixIn, HTTPServer):
daemon_threads = True
CSS = """:root{--bg-primary:#0d1117;--bg-secondary:#161b22;--bg-tertiary:#21262d;--text-primary:#f0f6fc;--text-secondary:#8b949e;--border-color:#30363d;--accent-blue:#58a6ff;--accent-purple:#bc8cff;--positive-green:#3fb950;--negative-red:#f85149;--gold:#f0c000;--silver:#c0c0c0;--bronze:#cd7f32}
*{margin:0;padding:0;box-sizing:border-box}
body{font-family:-apple-system,BlinkMacSystemFont,'Segoe UI',Roboto,sans-serif;background:var(--bg-primary);color:var(--text-primary);line-height:1.5}
a{color:var(--accent-blue);text-decoration:none}a:hover{text-decoration:underline}
.navbar{background:var(--bg-secondary);border-bottom:1px solid var(--border-color);padding:1rem 2rem;display:flex;align-items:center;justify-content:space-between}
.nav-brand{font-size:1.5rem;font-weight:bold;color:var(--accent-blue)}
.nav-links{display:flex;gap:1.5rem}
.nav-links a{color:var(--text-secondary);text-decoration:none;padding:.5rem 1rem;border-radius:6px;transition:all .2s}
.nav-links a:hover{color:var(--text-primary);background:var(--bg-tertiary)}
.nav-links a.active{color:var(--accent-blue);background:var(--bg-tertiary)}
.container{max-width:1400px;margin:0 auto;padding:2rem}
.card{background:var(--bg-secondary);border:1px solid var(--border-color);border-radius:8px;padding:1.5rem;margin-bottom:1.5rem}
.card h3{color:var(--text-primary);margin-bottom:1rem;font-size:1.1rem}
.cards{display:grid;grid-template-columns:repeat(auto-fit,minmax(220px,1fr));gap:1.5rem;margin-bottom:2rem}
.metric-large{font-size:2rem;font-weight:bold;margin-bottom:.3rem}
.metric-small{color:var(--text-secondary);font-size:.85rem}
.positive{color:var(--positive-green)!important}.negative{color:var(--negative-red)!important}
table{width:100%;border-collapse:collapse}
th,td{padding:.6rem .8rem;text-align:left;border-bottom:1px solid var(--border-color)}
th{color:var(--text-secondary);font-size:.8rem;text-transform:uppercase}
td{font-size:.9rem}
.rank-1{color:var(--gold);font-weight:bold}.rank-2{color:var(--silver)}.rank-3{color:var(--bronze)}
.btn{display:inline-block;padding:.5rem 1.2rem;background:var(--accent-blue);color:#fff;border:none;border-radius:6px;cursor:pointer;font-size:.9rem;text-decoration:none;transition:opacity .2s}
.btn:hover{opacity:.85;text-decoration:none}
.btn-outline{background:transparent;border:1px solid var(--border-color);color:var(--text-primary)}
.btn-outline:hover{border-color:var(--accent-blue)}
.btn-green{background:var(--positive-green)}.btn-red{background:var(--negative-red)}
input,select{background:var(--bg-tertiary);border:1px solid var(--border-color);color:var(--text-primary);padding:.5rem .8rem;border-radius:6px;font-size:.9rem}
.form-row{display:flex;gap:1rem;align-items:end;flex-wrap:wrap;margin-bottom:1rem}
.form-group{display:flex;flex-direction:column;gap:.3rem}
.form-group label{font-size:.8rem;color:var(--text-secondary);text-transform:uppercase}
.badge{display:inline-block;padding:.15rem .5rem;border-radius:4px;font-size:.75rem;font-weight:bold}
.badge-ai{background:var(--accent-purple);color:#fff}
.badge-human{background:var(--accent-blue);color:#fff}
.player-link{color:var(--text-primary);font-weight:500}
@media(max-width:768px){.navbar{flex-direction:column;gap:1rem}.cards{grid-template-columns:1fr}.container{padding:1rem}.form-row{flex-direction:column}}"""
def _fetch_live_prices(tickers):
"""Fetch live prices via yfinance. Returns {ticker: price}."""
try:
import yfinance as yf
data = yf.download(tickers, period="1d", progress=False)
prices = {}
if len(tickers) == 1:
t = tickers[0]
if "Close" in data.columns and len(data) > 0:
prices[t] = float(data["Close"].iloc[-1])
else:
if "Close" in data.columns:
for t in tickers:
try:
val = data["Close"][t].iloc[-1]
if val == val: # not NaN
prices[t] = float(val)
except Exception:
pass
return prices
except Exception:
return {}
def nav(active=""):
return f"""<nav class="navbar">
<a href="/" style="text-decoration:none"><div class="nav-brand">📊 Market Watch</div></a>
<div class="nav-links">
<a href="/" class="{'active' if active=='home' else ''}">Games</a>
<a href="/scans" class="{'active' if active=='scans' else ''}">Scans</a>
</div></nav>"""
class MarketWatchHandler(BaseHTTPRequestHandler):
class Handler(BaseHTTPRequestHandler):
def do_GET(self):
try:
parsed = urlparse(self.path)
path = parsed.path.rstrip("/")
params = parse_qs(parsed.query)
path = urlparse(self.path).path.rstrip("/") or "/"
if path == "" or path == "/":
self.serve_home()
elif path == "/create-game":
self.serve_create_game()
elif path.startswith("/game/") and "/player/" in path:
parts = path.split("/") # /game/{gid}/player/{user}
self.serve_player(parts[2], parts[4])
elif path.startswith("/game/"):
game_id = path.split("/")[2]
self.serve_game(game_id)
elif path == "/scans":
self.serve_scans()
# API
elif path.startswith("/api/games") and len(path.split("/")) == 3:
self.send_json(game_engine.list_games(active_only=False))
elif path.startswith("/api/games/") and path.endswith("/leaderboard"):
gid = path.split("/")[3]
self.send_json(game_engine.get_leaderboard(gid))
elif "/portfolio" in path:
parts = path.split("/")
self.send_json(game_engine.get_portfolio(parts[3], parts[5]))
else:
self.send_error(404)
except Exception as e:
self.send_response(500)
self.send_header("Content-type", "text/html")
self.end_headers()
self.wfile.write(f"<h1>500</h1><pre>{e}</pre>".encode())
def do_POST(self):
try:
content_len = int(self.headers.get("Content-Length", 0))
body = self.rfile.read(content_len).decode() if content_len else ""
parsed = urlparse(self.path)
path = parsed.path.rstrip("/")
if path == "/":
return self._serve_file("index.html", "text/html")
# API endpoints
if path == "/api/games":
data = parse_form(body)
name = data.get("name", "Untitled Game")
cash = float(data.get("starting_cash", 100000))
end_date = data.get("end_date") or None
gid = game_engine.create_game(name, cash, end_date)
self.redirect(f"/game/{gid}")
elif path.endswith("/join"):
data = parse_form(body)
parts = path.split("/")
gid = parts[3]
username = data.get("username", "").strip().lower()
if username:
game_engine.join_game(gid, username)
self.redirect(f"/game/{gid}")
elif path.endswith("/trade"):
data = parse_form(body)
parts = path.split("/")
gid, username = parts[3], parts[5]
action = data.get("action", "").upper()
ticker = data.get("ticker", "").upper().strip()
shares = int(data.get("shares", 0))
if ticker and shares > 0:
import yfinance as yf
price = yf.Ticker(ticker).info.get("currentPrice") or yf.Ticker(ticker).info.get("regularMarketPrice", 0)
if price and price > 0:
if action == "BUY":
game_engine.buy(gid, username, ticker, shares, price, reason="Manual trade")
elif action == "SELL":
game_engine.sell(gid, username, ticker, shares, price, reason="Manual trade")
self.redirect(f"/game/{gid}/player/{username}")
else:
self.send_error(404)
except Exception as e:
self.send_response(500)
self.send_header("Content-type", "text/html")
self.end_headers()
self.wfile.write(f"<h1>500</h1><pre>{e}</pre>".encode())
def serve_home(self):
games = game_engine.list_games(active_only=False)
rows = ""
# Enrich with summary
for g in games:
players = len(g.get("players", []))
status_badge = '<span class="badge badge-ai">Active</span>' if g["status"] == "active" else '<span class="badge">Ended</span>'
rows += f"""<tr>
<td><a href="/game/{g['game_id']}" class="player-link">{g['name']}</a></td>
<td>{players}</td>
<td>${g['starting_cash']:,.0f}</td>
<td>{g['start_date']}</td>
<td>{g.get('end_date', '') or ''}</td>
<td>{status_badge}</td>
</tr>"""
board = game_engine.get_leaderboard(g["game_id"])
g["leaderboard"] = board
trades_all = []
for p in g.get("players", []):
trades_all.extend(game_engine.get_trades(g["game_id"], p))
g["total_trades"] = len(trades_all)
sells = [t for t in trades_all if t.get("action") == "SELL"]
wins = [t for t in sells if t.get("realized_pnl", 0) > 0]
g["win_rate"] = round(len(wins)/len(sells)*100, 1) if sells else None
return self._json(games)
html = f"""<!DOCTYPE html><html><head><meta charset="UTF-8"><meta name="viewport" content="width=device-width,initial-scale=1.0">
<title>Market Watch</title><style>{CSS}</style></head><body>
{nav('home')}
<div class="container">
<div style="display:flex;justify-content:space-between;align-items:center;margin-bottom:1.5rem">
<h2>🎮 Active Games</h2>
<a href="/create-game" class="btn">+ New Game</a>
</div>
<div class="card">
<table><thead><tr><th>Game</th><th>Players</th><th>Starting Cash</th><th>Started</th><th>Ends</th><th>Status</th></tr></thead>
<tbody>{rows if rows else '<tr><td colspan="6" style="text-align:center;color:var(--text-secondary)">No games yet — create one!</td></tr>'}</tbody></table>
</div>
</div></body></html>"""
self.send_html(html)
# /api/game/{id}
parts = path.split("/")
if len(parts) >= 4 and parts[1] == "api" and parts[2] == "game":
gid = parts[3]
def serve_create_game(self):
html = f"""<!DOCTYPE html><html><head><meta charset="UTF-8"><meta name="viewport" content="width=device-width,initial-scale=1.0">
<title>Create Game - Market Watch</title><style>{CSS}</style></head><body>
{nav()}
<div class="container">
<div class="card">
<h3>🎮 Create New Game</h3>
<form method="POST" action="/api/games">
<div class="form-row">
<div class="form-group"><label>Game Name</label><input type="text" name="name" placeholder="GARP Challenge" required></div>
<div class="form-group"><label>Starting Cash ($)</label><input type="number" name="starting_cash" value="100000" min="1000" step="1000"></div>
<div class="form-group"><label>End Date (optional)</label><input type="date" name="end_date"></div>
</div>
<button type="submit" class="btn">Create Game</button>
</form>
</div>
</div></body></html>"""
self.send_html(html)
def serve_game(self, game_id):
game = game_engine.get_game(game_id)
if len(parts) == 4:
game = game_engine.get_game(gid)
if not game:
return self.send_error(404)
return self._json({"error": "not found"}, 404)
game["leaderboard"] = game_engine.get_leaderboard(gid)
# Add snapshots for each player
game["snapshots"] = {}
for p in game.get("players", []):
game["snapshots"][p] = game_engine.get_snapshots(gid, p)
return self._json(game)
board = game_engine.get_leaderboard(game_id)
if len(parts) == 5 and parts[4] == "trades":
game = game_engine.get_game(gid)
if not game:
return self._json({"error": "not found"}, 404)
all_trades = []
for p in game.get("players", []):
for t in game_engine.get_trades(gid, p):
t["player"] = p
all_trades.append(t)
all_trades.sort(key=lambda x: x.get("timestamp", ""), reverse=True)
return self._json(all_trades)
rank_rows = ""
for i, entry in enumerate(board):
rank_class = f"rank-{i+1}" if i < 3 else ""
medal = ["🥇", "🥈", "🥉"][i] if i < 3 else f"#{i+1}"
pnl_class = "positive" if entry["pnl_pct"] >= 0 else "negative"
badge = ' <span class="badge badge-ai">AI</span>' if entry["username"] == "case" else ""
rank_rows += f"""<tr>
<td class="{rank_class}">{medal}</td>
<td><a href="/game/{game_id}/player/{entry['username']}" class="player-link">{entry['username']}</a>{badge}</td>
<td>${entry['total_value']:,.2f}</td>
<td class="{pnl_class}">{entry['pnl_pct']:+.2f}%</td>
<td class="{pnl_class}">${entry['total_pnl']:+,.2f}</td>
<td>{entry['num_positions']}</td>
<td>{entry['num_trades']}</td>
</tr>"""
if len(parts) == 5 and parts[4] == "portfolio":
game = game_engine.get_game(gid)
if not game:
return self._json({"error": "not found"}, 404)
portfolios = {}
all_tickers = []
for p in game.get("players", []):
pf = game_engine.get_portfolio(gid, p)
if pf:
portfolios[p] = pf
all_tickers.extend(pf["positions"].keys())
html = f"""<!DOCTYPE html><html><head><meta charset="UTF-8"><meta name="viewport" content="width=device-width,initial-scale=1.0">
<title>{game['name']} - Market Watch</title><script src="https://cdn.jsdelivr.net/npm/chart.js"></script><style>{CSS}</style></head><body>
{nav()}
<div class="container">
<div style="display:flex;justify-content:space-between;align-items:center;margin-bottom:.5rem">
<h2>🏆 {game['name']}</h2>
<span class="badge badge-ai">{game['status'].upper()}</span>
</div>
<p style="color:var(--text-secondary);margin-bottom:1.5rem">Started {game['start_date']} · ${game['starting_cash']:,.0f} starting cash · {len(game['players'])} players</p>
<div class="card">
<h3>Leaderboard</h3>
<table><thead><tr><th>Rank</th><th>Player</th><th>Portfolio</th><th>Return</th><th>P&L</th><th>Positions</th><th>Trades</th></tr></thead>
<tbody>{rank_rows if rank_rows else '<tr><td colspan="7" style="text-align:center;color:var(--text-secondary)">No players yet</td></tr>'}</tbody></table>
</div>
<div class="card">
<h3>Join This Game</h3>
<form method="POST" action="/api/games/{game_id}/join">
<div class="form-row">
<div class="form-group"><label>Username</label><input type="text" name="username" placeholder="your name" required pattern="[a-zA-Z0-9_-]+" title="Letters, numbers, dashes, underscores only"></div>
<button type="submit" class="btn">Join Game</button>
</div>
</form>
</div>
</div></body></html>"""
self.send_html(html)
def serve_player(self, game_id, username):
game = game_engine.get_game(game_id)
p = game_engine.get_portfolio(game_id, username)
if not game or not p:
return self.send_error(404)
trades = game_engine.get_trades(game_id, username)
snapshots = game_engine.get_snapshots(game_id, username)
pnl_class = "positive" if p["total_pnl"] >= 0 else "negative"
is_ai = username == "case"
badge = '<span class="badge badge-ai">AI Player</span>' if is_ai else '<span class="badge badge-human">Human</span>'
# Positions table
pos_rows = ""
for ticker, pos in sorted(p["positions"].items()):
pc = "positive" if pos["unrealized_pnl"] >= 0 else "negative"
pos_rows += f"""<tr>
<td><strong>{ticker}</strong></td><td>{pos['shares']}</td>
<td>${pos['avg_cost']:.2f}</td><td>${pos['current_price']:.2f}</td>
<td>${pos['market_value']:,.2f}</td><td class="{pc}">${pos['unrealized_pnl']:+,.2f}</td>
<td>${pos.get('trailing_stop',0):.2f}</td>
</tr>"""
if not pos_rows:
pos_rows = '<tr><td colspan="7" style="text-align:center;color:var(--text-secondary)">No positions</td></tr>'
# Trade log
trade_rows = ""
for t in reversed(trades[-30:]):
action_class = "positive" if t["action"] == "BUY" else "negative"
pnl_cell = ""
if t["action"] == "SELL":
rpnl = t.get("realized_pnl", 0)
rpnl_class = "positive" if rpnl >= 0 else "negative"
pnl_cell = f'<span class="{rpnl_class}">${rpnl:+,.2f}</span>'
trade_rows += f"""<tr>
<td class="{action_class}">{t['action']}</td><td>{t['ticker']}</td><td>{t['shares']}</td>
<td>${t['price']:.2f}</td><td>{pnl_cell}</td>
<td>{t.get('reason','')[:40]}</td><td>{t['timestamp'][:16]}</td>
</tr>"""
chart_labels = json.dumps([s["date"] for s in snapshots])
chart_values = json.dumps([s["total_value"] for s in snapshots])
# Fetch live prices
all_tickers = list(set(all_tickers))
if all_tickers:
live = _fetch_live_prices(all_tickers)
for p, pf in portfolios.items():
total_value = pf["cash"]
for ticker, pos in pf["positions"].items():
if ticker in live:
pos["live_price"] = live[ticker]
pos["current_price"] = live[ticker]
pos["unrealized_pnl"] = round((live[ticker] - pos["avg_cost"]) * pos["shares"], 2)
pos["market_value"] = round(live[ticker] * pos["shares"], 2)
total_value += pos["market_value"]
pf["total_value"] = round(total_value, 2)
starting = game.get("starting_cash", 100000)
pf["total_pnl"] = round(total_value - starting, 2)
pf["pnl_pct"] = round((total_value - starting) / starting * 100, 2)
# Trade form (only for humans)
trade_form = "" if is_ai else f"""
<div class="card">
<h3>📝 Place Trade</h3>
<form method="POST" action="/api/games/{game_id}/players/{username}/trade">
<div class="form-row">
<div class="form-group"><label>Action</label>
<select name="action"><option value="BUY">BUY</option><option value="SELL">SELL</option></select></div>
<div class="form-group"><label>Ticker</label><input type="text" name="ticker" placeholder="AAPL" required style="text-transform:uppercase"></div>
<div class="form-group"><label>Shares</label><input type="number" name="shares" min="1" value="10" required></div>
<button type="submit" class="btn btn-green">Execute</button>
</div>
<p style="color:var(--text-secondary);font-size:.8rem;margin-top:.5rem">Trades execute at current market price via Yahoo Finance</p>
</form>
</div>"""
return self._json(portfolios)
html = f"""<!DOCTYPE html><html><head><meta charset="UTF-8"><meta name="viewport" content="width=device-width,initial-scale=1.0">
<title>{username} - {game['name']}</title><script src="https://cdn.jsdelivr.net/npm/chart.js"></script><style>{CSS}</style></head><body>
{nav()}
<div class="container">
<div style="margin-bottom:.5rem"><a href="/game/{game_id}" style="color:var(--text-secondary)">← {game['name']}</a></div>
<h2>{username} {badge}</h2>
<div class="cards" style="margin-top:1rem">
<div class="card"><h3>Portfolio Value</h3><div class="metric-large">${p['total_value']:,.2f}</div><div class="metric-small">Started at ${starting:,.0f}</div></div>
<div class="card"><h3>Cash</h3><div class="metric-large">${p['cash']:,.2f}</div><div class="metric-small">{p['cash']/max(p['total_value'],1)*100:.1f}% available</div></div>
<div class="card"><h3>Return</h3><div class="metric-large {pnl_class}">{p['pnl_pct']:+.2f}%</div><div class="metric-small {pnl_class}">${p['total_pnl']:+,.2f}</div></div>
<div class="card"><h3>Positions</h3><div class="metric-large">{p['num_positions']}</div><div class="metric-small">{len(trades)} total trades</div></div>
</div>
self._error(404)
except Exception as e:
self._json({"error": str(e), "trace": traceback.format_exc()}, 500)
<div class="card"><h3>Performance</h3><canvas id="chart" height="80"></canvas></div>
{trade_form}
<div class="card"><h3>Positions</h3>
<table><thead><tr><th>Ticker</th><th>Shares</th><th>Avg Cost</th><th>Price</th><th>Value</th><th>P&L</th><th>Stop</th></tr></thead>
<tbody>{pos_rows}</tbody></table>
</div>
<div class="card"><h3>Trade Log</h3>
<table><thead><tr><th>Action</th><th>Ticker</th><th>Shares</th><th>Price</th><th>P&L</th><th>Reason</th><th>Time</th></tr></thead>
<tbody>{trade_rows if trade_rows else '<tr><td colspan="7" style="text-align:center;color:var(--text-secondary)">No trades yet</td></tr>'}</tbody></table>
</div>
</div>
<script>
const ctx = document.getElementById('chart').getContext('2d');
const labels = {chart_labels}; const values = {chart_values};
if (labels.length > 0) {{
new Chart(ctx, {{type:'line',data:{{labels:labels,datasets:[
{{label:'Portfolio',data:values,borderColor:'#58a6ff',backgroundColor:'rgba(88,166,255,0.1)',fill:true,tension:0.3}},
{{label:'Starting',data:labels.map(()=>{starting}),borderColor:'#30363d',borderDash:[5,5],pointRadius:0}}
]}},options:{{responsive:true,plugins:{{legend:{{labels:{{color:'#f0f6fc'}}}}}},scales:{{x:{{ticks:{{color:'#8b949e'}},grid:{{color:'#21262d'}}}},y:{{ticks:{{color:'#8b949e',callback:v=>'$'+v.toLocaleString()}},grid:{{color:'#21262d'}}}}}}}}
}});
}} else {{ ctx.canvas.parentElement.innerHTML += '<div style="text-align:center;color:#8b949e;padding:2rem">Chart populates after first trading day</div>'; }}
</script></body></html>"""
self.send_html(html)
def serve_scans(self):
rows = ""
if os.path.exists(SCANS_DIR):
for sf in sorted(os.listdir(SCANS_DIR), reverse=True)[:30]:
if not sf.endswith(".json"): continue
data = {}
with open(os.path.join(SCANS_DIR, sf)) as f:
data = json.load(f)
n = data.get("candidates_found", len(data.get("candidates", [])))
top = ", ".join(c.get("ticker","?") for c in data.get("candidates", [])[:8])
rows += f'<tr><td>{sf.replace(".json","")}</td><td>{data.get("total_scanned",0)}</td><td>{n}</td><td>{top or ""}</td></tr>'
html = f"""<!DOCTYPE html><html><head><meta charset="UTF-8"><meta name="viewport" content="width=device-width,initial-scale=1.0">
<title>Scans - Market Watch</title><style>{CSS}</style></head><body>
{nav('scans')}
<div class="container"><div class="card"><h3>📡 GARP Scan History</h3>
<table><thead><tr><th>Date</th><th>Scanned</th><th>Candidates</th><th>Top Picks</th></tr></thead>
<tbody>{rows if rows else '<tr><td colspan="4" style="text-align:center;color:var(--text-secondary)">No scans yet</td></tr>'}</tbody></table>
</div></div></body></html>"""
self.send_html(html)
def redirect(self, url):
self.send_response(303)
self.send_header("Location", url)
self.end_headers()
def send_html(self, content):
def _serve_file(self, filename, content_type):
filepath = os.path.join(PORTAL_DIR, filename)
with open(filepath, "rb") as f:
data = f.read()
self.send_response(200)
self.send_header("Content-type", "text/html")
self.send_header("Content-Type", content_type)
self.end_headers()
self.wfile.write(content.encode())
self.wfile.write(data)
def send_json(self, data):
self.send_response(200)
self.send_header("Content-type", "application/json")
def _json(self, data, code=200):
body = json.dumps(data, default=str).encode()
self.send_response(code)
self.send_header("Content-Type", "application/json")
self.send_header("Access-Control-Allow-Origin", "*")
self.end_headers()
self.wfile.write(json.dumps(data, default=str).encode())
self.wfile.write(body)
def log_message(self, format, *args):
def _error(self, code):
self.send_response(code)
self.send_header("Content-Type", "text/plain")
self.end_headers()
self.wfile.write(f"{code}".encode())
def log_message(self, fmt, *args):
pass
def parse_form(body):
"""Parse URL-encoded form data."""
result = {}
for pair in body.split("&"):
if "=" in pair:
k, v = pair.split("=", 1)
from urllib.parse import unquote_plus
result[unquote_plus(k)] = unquote_plus(v)
return result
def main():
game_engine.ensure_default_game()
print(f"📊 Market Watch Portal starting on localhost:{PORT}")
server = ThreadedHTTPServer(("0.0.0.0", PORT), MarketWatchHandler)
print(f"📊 Market Watch Portal → http://localhost:{PORT}")
server = ThreadedHTTPServer(("0.0.0.0", PORT), Handler)
try:
server.serve_forever()
except KeyboardInterrupt:
print("\nPortal stopped")
print("\nStopped")
if __name__ == "__main__":

389
tools/analyze_tweet.py Executable file
View File

@ -0,0 +1,389 @@
#!/usr/bin/env python3
"""Tweet Analysis Tool - Scrapes and analyzes tweets via Chrome CDP."""
import argparse
import asyncio
import json
import re
import sys
from datetime import datetime
try:
from playwright.async_api import async_playwright
except ImportError:
print("ERROR: playwright not installed. Run: pip install playwright", file=sys.stderr)
sys.exit(1)
try:
import yfinance as yf
except ImportError:
yf = None
def extract_tickers(text: str) -> list[str]:
"""Extract $TICKER patterns from text."""
return list(set(re.findall(r'\$([A-Z]{1,5}(?:\.[A-Z]{1,2})?)', text.upper())))
def lookup_tickers(tickers: list[str]) -> dict:
"""Look up ticker data via yfinance."""
if not yf or not tickers:
return {}
results = {}
for t in tickers[:5]: # limit to 5
try:
info = yf.Ticker(t).info
results[t] = {
"price": info.get("currentPrice") or info.get("regularMarketPrice"),
"market_cap": info.get("marketCap"),
"name": info.get("shortName"),
"volume": info.get("volume"),
"day_change_pct": info.get("regularMarketChangePercent"),
"52w_high": info.get("fiftyTwoWeekHigh"),
"52w_low": info.get("fiftyTwoWeekLow"),
}
except Exception:
results[t] = {"error": "lookup failed"}
return results
async def scrape_tweet(url: str) -> dict:
"""Connect to Chrome CDP and scrape tweet data."""
# Normalize URL
url = url.replace("twitter.com", "x.com")
if not url.startswith("http"):
url = "https://" + url
data = {
"url": url,
"author": None,
"handle": None,
"text": None,
"timestamp": None,
"metrics": {},
"images": [],
"bio": None,
"followers": None,
"following": None,
"reply_to": None,
"replies_sample": [],
"scrape_error": None,
}
async with async_playwright() as p:
try:
browser = await p.chromium.connect_over_cdp("http://localhost:9222")
except Exception as e:
data["scrape_error"] = f"CDP connection failed: {e}"
return data
try:
ctx = browser.contexts[0] if browser.contexts else await browser.new_context()
page = await ctx.new_page()
await page.goto(url, wait_until="domcontentloaded", timeout=30000)
await page.wait_for_timeout(4000)
# Get the main tweet article
# Try to find the focal tweet
tweet_sel = 'article[data-testid="tweet"]'
articles = await page.query_selector_all(tweet_sel)
if not articles:
data["scrape_error"] = "No tweet articles found on page"
await page.close()
return data
# The focal tweet is typically the one with the largest text or specific structure
# On a tweet permalink, it's usually the first or second article
focal = None
for art in articles:
# The focal tweet has a different time display (absolute vs relative)
time_el = await art.query_selector('time')
if time_el:
dt = await time_el.get_attribute('datetime')
if dt:
focal = art
data["timestamp"] = dt
break
if not focal:
focal = articles[0]
# Author info
user_links = await focal.query_selector_all('a[role="link"]')
for link in user_links:
href = await link.get_attribute("href") or ""
if href.startswith("/") and href.count("/") == 1 and len(href) > 1:
spans = await link.query_selector_all("span")
for span in spans:
txt = (await span.inner_text()).strip()
if txt.startswith("@"):
data["handle"] = txt
elif txt and not data["author"] and not txt.startswith("@"):
data["author"] = txt
break
# Tweet text
text_el = await focal.query_selector('div[data-testid="tweetText"]')
if text_el:
data["text"] = await text_el.inner_text()
# Metrics (replies, retweets, likes, views)
group = await focal.query_selector('div[role="group"]')
if group:
buttons = await group.query_selector_all('button')
metric_names = ["replies", "retweets", "likes", "bookmarks"]
for i, btn in enumerate(buttons):
aria = await btn.get_attribute("aria-label") or ""
# Parse numbers from aria labels like "123 replies"
nums = re.findall(r'[\d,]+', aria)
if nums and i < len(metric_names):
data["metrics"][metric_names[i]] = nums[0].replace(",", "")
# Views - often in a separate span
view_spans = await focal.query_selector_all('a[role="link"] span')
for vs in view_spans:
txt = (await vs.inner_text()).strip()
if "views" in txt.lower() or "Views" in txt:
nums = re.findall(r'[\d,.KkMm]+', txt)
if nums:
data["metrics"]["views"] = nums[0]
# Images
imgs = await focal.query_selector_all('img[alt="Image"]')
for img in imgs:
src = await img.get_attribute("src")
if src:
data["images"].append(src)
# Check if it's a reply
reply_indicators = await page.query_selector_all('div[data-testid="tweet"] a[role="link"]')
# Try to get author profile info by hovering or checking
# We'll grab it from the page if visible
if data["handle"]:
handle_clean = data["handle"].lstrip("@")
# Check for bio/follower info in any hover cards or visible elements
all_text = await page.inner_text("body")
# Look for follower patterns
follower_match = re.search(r'([\d,.]+[KkMm]?)\s+Followers', all_text)
following_match = re.search(r'([\d,.]+[KkMm]?)\s+Following', all_text)
if follower_match:
data["followers"] = follower_match.group(1)
if following_match:
data["following"] = following_match.group(1)
# Sample some replies (articles after the focal tweet)
if len(articles) > 1:
for art in articles[1:4]:
reply_text_el = await art.query_selector('div[data-testid="tweetText"]')
if reply_text_el:
rt = await reply_text_el.inner_text()
if rt:
data["replies_sample"].append(rt[:200])
await page.close()
except Exception as e:
data["scrape_error"] = str(e)
try:
await page.close()
except:
pass
return data
def analyze(data: dict) -> dict:
"""Produce structured analysis from scraped data."""
text = data.get("text") or ""
tickers = extract_tickers(text)
ticker_data = lookup_tickers(tickers)
# Red flags detection
red_flags = []
text_lower = text.lower()
promo_words = ["100x", "1000x", "moon", "gem", "rocket", "guaranteed", "easy money",
"don't miss", "last chance", "about to explode", "next big", "sleeping giant",
"never stops printing", "true freedom", "beat the institutions", "revolution",
"empire", "vault", "get rich", "financial freedom", "life changing",
"without a degree", "from a bedroom", "join this"]
for w in promo_words:
if w in text_lower:
red_flags.append(f"Promotional language: '{w}'")
if len(tickers) > 3:
red_flags.append(f"Multiple tickers mentioned ({len(tickers)})")
if len(text) > 2000:
red_flags.append("Extremely long promotional thread")
if "github" in text_lower and ("star" in text_lower or "repo" in text_lower):
red_flags.append("Pushing GitHub repo (potential funnel to paid product)")
if any(w in text_lower for w in ["course", "discord", "premium", "paid group", "subscribe"]):
red_flags.append("Funneling to paid product/community")
# Check replies for coordinated patterns
replies = data.get("replies_sample", [])
if replies:
rocket_replies = sum(1 for r in replies if any(e in r for e in ["🚀", "💎", "🔥", "LFG"]))
if rocket_replies >= 2:
red_flags.append("Replies show coordinated hype patterns")
# Check for penny stock characteristics
for t, info in ticker_data.items():
if isinstance(info, dict) and not info.get("error"):
price = info.get("price")
mcap = info.get("market_cap")
if price and price < 1:
red_flags.append(f"${t} is a penny stock (${price})")
if mcap and mcap < 50_000_000:
red_flags.append(f"${t} micro-cap (<$50M market cap)")
# Build verdict
if len(red_flags) >= 3:
verdict = "High risk - multiple red flags detected, exercise extreme caution"
elif len(red_flags) >= 1:
verdict = "Some concerns - verify claims independently before acting"
elif tickers:
verdict = "Worth investigating - do your own due diligence"
else:
verdict = "Informational tweet - no immediate financial claims detected"
return {
"tweet_data": data,
"tickers_found": tickers,
"ticker_data": ticker_data,
"red_flags": red_flags,
"verdict": verdict,
}
def format_markdown(analysis: dict) -> str:
"""Format analysis as markdown."""
d = analysis["tweet_data"]
lines = [f"# Tweet Analysis", ""]
lines.append(f"**URL:** {d['url']}")
lines.append(f"**Analyzed:** {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}")
lines.append("")
# WHO
lines.append("## 👤 WHO")
lines.append(f"- **Author:** {d.get('author') or 'Unknown'}")
lines.append(f"- **Handle:** {d.get('handle') or 'Unknown'}")
if d.get("followers"):
lines.append(f"- **Followers:** {d['followers']}")
if d.get("following"):
lines.append(f"- **Following:** {d['following']}")
if d.get("bio"):
lines.append(f"- **Bio:** {d['bio']}")
lines.append("")
# WHAT
lines.append("## 📝 WHAT")
lines.append(f"> {d.get('text') or 'Could not extract tweet text'}")
lines.append("")
if d.get("timestamp"):
lines.append(f"**Posted:** {d['timestamp']}")
metrics = d.get("metrics", {})
if metrics:
m_parts = [f"{v} {k}" for k, v in metrics.items()]
lines.append(f"**Metrics:** {' | '.join(m_parts)}")
if d.get("images"):
lines.append(f"**Images:** {len(d['images'])} attached")
lines.append("")
# VERIFY
lines.append("## ✅ VERIFY")
tickers = analysis.get("tickers_found", [])
td = analysis.get("ticker_data", {})
if tickers:
lines.append(f"**Tickers mentioned:** {', '.join('$' + t for t in tickers)}")
lines.append("")
for t, info in td.items():
if isinstance(info, dict) and not info.get("error"):
lines.append(f"### ${t}" + (f" - {info.get('name', '')}" if info.get('name') else ""))
if info.get("price"):
lines.append(f"- **Price:** ${info['price']}")
if info.get("market_cap"):
mc = info["market_cap"]
if mc > 1e9:
lines.append(f"- **Market Cap:** ${mc/1e9:.2f}B")
else:
lines.append(f"- **Market Cap:** ${mc/1e6:.1f}M")
if info.get("volume"):
lines.append(f"- **Volume:** {info['volume']:,}")
if info.get("day_change_pct"):
lines.append(f"- **Day Change:** {info['day_change_pct']:.2f}%")
if info.get("52w_high") and info.get("52w_low"):
lines.append(f"- **52W Range:** ${info['52w_low']} - ${info['52w_high']}")
lines.append("")
elif isinstance(info, dict) and info.get("error"):
lines.append(f"- ${t}: lookup failed")
else:
lines.append("No tickers mentioned in tweet.")
lines.append("")
# RED FLAGS
lines.append("## 🚩 RED FLAGS")
flags = analysis.get("red_flags", [])
if flags:
for f in flags:
lines.append(f"- ⚠️ {f}")
else:
lines.append("- None detected")
lines.append("")
# MONEY
lines.append("## 💰 MONEY")
if tickers and not flags:
lines.append("Potential opportunity identified. Research further before any position.")
elif tickers and flags:
lines.append("Tickers mentioned but red flags present. High risk of promoted/manipulated asset.")
else:
lines.append("No direct financial opportunity identified in this tweet.")
lines.append("")
# VERDICT
lines.append("## 🎯 VERDICT")
lines.append(f"**{analysis['verdict']}**")
lines.append("")
# Scrape issues
if d.get("scrape_error"):
lines.append(f"---\n⚠️ *Scrape warning: {d['scrape_error']}*")
return "\n".join(lines)
async def main():
parser = argparse.ArgumentParser(description="Analyze a tweet")
parser.add_argument("url", help="Tweet URL (x.com or twitter.com)")
parser.add_argument("--json", action="store_true", dest="json_output", help="Output JSON")
parser.add_argument("-o", "--output", help="Write output to file")
args = parser.parse_args()
# Validate URL
if not re.search(r'(x\.com|twitter\.com)/.+/status/\d+', args.url):
print("ERROR: Invalid tweet URL", file=sys.stderr)
sys.exit(1)
print("Scraping tweet...", file=sys.stderr)
data = await scrape_tweet(args.url)
print("Analyzing...", file=sys.stderr)
analysis = analyze(data)
if args.json_output:
output = json.dumps(analysis, indent=2, default=str)
else:
output = format_markdown(analysis)
if args.output:
with open(args.output, "w") as f:
f.write(output)
print(f"Written to {args.output}", file=sys.stderr)
else:
print(output)
if __name__ == "__main__":
asyncio.run(main())

View File

@ -0,0 +1,51 @@
# Data Source Connectors
Standalone Python scripts for fetching crypto/market data. Each has CLI with `--pretty` (JSON formatting) and `--summary` (human-readable output).
## defillama.py ✅ (no auth needed)
DefiLlama API — DeFi protocol data, token prices, yield farming opportunities.
```bash
./defillama.py protocols --limit 10 --summary # Top protocols by TVL
./defillama.py tvl aave --pretty # TVL for specific protocol
./defillama.py prices coingecko:bitcoin coingecko:ethereum --summary
./defillama.py yields --limit 20 --stablecoins --summary # Top stablecoin yields
```
**Endpoints used:** api.llama.fi/protocols, api.llama.fi/tvl/{name}, coins.llama.fi/prices, yields.llama.fi/pools
## coinglass.py 🔑 (API key recommended)
Coinglass — funding rates, open interest, long/short ratios.
```bash
export COINGLASS_API_KEY=your_key # Get at coinglass.com/pricing
./coinglass.py funding --summary
./coinglass.py oi --summary
./coinglass.py long-short --summary
```
**Note:** Free internal API endpoints often return empty data. API key required for reliable access.
## arkham.py 🔑 (API key required)
Arkham Intelligence — whale wallet tracking, token transfers, entity search.
```bash
export ARKHAM_API_KEY=your_key # Sign up at platform.arkhamintelligence.com
./arkham.py notable --summary # List known whale addresses
./arkham.py address vitalik --summary # Address intelligence (supports name shortcuts)
./arkham.py transfers 0x1234... --limit 10 --pretty
./arkham.py search "binance" --pretty
```
**Built-in shortcuts:** vitalik, justin-sun, binance-hot, coinbase-prime, aave-treasury, uniswap-deployer
## Programmatic Usage
```python
from tools.data_sources.defillama import get_protocols, get_prices, get_yield_pools
from tools.data_sources.coinglass import get_funding_rates
from tools.data_sources.arkham import get_address_info, NOTABLE_ADDRESSES
```

4
tools/data_sources/__init__.py Executable file
View File

@ -0,0 +1,4 @@
"""Crypto & market data source connectors."""
from pathlib import Path
DATA_SOURCES_DIR = Path(__file__).parent

167
tools/data_sources/arkham.py Executable file
View File

@ -0,0 +1,167 @@
#!/usr/bin/env python3
"""Arkham Intelligence connector — whale tracking, token flows, address intelligence.
Requires API key for most endpoints. Set ARKHAM_API_KEY env var.
Sign up at https://platform.arkhamintelligence.com
"""
import argparse
import json
import os
import sys
from typing import Any
import requests
BASE = "https://api.arkhamintelligence.com"
TIMEOUT = 30
NOTABLE_ADDRESSES = {
"vitalik": "0xd8dA6BF26964aF9D7eEd9e03E53415D37aA96045",
"justin-sun": "0x3DdfA8eC3052539b6C9549F12cEA2C295cfF5296",
"binance-hot": "0x28C6c06298d514Db089934071355E5743bf21d60",
"coinbase-prime": "0xA9D1e08C7793af67e9d92fe308d5697FB81d3E43",
"aave-treasury": "0x464C71f6c2F760DdA6093dCB91C24c39e5d6e18c",
"uniswap-deployer": "0x41653c7d61609D856f29355E404F310Ec4142Cfb",
}
def _get(path: str, params: dict | None = None) -> Any:
key = os.environ.get("ARKHAM_API_KEY")
headers = {"User-Agent": "Mozilla/5.0"}
if key:
headers["API-Key"] = key
r = requests.get(f"{BASE}/{path}", params=params, headers=headers, timeout=TIMEOUT)
if r.status_code in (401, 403) or "api key" in r.text.lower():
raise EnvironmentError(
"Arkham API key required. Set ARKHAM_API_KEY env var.\n"
"Sign up at https://platform.arkhamintelligence.com"
)
r.raise_for_status()
return r.json()
def resolve_address(name_or_addr: str) -> str:
return NOTABLE_ADDRESSES.get(name_or_addr.lower(), name_or_addr)
# ── Data fetchers ───────────────────────────────────────────────────────────
def get_address_info(address: str) -> dict:
return _get(f"intelligence/address/{resolve_address(address)}")
def get_transfers(address: str, limit: int = 20) -> dict:
return _get("token/transfers", {"address": resolve_address(address), "limit": limit})
def search_entity(query: str) -> dict:
return _get("intelligence/search", {"query": query})
# ── Summary helpers ─────────────────────────────────────────────────────────
def summary_address(data: dict) -> str:
lines = ["═══ Address Intelligence ═══", ""]
if isinstance(data, dict):
entity = data.get("entity", {}) or {}
if entity:
lines.append(f" Entity: {entity.get('name', 'Unknown')}")
lines.append(f" Type: {entity.get('type', 'Unknown')}")
lines.append(f" Address: {data.get('address', '?')}")
labels = data.get("labels", [])
if labels:
lines.append(f" Labels: {', '.join(str(l) for l in labels)}")
else:
lines.append(f" {data}")
return "\n".join(lines)
def summary_transfers(data) -> str:
lines = ["═══ Recent Transfers ═══", ""]
transfers = data if isinstance(data, list) else (data.get("transfers", data.get("data", [])) if isinstance(data, dict) else [])
if not transfers:
lines.append(" No transfers found.")
return "\n".join(lines)
for t in transfers[:15]:
token = t.get("token", {}).get("symbol", "?") if isinstance(t.get("token"), dict) else "?"
amount = t.get("amount", t.get("value", "?"))
fr = t.get("from", {})
to = t.get("to", {})
fl = (fr.get("label") or fr.get("address", "?")[:12]) if isinstance(fr, dict) else str(fr)[:12]
tl = (to.get("label") or to.get("address", "?")[:12]) if isinstance(to, dict) else str(to)[:12]
lines.append(f" {token:<8} {str(amount):>15} {fl}{tl}")
return "\n".join(lines)
def summary_notable() -> str:
lines = ["═══ Notable/Whale Addresses ═══", ""]
for name, addr in NOTABLE_ADDRESSES.items():
lines.append(f" {name:<20} {addr}")
lines.append("")
lines.append(" Use these as shortcuts: arkham.py address vitalik")
return "\n".join(lines)
# ── CLI ─────────────────────────────────────────────────────────────────────
def main():
common = argparse.ArgumentParser(add_help=False)
common.add_argument("--pretty", action="store_true", help="Pretty-print JSON output")
common.add_argument("--summary", action="store_true", help="Human-readable summary")
parser = argparse.ArgumentParser(description="Arkham Intelligence connector", parents=[common])
sub = parser.add_subparsers(dest="command", required=True)
p_addr = sub.add_parser("address", help="Address intelligence", parents=[common])
p_addr.add_argument("address", help="Ethereum address or notable name")
p_tx = sub.add_parser("transfers", help="Recent token transfers", parents=[common])
p_tx.add_argument("address")
p_tx.add_argument("--limit", type=int, default=20)
p_search = sub.add_parser("search", help="Search entities", parents=[common])
p_search.add_argument("query")
sub.add_parser("notable", help="List notable/whale addresses", parents=[common])
args = parser.parse_args()
try:
if args.command == "notable":
if args.summary:
print(summary_notable())
else:
json.dump(NOTABLE_ADDRESSES, sys.stdout, indent=2 if args.pretty else None)
print()
return
if args.command == "address":
data = get_address_info(args.address)
if args.summary:
print(summary_address(data)); return
result = data
elif args.command == "transfers":
data = get_transfers(args.address, args.limit)
if args.summary:
print(summary_transfers(data)); return
result = data
elif args.command == "search":
result = search_entity(args.query)
else:
parser.print_help(); return
json.dump(result, sys.stdout, indent=2 if args.pretty else None)
print()
except EnvironmentError as e:
print(str(e), file=sys.stderr); sys.exit(1)
except requests.HTTPError as e:
detail = e.response.text[:200] if e.response is not None else ""
print(json.dumps({"error": str(e), "detail": detail}), file=sys.stderr); sys.exit(1)
except Exception as e:
print(json.dumps({"error": f"{type(e).__name__}: {e}"}), file=sys.stderr); sys.exit(1)
if __name__ == "__main__":
main()

181
tools/data_sources/coinglass.py Executable file
View File

@ -0,0 +1,181 @@
#!/usr/bin/env python3
"""Coinglass data connector — funding rates, open interest, long/short ratios.
Uses the free fapi.coinglass.com internal API where available.
Some endpoints may return empty data without authentication.
Set COINGLASS_API_KEY env var for authenticated access to open-api.coinglass.com.
"""
import argparse
import json
import os
import sys
from typing import Any
import requests
FREE_BASE = "https://fapi.coinglass.com/api"
AUTH_BASE = "https://open-api.coinglass.com/public/v2"
TIMEOUT = 30
def _free_get(path: str, params: dict | None = None) -> Any:
headers = {
"User-Agent": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36",
"Referer": "https://www.coinglass.com/",
}
r = requests.get(f"{FREE_BASE}/{path}", params=params, headers=headers, timeout=TIMEOUT)
r.raise_for_status()
data = r.json()
if data.get("code") == "0" or data.get("success"):
return data.get("data", [])
raise ValueError(f"API error: {data.get('msg', 'unknown')}")
def _auth_get(path: str, params: dict | None = None) -> Any:
key = os.environ.get("COINGLASS_API_KEY")
if not key:
raise EnvironmentError("COINGLASS_API_KEY not set. Get one at https://www.coinglass.com/pricing")
headers = {"coinglassSecret": key}
r = requests.get(f"{AUTH_BASE}/{path}", params=params, headers=headers, timeout=TIMEOUT)
r.raise_for_status()
data = r.json()
if data.get("success") or data.get("code") == "0":
return data.get("data", [])
raise ValueError(f"API error: {data.get('msg', 'unknown')}")
# ── Data fetchers ───────────────────────────────────────────────────────────
def get_funding_rates() -> list[dict]:
"""Funding rates across exchanges."""
try:
data = _free_get("fundingRate/v2/home")
if data:
return data
except Exception:
pass
return _auth_get("funding")
def get_open_interest() -> list[dict]:
"""Aggregated open interest data."""
try:
data = _free_get("openInterest/v3/home")
if data:
return data
except Exception:
pass
return _auth_get("open_interest")
def get_long_short_ratio() -> list[dict]:
"""Global long/short account ratios."""
try:
data = _free_get("futures/longShort/v2/home")
if data:
return data
except Exception:
pass
return _auth_get("long_short")
# ── Summary helpers ─────────────────────────────────────────────────────────
def _no_data_msg(name: str) -> str:
return (f"No {name} data available (free API may be restricted).\n"
"Set COINGLASS_API_KEY for full access: https://www.coinglass.com/pricing")
def summary_funding(data: list[dict]) -> str:
if not data:
return _no_data_msg("funding rate")
lines = ["═══ Funding Rates ═══", ""]
for item in data[:20]:
symbol = item.get("symbol", item.get("coin", "?"))
rate = None
if "uMarginList" in item:
for m in item["uMarginList"]:
rate = m.get("rate")
if rate is not None:
break
else:
rate = item.get("rate")
if rate is not None:
lines.append(f" {symbol:<10} {float(rate)*100:>8.4f}%")
else:
lines.append(f" {symbol:<10} (rate unavailable)")
return "\n".join(lines)
def summary_oi(data: list[dict]) -> str:
if not data:
return _no_data_msg("open interest")
lines = ["═══ Open Interest ═══", ""]
for item in data[:20]:
symbol = item.get("symbol", item.get("coin", "?"))
oi = item.get("openInterest", item.get("oi", 0))
lines.append(f" {symbol:<10} OI: ${float(oi):>15,.0f}")
return "\n".join(lines)
def summary_ls(data: list[dict]) -> str:
if not data:
return _no_data_msg("long/short")
lines = ["═══ Long/Short Ratios ═══", ""]
for item in data[:20]:
symbol = item.get("symbol", item.get("coin", "?"))
long_rate = item.get("longRate", item.get("longRatio", "?"))
short_rate = item.get("shortRate", item.get("shortRatio", "?"))
lines.append(f" {symbol:<10} Long: {long_rate} Short: {short_rate}")
return "\n".join(lines)
# ── CLI ─────────────────────────────────────────────────────────────────────
def main():
common = argparse.ArgumentParser(add_help=False)
common.add_argument("--pretty", action="store_true", help="Pretty-print JSON output")
common.add_argument("--summary", action="store_true", help="Human-readable summary")
parser = argparse.ArgumentParser(description="Coinglass data connector", parents=[common])
sub = parser.add_subparsers(dest="command", required=True)
sub.add_parser("funding", help="Funding rates across exchanges", parents=[common])
sub.add_parser("oi", help="Open interest overview", parents=[common])
sub.add_parser("long-short", help="Long/short ratios", parents=[common])
args = parser.parse_args()
try:
if args.command == "funding":
data = get_funding_rates()
if args.summary:
print(summary_funding(data)); return
result = data
elif args.command == "oi":
data = get_open_interest()
if args.summary:
print(summary_oi(data)); return
result = data
elif args.command == "long-short":
data = get_long_short_ratio()
if args.summary:
print(summary_ls(data)); return
result = data
else:
parser.print_help(); return
json.dump(result, sys.stdout, indent=2 if args.pretty else None)
print()
except EnvironmentError as e:
print(str(e), file=sys.stderr); sys.exit(1)
except requests.HTTPError as e:
print(json.dumps({"error": str(e)}), file=sys.stderr); sys.exit(1)
except Exception as e:
print(json.dumps({"error": f"{type(e).__name__}: {e}"}), file=sys.stderr); sys.exit(1)
if __name__ == "__main__":
main()

176
tools/data_sources/defillama.py Executable file
View File

@ -0,0 +1,176 @@
#!/usr/bin/env python3
"""DefiLlama API connector — TVL, token prices, yield/APY data.
No authentication required. All endpoints are free.
API base: https://api.llama.fi | Prices: https://coins.llama.fi | Yields: https://yields.llama.fi
"""
import argparse
import json
import sys
from typing import Any
import requests
BASE = "https://api.llama.fi"
COINS_BASE = "https://coins.llama.fi"
YIELDS_BASE = "https://yields.llama.fi"
TIMEOUT = 30
def _get(url: str, params: dict | None = None) -> Any:
r = requests.get(url, params=params, timeout=TIMEOUT)
r.raise_for_status()
return r.json()
# ── Protocol / TVL ──────────────────────────────────────────────────────────
def get_protocols(limit: int = 20) -> list[dict]:
"""Top protocols by TVL."""
data = _get(f"{BASE}/protocols")
# Sort by tvl descending, filter out CEXes
protos = [p for p in data if p.get("category") != "CEX" and p.get("tvl")]
protos.sort(key=lambda p: p.get("tvl", 0), reverse=True)
return protos[:limit]
def get_tvl(protocol: str) -> dict:
"""Get current TVL for a specific protocol (slug name)."""
val = _get(f"{BASE}/tvl/{protocol}")
return {"protocol": protocol, "tvl": val}
def get_protocol_detail(protocol: str) -> dict:
"""Full protocol details including chain breakdowns."""
return _get(f"{BASE}/protocol/{protocol}")
# ── Token Prices ────────────────────────────────────────────────────────────
def get_prices(coins: list[str]) -> dict:
"""Get current prices. Coins format: 'coingecko:ethereum', 'ethereum:0x...', etc."""
joined = ",".join(coins)
data = _get(f"{COINS_BASE}/prices/current/{joined}")
return data.get("coins", {})
# ── Yields / APY ────────────────────────────────────────────────────────────
def get_yield_pools(limit: int = 30, min_tvl: float = 1_000_000, stablecoin_only: bool = False) -> list[dict]:
"""Top yield pools sorted by APY."""
data = _get(f"{YIELDS_BASE}/pools")
pools = data.get("data", [])
# Filter
pools = [p for p in pools if (p.get("tvlUsd") or 0) >= min_tvl and (p.get("apy") or 0) > 0]
if stablecoin_only:
pools = [p for p in pools if p.get("stablecoin")]
pools.sort(key=lambda p: p.get("apy", 0), reverse=True)
return pools[:limit]
# ── Summary helpers ─────────────────────────────────────────────────────────
def _fmt_usd(v: float) -> str:
if v >= 1e9:
return f"${v/1e9:.2f}B"
if v >= 1e6:
return f"${v/1e6:.1f}M"
return f"${v:,.0f}"
def summary_protocols(protos: list[dict]) -> str:
lines = ["═══ Top Protocols by TVL ═══", ""]
for i, p in enumerate(protos, 1):
lines.append(f" {i:>2}. {p['name']:<25} TVL: {_fmt_usd(p.get('tvl', 0)):>12} chain: {p.get('chain', '?')}")
return "\n".join(lines)
def summary_prices(prices: dict) -> str:
lines = ["═══ Token Prices ═══", ""]
for coin, info in prices.items():
lines.append(f" {info.get('symbol', coin):<10} ${info['price']:>12,.2f} (confidence: {info.get('confidence', '?')})")
return "\n".join(lines)
def summary_yields(pools: list[dict]) -> str:
lines = ["═══ Top Yield Pools ═══", ""]
for i, p in enumerate(pools, 1):
lines.append(
f" {i:>2}. {p.get('symbol','?'):<25} APY: {p.get('apy',0):>8.2f}% "
f"TVL: {_fmt_usd(p.get('tvlUsd',0)):>10} {p.get('chain','?')}/{p.get('project','?')}"
)
return "\n".join(lines)
# ── CLI ─────────────────────────────────────────────────────────────────────
def main():
common = argparse.ArgumentParser(add_help=False)
common.add_argument("--pretty", action="store_true", help="Pretty-print JSON output")
common.add_argument("--summary", action="store_true", help="Human-readable summary")
parser = argparse.ArgumentParser(description="DefiLlama data connector", parents=[common])
sub = parser.add_subparsers(dest="command", required=True)
# protocols
p_proto = sub.add_parser("protocols", help="Top protocols by TVL", parents=[common])
p_proto.add_argument("--limit", type=int, default=20)
# tvl
p_tvl = sub.add_parser("tvl", help="TVL for a specific protocol", parents=[common])
p_tvl.add_argument("protocol", help="Protocol slug (e.g. aave, lido)")
# prices
p_price = sub.add_parser("prices", help="Token prices", parents=[common])
p_price.add_argument("coins", nargs="+", help="Coin IDs: coingecko:ethereum, ethereum:0x...")
# yields
p_yield = sub.add_parser("yields", help="Top yield pools", parents=[common])
p_yield.add_argument("--limit", type=int, default=30)
p_yield.add_argument("--min-tvl", type=float, default=1_000_000)
p_yield.add_argument("--stablecoins", action="store_true")
args = parser.parse_args()
try:
if args.command == "protocols":
data = get_protocols(args.limit)
if args.summary:
print(summary_protocols(data))
return
result = [{"name": p["name"], "tvl": p.get("tvl"), "chain": p.get("chain"), "category": p.get("category"), "symbol": p.get("symbol")} for p in data]
elif args.command == "tvl":
result = get_tvl(args.protocol)
if args.summary:
print(f"{args.protocol}: {_fmt_usd(result['tvl'])}")
return
elif args.command == "prices":
result = get_prices(args.coins)
if args.summary:
print(summary_prices(result))
return
elif args.command == "yields":
data = get_yield_pools(args.limit, args.min_tvl, args.stablecoins)
if args.summary:
print(summary_yields(data))
return
result = [{"symbol": p.get("symbol"), "apy": p.get("apy"), "tvlUsd": p.get("tvlUsd"), "chain": p.get("chain"), "project": p.get("project"), "pool": p.get("pool")} for p in data]
else:
parser.print_help()
return
indent = 2 if args.pretty else None
json.dump(result, sys.stdout, indent=indent)
print()
except requests.HTTPError as e:
print(json.dumps({"error": str(e)}), file=sys.stderr)
sys.exit(1)
except Exception as e:
print(json.dumps({"error": f"Unexpected: {e}"}), file=sys.stderr)
sys.exit(1)
if __name__ == "__main__":
main()

15
tools/tweet_analyzer_wrapper.sh Executable file
View File

@ -0,0 +1,15 @@
#!/usr/bin/env bash
# Tweet Analyzer Wrapper - for OpenClaw agent use
# Usage: ./tweet_analyzer_wrapper.sh <tweet_url> [output_file]
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
URL="${1:?Usage: $0 <tweet_url> [output_file]}"
OUTPUT="${2:-}"
if [ -n "$OUTPUT" ]; then
python3 "$SCRIPT_DIR/analyze_tweet.py" "$URL" -o "$OUTPUT"
echo "Analysis written to $OUTPUT"
else
python3 "$SCRIPT_DIR/analyze_tweet.py" "$URL"
fi