Compare commits

...

2 Commits

Author SHA1 Message Date
6592590dac Playwright X scraper, daily notes, feed analysis 2026-02-09 17:26:02 -06:00
be0315894e Add crypto signals pipeline + Polymarket arb scanner
- Signal parser for Telegram JSON exports
- Price fetcher using Binance US API
- Backtester with fee-aware simulation
- Polymarket 15-min arb scanner with orderbook checking
- Systemd timer every 2 min for arb alerts
- Paper trade tracking
- Investigation: polymarket-15min-arb.md
2026-02-09 14:31:51 -06:00
16 changed files with 38596 additions and 253 deletions

View File

@ -0,0 +1,36 @@
# Polymarket 15-Min Crypto Arbitrage
**Source:** https://x.com/noisyb0y1/status/2020942208858456206
**Date investigated:** 2026-02-09
**Verdict:** Legitimate edge, inflated claims
## Strategy
- Buy BOTH sides (Up + Down) on 15-minute BTC/ETH/SOL/XRP markets
- When combined cost < $1.00, guaranteed profit regardless of outcome
- Edge exists because these markets are low liquidity / inefficient pricing
## Reference Wallet
- `0xE594336603F4fB5d3ba4125a67021ab3B4347052`
- Real PnL on 2026-02-09: ~$9K on $82K deployed (11% daily)
- Combined costs ranged from $0.70 (great arb) to $1.10 (not arb)
- Best arbs: ETH markets at $0.70-0.73 combined cost
## Why It Works
- 15-min markets have thin books — prices diverge from fair value
- Binary outcome means Up + Down must sum to $1.00 at resolution
- If you buy both for < $1.00 total, guaranteed profit
## Challenges
- Needs significant capital ($50K+) to make meaningful returns
- Fill quality degrades at scale — slippage kills the edge
- Competition from other bots narrows the window
- Not all markets have arb — some combined costs > $1.00
## Revisit When
- [ ] We have capital to deploy
- [ ] Built a bot to scan for combined < $1.00 opportunities in real-time
- [ ] Polymarket adds more 15-min markets (more opportunities)
## Related
- Tweet author promoting "Clawdbots" — bot product shill
- "$99K in a day" / "$340K total" claims are inflated (real: $9K profit)

View File

@ -1,62 +1,66 @@
# 2026-02-09
# 2026-02-09 — Monday
## Market Watch Launch Day
- GARP paper trading simulator went live
- 9 AM systemd timer fired but crashed (scanner returns list, code expected dict) — fixed
- Manual run at 10:49 AM — **7 positions opened**:
- DUOL (57 shares @ $116.35)
- ALLY (156 shares @ $42.65)
- JHG (138 shares @ $48.21)
- INCY (61 shares @ $108.69)
- PINS (332 shares @ $20.06)
- EXEL (152 shares @ $43.80)
- CART (187 shares @ $35.49)
- ~$46.5K deployed, ~$53.5K cash remaining
- Portal live at marketwatch.local:8889
- Multiplayer game engine working — "GARP Challenge" game created
## X Feed Analysis Day
Major session analyzing Polymarket/crypto tweets D J forwarded from X feed.
## Super Bowl Results (from last night)
- Seahawks 36, Patriots 13
- kch123 copy-trade sim: ALL 5 positions won, +$728 on $1K bankroll (+72.8%)
- kch123 himself probably cleared ~$2M profit on this game
- Two weeks straight of wins for kch123 (60-0 last week + Super Bowl sweep)
### Tweets Investigated
1. **@browomo weather edge** — Pilot METAR data for Polymarket weather bets
- Wallet `0x594edB9112f...`: Claimed +$27K, actual **-$13,103** (387 losses, 51 wins)
- Verdict: SCAM/engagement bait
## Craigslist Account (from yesterday)
- Registered: case-lgn@protonmail.com, Nashville area
- Password set, credentials saved to .credentials/craigslist.env
- User ID: 405642144
- D J not ready for listings yet (needs photos)
2. **@ArchiveExplorer planktonXD** — "Buy everything under 5 cents"
- Wallet `0x4ffe49ba2a4c...`: Claimed +$104K, actual **-$9,517** (3090 losses, 1368 wins, 37% win rate)
- Verdict: SCAM/engagement bait
## D J Interests
- Looking at queen Murphy beds on Craigslist
- Wants to get rid of an old mattress (options discussed: Metro Nashville bulky pickup, free CL listing, dump)
- Interested in Olympics men's hockey (USA in Group C, games start Feb 11)
- Expanding analysis beyond Polymarket into crypto and stocks
- Goal: find market gaps to offset AI service costs ($200/mo Claude + infra)
- Getting crypto signals on Telegram, wants to forward them for analysis
- Asked about Tiiny AI (kickstarter AI hardware, 80GB unified memory, NPU) as potential Claude replacement
3. **@krajekis BTC 15-min LONG** — "+700% monthly, 1 trade/day at 9AM"
- Backtested 25 days: 52% win rate (coin flip), strategy loses 76% of capital
- Verdict: FABRICATED results
## Stock Screener Built
- GARP filters automated via yfinance (free, no API key)
- S&P 500 + S&P 400 MidCap scan (~902 tickers)
- Initial S&P 500 scan found 4: BAC, CFG, FITB, INCY
- Expanded scan found 22 total candidates
- Top non-bank picks: PINS, FSLR, PGR, NEM, ALLY
- Deep dive sub-agent ran on all 4 original picks
4. **@noisyb0y1 15-min arb** — "$99K in a day"
- Wallet `0xE594336603F4...`: Real strategy (buying both sides), actual PnL ~$9K not $99K
- Combined costs $0.70-$0.95 on some markets = genuine arb edge
- Verdict: INFLATED numbers, but strategy has merit → bookmarked for scanner
## X Post Analysis: @Shelpid_WI3M / anoin123
- Wallet 0x96489abc... is real, +$1.59M PnL, 216 trades
- BUT: concentrated single-thesis bet (NO on Iran strikes), not diversified alpha
- Post is a shill for PolyCopyBot (Telegram copy-trading bot)
- Verdict: real wallet, misleading narrative, exists to sell bot subscriptions
5. **5 more wallets** — spawned sub-agent to research @IH2P, Super Bowl trader, NegRisk arb, Elon insider, $270→$244K bot
## Tiiny AI Analysis
- 80GB LPDDR5X, ARM + NPU (160 TOPS), 18-40 tok/s, 30W
- Kickstarter vaporware — doesn't exist yet
- Would blow away D J's current 22GB VRAM setup IF it ships
- Recommended waiting for real reviews, not pre-ordering
### Pattern Identified
- Fintwit Polymarket accounts follow identical template: big $ claim → wallet → product shill
- Products being shilled: Clawdbots, Moltbook, Bullpen (affiliate/paid promos)
- Real money is in selling engagement, not trading
- Wallets are cherry-picked; PnL claims conflate position value with profit
## Lessons
- Scanner run_scan() returns list, not dict — caused systemd crash on first real run
- Always test the full pipeline end-to-end before relying on timers
- yfinance is reliable and free for fundamental data, no API key needed
### Built Today
- **Crypto signals pipeline** (`projects/crypto-signals/`)
- Signal parser for Telegram JSON exports
- Price fetcher (Binance US API)
- Backtester with Polymarket fee awareness
- Ready for D J's Telegram group export
- **Polymarket 15-min arb scanner** — systemd timer every 2 min
- Scans active Up or Down markets
- Checks orderbooks for combined < $1.00
- Paper trades and Telegram alerts
- Finding: markets are tight at steady state, arb windows likely during volatility
- **Playwright installed** — replaced flaky CDP websocket approach
- `connect_over_cdp('http://localhost:9222')` works with existing Chrome session
- X feed scraper working via Playwright
- **PowerInfer multi-GPU cron** — weekly Monday check for feature support
### Day-of-Week Stock Strategy
- D J's friend suggested buy Friday/Monday, sell midweek
- Backtested 5 years of SPY: Monday is strongest day (+0.114%, 62% win rate)
- Buy Mon Open → Sell Wed Close: 56% win rate, +0.265% avg, $10K→$17.8K
- But buy & hold still wins: $10K→$19K
- Verdict: weak edge, needs additional filters to be useful
### Infrastructure
- Polymarket fee structure documented: 15-min markets have taker fees, max 1.56% at 50/50
- Fee formula: `shares × price × 0.25 × (price × (1-price))²`
- Binance international is geo-blocked; Binance US works
### Pending
- D J sending Telegram crypto signal group export (Option 2: JSON export)
- Signal provider uses VWAP-based strategy
- Sub-agent researching 5 more wallets

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,259 @@
#!/usr/bin/env python3
"""
Crypto Signal Backtester
Simulates each signal against historical price data to determine outcomes.
"""
import json
import sys
import time
from datetime import datetime, timezone
from pathlib import Path
from price_fetcher import get_all_klines, get_current_price, normalize_symbol, datetime_to_ms
def simulate_signal(signal, klines):
"""
Simulate a signal against historical candle data.
Returns outcome dict with result, P&L, time to resolution, etc.
"""
direction = signal['direction']
entry = signal.get('entry')
stop_loss = signal.get('stop_loss')
targets = signal.get('targets', [])
leverage = signal.get('leverage', 1)
if not targets or not stop_loss:
return {'result': 'incomplete', 'reason': 'missing SL or targets'}
target = targets[0] # Primary target
# If entry is 'market', use first candle's open
if entry == 'market' or entry is None:
if not klines:
return {'result': 'no_data'}
entry = klines[0]['open']
signal['entry_resolved'] = entry
# Calculate risk/reward
if direction == 'short':
risk = abs(stop_loss - entry)
reward = abs(entry - target)
risk_pct = risk / entry * 100
reward_pct = reward / entry * 100
else: # long
risk = abs(entry - stop_loss)
reward = abs(target - entry)
risk_pct = risk / entry * 100
reward_pct = reward / entry * 100
rr_ratio = reward / risk if risk > 0 else 0
result = {
'entry_price': entry,
'stop_loss': stop_loss,
'target': target,
'direction': direction,
'leverage': leverage,
'risk_pct': round(risk_pct, 2),
'reward_pct': round(reward_pct, 2),
'rr_ratio': round(rr_ratio, 2),
}
# Walk through candles
for i, candle in enumerate(klines):
high = candle['high']
low = candle['low']
if direction == 'short':
# Check SL hit (price went above SL)
sl_hit = high >= stop_loss
# Check TP hit (price went below target)
tp_hit = low <= target
else: # long
# Check SL hit (price went below SL)
sl_hit = low <= stop_loss
# Check TP hit (price went above target)
tp_hit = high >= target
if sl_hit and tp_hit:
# Both hit in same candle — assume SL hit first (conservative)
result['result'] = 'stop_loss'
result['exit_price'] = stop_loss
result['candles_to_exit'] = i + 1
result['exit_time'] = candle['open_time']
break
elif tp_hit:
result['result'] = 'target_hit'
result['exit_price'] = target
result['candles_to_exit'] = i + 1
result['exit_time'] = candle['open_time']
break
elif sl_hit:
result['result'] = 'stop_loss'
result['exit_price'] = stop_loss
result['candles_to_exit'] = i + 1
result['exit_time'] = candle['open_time']
break
else:
# Never resolved — check current unrealized P&L
if klines:
last_price = klines[-1]['close']
if direction == 'short':
unrealized_pct = (entry - last_price) / entry * 100
else:
unrealized_pct = (last_price - entry) / entry * 100
result['result'] = 'open'
result['last_price'] = last_price
result['unrealized_pct'] = round(unrealized_pct, 2)
result['unrealized_pct_leveraged'] = round(unrealized_pct * leverage, 2)
else:
result['result'] = 'no_data'
# Calculate P&L
if result['result'] in ('target_hit', 'stop_loss'):
exit_price = result['exit_price']
if direction == 'short':
pnl_pct = (entry - exit_price) / entry * 100
else:
pnl_pct = (exit_price - entry) / entry * 100
result['pnl_pct'] = round(pnl_pct, 2)
result['pnl_pct_leveraged'] = round(pnl_pct * leverage, 2)
return result
def backtest_signals(signals, interval='5m', lookforward_hours=72):
"""Backtest a list of parsed signals."""
results = []
for i, signal in enumerate(signals):
ticker = signal['ticker']
symbol = normalize_symbol(ticker)
timestamp = signal.get('timestamp', '')
print(f"[{i+1}/{len(signals)}] {ticker} {signal['direction']} ...", end=' ', flush=True)
# Get start time
start_ms = datetime_to_ms(timestamp) if timestamp else int(time.time() * 1000)
end_ms = start_ms + (lookforward_hours * 60 * 60 * 1000)
# Cap at current time
now_ms = int(time.time() * 1000)
if end_ms > now_ms:
end_ms = now_ms
# Fetch candles
klines = get_all_klines(symbol, interval, start_ms, end_ms)
if not klines:
print(f"NO DATA")
results.append({**signal, 'backtest': {'result': 'no_data', 'reason': f'no klines for {symbol}'}})
continue
# Simulate
outcome = simulate_signal(signal, klines)
print(f"{outcome['result']} | PnL: {outcome.get('pnl_pct_leveraged', outcome.get('unrealized_pct_leveraged', '?'))}%")
results.append({**signal, 'backtest': outcome})
time.sleep(0.2) # Rate limit
return results
def generate_report(results):
"""Generate a summary report from backtest results."""
total = len(results)
wins = [r for r in results if r['backtest'].get('result') == 'target_hit']
losses = [r for r in results if r['backtest'].get('result') == 'stop_loss']
open_trades = [r for r in results if r['backtest'].get('result') == 'open']
no_data = [r for r in results if r['backtest'].get('result') in ('no_data', 'incomplete')]
resolved = wins + losses
win_rate = len(wins) / len(resolved) * 100 if resolved else 0
avg_win = sum(r['backtest']['pnl_pct_leveraged'] for r in wins) / len(wins) if wins else 0
avg_loss = sum(r['backtest']['pnl_pct_leveraged'] for r in losses) / len(losses) if losses else 0
total_pnl = sum(r['backtest'].get('pnl_pct_leveraged', 0) for r in resolved)
# Profit factor
gross_profit = sum(r['backtest']['pnl_pct_leveraged'] for r in wins)
gross_loss = abs(sum(r['backtest']['pnl_pct_leveraged'] for r in losses))
profit_factor = gross_profit / gross_loss if gross_loss > 0 else float('inf')
# Risk/reward stats
avg_rr = sum(r['backtest'].get('rr_ratio', 0) for r in resolved) / len(resolved) if resolved else 0
report = {
'summary': {
'total_signals': total,
'wins': len(wins),
'losses': len(losses),
'open': len(open_trades),
'no_data': len(no_data),
'win_rate': round(win_rate, 1),
'avg_win_pct': round(avg_win, 2),
'avg_loss_pct': round(avg_loss, 2),
'total_pnl_pct': round(total_pnl, 2),
'profit_factor': round(profit_factor, 2),
'avg_risk_reward': round(avg_rr, 2),
},
'trades': results,
}
return report
def print_report(report):
"""Pretty print the report."""
s = report['summary']
print("\n" + "=" * 60)
print("CRYPTO SIGNAL BACKTEST REPORT")
print("=" * 60)
print(f"Total Signals: {s['total_signals']}")
print(f"Wins: {s['wins']}")
print(f"Losses: {s['losses']}")
print(f"Open: {s['open']}")
print(f"No Data: {s['no_data']}")
print(f"Win Rate: {s['win_rate']}%")
print(f"Avg Win: +{s['avg_win_pct']}% (leveraged)")
print(f"Avg Loss: {s['avg_loss_pct']}% (leveraged)")
print(f"Total P&L: {s['total_pnl_pct']}% (sum of resolved)")
print(f"Profit Factor: {s['profit_factor']}")
print(f"Avg R:R: {s['avg_risk_reward']}")
print("=" * 60)
if __name__ == '__main__':
if len(sys.argv) < 2:
print("Usage: python3 backtester.py <signals.json> [--interval 5m] [--hours 72]")
print("\nRun signal_parser.py first to generate signals.json")
sys.exit(1)
signals_path = sys.argv[1]
interval = '5m'
hours = 72
for i, arg in enumerate(sys.argv):
if arg == '--interval' and i + 1 < len(sys.argv):
interval = sys.argv[i + 1]
if arg == '--hours' and i + 1 < len(sys.argv):
hours = int(sys.argv[i + 1])
with open(signals_path) as f:
signals = json.load(f)
print(f"Backtesting {len(signals)} signals (interval={interval}, lookforward={hours}h)\n")
results = backtest_signals(signals, interval=interval, lookforward_hours=hours)
report = generate_report(results)
print_report(report)
# Save full report
out_path = signals_path.replace('.json', '_backtest.json')
with open(out_path, 'w') as f:
json.dump(report, f, indent=2)
print(f"\nFull report saved to {out_path}")

View File

@ -0,0 +1,311 @@
#!/usr/bin/env python3
"""
Polymarket 15-Min Crypto Arbitrage Scanner
Scans active 15-minute crypto markets for arbitrage opportunities.
Alerts via Telegram when combined Up+Down cost < $1.00 (after fees).
Zero AI tokens — runs as pure Python via systemd timer.
"""
import json
import os
import sys
import time
import urllib.request
from datetime import datetime, timezone, timedelta
from pathlib import Path
# Config
DATA_DIR = Path(__file__).parent.parent / "data" / "arb-scanner"
DATA_DIR.mkdir(parents=True, exist_ok=True)
LOG_FILE = DATA_DIR / "scan_log.json"
PAPER_TRADES_FILE = DATA_DIR / "paper_trades.json"
TELEGRAM_BOT_TOKEN = os.environ.get("TELEGRAM_BOT_TOKEN", "")
TELEGRAM_CHAT_ID = os.environ.get("TELEGRAM_CHAT_ID", "6443752046")
# Polymarket fee formula for 15-min markets
def calc_taker_fee(shares, price):
"""Calculate taker fee in USDC."""
if price <= 0 or price >= 1:
return 0
return shares * price * 0.25 * (price * (1 - price)) ** 2
def calc_fee_rate(price):
"""Effective fee rate at a given price."""
if price <= 0 or price >= 1:
return 0
return 0.25 * (price * (1 - price)) ** 2
def get_active_15min_markets():
"""Fetch active 15-minute crypto markets from Polymarket."""
markets = []
# 15-min markets are scattered across pagination — scan broadly
for offset in range(0, 3000, 200):
url = (
f"https://gamma-api.polymarket.com/markets?"
f"active=true&closed=false&limit=200&offset={offset}"
f"&order=volume&ascending=false"
)
req = urllib.request.Request(url, headers={"User-Agent": "Mozilla/5.0"})
try:
resp = urllib.request.urlopen(req, timeout=15)
batch = json.loads(resp.read())
for m in batch:
q = m.get("question", "").lower()
if "up or down" in q:
markets.append(m)
if len(batch) < 200:
break
except Exception as e:
print(f"Error fetching markets (offset={offset}): {e}")
break
time.sleep(0.1)
# Only keep markets ending within the next 4 hours (tradeable window)
now = datetime.now(timezone.utc)
tradeable = []
for m in markets:
end_str = m.get("endDate", "")
if not end_str:
continue
try:
end_dt = datetime.fromisoformat(end_str.replace("Z", "+00:00"))
hours_until = (end_dt - now).total_seconds() / 3600
if 0.25 < hours_until <= 24: # Skip markets < 15min to expiry (already resolved)
m["_hours_until_end"] = round(hours_until, 2)
tradeable.append(m)
except:
pass
# Deduplicate
seen = set()
unique = []
for m in tradeable:
cid = m.get("conditionId", m.get("id", ""))
if cid not in seen:
seen.add(cid)
unique.append(m)
return unique
def get_orderbook_prices(token_id):
"""Get best bid/ask from the CLOB API."""
url = f"https://clob.polymarket.com/book?token_id={token_id}"
req = urllib.request.Request(url, headers={"User-Agent": "Mozilla/5.0"})
try:
resp = urllib.request.urlopen(req, timeout=10)
book = json.loads(resp.read())
bids = book.get("bids", [])
asks = book.get("asks", [])
best_bid = float(bids[0]["price"]) if bids else 0
best_ask = float(asks[0]["price"]) if asks else 1
bid_size = float(bids[0].get("size", 0)) if bids else 0
ask_size = float(asks[0].get("size", 0)) if asks else 0
return {
"best_bid": best_bid,
"best_ask": best_ask,
"bid_size": bid_size,
"ask_size": ask_size,
"spread": best_ask - best_bid
}
except Exception as e:
return None
def scan_for_arbs():
"""Scan all active 15-min markets for arbitrage opportunities."""
markets = get_active_15min_markets()
print(f"Found {len(markets)} active 15-min crypto markets")
opportunities = []
for market in markets:
question = market.get("question", market.get("title", ""))
hours_left = market.get("_hours_until_end", "?")
# Get token IDs for both outcomes
tokens = market.get("clobTokenIds", "")
if isinstance(tokens, str):
try:
tokens = json.loads(tokens) if tokens.startswith("[") else tokens.split(",")
except:
tokens = []
if len(tokens) < 2:
continue
# Get orderbook for both tokens (ask = price to buy)
book_up = get_orderbook_prices(tokens[0])
book_down = get_orderbook_prices(tokens[1])
time.sleep(0.15)
if not book_up or not book_down:
continue
# For arb: we BUY both sides at the ASK price
up_ask = book_up["best_ask"]
down_ask = book_down["best_ask"]
combined = up_ask + down_ask
# Calculate fees on 100 shares
fee_up = calc_taker_fee(100, up_ask)
fee_down = calc_taker_fee(100, down_ask)
total_cost_100 = (up_ask + down_ask) * 100 + fee_up + fee_down
net_profit_100 = 100 - total_cost_100
net_profit_pct = net_profit_100 / total_cost_100 * 100 if total_cost_100 > 0 else 0
# Fillable size (limited by smaller side)
fillable_size = min(book_up["ask_size"], book_down["ask_size"])
if fillable_size > 0:
fill_fee_up = calc_taker_fee(fillable_size, up_ask)
fill_fee_down = calc_taker_fee(fillable_size, down_ask)
fill_cost = (up_ask + down_ask) * fillable_size + fill_fee_up + fill_fee_down
fill_profit = fillable_size - fill_cost
else:
fill_profit = 0
opp = {
"question": question,
"hours_left": hours_left,
"up_ask": up_ask,
"down_ask": down_ask,
"up_ask_size": book_up["ask_size"],
"down_ask_size": book_down["ask_size"],
"combined": round(combined, 4),
"fee_up_per_100": round(fee_up, 4),
"fee_down_per_100": round(fee_down, 4),
"total_fees_per_100": round(fee_up + fee_down, 4),
"net_profit_per_100": round(net_profit_100, 2),
"net_profit_pct": round(net_profit_pct, 2),
"fillable_shares": fillable_size,
"fillable_profit": round(fill_profit, 2),
"is_arb": net_profit_100 > 0,
"timestamp": datetime.now(timezone.utc).isoformat(),
}
opportunities.append(opp)
return opportunities
def paper_trade(opp):
"""Record a paper trade for an arb opportunity."""
trades = []
if PAPER_TRADES_FILE.exists():
trades = json.loads(PAPER_TRADES_FILE.read_text())
trade = {
"id": len(trades) + 1,
"timestamp": opp["timestamp"],
"question": opp["question"],
"up_price": opp.get("up_ask", opp.get("up_price", 0)),
"down_price": opp.get("down_ask", opp.get("down_price", 0)),
"combined": opp["combined"],
"fees_per_100": opp["total_fees_per_100"],
"net_profit_per_100": opp["net_profit_per_100"],
"net_profit_pct": opp["net_profit_pct"],
"status": "open", # Will be "won" when market resolves (always wins if real arb)
"paper_size_usd": 50, # Paper trade $50 per arb
}
expected_profit = 50 * opp["net_profit_pct"] / 100
trade["expected_profit_usd"] = round(expected_profit, 2)
trades.append(trade)
PAPER_TRADES_FILE.write_text(json.dumps(trades, indent=2))
return trade
def send_telegram_alert(message):
"""Send alert via Telegram bot API (zero tokens)."""
if not TELEGRAM_BOT_TOKEN:
print(f"[ALERT] {message}")
return
url = f"https://api.telegram.org/bot{TELEGRAM_BOT_TOKEN}/sendMessage"
data = json.dumps({
"chat_id": TELEGRAM_CHAT_ID,
"text": message,
"parse_mode": "HTML"
}).encode()
req = urllib.request.Request(url, data=data, headers={
"Content-Type": "application/json",
"User-Agent": "Mozilla/5.0"
})
try:
urllib.request.urlopen(req, timeout=10)
except Exception as e:
print(f"Telegram alert failed: {e}")
def main():
print(f"=== Polymarket 15-Min Arb Scanner ===")
print(f"Time: {datetime.now(timezone.utc).strftime('%Y-%m-%d %H:%M UTC')}")
print()
opps = scan_for_arbs()
arbs = [o for o in opps if o["is_arb"]]
non_arbs = [o for o in opps if not o["is_arb"]]
print(f"\nResults: {len(opps)} markets scanned, {len(arbs)} arb opportunities\n")
for o in sorted(opps, key=lambda x: x.get("net_profit_pct", 0), reverse=True):
emoji = "" if o["is_arb"] else ""
print(f"{emoji} {o['question'][:65]}")
up = o.get('up_ask', o.get('up_price', '?'))
down = o.get('down_ask', o.get('down_price', '?'))
print(f" Up: ${up} | Down: ${down} | Combined: ${o['combined']}")
print(f" Fees/100: ${o['total_fees_per_100']} | Net profit/100: ${o['net_profit_per_100']} ({o['net_profit_pct']}%)")
if o.get('fillable_shares'):
print(f" Fillable: {o['fillable_shares']:.0f} shares | Fillable profit: ${o.get('fillable_profit', '?')}")
print()
# Paper trade any arbs found
for arb in arbs:
trade = paper_trade(arb)
print(f"📝 Paper trade #{trade['id']}: {trade['question'][:50]} | Expected: +${trade['expected_profit_usd']}")
# Send Telegram alert
msg = (
f"🔔 <b>Arb Found!</b>\n\n"
f"<b>{arb['question']}</b>\n"
f"Up: ${arb.get('up_ask', arb.get('up_price', '?'))} | "
f"Down: ${arb.get('down_ask', arb.get('down_price', '?'))}\n"
f"Combined: ${arb['combined']} (after fees)\n"
f"Net profit: {arb['net_profit_pct']}%\n\n"
f"📝 Paper traded $50 → expected +${trade['expected_profit_usd']}"
)
send_telegram_alert(msg)
# Save scan log
log = []
if LOG_FILE.exists():
try:
log = json.loads(LOG_FILE.read_text())
except:
pass
log.append({
"timestamp": datetime.now(timezone.utc).isoformat(),
"markets_scanned": len(opps),
"arbs_found": len(arbs),
"opportunities": opps,
})
# Keep last 1000 scans
log = log[-1000:]
LOG_FILE.write_text(json.dumps(log, indent=2))
# Summary of paper trades
if PAPER_TRADES_FILE.exists():
trades = json.loads(PAPER_TRADES_FILE.read_text())
total_expected = sum(t.get("expected_profit_usd", 0) for t in trades)
print(f"\n📊 Paper trade total: {len(trades)} trades, expected profit: ${total_expected:.2f}")
if __name__ == "__main__":
main()

View File

@ -0,0 +1,131 @@
#!/usr/bin/env python3
"""
Crypto Price Fetcher
Pulls historical OHLCV data from Binance public API (no key needed).
"""
import json
import time
import urllib.request
from datetime import datetime, timezone
# Binance intl is geo-blocked from US; use Binance US
BINANCE_KLINES = "https://api.binance.us/api/v3/klines"
BINANCE_TICKER = "https://api.binance.us/api/v3/ticker/price"
def get_price_at_time(symbol, timestamp_ms, interval='1m'):
"""Get the candle at a specific timestamp."""
url = f"{BINANCE_KLINES}?symbol={symbol}&interval={interval}&startTime={timestamp_ms}&limit=1"
req = urllib.request.Request(url, headers={'User-Agent': 'Mozilla/5.0'})
try:
resp = urllib.request.urlopen(req, timeout=10)
data = json.loads(resp.read())
if data:
return {
'open_time': data[0][0],
'open': float(data[0][1]),
'high': float(data[0][2]),
'low': float(data[0][3]),
'close': float(data[0][4]),
'volume': float(data[0][5]),
}
except Exception as e:
print(f"Error fetching {symbol}: {e}")
return None
def get_klines(symbol, interval='1h', start_time_ms=None, end_time_ms=None, limit=1000):
"""Get historical klines/candlestick data."""
params = f"symbol={symbol}&interval={interval}&limit={limit}"
if start_time_ms:
params += f"&startTime={start_time_ms}"
if end_time_ms:
params += f"&endTime={end_time_ms}"
url = f"{BINANCE_KLINES}?{params}"
req = urllib.request.Request(url, headers={'User-Agent': 'Mozilla/5.0'})
try:
resp = urllib.request.urlopen(req, timeout=15)
raw = json.loads(resp.read())
return [{
'open_time': k[0],
'open': float(k[1]),
'high': float(k[2]),
'low': float(k[3]),
'close': float(k[4]),
'volume': float(k[5]),
'close_time': k[6],
} for k in raw]
except Exception as e:
print(f"Error fetching klines for {symbol}: {e}")
return []
def get_all_klines(symbol, interval, start_time_ms, end_time_ms):
"""Paginate through all klines between two timestamps."""
all_klines = []
current_start = start_time_ms
while current_start < end_time_ms:
batch = get_klines(symbol, interval, current_start, end_time_ms)
if not batch:
break
all_klines.extend(batch)
current_start = batch[-1]['close_time'] + 1
time.sleep(0.1) # Rate limiting
return all_klines
def get_current_price(symbol):
"""Get current price."""
url = f"{BINANCE_TICKER}?symbol={symbol}"
req = urllib.request.Request(url, headers={'User-Agent': 'Mozilla/5.0'})
try:
resp = urllib.request.urlopen(req, timeout=10)
data = json.loads(resp.read())
return float(data['price'])
except Exception as e:
print(f"Error fetching current price for {symbol}: {e}")
return None
def normalize_symbol(ticker):
"""Convert signal ticker to Binance symbol format."""
# Remove USDT suffix if present, then add it back
ticker = ticker.upper().replace('USDT', '').replace('/', '')
return f"{ticker}USDT"
def datetime_to_ms(dt_str):
"""Convert datetime string to milliseconds timestamp."""
# Handle various formats
for fmt in ['%Y-%m-%dT%H:%M:%S', '%Y-%m-%d %H:%M:%S', '%Y-%m-%d']:
try:
dt = datetime.strptime(dt_str, fmt).replace(tzinfo=timezone.utc)
return int(dt.timestamp() * 1000)
except ValueError:
continue
return None
if __name__ == '__main__':
# Test with current signals
for ticker in ['ASTERUSDT', 'HYPEUSDT']:
symbol = normalize_symbol(ticker)
price = get_current_price(symbol)
print(f"{symbol}: ${price}")
# Get last 24h of 1h candles
now_ms = int(time.time() * 1000)
day_ago = now_ms - (24 * 60 * 60 * 1000)
klines = get_klines(symbol, '1h', day_ago, now_ms)
if klines:
highs = [k['high'] for k in klines]
lows = [k['low'] for k in klines]
print(f" 24h range: ${min(lows):.4f} - ${max(highs):.4f}")
print(f" Candles: {len(klines)}")
print()

View File

@ -0,0 +1,166 @@
#!/usr/bin/env python3
"""
Telegram Crypto Signal Parser
Parses exported Telegram JSON chat history and extracts structured trading signals.
"""
import json
import re
import sys
from datetime import datetime
from pathlib import Path
# Signal patterns - adapt as we see more formats
PATTERNS = {
# #TICKER direction entry SL target leverage balance%
'standard': re.compile(
r'#(\w+)\s+' # ticker
r'(Long|Short)\s+' # direction
r'(?:market\s+entry!?|entry[:\s]+([0-9.]+))\s*' # entry type/price
r'SL[;:\s]+([0-9.]+)\s*' # stop loss
r'(?:Targets?|TP)[;:\s]+([0-9.,\s]+)\s*' # targets (can be multiple)
r'(?:Lev(?:erage)?[:\s]*x?([0-9.]+))?\s*' # leverage (optional)
r'(?:([0-9.]+)%?\s*balance)?', # balance % (optional)
re.IGNORECASE
),
# Simpler: #TICKER Short/Long entry SL targets
'simple': re.compile(
r'#(\w+)\s+(Long|Short)',
re.IGNORECASE
),
}
def parse_signal_text(text):
"""Parse a single message text into structured signal(s)."""
signals = []
# Try to find all ticker mentions
ticker_blocks = re.split(r'(?=#\w+USDT)', text)
for block in ticker_blocks:
if not block.strip():
continue
signal = {}
# Extract ticker
ticker_match = re.search(r'#(\w+)', block)
if not ticker_match:
continue
signal['ticker'] = ticker_match.group(1).upper()
# Extract direction
dir_match = re.search(r'\b(Long|Short)\b', block, re.IGNORECASE)
if not dir_match:
continue
signal['direction'] = dir_match.group(1).lower()
# Extract entry price (or "market")
entry_match = re.search(r'(?:entry|enter)[:\s]*([0-9.]+)', block, re.IGNORECASE)
if entry_match:
signal['entry'] = float(entry_match.group(1))
else:
signal['entry'] = 'market'
# Extract stop loss
sl_match = re.search(r'SL[;:\s]+([0-9.]+)', block, re.IGNORECASE)
if sl_match:
signal['stop_loss'] = float(sl_match.group(1))
# Extract targets (can be multiple, comma or space separated)
tp_match = re.search(r'(?:Targets?|TP)[;:\s]+([0-9.,\s]+)', block, re.IGNORECASE)
if tp_match:
targets_str = tp_match.group(1)
targets = [float(t.strip()) for t in re.findall(r'[0-9.]+', targets_str)]
signal['targets'] = targets
# Extract leverage
lev_match = re.search(r'Lev(?:erage)?[:\s]*x?([0-9.]+)', block, re.IGNORECASE)
if lev_match:
signal['leverage'] = float(lev_match.group(1))
# Extract balance percentage
bal_match = re.search(r'([0-9.]+)%?\s*balance', block, re.IGNORECASE)
if bal_match:
signal['balance_pct'] = float(bal_match.group(1))
if signal.get('ticker') and signal.get('direction'):
signals.append(signal)
return signals
def parse_telegram_export(json_path):
"""Parse a Telegram JSON export file."""
with open(json_path, 'r') as f:
data = json.load(f)
messages = data.get('messages', [])
all_signals = []
for msg in messages:
if msg.get('type') != 'message':
continue
# Get text content (can be string or list of text entities)
text_parts = msg.get('text', '')
if isinstance(text_parts, list):
text = ''.join(
p if isinstance(p, str) else p.get('text', '')
for p in text_parts
)
else:
text = text_parts
if not text or '#' not in text:
continue
# Check if it looks like a signal
if not re.search(r'(Long|Short)', text, re.IGNORECASE):
continue
signals = parse_signal_text(text)
for signal in signals:
signal['timestamp'] = msg.get('date', '')
signal['message_id'] = msg.get('id', '')
signal['raw_text'] = text[:500]
all_signals.append(signal)
return all_signals
def parse_forwarded_messages(messages_text):
"""Parse signals from forwarded message text (copy-pasted or forwarded to bot)."""
signals = parse_signal_text(messages_text)
return signals
if __name__ == '__main__':
if len(sys.argv) < 2:
# Demo with the test signals
test_text = """#ASTERUSDT Short market entry! SL: 0.6385 Targets: 0.51 Lev x15 1.3% balance
#HYPEUSDT Short market entry! SL; 33.5 Target 25 Lev x12 1.4% balance"""
signals = parse_signal_text(test_text)
print(f"Parsed {len(signals)} signals:\n")
for s in signals:
print(json.dumps(s, indent=2))
else:
json_path = sys.argv[1]
signals = parse_telegram_export(json_path)
print(f"Parsed {len(signals)} signals from export\n")
# Save to output
out_path = json_path.replace('.json', '_signals.json')
with open(out_path, 'w') as f:
json.dump(signals, f, indent=2)
print(f"Saved to {out_path}")
# Quick summary
longs = sum(1 for s in signals if s['direction'] == 'long')
shorts = sum(1 for s in signals if s['direction'] == 'short')
print(f"Longs: {longs}, Shorts: {shorts}")
tickers = set(s['ticker'] for s in signals)
print(f"Unique tickers: {len(tickers)}")

File diff suppressed because one or more lines are too long

View File

@ -1,5 +1,5 @@
{
"last_check": "2026-02-09T16:58:59.975254+00:00",
"last_check": "2026-02-09T23:25:59.773254+00:00",
"total_tracked": 3100,
"new_this_check": 0
}

View File

@ -0,0 +1,155 @@
[
{
"name": "guest_id_marketing",
"value": "v1%3A177052493168164632",
"domain": ".x.com",
"path": "/",
"secure": true,
"httpOnly": false,
"sameSite": "None"
},
{
"name": "guest_id_ads",
"value": "v1%3A177052493168164632",
"domain": ".x.com",
"path": "/",
"secure": true,
"httpOnly": false,
"sameSite": "None"
},
{
"name": "guest_id",
"value": "v1%3A177052493168164632",
"domain": ".x.com",
"path": "/",
"secure": true,
"httpOnly": false,
"sameSite": "None"
},
{
"name": "personalization_id",
"value": "\"v1_6O8SSA4FCcIXzFzq4cql3A==\"",
"domain": ".x.com",
"path": "/",
"secure": true,
"httpOnly": false,
"sameSite": "None"
},
{
"name": "__cuid",
"value": "7ec0f8364ef9466bb4d5e5398de60a7a",
"domain": ".x.com",
"path": "/",
"secure": false,
"httpOnly": false,
"sameSite": "Lax"
},
{
"name": "guest_id_marketing",
"value": "v1%3A177052493360013497",
"domain": ".twitter.com",
"path": "/",
"secure": true,
"httpOnly": false,
"sameSite": "None"
},
{
"name": "guest_id_ads",
"value": "v1%3A177052493360013497",
"domain": ".twitter.com",
"path": "/",
"secure": true,
"httpOnly": false,
"sameSite": "None"
},
{
"name": "personalization_id",
"value": "\"v1_0RdWTpuTILka/W8MwiVsGQ==\"",
"domain": ".twitter.com",
"path": "/",
"secure": true,
"httpOnly": false,
"sameSite": "None"
},
{
"name": "guest_id",
"value": "v1%3A177052493360013497",
"domain": ".twitter.com",
"path": "/",
"secure": true,
"httpOnly": false,
"sameSite": "None"
},
{
"name": "g_state",
"value": "{\"i_l\":0,\"i_ll\":1770524933853,\"i_b\":\"/335bZxZT54Tkc2wThT5DEH5v8hDZyhbe/JOl6uvF+k\",\"i_e\":{\"enable_itp_optimization\":0}}",
"domain": "x.com",
"path": "/",
"secure": false,
"httpOnly": false,
"sameSite": "Lax"
},
{
"name": "kdt",
"value": "Y9jfWROysXsnZyHwlffVbs8jvBJabIN4RGlZYFHP",
"domain": ".x.com",
"path": "/",
"secure": true,
"httpOnly": true,
"sameSite": "Lax"
},
{
"name": "auth_token",
"value": "219b71a535b96ef9f978612a48cf81a462643ee3",
"domain": ".x.com",
"path": "/",
"secure": true,
"httpOnly": true,
"sameSite": "None"
},
{
"name": "ct0",
"value": "e2c61ad6ce7115f2d8acd2062dc5c9a377140d9b570f871d9b25847f2d7a36fe512a424a359775d73a11a5a0a5154b6623b0021992a2b7f1e094d5ac5ee65cfeaf8ac87de09b7dcfc48f28a5b6dd15dc",
"domain": ".x.com",
"path": "/",
"secure": true,
"httpOnly": false,
"sameSite": "Lax"
},
{
"name": "twid",
"value": "u%3D741482516",
"domain": ".x.com",
"path": "/",
"secure": true,
"httpOnly": false,
"sameSite": "None"
},
{
"name": "lang",
"value": "en",
"domain": "x.com",
"path": "/",
"secure": false,
"httpOnly": false,
"sameSite": "Lax"
},
{
"name": "external_referer",
"value": "vC8TI7P7q9UHtLBqrmGBr3bhFoPD7nVN|0|8e8t2xd8A2w%3D",
"domain": ".x.com",
"path": "/",
"secure": true,
"httpOnly": false,
"sameSite": "Lax"
},
{
"name": "__cf_bm",
"value": "UjX5M.SqXScrW4zZ_GhiubhCXhv.8SI8uU7MkZCGT24-1770678794.1374662-1.0.1.1-4x.1srI8Lir7aTkBYJxMGMZQ2E3.EZKgF5S_gLeoAQzEUvIFZQTLQNxhFfiiVNNaXbfZ8HgKEPtSTvpaglXpnCo9COtawFeKPtaKmENpRj5V3mP0EOhtt4w_MpLhHekN",
"domain": ".x.com",
"path": "/",
"secure": true,
"httpOnly": true,
"sameSite": "Lax"
}
]

View File

@ -0,0 +1,249 @@
#!/usr/bin/env python3
"""
X/Twitter Feed Scraper using Playwright
Scrapes specific accounts for trading-related posts.
Uses saved Chrome session cookies for authentication.
"""
import json
import os
import sys
import time
from datetime import datetime, timezone
from pathlib import Path
ACCOUNTS = [
"browomo", "ArchiveExplorer", "noisyb0y1", "krajekis",
"Shelpid_WI3M", "polyaboretum", "0xashensoul",
]
TRADING_KEYWORDS = [
"polymarket", "trade", "profit", "wallet", "arbitrage", "signal",
"crypto", "bitcoin", "ethereum", "solana", "strategy", "edge",
"bet", "position", "stock", "market", "pnl", "alpha",
"$", "usdc", "defi", "token", "copy", "whale", "degen",
"short", "long", "bullish", "bearish", "portfolio",
]
DATA_DIR = Path(__file__).parent.parent / "data" / "x-feed"
DATA_DIR.mkdir(parents=True, exist_ok=True)
COOKIE_FILE = Path(__file__).parent / "x_cookies.json"
TELEGRAM_BOT_TOKEN = os.environ.get("TELEGRAM_BOT_TOKEN", "")
TELEGRAM_CHAT_ID = os.environ.get("TELEGRAM_CHAT_ID", "6443752046")
def send_telegram(message):
if not TELEGRAM_BOT_TOKEN:
print(f"[ALERT] {message}")
return
import urllib.request
url = f"https://api.telegram.org/bot{TELEGRAM_BOT_TOKEN}/sendMessage"
data = json.dumps({"chat_id": TELEGRAM_CHAT_ID, "text": message, "parse_mode": "HTML"}).encode()
req = urllib.request.Request(url, data=data, headers={"Content-Type": "application/json"})
try:
urllib.request.urlopen(req, timeout=10)
except Exception as e:
print(f"Telegram error: {e}")
def save_cookies(context):
cookies = context.cookies()
COOKIE_FILE.write_text(json.dumps(cookies, indent=2))
print(f"Saved {len(cookies)} cookies")
def load_cookies(context):
if COOKIE_FILE.exists():
cookies = json.loads(COOKIE_FILE.read_text())
context.add_cookies(cookies)
print(f"Loaded {len(cookies)} cookies")
return True
return False
def export_cookies_from_chrome():
"""One-time: grab cookies from the running Chrome debug instance."""
import http.client, websocket as ws_mod
conn = http.client.HTTPConnection("localhost", 9222)
conn.request("GET", "/json")
tabs = json.loads(conn.getresponse().read())
x_tab = None
for t in tabs:
if "x.com" in t.get("url", ""):
x_tab = t
break
if not x_tab:
print("No X tab found in Chrome debug")
return []
ws = ws_mod.create_connection(x_tab["webSocketDebuggerUrl"], timeout=10)
ws.send(json.dumps({"id": 1, "method": "Network.getAllCookies"}))
result = json.loads(ws.recv())
all_cookies = result.get("result", {}).get("cookies", [])
ws.close()
# Filter for x.com cookies and convert to Playwright format
x_cookies = []
for c in all_cookies:
if "x.com" in c.get("domain", "") or "twitter.com" in c.get("domain", ""):
x_cookies.append({
"name": c["name"],
"value": c["value"],
"domain": c["domain"],
"path": c.get("path", "/"),
"secure": c.get("secure", False),
"httpOnly": c.get("httpOnly", False),
"sameSite": c.get("sameSite", "Lax"),
})
COOKIE_FILE.write_text(json.dumps(x_cookies, indent=2))
print(f"Exported {len(x_cookies)} X cookies from Chrome")
return x_cookies
def scrape_account(page, account, max_scroll=5):
"""Scrape recent posts from a single account."""
posts = []
try:
page.goto(f"https://x.com/{account}", wait_until="networkidle", timeout=15000)
except:
try:
page.goto(f"https://x.com/{account}", wait_until="domcontentloaded", timeout=10000)
page.wait_for_timeout(3000)
except Exception as e:
print(f" Failed to load @{account}: {e}")
return posts
seen_texts = set()
for scroll in range(max_scroll):
articles = page.query_selector_all("article")
for article in articles:
try:
text = article.inner_text()[:800]
# Deduplicate
sig = text[:100]
if sig in seen_texts:
continue
seen_texts.add(sig)
# Extract links
links = article.query_selector_all("a")
urls = [l.get_attribute("href") for l in links if l.get_attribute("href")]
posts.append({
"account": account,
"text": text,
"urls": urls[:5],
"scraped_at": datetime.now(timezone.utc).isoformat(),
})
except:
continue
# Scroll down
page.evaluate("window.scrollBy(0, 1500)")
page.wait_for_timeout(1500)
return posts
def is_trading_related(text):
text_lower = text.lower()
return any(kw in text_lower for kw in TRADING_KEYWORDS)
def main():
from playwright.sync_api import sync_playwright
print(f"=== X Feed Scraper (Playwright) ===")
print(f"Time: {datetime.now(timezone.utc).strftime('%Y-%m-%d %H:%M UTC')}")
# Export cookies from Chrome if we don't have them yet
if not COOKIE_FILE.exists():
print("No cookies found — exporting from Chrome debug session...")
export_cookies_from_chrome()
with sync_playwright() as p:
browser = p.chromium.launch(headless=True)
context = browser.new_context(
user_agent="Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36",
viewport={"width": 1280, "height": 900},
)
load_cookies(context)
page = context.new_page()
all_posts = []
trading_posts = []
for account in ACCOUNTS:
print(f"\nScraping @{account}...", end=" ", flush=True)
posts = scrape_account(page, account)
print(f"{len(posts)} posts")
for post in posts:
all_posts.append(post)
if is_trading_related(post["text"]):
trading_posts.append(post)
browser.close()
print(f"\n{'='*50}")
print(f"Total posts: {len(all_posts)}")
print(f"Trading-related: {len(trading_posts)}")
# Save results
timestamp = datetime.now(timezone.utc).strftime("%Y%m%d-%H%M")
out_file = DATA_DIR / f"scan-{timestamp}.json"
out_file.write_text(json.dumps({
"timestamp": datetime.now(timezone.utc).isoformat(),
"total_posts": len(all_posts),
"trading_posts": len(trading_posts),
"posts": trading_posts,
}, indent=2))
print(f"Saved to {out_file}")
# Check for new posts we haven't seen before
seen_file = DATA_DIR / "seen_posts.json"
seen = set()
if seen_file.exists():
try:
seen = set(json.loads(seen_file.read_text()))
except:
pass
new_posts = []
for post in trading_posts:
sig = post["text"][:150]
if sig not in seen:
new_posts.append(post)
seen.add(sig)
seen_file.write_text(json.dumps(list(seen)[-5000:])) # Keep last 5000
if new_posts:
print(f"\n🔔 {len(new_posts)} NEW trading posts!")
for post in new_posts[:5]:
lines = post["text"].split("\n")
author = f"@{post['account']}"
snippet = post["text"][:200].replace("\n", " ")
print(f"\n {author}: {snippet}")
# Alert on Telegram
msg = f"🔍 <b>New from {author}</b>\n\n{snippet[:300]}"
if post.get("urls"):
x_urls = [u for u in post["urls"] if "x.com" in u or "twitter.com" in u]
if x_urls:
msg += f"\n\n{x_urls[0]}"
send_telegram(msg)
else:
print("\nNo new trading posts since last scan.")
if __name__ == "__main__":
main()

View File

@ -1,18 +1,10 @@
{
"cash": 53477.43000000002,
"cash": 60255.30020874026,
"positions": {
"DUOL": {
"shares": 57,
"avg_cost": 116.35,
"current_price": 116.35,
"entry_date": "2026-02-09T10:55:58.243598",
"entry_reason": "GARP signal: PE=14.65, FwdPE=14.71, RevGr=41.1%, EPSGr=1114.3%, RSI=23.44",
"trailing_stop": 104.715
},
"ALLY": {
"shares": 156,
"avg_cost": 42.65,
"current_price": 42.65,
"current_price": 42.040000915527344,
"entry_date": "2026-02-09T10:55:58.244488",
"entry_reason": "GARP signal: PE=18.0, FwdPE=6.76, RevGr=12.0%, EPSGr=265.4%, RSI=53.23",
"trailing_stop": 38.385
@ -20,7 +12,7 @@
"JHG": {
"shares": 138,
"avg_cost": 48.21,
"current_price": 48.21,
"current_price": 48.20000076293945,
"entry_date": "2026-02-09T10:55:58.245351",
"entry_reason": "GARP signal: PE=9.22, FwdPE=9.96, RevGr=61.3%, EPSGr=243.6%, RSI=68.71",
"trailing_stop": 43.389
@ -28,31 +20,31 @@
"INCY": {
"shares": 61,
"avg_cost": 108.69,
"current_price": 108.69,
"current_price": 109.02999877929688,
"entry_date": "2026-02-09T10:55:58.246289",
"entry_reason": "GARP signal: PE=18.42, FwdPE=13.76, RevGr=20.0%, EPSGr=290.7%, RSI=63.48",
"trailing_stop": 97.821
"trailing_stop": 98.12699890136719
},
"PINS": {
"shares": 332,
"avg_cost": 20.06,
"current_price": 20.06,
"current_price": 20.139999389648438,
"entry_date": "2026-02-09T10:55:58.247262",
"entry_reason": "GARP signal: PE=7.04, FwdPE=10.61, RevGr=16.8%, EPSGr=225.0%, RSI=19.14",
"trailing_stop": 18.054
"trailing_stop": 18.125999450683594
},
"EXEL": {
"shares": 152,
"avg_cost": 43.8,
"current_price": 43.8,
"current_price": 43.95000076293945,
"entry_date": "2026-02-09T10:55:58.252764",
"entry_reason": "GARP signal: PE=18.4, FwdPE=12.76, RevGr=10.8%, EPSGr=72.5%, RSI=50.12",
"trailing_stop": 39.42
"trailing_stop": 39.555000686645506
},
"CART": {
"shares": 187,
"avg_cost": 35.49,
"current_price": 35.49,
"current_price": 35.150001525878906,
"entry_date": "2026-02-09T10:55:58.254418",
"entry_reason": "GARP signal: PE=19.5, FwdPE=9.05, RevGr=10.2%, EPSGr=21.1%, RSI=37.75",
"trailing_stop": 31.941000000000003

View File

@ -1,10 +1,10 @@
[
{
"date": "2026-02-09",
"total_value": 100000.0,
"total_pnl": 0.0,
"pnl_pct": 0.0,
"cash": 53477.43,
"num_positions": 7
"total_value": 100055.9,
"total_pnl": 55.9,
"pnl_pct": 0.06,
"cash": 60255.3,
"num_positions": 6
}
]

View File

@ -61,5 +61,16 @@
"cost": 6636.63,
"reason": "GARP signal: PE=19.5, FwdPE=9.05, RevGr=10.2%, EPSGr=21.1%, RSI=37.75",
"timestamp": "2026-02-09T10:55:58.254721"
},
{
"action": "SELL",
"ticker": "DUOL",
"shares": 57,
"price": 118.91000366210938,
"proceeds": 6777.87,
"realized_pnl": 145.92,
"entry_price": 116.35,
"reason": "No longer passes GARP filter",
"timestamp": "2026-02-09T15:36:18.884898"
}
]

View File

@ -194,5 +194,117 @@
"ticker": "WTFC",
"reason": "RSI too high (72.6 > 70)",
"details": {}
},
{
"timestamp": "2026-02-09T15:36:18.885237",
"action": "SELL",
"ticker": "DUOL",
"reason": "No longer passes GARP filter",
"details": {
"success": true,
"ticker": "DUOL",
"shares": 57,
"price": 118.91000366210938,
"proceeds": 6777.87,
"realized_pnl": 145.92
}
},
{
"timestamp": "2026-02-09T15:36:19.302964",
"action": "SKIP",
"ticker": "VLY",
"reason": "RSI too high (78.3 > 70)",
"details": {}
},
{
"timestamp": "2026-02-09T15:36:19.303492",
"action": "SKIP",
"ticker": "FHN",
"reason": "RSI too high (72.4 > 70)",
"details": {}
},
{
"timestamp": "2026-02-09T15:36:19.304721",
"action": "SKIP",
"ticker": "FNB",
"reason": "RSI too high (71.0 > 70)",
"details": {}
},
{
"timestamp": "2026-02-09T15:36:19.305710",
"action": "SKIP",
"ticker": "SSB",
"reason": "RSI too high (89.0 > 70)",
"details": {}
},
{
"timestamp": "2026-02-09T15:36:19.306687",
"action": "SKIP",
"ticker": "WBS",
"reason": "RSI too high (82.0 > 70)",
"details": {}
},
{
"timestamp": "2026-02-09T15:36:19.307754",
"action": "SKIP",
"ticker": "ONB",
"reason": "RSI too high (77.6 > 70)",
"details": {}
},
{
"timestamp": "2026-02-09T15:36:19.308706",
"action": "SKIP",
"ticker": "WAL",
"reason": "RSI too high (71.7 > 70)",
"details": {}
},
{
"timestamp": "2026-02-09T15:36:19.309624",
"action": "SKIP",
"ticker": "ZION",
"reason": "RSI too high (73.3 > 70)",
"details": {}
},
{
"timestamp": "2026-02-09T15:36:19.310657",
"action": "SKIP",
"ticker": "CFG",
"reason": "RSI too high (78.5 > 70)",
"details": {}
},
{
"timestamp": "2026-02-09T15:36:19.311641",
"action": "SKIP",
"ticker": "UBSI",
"reason": "RSI too high (77.5 > 70)",
"details": {}
},
{
"timestamp": "2026-02-09T15:36:19.312722",
"action": "SKIP",
"ticker": "EWBC",
"reason": "RSI too high (78.6 > 70)",
"details": {}
},
{
"timestamp": "2026-02-09T15:36:19.313689",
"action": "SKIP",
"ticker": "FITB",
"reason": "Too close to 52wk high (1.9% away)",
"details": {}
},
{
"timestamp": "2026-02-09T15:36:19.314689",
"action": "SKIP",
"ticker": "BAC",
"reason": "RSI too high (78.1 > 70)",
"details": {}
},
{
"timestamp": "2026-02-09T15:36:19.315714",
"action": "SKIP",
"ticker": "WTFC",
"reason": "RSI too high (70.2 > 70)",
"details": {}
}
]

View File

@ -1,356 +1,338 @@
{
"date": "2026-02-09",
"timestamp": "2026-02-09T10:55:58.242176",
"timestamp": "2026-02-09T15:36:18.290971",
"total_scanned": 902,
"candidates_found": 21,
"candidates_found": 20,
"candidates": [
{
"ticker": "DUOL",
"price": 116.35,
"market_cap": 5378552832,
"market_cap_b": 5.4,
"trailing_pe": 14.65,
"forward_pe": 14.71,
"peg_ratio": null,
"revenue_growth": 41.1,
"earnings_growth": 1114.3,
"roe": 36.2,
"quick_ratio": 2.6,
"debt_to_equity": 7.4,
"rsi": 23.44,
"week52_high": 544.93,
"pct_from_52wk_high": 78.6,
"score": -100.83
},
{
"ticker": "ALLY",
"price": 42.65,
"market_cap": 13158768640,
"market_cap_b": 13.2,
"trailing_pe": 18.0,
"forward_pe": 6.76,
"price": 42.04,
"market_cap": 12969046016,
"market_cap_b": 13.0,
"trailing_pe": 17.74,
"forward_pe": 6.66,
"peg_ratio": null,
"revenue_growth": 12.0,
"earnings_growth": 265.4,
"roe": 5.8,
"quick_ratio": null,
"debt_to_equity": null,
"rsi": 53.23,
"rsi": 49.52,
"week52_high": 47.27,
"pct_from_52wk_high": 9.8,
"score": -20.98
"pct_from_52wk_high": 11.1,
"score": -21.08
},
{
"ticker": "JHG",
"price": 48.21,
"market_cap": 7447323136,
"price": 48.2,
"market_cap": 7445763584,
"market_cap_b": 7.4,
"trailing_pe": 9.22,
"forward_pe": 9.96,
"forward_pe": 9.95,
"peg_ratio": null,
"revenue_growth": 61.3,
"earnings_growth": 243.6,
"roe": 16.2,
"quick_ratio": 69.46,
"debt_to_equity": 6.5,
"rsi": 68.71,
"rsi": 68.18,
"week52_high": 49.42,
"pct_from_52wk_high": 2.4,
"score": -20.529999999999998
"pct_from_52wk_high": 2.5,
"score": -20.54
},
{
"ticker": "INCY",
"price": 108.69,
"market_cap": 21338314752,
"market_cap_b": 21.3,
"trailing_pe": 18.42,
"forward_pe": 13.76,
"price": 109.03,
"market_cap": 21405063168,
"market_cap_b": 21.4,
"trailing_pe": 18.48,
"forward_pe": 13.81,
"peg_ratio": null,
"revenue_growth": 20.0,
"earnings_growth": 290.7,
"roe": 30.4,
"quick_ratio": 2.86,
"debt_to_equity": 0.9,
"rsi": 63.48,
"rsi": 64.03,
"week52_high": 112.29,
"pct_from_52wk_high": 3.2,
"score": -17.310000000000002
"pct_from_52wk_high": 2.9,
"score": -17.259999999999998
},
{
"ticker": "PINS",
"price": 20.06,
"market_cap": 13635989504,
"market_cap_b": 13.6,
"trailing_pe": 7.04,
"forward_pe": 10.61,
"price": 20.14,
"market_cap": 13693783040,
"market_cap_b": 13.7,
"trailing_pe": 7.07,
"forward_pe": 10.66,
"peg_ratio": null,
"revenue_growth": 16.8,
"earnings_growth": 225.0,
"roe": 51.5,
"quick_ratio": 8.14,
"debt_to_equity": 4.3,
"rsi": 19.14,
"rsi": 19.93,
"week52_high": 40.38,
"pct_from_52wk_high": 50.3,
"score": -13.57
"pct_from_52wk_high": 50.1,
"score": -13.52
},
{
"ticker": "VLY",
"price": 13.72,
"market_cap": 7647972352,
"price": 13.7,
"market_cap": 7639607296,
"market_cap_b": 7.6,
"trailing_pe": 13.58,
"forward_pe": 9.2,
"trailing_pe": 13.56,
"forward_pe": 9.19,
"peg_ratio": null,
"revenue_growth": 38.3,
"earnings_growth": 66.3,
"roe": 7.8,
"quick_ratio": null,
"debt_to_equity": null,
"rsi": 78.6,
"rsi": 78.34,
"week52_high": 13.79,
"pct_from_52wk_high": 0.5,
"score": -1.2600000000000002
"pct_from_52wk_high": 0.7,
"score": -1.27
},
{
"ticker": "FHN",
"price": 26.33,
"market_cap": 12967198720,
"market_cap_b": 13.0,
"trailing_pe": 14.08,
"forward_pe": 11.23,
"price": 26.03,
"market_cap": 12817018880,
"market_cap_b": 12.8,
"trailing_pe": 13.92,
"forward_pe": 11.1,
"peg_ratio": null,
"revenue_growth": 23.7,
"earnings_growth": 74.9,
"roe": 10.9,
"quick_ratio": null,
"debt_to_equity": null,
"rsi": 76.1,
"rsi": 72.36,
"week52_high": 26.56,
"pct_from_52wk_high": 0.8,
"score": 1.37
"pct_from_52wk_high": 2.0,
"score": 1.2399999999999993
},
{
"ticker": "FNB",
"price": 19.05,
"market_cap": 6822501376,
"price": 18.92,
"market_cap": 6775944192,
"market_cap_b": 6.8,
"trailing_pe": 12.21,
"forward_pe": 9.73,
"trailing_pe": 12.13,
"forward_pe": 9.67,
"peg_ratio": null,
"revenue_growth": 26.4,
"earnings_growth": 56.5,
"roe": 8.7,
"quick_ratio": null,
"debt_to_equity": null,
"rsi": 71.92,
"rsi": 70.99,
"week52_high": 19.14,
"pct_from_52wk_high": 0.4,
"score": 1.4400000000000004
"pct_from_52wk_high": 1.1,
"score": 1.38
},
{
"ticker": "SSB",
"price": 107.67,
"market_cap": 10821986304,
"price": 107.18,
"market_cap": 10773236736,
"market_cap_b": 10.8,
"trailing_pe": 13.68,
"forward_pe": 10.18,
"trailing_pe": 13.62,
"forward_pe": 10.13,
"peg_ratio": null,
"revenue_growth": 53.2,
"earnings_growth": 30.9,
"roe": 10.7,
"quick_ratio": null,
"debt_to_equity": null,
"rsi": 92.25,
"rsi": 89.03,
"week52_high": 108.46,
"pct_from_52wk_high": 0.7,
"score": 1.7699999999999996
"pct_from_52wk_high": 1.2,
"score": 1.7200000000000006
},
{
"ticker": "WBS",
"price": 73.28,
"market_cap": 11818547200,
"price": 73.21,
"market_cap": 11808063488,
"market_cap_b": 11.8,
"trailing_pe": 12.42,
"forward_pe": 9.79,
"trailing_pe": 12.41,
"forward_pe": 9.78,
"peg_ratio": null,
"revenue_growth": 18.2,
"earnings_growth": 53.4,
"roe": 10.8,
"quick_ratio": null,
"debt_to_equity": null,
"rsi": 82.13,
"rsi": 82.05,
"week52_high": 73.5,
"pct_from_52wk_high": 0.3,
"score": 2.6299999999999994
},
{
"ticker": "WAL",
"price": 96.21,
"market_cap": 10588263424,
"market_cap_b": 10.6,
"trailing_pe": 11.02,
"forward_pe": 8.09,
"peg_ratio": null,
"revenue_growth": 16.6,
"earnings_growth": 32.9,
"roe": 13.5,
"quick_ratio": null,
"debt_to_equity": null,
"rsi": 71.81,
"week52_high": 96.94,
"pct_from_52wk_high": 0.8,
"score": 3.1399999999999997
"pct_from_52wk_high": 0.4,
"score": 2.6199999999999997
},
{
"ticker": "ONB",
"price": 25.93,
"market_cap": 10132744192,
"market_cap_b": 10.1,
"trailing_pe": 14.49,
"forward_pe": 9.04,
"price": 25.67,
"market_cap": 10031142912,
"market_cap_b": 10.0,
"trailing_pe": 14.34,
"forward_pe": 8.95,
"peg_ratio": null,
"revenue_growth": 41.4,
"earnings_growth": 17.2,
"roe": 9.0,
"quick_ratio": null,
"debt_to_equity": null,
"rsi": 81.24,
"rsi": 77.64,
"week52_high": 26.17,
"pct_from_52wk_high": 1.9,
"score": 3.09
},
{
"ticker": "WAL",
"price": 96.08,
"market_cap": 10573956096,
"market_cap_b": 10.6,
"trailing_pe": 11.01,
"forward_pe": 8.08,
"peg_ratio": null,
"revenue_growth": 16.6,
"earnings_growth": 32.9,
"roe": 13.5,
"quick_ratio": null,
"debt_to_equity": null,
"rsi": 71.66,
"week52_high": 96.99,
"pct_from_52wk_high": 0.9,
"score": 3.1799999999999997
"score": 3.13
},
{
"ticker": "EXEL",
"price": 43.8,
"market_cap": 11791070208,
"price": 43.95,
"market_cap": 11831451648,
"market_cap_b": 11.8,
"trailing_pe": 18.4,
"forward_pe": 12.76,
"trailing_pe": 18.47,
"forward_pe": 12.8,
"peg_ratio": null,
"revenue_growth": 10.8,
"earnings_growth": 72.5,
"roe": 30.6,
"quick_ratio": 3.5,
"debt_to_equity": 8.2,
"rsi": 50.12,
"rsi": 51.02,
"week52_high": 49.62,
"pct_from_52wk_high": 11.7,
"score": 4.43
"pct_from_52wk_high": 11.4,
"score": 4.470000000000001
},
{
"ticker": "ZION",
"price": 65.29,
"market_cap": 9640263680,
"price": 65.16,
"market_cap": 9621069824,
"market_cap_b": 9.6,
"trailing_pe": 10.86,
"forward_pe": 9.99,
"trailing_pe": 10.84,
"forward_pe": 9.97,
"peg_ratio": null,
"revenue_growth": 13.6,
"earnings_growth": 31.4,
"roe": 13.5,
"quick_ratio": null,
"debt_to_equity": null,
"rsi": 74.03,
"rsi": 73.29,
"week52_high": 66.18,
"pct_from_52wk_high": 1.3,
"score": 5.49
"pct_from_52wk_high": 1.5,
"score": 5.470000000000001
},
{
"ticker": "CART",
"price": 35.49,
"market_cap": 9349425152,
"price": 35.15,
"market_cap": 9259855872,
"market_cap_b": 9.3,
"trailing_pe": 19.5,
"forward_pe": 9.05,
"trailing_pe": 19.31,
"forward_pe": 8.97,
"peg_ratio": null,
"revenue_growth": 10.2,
"earnings_growth": 21.1,
"roe": 15.3,
"quick_ratio": 3.33,
"debt_to_equity": 1.0,
"rsi": 37.75,
"rsi": 36.01,
"week52_high": 53.5,
"pct_from_52wk_high": 33.7,
"score": 5.92
"pct_from_52wk_high": 34.3,
"score": 5.84
},
{
"ticker": "CFG",
"price": 68.17,
"market_cap": 29278072832,
"market_cap_b": 29.3,
"trailing_pe": 17.66,
"forward_pe": 10.82,
"price": 67.7,
"market_cap": 29076213760,
"market_cap_b": 29.1,
"trailing_pe": 17.54,
"forward_pe": 10.75,
"peg_ratio": null,
"revenue_growth": 10.7,
"earnings_growth": 35.9,
"roe": 7.2,
"quick_ratio": null,
"debt_to_equity": null,
"rsi": 80.86,
"rsi": 78.47,
"week52_high": 68.65,
"pct_from_52wk_high": 0.7,
"score": 6.16
"pct_from_52wk_high": 1.4,
"score": 6.09
},
{
"ticker": "UBSI",
"price": 45.32,
"market_cap": 6316939264,
"price": 45.06,
"market_cap": 6280699392,
"market_cap_b": 6.3,
"trailing_pe": 13.86,
"forward_pe": 12.03,
"trailing_pe": 13.78,
"forward_pe": 11.96,
"peg_ratio": null,
"revenue_growth": 22.1,
"earnings_growth": 32.1,
"roe": 8.9,
"quick_ratio": null,
"debt_to_equity": null,
"rsi": 80.0,
"rsi": 77.51,
"week52_high": 45.93,
"pct_from_52wk_high": 1.3,
"score": 6.61
"pct_from_52wk_high": 1.9,
"score": 6.54
},
{
"ticker": "EWBC",
"price": 123.19,
"market_cap": 16949170176,
"price": 122.56,
"market_cap": 16862490624,
"market_cap_b": 16.9,
"trailing_pe": 12.94,
"forward_pe": 11.24,
"trailing_pe": 12.87,
"forward_pe": 11.18,
"peg_ratio": null,
"revenue_growth": 21.6,
"earnings_growth": 21.3,
"roe": 15.9,
"quick_ratio": null,
"debt_to_equity": null,
"rsi": 79.27,
"rsi": 78.61,
"week52_high": 123.82,
"pct_from_52wk_high": 0.5,
"score": 6.949999999999999
"pct_from_52wk_high": 1.0,
"score": 6.890000000000001
},
{
"ticker": "FITB",
"price": 54.38,
"market_cap": 48944635904,
"price": 54.33,
"market_cap": 48899633152,
"market_cap_b": 48.9,
"trailing_pe": 15.41,
"forward_pe": 11.09,
"trailing_pe": 15.39,
"forward_pe": 11.08,
"peg_ratio": null,
"revenue_growth": 11.5,
"earnings_growth": 20.8,
"roe": 12.2,
"quick_ratio": null,
"debt_to_equity": null,
"rsi": 66.02,
"rsi": 65.77,
"week52_high": 55.36,
"pct_from_52wk_high": 1.8,
"score": 7.859999999999999
"pct_from_52wk_high": 1.9,
"score": 7.85
},
{
"ticker": "BAC",
"price": 56.43,
"market_cap": 412079849472,
"market_cap_b": 412.1,
"price": 56.41,
"market_cap": 411933769728,
"market_cap_b": 411.9,
"trailing_pe": 14.81,
"forward_pe": 11.38,
"peg_ratio": null,
@ -359,28 +341,28 @@
"roe": 10.2,
"quick_ratio": null,
"debt_to_equity": null,
"rsi": 78.3,
"rsi": 78.1,
"week52_high": 57.55,
"pct_from_52wk_high": 1.9,
"pct_from_52wk_high": 2.0,
"score": 7.970000000000001
},
{
"ticker": "WTFC",
"price": 159.71,
"market_cap": 10696563712,
"market_cap_b": 10.7,
"trailing_pe": 14.0,
"forward_pe": 11.91,
"price": 158.57,
"market_cap": 10620212224,
"market_cap_b": 10.6,
"trailing_pe": 13.9,
"forward_pe": 11.82,
"peg_ratio": null,
"revenue_growth": 10.5,
"earnings_growth": 19.4,
"roe": 12.1,
"quick_ratio": null,
"debt_to_equity": null,
"rsi": 72.56,
"rsi": 70.23,
"week52_high": 162.96,
"pct_from_52wk_high": 2.0,
"score": 8.92
"pct_from_52wk_high": 2.7,
"score": 8.83
}
]
}