Full sync - all projects, memory, configs
This commit is contained in:
163
data/tasks/market-data-service.md
Normal file
163
data/tasks/market-data-service.md
Normal file
@ -0,0 +1,163 @@
|
||||
# Task: Unified Market Data Service + Redis Message Log
|
||||
|
||||
**Priority:** HIGH
|
||||
**Assigned:** Glitch (after trader-dashboard-controls completes)
|
||||
**Status:** Queued
|
||||
**Date:** 2026-03-01
|
||||
**Depends on:** trader-dashboard-controls (in progress)
|
||||
|
||||
## Problem
|
||||
|
||||
Two services independently fetch the same OHLCV data from Binance.us:
|
||||
1. **Dashboard scanner** (`lib/server/scanner.ts`) — 29 coins × 1h × every 30s
|
||||
2. **TA Service** (`ta_service.py`) — 29 coins × 3 timeframes (5m/1h/4h) × own cycle
|
||||
|
||||
This doubles API calls, wastes bandwidth, and increases rate limit risk.
|
||||
|
||||
## Solution
|
||||
|
||||
### New Service: `coinex-market-data`
|
||||
A single long-running service that:
|
||||
1. Fetches ALL OHLCV data from `api.binance.us` (29 coins × 3 timeframes)
|
||||
2. Caches everything in Redis with TTLs
|
||||
3. Publishes updates to Redis pub/sub channels on each refresh
|
||||
4. Exposes a health endpoint
|
||||
|
||||
**No other service should hit Binance directly after this.**
|
||||
|
||||
### Redis Schema
|
||||
|
||||
```
|
||||
# Cached OHLCV candle data
|
||||
market:ohlcv:{symbol}:{timeframe} → JSON array of candles (TTL: varies by timeframe)
|
||||
- 5m: TTL 120s (refresh every 60s)
|
||||
- 1h: TTL 300s (refresh every 120s)
|
||||
- 4h: TTL 600s (refresh every 300s)
|
||||
|
||||
# Latest price/ticker for quick reads
|
||||
market:price:{symbol} → JSON { price, change24h, volume, timestamp } (TTL: 60s)
|
||||
|
||||
# Pub/sub channels for real-time consumers
|
||||
channel: market:update:{timeframe} → published after each timeframe refresh cycle
|
||||
channel: market:update:all → published after complete scan cycle
|
||||
|
||||
# Message log (see below)
|
||||
market:log → Redis Stream (XADD) of all messages passing through Redis
|
||||
```
|
||||
|
||||
### Refresh Cycle
|
||||
```
|
||||
Every 30s: Fetch 5m candles for all 29 coins → cache → publish market:update:5m
|
||||
Every 120s: Fetch 1h candles for all 29 coins → cache → publish market:update:1h
|
||||
Every 300s: Fetch 4h candles for all 29 coins → cache → publish market:update:4h
|
||||
```
|
||||
|
||||
### Consumer Changes
|
||||
|
||||
**Dashboard scanner (`scanner.ts`):**
|
||||
- Remove ALL direct Binance API calls
|
||||
- Read from `market:ohlcv:{symbol}:1h` Redis keys instead
|
||||
- Subscribe to `market:update:1h` channel to trigger recalculation
|
||||
- Calculate RSI/VWAP/BB from cached candles (same logic, different data source)
|
||||
|
||||
**TA Service (`ta_service.py`):**
|
||||
- Remove ALL direct Binance API calls
|
||||
- Read from `market:ohlcv:{symbol}:{timeframe}` Redis keys
|
||||
- Subscribe to `market:update:{timeframe}` channels to trigger indicator calculation
|
||||
- Keep all indicator logic (EMA ribbons, TTM Squeeze, Stoch RSI) unchanged
|
||||
|
||||
### Redis Message Log
|
||||
|
||||
Use **Redis Streams** (`XADD`/`XRANGE`/`XLEN`) to log ALL messages passing through Redis:
|
||||
|
||||
```
|
||||
# Every publish, cache write, and signal gets logged
|
||||
Stream key: redis:message_log
|
||||
|
||||
Fields per entry:
|
||||
- type: "publish" | "cache_write" | "cache_read" | "signal" | "trade" | "error"
|
||||
- channel: the pub/sub channel or key involved
|
||||
- source: "market-data" | "ta-service" | "dashboard" | "trader-bot"
|
||||
- summary: human-readable one-liner
|
||||
- payload_size: bytes
|
||||
- timestamp: ISO string
|
||||
```
|
||||
|
||||
The market data service wraps Redis operations to auto-log:
|
||||
- Every `PUBLISH` → logged with channel + payload size
|
||||
- Every `SETEX` (cache write) → logged with key + TTL
|
||||
- Every signal published by TA service → logged
|
||||
- Every trade action by trader bot → logged
|
||||
|
||||
**Dashboard API:**
|
||||
- `GET /api/redis/log?limit=100&type=signal&source=ta-service` — filterable log viewer
|
||||
- Displayed in the Status page under a new "Redis Activity" section
|
||||
|
||||
Stream is capped at 10,000 entries (`XADD ... MAXLEN ~ 10000`) to prevent unbounded growth.
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
┌─────────────────────┐
|
||||
│ Binance.us API │
|
||||
└─────────┬───────────┘
|
||||
│ (ONLY connection)
|
||||
┌─────────▼───────────┐
|
||||
│ Market Data Service │ ← NEW (Python, systemd)
|
||||
│ Port 8895 /health │
|
||||
└─────────┬───────────┘
|
||||
│ SETEX + PUBLISH + XADD
|
||||
┌─────────▼───────────┐
|
||||
│ Redis │
|
||||
│ OHLCV cache │
|
||||
│ Pub/sub channels │
|
||||
│ Message stream log │
|
||||
└──┬──────┬────────┬──┘
|
||||
│ │ │
|
||||
┌────────▼┐ ┌───▼─────┐ ┌▼──────────┐
|
||||
│Dashboard │ │TA Svc │ │Trader Bot │
|
||||
│Scanner │ │(signals)│ │(trades) │
|
||||
│(scores) │ │ │ │ │
|
||||
└──────────┘ └─────────┘ └───────────┘
|
||||
```
|
||||
|
||||
## Tech Stack
|
||||
- **Python** (asyncio + aiohttp + redis-py)
|
||||
- **FastAPI** health endpoint on port 8895
|
||||
- **Systemd** user service: `coinex-market-data.service`
|
||||
- **Redis Streams** for message logging
|
||||
|
||||
## Files
|
||||
- New service: `~/.openclaw/workspace/projects/coinex-market-data/`
|
||||
- Dashboard scanner: `~/.openclaw/workspace/projects/coinex-dashboard/lib/server/scanner.ts`
|
||||
- TA service: `~/.openclaw/workspace-glitch/projects/coinex-ta-service/ta_service.py`
|
||||
- Redis: 127.0.0.1:6379
|
||||
|
||||
## Git Repository
|
||||
- **Create private Gitea repo FIRST**: `https://git.letsgetnashty.com/case/coinex-market-data`
|
||||
- Git credentials: `case:Gh0st%21nTh3Mach1n3` (URL-encoded in remote URL)
|
||||
- Git config: user `Case`, email `case-lgn@protonmail.com`
|
||||
- Push initial commit before starting development
|
||||
- Commit frequently throughout build
|
||||
|
||||
## Constraints
|
||||
- `api.binance.us` ONLY (not api.binance.com — 451 geo-block)
|
||||
- Respect Binance rate limits (1200 req/min)
|
||||
- All existing indicator/scoring logic untouched
|
||||
- Dashboard must work identically from user perspective
|
||||
- Include unit tests
|
||||
- Context7 mandatory for library docs
|
||||
|
||||
## Definition of Done
|
||||
- [ ] Market data service running, fetching all 29 coins × 3 timeframes
|
||||
- [ ] All OHLCV data cached in Redis with appropriate TTLs
|
||||
- [ ] Dashboard scanner reads from Redis (zero Binance calls)
|
||||
- [ ] TA service reads from Redis (zero Binance calls)
|
||||
- [ ] Redis message log capturing all pub/sub + cache activity
|
||||
- [ ] `/api/redis/log` endpoint on dashboard with filtering
|
||||
- [ ] Redis Activity section on Status page
|
||||
- [ ] Health endpoint at :8895/health
|
||||
- [ ] Systemd service created and running
|
||||
- [ ] Gitea repo created and pushed
|
||||
- [ ] Unit tests passing
|
||||
- [ ] All 3 services confirmed working together end-to-end
|
||||
Reference in New Issue
Block a user