Context
S — I wanted a playground to prove I could ship real-time collaboration surfaces without game engines. The experiment: a blitz version of Connect-6 with timers, multi-move turns, and built-in anti-cheat. T — Architect an authoritative server, deterministic client, and chaos testing harness that hold up under packet loss and rage quits.
Threats
- Race conditions when both players emit moves simultaneously.
- Memory leaks from orphan sockets when players drop mid-match.
- Cheating attempts via tampered payloads or client replay scripts.
- Win-detection on a 15×15 board could become CPU heavy if naive.
Approach
- Modeled every game event (create, join, move, timer, surrender) as explicit TypeScript contracts shared by client + server.
- Embedded an authoritative rules engine on the server that validates move order, move count, and per-turn time budgets before broadcasting state.
- Adopted Socket.IO rooms per match with Redis pub/sub backing so I could scale horizontally later.
- Added heartbeat + exponential backoff reconnection to recycle ghost games and keep metrics honest.
- Optimized win detection using memoized direction vectors; worst-case evaluation stays under 5 ms on 225 cells.
- Instrumented structured logs and Grafana dashboards to replay matches and detect anomalies.
- Deployed split frontend/backend (Vercel + Render) to decouple scaling planes.
Outcome
The prototype consistently delivered <100 ms round trips even from mobile networks, survived induced packet loss without desync, and logged zero cheating attempts that bypassed server validation. The codebase now serves as my reference for concurrency audits on client projects.
Lessons Learned
Real-time UX fails when ownership of state is ambiguous. By giving the server final authority and treating sockets like unreliable narrators, I built empathy for distributed systems that directly translates to collaborative SaaS work.
