When 150ms Is Too Slow for Live Sports
A major North American sports league was losing the broadcast graphics race. Their cloud-first architecture meant on-screen stats lagged the action by a visible half-second. Fans at home saw the replay before the score updated.
See How It Works- Client
- Major North American Sports League
- Industry
- Professional Sports
- Use Case
- Broadcast Graphics & In-Stadium Analytics
- Timeline
- First venue live in 6 weeks
- ROI
- $1.2M annual cloud cost savings
The Challenge
The league's broadcast partner was threatening to build their own solution. Every game, the TV graphics showed outdated stats. Viewers on social media saw the goal before the score changed on screen. The cloud architecture that worked for post-game analytics was failing for live broadcast.
- 01 Broadcast graphics lagged 150-200ms behind live action
- 02 Cloud egress fees hitting $100K per month during playoffs
- 03 23 venues with different network setups and stadium IT policies
- 04 Integration with existing Hawk-Eye tracking and ChyronHego graphics
- 05 Season opener deadline - 8 weeks to prove the concept
- 06 IT staff at venues ranged from 'expert' to 'we have a guy'
The Solution
We put Expanso boxes in the broadcast truck at each venue. Raw tracking data stays local. Graphics systems get updates in single-digit milliseconds. The cloud only sees aggregated stats for the website and mobile app.
Broadcast Truck Deployment
A 1U server in each broadcast truck ingests Hawk-Eye feeds, runs the graphics calculation, and pushes to ChyronHego. No internet required for live graphics. Cloud sync happens during commercial breaks.
Works With Stadium IT
Some venues have fiber. Some have 'good enough' WiFi. The system works either way - local processing means we're not dependent on the stadium's network for live operations.
One Config, 23 Venues
Same pipeline template deploys everywhere. Venue-specific tweaks for camera angles and sensor placement. Updates push from HQ without touching broadcast equipment.
The Results
The broadcast partner stopped threatening to leave. Graphics now update faster than the human eye can perceive. The $1.2M in cloud savings came from not streaming raw tracking data to AWS every game.
- Broadcast graphics latency dropped from 150ms to 8ms
- Cloud egress costs cut by 68% - we stopped streaming raw data
- First venue proved concept in 6 weeks, remaining 22 deployed over the season
- Zero graphics outages in first full season
- Same system now powers in-stadium displays and mobile app
- Broadcast partner renewed contract with improved terms

Broadcast latency killing your fan experience?
We've deployed at stadiums, arenas, and race tracks. If your graphics are lagging your action, we should talk.
