I’ve been a Leafs fan my whole life. No, that’s not a brag.

Growing up, I always wanted one of those Budweiser goal lights — the ones that flash red and sound like a foghorn when a goal goes in. Every Canadian kid watching Hockey Night in Canada wanted one.

That was twenty-something years ago. I now have a house, disposable income, and enough programming knowledge to be dangerous.

Nobody can stop me anymore.

What I’m building

The idea is simple: when the Toronto Maple Leafs score a goal, the lights behind my TV should turn blue and flash. My own personal goal light.

The implementation is less simple.

Why not use an API

The NHL has a public stats API. So do several third-party sports data providers. They all return game state, including when goals are scored. You could poll one every few seconds and trigger lights when the score changes.

I looked into a few of them. They all had the same problems:

Latency. The API updates after the goal is officially recorded, not when the puck crosses the line. That’s anywhere from 5 to 15 seconds of delay. The crowd is already done celebrating by the time your lights would fire. The whole point is it happening in the moment.

Rate limiting. olling frequently enough to catch a goal in real time puts you up against rate limits fast, especially on free tiers. You’d need a paid plan just to be wrong by ten seconds.

So: computer vision. Watch the broadcast, detect the goal moment from the video signal itself.

The detection target

When a goal is scored on an NHL broadcast, specific visual things happen in a consistent sequence:

  1. The goal animation overlays the screen
  2. The score graphic updates
  3. The replay sequence starts

Any of these could be a detection target. The goal animation is the most visually distinct and the earliest signal. That’s what I’m targeting.

The system design

Here’s how the full pipeline works:

  1. Capture — A Raspberry Pi with a capture card grabs frames from the HDMI output of casting device
  2. Classify — Each frame runs through a trained binary classifier: goal or no goal
  3. Trigger — When the classifier fires with high confidence, the Pi sends a bluetooth command, which triggers the lights
  4. Cooldown — 30-second lockout after a trigger to avoid double-firing on replays

The classifier doesn’t need to be complex. Binary classification, ~224x224 input, probably MobileNet-based for speed on a Pi. The training data is the interesting problem.

What I need

Training data: labelled frames showing goal moments and non-goal moments from real broadcasts.

The class imbalance is significant. In a 2.5-hour broadcast, a goal happens maybe 5–6 times. Each goal event is visible for maybe 3–5 seconds. So I have maybe 15–25 seconds of positive footage in 150 minutes of broadcast.

I need to be strategic about what I capture and how I label it. That’s the next problem.


Next: building a training dataset from broadcast footage — what to capture, how many frames, and why the boring part of machine learning is actually the whole game.