Project · 2025

The Tide Clock

I wanted to build a tide clock. One thing led to another.

Vanilla JS HTML Canvas NOAA CO-OPS API spatial-utils Vercel
§ Live clock
Requests your location to select the nearest NOAA station. Open full screen ↗

Mechanical tide clocks are genuinely beautiful objects. One gear, one hand, one rotation every 24 hours and 50 minutes - the average lunar day. Set it to a local high tide and it approximates the rhythm of the sea on your wall. There's real elegance in that.

But tides don't run on averages, and that's not a flaw - it's the interesting part. Local geography shapes everything: inlet width, basin depth, the back-and-forth of a tidal river. The difference between the ocean beach and the intracoastal waterway a quarter-mile away can be two hours and several feet. A gear makes a beautiful guess. I wanted to see what live data could do instead.

Not a problem to solve - more of a puzzle to sit with.

On load, the clock asks for your location and uses that to find the nearest NOAA tide prediction stations - about 3,400 of them across the US. The two closest within 30km get fetched and their predictions blended by distance, so if you're on the ICW you get ICW timing, and if you're on the ocean beach you get ocean timing. No configuration, no hardcoded offsets.

The blending uses inverse-distance weighting with wrap-safe angle averaging to handle the 0°/360° boundary on the clock face. An exponential moving average keeps the hand from jumping as weights shift. The hand completes one rotation every 24 hours and 50 minutes - same as the gear, just driven by real data instead of an approximation.

The thing I didn't expect: NOAA's subordinate station network already encodes the hydraulic corrections for inlets, back bays, and tidal rivers. Inlet lag, basin attenuation, height ratio - all of it, for every named water body in the country. Ranking stations by physical distance rather than region just lets you tap into that existing knowledge. The clock didn't need to be smart about geography. NOAA already was.

The coordinate math lives in spatial-utils, an open-source JavaScript navigation library I built in parallel. It handles WGS-84 geodetic conversions - the same coordinate system GPS uses - and it's what makes the station ranking work without any region-specific logic.

spatial-utils · what it does in the clock
  • wgs84ToEcef Converts lat/lng into 3D Cartesian space anchored at Earth's center. You can't compare distances in degrees directly - a degree of longitude shrinks from 111km at the equator to zero at the poles. ECEF removes that distortion.
  • ecefToEnu Projects the vector from your position to each station into a local East-North-Up frame. The result is a real ground distance in metres, accounting for the WGS-84 ellipsoid rather than a sphere.
  • getHorizontalDistance Collapses the ENU vector to a single ground-plane distance, used to rank all ~3,400 stations and compute the inverse-distance blend weights.
  • ema / smoothAngle Exponential moving average and wrap-safe angle interpolation. Keeps the hand smooth across the 0°/360° boundary as blend weights shift over time.

spatial-utils is inlined into the clock rather than imported as a module - easier to open as a local file that way, no CORS to deal with. Its job is purely geometric: figure out which stations are relevant to where you are. NOAA handles everything else.

The clock face is a custom aerial photograph rendered as a tiny planet - a stereographic projection of a 360° equirectangular image, ocean at the center, horizon curling around the edge. Around the hand: current tide phase, countdown to the next high and low, water height in feet above MLLW, a fill arc on the rim, a spring/neap indicator, and a blend card showing which NOAA stations are active and how much weight each one carries.

It started as a single widget to understand the mechanics. It grew into something location-aware through iteration: coordinate math, then API integration, then multi-station blending, then the ICW/ocean split, then the geolocation edge cases.

That last part was its own rabbit hole. The geolocation flow uses the Permissions API to check for blocked access before calling getCurrentPosition, retries with enableHighAccuracy: false when macOS Core Location fails quietly, and surfaces OS-level instructions when the browser thinks it has permission but the system is blocking it. None of that was planned. It just kept coming up.

Data refreshes automatically every 6 hours. The clock runs on Vercel.