Visualizing Pedestrian Trajectories on a Map with MapLibre

Motivation When I stood alone in protest at stations along the Yamanote Line, I watched people walk past me. Some glanced at my sign and looked away with a frown. Others gave a small nod. But I had...

By · · 1 min read
Visualizing Pedestrian Trajectories on a Map with MapLibre

Source: DEV Community

Motivation When I stood alone in protest at stations along the Yamanote Line, I watched people walk past me. Some glanced at my sign and looked away with a frown. Others gave a small nod. But I had no way to measure how many people actually reacted to my message — or how. That question is what started this project. I wanted to capture not just whether people reacted, but how their movement changed: did they slow down, step aside, or adjust their path? To do that, I needed to track pedestrian trajectories from video and place them on a real map. I filmed at several stations in Tokyo using a smartphone, and built the pipeline using open-source tools — YOLOX for detection, ByteTrack for tracking, and MapLibre for visualization. Introduction In the previous article, I extracted pedestrian trajectories from street video as structured JSON. The coordinates in that output are pixel positions in image space means something relative to the camera frame, but nothing about the real world. In this