NodCursor Capstone
A Research Through Design capstone for a browser-based, privacy-first head-tracking cursor controller that maps head motion and facial gestures to mouse events using MediaPipe — no cloud, no installs.
NodCursor Capstone
A Research Through Design capstone for a browser-based, privacy-first head-tracking cursor controller that maps head motion and facial gestures to mouse events using MediaPipe — no cloud, no installs.
NodCursor
Head-Driven Cursor Control Without Installs or Cloud
About
NodCursor is a browser-based accessibility tool that replaces the mouse with head movement and facial gestures. Powered by MediaPipe Tasks Vision, it tracks facial landmarks in real time and maps nose position and brow tilt to cursor coordinates. A Web Worker handles exponential smoothing and gesture signal derivation off the main thread, keeping the UI fluid. A calibration flow lets users define their natural head range and snap it to the viewport. Gesture controls — blink, double-blink, long-blink — trigger click and scroll actions without any hardware beyond a webcam. Built with React, TypeScript, and Vite, the entire stack runs locally in the browser with no accounts, no server, and no data transmission.