Design-Based Research Capstone

NodCursor Capstone

A Research Through Design capstone for a browser-based, privacy-first head-tracking cursor controller that maps head motion and facial gestures to mouse events using MediaPipe — no cloud, no installs.

NodCursor - Browser-Based Head Tracking Cursor Controller
View Project
In Development / Testing
Project Leads Aadi Bhat, Pranav Santhosh

NodCursor

Head-Driven Cursor Control Without Installs or Cloud

Browser-native head tracking via MediaPipe FaceLandmarker
Facial gestures mapped to click, scroll, and drag events
Web Worker pipeline for real-time smoothing with minimal latency
Calibration system mapping head range to viewport coordinates
Exponential smoothing with deadzone and acceleration curve
Blink, double-blink, and long-blink gesture detection
React + TypeScript + Vite, entirely client-side
Privacy-first: no data leaves the browser
React + TypeScript + Vite MediaPipe Tasks Vision (FaceLandmarker) Web Worker (trackingWorker.ts) Exponential Smoothing Pipeline Tailwind CSS Calibration Viewport Mapping Gesture Dispatch (useGestureControls) useFaceTracking / useCursorMapping Hooks

About

NodCursor is a browser-based accessibility tool that replaces the mouse with head movement and facial gestures. Powered by MediaPipe Tasks Vision, it tracks facial landmarks in real time and maps nose position and brow tilt to cursor coordinates. A Web Worker handles exponential smoothing and gesture signal derivation off the main thread, keeping the UI fluid. A calibration flow lets users define their natural head range and snap it to the viewport. Gesture controls — blink, double-blink, long-blink — trigger click and scroll actions without any hardware beyond a webcam. Built with React, TypeScript, and Vite, the entire stack runs locally in the browser with no accounts, no server, and no data transmission.

Impact

Fully browser-native — no app install required
Privacy-first: all inference runs locally on-device
Accessible cursor control for motor-impaired users
Gesture-based click and scroll without physical peripherals
Smooth tracking across varied lighting and camera setups
Deadzone and acceleration tuning for fatigue-free use
Calibration system adapts to each user's motion range
Foundation for gaze, voice, and switch-access extensions
Learn More