← Back to Projects
Fabrication · Wearable IoT · 2024

Finger Drumbeat

Designed, built, and programmed Finger Drumbeat, a self-built wearable gadget that lets users play virtual drums through finger gestures.

Finger swing force maps to sound intensity, gesture direction maps to drum type, and motion, audio, and visual feedback come together into one compact modular device.

A fully self-built project: I scoped the interaction concept, architected the embedded system, fabricated the PCB and enclosure, and wrote the gesture-mapping logic end-to-end.

Role

Product Builder
Interaction · System Arch · Full-Stack Dev

Context

UW GIX
Individual Creative Project

Type

Individual Creative Project
Embedded System · Wearable

Stack

ESP32 · MPU6050 · APDS9960
Arduino · Custom PCB

Live Demo · Turn on sound to hear different drums across swipes
View Finger Drumbeat on GitHub
Finger Drumbeat key components: custom yellow enclosure with speaker and OLED, annotated internal PCB with Seeed XIAO ESP32, SSD1306 OLED, DotStar LED matrix, stepper motor and Visaton speaker, finger sensor unit and wrist wearable with gesture sensor
Key Components · Custom enclosure, PCB, finger sensor, and wrist wearable

Music enthusiasts often crave a deeper connection with their favorite tunes, yet attending live shows is not always feasible. Finger Drumbeat offers a tangible way to engage with music anytime and anywhere, letting users tap out rhythms with their fingertips and change drum types with a swipe of the other hand.

I used it as a testbed for how minimal hardware (MPU6050, APDS9960, ESP32) can deliver real-time, expressive interaction through embedded UX design.

What I built

  1. Interaction concept: mapped fingertip taps to beats and hand swipes to drum types, so one pair of gestures drives a full virtual kit.
  2. Embedded system architecture: two XIAO ESP32-S3 boards communicating over Bluetooth, each managing its own sensors, audio, or display peripherals.
  3. Custom PCB, enclosure, and power system for wearable integration, packaged into a handheld display unit and a wrist sensor module.
  4. Modular firmware for gesture mapping, audio playback, and display synchronization, written so each subsystem can be tested and iterated on independently.
System architecture keys: finger tap acceleration via MPU6050 maps to beat and volume; wrist swipe via APDS9960 maps to drum type, analogous to playing a real drum kit
Interaction Mapping · Finger taps to beats, wrist swipes to drum types

Multimodal feedback loop

  1. Z-axis finger acceleration (MPU6050) drives sound volume, LED matrix intensity, and stepper-motor gauge rotation, so the harder you tap, the louder, brighter, and more energetic the response.
  2. Gesture direction from the IR sensor (APDS9960) triggers drum-type changes, with the current drum surfaced in real time on the OLED display.
  3. Dynamic alpha low-pass filtering on the accelerometer smooths minor movements while staying responsive to intentional taps, so the device feels like an instrument rather than a twitchy sensor.
System architecture communication diagram: input sensors (MPU6050, APDS9960) feed processed data over Bluetooth to dual ESP32-S3 processors, which drive output peripherals (OLED, speaker, LED matrix, stepper needle)
System Architecture · How input sensors, dual ESP32 processors, and output peripherals talk to each other

Project highlights

  1. Designed a complete embedded UX loop from sensor input to synchronized audio, light, and motion feedback.
  2. Built a custom PCB, 3D-printed enclosure, and portable power system for the wearable and handheld units.
  3. Developed modular firmware logic for gesture mapping, audio playback, and display sync, keeping each subsystem independently testable.
  4. Explored how low-cost, minimal hardware can deliver real-time multimodal interaction with instrument-grade responsiveness.
Original concept sketch: hand-drawn notes on virtual instrument, finger cuff with MPU6050, display device with LED array and OLED, and the wearable glove-like sensor unit
Original Concept · Hand-drawn sketch from the early scoping phase

Role

Full-Stack Build
Interaction · System · Firmware

Context

UW GIX

Type

Individual Creative Project

Year

2024

Next Project
VisionSense