
Gesture Control for the Wearable AI and AIoT Era
Patented, edge-powered, finger and hand gesture recognition — no cloud, no external processors, no occlusions.
143 Available Gestures

124 Taps
5 x Single finger
10 x 2 Fingers
10 x 3 Fingers
5 x 4 Fingers
1 x 5 Fingers

6 Surface Swipes
Left
Right
Up
Down
Zoom in
Zoom out

6 Air Gestures
Flick up
Flick Down
Flick Right
Flick Left
Bloom
Fist

4 Pinches
Thumb + Index
Thumb + Middle
Thumb + Ring
Thumb + Pinky

3 Thumb Swipes
Left
Right
Select & hold
Tap Vs. Competing Tech


Tap Is The Future of Natural Interaction
Intuitive, Simple, Limitless.

Self-Contained Intelligence
Tap is the world’s most advanced alternative input platform for AR and AI glasses. Reimagined and optimized to how humans should be interacting with spatial computing.
Scroll.
Play.
Type.
Edge AI Processing
Runs entirely on-chip
No Host Required
Zero external computing dependency
Near-Instant Latency
~40ms end-to-end
6-DOF Tracking
High fidelity in any orientation
Lighting-Independent
Works in starlight darkness
Multi-Platform SDKs
C/C++, Android, iOS, Unity, Python

Modality Highlights
Proprietary AI algorithms optimized for ultra-low power and real-time processing on compact hardware.
Built for privacy — no cloud processing required.
Near-zero learning curve.
Interact with any digital device naturally & easily without relying on host processing.
Compliment voice, EMG & eye tracking with a self-contained, AI-powered hand & finger gesture tracking model running locally on an edge processing chip.


Activate any button, command, or prompt with finger taps, pinches, and hand gestures.
Accuracy: 99% when tapping on a hard surface, 97% when tapping on soft surfaces, 99% when pinching
Transmission latency/speed: 40 milliseconds
Minimum Lighting requirements: Starlight (0.1 LUX)
Tap Systems Patents

