logo

Gesture Recognition API

Kickstarter Video

GRAPI fundamentally improves your gesture recognition experience. Introducing new and advanced “9 Degrees of Freedom” technology, high performance processor and ultra-long battery life. Not only it is an innovative and versatile piece of technology but it is also a fashionable accessory. It is light as feather, easily portable, user friendly and all these top of the line features are compacted to a wristband. GRAPI makes your everyday life motivating and entertaining.

Purpose

GRAPI (Gesture Recognition Application Programming Interface) is an API for a wearable tech wristband that recognises gestures made by the wearer.

The wristband is designed to use an Arduino, a 9DOF gyroscope, LEDs, Bluetooth, Wi-Fi and a battery. The API is a gesture recognition system which uses an artificial neural network to process motion data from a sensor worn on a user’s wrist and calculate the probability that the user made a particular gesture.

It is developed on an Arduino feather HUZZAH and a BNO055 motion sensor, which are very suited to this type of wearable technology as they offer a battery powered lightweight processor and on-chip Wi-Fi which can be used to run client programs using GRAPI, and to communicate with other devices wirelessly. GRAPI can be implemented to any desired program or applications due to its versatility.

It is a one-system-fits-all API. GRAPI also includes a desktop calibration program which can be used to tune the API’s sensitivity. It will display the output and thresholds from the gesture recognition output and allows users to easily experiment with the API’s settings and the available gestures. The program will also act as a demonstration and example of the API’s use and basic capabilities for users.

Components

Arduino Feather Huzzah Adafruit Feather HUZZAH with ESP8266 WiFi
Adafruit 9-DOF Absolute Orientation IMU Fusion Breakout – BNO055

Our Progress

Our device uses an artificial neural network to recognize gestures made by a user. The ANN is currently trained using a pool of data recorded from three of the gestures - push, screw in and screw out - and can easily be extended to recognize other gestures by recording movement data output from the device.

The device can connect to standard WiFi networks allowing it to be used freely without relying on a cable connection, which gives the user freedom to use the device anywhere they have WiFi access, without worrying about their proximity to other systems.

Our Future

Our top priority is to implement the remaining 5 gestures in the future. We will record gesture data for each gesture and extend and retrain the neural network to recognize the gestures. We will also create a user interface that displays the probability of the currently gesture being recognised, the current position of the hand and other underlying information such as the acceleration of the device.

The interface also allows the user to re-calibrate the device. If, for example, the API is too sensitive for a particular usage and produces a lot of false-positive gestures, then the thresholds may need to be increased. The calibration interface will offer an easy way to experiment with the calibration of the GRAPI; the users can use the calibration interface to tweak the calibrations, and then see feedback from test gestures in real time.

Meet The Team

We are Team Bandwagon, a team of 6 ambitious, enthusiastic, friendly undergraduate students at the University of Queensland. Members studying various different degrees have their individual specialties in designing and/or programming including basic HTML/CSS, Python, Java, C, etc. We are proud and excited to introduce you to our state-of-the-art product. Feel free to leave us comments if you have any questions or recommendations! Team Bandwagon, out.

Ashleigh Armstrong
Freya Rogers
Jessalyn Santoso
Leo Ng
Logan Caskie
Michael Du

Back to Top