Elixir

AI Smart Band that can detect hand gestures and also monitor body for sudden changes in body conditions that may point to seizures or other conditions. ...learn more

Project status: Under Development

Internet of Things, Artificial Intelligence

Groups
Internet of Things, DeepLearning

Intel Technologies
AI DevCloud / Xeon

Overview / Usage

Main goal of the project is to build a hand gesture recognition system that can be turned into a wearable device or incorporated into future smart watches. In addition to this, research is going on to find out how the same technology can be used to detect changes in body conditions like sudden spike or drop in blood sugar levels or onset of a seizure.

Why ?
Smart Watches and Wearable Devices are rapidly being adopted by customers, bring them more closer to powerful technologies.
At the same time, hand based input still remains the best mode of human machine interface (HMI) and we envision a future where hand gestures is all we need to effectively interact with a machine. Leap Motion, Kinect and similar sensors can be seen as earlier implementations of this technology. Wearable versions of the same implementation uses EMG sensor (Myo band) and/or pressure sensors to approximate various gestures.

Bridging the gap between HMI and Wearable Lowe Power devices two is what Project Elixir is about - Developing a Hand Gesture recognition system that can be incorporated into a smart watch strap or as a stand alone bracelet, which is low power and much more accurate compared to existing technologies in this field.

Methodology / Approach

We start with figuring out the best method to detect hand gestures. Most EMG based implementations relies only on two muscles closer to the surface and approximates the output based on this and data from other electrodes. Intra and Inter Muscle EMG noise also adds to the complexity of finding out the right output.
Pressure Sensor based bands eliminates most of the issues off EMG based sensing but comes with it's own problems namely inaccuracies due to how tight the band is and movements.

Based on our research and with reference to previous works done int he field of tomography based sensing, our approach is to use a combinations of Electrical Impedance Tomography, Contour Sensing and Machine Learning to figure out the changes that occurs inside a hand while performing different gestures, detecting the corresponding gesture and triggering an action.

Technologies Used

Intel AI Dev Cloud (ML cloud Implementation), Intel NUC (ML Local Server Implementation and Back Projection for Imaging), Intel Curie (Tiny Tile), Arduino, Python, Matlab (for back projection calculations and rendering final tomography output ), Eagle (Circuit and PCB design)

Collaborators

4 Results

4 Results

Comments (0)