Intel oneAPI Sign Language Gesture Translator using MediaPipe

The Intel oneAPI Sign Language Gesture Translator uses MediaPipe and Intel's oneAPI platform for real-time translation of sign language gestures. It bridges communication gaps by capturing and interpreting gestures, enabling effective communication between sign language and non-sign language users. ...learn more

Project status: Published/In Market

Artificial Intelligence, oneAPI

Groups
Student Developers for oneAPI, Intel Out of the Box Network Developers group, Student Developers for AI, Artificial Intelligence India, oneAPI Showcase, DeepLearning

Intel Technologies
oneAPI, DevCloud

Docs/PDFs [1]Code Samples [1]

Overview / Usage

The Intel oneAPI Sign Language Gesture Translator is a project that utilizes the power of Intel's oneAPI platform and the MediaPipe framework to create a real-time translator for sign language gestures. The aim of this project is to bridge the communication gap between individuals who use sign language and those who are not fluent in it.

The system is designed to capture video input from a camera and apply advanced computer vision techniques to recognize and interpret sign language gestures. It leverages MediaPipe, a cross-platform framework for building real-time multimedia processing pipelines, to perform keypoint detection and tracking of hand gestures.

Using the Intel oneAPI Base Toolkit, which provides a comprehensive set of tools for software development across CPUs, GPUs, and FPGAs, the project optimizes performance and takes advantage of hardware acceleration. This enables efficient and fast processing of video frames, allowing for real-time translation of sign gestures.

The Sign Language Gesture Translator takes advantage of machine learning models trained on large datasets to recognize and classify different sign gestures. By mapping these gestures to corresponding textual or spoken language output, the translator enables effective communication between sign language users and non-sign language users.

This project showcases the potential of combining Intel's oneAPI platform with the capabilities of MediaPipe to create innovative solutions for real-time gesture recognition and translation. It has the potential to empower individuals with hearing impairments and enhance inclusivity by facilitating communication between diverse groups of people.

Methodology / Approach

Intel OneAPI Base Toolkit
Jupyter Lab ( Intel oneAPI 2023 ) Kernal
MediaPipe
TensorFlow
Keras
openCV

Technologies Used

  • Intel OneAPI Base Toolkit
  • Jupyter Lab ( Intel oneAPI 2023 ) Kernal
  • MediaPipe
  • TensorFlow
  • Keras
  • openCV

Documents and Presentations

Repository

https://github.com/Er-AI-GK/oneAPI-Sign-Language-Gesture-Translator.git

Collaborators

1 Result

1 Result

Comments (1)