Skip to content

HGC is a Hand Gesture Control Module based on Google's Mediapipe and a custom model build on top of it for gesture classification.

License

Notifications You must be signed in to change notification settings

FaragSeif/HGCv1

Repository files navigation

GitHub issues GitHub pull-requests GitHub stars License


Logo

HGCv1

A Hand Gesture Control Module based on Google's Mediapipe and a custom model build on top of it for gesture classification.

View Demo · Report Bug · Request Feature

Table of Contents
  1. About The Project
  2. Getting Started
  3. Roadmap
  4. Contributing
  5. License
  6. Contact

About The Project

HGC is an easy to use module for hand gesture control in any application. the module facilitates creation of a custom dataset, training a model on it, and using it to control any application. The module is based on Google's Mediapipe for hand detection and tracking, and uses a custom trained model build on top of it for gesture classification.

Demo

Mediapipe is an amazing base for such an application since its incredably fast and optimizied for CPUs and Edge devices. The demo below shows Mediapipe in action.

demo


This demo shows the module in action. The module is able to detect and track the hand, classify the gesture, and control the drone all in real time.

demo


Here is a graph of how the application for drone control is constructed.

demo


Built With

The project uses the following frameworks and packages: \

Getting Started

To get started you need to install the module dependencies. If you will use the pre-trained model on 8 gestures, you can skip the training part. If you want to train your own custom model, you need to collect a dataset first. The module provides a simple way to collect a dataset, and train a model on it.

Prerequisites

  • Install Requirements:
    pip install -r requirements.txt

Please check the comments in the requirements file and remove what you don't need before installing to avoid unnecessary installations.

Module Usage

  1. Step 1: Clone the repo using
    git clone https://github.com/FaragSeif/HGCv1.git
    or simply click on "Download ZIP" from the repo page.
  2. Step 2: Run the dataset collection script if you want to collect your own custom gestures dataset.
    python create_dataset.py -n 3 -s 1000 -l label1 label2 label3 -p path/to/dataset
    where:
    • -n is the number of gestures you want to collect
    • -s is the number of samples you want to collect for each gesture
    • -l is the labels of the gestures
    • -p is the path to the dataset folder

  1. Step 3: Create your application that uses and HGClassifier object and give it the path to the TFlite model you want to use (either custom or pre-trained), then specify your video stream source (0 is the defualt for most laptops). Then, use the HGClassifier object to classify the gesture and control your desired application.
    from HGCv1.models import HGClassifier
    classifier = HGClassifier(model_path='path/to/model', src=0)
    command, image = hg_classifier.detect(draw_on_image=True)

Roadmap

In the future, we are planning:

  1. Add more applications examples to the module
  2. Publish the module on PyPi.
  3. Make it more user friendly.
  4. Add a GUI for the dataset collection and training process...?

Contributing

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

License

Distributed under the MIT License. See LICENSE.txt for more information.

(back to top)

Contact

Seif Farag - @FaragSeif - [email protected]
Nabila Adawy - @NabilaAdawy - [email protected]

Project Link: https://github.com/FaragSeif/HGCv1

(back to top)

About

HGC is a Hand Gesture Control Module based on Google's Mediapipe and a custom model build on top of it for gesture classification.

Resources

License

Stars

Watchers

Forks