Skip to content

Latest commit

 

History

History
52 lines (34 loc) · 2.12 KB

README.md

File metadata and controls

52 lines (34 loc) · 2.12 KB

GUI for Machine Learning Application

The GUI is made with Python and PyQt5/Qt Designer.

About the GUI

The GUI primarily contains two independent ML applications merged together:

  • Hand gesture recognizer
  • Activity tracker and analyzer

The ML side of the application utilizes the Mediapipe API to ensure lightweight application.

Dependencies

This project is actually not intended to be used as a shared application. However, if you want to give it a try, you can setup your own Firebase project. You will also need to install the dependencies listed below using pip install.

  • Pyrebase4
  • firebase_admin
  • mediapipe
  • opencv-python
  • pyqt5
  • matplotlib

What's Inside

Login and Signup Page

The authentication is done using Google Firebase, so we can focus on the application logic and styling.

login signup

Home Page

This is the home page of the GUI after logging in.

home_profile

Hand Gesture Recognition

Hand gestures can be trained and then be recognized. The recognition is based on the matrix distance of every points in the hand.

hand_train hand_recog

Activity Tracker

Activity tracker is intended to detect activities that are caught inside the region of intereset (RoI). The simple statistics is provided in the app beside the camera feed and the user obtains the graph that summarize the activities.

activity activity_graph

Acknowledgments

The GUI is made possible by collaboration with @wiryanatasunardi and Evelio, outside of Github.