The GUI is made with Python and PyQt5/Qt Designer.
The GUI primarily contains two independent ML applications merged together:
- Hand gesture recognizer
- Activity tracker and analyzer
The ML side of the application utilizes the Mediapipe API to ensure lightweight application.
This project is actually not intended to be used as a shared application. However, if you want to give it a try, you can setup your own Firebase project.
You will also need to install the dependencies listed below using pip install
.
Pyrebase4
firebase_admin
mediapipe
opencv-python
pyqt5
matplotlib
The authentication is done using Google Firebase, so we can focus on the application logic and styling.
This is the home page of the GUI after logging in.
Hand gestures can be trained and then be recognized. The recognition is based on the matrix distance of every points in the hand.
Activity tracker is intended to detect activities that are caught inside the region of intereset (RoI). The simple statistics is provided in the app beside the camera feed and the user obtains the graph that summarize the activities.
The GUI is made possible by collaboration with @wiryanatasunardi and Evelio, outside of Github.