The Gesture Control System is a project designed to recognize and interpret hand gestures using a custom Convolutional Neural Network (CNN) model. The system utilizes popular machine learning libraries such as TensorFlow, Keras, OpenCV, and Mediapipe. The project's main goal is to provide an interface for gesture-based control, allowing users to interact with their devices through hand gestures.
-
TensorFlow and Keras Integration: The project leverages TensorFlow and Keras for building and training the custom CNN model responsible for hand gesture recognition.
-
OpenCV and Mediapipe: OpenCV and Mediapipe are used for capturing and processing video input, providing a real-time feed for gesture recognition.
-
Data Collection: We collected data on hand gestures from Kaggle, which includes a dataset containing 10 different hand gestures.
-
Custom CNN Model: The project features a custom CNN model trained on the collected dataset, achieving an impressive accuracy of 97.4% in gesture recognition.
-
Interface with PyAutoGUI: The system's user interface is created using PyAutoGUI, enabling seamless integration with the trained model for gesture-based control.
-
Collaborative Development with Colab: The development and training of the model were facilitated using Google Colab, providing a collaborative and cloud-based environment.