VIRTUAL MOUSE USING HANDS GESTURES
DOI:
https://doi.org/10.6084/m9.figshare.26090662Abstract
In this research paper, we present a pioneering system for gesture-based control of a virtual mouse using computer vision and machine learning techniques. The system leverages the capabilities of the MediaPipe library and OpenCV to accurately detect and track hand gestures in real-time through a standard webcam. The virtual mouse controller, implemented in Python, interprets the hand gestures to enable intuitive and natural interaction with the computer.
Our approach includes the detection and tracking of multiple hand landmarks, allowing for precise mapping of hand movements and finger positions. The system recognizes various gestures, such as right swipe, left swipe, screenshots, left and right clicks, double-clicks, scrolling, zooming, and dragging, enhancing the user experience in virtual environments. The gestures are translated into corresponding mouse actions using the PyAutoGUI library, enabling seamless integration with existing applications.
The gesture control system is characterized by its adaptability to diverse hand gestures and robust performance in different lighting conditions. The controller's functionality encompasses right swipe, left swipe, screenshots, cursor movement, scrolling, zooming in and out, left and right clicking, double-clicking, and dragging. Additionally, the system introduces a dynamic and responsive cursor movement algorithm, enhancing user control and precision.
Our research contributes to the evolving field of human-computer interaction by providing an innovative and practical solution for hands-free computer control. The gesture-controlled virtual mouse offers potential applications in various domains, including accessibility technology, virtual reality, and interactive presentations. The paper concludes with a discussion of potential future enhancements and extensions, emphasizing the system's versatility and its impact on the evolution of intuitive human-computer interfaces.