Flow Chart of AI Virtual Mouse

Flow Chart

Real-Time AI Virtual Mouse System Using Computer Vision

AI Virtual Mouse is a software that allows users to give inputs of a mouse to the system without using the actual mouse. To the extreme, it can also be called hardware as it uses an ordinary camera. A virtual muse can usually be operated with multiple input devices, which may include an actual mouse or computer keyboard. The virtual mouse uses a web camera with the help of different image processing techniques. Using figures detection methods for instant Camera access and a user-friendly interface makes it more easily accessible. The system is used to implement a motion-tracking mouse, a physical mouse that saves time and also reduces effort. The hand movements of a user are mapped into mouse inputs. A web camera is set to take images continuously. Most laptops today are equipped with webcams, which have recently been used in security applications utilizing face recognition. To harness the full potential of a webcam, it can be used for vision-based CC which would effectively eliminate the need for a computer mouse or mouse pad. The usefulness of a webcam can also be greatly extended to other HCI applications such as a sign language database or motion controller.

Software Specification:

  • Python Libraries: Various Python libraries like OpenCV, NumPy, PyAutoGUI, and TensorFlow can be used for building the Al virtual mouse system.
  • Open CV: This library is used for image and video processing, which can be used for hand detection and tracking.
  • NumPy: NumPy is used for numerical computations, and it is used to process the captured images.
  • PyAutoGUI: PyAuto GUI is used to control the mouse movements and clicks.
  • Mediapipe: A cross-platform framework for building multi-modal applied machine learning pipelines.
  • Comtypes: A Python module that provides access to Windows COM and .NET components.
  • Screen-Brightness-Control: A Python module for controlling the brightness of the screen on Windows, Linux, and macOS.

Detecting which Fingure is Up and Performing the particular Mouse Function

Over here we are detecting which finger is Up using the tip ID of the respective finger that we found using the MideaPipe and the respective figure that we found using the Mediapipe and the respective coordinates of the fingers that are up, according to that we found using the MediaPipe and the respective co-coordinates of the figures that are up and according to that the particular mouse function is performed.

Mouse Functions depending on the Hand Gestures and Hand Tip Detection. Using Computer Vision for the mouse cursor moving around the system window. If the index fingure is up with tip Id = 1 or both the index finger with tip Id = 1 and the middle finger with tip Id =2 are up, the mouse cursor is made to move around the window of the computer using the AutoPy package of Python.

Co-ordinates or landmarks in the hand using Mediapipe

Similar Reads

Flow Chart of AI Virtual Mouse

Flow Chart...

How to set up and run AI Virtual Mouse

Pre requisiretes: Check Python Version – (3.6 or 3.8.5) must be installed in your system. You need to install Anaconda dIstribution in your system. Refer to this article: How to Install Anaconda on Windows? How to install Anaconda on Linux? How to Install Anaconda on MacOS?...

Importing Required Modules

Firstly we need to sets up the necessary dependencies for a Python program that may involve computer vision, audio control, and screen brightness control....

Convert Mediapipe Landmarks to recognizable Gestures

...

Executes commands according to detected gestures

...

Entry point of Gesture Controller

In this code provided is a class called “HandRecog” which is used for gesture recognition using hand tracking. It initializes various variables such as finger count, gesture type, frame count, and hand result. The class has methods to update the hand result, calculate the signed distance, distance, and change in z-coordinate of hand landmarks. It also has a method to set the finger state based on the hand result and handle fluctuations due to noise....

GUI Code

...

Conclusion

Now we defines a Python class called “Controller” that provides methods for gesture recognition and control of various system functions such as screen brightness, system volume, and scrolling. The class has variables to keep track of hand positions, gesture flags, and pinch thresholds. The class methods include “getpinchylv”, “getpinchxlv”, “changesystembrightness”, “changesystemvolume”, “scrollVertical”, “scrollHorizontal”, “get_position”, “pinch_control_init”, “pinch_control”, and “handle_controls”. These methods are used to calculate distances and changes in hand landmarks, set finger states, determine the current gesture, and control system functions based on the recognized gestures....