Control your computer with hand gestures using computer vision
Virtual Mouse is a computer vision project that lets you control your mouse cursor and perform various actions using simple hand gestures. No need for a physical mouse - just use your webcam and hand movements!
- ๐ฏ Precise cursor control with hand tracking
- ๐ Multiple gesture support (click, right-click, scroll, drag)
- ๐ง Auto-calibration adapts to your hand size
- ๐ Real-time processing with smooth performance
- ๐ก๏ธ Error handling with graceful failure recovery
- โ๏ธ Configurable settings for personalization
- ๐ฑ Cross-platform support (Windows, macOS, Linux)
| Gesture | Action | How to Perform |
|---|---|---|
| ๐ Pinch | Left Click | Touch thumb and index finger together |
| โ๏ธ Peace Sign | Right Click | Show index and middle fingers only |
| ๐ค Three Fingers | Scroll Up | Extend index, middle, and ring fingers |
| ๐๏ธ Four Fingers | Scroll Down | Extend all fingers except thumb |
| โ Closed Fist | Drag | Make a fist to drag objects |
| ๐๏ธ Open Hand | Screenshot | Show all five fingers |
- Python 3.7 or higher
- Webcam
- Good lighting conditions
-
Clone the repository
git clone https://github.com/AkshayS734/virtual-mouse.git cd virtual-mouse -
Create a virtual environment (recommended)
python -m venv venv # On Windows venv\Scripts\activate # On macOS/Linux source venv/bin/activate
-
Install dependencies
pip install -r requirements.txt
-
Run the application
python main.py
That's it! Your webcam should open and you can start controlling your mouse with hand gestures.
- CPU: Dual-core processor
- RAM: 4GB
- Camera: Any USB webcam or built-in camera
- OS: Windows 7+, macOS 10.12+, or Linux
- CPU: Quad-core processor
- RAM: 8GB
- Camera: HD webcam for better accuracy
- Lighting: Well-lit environment
- Position yourself: Sit 1-2 feet away from your camera
- Check lighting: Ensure your hand is well-lit
- Start the app: Run
python main.py - Calibrate: The app will automatically adjust to your hand size
- Move cursor: Point with your index finger
- Click: Make a pinch gesture (thumb + index)
- Right-click: Show peace sign (index + middle)
- Scroll: Use three fingers (up) or four fingers (down)
- Drag: Make a fist and move your hand
- Screenshot: Show all five fingers
- Q: Quit the application
- R: Reset hand calibration
- ๐ Use good lighting
- ๐ Stay 1-2 feet from camera
- ๐ค Make clear, distinct gestures
- โฑ๏ธ Hold gestures for 1-2 seconds
- ๐ Press 'R' if gestures aren't recognized
You can customize the app by editing config.py:
# Gesture sensitivity (0.1 to 1.0)
GESTURE_CONFIDENCE_THRESHOLD = 0.8
# Cursor smoothing (0 = no smoothing, 1 = max smoothing)
CURSOR_SMOOTHING_FACTOR = 0.3
# Camera settings
CAMERA_INDEX = 0 # Change if you have multiple cameras
FLIP_HORIZONTAL = True # Mirror the video feedCamera not working?
- Check if other apps are using the camera
- Try changing
CAMERA_INDEXinconfig.py(0, 1, 2...) - Restart the application
Gestures not detected?
- Ensure good lighting
- Move closer or farther from camera
- Press 'R' to reset calibration
- Make gestures more distinct
App running slowly?
- Close other heavy applications
- Lower the camera resolution in
config.py - Check your internet connection isn't being used by other apps
Permission errors on macOS?
- Go to System Preferences โ Security & Privacy โ Camera
- Enable camera access for Terminal/Python
- Do the same for Accessibility if needed
Enable detailed logging:
# In config.py
LOG_LEVEL = 'DEBUG'
SHOW_DEBUG_INFO = Truevirtual-mouse/
โโโ .gitignore # Git ignore rules
โโโ main.py # Main application (start here!)
โโโ virtual_mouse_modular.py # Alternative modular version
โโโ config.py # Configuration settings
โโโ requirements.txt # Python dependencies
โโโ LICENSE # MIT License
โโโ README.md # You are here
โโโ gestures/
โ โโโ __init__.py # Package initialization
โ โโโ gesture_utils.py # Gesture detection logic
โโโ screenshots/ # Auto-created for screenshots
โโโ tests/ # Unit tests
# Run all tests
python -m pytest tests/
# Run specific test files
python tests/test_fixes.py
# Verify bug fixes
python tests/verify_fixes.py
# Check import compatibility
python tests/check_imports.pyWe welcome contributions! Here's how to get started:
- Fork the repository
- Create a feature branch:
git checkout -b feature-name - Make your changes
- Test your changes:
python -m pytest - Commit:
git commit -am 'Add new feature' - Push:
git push origin feature-name - Submit a Pull Request
# Install dependencies
pip install -r requirements.txt
# Optional: Install development tools
pip install pylint black flake8 mypy
# Run code formatting (if tools installed)
black .
flake8 .| Metric | Value |
|---|---|
| Accuracy | ~85% gesture recognition |
| Latency | ~100ms response time |
| FPS | 30 FPS video processing |
| CPU Usage | ~15-25% on modern hardware |
- Multi-hand support for advanced gestures
- Gesture customization interface
- Voice command integration
- Mobile app for remote control
- AI-powered gesture learning
- Eye tracking integration
This project is licensed under the MIT License - see the LICENSE file for details.
- MediaPipe - Google's ML framework for hand tracking
- OpenCV - Computer vision library
- PyAutoGUI - Cross-platform GUI automation
If you find this project helpful, please consider:
- โญ Starring the repository
- ๐ Reporting bugs via GitHub Issues
- ๏ฟฝ Suggesting features via GitHub Discussions
- ๐ค Contributing code or documentation
- GitHub Issues: Report bugs or request features
- Discussions: Ask questions or share ideas
Made with โค๏ธ for accessible computing