Skip to content

Nancyjikadra/Sign-language-translator

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Sign Language to English Translator

Overview

The Sign Language to English Translator is a Python-based application designed to facilitate communication with individuals who are unable to speak or hear. This tool uses Computer Vision and Natural Language Processing (NLP) to recognize hand gestures (sign language) and translate them into spoken English.

Features

  • Detects and interprets common hand gestures using a webcam.
  • Converts recognized gestures into text and speech.
  • Uses MediaPipe for hand landmark detection.
  • Implements real-time translation to bridge communication gaps.

Tech Stack

  • Programming Language: Python
  • Libraries:
    • OpenCV for image processing.
    • MediaPipe for hand gesture recognition.
    • pyttsx3 for text-to-speech conversion.
    • Concurrency with Python's concurrent.futures for efficient multitasking.

Prerequisites

  • Python 3.7 or higher.
  • A functional webcam for gesture detection.
  • Install the required libraries:
    pip install mediapipe opencv-python pyttsx3

Setup and Installation

  1. Clone the repository or download the project files.
  2. Install the required Python packages:
    pip install -r requirements.txt
  3. Connect a webcam to your computer.
  4. Run the script:
    python sign_language.py

How It Works

  1. Hand Detection: The application uses MediaPipe to detect hand landmarks in the video feed.
  2. Gesture Recognition: Custom conditions are applied to identify gestures like:
    • Victory: Hand forming a "V" shape.
    • Thumbs Up/Down: Specific finger positions.
    • Other gestures like "OK", "Call Me", and "Smile".
  3. Translation: Recognized gestures are mapped to their respective English phrases.
  4. Speech Output: The text is converted to speech using pyttsx3 for auditory feedback.

Supported Gestures

Gesture Output Text Description
Victory "Victory" Hand forming a "V".
Thumbs Up "Thumbs Up" Thumb pointing upwards.
Thumbs Down "Thumbs Down" Thumb pointing downwards.
Smile "Smile" Smile gesture with hand.
Call Me "Call Me" Hand mimicking a phone shape.
Pain "Pain" Gesture indicating discomfort.

Usage Instructions

  1. Launch the application.
  2. Position your hand in front of the camera.
  3. Perform one of the supported gestures.
  4. The application will:
    • Display the corresponding text on the screen.
    • Announce the phrase using text-to-speech.

Future Enhancements

  • Add support for a broader range of gestures.
  • Improve gesture recognition accuracy.
  • Integrate NLP to allow users to customize responses.
  • Deploy the application as a web or mobile app for accessibility.

Contribution

Contributions are welcome! If you'd like to contribute:

  1. Fork the repository.
  2. Create a feature branch (git checkout -b feature-name).
  3. Commit your changes (git commit -m 'Add feature').
  4. Push to the branch (git push origin feature-name).
  5. Open a pull request.

License

This project is licensed under the MIT License.


We hope this project helps foster better communication and inclusivity. Feel free to reach out with any feedback or suggestions!

Result:

10 9 8 7 6 5 4 3 2 1

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages