Priyanjali Gupta: The Young Mind Behind India’s AI-Powered Sign Language Translator
At a time when AI is often linked to automation and big tech, Priyanjali Gupta, a 21-year-old computer science undergraduate at Vellore Institute of Technology (VIT) who specializes in data science, is showing that AI can also be a powerful tool for connecting people and making them feel welcome.
Her new idea? An AI-powered real-time sign language translator that turns American Sign Language (ASL) motions into English writing with just a camera and free software for developers.
This easy but powerful project went popular on social media, getting the attention of both tech groups and people who support accessibility. But the “viral moment” hides a story of purpose, imagination, and the power of new ideas led by students.
The Spark: A Mother's Challenge and a Student’s Curiosity
Priyanjali’s story began at home, not in a lab, when her mother told her that she kept wondering why she wasn’t using her engineering degree in useful ways. The task there, which was part joke and part motivation, planted a seed. Her mother was a teacher and had a student who was hard of hearing. When Priyanjali learned that her mother had a student who was hard of hearing, she realized how hard it must be for someone who uses sign language to communicate in a world that doesn’t understand it.
Now that she was more sure of herself in machine learning and computer vision, she knew she had to do more. This made her want to make something useful, something that would help deaf people.
How It Works: AI Meets Accessibility
Priyanjali used Raspberry Pi Model 3B and created her model on TensorFlow Object Detection API based on transfer learning involving the SSD-MobileNet model which enables a model pre-trained to be used to perform a different task.
- Dataset: She has collected a dataset of 250 300 labeled images of six common ASL gestures of 6 common ASL gestures of Hello, Thank you, I love you, Yes, No, and Please, which she made manually.
- Training: She employed the concept of transfer learning so that the system was trained with an already trained model and tweaked it to learn gestures.
- Interface: The application works on the basis of a webcam; it allows real-time processing of frames and their display on the screen in the form of the translated word in English.
The open-source code and documentation of her published on GitHub reveals a simple, well-modularized architecture that someone with a basic programming background could modify or extend.

Limitations and Vision for Improvement
Priyanjali’s model is new, but it’s still just a proof of concept and not ready to be fully used yet. It transforms static signs from single frames, but it doesn’t take into account facial emotions, moving movements, or sentence structure, all of which are important parts of real-life ASL.
To advance her system, she plans to:
- Long Short-Term Memory (LSTM) networks should be used with video sequence analysis to help people understand continuous motion better.
- Learn more than just six motions. Learn all of ASL's words, including spelling and sentence structure.
- Add face expression recognition, which is very important for figuring out what a sign means and how it makes you feel.
As she puts it:
“I believe sign languages include facial expressions, shoulder movements … it requires a very well-trained deep neural network.”
Why It Matters
Priyanjali’s project is more than just a technical experiment. It offers a glimpse into what the future of inclusive AI could look like:
- Availability in real-time: Her translator may be redesigned as mobile or web-based solutions, becoming an inexpensive means of communication among millions of people that use sign language.
- Community-driven development: With her dataset and code made publicly available, other developers, researchers and students have been able to add to the project, improve and expand it.
- Leadership by the youth: The industry is inundated with big groups and big budgets, her single-handed innovation speaks of the work that young brains can do when they are plain ambitious and skilled.
Google is still the best way to find quick facts. That being said, ChatGPT is quickly becoming an indispensable tool for tasks like summarizing PDFs, translating handwritten text, analyzing images, and answering complicated questions.
A Role Model for Future Innovators
Priyanjali is special not only due to her engineering skill but the vision that guides her empathetic ideas. In an era in which AI is occasionally lamented to be a cold and even greed-driven process, her work serves as a helpful reminder that technology can also be kind-hearted.
After graduation, she intends to acquire industry experience, and afterward, go after a master degree where she will major in AI as a means to communicate accessibility. She is hoping to create a world where AI is used to remove the barriers and not recreate them in the long run.
Not only has Priyanjali Gupta enabled her family and university to be proud of her; she has made this nation proud, having demonstrated the world how technology that is propelled by empathy can bring about real change, which completely transforms.
Disclaimer
The information presented in this blog is derived from publicly available sources for general use, including any cited references. While we strive to mention credible sources whenever possible, Web Techneeq –Seo Company in Mumbai does not guarantee the accuracy of the information provided in any way. This article is intended solely for general informational purposes. It should be understood that it does not constitute legal advice and does not aim to serve as such. If any individual(s) make decisions based on the information in this article without verifying the facts, we explicitly reject any liability that may arise as a result. We recommend that readers seek separate guidance regarding any specific information provided here.