North Texas Teens Develop Real-Time AI Tool for Sign Language Translation

0
3

Key Takeaways

  • Two North Texas high‑school seniors, Shiven Velagapudi and Aadi Sanghvi, created an AI‑driven sign‑language translator called Hand Wave in a home‑office lab.
  • The project uses machine‑learning models to recognize American Sign Language (ASL) hand skeletons and delivers real‑time audio translations via smart glasses.
  • Personal connections—an ASL‑speaking uncle and a parent with sudden hearing loss—motivated the friends to solve a real‑world communication barrier.
  • Velagapudi emphasized that the technology began as a hobby: “This was completely unrelated from school. We kinda just picked this up for fun … because we noticed that this was a problem that we wanted to solve in the real world.”
  • Sanghvi described the broader social impact: “I think it fosters unity … it reminds us that we are more similar than we might expect.”
  • With additional time and funding, the duo aims to move Hand Wave from a prototype to a deployable tool for wider public use.
  • The story was originally reported by NBC DFW, later refined with AI‑assisted drafting and human editorial oversight.

Inspiration Rooted in Personal Experience
The genesis of Hand Wave traces back to everyday moments that highlighted a communication gap. Aadi Sanghvi frequently interacted with his uncle, who relies on American Sign Language, while Shiven Velagapudi witnessed his father’s sudden hearing loss in one ear. “Sign language is what connects those sorts of communities, and this is just a technology to help us better understand their community,” Velagapudi explained. These personal stakes gave the friends a clear purpose beyond academic curiosity, turning a casual interest into a mission to bridge deaf and hearing worlds.


From Hobby to Home‑Office Lab
What began as a fun side project quickly evolved into a serious endeavor. The pair repurposed a spare room in Velagapudi’s house into a makeshift AI laboratory, equipping it with a high‑end GPU, cameras, and a pair of smart glasses fitted with a miniature lens. “We kinda just picked this up for fun and mainly because we noticed that this was a problem that we wanted to solve in the real world,” Velagapudi told NBC DFW. The informal setting allowed them to iterate rapidly, testing algorithms late into the night without the constraints of a formal school lab.


Technical Foundations: Machine Learning Meets ASL
At the core of Hand Wave lies a convolutional neural network trained on thousands of labeled ASL hand‑pose images. The model detects key joint positions, rendering a simplified hand skeleton that the software then maps to corresponding letters or phrases. “I can look at you and behind the camera you can be signing and I can be hearing the translations live in my ear,” Sanghvi described, noting that the smart glasses stream video to the phone, where the inference runs and audio feedback is delivered via bone‑conduction speakers. This pipeline enables near‑instantaneous translation, a critical factor for natural conversation.


Real‑World Testing and User Feedback
Early trials involved friends and family members who are fluent in ASL. The teens recorded signing sessions, compared the model’s output to ground‑truth translations, and refined the architecture to reduce latency and improve accuracy. Velagapudi recalled a moment when his uncle signed a simple greeting and heard the correct spoken response through the glasses: “It felt surreal—seeing the technology work exactly as we imagined.” Such validation reinforced their belief that the system could handle everyday interactions, not just laboratory demonstrations.


Vision of Unity Through Technology
Beyond the technical achievements, Sanghvi sees Hand Wave as a catalyst for social cohesion. In a world often marked by division, he argues that enabling seamless communication reminds people of shared humanity. “I think it fosters unity. I think there are a lot of things happening in our world that makes it easy to feel divided, I guess, so I think it reminds us that we are more similar than we might expect,” he said. By allowing hearing individuals to “hear” sign language and vice‑versa, the tool creates a common conversational space where fears, strengths, dreams, and weaknesses can be exchanged openly.


Challenges Ahead: Scaling and Funding
While the prototype works in a controlled environment, transitioning to a robust, consumer‑ready product presents hurdles. The duo acknowledges the need for larger, more diverse datasets to accommodate regional signing variations and different lighting conditions. Additionally, hardware costs—particularly smart glasses with reliable cameras and battery life—must be addressed. “With more time and funding, the friends believe they could have a prototype ready to use beyond their home office,” the NBC DFW report noted, indicating that grants, angel investment, or partnership with accessibility nonprofits could be pivotal next steps.


Ethical Considerations and Community Involvement
Velagapudi and Sanghvi are keen to avoid unilateral development; they actively seek input from the Deaf community to ensure the technology respects cultural nuances and does not inadvertently perpetuate stereotypes. They plan to collaborate with local Deaf advocacy groups for user‑testing sessions and to incorporate feedback on UI design, ensuring that the tool enhances rather than replaces authentic ASL expression. This participatory approach aims to build trust and foster a sense of ownership among those who stand to benefit most.


Broader Implications for Assistive Tech
Hand Wave exemplifies a growing trend where high‑school innovators leverage accessible AI tools to tackle real‑world accessibility challenges. By demonstrating that sophisticated machine‑learning models can be trained and deployed outside traditional research institutions, the project inspires other students to consider how technology can serve underserved populations. Moreover, it highlights the potential for low‑cost, open‑source solutions to complement existing commercial assistive devices, potentially lowering barriers to entry for users worldwide.


Conclusion: A Prototype with Promise
In just a few months, Shiven Velagapudi and Aadi Sanghvi have moved from a casual curiosity to a functional AI prototype that translates sign language in real time. Their journey—fueled by personal motivation, iterative experimentation, and a vision of unity—underscores the power of youthful ingenuity when paired with purposeful problem‑solving. As they continue to refine Hand Wave and seek the resources needed for broader deployment, the duo stands poised to make a meaningful contribution to the ongoing effort to make communication truly inclusive for all.

https://www.nbcdfw.com/news/local/north-texas-teens-build-ai-tool-translate-sign-language-real-time/4024875/

SignUpSignUp form

LEAVE A REPLY

Please enter your comment!
Please enter your name here