STEM-In-Action Fall Follow Up: Signing Squad

Hello again! It's time to check in on our STEM-In-Action Grant winning teams! Over the next few months we will see posts from each of our winning teams to find out what they are up to this fall. In case you're just now tuning in, the U.S. Army Educational Outreach Program (AEOP) STEM-In-Action Grant awards eCYBERMISSION teams up to $5,000 to develop their projects into mature and scalable solutions in their community. Typically we award this honorary grant to five teams, but this year ten teams took home the prestigious award. The next team we're catching up with is SigningSquad!

________________________________________________________________________________

Hello engineers! Welcome to the SigningSquad’s first blog post since receiving the STEM-In-Action Grant. The team, located in Ashburn VA, consists of high schoolers Jagadeepram Maddipatla and Rishi Athavale. Participants of eCYBERMISSION since 6th grade, we were prompted through AEOP to continue providing meaningful contributions to our communities. As the past year was mostly virtual, our project was poised around strengthening connections between varying groups of the community; through our research, we discovered that 8.8 to 10% of our community is deaf or hard of hearing. Further research revealed the numerous ways in which the communication gap between the deaf and non-deaf community impacts availability of opportunities; for example, jobs or a relatively proper education. Utilizing the subset of computer vision, we decided to create a communication bridge.


The result of our work is AISL (Artificial Intelligence for Sign Language), which translates between ASL and English. Unlike various other English and ASL translators which usually only translate ASL signs to English, our product translates from ASL to English and from English to ASL utilizing the aforementioned computer vision techniques. While a prototype model was made using rudimentary algorithms, our primary goals are to increase accuracy and scalability of our model using an app solution with the help of the grant.


Since competing in eCYBERMISSION, we have continued to research the problem and possible AI solutions to it. The main developments we have been working on are using a MobileNetV2 model for ASL to English instead of a Haar Cascade model and adding support for more ASL signs other than just letters. We went with ASL letters initially because they would help us simplify the problem and allow us to create a more effective solution. However, ASL features a wide range of signs and limiting ASL users to only letters would not truly bridge the communication gap. To make an efficient and seamless translator, we need to be able to incorporate a multitude of ASL signs for both the ASL to English and English to ASL translator. We plan on using our STEM-In-Action Grant to expand our reach in our community, from whom we’ll get labeled images of ASL signs to expand our dataset. Our first step to expanding and making our product accessible in our community is to develop an iOS and Android app for our translator, which will be made freely accessible and will have a feature to share labeled images of ASL signs and to give feedback. Our website, which will be up in the coming weeks, will also be available for the public to view.

We need your help too! A majority of our project relies upon crowdsourcing data in order to deviate from fingerspelling to full ASL word depiction; through our website (which will launch later this month), we are creating a portal for submissions by the community, and the opportunity for individuals to join in on the cause!

Comments