STEM-In-Action Spring Scoop: Signing Squad
Welcome back to the STEM-In-Action Spring Scoop, where we feature one of the STEM-In-Action Grant winning teams from the eCYBERMISSION 2020-2021 year. STEM-In-Action Grants of up to $5,000 are awarded by the U.S. Army Educational Outreach Program (AEOP) to eCYBERMISSION teams that wish to further develop and implement their projects in their communities. Typically, only five teams receive a STEM-In-Action grant, but this year ten teams took home the award. Today, we're chatting with Signing Squad to get an inside look at their project progress!
___________________________________________________________________________________
Hello again, engineers! This is Jagadeepram Maddipatla from the SigningSquad, writing from Ashburn, Virginia. In case you’ve missed our last blog post, we are a team committed to bridging the communication gap with the deaf community through the use of American Sign Language (ASL). Through the eCYBERMISSION competition, we had proposed a solution to do just this by developing a two-way communication pathway. The first pathway, which translates ASL to English text, utilizes computer vision and machine learning techniques to detect hand motions from deaf individuals and communicate them to those who may not understand the language. On the other hand, the second pathway allows for individuals to convert English speech into ASL signs, allowing for deaf communities to readily communicate with hearing individuals who may not know the language. Together, both of these pathways make up the A.I.S.L. translation system.
Through our research, we discovered that 8.8% to 10% of our local community is deaf or hard of hearing. Further research revealed the numerous ways in which the communication gap between the deaf and non-deaf community impacts availability of opportunities, such as jobs or a relatively proper education. The ultimate goal of A.I.S.L. is to allow for increased accessibility to these opportunities.
Since our last blog post, we’ve made tremendous progress with A.I.S.L. After delving into various courses and certification programs online, we have identified two specific computer vision algorithms that could be used to develop the final program: MobileNet V2 and YOLO V4. After developing and building upon our previous ASL image dataset, we had trained both of the algorithms on it and cross referenced accuracies to determine the most optimal solution. While YOLO V4 was ultimately more powerful, we had also realized that MobileNet V2 would be the best choice for an app based translator. As a result, we are currently working on creating a new algorithm through Amazon Web Services; ultimately, this will allow for everybody to utilize our solution without worrying about running the algorithm on their mobile device, as a server would be completing the computations. However, we will repurpose our MobileNet V2 algorithm for use in our website. Throughout the process, we have also been in close contact with our county’s ASL supervisor, and have received continuous input on how A.I.S.L. should be optimized from a professional lens.
So far, the process has by no means been obstacle-free. One of the first speed bumps that we encountered was with the training process of our A.I.S.L. algorithm. After training the computer vision algorithm on all 26 letters of the ASL alphabet, we had realized that the entire model would have to be retrained in order to incorporate a new character or word. For almost a month, we were stuck. Retraining the model for every new set of words would force us to train hundreds of characters, each with thousands of images of data, in a single sitting; this requires an immense amount of resources solely to process! However, after some more research and experimentation, we had decided to go about the issue from a slightly different angle. Rather than forcing the A.I.S.L. algorithm to forcibly choose between hundreds of choices in words and letters, it would make much more sense to split up the dataset into subsets, each with similar hand movements. This way, the algorithm would look for patterns rather than hard-coded hand shapes. We are currently in the process of reengineering our algorithm to take this approach, which will hopefully increase the viability of our solution.
We at the SigningSquad were fortunate enough to receive recognition from Virginia Senator Mark Warner for our work with A.I.S.L. Our school newspaper had also interviewed us due to our involvement in introducing A.I.S.L. to the education system. While none of these pieces of publicity are overly impressive on their own, we believe that they reaffirm the mission of the SigningSquad and pave the way for the future of our technology in communities!
Moving forward, our mission remains clear. Simply put, if we manage to better the livelihood of a single deaf individual through our solution, then our goal will be fulfilled. A.I.S.L. is a project which aims to directly solve one of the largest communication gaps present in modern society; if our application were to be recognized as a viable alternative to human translators, then it would directly pave the way for increased access to opportunities for those with hearing disabilities. By the end of our grant tenure, we hope to work closely with Gallaudet University in order to synthesize diverse datasets for our algorithm, and hope to integrate our solution fully within our own county’s education system.
However, the vision of the SigningSquad would not have existed without support from eCYBERMISSION and AEOP as a whole. It was this opportunity which enabled us to take our project to the next level, and directly make an impact on the communities around us. As the months progress, we hope to continue the mission that the AEOP has instilled within us, and will work to maximize the impact of our project.
Of course, you can help our mission too! While most of our current data has come from our own hand signs and open source datasets, acquiring organic ASL images from members of the community would lead to a much more inclusive solution. We hope that you can join us as we continue our mission to bridge the deaf communication gap by contributing your own datasets, and perhaps even testing our algorithms prior to official release; check out our website for details! This would mean the world to us, and would assist in creating a better world of communication for all those who currently need it.
--
The Signing Squad team is an awesome example of how, with hard work and dedication, you can truly achieve anything! Their grit and perseverance, continuing to improve their coding skills and come up with new solutions when faced with problems, are qualities that are sure to help them succeed. Thank you, team, for dedicating your time, energy, and talents to improve access for the deaf community!
Faith Benner
AEOP Senior Communications and Marketing Specialist
Comments
Post a Comment
We welcome your comments and expect that our conversation will follow the general rules of respectful civil discourse. This is a moderated blog, and we will only post comments from bloggers over 13 years of age that relate to eCYBERMISSION. We will review comments for posting within one business day. Bloggers are fully responsible for everything that they submit in their comments, and all posted comments are in the public domain. We do not discriminate against any views, but we reserve the right not to post comments.