STEM-In-Action Spring Scoop: Hawks

Another week, another STEM-In-Action Spring Scoop! If you're new to eCYBERMISSION, the U.S. Army Educational Outreach Program (AEOP) awards STEM-In-Action Grants of up to $5,000 to eCYBERMISSION teams that wish to further develop and implement their projects in their communities. Typically, only five teams receive a STEM-In-Action grant, but this year ten teams took home the award. This week, we're checking in with team Hawks, who are working to improve the lives of blind/visually impaired members of their community!

___________________________________________________________________________________

Hello again! We’re Team Hawks, a group of 10th graders trying to make the lives of those in our community a bit easier, using STEM. Our team consists of three sophomores out of the wonderfully weird Portland, Oregon: Raina, Joseph, and Karthik. 

The problem we set out to solve was the issues faced by the members of our community who are blind/visually impaired (BVI). Members in the BVI community face a variety of difficulties in their daily lives, and the most common device to solve some of these problems, the white cane, does not always meet every need due to its simplicity. 

However, most existing “smart canes” that look to improve the white cane are extremely expensive. This is bad news for most BVI, who have a lower median monthly income due to the stigma surrounding blindness and visual impairment. Our goal is to create an affordable solution that attaches to a normal white cane to enhance the user’s ability to detect obstacles above and below the waist and tell them when it is time to cross the street.

Since receiving the STEM-in-Action Grant, we have run a successful crosswalk detection program on our computer and are working on integrating all of the parts of our solution. We have also explored various technologies that we are confident will simplify the number of parts and make our solution easier to integrate.

Right now, we are busy integrating our obstacle detection system (which we have already gotten to work on its own) with vibration motors and integrating our crosswalk detection program (which we have also gotten to work on its own) with an OAK-D camera, which is a camera that can detect objects in real-time footage. Our goal is to finish integration and mount all of our hardware onto the cane. We are also thinking about adding a navigation system to aid with directions if we have time.

Any and every journey will contain many roadblocks. So it wasn’t any surprise that our team has encountered many speed bumps. One of the many speed bumps we have encountered in this journey is with the OAK-D. As mentioned before, we are trying to integrate the OAK-D camera into our prototype as it would make the whole process easier and more efficient. We successfully got the OAK-D to launch; however, we ran into some issues deploying an AI model. We tried to troubleshoot this by deleting and redownloading the software, which did not result in success. Trying to start from the basics and making a “Hello World” program also resulted in a failure. After searching for other ways to get the OAK-D to work as intended, we found a tutorial, which as of right now we are still using to try to get the OAK-D to work. Compared to previous attempts, however, this tutorial seems to be actually working!

Our biggest challenge to date has been getting the AI to work. Coming into this project, we didn’t have a lot of coding experience, so on multiple occasions we encountered a roadblock or a minor bug and didn’t know how to fix it. And it’s  always been difficult not to get discouraged when our entire code is working except for one line, and we have no idea what happened. We’ve contacted mentors and watched a lot of videos to figure out the problem, which has worked to some extent, although it’s still hard sometimes to keep at it.

But the reason why we keep going is also the thing that excites us the most, and that’s the feeling when everything works. It’s going to be very exciting when we finish integration and all the separate parts of our solution can fully work together. And the fact that we’re getting closer to this excites us to continue onward.

Speaking of things that excite us, we recently entered our idea in a pitch competition called GoVertical, and we won 1st place! The competition was made up of startups around the Pacific Northwest, and the fact that we won helps us know that yes, our project can actually be a very useful solution. We expect more publicity once we complete integration testing and can get our product out to folks who are BVI for user testing. 

Our ideal scenario is for limited speed bumps the rest of the way, as we finish integration, and as we make the solution compatible with a white cane. These speed bumps are the main reason why we had great difficulty meeting our ideal goal dates. A lot of time has been spent fixing up bugs in code and trying to debug unplanned errors, making progress difficult; hopefully the frequency of these speed bumps will decrease as we get closer and closer to our goal. 

If we have time at the end of our grant, we are also considering adding navigational support that will essentially provide Google Maps directions on the smart cane itself; however, our main priority is to make a product that can both detect obstacles and detect when a crosswalk signal is on. If all goes well, we will be able to send our product to members in our community who are BVI and get their feedback before our project submission in June!

--
It's often said that everything worth doing is difficult, but Team Hawks is showing that they're unstoppable! Their perseverance brings their goal of improving the lives of people that are blind/visually impaired ever closer, all the while they're becoming expert coders in the process. We can't wait to see how they continue to progress!



Faith Benner
AEOP Senior Communications and Marketing Specialist

Comments

Popular Posts