UC Davis student Alexa Adams wants to change people’s perspective — by literally changing what’s in their perspective — using the Google Glass technology.
Google’s Project Glass, which was first announced last year, are futuristic glasses that would provide what’s essentially an augmented reality for its users. It’s a heads-up display that offers many of the same features of the Android smartphone, such as navigation, email and video conferencing.
Adams, a second-year undergrad studying biotechnology with an emphasis in bioinformatics, means to create a program for the wearable computer that would allow for the deaf to be notified of sounds in the environment and eventually even interact with those unfamiliar with sign language.
Adams’ application would make visual stimuli out of surrounding auditory cues — picked up from the device’s microphone — such as bicycle bells, screeching tires or crosswalk chirps. With further development, speech recognition and captioning may become another possibility.
With lip reading hardly being an exact science, and interpreters not always at the ready; Adams said the app could improve communication between the hearing and deaf worlds. It’s a passion that has deepened from her early exposure to the deaf community.
“The reason why I have an interest in the deaf is because I grew up with an aunt who interprets from the hearing-impaired at a Modesto city school,” she said. “She taught me sign language.
“I went to a summer camp every year where there were deaf students, and I got to see first-hand the difficulty there is in interacting anywhere outside a classroom — even in it can be challenging.”
Adams also made many friends in the deaf community at camp. She recalled playing games outdoors with them, and struggling to get their attention during times when they weren’t looking her direction.
It’s those young experiences that became the impetus for her to get involved in the Google Glass Explorers program through a contest held on Twitter, wherein participants devoted 50 words to what they believe could be done with the technology.
The idea to use the Glass in the way that she plans to was accepted, which allows her an opportunity to get a Google Glass prototype and develop a program for it. She’s committed to following through, and seeing her concept in tangible form.
“I feel that if I can be using my skills to help other people, then why not,” Adams said. “Since I’m connected to the deaf community, and it’s something I’m passionate about, it’s a perfect niche to fall into.”
Her goal is not a profit-driven one. She intends for the app to be monetized — through advertisements — only if she needs to cover any charges that Google may levy for publishing an app.
Instead, Adams explained that her priority is assisting the deaf community. She’s eager to get their feedback, first and foremost, when working on the app.
“I talked to my aunt, who invited me to visit her school once I get the prototype, to get comments from her students,” she said. “I talked to a few of my friends who are deaf, and they’re really excited about it.”
“This app is not a replacement for sign language, it’s an aid for people who don’t know it to communicate with the deaf,” Adams added.
The one thing that holds her back is the cost of a prototype model of the device. Google will not waive the fee, and responded to her request to do so with a cold but logical answer — if they did it for one, they’d have to do it for all 8,000 developers.
Unfortunately, Adams isn’t a professional developer with full financing backing. She has raised only $125 of $2,500 she needs thus far, which includes $1,500 for the Glass and another $1,000 for the required tools.
The route she has chosen to raise the funds is a Piggybackr account, an online fundraising tool. Donations may be made at goo.gl/J5s4a.
“Every little bit helps, and will go to a good cause,” she said. “I’m not looking at profit, I’m looking at empowering individuals to interact in a world that’s not always the most welcoming.”
If she can raise the money, Adams said she will devote her entire summer to developing the app. She’s already started some of the work, though she’s not sure of the exact programming language the Glass will employ for development.
Because the voice recognition software is built in, as is the camera, she can work within the existing systems to program the app. It’s something she believes she can tackle on her own, and complete within a reasonable amount of time.
“Though I do have a roommate who reluctantly agreed to help me debug,” Adams said with a laugh.
— Reach Brett Johnson at email@example.com or 530-747-8052. Follow him on Twitter at @ReporterBrett