Wednesday, September 19, 2012

Who knew I was a "designer"?

I did my undergrad at CMU's Tepper School of Business. I'm currently working for a Robotics research group at CMU called TechBridgeWorld. I'm also doing my masters part time at Heinz College in Public Policy and Management. I used to party with kids at CMU's School of Design, but if I were to hand you my resume, nowhere would you find any design experience.

However, after reading "Design Thinking for Social Innovation," I realize that I am, in fact, a "designer" and have been one for several years now. As I mentioned earlier, I work for TechBridgeWorld, which develops and field tests sustainable technology solutions to meet development needs around the world (translation: we create technologies with a global heart). The article mentions that a designer's "solutions are relevant to a unique cultural context and will not necessarily work outside that specific situation." And this is exactly what we do.

NavPal for indoor navigation for the visually impaired
Take for example, one of our projects, called NavPal. Based on conversations with our partners in Pittsburgh, we learned of a great need for a navigation tool (that's also socially acceptable) for visually impaired and deafblind people so that they can get around safely in unfamiliar indoor environments. To that end, we developed a navigation app (NavPal) that runs on an Android phone. The prototype guides a user to his or her destination through audio and vibration feedback. The user swipes a gesture on the touch screen to tell the app that he or she is ready for the next instruction. If the user comes across an obstacle or "landmark", the app lets the user make a note of it so that he or she can either avoid the obstacle or remember the landmark next time.

The current prototype is the result of several months of "human-centered design" which I helped to lead. No one on our team was visually impaired or deafblind and we did not personally know anyone who was. We conducted, what we call "needs assessment", to understand the needs and challenges of our end users. My needs assessment team then used these findings to inform our tech developers to create a prototype solution so that it was actually relevant. Furthermore, we tested the prototype with visually impaired participants to ensure we were truly meeting their needs.

If you didn't think human-centered design was important for social innovation, allow me to prove you wrong:
  1. NavPal guides a user to his or her destination through audio and vibration feedback. OK, it makes sense that audio feedback would be important since the user cannot see what is on the screen. However, when interviewing visually impaired people, we learned that there are times when they find themselves in noisy environments and so listening to the app could be challenging. They could then go on vibration mode to get to where they want to go.
  2. The user swipes a gesture on the touch screen to tell the app that he or she is ready for the next instruction. Here is where user testing is crucial. Our researchers came up with a set of "simple" gestures (< for previous instruction, > for next instruction, etc.) for navigating. When we tested it with visually impaired users, we found that for those who have been blind since birth, it was especially difficult for them to draw a "greater than" symbol. There was a communication gap between the tester and the participant because we would try to explain it in a way that didn't match how they learned to map things out spacially. Even after physically guiding their hand, it was difficult for them to draw that gesture in a fluid motion. Long story short, we simplified our gestures so that it was more relevant. The customer is always right.
  3. NavPal lets the user make notes so that he or she can avoid an obstacle or remember a landmark. From our needs assessment, we learned that visually impaired people figure out where they are through sensory cues. Through "audible" landmarks, they know they are by a snack room if they hear a lot of chatter. Through their heightened sense of smell, they can know they are in a hallway from its stale smell. They also use counting techniques to figure out their progress to their destination, such as counting the number of doors they passed as they trailed a certain side of the wall. All of these cues are important in navigation when you are visually impaired and we designed our user interface to capture this need.

Earlier this year I went to a Social Enterprise Conference at Harvard and saw examples of projects that wasted a lot of time and money simply because they didn't start with the end user. They thought they knew what the challenge was from an outsider's perspective, when in fact, they didn't. Because we took the time to learn about the end user's needs and tested it with them, we developed a relevant prototype. And the design doesn't stop there. We'll continue to improve our solution and consider other research areas such as indoor navigation in emergency situations.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.