Live: InfoCamp 2008 – Keynote: Jacob O. Wobbrock

Live: InfoCamp 2008 – Keynote: Jacob O. Wobbrock

Nick Finck

September 27, 2008 at /files/includes/10.css:00 AM

Rachel Elkington is giving the introduction for Jacob Wobbrock. Jacob is going to be talking on Flipping the Burden: Marking computer accessible with everyday input devices.

Jacob takes the stage, after quickly plugging in his laptop we’re rolling. He hopes to do justice to InfoCamp’s “power to the people” theme. He is involved with DUB: Design, Use, Build which is a mix of people from various departments and schools at UW.

He starts off by talking about AIM: Accessibility Interaction and Mobility group. He wants to talk about ability rather than accessibility. “I want to open you to the idea that accessibility is not just about people in wheelchairs.” He shows a photo of a wheelchair, bicycle, and stroller using a curb-cut. Now he’s showing a automatic door where both a wheelchair and someone with a shopping cart going thru it. The person with the cart is situationally impaired. Using an example of an adult child and them helping an elderly parent by making use of a uni-sex bathroom. “Accessibility is a form of usability for all.” “it’s what can you do, not what can’t you do.”

“Typical computer access programs assume a standard user interface. What about people with disabilities, we adapt them to the technology. We use specialized technologies. Assistive technologies. We want to change that, we want to flip the burden. How about a non-standard interface?” Hes going to run thru a series of projects that do this.

His dissertation was on trying to address the issue of small targets on screen, this was before the advent of iphones and such. He shows the issues with using a styilus for someone who has cerebral palsy. Now a video. Explaining the concept of 4 corners in Graffiti. This is “edgewrite.” Showing metrics for 5 subjects entering the alphabet 4 times using graffiti. Now showing how its used on trackballs, touchpads, game controllers, etc. Showing another video, now with text-completion. “The /files/includes/10.css0 most common words in English covers 40% of the language, “The” is 6% of English.” Now showing a video of a news story on his research.

“Now, what about acquiring targets? what about using edges of the screen on a mobile device.” The edge of the screen improves your accuracy because you can’t over-shoot it. “could I move along the edge of the device and make the target based on where i pick up the stylus rather than where i click?” Showing video of this. These are all videos of people with motor disabilities. Now we can start think about edge interfaces. “but the sylis is kinda dead.. so anyway… these projects take a long time… so we have to keep using them” Now he’s talking about touch screens.

Now he’s talking about the creation of “sliderule” Showing a ipod touch and how it would work. The ability to read your finger. How can i hit targets without seeing them. Talking about flick gestures. Another video. This one has a ipod touch, with a screen reader.. there is nothing displayed on the screen. The user is using flick gestures. Doing things like “Call Woody Allen” “play Five Monday” etc. Audience claps.. amazing! Seriously, this is impressive! Audience: “When is it going to be available?!” Jacob says “Maybe you can help me with that?”

Now he goes into a brief explanation of how it works and how he got testers. “We did this study with blind people who were all very skeptical this would ever work” he goes on to say “Well, we’ll pay you.. so they did do the testing” then he goes on to say how they asked “when will be this available?” when they were done with the testing and after they tried the interface. Now another video this time of TrueKeys. How do you verify with a user that the previous word was corrected. Text corrections not all that sexy for a video but we did all kinds of test.

Now we’re going to push the burden really on the machine. “Supple++” Explaining the concept. Can we have a person designing a UI for each person with different needs? Talking about the tasks issued to the user. Watching performance and uses the results to build a automatic layout. Issue low level task, you capture that, you model that, then you adapt the interface to that. This is basically intelligent computing where the UI learns based on performance. Now hes talking about how the UI can even adapt to vision… this is not just like changing the text size on a web page, the number of options and buttons actually adjust to the user’s needs. Results were, in a study with 11 motor impaired participants were consistency faster… by 65%! Showing some other stats.

Now showing a user, Phillip Chavez, a “voice artist.” Showing how he paints with his voice… with Dragon Naturally Speaking and Microsoft Paint.. ouch! Showing a piece of art inspired by Jackson Pollic. Now showing the video on how it’s done. Pauses video, we’re trying to perform continuous commands. Continues with video, this is very painfully slow and prone to errors. Now talking about a voice vowel map with Vocal Joystick. Showing a voice draw video. The audio input is continuous with different sounds. Showing Philip’s art before and after voice draw. Pretty impressive, much more “paint” like.

Now he’s going into the Angle Mouse. Now talking about intention tremor. Showing a diagram of “spread” when moving the mouse at a angle which shows deviation. Motor-space is the space of thing as they exist in the physical space not on the screen. Talking about gravity wells when you near a target. Now talking about issues with other targets near one target… or distractor targets. Showing deviation comparison. Results are /files/includes/10.css.3% better thru-put with this angle mouse idea on motor-impaired users than with the windows /files/includes/default.css with no significant difference for able-bodied users.

Talking about light switches and how the interaction works and comparing it with a button. He proposes the concept of goal crossing instead of pointing. The people with motor-impairments were much more accurate and able-bodied people noticed no difference in performance. Showing a design app with no buttons, just goal areas. Describing the concept of occlusion. Showing some design schemes of attempts to solve this issue. He is going to show an assignment he gave his class about “reels.” where the UI is polar from inside out instead of outside in.

In review. We can use these projects to understand different ways about thinking about the UI for even able-bodied people and push the envelope on what is possible. Special thanks to his wife, co-authors, PHD students, the school, etc. Applause. Now for audience questions.



September 28, 2008 at 7:44 AM

Motor-space is the space of thing as they exist in the physical space not on the screen. Talking about gravity wells when you near a target. Now talking about issues with other targets near one target… or distractor targets. Showing deviation comparison.