Multi-touch is the new interface buzz-word of the last few years, inciting a geek-frenzy after its big screen debut in “Minority Report”. While such an interface is still a little bit far fetched for the consumer market, the multi-touch concept has made its way to consumers, notably via the iPhone and the trackpads of the Apple notebook computers. Other companies are embracing multi-touch, but I think it’s fair to say that no company has nailed it like Apple. I’ve used many smart phones and computer screens with touch interfaces, and nothing comes close to the natural feel achieved by Apple engineers.
Now, full disclosure – I’ve been called an Apple fanboy – but in this case I have objective proof, and his name is Scott. My son Scott is six years old, and his first six years have been difficult. He was born at 24 weeks gestation, just over a pound and barely able to survive. He followed an all too common path familiar to his preemie peer group, and came through the experience with several conditions that will challenge him in life. He is completely deaf, and has also lost much of his vision. The combination of hearing and vision loss make it very difficult to learn communication skills, and to use the tools on which our society has become dependant. Using a mouse is hard, since he finds it difficult to relate the movements of the mouse to a screen that he can only partially see. Now Scott is a smart kid – he can figure out a lot of complex things by employing his curiosity, and he’s not afraid to try again and again. He loves looking at pictures of people (and trains/trucks/wheels or course) and since we spend a lot of time in doctors’ offices, his Mom showed him the pictures on her iPhone one time, and he soon figured out that moving his fingers on the screen “did stuff”. The moment he figured that out, he knew how to use an iPhone. It didn’t take him long to figure out how to switch apps, use the home button, “swipe to unlock” and make phone calls to random people. He even came within a button press of replying to an email from the CEO of my company. We now restrict him to an iPod Touch.
Apple’s touch interface removes a lot of the learning that is required to use a new high-tech device. One almost needs to forget some of the conditioned impulses, and regress a little bit to use it. The iPad takes this a step further, more closely representing the form factor of everyday objects that we may interact with. This removes a barrier for people with disabilities.
I’ve met a lot of people with challenges that have been overcome with the use of technology. A well known example of this is Stephen Hawking, who speaks with the assistance of a computerized voice. He uses a speech synthesis system that runs on a laptop attached to his wheelchair. This is a fairly common set-up for people with disabilities, whether their disability is purely physical or if they need their device to help them form their thoughts as well as communicate. A set-up like this can get really expensive. A touch-screen laptop like the TuffTalker Convertible costs close to $10,000. Simpler devices that only show a group of pictures and say a phrase when a picture is touched can cost well over $1000. An iPhone or an iPad, with some rudimentary software (examples of which are already showing up in the Apple app store) can do all of this, cost far far less.
People with disabilities will never be a market with huge buying power, but the trend towards accessible, simple products is making the world a more welcoming place, and the classroom more inclusive.