It’s a little embarrassing, but I get a lot of my insights from watching TED presentations. Blame it on the combination of my 2 hour commute, iPod Nano and TED providing video podcasts.
In a fascinating presentation by neurologist Vilayanur Ramachandran, he talks about how the brain works with sensory input. What stuck with me was towards the end of his talk:
Something very interesting is happening in the angular gyrus, because it is the crossroads between hearing, vision and touch and it became enormous in humans. I think it is the basis of many uniquely human abilities as abstraction, metaphor and creativity.
With interfaces, it is important to get sensory feedback. For example, right now, I am typing on a keyboard. This action creates a tactile feedback (it depresses), an auditory feedback (it clicks), and a visual feedback (letters appear on the screen). Unknowingly we feel satisfaction when these sensory feedback is properly provided. When typing on a keyboard does not produce letters on the screen, or the letters are somehow delayed, we have an emotional response – one of frustration.

With the iPhone there is no tactile or haptic feedback. (Some phones do have haptic feedback in the form of light vibrations) In order to compensate for the fact that it is missing the one of three feedback that is necessary for a good interface, it provides strong feedback through the remaining two. When you use the dialer on the iPhone, it provide a strong color change (visual feedback) and the dial tone (auditory feedback) whenever you touch they keys. Same thing happens when you use the on-screen qwerty keyboard. In order to compensate for the fact that is is no tactile key-pressing sensation, iPhone provides visual feedback in the form of the keys popping up, and auditory feedback in the form of a tapping sound.
Compare the iPhone experience to the LG Prada phone experience. LG Prada phone provides haptic feedback (you feel a slight vibrarion at your fingertips) and visual feedback, however the color change in the interface is weak (trying to stay “cool” by using grey tones), and auditory feedback is aways the same no matter what you do (it’s the same bell sound). This results in the Prada phone having a less satisfying touch UI experience over the iPhone.
A large part of the satisfaction when using a touch UI is based on providing appropriate feedback. Another large part is based on what metaphor from everyday life you adopt and present to the users. Watching Ramachandran’s talk made me realize is that there is a deeper neurological basis for what consitiutes to a satisfying touch UI experience: Our brains are wired to take in sensory feedback and develop an emotional response to it (sometimes without us realizing it).