Tuesday 2 February 2010

Augmented Reality Interfaces

At the moment everyone's talking about augmented reality (see Layar.com) as a visual overlay on the real world. But I don't believe we're anywhere near permanently wearing glasses and contact lenses that constantly beam info to us (and yes, I have seen Avatar's revenue figures).

I agree augmented reality will be a big deal, but we'll have to get more information through other senses. We've been training for years to quickly take in tiny visual cues from our electronics. How good can we get with other senses?

How about:
  • Wearing a tiny earpiece all the time. Seen how fast a 13-year old texts? How fast could next gen teens learn to take in audio? What's the optimum mix between voice and other audio alerts?
  • All types of vibration - I can feel the difference between my phone ringing and getting a text. How more things could I differentiate between?
  • Other kinaesthetics - How about a device that warms slightly when I pass a particularly good restaurant??
  • Smells and tastes - maybe one day? Aside from any other limitations, generating smells and tastes from an electronic device can't be that easy technically.
Obviously once we've got an initial audio or kinestetic alert we can pull out a screen to look at.

No comments: