Natural user interfaces, or NUIs, are characterized by 3 distinct input modalities: multi-touch, in-air (kinetic) gesturing, and voice command. The Natural user interface is a blanket term that Steve Mann came up with. It's similar to interactions of graphical interfaces with a mouse, but another layer of abstraction has been removed, allowing users to interact directly with the system via fingers, motion, or dictation.

Note: Let's not get caught up in vernacular. You can easily subsitute Touch User Interface, Gestural User Inteface, or Sensory Interfaces

We've been designing for gestural interfaces for the past five years. Beyond typography and touch points and an ancient human interface guideline, the initial delight of owning a touchable device is ever fading to... unremarkable. Multi-touch surfaces, like tablets and smartphones, create interesting opportunities for design that cater to the devices in question. It's a natural experience in the way it establishes a positive emotional resonance when interacting with a physical device, not just interacting with a website behind glass.

We've reached this kind of emotional stagnation with the screen. The initial delight of owning a touchable device is decreasing and becoming unremarkable. We're losing the magic and it's important that design continues to progress forward in order to enhance the experience.

I've observed during usability studies that people tend to use fingers or hands as a giant mouse, often clicking their way to the desired goal. Rather than point and clicking, it's time we explore and embrace NUIs by increasing the gestural vocabulary in order to mitigate the point and click convention that I think is inhibiting the maximum experience. Why do you choose check Twitter over doing million other things that your smart phone can do while waiting in a grocery line? It's because of these simple, rhythmic, gestural microinteractions like pull-to-refresh that keep delivering the sense of delight.