Interface/Off (Part 1)

As the pace of technological development continues to increase, it becomes more and more interesting to me to try to predict the future – I’m impatient and enjoy getting closure on my guesses sooner rather than later. Despite the difficulty of accurately forecasting the future, it’s worth taking a shot at because the upside is so high: if you can position yourself or your company as an expert on the next world-changing technology before it breaks big, you will be as popular as a PBR salesman in Dolores Park. Conversely, a poor understanding of where technology is going can be the death knell for even a dominant company – look at how quickly Nokia and RIM have fallen following their inability to anticipate and understand the smart phone revolution.

I would argue that we are in the midst of a pretty significant shift right now in the field of interface technology, specifically with regard to smart products. For a few decades after the advent of the personal computer, the keyboard and mouse were standard interface technology without much in the way of alternatives. However, the overlapping rises of mobile computing, smart products, and some new interface technologies have significantly opened the playing field as we look forward (nowadays Minority Report seems a lot less futuristic and a lot more presentistic, which is definitely not a word). Just look at the last 5 years – touchscreens and then voice control have dramatically expanded the realm of possibility for how we interact with our phones. It is possible that touch screens’ day in the sun will be shorter than one might think, though – new technologies such as gestural interfaces and brain-machine interfaces (BMIs) are showing significant promise and could become ubiquitous within a few years. So, the question is, how will you interact with the (man-made) world around you in 5, 10 or 25 years?

Because this topic deserves a little more space to breathe than I can give it in a single post, I decided to break up my thoughts into 4 follow-up sections, distributed on a spectrum of approximately increasing sophistication and system versatility:

  1. In the red corner are Touch-Based Systems: physical button systems such as keyboard and mouse, touch sensors, and touch screens
  2. In the blue corner are Touch-Free (Gestural) Systems: systems that are controlled physically but without contact, such as eye tracking, voice control and gestural interfaces
  3. In the yellow corner (it’s a triangular ring apparently) are Brain-Machine Systems: revisiting thought control, which I’ve touched on several times in previous posts
  4. Finally, the fight can’t end without a Judge’s Decision: what system will claim victory?

Be Sociable, Share!