OpCon 18.1: Searching for the ultimate UI
OpCon 18.1: Searching for the Ultimate UI
Way back in the 70s, video games burst into our homes, and like many teenagers of that era I was hooked on both computers and games. The first games were very simple and could be controlled with just a few keys, but innovation was fast, and soon we were scrolling through worlds, jumping and running using different keys. Although I still love games, what interests me more now is the human computer interface, and it’s a topic I have frequently explored. Learning about one of our new UI features had me both reminiscing and wanting more technologically-advanced input methods.
Looking back, I simply wanted a better control mechanism, so I explored various keyboards and joysticks, but they never felt quite right. Full-sized machines in gaming arcades needed to be simpler and stronger (they often took a pounding from irate gamers), so they dispensed with the keyboards and used large buttons and sticks. These worked well, but there was one machine that fascinated me. It was called Marble Madness, and you controlled the marble with a large trackball. You could use both hands to make the trackball and marble roll in any direction. I’d played the same game by keyboard, and it wasn’t very interesting, but the trackball turned it into an exciting physical experience rather than a simple game.
Around the same time, Clint Eastwood’s movie Firefox came out; in it he steals an advanced plane that has weapons controlled by thought! We wondered: Would we ever be able to control games by thought? One of the most interesting stories around this 1982 film is that it cost $21M to make, and over 95% of that budget was for special effects!
A few years later one of the first VR systems was available to play in an arcade in London. It cost the princely sum of one British pound, which was a lot back then, but worth every penny. The graphics were poor, but the experience was priceless. When Sony’s PlayStation arrived in the early 90s, there were stories that designing the hand control cost $1M. That controller is largely the same, my son uses today; so, its stood the test of time, although it has some way to go to surpass the keyboard (1874 and still going strong). I doubt we will be using the same console controls in 2134.
I waited patiently for VR gaming, but unfortunately it has only recently been available and requires expensive technology and somewhat claustrophobic headgear to make it a worthwhile experience. Steering wheels and pedals got a lot of excitement for a while but never quite felt like the real thing; mostly it was the lack of feedback from the “car” that weakened the experience.
Then in 2002 the Tom Cruise film Minority Report came out. Wow. By simply waving his hands, multiple screens could be controlled – zooming in, flipping images and solving crime. I wanted that, and in a remarkably short time between science fiction and science fact, Xbox’s Kinect was available.
I don’t believe SMA has any firm plans to control OpCon with hand gestures, but goodness me that would make a cool feature. We can control it with voice using Amazon’s Alexa, but that’s another story – and both ideas would be impractical in an office. Just imagine the potential for chaos!
A potentially less-invasive control technology that really got me excited was eye-tracking. One such company, Tobii, devises solutions that help overcome disabilities and relate to the gaming industry. As an example of how this technology is used, imagine you are in a game and your character is in a building; there is a door to the left and stairs to your right. Simply looking at the door causes your character to go through the door and looking at the stairs will move your character up the stairs. In the latest release of OpCon, we have a cool progressive discovery feature. It is a technique that enables a large and complex workflow to be presented one section at a time and you get to control which direction you want to explore next, which areas you want to zoom in on and so on. I love it but imagine if you could simply look at a section to have that expand and thereby walk through the workflow as you desired. Great, but perhaps not available for a while.
I absolutely love that technology is being used to help those with physical challenges and perhaps the most notable is the recently departed and great physicist Stephen Hawking. Increasingly more paralyzed from around 1985 he used a single cheek muscle to write books, deliver lectures and communicate with the world. Perhaps surprisingly he wasn’t a fan of eye-tracking noting, “I have experimented with eye tracking and brain controlled interfaces to communicate with my computer. However, I still find my cheek operated switch easier and less fatiguing to use”.
Don’t worry, we do not have plans to control OpCon in this way, but eye-tracking… That would be neat.