Years and years ago when I started using computers I had a keyboard, a big tape drive and eventually this new thing called a mouse. It doesn’t exactly sound groundbreaking now, but at the time it was an interface we’d never had and it changed the way we used computers as it made things so much simpler. Add joystick’s to the mix and we had three main interfaces; keyboard, joystick and mouse. We also had roller balls which were similar and got people experimenting with different interfaces and different ways of doing things on computers.
The Gaming Interface
Over the next few years the other interfaces that worked with computers were to do with gaming. Consoles came out and control pads were designed in a way so they were ergonomic and we could easily use them even though there were a lot of switches on them, paddles and triggers etc. The problem was they were still designed very much with a male geeky audience in mind, with a very masculine shape.
The thing that changed everything was the Wii, the Wii came along and brought out a controller that was more feminine in its style and that you didn’t have to plug in to a wire. Your body’s movement was the interface. It was the first thing since the mouse to change the way we use computers and it made the Wii a massive success as it opened up this world to a much wider demographic like families. And obviously spawning from that, we had the PlayStation Move controller and Xbox Kinects which were similar sorts of products.
The Touch Screen Revolution
Touch screens had been around for a while but because the interfaces were so clunky and not very receptive, they weren’t beautiful or easy to use. Then Apple came out with their version that actually mimicked typical human gestures like swiping and pinching. All of a sudden we were using human gestures to use a digital device, and that was a massive thing because no longer did you have to know how to use this keyboard or a mouse. With touch you could do everything, and it opened things up to technophobes and people who wouldn’t usually use a clunky old computer, like young children and the elderly, as they could work out how to use a touch device in a very short time using human gestures. It’s now got to the point that some people are using touch devices more than they use computers and eventually touch devices will destroy the computer as we know it. When we realise that humans are using this interface with touch and gesture, surely perceptual computing can be used for so much more than Just Dance?
What Does This Mean For Matmi?
This year we will see the rise of perceptual computing in exhibitions, theme parks and shop windows. This technology uses gestures and voice recognition to make the interface the person, meaning we can interact just by being us. We’ve been doing a lot of work with theme parks in the last couple of years and one thing we know is that perceptual computing works very well in this environment, where kids can dance with monkeys and compare themselves with animals. We want to create things that are educational and fun by using technology in a different way. For example we want to help shops to attract customers by replacing ‘SALE’ signs in the window with engaging technology that lets passers by truly interact with the product they are selling, learning from what we do online but using the human interface within a shop window to get them to virtually try on clothes and use social sharing.
From Matmi’s point of view it’s all about engagement, and if you don’t have to do much or learn anything to engage you can just be. It won’t just be movement it will be touch, voice, facial expressions and we will end up communicating with computers just as they are human beings. My television at home is voice and gesture controlled and I believe that it won’t be long until we go totally sci-fi and have control panels helping to run our homes. But how much is too much? Where do we draw the line with this technology? And more importantly how can we use it to make engaging and extraordinary digital solutions?
Below is an example of some of the ways we have been experimenting with the human interface using the Kinect.
Jeff Coghlan, CEO