by Ian Hooper
One of my passions is exploring the possibilities of human-computer interaction. Perhaps it is my background in industrial design, but there is something particularly interesting to me in the ways that we try to bridge the gap between our bodies and our machines. We are naturally high bandwidth, multi-sensory creatures and we are trying to send and receive information from a machine that is functionally deaf, dumb and blind.
If some of the researchers have their way, the future is going to be full of much richer exchanges between ourselves and our machines. I recently attended the CHI 2012 conference where I saw some great examples such as the ZeroTouch interface. Multitouch on our phones or tablets is not so new anymore, but it is still a rare thing to see on large desktop sized displays. The Texas A&M Interface Ecology Lab showed an evolution to an old touch screen technology which may change this situation. Using a ring of infared emitters and detectors, the team has been able to create an optical sensing solution that does not suffer from occlusion and precision problems like other bezel-based touch solutions. The ZeroTouch is fast, accurate, works on a large scale and unlike many other multitouch devices, is relatively low cost. The ZeroTouch system was originally shown at CHI 2011, where the largest display they had it working on was a 27” monitor. At this year’s event they had improved the performance and had scaled it up to work on a 55” plasma TV.
The ZeroTouch was also interesting to me because of the fact that it was a frame, and not a screen or surface. This is significant for a few reasons. Firstly, despite the growing number of multitouch displays coming to market, it will be years before any significant number of people have access to this new technology. Secondly, some people work in places that spend a lot of money to have color accurate displays. These people are not going to want to throw away that investment just to have a richer interaction modality. If the ZeroTouch were to come out as an after-market enhancement to your existing screen – and if the price were right – it might just bridge that gap between the installed base of dumb displays and the next generation of touch-enabled displays. Finally, being a frame is interesting because it does not necessarily need to be mated to the surface of a display. It can hang out front and allow for open-air gestures like you might see with a Kinnect. While it cannot sense depth per-se, it does open the possibility of something going beyond what can be done with a surface sensor.
Hand and arm gestures are a great way of enriching our interaction with computers, but for some researchers it is our sense of taste that has been neglected for too long. Hiromi Nakamura was part of the team from Meiji University who showcased concept devices that introduced electric current into food as it is being eaten. At CHI 2012 they had a cup that had a wire in a straw which would electrify lemonade as it was drunk, and a fork that had a variable slider to adjust the electrification of food on the fork. Through modulation of the electric current passing through the food, different reactions from our taste buds can be elicited. In theory this could mean tastes like the amount of saltiness or sourness could be dynamically adjusted by changing the electrical waveform and intensity. In practice I found that the natural sourness of the lemonade overpowered my perception of any change to the drink. The cheese on the electrified fork on the other hand really did taste different. While holding the cheese on my tongue with the electric fork, I could taste distinct changes in the flavor as I slid the control from low to high. It was quite a remarkable experience to have flavors change in your mouth without adding or removing anything. Unfortunately, the alteration was making the cheese simply taste more or less metallic – not exactly an appealing flavor.
The practicalities and value of allowing a computer to interface with our sense of taste may be questionable, but adding sensors and actuators to enrich our experience is a good idea. Already our mobile devices communicate across our visual, auditory and tactile senses. Maybe someday we’ll have dental implants that make us taste candy when our sweetheart calls.