The way in which we interact with machines is changing. A lot of progress is being made in the area of gesture-based control and in thought-based control – controlling machines with our minds. Knobs and switches are going to give way to hand movement and brain waves. Eye movement and vocal commands are also being harnessed as a way to manipulate machines. Advances in this area will be especially beneficial to the medical field, where cleanliness and sterility are of constant concern. Gesture/mind/voice – based control will enable surgeons and operating suite technicians to control monitors and devices without having to touch them, minimizing the chance of bacteria transmission and infection. That advantage extends to all areas of the hospital/clinic environment.
From a design standpoint, this raises some interesting issues. How do we design control and feedback interfaces when we no longer need physical mechanisms to do so? How do we design them to be understandable, to be intuitive?
Though still a touch-based system, graphical user interface design would seem a likely starting point, substituting image and iconography for physical buttons and such. But I think we’ll move beyond the GUI to something that uses the capabilities of thought and gesture control in ways we haven’t yet imagined.