Following on his earlier column about command line as the future of User Interface, Donald Norman now describes physicality in the latest issue of ACM interactions as another important direction ("the return to physical controls and devices"). As he says "Physical devices, what a breakthrough! But wait a minute, isn't this where the machine age started, with mechanical devices and controls?" this is some sort of throwback to earlier times "with improvement" though.
"Physical devices have immediate design virtues, but they require new rules of engagement (...) Designers have to learn how to translate the mechanical actions and directness into control of the task. (...) As we switch to tangible objects and physical controls, new principles of interaction have to be learned, old ones discarded. With the Wii, developers discovered that former methods didn't always apply. Thus, in traditional game hardware, when one wants an action to take place, the player pushes a button. With the Wii, the action depends upon the situation. To release a bowling ball, for example, one releases the button push. It makes sense when I write it, but I suspect the bowling-game designers discovered this through trial and error, plus a flash of insight. Not all of the games for Wii have yet incorporated the new principles. This will provide fertile ground for researchers in HCI."
He also points out intriguing issues such as the movement towards physical interface would lead HCI to "move from computer science back to mechanical engineering (which is really where it started many years ago)". So he advocates for HCI that would take advantage of both mechatronics and UX: "If the future is a return to mechanical systems, mechatronics is one of the key technological underpinnings of their operation. Mechatronics taught with an understanding of how people will interact with the resulting devices"... wondering where this would happen.
Why do I blog this? it's now well established that "new rules" should be written. New games are being design, new guidelines being described, new approach are required (like gestural language annotation for example) but the final part (about need to have more mechatronic + a user-centered approach) is less common in papers about tangible interfaces. It's curious to see how things will unfold towards that direction (yes I assume that it's a correct direction).