The day before yesterday, professor Duval told us a little about the history of HCI. As always, IT history sounds too incredible to be true. Yet, it is. A story of vision, opportunities and a glance at what might lie ahead.
Being master students, we've come across the history of the field before. Yet, every time, I cannot help but wonder on the short time span it took to accomplish the omnipresence of computers. Reason the more to reflect a little on how to interface them since, as it stands, functionality seems to precede usability most of the time.
Still, progress has been enormous, evolving from switches in the fifties and sixties over cards, CLIs and finally GUIs. Yet, it doesn't stop there. As an example, a typical book on HCI (e.g. Designing Interactive Systems, D. Benyon, 2010) explicitly states to advocate 'user-centred design'. What use is functionality if hardly any target users know how to operate the system?
What is this 'design thing' about? More than simply having an intuitive feel and look, it seems. The user has to be integrated into the design process as early as the first draft. A user-centred approach forces designers to rethink the design again and again. In fact, a focus shift occurs: apart from the technical analysis and design, 'soft science' enters the process. Insights from cognitive psychology, social sciences etc become ever more important.
This shouldn't come as a surprise. The task and influence of computer systems has also changed over the years. Originally, dumb machines where meant to perform repetitive and specialised tasks. Now, computers are all around, connecting people, and effectively affecting nearly ever aspect of our life. Do you still have the patience to wait for 'snail mail' to arrive? Did you ever consider the number of computers you're wearing, even as you are reading this?
This new scope asks for novel way of interaction. The aforementioned book has sections on auditory interaction (speech, but also so-called 'earcons'), tangible interaction (using touch and haptic devices) and even smell. To communicate more directly, brain interfaces are being developed too. A useful toolkit or a potential source of manipulation?
I would say a tendency is going on that seems to stretch far out to the future in a way we might not be able to foretell. First of all, there is the connection case. As mentioned in class, currently there are some compatibility issues, but they will be resolved one day. If that happens, connectivity and reachability might venture beyond what we know now. This raises an interesting question: do we want to be reachable all the time? Or even if not, do we want others to know we actually don't want to be reachable (for them, or in general)? With an abundance of information available all the time, are we expected to know everything as soon as it happens?
A second tendency, more related to HCI, is the notion of the invisible computer. Many things we wear or handle contain circuitry. To communicate effectively with all gadgets we use (preferably simultaneously), novel ways of interaction might be needed. If you cut a paper, you don't wonder about the technical aspects of a scissor: you perform a cutting task instead. The same argument goes for computers: people don't want to think how they work, they simply want to use them as one would use everyday objects.
An obvious concern: new ways of interaction generate new data streams. Who will have access, what about privacy and how might this affect the power balance in a world where information will become more important than ever?
A last remark: what does it take for all this to happen? Vision, as professor Duval suggested? Most certainly, but what does such broad a term signify? All together, visioneers are dreamers. What distinguishes the former from the latter is, oddly enough, a sense of reality. Although they keep possibilities in sight, they fully understand what is not possible at the moment. Yet, they do realize that every tiny contribution they make could eventually open up the path: even the longest journey starts with a single step. With hindsight, the history of IT seems almost magical. At the time, it was probably rather a matter of (albeit smart) people working on ideas they have, just as people try to do nowadays. A very reassuring thought...or not?
Interesting post! And scissor is a very striking example.
ReplyDeleteActually I don't fully agree with your last paragraph. To me a visionary can be distinguished from a dreamer, because he/she puts his ideas (or dreams) into action. Thereby Engelbart is also a visionary: he had an innovative idea, nobody back then thought it was possible (it was not yet reality), but he did the research and presented the world with a demo that is now considered as the mother of all demos.
ReplyDeleteI understand what you mean. I think, however, that very often the distinction is only made when some initial targets have been reached, when ideas have proven to be realisable (as with the demo). That's why I stress the small steps. In my opinion, people with vision don't necessarily work on their great idea all the time, but restrict themselves to what needs to be done now, keeping alive what it might lead to without lingering with it all too long. Yet, they also take risks, choosing their target just beyond the existing. That's their sense of reality - making a correct guess about the border between reality and fiction, given their abilities and the current technology. If you put the entire puzzle of history together, you can conclude whether someone truly lived his (or her) vision, whether their small accomplishments can be chained together in one vision.
DeleteOn the other hand, I know plenty of dreamers who (try to) put their dreams (or ideas) into action (and many more who don't - you have a point there). Only, most of them try to take on all of the world at once - or at least more than they can handle. That's what makes them dreamers in the eye of the beholder, and history often proves the beholder to be correct...
So, for me, the main distinction is not whether one puts dreams/ideas into action, but whether one succeeds in doing so, possibly slightly diverging from the initial idea when needed.