While launching the first-ever iPhone in 2007 Steve Jobs told us that, “Who wants a stylus? We are going to use the best pointing device in the world. We are born with it and we have 10 of them. We are going to use our fingers.” This is the best example one can give for Human-Computer Interaction (HCI). An ever-evolving field of study, HCI is changing the way we interact with modern day computers at an escalating rate.
What is HCI?
The technical definition boils down to, “it is a multidisciplinary field of study focusing on the design of computer technology and, in particular, the interaction between humans (the users) and computers. While initially concerned with computers, HCI has since expanded to cover almost all forms of information technology design.” As mentioned by the minds behind the Interaction Design Foundation. To expand on their definition, the basic human communication system involves hand gestures (use of fingers and arms), eye movements, facial expressions, and voice. HCI is the bridge between the computing device and these human communication systems. From its origins, HCI would expand to incorporate multiple disciplines, such as computer science, cognitive science, and human-factors engineering since its inception in the late 1970s.
History of HCI
With the boom in personal computing in the 1980s, computers were no longer limited with academic experts and military personnel. Machines such as the Apple Macintosh, IBM PC 5150 and Commodore 64 brought computing to the common man. This sophisticated electronic system would behave like a word processor, accounting software or gaming unit for the common user. Consequently, the need to create human-computer interaction that was also easy and efficient for less experienced users became increasingly vital. After grueling research academics started to believe that the interaction between human and computer should be like human-to-human, just like an open-ended dialogue.
John Carroll, Professor of Information Sciences and Technology at the Pennsylvania State University says that the discipline of Human-Computer Interaction was born (or perhaps “emerged” is a better word) in 1980 as all these separate disciplines began to realign around a single objective; making computing easier for the masses.
“…it no longer makes sense to regard HCI as a specialty of computer science; HCI has grown to be broader, larger and much more diverse than computer science itself. HCI expanded from its initial focus on individual and generic user behavior to include social and organizational computing, accessibility for the elderly, the cognitively and physically impaired, and for all people, and for the widest possible spectrum of human experiences and activities. It expanded from desktop office applications to include games, learning, and education, commerce, health, and medical applications, emergency planning and response, and systems to support collaboration and community. It expanded from early graphical user interfaces to include myriad interaction techniques and devices, multi-modal interactions, tool support for model-based user interface specification, and a host of emerging ubiquitous, handheld and context-aware interactions.”
— John M. Carroll, author and a founder of the field of human-computer interaction.