Chapter3 HCI

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 70

Chapter 3

Human–Computer Interaction
By
Dr. Nilesh M. Patil
Syllabus

USER INTERFACES AND HIDDEN UI VIA BASIC SMART DURATION: 8 HOURS


INTERACTION FOR WIDELY DEVICES
USED DEVICES
Introduction to HCI

 Human Computer interaction (HCI) is characterized as a dialogue or


interchange between the human and the computer because the output of
one serves as the input for the other in an exchange of actions and
intentions.
 HCI is the study of interaction between people (users) and computers.
 Human Computer Interaction is concerned with the design, evaluation
and implementation of interactive computing systems for human use
and with the study of major phenomena surrounding them.
 Human Computer Interaction (HCI) is an interdisciplinary field in
which computer scientists, engineers, psychologists, social scientists
and design professional play important roles.
Human Computer Interaction (HCI) is an
Interdisciplinary field
Human Computer Interaction (HCI)
 HCI tackles questions concerning how people interact with
computers
 Are computers intuitive or complicated?

 Are computers rewarding or frustrating?

 How can computers be made accessible to everybody (e.g. different physical abilities, different
languages, etc.)?

 To what level can computer interaction be standardized?

 Are computers “user-friendly”?

 What does it mean to be “user-friendly”?


Human Computer Interaction has Three Components
 Human

 Computer

 Interaction

The goal of HCI is to improve the interaction between


users and computers by making computers more user-
friendly and receptive to the user's needs.
Model Human Processor

 Card, Moran and Newell


(1983), described the
Model Human Processor
(MHP)

 A simplified view of the


human processing
involved in interacting
with computer system.
Why do we need to understand Humans?
 Interacting with technology is cognitive.
 Human information processing referred to as cognition
 Human cognition process is involved when interacting with system, like
attention, perception and recognition, memory, learning, reasoning, problem
solving and decision making.
 Need to take into account cognitive processes involved and cognitive
limitations of users.
 Provides knowledge about what users can and cannot be expected to do.
 Identifies and explains the nature and causes of problems users encounter.
 Supply theories, modelling tools, guidance and methods that can lead to the
design of better interactive products.
 Must consider what are users good and bad at?
Computer
In fact, the most sophisticated machines are worthless unless they can be used
properly by men.
What is Interaction?

communication
user system

 Interaction refers to a dialogue generated by the command and


data, input to the computer and the display, output of the
computer and the sensory/perceptual input to the human and
motor response output of the human.

 There are number of ways in which the user can communicate


with the system, batch input, direct manipulation etc.
What is Interface?
 Interface is made up of a set of hardware devices and
software tools from the computer side and a system
of sensory, motor and cognitive processes from the
human side.

Interaction takes place at the Interface,


Donald Norman’s model
 Norman’s model concentrates on user’s view of the
interface

 Seven stages
◦ user establishes the goal
◦ formulates intention
◦ specifies actions at interface
◦ executes action
◦ perceives system state
◦ interprets system state
◦ evaluates system state with respect to goal
execution/evaluation loop
goal

execution evaluation
system
 user establishes the goal
 formulates intention
 specifies actions at interface
 executes action
 perceives system state
 interprets system state
 evaluates system state with respect to goal
execution/evaluation loop
goal

execution evaluation
system
 user establishes the goal
 formulates intention
 specifies actions at interface
 executes action
 perceives system state
 interprets system state
 evaluates system state with respect to goal
execution/evaluation loop
goal

execution evaluation
system
 user establishes the goal
 formulates intention
 specifies actions at interface
 executes action
 perceives system state
 interprets system state
 evaluates system state with respect to goal
execution/evaluation loop
goal

execution evaluation
system
 user establishes the goal
 formulates intention
 specifies actions at interface
 executes action
 perceives system state
 interprets system state
 evaluates system state with respect to goal
User interface (UI)
User interface: User interfaces mediate the interaction (dialog) between

humans and computers.

 The User Interface today is often one of the most critical factors regarding

the success or failure of a computer system


 [[

 Good UI design:
 Increases efficiency

 Improves productivity

 Reduces errors

 Reduces training

 Improves acceptance
© Worboys and Duckham (2004) GIS: A Computing Perspective, Second Edition, CRC Press
Basic user interface styles

• Five commonly encountered user interface


paradigms:

Interface style Expressive


Command entry
Menu
Forms
WIMP
Natural language Intuitive
Command line interface
• Command entry: human user issues commands directly to the
computer.

• Many different options customize commands (expressive).

• Requires user to learn large numbers of commands and options


(not intuitive).
Menu interface

• Menu interface: commands


organized into logical groups (more
intuitive than command entry)

• A submenu can be used to present


further related list of sub-functions
or options

• Menu structure limits range of


options (less expressive than
command entry)

• Restricted form of WIMP


Form Interface

• Form interface: presents


specific questions to which a
user must respond in order to
perform some task.

• Intuitive, since users are led


step by step through
interaction.

• Not expressive, since form


allows access to only a few
specialized commands
WIMP

• WIMP: stands for windows, icons, menus, pointers

• WIMP interfaces are familiar as they are the basis of


most desktop-computer operating systems
Some other Interaction styles
 Question/answer and query dialogue
 Point and click
 Direct Manipulation
 Three–dimensional interfaces
 Gesture Recognition
 Gaze Detection
 Speech and Speaker Recognition
 Pen based Interaction
 Motion Tracking sensors and Digitizers
 Taste and smell sensors
Principles of User interface design
 Simple and natural dialogue
 Speak the user’s language
 Minimize user’s memory load
 Provide feedback
 Provide clearly marked exits
 Provide shortcuts
 Deal with errors in a positive manner
 Provide help
Example: Speak the users’ language

 Terminology based on users’ language for task


◦ e.g. withdrawing money from a bank machine

 Use meaningful mnemonics, icons & abbreviations


◦ e.g File / Save
Example: Minimize user’s memory load

 Computers good at
remembering, people
are not!
 Promote Recognition
over Recall
◦ menus, icons, choice
dialog boxes vs.
commands, field formats
◦ relies on visibility of
objects to the user (but
less is more!)
Example : Provide feedback
 Continuously inform the user about
◦ what it is doing
◦ how it is interpreting the user’s input
◦ user should always be aware of what is going on

Multiple files being copied,


but feedback is file by file.
Suggestions for Interface Designs
 HCI has traditionally been about designing efficient and effective systems.

 Well-designed interfaces can elicit good feelings in users.

 Expressive interfaces can provide comforting feedback.

 Badly designed interfaces make people angry and frustrated.

 Emotional interaction is concerned with how we feel and react when

interacting with technologies.

 Emotional interaction is concerned with how interactive systems make

people respond in emotional ways.

 Relaxed users will be more forgiving of shortcomings in design.

 Aesthetically pleasing and rewarding interfaces will increase positive affect.


Suggestions for Interface Designs cont..

 User interfaces should be designed to match the skills, experience and


expectations of its anticipated users.

 System users often judge a system by its interface rather than its
functionality.

 A poorly designed interface can cause a user to make terrible errors.

 Poor user interface design is the reason why so many software systems
are never used.

 Designers should be aware of people’s physical and mental limitations


(e.g. limited short-term memory) and should recognise that people make
mistakes.
Basic Goal of HCI
USABILITY

One of the key concepts in HCI.


It is concerned with making systems easy to learn and use

A Usable system is:

Easy to
Easy to learn remember Effective to Efficient to Enjoyable to
use Safe to use
how to use use use
In order to produce computer system with good usability;
Developers must attempt to

Understand Develop Achieve Put People 1st

• Tools and • Efficient, • Their needs,


• The factors that capabilities and
determine how techniques to effective, and preferences for
people use enable building safe conducting various
technology suitable tasks should direct
interaction developers in the way
systems that they design
systems
• People should not
change their way
they use the system
to fit with it, instead
system should match
their requirements

The long term goal:


To design systems that minimize the barrier between the human’s cognitive
model of what they want to accomplish and the computer’s understanding of the
user’s task
Why is usability important?

 Poor usability results in


◦ anger and frustration
◦ decreased productivity in the workplace
◦ higher error rates
◦ physical and emotional injury
◦ equipment damage
◦ loss of customer loyalty
◦ costs money
Role of HCI in Pervasive Computing
Role of HCI in Pervasive Computing

The technology should be invisible, hidden from sight. To develop


information appliances that fit people's needs and lives. To do this
companies must change the way they develop products.
They need to start with an understanding of people: user needs first,
technology last-- the opposite of how things are done now.

Now, computers become pervasive. They are


embedded in everyday objects……

Users do not care about what is inside the box, as long as the box does
what they need.
Explicit HCI
 User is always at the center of interaction.
 System control responds to and generated by human
 System is not driven internally but by user.
 Complex to coordinate lot of inputs by different devices to perform
concurrent activities.
HCI motivation

 To support more effective use.


 useful: accomplish a user task that the user requires to be done
 usable: do the task easily, naturally, safely (without danger of error)
 used: enrich the user experience by making it attractive, engaging, fun, etc.
 The success of a product depends largely on both the user’s experience with how
usable a product is and how useful it is in terms of the value it brings to them.
Haeckel’s law and inverse law

 Heckel’s law states that the quality of the user interface of an appliance is
relatively unimportant in determining its adoption by users if the perceived
value of the appliance is high.
 Heckel’s inverse law states the importance of the user interface design in
the adoption of an appliance is inversely proportional to the perceived
value of the appliance.
 Although the usability of the UI is important, the overriding concern is the
usefulness of the device itself.
Implicit HCI motivation

 Explicit HCI (eHCI) design supports direct human intervention.


 Pure explicit interaction is context free.
 Users must repeat and reconfigure the same application access every
session even if every session repeats itself.
 It is also more about H2C (Human to Computer) Interaction.
 Focus is on the human having a model of the system (a mental model)
rather than the system having a model of the individual user.
Implicit HCI (iHCI)

 Eg: Person entering dark room


 Its an action, performed by the user that is not primarily aimed to interact with a
computerised system but which such a system understands as input.
 Context aware
 C2H (Computer to Human) Interaction
 Computer has a certain understanding of users’ behaviour in a given
situation(additional input)
 Complex to design than eHCI.
Complexity and challenges of iHCI
 To accurately and reliably determine the user context
 Systems may also require time to learn and build an accurate model of a
user.
 The user context determination may invade and distract users’ attention
Ubiquitous audio video content access
Continued…
 Individual voice, video and audio services are often not aware of each
other and sometimes are not user configurable.
 Eg: when a voice call arrives, TV and radio are automatically paused or
muted.
 Voice calls can be recorded in answer phone devices but they cannot
easily be exported or reformatted.
 To support such dynamic service composition requires the use of a
pervasive network infrastructure, standard multimedia data exchange
formats and certain metadata.
Ubiquitous information access and E-books
 Personal digital calendar- can be accessed through different devices.
 Pull type interaction allows users to initiate the information exchange e.g.:
searching the web.
 Push type notification services are used for customers to be notified of
events, e.g., news.
 PC remained the dominant interactive information access device, but not in
all cases e.g.: kitchen, which motivated use of mobile devices.
 Electronic information access is better as compared to papers.
Universal local control of ICT system

Radio TV DVD-W

Radio
Play channel Channel 5
Record channel Channel 4

17:38


ABC DEF
1 2 3
GHI JKL MNO
4 5 6
PQR STU WXYZ
7 8 9

* 0+ #
User-Awareness and Personal Spaces
 Personalization can make the system customized.
 Configuration of services can also be personalized.
 E.g.: coordinate and configure different home appliances
 Complex issue is to manage shared social spaces.
Diversity of ICT Device Interaction
 Usually, a PC is a device with a programmable chip, haptic i/o, and visual UI.
 Embedded systems are used in a device to perform specialized tasks and have
different i/o interfaces
 Devices are characterized based on
1. size: hand-sized, centimetre-sized, decimetre-sized versus micro-sized versus body-sized or larger;
2. haptic input: two-handed versus one-handed versus hands-free operation;
3. interaction modalities: single versus multiple;
4. single user versus shared interaction: in personal space, friends’ space or public space;.
5. posture for human operator: lying, sitting, standing, walking, running, etc.;
6. distance of output display to input control: centimeters to meters;
7. position during operation: fixed versus mobile;
8. connectivity: stand-alone versus networked, wired versus wireless;
9. tasking: single-task devices versus multi-task devices;
10. multimedia content access: voice and text communication-oriented, alphanumeric data or text-oriented, AV
content access;
The range of ICT device sizes in common use in the 2000s
UI and Interaction for Four Widely
Used Devices
 Personal computer,
 Hand-held mobile devices used for communication,
 Games consoles and
 Remote-controlled AV displays, players and recorders. ,
PC Interface
 Early interfaces- command based
 In 1995 WIMPS interface was introduced
 WIMPS- not only commands but interactive screen objects can be
controlled
 WIMPS(window, icon, menu, pointer devices)
 Most dominant interface- can perform direct manipulations
WIMPS interface
 WIMPS interface is associated with a desktop metaphor.
 Documents relate to Windowed areas of the screen.
 Windows can be arranged in stacks, created, discarded, moved, organized and modified on
the display screen using the pointer device(Direct manipulation)
 Advantages of the WIMPS UI over the command UI
 Order of multiple commands is ad-hoc
 Users do not need to remember command names
Dialogues in WIMP
 Dialogues are mechanisms in which users are informed about pertinent
information that they must acknowledge receipt of or they ask for input to
constrain a query.
 Typically, this interface is displayed as a pop-up window called a dialog
box.
 E.g. Form filling dialog interfaces are used by many applications for
alphanumeric data input like information systems or for data output like
spreadsheets
 These enable applications to receive data input in a structured way,
reducing the processing used by a computer.
Drawbacks of WIMP

 not necessarily an improvement for visually impaired users


 consumes screen space which is more critical in lower-resolution
displays;
 the meaning of visual representations may be unclear or misleading to
specific users;
 mouse pointer control and input require good hand-eye coordination
and can be slow.
Mobile handheld device interface
 PC style WIMPS is not effective on mobile devices because
1. Display area is smaller.
2. It is impractical to have several windows open at a time.
3. It can be difficult to locate windows and icons if they are deeply stacked one on
another
4. Difficult to resize windows.
5. Screen navigation using fingers on a touchpad or an external device may be too big
and unwieldy for small devices.
6. In addition, the keyboard is smaller for user input and there is a greater variety of
input devices.
7. Instead of using the inbuilt device interface, the device can be attached to different
kinds of external input interface PC-style

Handling limited key input
Different modes- limited number of keys and the minimum key size
 Same interface interaction can lead to different action
 Multi-Tap, T9, Fastap, Soft keys and Soft Keyboard
1. Multitap-12 keys having combinations(Explicit)
2. T9 –enhances experience of multitap(implicit)
3. Fastap-two keypads, one with smaller keys raised at the corners above the other keypad
keys.
 The upper one is used for alphabetic input, the lower one for number input,
 If several keys are hit at once, a technique called passive chording allows the system to work
out what the user intended to enter.
Continued..

4. Soft keys-two left and right keys at the top of keypad to be determined by information
on the screen;
 Allows the same keys to be reused to support application and task specific choices.
 Instead of having two soft keys, a whole mini keyboard, a soft keyboard, could also
be displayed if there is sufficient screen space.

 Internal pointer devices like tracker pad, roller pads, mini


joysticks ,keyboard arrow keys can be used to move the
pointer on the screen.
 Touch screens whose areas can be activated using some
physical stick like pointer, pen or a finger
 Auditory interfaces used for voice commands
Handling Limited Output
 If output is too large, it can be cropped, content resolution can be reduced or a zooming interface
can be used.
 Zooming (in and out) coupled with scrolling (up and down) and panning (side to side) control
enables users to view contents in an interactive way.
 Marking which part of the whole view that is currently zoomed in is also useful for orientation.
 Peephole display uses Sensors to determine the position of the device in relation to the user
 Use of projectors or organic displays(foldable) for displaying large contents on small display
 Audible outputs can be used for vehicle navigation.
 Haptic outputs can also be used as an output device.(e.g., urgency of call can be conveyed from the
vibrations)
Games console interface
 Seven different generations of game consoles are available based on the technologies they use.
 Current, seventh generation, game consoles include the Nintendo Wii
 Nintendo contains micro sensors in the form of accelerometers located inside the controller and an
infrared detector to sense its position in 3D space.
 Scoring system is often tuned to the interface(as the game progresses, it becomes difficult to score
points)
 Wii wand(natural interface) make it easy for the user to interact with the system and become
immersed in the virtual game environment.
1st Generation 2nd Generation 3rd Generation

4th Generation 5th Generation 6th Generation

Generation of Game Consoles

7th Generation
Localised remote control
 To reduce the degree of manual interaction
 Design issue- overlapping features ,devices need to be orchestrated with respect to
a common feature.
 Eg: increasing volume of home entertainment system
 Solution-universal localised remote control
Universal local remote control
Hidden UI Via Basic Smart Devices

 WIMPS is more obtrusive-needs users to think continuously


 More natural interfaces are required –gestures, senses, speech etc
 Multimodal interface , gesture interface, natural language interfaces
Multimodal interface

 Modality- mode of human interaction using one of the human senses-5 senses
 human senses such as cameras ,touch screens, microphones ,chemical sensors .
 Majority of ICT system have single mode- but human interaction is multimodal.
 Eg: attentive interface-rely on attention
wearable interface- worn by user
vision based human motion analysis system
Gesture interface
 Meaningful and expressive body movements
 Can be sensed by

wearable device-gloves
magnetic trackers
body attachments- accelerometers, gyroscopes
computer vision techniques
 Two types of gestures
 Contactful gesture-handshake, use of touchscreen
 Contactless gesture- waving at someone
 Eg: Sony’s eye toy, current devices having gyroscope
Reflective Versus Active Displays
 Ebooks are light weight, thin, long lasting powered, pocket sized devices with
touch screens enabling pages to be turned by touch.
 It differs in type of display it uses- reflective
 No energy required, readable in sunlight, can be read from any direction
 Based on electrophoretic display
 EPDs –electrophoretic phenomenon of charged particles suspended in a solvent.
 Displayed text and images can be electrically written or erased repeatedly
Combining I/O interfaces
 Resistive v/s capacitive touchscreen
 Tangible user interface (TUI) –augmenting real physical world by connecting
digital information to everyday physical objects and environments.
 Eg of TUI are ambient wood, datatiles
 Organic interface-resemble natural human -physical and human-human interaction
 Eg: Organic Light Emitting Diode (OLED) display
Advantages of OLED
 Lower cost
 Lightweight and flexible
 Good resolution
 Wider viewing angles and improved brightness
 Power efficiency
 Eg: samsung galaxy note edge, LG G Flex
 An Auditory user interface (AUI) is an interface which relies primarily or exclusively
Auditory on audio for interaction, including speech and sound. (Weinschenk & Barker 2000)
 Examples:
interface 1. Natural Language/Speech User Interfaces.
2. Hands-free automobile navigational system.
3. Interactive voice response system (IVR) like automated payment center.
 Communicative connections between machine and user
 Replacement to keyboard text
 For visually impaired users
 Challenges- noise removal, ambiguity of commands

You might also like