Cuteness As A Dark Pattern in Home Robots

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Cuteness as a ‘Dark Pattern’ in Home Robots

Dr Cherie Lacey Dr Catherine Caudwell


Media Studies School of Design
Victoria University of Wellington Victoria University of Wellington
Wellington, New Zealand Wellington, New Zealand
ORCID 0000-0002-6248-6793 ORCID 0000-0002-3496-337X

Abstract— Dark patterns are a recent phenomenon in the field literature on dark patterns to help establish the grounds for
of interaction design, where design patterns and behavioral ethical best practice in human-robot interaction design.
psychology are deployed in ways that deceive the user. However,
the current corpus of dark patterns literature focuses largely on In this paper, we apply the concept of dark patterns to
screen-based digital interactions and should be expanded to common aesthetic features of home robots, and suggest that their
include home robots. In this paper, we apply the concept of dark ‘cute’ design constitutes a dark pattern in HRI. Cuteness has
patterns to the ‘cute’ aesthetic of home robots and suggest that emerged as an important design strategy for social robotics
their design constitutes a dark pattern in HRI by (1) emphasizing because it endears the robot to the user by producing a powerful
short-term gains over long-term decisions; (2) depriving users of affective bond [3],[4],[5]. However, research by Caudwell and
some degree of conscious agency at the site of interaction; and (3) Lacey [5] suggests that the cute design of home robots creates
creating an affective response in the user for the purpose of an ambivalent power relation between the user and the robot:
collecting emotional data. This exploratory paper expands the cuteness, as an aestheticization of powerlessness, may disguise
current library of dark patterns and their application to new the robot’s powerful data-gathering capacities. This paper
technological interfaces into the domain of home robotics in order suggests that the cute aesthetic of home robots functions as a
to establish the grounds for an ethical design practice in HRI. ‘dark pattern’ for three reasons: first, it emphasizes short-term
(Abstract) gains over long-term decisions; second, it deprives users of
some degree of conscious agency (or ‘sovereignty’) at the site
Keywords—dark patterns; user experience design; interaction
design; home robots; privacy; surveillance capitalism; data ethics;
of the interaction; and finally, it creates an emotional response
emotion; affect (key words) likely to generate “data myopia” [6] in the user. More work
therefore needs to be done to establish ethical design practices
I. INTRODUCTION of social robots in the field of HRI, specifically within the
context of information privacy and data ethics.
Dark patterns are a recent phenomenon in the field of
interaction design whereby design patterns and behavioural One contribution of this paper is an expansion of the current
psychology are deployed in ways that deceive the user—for library of dark patterns and their application to new
instance, a ‘play’ button on a website that opens another browser technological interfaces into the domain of social robotics. This
window, taking the user to a page they had not intended to visit. exploratory paper considers an emergent interaction
A powerful concept through which to categorise and identify phenomenon in relation to current home robot design in order to
deceitful tactics of interaction, dark patterns have attracted a establish the case for an ethical design practice in HRI.
significant amount of popular and academic interest in recent Specially, we advocate for the following: (1) greater awareness
years, particularly in light of the Humane Technology of, and transparency around, how the design of home robots
Movement [1]. However, the current corpus of dark patterns influences interaction; and (2) greater transparency regarding
literature focuses largely on screen-based digital interactions, informed consent and reasonable expectations of privacy in
and should be expanded to include home robots. As Pagallo home robotics. If, as Harris claims, current design practice
argues, “what is new with domestic robots is that sensors, urgently needs an ethical compass to evaluate the true costs of a
cameras, GPS, facial recognition apps, Wi-Fi, microphones and life mediated by technology [7], then it is crucial that we
more, will be assembled into a single piece of high-tech” [2]. As understand the role of dark patterns in the field of robotics.
a consequence, interactions with domestic robots are likely to
become both more extemporized and more ubiquitous. Given II. DARK PATTERNS
the increasing availability of home robots on the commercial Cognitive science and behavioural psychology are
market, it is important to consider robotic technology alongside increasingly common disciplinary partners for interaction and

XXX-X-XXXX-XXXX-X/XX/$XX.00 ©20XX IEEE


978-1-5386-8555-6/19/$31.00 ©2019 IEEE 374
user-experience designers ([8],[9],[10]). Psychological (1)Nagging: redirection of expected functionality that persists
scholarship on human emotion and cognition is becoming beyond one or more interaction(s).
incorporated into the design and deployment of digital
technologies for a range of purposes, including sentiment (2)Obstruction: making a process more difficult than it needs to
analysis, which “uses algorithm processing to determine mood, be, with the intent of dissuading certain action(s).
affect, and emotion in the text of individual documents and that (3)Sneaking: attempting to hide, disguise, or delay the
of the social web itself” [11]; the development of affect-capture divulging of information that is relevant to the user.
software in voice- and facial-recognition software; the
interaction design of mood tracking applications (e.g. (4)Interface interference: manipulation of the user interface that
MoodPanda, Smiling Mind) [11]; and the design of social media privileges certain actions over others.
platforms. The psychological approach to interaction design has (5)Forced action: requiring the user to perform a certain action
become known by a range of monikers, including ‘persuasive to access (or continue to access) certain functionality.
technology’ and ‘designing with intent’, and has the goal of
influencing user behavior, emotion, and cognition through Taken together, we suggest that these techniques possess
design. three common features:
However, as Stark argues, the social reach of persuasive 1: Dark patterns produce the illusion of user sovereignty: dark
design has risen alongside its ubiquity [11]. The psycho- patterns deceive the user into believing that they have ultimate
computational complex that underpins interaction design has sovereignty, mastery, or agency over their actions. However,
been the subject of increasing concern on account of its wide- techniques used in dark patterns subject the user to the laws of
reaching ethical implications. Recent controversies, such as the the interface, and enable the user to be mapped, tracked, profiled
2014 Facebook ‘emotional contagion’ study and the and surveilled for their underlying patterns of behaviour. As
psychographic data profiling by Cambridge Analytica in the Chun writes [16]:
2016 American election and the Brexit referendum, “signal “[U]ser-friendly’ computer interfaces have been key to
watershed moments in which the intersecting trajectories of empowering and creating ‘productive individuals’ ... Computers
psychology and computer science have become matters of embody a certain logic of governing or steering through the
public concern” [11]. The combination of psychology and increasingly complex world around us … The dream is: the
computer science with interaction design has also been blamed more that an individual knows, the better decisions he or she can
for widespread “tech addiction” [1], [12]. According to the make … [However], by individuating us and also integrating us
Center for Humane Technology, persuasive design in into a totality, their interfaces offer us a form of mapping, of
technology has led to a situation in which impulse takes storing files central to our seemingly sovereign—empowered—
precedence over intention, and short-term rewards are favoured subjectivity. By interacting with these interfaces, we are also
over longer-term decision-making practices [1]. mapped: data-driven machine learning algorithms process our
The intertwining of behavioural and cognitive psychology collective data traces in order to discover underlying patterns.”
with design practice is the landscape in which dark patterns of Psycho-computational interaction design gives rise to the
user experience have emerged. The term ‘dark pattern’ was production of a quantifiable psychological subject-position that
coined by User Experience (UX) Designer Harry Brignull on the is able to be translated, via data sets and algorithmic analysis,
website darkpatterns.org, which catalogues instances where into a “model subject amenable to classification through digital
established design patterns and user behaviours are leveraged to media platforms” [11]. This user-subject equates feelings of
manipulate or deceive users [13]. Dark patterns are derived from mastery over technology with the impression of agency. This
the concept of ‘design patterns’, where designers “capture an may detract the user from (i) being aware of the ways in which
instance of a problem and a corresponding solution, abstract it their subject-position is being shaped by the software, and (ii)
from a specific use case, and shape it in a more generic way, so how that subject-position works in the service of data capitalism.
that it can be applied and reused in various matching scenarios”
[14]. In instances where interactions are prescriptive—for 2: Dark patterns emphasize short-term gains over long-term
example, entering an email address in a form—the design can decisions or actions by encouraging addictive or compulsive
make the user’s task easier by providing options that predict technology use, and offering reward-mechanisms (such as
their response. A ‘dark pattern’ is the use of this approach to ‘likes’, or the pull-to-refresh mechanism) that activate dopamine
mislead a user for the benefit of another, and “tricks users into pathways in the brain [12]. Studies have shown that these
performing unintended and unwanted actions, based on a techniques dull longer-term decision-making faculties, and tend
misleading interface design” [15]. to have lasting detrimental effects on the user’s health and well-
being. As James Williams describes in [17]:
Generally, ‘dark patterns’ refers to tactics used in the online
space—for example, an advertisement on a webpage disguised “I felt that the attention-grabby techniques of technology
as a function. However, as the word ‘pattern’ indicates, these design were playing a nontrivial role [in being constantly
techniques may be abstracted from their specific context and distracted]. I began to realize that my technologies were
applied to a range of other technologies. Building on Brignull’s enabling habits in my life that led my actions over time to
collection, Gray et al. [15] summarize dark patterns into five diverge from the identity and values by which I wanted to live.
strategy groups that are consistent with potential design It wasn’t just that my life’s GPS was guiding me into the
motivations: occasional wrong turn, but rather that it had programmed me a
new destination in a far-off place that it did not behove me to

375
visit. It was a place that valued short-term over long-term of intentionality? A design decision may have been made with
rewards, simple over complex pleasures.” good intentions for a specific audience, but resulted in
manipulative outcomes when exposed to a broader audience”
3: Related to 2 (above), dark patterns produce, manipulate,
[15]. It is, therefore, important to bear in mind that the ethical
manage, and exploit emotion in order to generate “data landscape of dark patterns is complex, and the authors stress that
myopia” [6] in the user. The emotional or affective response
it is not necessarily the case that home robot developers or
prompted by interface design can blind the user to the ‘big designers have ‘dark’ intentions; rather, the myriad forms of
picture’ of comprehensive data profiling. As Stark argues,
interaction that are possible with home robots—alongside the
against our emotional connections with our technologies, “[o]ur unprecedented access they have to our homes, familial
lack of felt connection to the aggregation of our everyday data,
relationships, and data—require more knowledge of, and
and our resulting data myopia, influence our broader attitudes transparency around, dark patterns if we are to arrive at ethical
toward information privacy and the appropriate flows of our
best practice in the field of HRI.
personal information at a societal level” [18]. Each of the dark
patterns set out by Brignull [13], as well as Grey et al. [15], We suggest that dark patterns and the ethical concerns they
contribute to the production of data myopia by diverting the raise provide a valuable framework in which to explore the ways
user’s attention away from what they are relinquishing (i.e. that social technologies are believed to act on people. However,
personal data or access to social networks) in favor of what they the dark patterns concept is generally confined to, and
might immediately receive. exemplified by, screen-based technologies. One exception is the
work of Greenberg et al. [21], who apply the dark patterns
III. ETHICS OF DARK PATTERNS concept to ‘proxemic interactions’—i.e. interactions in which
By using a familiar design-language of interaction against data are shared between devices in close proximity to each other.
the user, dark patterns effectively disregard the ethical notions We would like to see the literature on dark patterns further
of transparency, informed consent, and reasonable expectations expanded as technology becomes increasingly integrated and
of privacy. Indeed, designer Flavio Lamenza, writing for the UX networked into an array of devices. Home robots are especially
Collective, argues that the label ‘dark patterns’ is itself significant because they promise to serve as the intelligent hub
deceptive, masking the poor behavior of those who employ such of the home behind a cute and appealing facade. As Pagallo
methods: “[a]ll of this is not bad user experience design, not argues, what is new with domestic robots is that they combine a
psychology, not ‘dark patterns’. This is being dishonest, number of smart technologies and interfaces into a single object
deceitful, corrupt, and unethical” [19]. Lamenza calls into [2]. As a consequence, interaction with social robots is likely to
question the ethics and conduct of the designer, a sentiment become more extemporaneous and ubiquitous. Against the
echoed by Gray et al. who state that “[a] designer’s character is socio-political context of growing concerns regarding user
a reflection of who they are as a person—not just a performance privacy, it is more important than ever that we become literate
of design actions merely because they are ‘appropriate’ or about the ways in which design strategies influence human-
‘accepted practice’” [15]. robot interaction.
The phenomenon of dark patterns is often framed by popular IV. THE REMARKABLE CUTENESS OF SURVEILLANCE
and academic discourse as an ethical concern: a practice of co- CAPITALISM
opting human-centered values in the service of “deceptive or
One of the key design strategies for endearing home robots
malicious aims” [15], with the intention of promoting the
to the user is their cute appearance; however, it is precisely the
shareholders’ interests over the user’s. However, the ethical
cute appearance of home robots that may situate them in an
landscape of design patterns should be considered in more
ethical ‘grey-zone’ of interaction design. Discussions about the
nuanced terms than this. Gray et al. signal the complex ethical
aesthetic of robots that are designed to interact with humans
territory of dark patterns when they acknowledge “the
have long been dominated by two traditions: (1) Masahiro
persuasive intent underlying all of design activity” [15].
Mori’s notion of the uncanny valley [22], which proposed that a
Similarly, Nodder writes that “[d]esigners work hard to control
person’s affinity for a robot would turn to revulsion if the
the emotions and behaviors of users. Truly great websites—
appearance of the robot became almost, but not quite, human;
good or evil—use specific techniques to get users to perform the
and (2) Konrad Lorenz’s kindchenschema, or ‘baby-scheme’,
desired task time and time again” [20]. The ethics of any
which established the aesthetic criteria for cuteness in both
persuasive design are also complicated by Gray et al.’s query:
infant humans and animals, including: “a relatively large head,
“[w]hich user or stakeholder interests are or should be kept in
predominance of the brain capsule, large and low-lying eyes,
mind?” Gray and colleagues further elaborate that
bulging cheek region, short and thick extremities, a springy
“the emergence and use of dark patterns as an ethical elastic consistency, and clumsy movements” [23]. Such
concern in HCI and UX design reveals that design is rarely a characteristics would, Lorenz argued, elicit positive affective
solitary endeavor; in contrast, the complex entanglement among responses from parents and other caregivers, including bonding
designer responsibility, organizational pressures, and neoliberal and attachment formation. Joel Ng has referred to the cute
values often politicizes and prioritizes the profitability of design aesthetic as a “technology of the lovable”, arguing that it
above other social motivations” [15]. functions as “a stylistic device with an intrinsic vocabulary that
fosters an affectionate subject-object relationship” [24].
Gray et al. also highlight the problem of attributing ‘dark’
intentions to an interaction strategy, questioning: “[w]here along
this trajectory does a pattern become dark, and with what level

376
cocaine, and sex [32]. This short-term, rewards-based ‘hit’
provided by cuteness has further been linked to indulgent
consumption and impulse spending [33],[34],[35]. Cuteness
may therefore be understood as an aesthetic of powerlessness,
but one that is, at the same time, extraordinarily powerful: it
produces a positive affective bond between the subject and the
(cute) object, and establishes ideal conditions for the creation of
social and emotional intimacy.
Most roboticists seem quite transparent about their reasons
for making their robot appear cute. As Breazeal and Foerst have
articulated: “for our purposes as robot designers, it seems
reasonable [to] construct a robot with an infant-like appearance,
which could encourage people [to] switch on their baby-scheme
and treat it as a cute creature in need of protection and care” [36].
Jibo.inc.’s website details the design processes underpinning his
cute appearance, explaining how the designers drew on tools
from Disney animation to create his look and personality [37].
Similarly, Alonso Martinez, who designed Mira robot, explains:
“round-shaped characters tend to remind us of new-born
Fig. 1 Social robot design: Anki’s Cozmo, Blue Frog’s Buddy, Jibo inc.’s Jibo, children and seem lovable, which tends to arouse tenderness in
and Honda’s 3E-A18.  the beholder” [38]. Likewise, the ultimate goal for Cozmo’s
designers was to create a cute character that would enable the
Also significant is Shibata’s taxonomy of robotic types: user to “bond” with the robot as if it were a puppy [39].
human type, familiar animal type, unfamiliar animal type, and a
new character or imaginary animal type [25]. Of these, Beyond the design of the robot, marketing copy reinforces
‘unfamiliar animal type’ and ‘new character/imaginary animal the intimate familial bond that these robots are capable of
type’ are argued to be the most successful at avoiding negative establishing. An internet search for Jibo brings up the website
associations and negative affective responses from the heading: “Jibo Robot – He can’t wait to meet you.” Upon arrival
caregiver. This is because both unfamiliar animals and at the site, Jibo is proclaimed as “The first social robot for the
imaginary animals do not come attached with preconceived home who looks, listens and learns. Artificially intelligent,
ideas about their behavior. Most home robots currently, or soon- authentically charming” [40]. More overtly, the website for Blue
to-be, available on the commercial market combine aspects of Frog’s Buddy—the “emotional robot”—states:
the three aesthetic traditions mentioned here (Fig. 1). Home
robots such as Jibo, Honda’s 3E robots, Blue Frog’s Buddy, and “How not to resist to his cuteness and not want to adopt
Anki’s Cozmo are designed to avoid the uncanny valley, tend to him? BUDDY is an endearing emotional robot that wins the
be either an ‘unfamiliar animal’ or ‘imaginary character’ type, heart of the whole family, including children and adults. And it
and they are all extraordinarily cute. In line with theory on the is not his 60cm high that will stop him in his quest for bringing
cute aesthetic, the vast majority of the new wave of home robots the family around a new emotional experience” [41].
appear stunted in some way. Jibo, Buddy, and Honda’s 3E-A18 In both of these examples, what is promised is not just
all have missing ‘limbs’, outsized heads, and huge eyes. For a genuine and meaningful relationship, but a new family
Jibo’s design, the screen is essentially a giant head, which also
dynamic. By positioning these robots as family members, they
functions as a large eye, or a “personality-led animated eyeball”
are arguably more likely to be engrained in the habits and
[26]. The body-stand can swivel and tilt in a way that mimics
the movements of a toddler. This kind of movement has been rhythms of everyday life, encouraging high levels of use and
studied by Mara and Appel [27], who found that users perceived interaction.
robots with a head tilt to be “significantly cuter” than those
without. Jibo has no limbs and is incapable of locomotion. As Domestic robots, and the design and marketing materials
Turkle argues, such companion robots “push our Darwinian that surround them, appear, then, to communicate an ideal of
buttons” [28] with this combination of design features, invoking transparency. In the examples above, we see a degree of
in the user the desire to nurture, protect, and care for the robot. openness regarding the role of cuteness in establishing an
affective transference between the user and robot (i.e.
A number of scholars in the emerging interdisciplinary field ‘protection and care’, ‘tenderness’, ‘bonding’). What is not
of Cute Studies have argued that cuteness has become one of the being communicated, however, are other functions of cuteness
most pervasive commodity aesthetics in the digital age. Harris in the design of home robots: the emphasis on short-term gains,
[29] Dale [30], and Gn [4],[24] suggest that cute commodity the encouragement of ‘data myopia’ in the user, and its role in
objects may help ease feelings of loneliness and isolation by producing an illusion of user sovereignty—all of which suggests
provoking comforting, childlike affective states. Others [31] cuteness constitutes a dark pattern in home robots.
suggest that cute objects provide a buffer against our precarious
political and ecological existence. Neurologists have confirmed
that interactions with cute objects release dopamine into the
brain and activate the same neurological pathways as gambling,

377
V. SHORT-TERM GAINS AND ‘DATA MYOPIA’ providing and improving on the optimal experience of the robot
Given that emotion is a central channel of interaction for [5]. These data transcend what is consciously entered by the
social robotics, the potential for manipulation and exploitation user, such as contact details and age, and incorporates the
is high. Under the category of ‘interface interference’, Gray et environmental, sensory, affective, and atmospheric states of the
al. list ‘toying with emotion’ as a dark pattern that “includes any home. Buried in Jibo’s Privacy Policy, users are informed that,
use of language, style, color, or other similar elements to evoke “when using Jibo products, we may collect recordings of your
an emotion in order to persuade the user into a particular action. voice or image, or other biometric identifiers” [47]. Similarly,
This can manifest as cute or scary images, or as enticing or Asus’s Zenbo robot “may collect your voice instruction as well
frightening language” [15]. Designer Dieter Rahm has famously as your video record, which may contain the image of your home
said that “good design is honest” [42]; Hoffman [43] has linked environment … The above voice, video, and communication
Rahm’s principle of honesty to robot design, arguing that if the records may contain your personal data” [48]. Further, opting
aesthetics overpromise what the robot can do, then it should be out of data-collecting services, updating privacy settings, and
considered operationally dishonest. Although Hoffman was monitoring changes to Privacy Policies all require unreasonable,
discussing humanoid robots in this instance, we argue that the if not impossible, ongoing attention to websites and their third-
psychological power of cuteness also constitutes a dishonest party affiliates. This undermines the power of the user to manage
façade. their information privacy.

Cuteness in home robots is designed to induce in the user In their work on dark patterns and proxemic technology
strong feelings of attachment, bonding, and protection, thereby interactions, Greenberg et al. [21], raise another perspective on
setting up the ideal conditions for companionship and care. the incongruency of data collection and conscious choice within
Although there are a number of studies that have highlighted the the context of social interaction:
positive impact of robotic companionship in healthcare settings “In day-to-day life, proximity is an ephemeral phenomenon.
[44], it is also worth highlighting the role of emotional The proxemic relationship between parties dissolves as soon as
attachment to technological devices within the wider socio- they separate. In contrast, systems can tag any proxemic
political context of data collection and analysis, and new interactions as indicating a permanent, persistent (and
regulations around data protection and privacy. undesirable) relationship that is never forgotten.”
Psychological studies [45] have shown that cuteness dulls
long-term decision-making behaviour by activating short-term, Greenberg et al. [21] term this the ‘we-never-forget’ pattern,
rewards-based neurological responses. Cuteness could therefore whereby what is usually an ephemeral experience is made
be understood as a tactic to deceive the user into acting in ways concrete by the collecting and recording of data.
that are not in their best interests in the longer term, particularly Design decisions that undermine the position of the user are
regarding information privacy. The cute design of home robots of particular relevance to the field of social robots because, as
provides a kind of ‘path of least resistance’, in which future- Pagallo puts it, “domestic robots will know a lot of things about
oriented decisions about privacy are exchanged for the short- our private lives” [2]. Because they operate within the domestic
term, positive affective hit that accompanies feelings of environment, these robots will be ideally placed to capture new
companionship or intimacy with a cute object. If cuteness is sets of data in new situations. As a specific design example, the
known to activate an instinctual or immediate response in the exaggeratedly large and emotive eyes that typify cuteness are
user [45], we question whether there is an incongruity between also, for a robot, the ‘front’ for their recognition and recording
the unconscious or impulsive ways that interaction with robots
technologies. The cute appeal of the eyes works to draw the user
is designed, and the responsibility required of users by the Terms in, while the interaction produces data for collection.
of Service and Use.
Particularly important will be the deeply granular data about our
The most obvious aspect to point out here are the aesthetic subjective states. This is something that Kerr and McGill have
differences between the cute design of the robotic object and the called our “personal data emanations” [49], and it is of enormous
website design for the Terms of Use and Privacy Policy. As interest to business in the digital economy, which has a great
Natasha Lomas writes, deal to gain from behavior tracking and the manipulation and
exploitation of emotion. Together, the cute aesthetic’s emphasis
“Where T&Cs are concerned, it really is shooting fish in a on short-term gains and its privileging of impulse over intention
barrel. Because humans hate being bored or confused and there may lead to data myopia in the user, undermining their long-
are countless ways to make decisions look off-puttingly boring term decision-making capacities.
or complex—be it presenting realms of impenetrable legalese in
tiny greyscale lettering so no-one will bother reading it,
combined with defaults set to opt in when people click ‘OK’;
deploying intentionally confusing phrasing and/or confusing VI. THE ILLUSION OF USER SOVEREIGNTY
button/toggle design that makes it impossible for the user to be
sure what’s on and what’s off (and thus what’s opt out and If, as Breazeal and Foerst [36] argue, cute robots encourage us
what’s an opt in) or even whether opting out might actually to ‘switch on our baby-scheme’, then the user is interpellated by
mean opting in to something you really don’t want” [46]. the robot into the subject-position of ‘protector’ and ‘caregiver’.
In the marketing materials of many home robots, the Taking on the role of caregiver or protector appears to situate
collection of personal data is often framed as necessary in the user in a position of power or mastery over the robot,

378
including ‘raising’ the robot within the context of the family. turn’ in digital commerce [6],[52]. Some scholars have begun to
Indeed, the marketing discourse that surrounds the introduction document the emergence of what Zuboff terms ‘surveillance
of home robots onto the commercial market communicates this capitalism’ [53], and its reliance on social technologies for
as a unique selling point: the robot will grow alongside you and behavioural tracking and manipulation. Within the context of
your family, becoming its own, distinct personality. The user surveillance capitalism, home robots will likely play a crucial
appears, then, to be empowered to make decisions on behalf of role. As Pagallo notes, the security of personal data and privacy
the robot, thereby eliciting feelings of user sovereignty over the is at stake in relationships with home robots because, “[c]ontrary
technology. Additionally, the consistent discursive emphasis on to traditional robots with on-board computers, recent
applications are increasingly connected to a networked
‘family’ in the marketing materials of home robots encourages
repository on the Internet that allows robots to share the
a much more complex interaction framework than one between
information required for object recognition, navigation and task
singular subjects, which suggests that social home robots are an completion in the real world” [2]. In this case, home robots
example of what we call ‘Family-Robot Interaction’[FRI]. require the steady stream of personal and emotional data in order
to function, if they are to provide companionship at all.
As Pagallo argues, although it is the user that enables the
robot to ‘survive in the environment’ by fulfilling its needs, the Design mediates relationships in HRI, and impacts the social
“social needs of the artificial agent are defined by the designer interactions and environments in which data are exchanged. The
and modelled by the internal control architecture of the cuteness of social technology plays a fundamental role in the
machine” [2]. Therefore, although the user is interpellated into a new data economy by promoting certain kinds of affective
position of power by the robot’s cute appearance, the robot’s relationships; these relationships are, in turn, well placed to
software makes only certain relationships possible. In this sense, capture valuable emotional data about the user, as well as
and following from Lessig’s axiom that “code is law” [50], it is contextual, sensory information about their environment. This
the robot’s software that fundamentally determines the phenomenon is what we have called data’s ‘path of least
interaction, not the user. Within the context of human-robot resistance’, and it relies on cross-cultural responses to cute
interaction, software is sovereign, and the user subjected to its stimuli that emphasize impulse over intention, and short-term
laws. As Bratton writes: rewards over long-term decisions. Currently, the predominant
design aesthetic of domestic robots promotes a conflict between
“[The] agglomeration of computing machines into platform unconscious action and conscious thought, depriving users of
systems not only reflects, manages, and enforces forms of some level of agency at the site of the interaction while
sovereignty; it also generates them in the first place. Just as for simultaneously producing feelings of mastery over the
Foucault’s technologies, its mechanics are not representative of technology. Consequently, we believe that the cute aesthetic of
governance; they are governance. But unlike for Foucault’s home robots constitutes a dark pattern in user experience design,
archaeology, its primary means and interests are not human and work should be done to promote more conscious, rational,
discourse and human bodies but, rather, the calculation of all the and transparent interactions with domestic robots.
world’s information and of the world itself as information. We,
the humans, while included in this mix, are not necessarily its Although cuteness may make for quick and easy affections,
essential agents, and our well-being is not its primary goal” [51]. less considered is what would constitute a meaningful
relationship with a robotic companion. Dotson calls for the
Bratton’s point is that the user is encouraged to experience design of technologies that encourage thoughtful action within
feelings of mastery over the technology in order to detract them our “local physical and social ecologies”, suggesting that
from the ways in which the technology is generating their “scholars could study how technologies could actually enable
subject-position as ‘user’ in the first place. The legal more reflexive decisions and thus the ability to more easily put
implications of the emergence of the user as a new subject- one’s values into action. Second, technologies could be
position are significant, but well outside the scope of this paper. discussed, designed and deployed in more open
However, it is worth bringing in Bratton’s questions here acknowledgment of their implications for the good life” [54].
regarding user sovereignty: “in the slippery redefinitions of Following Dotson, relationships with home robots could have
citizenship and sovereignty in a cloud computing era, what implications for how the notion of a relationship itself is
reference of last resort can we rely on? Human rights? End-user understood, and it is therefore important that design be
agreements? Are we obligated to every service embedded in conducted with an acknowledgement of systematic effects
every software-enabled object or surface we might encounter?” beyond the human-robot interaction. Methods such as Value
[51]. If home robots are going to function as the cute hub of the Sensitive Design [55] are one way that designers can
Internet-enabled home, then we need to critically evaluate the systematically explore the far-reaching and indirect
flow of power in and around HRI: who has sovereignty in this repercussions of their work.
techno-social environment? And who has the most to gain from
encouraging the user to feel as though they have absolute Information privacy and data protection must be considered
mastery over their technology? in the context of human dignity and property rights; therefore,
the design of domestic robots should come into alignment with
VII. CONCLUSION these human-centric values. As Pagallo argues, “attention
A number of converging trends in HCI design, behavioural should be drawn to the overall idea that designers and
psychology, neurology, data science and advertising reveal that manufacturers of domestic robots should build them in such a
the extraction, collection, and analysis of emotional data is of way that the latter can fulfil their tasks within limits that humans
extraordinary value to business, part of a broader ‘behavioral can rationally accept and find comfortable or adequate” [2].

379
Similarly, Stark argues that the design of digital objects should in Proceedings on Privacy Enhancing Technologies: PETS 2016,
provoke a “reflective response, tied to the user’s visceral sense Darmstadt, Germany, July 19 -22, 2016, C. Diaz and A. Kapadia, Eds.
Warsaw: De Gruyter, 2016. pp. 237-254.
of privacy” [6]. In line with this, we would like to encourage a
[15] C. Gray, Y. Kou, B. Battles, J. Hoggatt, and A.L. Toombs, “The dark
movement towards design that promotes intentional, rational, (patterns) side of UX design, In Proceedings of the 2018 CHI Conference
and reflexive interaction. Alongside this, we believe that Privacy on Human Factors in Computing Systems (CHI '18) Montreal, QC, April
Policies and Guidelines should also factor in the complex 21-26, 2018, New York: ACM Paper 534, 14 pages. DOI:
networks of affects that influence and shape the management of https://doi.org/10.1145/3173574.3174108
emotional data, and reflect a reasonable expectation of privacy. [16] W. Chun, Programmed visions: Software and memory. Cambridge, MA:
MIT Press, 2011.
Karen Barad argues that ethics is about what matters. For [17] J. Williams, Stand out of our light: Freedom and resistance in the attention
her, ethics is about “taking account of the entangled economy, Cambridge: Cambridge University Press, 2018.
materializations of which we are a part, including new [18] L. Stark, “That signal feeling: Emotion and interaction design from social
configurations, new subjectivities, new possibilities” [56]. An media to the" anxious seat””, Ph.D Dissertation, Department of Media,
ethics of ‘what matters’ should take into account the ‘matter’ of Culture, and Communication. New York University, 2016.
the human person in all its material, affective, social, and [19] F. Lamenza, “Stop calling these dark design patterns or dark UX - these
cognitive entanglements. In an age that is increasingly mediated are simply a**hole designs”, UX Collective, April 12, 2018. Available:
https://uxdesign.cc/stop-calling-these-dark-design-patterns-or-dark-ux-
by technology, Barad’s posthuman approach to ethics might also these-are-simply-asshole-designs-bb02df378ba [Accessed Nov 28 2018].
encourage us to question our own values regarding what matters [20] C. Nodder, Evil by design: Interaction design to lead us into temptation.
to us most. User experience and interaction technology design Indianapolis: Wiley, 2013.
can—and should—play an important role in asking us to reflect [21] S. Greenberg, S. Boring, J. Vermeulen, and J. Dostal, “Dark patterns in
on the kind of life that matters to us, and empowering us to make proxemic interactions: a critical perspective”, In Proceedings of the 2014
the decisions that will lead us there. conference on Designing interactive systems, New York: ACM, 2014. pp.
523-532. DOI: https://doi.org/10.1145/2598510.2598541
[22] M. Mori, K. MacDorman, N. Kageki, “The uncanny valley [from the
field]”, IEEE Robotics and Automation Magazine, vol. 19, no. 2. 2012,
REFERENCES pp. 98-100.
[1] Center for Humane Technology, [online]. Available: [23] K. Lorenz, Studies in Animal and Human Behavior. Cambridge: Harvard
http://humanetech.com/ [Accessed 28 Nov. 2018]. University Press, 1971.
[2] U. Pagallo, (2016) “The impact of domestic robots on privacy and data [24] J. Ng, “The technology of the cute body”, Eidos: Journal for Philosophy
protection, and the troubles with legal regulation by design,” In Data of Culture vol. 4, issue 6. pp. 14 - 26. December 2018.
protection on the move: Current developments in ICT and privacy/data [25] T. Shibata, “An overview of human interactive robots for psychological
protection, S. Gutwirth, R. Leenes & P. De Hert Eds. New York: Springer, enrichment,” Proceedings of the IEEE vol. 92 no.11, 2004, pp.1749 -
2016, pp. 387 - 410. 1758.
[3] C. Breazeal, Designing Sociable Robots. Cambridge: MIT Press, 2002. [26] C. Davies, “Jibo delayed to 2017 as social robot hits more hurdles,”
[4] J. Gn, “Designing affection: On the curious case of machine cuteness,” In Slashgear. 20 November, 2016 [online]. Available:
The Aesthetics and Affects of Cuteness, J. P. Dale, J Goggin, J. Leyda, https://www.slashgear.com/jibo-delayed-to-2017-as-social-robot-hits-
A.P McIntyre, and D. Negra Eds, New York, NY: Routledge, 2017, pp. morehurdles-20464725/ [Accessed 8 January 2019].
185-203. [27] M. Mara, and M. Appel, “Effects of lateral head tilt on user perceptions
[5] C. Caudwell and C. Lacey “What do home robots want? The ambivalent of humanoid and android robots”, Computers in Human Behavior vol.
power of cuteness in robotic relationships” Convergence, forthcoming 44. pp. 326-334. March 2015. p.332.
2019. [28] S. Turkle, “In good company? On the threshold of robotic companions,”
[6] L. Stark, “The emotional context of information privacy,” The In Close Engagements with Artificial Companions: Key Social,
Information Society, vol. 32, no. 1, 2016, pp. 14- Psychological, Ethical and Design Issues, Y. Wilks Ed,
27. Available: 10.1080/01972243.2015.1107167 Amsterdam/Philadelphia: John Benjamins Publishing Company, 2010,
[7] T. Harris, “How technology hijacks people’s minds - from a magician and pp.3-10.
Google’s design ethicist,” Tristan Harris May 19, 2016 [online]. [29] D. Harris, Cute, quaint, hungry, and romantic: The aesthetics of
Available: http://www.tristanharris.com/essays/ [Accessed Nov 28 consumerism, New York: Basic Books, 2000.
2018]. [30] J. P. Dale, “The appeal of the cute object: Desire, domestication, and
[8] B. Fogg, Persuasive technology: Using computers to change what we agency,” In The Aesthetics and Affects of Cuteness, J. P. Dale, J Goggin,
think and do. San Francisco: Morgan Kaufmann Publishers, 2003. J. Leyda, A.P McIntyre, and D. Negra Eds, New York: Routledge, 2017,
[9] D. Norman, Emotional design: Why we love (or hate) everyday things. pp. 35-55.
New York: Basic Books, 2004. [31] M. Pramaggiore, ‘I’ll be Dancin’: American Soldiers, Cute YouTube
[10] S. Wendel, Designing for behaviour change: Applying psychology and Performances, and the Deployment of Soft Power in the War on Terror’
behavioural economics. California: O’Reilly, 2014. In The aesthetics and affects of cuteness, J. P. Dale, J Goggin, J. Leyda,
A.P McIntyre, and D. Negra Eds, New York, NY: Routledge, 2017, pp.
[11] L. Stark, Algorithmic psychometrics and the scalable subject. Social 95-111.
Studies of Science, Vol.48, no. 2, 2018, pp.204-231.
[32] S. Ngai, Our aesthetic categories: Zany, cute, interesting, Cambridge,
[12] P. Lewis, “’Our minds can be hijacked’: The tech insiders who fear a MA: Harvard University Press, 2012.
smartphone dystopia,” The Guardian, 6 October, 2017 [online].
Available: [33] S. Chin, T. Wade and K. French, “Race and facial attractiveness:
https://www.theguardian.com/technology/2017/oct/05/smartphone- Individual differences in perceived adoptability of children,” Journal of
addiction-silicon-valley-dystopia [Accessed Nov 28 2018]. Cultural and Evolutionary Psychology vol. 4, no. 3-4, 2006, pp. 215–229.
[13] H. Brignull, Dark Patterns [online]. Available: https://darkpatterns.org/ [34] A. Volk and V. Quinsey, “The influence of infant facial cues on adoption
[Accessed Nov 28 2018]. preferences,” Human Nature vol. 13 no. 4, 2002, pp. 437–455.
[14] C. Bosch, B. Erb, F. Kargl, H. Kopp, and S. Pfattheicher, “Tales from the [35] N. de Vries, “Under the yolk of consumption: Re-envisioning the cute as
dark side: Privacy dark strategies and privacy dark patterns,” consumable,” In The Aesthetics and Affects of Cuteness, J. P. Dale, J

380
Goggin, J. Leyda, A.P McIntyre, and D. Negra Eds, New York, NY: [46] N. Lomas, “WTF is dark pattern design?” TechCrunch, July 1, 2018
Routledge, 2017, pp. 253-273. [online]. Available: https://techcrunch.com/2018/07/01/wtf-is-dark-
[36] C. Breazeal and A. Foerst, “Schmoozing with robots: Exploring the pattern-design/ [Accesed 14 January 2019]
original wireless network,” In Proceedings of Cognitive Technology [47] Jibo Inc. “Privacy”, Jibo [online]. Available:
(CT99), San Francisco, USA, Aug. 11 -14, K. Cox, B. Gorayska, J. Marsh https://www.jibo.com/privacy/ [Accessed 28 November 2018]
Eds. Michigan: M.I.N.D Lab, 1999. pp. 375–390. [48] Asus, “Terms of use notice / privacy policy”, Asus, [online]. Available:
[37] C. Lacey and C. Caudwell, “The robotic archetype: Character animation https://www.asus.com/Terms_of_Use_Notice_Privacy_Policy/Privacy_
and social robotics,” In Proceedings of the 10th International Conference Policy [Accessed 28 November 2018]
on Social Robotics Qingdao, China Nov. 28 - 30 2018, S. Ge, J. [49] I. Kerr, and J. McGill. “Emanations, snoop dogs and reasonable
Cabibihan, M. Salichs, et al. Eds, Switzerland: Springer. 2018. Pp 24-34. expectations of privacy,” Criminal Law Quarterly vol. 52 no. 3, Nov.
[38] P. Canteneur, “Disney remains an inspiration for designing robot- 2008, pp. 392–431.
assistants,” L’atelier BNP PARIBAS, Feb, 2017 [online]. Available: [50] L. Lessig, Code: And Other Laws of Cyberspace, New York: Basic
https://atelier.bnpparibas/en/prospective/article/disney-remains- Books, 1999.
inspiration-designing-robot-assistants [Accessed 14 January 2019].
[51] B. Bratton, The Stack, Cambridge, MA: MIT Press, 2016.
[39] Personal communication.
[52] A. Nadler, and L. McGuigan, “An impulse to exploit: The behavioural
[40] Jibo inc. Jibo [online]. Available: https://www.jibo.com/ [Accessed 8 turn in data-driven marketing,” Critical Studies in Media Communication,
January 2019]. vol. 35, no. 2. March 2017. Available:
[41] Bluefrog Robotics, Buddy: The emotional robot. [online] available: https://doi.org/10.1080/15295036.2017.1387279
https://buddytherobot.com/en/buddy-the-emotional-robot/ [Accessed 8 [53] S. Zuboff, “Big other: Surveillance capitalism and the prospects of an
January 2019]. information civilization”, Journal of Information Technology vol. 30, no.
[42] D. Rams, “The power of good design: Dieter Rams’s ideology, engrained 1 pp.75-89. April 2015. Available: 10.1057/jit.2015.5
within Vitsœ” Vitsoe [online] Available: [54] T. Dotson, “Technology, choice and the good life: Questioning
https://www.vitsoe.com/rw/about/good-design [Accessed 10 January technological liberalism” Technology in Society vol. 34. Pp. 326 - 336.
2019]. Nov. 2012. Available: http://dx.doi.org/10.1016/j.techsoc.2012.10.004
[43] G. Hoffman, cited in K. Schwab, “The case against anthropomorphizing [55] N. Doorn et al. Eds, Early engagement and new technologies: Opening up
robots” Fast Company, 8 January, 2019 [online]. Available: the laboratory, Dordrecht: Springer, 2013.
https://www.fastcompany.com/90288928/the-case-against-
[56] K. Barad, Meeting the Universe Halfway: Quantum Physics and the
anthropomorphizing-robots [Accessed 10 January 2019]
Entanglement of Matter and Meaning, Durham: Duke University Press,
[44] L. Riek, “Healthcare robots,” Communications of the ACM, 24 October 2007.
2017, Vol.60(11), pp.68-78.
[45] O. Aragon et al. “Dimorphous Expressions of Positive Emotion: Displays
of Both Care and Aggression in Response to Cute Stimuli,” Psychological
Science vol. 26 issue 3, Jan. 2015, pp.259-273.

381

You might also like