How Social Media Filter Bubbles and Algorithms Influence The Election - Technology - The Guardian

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

29/9/2017 How social media filter bubbles and algorithms influence the election | Technology | The Guardian

How social media lter bubbles and algorithms


inuence the election
With Facebook becoming a key electoral battleground, researchers are studying how automated accounts are
used to alter political debate online

Alex Hern
Monday 22 May 2017 14.14BST

One of the most powerful players in the British election is also one of the most opaque. With just
over two weeks to go until voters go to the polls, there are two things every election expert agrees
on: what happens on social media, and Facebook in particular, will have an enormous eect on
how the country votes; and no one has any clue how to measure whats actually happening there.

Many of us wish we could study Facebook, said Prof Philip Howard, of the University of
Oxfords Internet Institute, but we cant, because they really dont share anything. Howard is
leading a team of researchers studying computational propaganda at the university, attempting
to shine a light on the ways automated accounts are used to alter debate online.

I think that there have been several democratic exercises in the last year that have gone o the
rails because of large amounts of misinformation in the public sphere, Howard said. Brexit and

https://www.theguardian.com/technology/2017/may/22/social-media-election-facebook-filter-bubbles 1/4
29/9/2017 How social media filter bubbles and algorithms influence the election | Technology | The Guardian

its outcome, and the Trump election and its outcome, are what I think of as mistakes, in that
there were such signicant amounts of misinformation out in the public sphere.

Not all of that comes from automation. It also comes from the news culture, bubbles of
education, and peoples ability to do critical thinking when they read the news. But the proximate
cause of misinformation is Facebook serving junk news to large numbers of users.

Emily Taylor, chief executive of Oxford Information Labs and editor of the Journal of Cyber Policy,
agreed, calling Facebooks eect on democratic society insidious. Taylor expressed similar
reservations about fake news being spread on social media, (a term Howard eschews due to its
political connotations, preferring to describe such sources as false, junk or simply bad), but
she added there was a deeper, scarier, more insidious problem: we now exist in these curated
environments, where we never see anything outside our own bubble and we dont realise how
curated they are.

A 2015 study suggested that more than 60% of Facebook users are entirely unaware of any
curation on Facebook at all, believing instead that every single story from their friends and
followed pages appeared in their news feed.

In reality, the vast majority of content any given user subscribes to will never appear in front of
them. Instead, Facebook shows an algorithmic selection, based on a number of factors: most
importantly whether anyone has paid Facebook to promote the post, but also how you have
interacted with similar posts in the past (by liking, commenting or sharing them) and how much
other people have done the same.

It is that last point that has Taylor worried about automation on social media sites. Advertising is
a black hole of its own, but at least it has to be vaguely open: all social media sites mark sponsored
posts as such, and political parties are required to report advertising spend at a national and local
level.

No such regulation applies to automation. You see a post with 25,000 retweets or shares that
comes into your timeline, Taylor said, and you dont know how many of them are human. She
sees the automation as part of a broad spectrum of social media optimisation techniques, which
parties use to ensure that their message rides the wave of the algorithmic curation on to as many
timelines as possible. It is similar, though much younger and less documented, to search engine
optimisation, the art of ensuring a particular web page shows up high on Googles results pages.

Academics such as Taylor and Howard are trying to study how such techniques are applied, and
whether they really can swing elections. But their eorts are hurt by the fact that the largest
social media network in the world Facebook is almost totally opaque to outsiders.

If Howards group were examining Facebook rather than Twitter, they would only be able to
crawl the public pages, he said. That would miss the vast majority of activity that goes on on the
social network, on private timelines, closed groups, and through the eect of the algorithmic
curation on individual feeds. Even so, he says, those public pages can be relevant. In some of our
other countries studied, we think weve found fake Facebook groups. So there are fake users, but
the way we think they were used with Trump in particular is that they were used, created,
hired, rented, to join fake fan groups that were full of not-real people.

https://www.theguardian.com/technology/2017/may/22/social-media-election-facebook-filter-bubbles 2/4
29/9/2017 How social media filter bubbles and algorithms influence the election | Technology | The Guardian

Those fake groups may have eventually attracted real fans, he said, who were emboldened to
declare their support for the candidate by the articially created perception of a swell in support
for him. Theres all these Trump fans in your neighbourhood, that you didnt really know so
we think thats the mechanism. And then we think some of those public pages got shut down,
went private, or just because so full of real people that the fake problem went away. We dont
know, this is the theory.

Facebook does allow some researchers access to information that would answer Howards
questions it just employs them rst. The company publishes a moderate stream of research
carried out by its own data scientists, occasionally in conjunction with partner institutions. By
and large, such research paints a rosy view of the organisation, though occasionally the company
badly misjudges how a particular study will be received by the public.

In 2014, for instance, the social network published research showing that two years earlier it had
deliberately increased the amount of negative content on the timelines of 150,000 people, to
see if it would make them sad. The study into emotional contagion sparked outrage, and may
have cooled Facebooks views on publishing research full stop.

As a result of their lack of access to Facebook, Howard and his team have turned to Twitter. Even
there, the companys limits hit hard they can see just 1% of posts on the site each day, meaning
they have to carefully select what terms they monitor to avoid being too broad. In the US election,
they hit the cap a few times, missing crucial hours of data as conversation hit a fever pitch.

Similar limitations exist throughout the study. The team had to use a broad denition of
automated posting (they count any account that makes more than 50 tweets a day with political
hashtags), because Twitter would not share its own denition. And they had to limit their
examination of political postings to tweets that contain one of about 50 hashtags such as #ge17
not only to avoid hitting the 1% limit, but also to only scoop up tweets actively engaged in
political debate.

The result, Howard said, was vaguely analogous to the conversation on Facebook: while it may
not be the same, it is likely that debates that are most automated on Twitter are also most
automated on Facebook, and in largely the same direction.

But the limitations have one huge advantage: unlike nearly every other academic in the world,
and the vast majority of civil institutions responsible for regulating the fairness of elections, the
Oxford Internet Institute aims to publish its ndings before the election, updated on a regular
basis. When I met Howard, along with two of his DPhil students and fellow researchers John
Gallacher and Monica Kaminska, they were just beginning to make sense of the Twitter data. The
graduate students face a heavy couple of weeks coding the data, manually marking web domains
and links into categories like news, hoax or video, but the result should be a rare glimpse
into a style of campaigning that many of its practitioners wish was hidden.

It is too early to say what results they will get, though Howard has pulled one nding from the
preliminary data: judging by their own metrics, there do not seem to be signicant amounts of
bots posting Russian content, such as links to Sputnik or Russia Today.

Even there, though, he wishes for a small amount of extra cooperation. Weve stopped working
with geolocation, he said, referring to the process of trying to work out from where a particular
tweet was sent, but Twitter has the IP addresses of every user. Sharing that, even in aggregate,
https://www.theguardian.com/technology/2017/may/22/social-media-election-facebook-filter-bubbles 3/4
29/9/2017 How social media filter bubbles and algorithms influence the election | Technology | The Guardian

anonymised form, could shine a tiny light on a side of democratic politics shrouded in darkness.
These days, on the internet, no one knows youre a bot.

Topics

Facebook
Internet
General election 2017
Social networking
news

https://www.theguardian.com/technology/2017/may/22/social-media-election-facebook-filter-bubbles 4/4

You might also like