Social Media Algorithms and Data Management
Social Media Algorithms and Data Management
Social Media Algorithms and Data Management
Ivana STAMENKOVIĆ
Dušan ALEKSIĆ
Abstract
Although the audience in the digital media space has more power
than in the traditional media environment, as indicated by their
ability to create, reshape and share content, media users’ behavior
is shaped by the use of algorithms and big data management.
Taking into consideration the fact that students use the internet
and social media platforms daily, this paper aims to examine their
perceptions and viewpoints on the operation of algorithms and
data management on the Internet. According to a survey
conducted by the authors, which consists of 200 respondents, two-
thirds of students notice the results of the algorithmic
personalization, filtered selection of content and news, and the
customized display of content on social media. Even though 70%
of them realize that user activities are continually monitored and
that control over personal data online is taken over by large
companies and/or a third party, most respondents express only
moderate concern for their data online (82%), which further
confirms the fact that only a small percentage of students (18%)
almost always read the terms of use on a website, application, or
internet service.
Introduction
200 Balkan Social Science Review, Vol. 21, June 2023, 199-217
Social media algorithms and data management
universe users. On the other hand, the level of criticism towards the algorithmic
programming of social reality can vary among media users, which depends on
numerous factors, such as user behavior, the degree of participation online, the
level of education, as well as the knowledge of the methods of functioning of
modern, digital communications based on the principles of algorithmic
organization of information space. Moreover, the unpredictability of content
that differs from the media users’ preferences and interests, may cause
skepticism of young people towards a social media platform as a provider of
information, specifically the awareness of the reciprocal relationship between
user behavior and algorithm-dependent decisions (Swart, 2021). If users can
detect latent forms of manipulation carried out by organizations, institutions,
groups, and individuals on the internet analytically and critically, the belief that
the benefits of the digital world are easily accessible, with plenty of knowledge
available and social barriers removed, becomes a delusion.
In a world free from hierarchical forms of influence of traditional and
official news sources, social media platforms and search engines, together with
the companies that own them, have become the dominant distributors of
information and programmed consciousness. These groups are taking over the
role of information gatekeepers and social agenda editing, thus suppressing the
former information masters. In addition, the difference is that instead of the
human factor, in the domain of social media and search engines, the editorial
function is taken over by an algorithm, under the influence of the criteria of
those who created and implemented it (Presuel & Martinez Sierra, 2019).
Taking into account that students use the Internet and social media platforms
daily, this paper aims to examine their perception of and attitudes toward the
operation of algorithms and data management on the Internet.
1
All translations of quotations made by non-English authors were made by the authors
of the article.
Balkan Social Science Review, Vol. 21, June 2023, 199-217 201
Ivana STAMENKOVIĆ, Dušan ALEKSIĆ, Tatjana ĐUKIĆ ŽIVADINOVIĆ
of the current situation; its use is always influenced by social, political, and
economic factors.
Commercial imperatives dictate the further direction and cannot be
equated and justified by technological inevitability. For this reason, according
to Shoshana Zuboff, we should look for puppeteers “who secretly, behind the
scenes, control the machines and tell them what to do” (Zuboff, 2020, p. 27).
Zuboff also pointed out the modern form of capitalism, the so-called
surveillance capitalism, which claims unilateral ownership of the human
experience and behavioral data. While one part of the data is used to improve
services and products, the other part - a private behavioral surplus, is used to
predict future behavior thanks to machine intelligence. Additionally, these data
are transformed into prognostic products and become key items on the market
of ideas (Zuboff, 2020, p.18). An insight into these data enables the prediction
of future behavior and directs millions of people around the world toward
acceptable choices dictated by surveillance capitalists. Behavior modification
is becoming a goal, and how it is achieved is becoming more hidden, perfidious,
and complex. Through simple decisions such as purchasing a product, watching
a certain movie, or visiting a restaurant, to forming a lifestyle or a political or
ideological orientation, a new market project is imposed and is managed from
the shadows by the most intimate part of our personality, and cannot be avoided.
Data mining has become a lucrative strategy that ensures a long-term impact
on the user and their behavior. It is easy to influence people’s future activities
with filtered content if it is known what they want, their personal and
professional interests, plans, and habits. Therefore, based on the extracted data,
one can assume the structure of an individual’s personality, character, affinities,
what they want to have, buy, do, or what they want to be. It has become
generally accepted that personal data that users leave online is used, classified,
and grouped and this process is carried out with the help of search tools, the
most popular of which is Google. Also, these search tools serve users by
providing information they consider relevant based on individual user’s
previous online activities, while at the same time collecting, sorting, storing,
and selling data to advertisers and other third parties. Only a small number of
users are aware that their privacy has been violated, and the issue emerges as
the result of the unread terms of use, which are complex, incomprehensible, and
changeable. One of the problems is the insufficient critical potential of users to
understand how they can protect themselves and share as little personal data as
possible.
Taking into consideration that surveillance is a constant phenomenon
in the world (Zuboff, 2020) and that its tentacles are expanding and multiplying
in the circumstances of digital communication, Mary Chayko distinguishes
between vertical (asymmetric) and horizontal (social) surveillance (Chayko,
2019, p. 100-109). Online surveillance relies on the use of the internet to track
or observe someone’s behavior. In the case of a “solid hierarchical structure of
power”, the surveillance that is carried out for commercial, political, or legal
purposes is called vertical. It involves the government with its institutions and
various corporations, which possess the power to influence, direct, shape, and
202 Balkan Social Science Review, Vol. 21, June 2023, 199-217
Social media algorithms and data management
Balkan Social Science Review, Vol. 21, June 2023, 199-217 203
Ivana STAMENKOVIĆ, Dušan ALEKSIĆ, Tatjana ĐUKIĆ ŽIVADINOVIĆ
Facebook’s terms and conditions discourse is dominated by the attitude that the
user bears the responsibility and that privacy is a personal matter. According to
Siva Vaidhyanathan, “privacy protection is a problem of the environment that
Facebook approaches as a matter of personal responsibility; for Facebook,
privacy is a structural problem that can be only solved by the work and insight
of users” (Vaidhyanathan, 2018, p.83). In the desire to keep the attention of
users and save it, the issues of privacy and dignity become inconvenient and
redundant (Vaidhyanathan, 2018).
We depend on how the algorithm determines is based on the data we
have consciously or unconsciously left in the digital space. “Data sets are in
themselves meaningless. They must be made useful. Therefore, we are “made
subject not to our data, but to the interpretations of that data” (Cheney-Lippold,
2017, p. 254). As the author adds, in the process of shaping knowledge about
the world and ourselves, the central place belongs to algorithms, databases, and
their logic (2017). Another explanation offered by Cheney-Lippold is that the
citizen is, for example, the one that currently produces data that the algorithmic
logic defines as a citizen. Algorithmic identification begins as soon as an
individual enters the online world, and its consequences extend into a future
that is becoming increasingly uncontrollable.
Distribution of content on social media is closely related to what
Riemer and Peter describe as the algorithmic audience. They use this term "to
refer to the automatic and ad-hoc configuration of audiences for speech through
algorithmic content distribution, as a by-product of profit maximization
(Riemer & Peter, 2021,p. 9). The consequences of algorithmic programming
include fragmentation and polarization of the audience, the decline in news
quality, and the radicalization of public discourse (due to the presence of
misinformation and fake news), as well as uncivilized and hate speech (Stark
& Stegmann, 2020, p. 6; Spohr 2017). As it was argued multiple times: "This
happens because more extreme, more outrageous and thus more polarising
content is often found to be most engaging and thus amplified by the algorithm"
(2021, p. 11; Marantz 2020; Edelson et al. 2021; Vaidhyanathan 2021). Many
authors consider that these negative phenomena originated from the operation
of the algorithmic logic of social media, the personalization of content
according to the user, as well as the so-called filter bubbles and echo chambers.
Back in 2011, Eli Pariser, a writer, internet activist, and entrepreneur, drew
attention to the phenomenon that appeared on social media, driven by market
imperatives and the desire to achieve a commercial benefit and called it the
“filter bubble” (Pariser, 2011; Vaidhyanathan, 2018). It is about delivering
content that is similar that what users searched for and responded to by clicking,
liking, sharing, and commenting. In this way, Facebook has become a machine
that has great power in organizing our information space, our connections and
relationships with others, and our perspectives on social reality. As noted by
Cetina Presuel & Sierra: "Distributing any type of content that may interest
their users means that those users will remain engaged and spend time on their
services and that more data will be collected" (Cetina Presuel & Sierra, 2019,
p 265). Closely related to this phenomenon is the phenomenon known as the
204 Balkan Social Science Review, Vol. 21, June 2023, 199-217
Social media algorithms and data management
Balkan Social Science Review, Vol. 21, June 2023, 199-217 205
Ivana STAMENKOVIĆ, Dušan ALEKSIĆ, Tatjana ĐUKIĆ ŽIVADINOVIĆ
206 Balkan Social Science Review, Vol. 21, June 2023, 199-217
Social media algorithms and data management
Method
Balkan Social Science Review, Vol. 21, June 2023, 199-217 207
Ivana STAMENKOVIĆ, Dušan ALEKSIĆ, Tatjana ĐUKIĆ ŽIVADINOVIĆ
The responses of the students from both departments have been compared to
determine whether there were any differences between them in their
understanding of the algorithmic reality on the platforms, as well as to
determine whether the students were able to deal with the mechanisms of
algorithm operations in the shaping of information flows.
All respondents who participated in the research are social media users.
Most of them (93%) use them daily, while 6.5% tend to access social media
several times a week. Only one respondent has the habit of accessing them only
several times every month. According to the frequency of accessing social
media, the students of these departments do not differ.
It should be pointed out that the sample is not representative, but it
gives us insight into students’ perception of algorithms’ understanding. The
survey was conducted by the authors. All the questionary sessions were
conducted face-to-face.
Results
208 Balkan Social Science Review, Vol. 21, June 2023, 199-217
Social media algorithms and data management
Table 2: During your studies so far, have you dealt with the issue of privacy
and algorithmic operation mechanisms online?
Number of students
Yes No Not certain
Department
Devi Total
Empir Theore Devia Empir Theore Devia Empir Theore
atio
ical tical tion ical tical tion ical tical
n
Communication
and Public
36 31.5 0.64 82 90.75 0.84 32 27.75 0.65 150
Relations /
Journalism
Social Policy
6 10.5 1.93 39 30.25 2.53 5 9.25 1.95 50
and Social Work
Total 42 121 37 200
X2 (2, N = 200) = 8.54, p = .013914.
Balkan Social Science Review, Vol. 21, June 2023, 199-217 209
Ivana STAMENKOVIĆ, Dušan ALEKSIĆ, Tatjana ĐUKIĆ ŽIVADINOVIĆ
Table 3: Do you think that by being presented with similar content, conditioned
by previous experiences and interests, you can be denied different opinions and
viewpoints?
Number of students
210 Balkan Social Science Review, Vol. 21, June 2023, 199-217
Social media algorithms and data management
Most students (76.5%) agree that the way the algorithm functions can affect the
amount of time they spend on social media, 16.5% are uncertain, while 7% state
that is not the case. There is no statistically significant difference between the
two groups of students (X2 (2, N = 200) = 3.1237, p = .20975).
Only one respondent (0.5%) believes that algorithms do not allow
social media platforms and large companies to manipulate users’ personal data.
On the contrary, 76% believe that such a practice exists, while 23,5% express
uncertainty. In this case, the groups studied did not show a significant
difference of opinion (X2 (2, N = 200) = 0.0054, p = .941425).
Furthermore, 68.5% believe that users’ activities on any social media are
constantly monitored, 23% express uncertainty and 8.5% think there is no
monitoring. In relation to their departments, the students’ opinions do not differ
in this case (X2 (2, N = 200) = 5.7224, p = .057201).
According to 85.5% of respondents, appropriate education and
information can help users to better understand and control their data on social
media. 12.5% express uncertainty, while only 2% believe that is not the case.
Regarding this issue, there is no statistically significant difference among the
students (X2 (2, N = 200) = 1.3639, p = .24287).
As far as the analyzed categories are concerned, there is a statistically
significant difference between the students of Communication, Public
Relations, and Journalism, and the students of Social Policy and Social Work
in only two categories. Also, there is an impression that the students are equally
familiar with the functioning of social media, algorithms, and potential privacy
threats. The reason for this may be found in a critical approach to social media,
both by the students who are media-focused and those who are not.
Discussion
Balkan Social Science Review, Vol. 21, June 2023, 199-217 211
Ivana STAMENKOVIĆ, Dušan ALEKSIĆ, Tatjana ĐUKIĆ ŽIVADINOVIĆ
212 Balkan Social Science Review, Vol. 21, June 2023, 199-217
Social media algorithms and data management
algorithmic decisions (Swart, 2021). These findings lead to the observation that
young people are, to a certain extent, aware of the impact of algorithms on
information daily, but they have no control over the process of selection and
editing, or personalization of content. Young people do not withdraw from
social media or limit the amount of time they spend online and on various
platforms, according to the research conducted by Head, Fister, and MacMillan,
despite knowing that their activities are monitored and their data traded,
because they find application services useful (Head et al., 2020).
Conclusion
Balkan Social Science Review, Vol. 21, June 2023, 199-217 213
Ivana STAMENKOVIĆ, Dušan ALEKSIĆ, Tatjana ĐUKIĆ ŽIVADINOVIĆ
References
Aleksić, D., & Stamenković, I. (2018). The language of media in the digital
age. Balkan Social Science Review, 12, 101-113.
Barbera, P., Jost, J. T., Nagler, J., Tucker, J. A. & Bonneau, R. (2015). Tweeting
from left to right: Is online political communication more than an echo
chamber? Psychological Science 26(10), 1531–1542.
https://doi.org/10.1177/095679761559462
Barbera, P. & Rivero, G. (2015). Understanding the political representativeness
of Twitter users. Social Science Computer Review 33(6), 712–729.
https://doi.org/10.1177/0894439314558836
Barbera, P. (2020). Social Media, Echo Chambers, and Political Polarization.
In N. Persily & J. A. Tucker (Eds.) Social Media and Democracy: The
State of the Field (pp. 34-55). Cambridge University Press
https://doi.org/10.1017/9781108890960
Cetina Presuel, R., & Sierra, J. M. M. (2019). Algorithms and the news: social
media platforms as news publishers and distributors. Cetina Presuel,
R., & Martínez Sierra, J.(2019). Algorithms and the News: Social
Media Platforms as News Publishers and Distributors. Revista De
Comunicación, 18(2), 261-285.
Cheney – Lippold, J. (2017). We are data. New York: New York University
Press.
214 Balkan Social Science Review, Vol. 21, June 2023, 199-217
Social media algorithms and data management
Balkan Social Science Review, Vol. 21, June 2023, 199-217 215
Ivana STAMENKOVIĆ, Dušan ALEKSIĆ, Tatjana ĐUKIĆ ŽIVADINOVIĆ
216 Balkan Social Science Review, Vol. 21, June 2023, 199-217
Copyright of Balkan Social Science Review is the property of Balkan Social Science Review
and its content may not be copied or emailed to multiple sites or posted to a listserv without
the copyright holder's express written permission. However, users may print, download, or
email articles for individual use.