Humanism
Feral Children. Childhood’s Connection to Nature
24 September 2024
Is social media ruining our relationships? Or perhaps they are a digital reflection of our individual preferences? One thing we know for sure: the algorithms that recommend content know us better than we know ourselves.
Over the past few years, we’ve seen an increase in studies and publications that expose the damaging impact of social media on our lives. One is the Netflix series The Social Dilemma, which has led to an explosion of negative emotions towards social media. The creators of the documentary showcased consequences of being addicted to digital reality, and based their reasoning on tragic statistics according to which, since 2011, depression and anxiety among young people have increased by 62 percent, and suicide by 75 percent. For preschool-aged girls, the corresponding figures are 189 and 151 percent. These statistics refer to the United States, but similar data has emerged in other countries as well.
Max Fisher, author of the recently released book The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World, compares social-media platforms with cigarette manufacturers in the 1960s, who lobbied for years to cover up damning research on tobacco products. According to Fisher, a shift in attitudes towards social media is inevitable, amid extreme cases of its excessive use. Fisher states that a shift in attitudes towards social media is inevitable, and in its extreme version, excessive use will be viewed just as negatively as addiction to tobacco or alcohol. We can already see the first positive changes that followed the phase of intoxication to the allure of digital platforms. The results of a 2019 study by Edison Research and Triton Digital show that overall social-media use among Americans from 12 to 34 on several platforms has stabilized or is declining. The same study found that an increasing number of people under 30 are using Facebook solely for socializing with friends.
Read also:
Since the very beginning of traditional media, owners have sought to assert as much influence as possible on people’s daily lives: from consumer choices to the political beliefs of their audiences. Nowadays, the capabilities of social-media platforms are beyond our imagination, thanks to the tracking of our every move in cyberspace. YouTube monitors the behavior of two billion users, collecting what is surely the largest dataset of viewer preferences ever. The goal of any digital platform is to hold our attention for as long as possible, no matter the social and psychological cost. Surprisingly, numerous studies of algorithm-based recommendation systems have shown that such mechanisms significantly contribute to shaping extreme opinions in viewers, as this type of content is rewarded by algorithms due to users’ high emotional investment. One of 2021’s reports revealed that more than 70 percent of extremist content found on YouTube has been recommended to users by an algorithm.
Even Mark Zuckerberg has described the correlation between increasing engagement levels and the frequency with which the algorithm serves extreme content. The Facebook founder published an article in 2018 admitting that “the social network’s greatest problem is that without supervision, people are most likely to engage with sensational and provocative content. Our research suggests that no matter where we draw the line of what’s allowed, as we get closer to that line, people will engage with extreme content more.”
Facebook’s internal research has confirmed what experts concerned about the growing polarization of the service have been trying to show for years. On the Internet, informed choice is hard to come by, as there is a phenomenon located somewhere between impulse, natural temptation, and the stimulus suggested by social networks.
It is worth noting that the problem described by Zuckerberg doesn’t apply only to digital platforms, but extends to traditional media. Charlie LeDuff, an American journalist and gonzo-style reporter, has pointed out the disturbing symbiosis between the media and politicians representing particularly extreme beliefs. In his book Sh*tshow!: The Country’s Collapsing… and the Ratings Are Great, LeDuff wrote:
Just ten weeks after announcing his candidacy, Trump was in command not only of the polls, but the media. The guy had shocked the establishment and they despised his disregard for their rules, but he was ratings gold for a catatonic industry whose viewers were fleeing in droves. Twenty-four million people had tuned in to watch the Big Orange insult everybody during the previous debate. Seeing that, the network televising this contest was doubling down on the game show motif and increasing its ad buys by 40 percent. Cha-ching!
Peter Thiel, founder of PayPal and Palantir, displayed antisocial views in 2009 by stating that society couldn’t be left unattended online. As Fisher writes, Thiel and his Silicon Valley colleagues viewed society as “a set of engineering problems waiting to be solved” – as you might guess, the perfect tool for surveillance and problem-solving turned out to be the ill-fated algorithm.
Silicon Valley giants have fallen in love with recommendation systems, which, thanks to special algorithms, use our data to present content tailored to our tastes. Facebook, Twitter, and Instagram have moved away from chronological feeds, displaying content in the order in which it was published. What emerged instead was a more algorithmically sequenced content based on what the platforms deemed most interesting to the user. Such changes have made social networks unpredictable and less transparent.
This shift has started a constant and desperate war for attention. As Fisher explains, “the element that all social media rely on is attention.” The methods used by digital platforms have become persuasive like nothing else before. The recommendation algorithm, refined according to findings of dopamine and addiction studies and discoveries of behavioral psychology, has become the most powerful tool for grabbing our attention, changing our beliefs, and influencing important life decisions.
The negative impact exerted by digital platforms can be seen best among the youngest, who don’t know a world without the Internet. As research shows, digital natives get bored rather quickly, are impatient, and expect immediate reward for work done. They are also better adapted to simultaneous performance of multiple activities – using the web, for example, makes it possible to talk to a friend while searching for information on a particular topic, listening to music, or making a bank transfer. This is the cause of Linda Stone’s description of a “persistent state of partial distraction,” or “being interested in everything without focusing on anything,” which is directly related to the popular FOMO (Fear of Missing Out) syndrome: “the pervasive fear that other people at any given moment are having highly rewarding experiences that I am not currently participating in.”
Max Fisher’s proposal is to turn off digital platforms, or at least the recommendation algorithms that fuel them. This would result in a less engaging Internet, one whose architecture would not resemble a one-armed bandit set up in a casino. At present, social media preys on our weaknesses, our lowest instincts, and our rougher days, when lured by the temptation of cheap entertainment, we feed the algorithms our choices and become idle consumers of low-quality content.
In the face of a lack of transparency and requirements imposed on social media top-down by state governments, many Internet users have started putting together tutorials with simple ways of managing algorithms. The idea is to teach the recommendation system by clicking a “thumb down” on YouTube videos or by quickly skipping unwanted content on TikTok. But such individual practices don’t always work; a much better solution is the European Parliament’s legal proposal that digital platforms, to continue operating in the EU, will be forced to explain to their users the workings of their content-recommendation systems or allow Internet users to opt out of having their preferences tracked.
Sources:
Josh Constine, Techcrunch, “Facebook will change algorithm to demote ‘borderline content’ that almost violates policies” (2018)
Max Fisher, The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World (2023)
Tamsin Shaw, New York Times, “How Social Media Influences Our Behavior, and Vice Versa” (2022)
Humanism
19 September 2024
Humanism
17 September 2024
Zmień tryb na ciemny