Parents frequently worry about the effect social media is having on their families. They approach the topic as if it were some sort of villain: The Algorithm is coming for our kids, they might say. But what is “The Algorithm?” And how is it affecting all of us?
Simply put, social media algorithms are curated feeds. Machine learning allows these powerful programs to analyze someone’s social media use and—often with astonishing accuracy—predict what sort of content that someone might also enjoy. It’s a way for social media companies to increase user engagement. Because if you grow bored with what you’re seeing, or if you simply don’t like it, then you’ll log off and start doing something else.
In adolescents, whose minds are still developing, this can cause a level of social media addiction. Indeed, even adults can fall prey to these addictive features. It’s why Meta (the parent company of Facebook and Instagram) and Google (the parent company of YouTube) recently lost a landmark case regarding the design of their products: A California jury decided these companies purposefully and maliciously designed their products to be addictive, handing the plaintiff a $3 million decision.
But there’s another risk involved with algorithms that you might not be aware of: In fact, it may have already influenced you in ways that aren’t so easy to reconcile.
Everybody has a bias. Ask anyone who works in public relations, and they’ll tell you. But that doesn’t have to be a bad thing. For instance, if you read a lot of articles by Plugged In or Focus on the Family, you may be biased toward Christian content.
Unfortunately, social media algorithms can encourage us to form biased opinions that we may not otherwise have developed on our own.
In a study conducted by Ohio State University, researchers found that “even when you know nothing about a topic,” algorithm-driven information “can start building biases immediately and can lead to a distorted view of reality.”
Essentially, algorithmic recommendations typically only show you one viewpoint—the one that your previous choices have demonstrated you will most likely agree and engage with. Because of that, “people miss information when they follow an algorithm,” a co-author on the study said, “but they think what they do know generalizes to other features and other parts of the environment that they’ve never experienced.”
The researchers described their findings using a simple scenario:
“A person who has never watched movies from a certain country decides to try some. An on-demand streaming service offers recommendations. The viewer selects an action-thriller because it appears at the top of the list. The algorithm then promotes more action-thrillers, which the viewer continues to choose.”
The authors of the study wrote that if the viewer’s goal was to get a better understanding of the movies offered by that particular country, the algorithmic recommendation would have seriously biased that understanding. “By only seeing one genre, the person may overlook strong films in other categories. They may also form inaccurate and overly broad assumptions about the culture or society represented in those movies.”
To further demonstrate the impact on bias, the researchers also applied their findings to algorithm-guided learning.
They conducted an experiment wherein participants were given information about the physical attributes of fictional extraterrestrial aliens and asked to classify them. In one group, the participants were required to look at all of an alien’s features before identifying them. In another, participants could select which features they wanted to see, and then a personalized algorithm would provide recommendations based on those selections. In this group, participants could still manually select any feature they wanted to view. They could also skip anything they didn’t want to look at. But the algorithm usually recommended the same features over and over, rather than giving them the whole picture. Therefore, when tested, the participants in this group frequently classified the aliens incorrectly.
We may not be looking to social media to identify extraterrestrial visitors, but these studies illustrate how algorithms can shape, and distort, our own view of the world. If someone chooses to learn about a subject by watching YouTube videos or scrolling through TikToks feeds or Reels, they’re not going to get the whole picture. They might know a lot about one attribute of a topic, but they may not understand how that attribute fits into the broader subject. It would be like studying one verse or even one book of Scripture. You may have memorized it, know who authored it and even applied it to your daily life, but without the context of the whole Bible, you may not truly understand what it means—what God intended for it to mean.
If you’ve heard social media compared to an echo chamber, this is why. Algorithms aren’t programmed to broaden our horizons. They’re programmed to pinpoint the exact, precise, niche topics that we’ll most enjoy consuming. But “consuming similar content is often not aligned with learning,” the study authors cautioned.
Now, you can explain everything written in that study—everything I just summarized—to your teens. And maybe some of them will listen. But many teens actually like their personalized algorithms.
According to a qualitative interview study featured in Fast Company, teens enjoy the customized content provided by algorithms because it shows them things they agree with and actually want to see without having to search for it themselves. And because algorithms can be so accurate, they even see the content presented as a reflection of themselves.
Because of this, many teens are unaware—or unconcerned—with algorithmic bias. Based on what they told the interviewers, they felt confident in their ability to ignore or scroll past content that didn’t align with their personal beliefs or self-image.
Unfortunately, further research says otherwise.
According to several studies, teens have proven themselves “highly vulnerable to self-image distortion and other mental health problems based on social media algorithms.” Researchers know that the developing teen brain is exceptionally malleable to what their peers say and believe—including feedback provided via social media. So “teens are wrong to believe that they can scroll past the self-identity risks of algorithms.”
So parents, that leaves you with a pretty heavy burden. How do you teach your child to resist their algorithmic tendencies without challenging their very sense of self?
The post Your Algorithm Is Fooling You appeared first on Plugged In.