An Analysis of AI: How Algorithms Affect Teens

aurna mukherjee, Club Contributor

Artificial intelligence (AI), or the simulation of human intelligence processed by machines, is used in many aspects of students’ life, including their communication with friends and social platforms. AI algorithms are heavily utilized in the domain of social media, and the algorithms’ filtering of material can change how students view themselves, their peers and world events. There are several efforts to increase awareness about the challenges that algorithms pose.

One main issue when designing algorithms is the threat of algorithmic bias, or repeated errors resulting in unethical outcomes. LASA class of 2021 graduate Lily Yeazell is majoring in Computer Science at MIT, and she said she has noticed that the problem stems from existing cultural and social trends from the data fed into the algorithm. Because it has no moral conscience, Yeazell said algorithms can have extreme effects, from widening the partisan divide to facilitating online hate speech. 

“[Algorithms] might be correct for 95% of the time, but the other 5% of the time, it just gets ignored, and it can have bad effects,” Yeazell said. “In general, applying statistical calculations to people is never good because people aren’t numbers.” 

In order to increase retention, Yeazell said social media is reliant on algorithms that create targeted media to influence users. Algorithms in TikTok, Instagram, and Facebook are a perfect example of this, according to Yeazell. In these applications, algorithms are used to serve targeted advertisements to users to increase their click rate, according to Yeazell. This in turn maximizes profits for these companies. 

“There’s an algorithm in TikTok that tries to find what you like and give you more of that,” Yeazell said. “It also gives you advertisements to try to buy more things. This is something that is very geared towards youth and generates this addiction to screens that we have.” 

On average, teenagers engage in more than 6.5 hours of screen time every day, according to Common Sense Media. Sophomore Gideon Witchel said teenagers screen time usage is due in part to algorithms that are designed to be addictive. He believes that while AI can be useful, the impact on the youth is particularly damaging.

 

“[Algorithms] care a lot about you interacting with them for as long as physically possible, and they don’t care about how that affects you, and they don’t really care about you being a good person or an ethical person, or even a healthy person,” Witchel said. “They just care about you interacting with their platform as much as possible. I think algorithms are making that much more efficient and easy to do, and that’s probably a bad thing.”

 

According to Witchel, increased interaction with algorithms designed to be engaging can have several negative effects. For example, Brigham Young University found there is a clear correlation with the increase of social media usage and teen self-harm, as users can fall into a cycle of comparing themselves with others, leading them into a state of depression. Witchel criticizes social media companies’ lack of response to the issue.

“We are in a world where companies only care about making money,” Witchel said. “They will take as many steps as possible to achieve that goal, even if it means destroying the mental health of youth.”

Computer Science teacher James Shockey understands how social media platforms such as Facebook have dealt with some of these issues. He is especially concerned with the private data that corporations have access to for individual users.

“Facebook is trying to work through freedom of speech while eliminating hate speech, so there are some concerns there,” Shockey said. “I think that in general, I would be concerned by the collection of large sets of data by corporate entities, because we now have tools for data mining that never existed before.”

In order to address these issues, there is a call for transparency from larger technology corporations that are financially incentivized to design such algorithms. Shockey said it is necessary to understand how personal information provided to these corporations are used, and how it affects us. Sophomore Emily Lucas is also concerned about a lack of transparency about the extent to which algorithms shape what students see each day. 

“A lot of companies aren’t very open about what they are doing,” Lucas said, “so this might be concerning because [people] do not know what is going on.” 

This exact issue is why there is a need for regulations to control how algorithms are used, according to Lucas. There are already efforts in place to increase regulation on algorithmic bias. For example, the Filter Bubble Transparency Act requires social media companies to offer a feature that allows users to turn off data input that generates algorithmic recommendations, according to PetaPixel, a photography company. 

In addition to this, President Joe Biden has proposed efforts to secure the privacy of social media users, prevent targeted advertisements specifically towards children, and ban social media companies from collecting private data on children, according to The Hill, a newspaper in Washington, D.C. These efforts are meant to better the mental health of children who are affected by these algorithms. Yeazell said that these are promising steps towards a journey for greater controls over algorithmic bias if more people are conscious of the problem. 

“Algorithms are essentially just being used to make their product more addictive,” Yeazell said. “Creating more awareness on this is generally good.”