Not “for you”: TikTok is addictive and dangerous
When it comes to social media, the app controlling the mood and direction of our youth is undeniably TikTok. The ByteDance-owned app, formerly called Musical.ly, is a user-generated short-form content hosting service. TikTok has about one billion monthly users from around the world, excluding India and Afghanistan where the app was banned over geopolitical issues and private data being shared with China.
There is a so-called “TikTok ban” that has been in the news recently. Though the Senate committee focused their investigation on TikTok and the owner, this act would affect the whole internet. According to Congress.gov its purpose is to, “identify and mitigate foreign threats to information and communications technology (ICT) products and services (e.g., social media applications).” Though this act could have major effects on everybody’s online life, that will not be the focus of this article.
To give some background on myself and my online presence, I virtually have none. Though I had TikTok during lockdown I quickly found it to be consuming. In an attempt to save my mental health I deleted both TikTok and Instagram. I found limiting my social media presence to be freeing, not having to worry about how I will be perceived by others, and if I’m pretty, skinny or funny enough.
The algorithm of the “For You” page is where we will start our journey. ByteDance has not shared exactly how the algorithm works, but people and users have been able to figure it out: when first joining the app, a user’s “For You” page will be curated based upon the basic information they have entered into the app, like gender, age and location. The more the app is used, it gains more information about what you like; based on what accounts you follow, like, comment on, share and watch time. Social media algorithms are made to keep users attention and it can easily put a user into a rabbit hole. Once the algorithm sees you spend time watching a certain type of video it will begin to feed you similar videos until your whole feed is of content it has decided you will like.
Videos ranging from “Family Guy” clips and teens dancing to graphic footage of a real suicide, can be found on TikTok. The afformentioned suicide clip has been one of many horrific videos to be spread accross the app. CBS News reporter Sophie Lewis covered this event in her Sep. 9, 2020 article, “TikTok struggles to stop the spread of viralsuicide video.” In this article Lewis cites a TikTok representative who said, “Our systems, together with our moderation teams, have been detecting and removing these clips for violating our policies against content that displays, praises, glorifies or promotes suicide.” While the graphic clip was removed from the app it was not before the algorithm put these often disguised videos on young peoples “For You” page. Parents and people were obviously upset that a clip like this was showing up on their phones. It can be assumed that this video has left many children and adults scarred.
This statement from the TikTok spokesperson also does not hold up. TikTok and other social media have documented effects on adolescents mental health. There are many spaces on social media that have the ability to empower and bring together otherwise overlooked groups of people, but these pros do not outweigh the cons. In contrast to the supportive, positive spaces there are others whose sole purpose is to bring people down by encouraging things like eating disorders, self harm and suicide.
The Center for Countering Digital Hate completed a report to “provide parents and policymakers insight into the TikTok content and algorithms shaping young lives today.” Researchers created accounts in the United States, United Kingdom, Canada and Australia posing as 13 year olds (the minimum age to use the app). The researchers then paused briefly on videos about body image and mental health, liking some of them. They found that after 2.6 minutes TikTok showed a video regarding suicide, after eight minutes there was content about eating disorders and every 39 seconds they were recommended videos about mental health and body image. While it can’t be assumed all of these videos were explicitly harmful, it is frightening to think young kids are being fed content discussing issues that are normally handled by professionals.
Along with harmful messages being spread on TikTok, people are also bombarded with sexual comments and content.While some content is inherently sexual and is made as a “thirst trap” that would hopefully only be targeted towards adults, these videos still end up on young children’s feeds. “Thirst traps” are an issue within themselves, when young kids see adults acting in a very sexual way they see that as normal and end up emulating them. For young girls especially, their pre-teen years are when they begin to learn about society’s standards and attitude towards sex. So when they are fed hypersexualized content and unrealistic beauty standards it becomes ingrained into them that’s how they should act.
The issue of predatory adults and harassment online is another issue. According to the National Sexual Violence Resource Center, 24 percent of boys and 36 percent of girls have experienced online harassment. This statistic isn’t exactly about underage people receiving harassment from adults but it is still relevant to this argument because it highlights the unsafe online environment.
TikTok may try to restrict content based on age but it is very easy to lie about one’s age when online. But children lying still isn’t the issue. When adults are able to easily access children’s content and hide behind their screen they are able to send gross and perverted messages to innocent kids. TikTok needs to work on separating adult and children’s content before it damages more kids’ mental health and threatens their safety.
Besides the culture on TikTok there is also an issue with short form content in general. I have found myself scrolling through social media thinking it’s only been 10 minutes when in reality it’s been an hour. Short form content sends little hits of dopamine to your brain everytime you see a video you like. These dopamine hits leave your brain craving more causing people to spend hours laying in bed mindlessly scrolling.
Short form content is also shortening peoples, especially children’s, attention spans. The “Family Guy” and Subway Surfers combo videos are a prime example of how people’s attention spans are being affected. When speaking with Sandee LaMotte from CNN, professor of informatics, Dr. Gloria Mark said that in 2004 attention time on a screen was around two and a half minutes. Now it is around 47 seconds. While this study is not directly linked to TikTok there is no doubt that it and the spread of short form content to Instagram, Youtube and Snapchat are a factor. Depletion in our attention spans is especially scary to me because of how it will affect younger kids in the future.
From issues with privacy, harmful messages, oversexualization, screen addiction, TikTok presents many issues in my mind. While I know this essay most likely won’t get you to delete the app, I hope it makes you more conscious of what is being spoon fed to you. I think there needs to be some policy changes and better regulation on the app, people also need to understand the consequences their posts and online activities can have on others. While I can admit that there are references and jokes I dont understand from not being on TikTok and there are times when YouTube Shorts suck me in (cringe I know), I am also very happy not being addicted to the drug that is TikTok.