A recent study has revealed that TikTok may expose teens to potentially dangerous information after they create an account. This finding certainly adds to the rising concern about the app’s effects on its youngest users.
The Center for Countering Digital Hate (CCDH), a non-profit organisation, revealed in their latest study that it can take as little as three minutes after creating an account on TikTok to uncover content connected to suicide and around five more minutes to find a community advocating content related to eating disorders.
At TikTok’s minimum user age of 13, the researchers claimed to have created eight new accounts in the United States, the United Kingdom, Canada, and Australia. These accounts temporarily paused and liked articles about mental health and body image. Within a 30-minute period, the CCDH reported that the app recommended films regarding body image and mental health around every 39 seconds.
The research is released at a time when local, state, and federal officials are looking for methods to penalise TikTok for privacy and security violations as well as deciding if the app is suitable for teenagers. It also comes after more than a year of difficult questioning from legislators about how social media platform CEOs, including TikTok, might steer younger users – particularly teenage girls – to hazardous content, harming their mental health and body image, during a series of congressional hearings.
TikTok promised to improve after those hearings, which came as a result of revelations made by Facebook leaker Frances Haugen regarding Instagram’s effect on teenagers. However, the most recent CCDH findings indicate that there may still be work to be done.
Imran Ahmed, CEO of the CCDH said:
“Every parent’s worst nightmare: young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them, as well as their physical and mental health.”
The research, according to a TikTok spokesman, is inaccurately representing the watching experience on the platform for a number of reasons, including the tiny sample size, the 30-minute testing session, and the way the accounts browsed over a number of irrelevant topics in search of other material.
TikTok stated that it is always introducing new security measures for its customers, such as methods to block mature or “maybe dangerous” videos. A feature to help people choose how much time they want to spend watching TikTok videos, set regular screen time breaks, and provide a dashboard that details the number of times they opened the app were added in July.
TikTok videos that were identified as possibly containing mature or complex themes were also given a “maturity score,” and the feature was added to videos generally. Additionally, TikTok provides a number of parental restrictions.
According to TikTok, anything that promotes, normalises, or glorifies behaviours that might result in suicide or self-harm is not permitted in the platform. From April to June of this year, the business removed 93.4% of the videos at zero views, 91.5% within 24 hours of being posted, and 97.1% before any reports were received for breaching its policy on suicide and self-harm material.
However, according to the CCDH, more has to be done to strengthen safety for young users and ban some content on TikTok.
The CCDH’s Ahmed stated that the research “underscores the critical need for reform of online spaces.” Without regulation, TikTok’s secretive platform will keep making money by delivering increasingly intense and upsetting content to its users, who can be as young as 13 years old, without checks, resources, or assistance.
Read More Stories: Tesla retains EV dominance but new players slowly gaining market share