(Photo : Pixabay)
TikTok logo
- TikTok is facing lawsuits from 13 U.S. states, accusing the company of harming and failing to protect young users.
- The states allege that TikTok uses addictive software to keep children engaged and targets them with ads.
- TikTok has denied the allegations, highlighting its safety features like screen time limits and privacy defaults for minors.
- The lawsuits could have far-reaching implications for the social media industry and its regulation in the future.
In a significant development, TikTok, the popular social media platform, is facing lawsuits from 13 U.S. states and the District of Columbia. The lawsuits, filed separately in New York, California, the District of Columbia, and 11 other states, accuse the Chinese-owned company of harming and failing to protect young people. The states are seeking new financial penalties against the company, marking a significant expansion of TikTok's legal battle with U.S. regulators.
The states allege that TikTok uses intentionally addictive software designed to keep children engaged as long and often as possible. They also accuse the company of misrepresenting its content moderation effectiveness. TikTok cultivates social media addiction to boost corporate profits, California Attorney General Rob Bonta said in a statement. TikTok intentionally targets children because they know kids do not yet have the defenses or capacity to create healthy boundaries around addictive content.
The lawsuits also claim that TikTok seeks to maximize the amount of time users spend on the app to target them with ads. Young people are struggling with their mental health because of addictive social media platforms like TikTok, said New York Attorney General Letitia James.
TikTok's Response and Safety Measures
In response to these allegations, TikTok stated that it strongly disagreed with the claims, many of which it believes to be inaccurate and misleading. The company expressed disappointment that the states chose to sue rather than work with them on constructive solutions to industry-wide challenges. TikTok also highlighted the safety features it provides, including default screen time limits and privacy defaults for minors under 16.
The lawsuits also accuse TikTok of facilitating the sexual exploitation of underage users. Washington D.C. Attorney General Brian Schwalb alleged that TikTok operates an unlicensed money transmission business through its live streaming and virtual currency features. He described TikTok's platform as dangerous by design, an intentionally addictive product designed to get young people addicted to their screens.
The lawsuits stem from a national investigation into TikTok, launched in March 2022 by a bipartisan coalition of attorneys general from many states, including New York, California, Kentucky, and New Jersey.
The Larger Implications and Historical Context
The lawsuits focus on TikTok's algorithm, which drives the app's "For You" feed by recommending content based on users' preferences. The lawsuits note TikTok design features that they say addict children to the platform, such as the ability to scroll endlessly through content, push notifications that come with built-in "buzzes" and face filters that create unattainable appearances for users.
The U.S. Justice Department sued TikTok in August for allegedly failing to protect children's privacy on the app. Other states previously sued TikTok for failing to protect children from harm, including Utah and Texas. TikTok on Monday rejected the allegations in a court filing. TikTok's Chinese parent company ByteDance is battling a U.S. law that could ban the app in the United States.
This is not the first time that a social media platform has faced legal action over its impact on young users. In the past, other platforms like Facebook and Instagram have also been scrutinized for their potential harm to young users' mental health. These lawsuits against TikTok highlight the growing concern over the impact of social media on young people's mental health and the need for more robust regulations to protect users.