TikTok: Our Biggest Problem Is Dumb, Horny Teens - 6 minutes read
Photo : Lionel Bonaventure ( Getty Images )
It’s hard not to feel bad for TikTok. Back in 2018, we were calling the short-form video monolith the only app that was still capable of wringing anything close to “unbridled joy” out of our jaded, miserable lives. Less than two years later, it’s facing a potential ban from our sitting president and an exile from its largest market. It’s weathered its fair share of class action suits. It’s being investigated by the Department of Justice on the grounds that the app might pose a risk to national security. What used to be the last safe haven for silliness has quickly morphed into something that feels, well, just as unsafe as every other platform, if not even riskier .
Advertisement
Naturally, none of this was brought up in the company’s third transparency report when it went live on Thursday . Instead, the company seemed to insist that one of the biggest threats to safety on the platform are horny teens.
“TikTok is built upon the foundation of creative expression,” the report reads. “Feeling safe helps people feel comfortable expressing themselves openly, which allows creativity to flourish. This is why our top priority is promoting a safe and positive experience for everyone on TikTok.”
Advertisement
Keeping the platform safe and secure, they go on, starts with “addressing problematic behavior and content,” which is, evidently, something that TikTok has in droves. In the latter half of last year, the company reports taking down close to fifty million videos globally (49,247,689, to be precise) for violating the platform’s community guidelines or terms of service. TikTok’s content moderation algorithms reportedly caught roughly 98% of these clips before they were reported, and about 89% of these videos were reportedly taken down before anyone had the chance to view them. Or put another way: just over five million videos squeaked past these algos and into our feeds before the company snapped into action and took the clip down.
Before the Indian government made the call to ban TikTok and dozens of other Chinese apps over alleged security and privacy issues this past June, the country was TikTok’s biggest overseas market, which means a lot of the banned content happened on their turf. As the report points out, just under sixteen and a half million content crackdowns happened over Indian TikTokers uploading something shady—which is kind of a jump from the second most problematic country, which is (naturally) t he United States. US users scored just over four and a half million crackdowns on their turf during this time, beating out takedowns in Pakistan, the UK, and Russia.
The crackdowns don’t reflect as much on TikTok as much as they reflect on the fact that the teens who use it are dumb, horny monsters—you know, the same way teens usually are. The report only gives us the number of crackdowns for December, but they give a pretty good picture of the platform’s priorities:
During the month of December, 25.5% of the videos we took down fell under the category of adult nudity and sexual activities. Out of an abundance of caution for child safety, 24.8% of videos we removed violated our minor safety policies, which include content depicting harmful, dangerous, or illegal behavior by minors, like alcohol or drug use, as well as more serious content we take immediate action to remove, terminate accounts, and report to NCMEC and law enforcement as appropriate. Content containing illegal activities and regulated goods made up 21.5% of takedowns. In addition, 15.6% of videos removed violated our suicide, self-harm, and dangerous acts policy, which primarily reflects our removal of risky challenges.
Advertisement
So just to recap: sexting—consensually or otherwise—brought on just about a quarter of the crackdowns. Another quarter hit the teens dumb enough to brag about how edgy they are for vaping. “Illegal activities and regulated goods,” by TikTok’s definition, also translates to drug-using and buying, along with “pranks” and any how-to’s involving “illegal activities,” which all told made up about a fifth of the crackdowns. A little more than tenth were largely cracked down over what they’d call “risky challenges,” and what we’d call setting yourself on fire, or fitting your entire head inside a condom for clicks.
I’m probably not giving TikTok enough credit for its troubles here since the content being crushed isn’t only from teens, but from sexual predators, drug dealers, and clips that might spur someone to suicide. But this report doesn’t make that distinction clear. Without that, and because TikTok’s a platform where the lion’s share are under 18, it’s worth assuming that the vast majority of these cases—which add up to just about 87% of the total TikTok crackdowns—hit teens the hardest, which is what just about all of us predicted would happen when TikTok gave its community guidelines an overhaul back in January.
Advertisement
It makes the rest of their bans seem like barely anything at all:
Of the remaining videos removed, 8.6% violated our violent and graphic content policy; 3% fell under our harassment and bullying policy; and less than 1% contained content that violated our policies on hate speech, integrity and authenticity, and dangerous individuals and organizations.
Advertisement
TikTok, like just about every other social network, relies on imperfect AI’s backed by equally imperfect human reviewers to do this dirty work for them. And just like every other platform, they’re struggling with the fact that hate speech is both insanely pervasive and insanely hard to catch in any automated way. Meanwhile, researchers have spent years honing all sorts of algorithms into well-oiled machines that can pick up on the smallest hint of boob, or anything remotely boob-shaped like these baby owls.
For every six times that this report preaches about “safety,” privacy only gets mentioned once, and only in a quick litt le paragraph describing the “wide range of privacy settings” available to teens on a platform that, it’s worth repeating here, is under global scrutiny for the data it collects. TikTokers, the report points out, can turn their accounts private, or turn off their messaging function, or, you know, block a follower.
Advertisement
What’s omitted here is that those little steps—just like every other step we might take on an app—leave behind their own digital trail that TikTok doesn’t even pretend to let us erase. Instead, it looks like the company’s only motivation isn’t to keep its users safe but to keep its users safe from everyone but TikTok.
Source: Gizmodo.com
Powered by NewsAPI.org
It’s hard not to feel bad for TikTok. Back in 2018, we were calling the short-form video monolith the only app that was still capable of wringing anything close to “unbridled joy” out of our jaded, miserable lives. Less than two years later, it’s facing a potential ban from our sitting president and an exile from its largest market. It’s weathered its fair share of class action suits. It’s being investigated by the Department of Justice on the grounds that the app might pose a risk to national security. What used to be the last safe haven for silliness has quickly morphed into something that feels, well, just as unsafe as every other platform, if not even riskier .
Advertisement
Naturally, none of this was brought up in the company’s third transparency report when it went live on Thursday . Instead, the company seemed to insist that one of the biggest threats to safety on the platform are horny teens.
“TikTok is built upon the foundation of creative expression,” the report reads. “Feeling safe helps people feel comfortable expressing themselves openly, which allows creativity to flourish. This is why our top priority is promoting a safe and positive experience for everyone on TikTok.”
Advertisement
Keeping the platform safe and secure, they go on, starts with “addressing problematic behavior and content,” which is, evidently, something that TikTok has in droves. In the latter half of last year, the company reports taking down close to fifty million videos globally (49,247,689, to be precise) for violating the platform’s community guidelines or terms of service. TikTok’s content moderation algorithms reportedly caught roughly 98% of these clips before they were reported, and about 89% of these videos were reportedly taken down before anyone had the chance to view them. Or put another way: just over five million videos squeaked past these algos and into our feeds before the company snapped into action and took the clip down.
Before the Indian government made the call to ban TikTok and dozens of other Chinese apps over alleged security and privacy issues this past June, the country was TikTok’s biggest overseas market, which means a lot of the banned content happened on their turf. As the report points out, just under sixteen and a half million content crackdowns happened over Indian TikTokers uploading something shady—which is kind of a jump from the second most problematic country, which is (naturally) t he United States. US users scored just over four and a half million crackdowns on their turf during this time, beating out takedowns in Pakistan, the UK, and Russia.
The crackdowns don’t reflect as much on TikTok as much as they reflect on the fact that the teens who use it are dumb, horny monsters—you know, the same way teens usually are. The report only gives us the number of crackdowns for December, but they give a pretty good picture of the platform’s priorities:
During the month of December, 25.5% of the videos we took down fell under the category of adult nudity and sexual activities. Out of an abundance of caution for child safety, 24.8% of videos we removed violated our minor safety policies, which include content depicting harmful, dangerous, or illegal behavior by minors, like alcohol or drug use, as well as more serious content we take immediate action to remove, terminate accounts, and report to NCMEC and law enforcement as appropriate. Content containing illegal activities and regulated goods made up 21.5% of takedowns. In addition, 15.6% of videos removed violated our suicide, self-harm, and dangerous acts policy, which primarily reflects our removal of risky challenges.
Advertisement
So just to recap: sexting—consensually or otherwise—brought on just about a quarter of the crackdowns. Another quarter hit the teens dumb enough to brag about how edgy they are for vaping. “Illegal activities and regulated goods,” by TikTok’s definition, also translates to drug-using and buying, along with “pranks” and any how-to’s involving “illegal activities,” which all told made up about a fifth of the crackdowns. A little more than tenth were largely cracked down over what they’d call “risky challenges,” and what we’d call setting yourself on fire, or fitting your entire head inside a condom for clicks.
I’m probably not giving TikTok enough credit for its troubles here since the content being crushed isn’t only from teens, but from sexual predators, drug dealers, and clips that might spur someone to suicide. But this report doesn’t make that distinction clear. Without that, and because TikTok’s a platform where the lion’s share are under 18, it’s worth assuming that the vast majority of these cases—which add up to just about 87% of the total TikTok crackdowns—hit teens the hardest, which is what just about all of us predicted would happen when TikTok gave its community guidelines an overhaul back in January.
Advertisement
It makes the rest of their bans seem like barely anything at all:
Of the remaining videos removed, 8.6% violated our violent and graphic content policy; 3% fell under our harassment and bullying policy; and less than 1% contained content that violated our policies on hate speech, integrity and authenticity, and dangerous individuals and organizations.
Advertisement
TikTok, like just about every other social network, relies on imperfect AI’s backed by equally imperfect human reviewers to do this dirty work for them. And just like every other platform, they’re struggling with the fact that hate speech is both insanely pervasive and insanely hard to catch in any automated way. Meanwhile, researchers have spent years honing all sorts of algorithms into well-oiled machines that can pick up on the smallest hint of boob, or anything remotely boob-shaped like these baby owls.
For every six times that this report preaches about “safety,” privacy only gets mentioned once, and only in a quick litt le paragraph describing the “wide range of privacy settings” available to teens on a platform that, it’s worth repeating here, is under global scrutiny for the data it collects. TikTokers, the report points out, can turn their accounts private, or turn off their messaging function, or, you know, block a follower.
Advertisement
What’s omitted here is that those little steps—just like every other step we might take on an app—leave behind their own digital trail that TikTok doesn’t even pretend to let us erase. Instead, it looks like the company’s only motivation isn’t to keep its users safe but to keep its users safe from everyone but TikTok.
Source: Gizmodo.com
Powered by NewsAPI.org