TIKTOK: A PANDEMIC WITHIN A PANDEMIC
Why is TikTok becoming the pandemic of all pandemics? Is it true that this Chinese app is killing people, mainly teenagers? Should you immediately delete it? Why all of sudden, everyone who’s anyone is warning us about the dangers of TikTok? How come they’re now admitting that TikTok is more addictive than Instagram, Facebook and Twitter combined? TikTok is accused of profiteering while endangering the lives of teenagers, some of whom are being lured into sharing sexually explicit photos and videos of themselves through the app.
Its algorithm is accused of showing dangerous content to minors, including sex- and drug-related videos, pornography, and even rape, murder, and suicide. As a result, some teenagers are suffering from mental health problems, and some are even said to have been encouraged to attempt suicide.
If all this is true, why does everyone seem to be obsessed with the app? How come three billion people worldwide have downloaded it to date? Is it all lies and fabrications about the Chinese app that has taken the world by storm and is on its way to dominating the world and becoming bigger than Facebook, Instagram, Twitter, and Snapchat combined? Or is there a dark secret behind the app that only the chosen few know about?
In this article, I will explore the mysterious, yet fascinating world of TikTok. You can also view it on YouTube by following this link and subscribing to my channel and liking my videos. I count on your support. https://youtu.be/PUTL-ib70MQ
LURING CHILDREN INTO SEXUALLY EXPLICIT PHOTOS AND VIDEOS
Almost everyone who is anyone is warning us about the dangers of TikTok. They’re all criticizing the app, whose parent company ByteDance is based in Beijing. TikTok, which is popular with even pre-teens and children, is said to be more addictive than Instagram, Facebook and Twitter combined. Just like Facebook and Instagram, TikTok is being accused of profiteering from amplifying extreme, sensationalist, and highly-emotive content to keep young teens online and engaged.
What’s worse…
Young victims are said to be lured into sharing sexually explicit photos and videos of themselves through the popular app. Its algorithm is viewed as dangerous, and the platform is now being accused of serving harmful content to underage users, and of showing dangerous content to minors, including sex- and drug-related videos. A recent Wall Street Journal report has found that the app’s algorithm suggested hundreds of contents on its “For You” page related to drug use, pornography, and rape to its younger users.
While the platform restricts full access to users under 13, a NewsGuard study found younger children are easily able to create an account despite the measure, and that children as young as nine were being exposed to sexual content on the platform soon after signing up.
But what is TikTok exactly doing to our kids? And why is there so much talk about the harm it’s inflicting upon them?
CHILDREN ARE BEING ENCOURAGED TO ATTEMPT SUICIDE
With 3 billion downloads to date, children and teenagers are rushing to TikTok in astronomical numbers. More than one-third of its daily users are younger than 14 years of age. While TikTok’s popularity is undeniable, mental health concerns for its users are escalating. From what it appears, TikTok’s appeal can turn to significant and potentially deadly mental health problems in youths.
One huge problem is that TikTok provides a stream of user-uploaded videos and recommends additional clips based on which videos are being watched. These recommended clips are sometimes extreme, anxiety-producing, and flat-out toxic. For example, teens who are interested in hunting or military life can soon be faced with pictures of serial killers and descriptions of failed or successful murders.
TikTok, like other social media, is associated with increased anxiety and depression. And it is especially dangerous for children and teenagers already suffering from mental health problems, specifically depression and eating disorders. Some are even said to have been encouraged to attempt suicide.
TAKING CHILDREN DOWN A DARK RABBIT HOLE
More obvious problems have also been noted.
TikTok has been found to show sex and drug videos to minors. And TikTok “challenges” have led kids to engage in destructive and illegal acts. For example, the “devious lick” challenge resulted in theft and vandalism at schools across the country.
Today, you’d be hard-pressed to find a kid with a smartphone who doesn’t obsessively watch TikTok. A glimpse into the dark mind of the Chinese-owned video-sharing app exposes the way its powerful algorithms steer children toward drug abuse and sexual depravity. Now, TikTok is said to be causing teens to develop a movement disorder similar to the Tourette syndrome.
From adorable animals to kinky sex, and from droll dances to drug dealers, TikTok is rapidly taking children down a dark rabbit hole that would shock the most jaded adults. Extreme videos describe how to tie knots for sex, how to recover from violent sex acts, and how to fantasize about rape.
A DEEP CHASM OF FILTH
How do I know all this?
Well, for the first time since the Chinese app was created in 2016, I created an account for myself a few days ago for the purpose of researching the content of this article. And not long after the account was created, I found that more than 90 percent of its video feed focused on bondage and sex.
Willingly or unwillingly, it took me into loops that included sexual topics. My account was also bombarded with marketing for strip clubs, promoted paid pornography, and videos pushing me toward the notorious UK-based porn website OnlyFans.com, a platform favored by sex traffickers, with explicit sex and prostitution come-ons. I was also lured into a TikTok space called “KinkTok,” featuring torture devices, chains, whips, and such.
From what I’ve experienced and from what I’ve seen, TikTok aims to engross and addict. Unlike other platforms I’m on like Facebook and Instagram, a big part of TikTok’s business includes addicting children. I wouldn’t be surprised if one day TikTok became a tool for the spread of mass psychogenic illness.
Although to me it initially looked clean and fun on the surface. Underneath, however, I found a deep chasm of filth that can deform impressionable minds forever — or, worse, lure a vulnerable child into the grasp of predators and pornographers.
WILL TIKTOK BECOME THE NEW FACEBOOK?
TikTok is mercilessly working to control the lives of our children. It decides what they can see, what they can share, and even who they are becoming. As TikTok grows in power, children are becoming more and more entranced by their shiny screens. They are being offered as sacrificial lambs to a corporate god who cares about nothing but clicks and cash.
TikTok is becoming a pandemic within a pandemic. And there’s absolutely nothing we can do about it.
Despite all the challenges TikTok is facing when causing harm to young teens and pushing them to the verge of hurting themselves and even committing suicide, the most famous app in the world apparently now has new global ambitions to challenge the Meta universe.
It wants to become the new Facebook!
In early November, TikTok staff were surprised when they received a direct message from their founder Zhang Yiming that hinted about some important restructuring of the business. The changes were caught them off guard. They never saw them coming. Six new units were being established, which left the staff wary of why the restructuring is happening.
“But why now? And what has changed all of a sudden?” most of them wondered.
The company, as expected, is on its way to posting one of its highest revenue growths of 60 percent this year. On top of that, TikTok recently crossed the 1 billion users mark outside China. From what it appears, the company continues to ride high. So, why the change?
Apparently, there’s one secret that only those who are steering the company’s new direction have the precise answer for.
Here’s that secret…
TikTok has developed the ambition to be known for more than video sharing — It wants to become the new Facebook.
Looking at the numbers, this could happen soon, sooner than anyone can expect. TikTok is the first non-Facebook app to cross 3 billion downloads worldwide. More than one in four Brits and one in three Americans use it every single month.
But this is just one part of TikTok’s business.
Its grip on its users is only the beginning of what’s coming. The video app has eyes on capturing people’s attention at work after it got the attention of young teens at school. If it succeeds in doing just that, the app’s parent company will become a westernized Tencent or Alibaba. It will be perceived as a Western company, more than a Chinese one. It will dominate the world, just like Facebook once did before it rebranded to Meta.
To achieve its goals, TikTok is following in the footsteps of Facebook. Day after day, its competition with Facebook is becoming more and more apparent. But for it to succeed, more than 50 percent of its revenue has to come from outside China. What’s more, it needs to grow in emerging markets. This will be the augur of what’s to come.
But here’s the multi-multi-billion-dollar question for you: “TikTok is already a monster hit; So, how is it now going to meet the challenge of its much more difficult second upgrade?”
If it’s ever to succeed, it would definitely need the support of the Western brands out there.
And This is happening already.
TikTok is strengthening its ties with Shopify and Spotify in the West. In May 2021, its parent company ByteDance quietly announced a login kit for TikTok, allowing people to access third-party apps through their TikTok profile, much in the same way people can sign up for Tinder or read news websites with their Facebook profile. That’s a pattern Facebook had previously followed to great success.
When TikTok’s parent company was first set up, it wanted to create an empire as borderless as Google and Facebook combined. Soon, it may be granted its wish.
TIKTOK’S VIRAL CHALLENGES
However, this may not be as easy as it sounds.
TikTok recently came under fire when it was accused of causing teens to develop strange mental behaviors, some of them related to a mental disorder known as the Tourette syndrome.
As a result, many TikTokers found themselves involuntarily cursing and slapping themselves online. What’s worse, more than five billion teens have been watching them communicating their suffering online. This is becoming increasingly dangerous despite the app’s efforts to rectify the problem.
No doubt, TikTok has become a “pandemic within a pandemic.” Because of it, teens are struggling with schoolwork. Because of it, they’re feeling more and more isolated, and as a result of it, they’re becoming more and more bruised.
According to psychologists, these abnormalities caused by TikTok have resulted in more anxiety, depression, and even traumatic stress among teens. Rebecca Lester, a professor of sociocultural anthropology at Washington University, described the consequences as debilitating.
But she was not the only one suggesting that the TikTok teens are in deep distress and that the online app is making their symptoms worse. German psychiatrist Kirsten Müller-Vahl also described the TikTok outbreak as a mass social media-induced illness.
What’s more…
On November 17, 2021, the online publication Tech Crunch, wrote that TikTok has developed a bad reputation for hosting dangerous viral challenges on its app, which at their worst, have led to serious injury or death. More recently, the app made headlines for challenges that encouraged students to hit their teachers and destroy their school property.
Reportedly, a child died from trying the blackout challenge, which involves asphyxiation on TikTok’s platform. What the investigation of that incident revealed is that TikTok can be a breeding ground for harmful content, like its viral challenges. These challenges appeal to teens’ desires for approval from friends and peers, as they result in more likes and views.
This is scary, especially when a recent study announces chilling findings that more than 20% of global teens have participated in these challenges, most of which are considered very risky and really dangerous. They direct children to engage in harmful activities that escalate to self-harm and even suicide.
So far, TikTok has done little to address these issues.
Apparently, it is no longer that social media platform that lets users wind down while watching funny videos, viral sensations, and interesting challenges. Instead, it has become a dangerous tool that exposes its users to danger, said the November 18 online issue of Screen Rant.
While TikTok claims that those challenges are meant to be fun and engaging, children do not seem to be able to decipher which ones are safe and which are not. Based on its own survey, TikTok has found that 46 percent of teens need to understand the risks involved.
But unfortunately, they don’t. And TikTok is the one to blame for that.
DON’T TRY THIS AT HOME
Social media “challenges” are nothing new. Many of them are harmless fun and often raise money for a good cause. Who could forget classics such as the “ice bucket challenge” or the “mannequin challenge”?
However, some of them stray into dangerous territory, and this is where things get worrying. “Planking” was one of the early trendsetters, with people putting themselves in vulnerable positions, such as atop skyscrapers or on train tracks, just to grab an image for Instagram.
TikTok has taken the idea of dangerous challenges to new extremes. The “penny challenge”, in which you drop a penny between a wall socket and a plug, has started house fires, while the “skull breaker challenge”, which involves intentionally tripping people up, caused nasty injuries.
There is also the “devious licks” challenge, which encourages students to steal or vandalize school property. As a result of it, several students have been arrested, and schools have been forced to spend money on fixing broken property.
All of this makes TikTok entirely inappropriate for kids, and yet they keep using the app, none-stop.
TikTok teens are getting caught up in such frightening hoaxes on the app. New research conducted by TikTok found that less than a third recognize these hoaxes as clearly fake. The rest are becoming more and more distressed by the scary hoaxes they see on the app. Global News online said in its November 17 issue that nearly half of them are seeking help afterward.
A trend with a dark and disturbing history has once against reared its head on TikTok to terrify a new generation of parents and users.
The Blue Whale challenge has been around for years but tends to only appear in brief, viral flare-ups. Its return to TikTok is worrying users, thanks to the challenge’s dangerous history.
So, what’s the Blue Whale challenge.
It originally cropped up broadly in 2016 and has resurfaced several times over the years since. When it first appeared, TikTok was still years from becoming a worldwide phenomenon. The optics of this “game” are overtly negative, thanks to its links to self-harm and suicide, so it is often discussed in hushed tones or not at all.
The challenge appears to have originated from several Russian news stories following a teen’s suicide. The teen’s death was linked to a challenge that reportedly presents participants with a range of gradually more dangerous tasks. The initial tasks are often rather innocuous, as the BBC reports, with demands that participants “wake up in the middle of the night” or “watch a scary film.” As participants get deeper into the challenge, however, the tasks become far more menacing. Typically, 50 tasks are performed over the course of the challenge.
Tasks begin to incorporate self-harm and other damaging acts before ultimately culminating in the final act… suicide.
The trend has appeared on social media numerous times since its inception, typically stirring up panic — and very few legitimate cases — before fading into obscurity once again. In total, very few deaths have been definitively linked back to the Blue Whale challenge, but that hasn’t stopped it from inciting alarm around the nation.
So, what is the Blue Whale challenge?
On TikTok, the challenge has taken on a slightly different shape. As described by one user, the profile picture depicts a person in a black wig with a large mouse nose secured over their face. The video warns people away from replying to commenters asking if users “wanna play a game,” which is how people begin playing.
The real fear that is spreading on TikTok relates to those who don’t participate. Videos warn that even the disinterested will be forced to participate if they engage with profiles involved in the challenge. One video claims that anyone who clicks a link sent via direct message or comment section will be forced to participate… “or they will kill your family and friends.”
Up till now, very few suicides have been linked back to the challenge. If teens choose to participate, however, the challenge has the power to become extremely deadly. But although the challenge is causing panic at the moment, there’s a great possibility that it might fade back into obscurity soon.
TikTok’s hoaxes vary.
But a common one includes warnings about a wide-eyed, dark-haired woman known as “Momo”. She threatens users who don’t do the violent tasks she demands of them. Another is based on a 50-step challenge that starts innocuously but ramps up to the final task, the task that challenges users to commit the unthinkable — I’m talking suicide!
Another black-out challenge requires participants to choke themselves to the point they pass out and then wake up a few minutes later. While choking is dangerous in itself, the death of a 12-year-old trying to imitate this challenge brought this into the limelight.
Despite all these dangerous challenges, TikTok is hardly doing anything to save the lives of those young teens who are unaware of the dangers it poses to their very existence. It is time to act. It is time to challenge the dominance of TikTok and its ambition to rule the virtual world — the Facebook way.
SHOULD YOU DELETE YOUR ACCOUNT ON TIKTOK?
With more than one billion monthly users, TikTok has taken the world by storm since its launch in September 2016. But like so many social media apps, it’s not all rosy. TikTok has a dark side to it that only a few know about. You might want to think twice about using the app, and three times about deleting it, now before it’s too late.
TikTok’s format of short videos has been linked to decreased attention spans when the app is used for more than 90 minutes a day. This problem has become so severe that TikTok was forced to hire influencers such as Gabe Erwin, Alan Chikin Chow, James Henry, and Cosette Rinab to ask users to take breaks. It even created pop-up warnings to encourage users, 60 percent of whom are under the age of 24, to stop scrolling.
In March 2020, The Intercept got its hands on some internal TikTok documents that said moderators needed to suppress posts by users who were “too ugly, poor, or disabled”. This has made TikTok’s ambition to replace Facebook worse, not better.
TIKTOK’S DYSTOPIAN DATA COLLECTION TECHNIQUES
All the apps on our phones track us in some way. And in some way, we’ve accepted to learn to live with that. But while social media has always been one of the worst culprits, TikTok’s data collection techniques are particularly dystopian.
Believe it or not, TikTok watches you all the time. It watches what you write in your messages to your friends. It even watches if you never hit that send button. It also requests access to your phone’s model, screen resolution, current operating system, phone number, email address, location, keystroke patterns, and even your contacts list.
None of that seems important to you if you just want to watch 15-second clips. But it does become important when TikTok becomes a danger to your own privacy. And that’s not where your problems with TikTok end. Many security researchers have found security vulnerabilities in the TikTok app. Hackers, for example, use your SMS messages to gain unauthorized access to your account.
And then there’s the big issue of your mental health.
The toll on your brain comes in many forms. You’ll find ample cases of the usual social media scourges, including harassment, abuse, cyberbullying, among others.
But the problem with TikTok runs much deeper.
For example, many younger users have uploaded sexually provocative content, while there have also been cases of ex-partners attempting to ruin their previous partners’ lives by uploading videos and photos from their old relationships. This has real-world consequences for users. In Egypt, five women have been sentenced to two years in prison for “violating public morals” in their TikTok videos.
Then there’s the never-ending stream of anti-Semitism, racism, and xenophobia. There have even been cases of ISIS using the platform to promote their extremist propaganda. All these issues can lead you on a path you don’t want to go down.
SO, YOU STILL WANNA STAY ON TIKTOK?
Back in 2018, the #DeleteFacebook movement took hold as users protested some of the company’s ulterior motives and suspicious practices. But while Facebook is no angel and unquestionably deserves to be under the spotlight for the decisions it has taken in recent years, TikTok is a whole lot worse. It is a pandemic within a pandemic.
The bottom line is quite simple: You should not have an account on TikTok, you should not have the app on your phone, and you should not encourage other users to sign up.
End of story…
WHY ARE LARGE COMPANIES QUITTING TIKTOK?
On November 26, UK-based cosmetic retailer Lush said it’s quitting TikTok. It also said that it’s finally had enough of Facebook, Instagram, and Snapchat. The cosmetics company attributed its decision to ‘the latest information which clearly lays out the harms that young people are exposed to because of the current algorithms.
This directly refers to Facebook, now known as Meta, and to TikTok, soon to possibly become known as Facebook.
The company likened social media to a ‘dark and dangerous alleyway’, calling them ‘places no one should go’. ‘There is now overwhelming evidence we are being put at risk when using social media, the company said in a statement. It cited social media’s reliance on harassment, harm, and manipulation.
This is not the first time brands have boycotted social media platforms for hosting harmful content. Last year, companies including Adidas, Unilever, and US telecoms provider Verizon paused their advertising spending on Facebook for a month. Even global coffee chain Starbucks reportedly considered quitting Facebook over the constant struggle to moderate hateful comments on its pages.
But none of them has been able to stay away for long. And there’s a good reason for that. Quitting social media means switching off a channel for criticism from customers for their brands.
THE CONCLUSION
It’s time to study the health impacts of TikTok, knowing that there’s hardly any research on the platform.
Despite the fact that TikTok now has over one billion users, public health researchers know very little about the health effects it might be having on the platform’s mostly young users. That’s concerning because researchers know social media platforms can be harmful, especially to the health of some groups of teens and adolescents.
“The platform is so massive, but there’s almost no scholarly investigation into it,” said Marco Zenone, a health policy researcher at the London School of Hygiene & Tropical Medicine. “The research is very behind, considering how much reach TikTok has,” Zenone added.
Regardless of what researchers like Zenone think or say, TikTok is always one step ahead of the competition. On the 2nd of December 2021, TikTok announced it’s rolling out Creator Next, a single hub for all of the platform’s monetization tools. alongside Creator Next, it’s also introducing its tipping feature, which lets users send money directly to their favorite creators.
This is part of TikTok’s larger efforts to retain creators and prevent them from hopping on other networks that pay creators, like Instagram, Facebook, YouTube, and Snapchat. When the platform was just a year old in 2019, users struggled to find ways to earn money natively through the app, as they had to secure deals and sponsorships outside of the app. To help remedy this, the platform later rolled out its $200 million Creator Fund in 2020, splitting up the cash to reward the platform’s most popular creators.
TikTok started testing Tips as another way for creators to make money last October, but now it appears that the feature is available to all creators who meet TikTok’s eligibility requirements. The qualifications include being over the age of 18, having an account in good standing, and having 100,000 followers or more. If they meet the criteria, creators will receive 100 percent of every tip.
TikTok is also expanding on its Live Gifts feature, which lets viewers gift diamonds — a virtual currency that’s redeemable for cash — during live streams. TikTok is now throwing Video Gifts into the mix, allowing users to donate diamonds during regular, non-live videos.
What this means is that everything nowadays is becoming TikTok. YouTube and Instagram are taking cues from TikTok’s interface. Virtually everywhere, we’re seeing the same content. Reels, shorts, stories, call them whatever you want, they’re all copies of TikTok. They’re all focusing on generating TikTok-styled content. And they’re all ruthlessly copying TikTok.
There’s no doubt that TikTok is today the app of the moment. But will it succeed in becoming the next Facebook?
Only time will tell…
Please watch this on my YouTube https://youtu.be/PUTL-ib70MQ and don’t forget to subscribe to my channel and like my videos.