TikTok’s Privacy Concerns Are Well-founded
Originally published on ValueWalk
You’ve probably heard of TikTok by now.
It’s been downloaded nearly 2 billion times, and its share of active users is growing exponentially. And though many of these users are teenagers and young adults, the app’s appeal is quickly spreading beyond that demographic. This is probably due, in large part, to social distancing mandates enforced by governments around the world in light of the coronavirus pandemic. Doctors even use TikTok to share health news and updates, and families join together to create wholesome content while stuck in quarantine. Even older celebrities like Jack Black and Jane Fonda are on board, participating in viral coronavirus-themed dance challenges and sharing home workouts.
Sounds pretty innocent, right? So why has this seemingly harmless app come under so much scrutiny? What led Reddit co-founder and CEO Steve Huffman to call it “fundamentally parasitic” and characterize it as spyware? Why has the Pentagon advised all military personnel to delete the app from their smartphones? As it turns out, the app’s detractors have plenty of ammunition to work with — and much of it ties back to how governments work in concert with the app itself. Now, as rumblings of a TikTok IPO are rising, concerns over privacy and security are surfacing that cloud the stock’s expected performance horizon. With monthly active user counts numbering beyond 800 million in late 2019, TikTok’s debut would likely otherwise be unfettered. Instead, analysts are downgrading their forecasts based on the potential for future regulation that could stifle growth.
The question is: Do users actually care? Despite recent transgressions ranging from egregious data breaches to enabling foreign pandering in U.S. elections, just a fraction of users actually “quit” similar social apps like Facebook and Twitter.
Tracing Questionable Origins
The platform now known as TikTok actually emerged around six years ago as Musical.ly, a lip-syncing app that was popular among children and teenagers.
Unlike other apps targeting this audience segment, Musical.ly didn’t require parental consent for users under 13. And until the last few years, it engaged in the illegal collection of these users’ personal information without giving parents a way to delete that data. Given the app’s popularity, these were unprecedented violations of the Children’s Online Privacy Protection Act that ultimately resulted in a $5.7 million fine.
But that legal battle is just a small part of the app’s unsavory history. In 2017, Musical.ly was acquired by China’s ByteDance for between $800 million and $1 billion. As part of the deal’s terms, ByteDance promised to keep the lip-syncing app separate from its portfolio of Chinese apps, but it merged Musical.ly with a similar app — TikTok — less than a year later.
Since then, ByteDance has become one of the world’s most valuable startups, and TikTok reached global stardom. But the larger the app has grown, the more serious the allegations against it have become.
TikTok's Privacy Concerns Mount
Toward the end of last year, Israeli cybersecurity firm Check Point revived TikTok’s security concerns when it identified vulnerabilities in the platform that could’ve led to the exposure and manipulation of user data. Around the same time, a lawsuit filed in Northern California’s district court alleged that TikTok has “vacuumed up and transferred to servers in China vast quantities of private and personally-identifiable user data.”
In China — where the lines between the private and public sectors are blurry at best — no company can flourish outside of the government’s good graces. The prospect of American consumers’ private information falling into the hands of China’s government has shaken politicians and regulators, and the app’s shady history undoubtedly provides cause for concern. But why would a foreign government be interested in data mined from American teenagers?
In short, TikTok relies on the same types of features that make social platforms such as YouTube, Instagram, and Facebook so addictive. Deploying these features requires it to gather a database of user preferences that’s parsed in real time to queue up relevant content.
Unfortunately, the algorithms driving user behavior on these platforms are prone to misuse. Just take YouTube’s recommendation engine, which leads users down a winding path of increasingly radical content and has been shown to promote extremist behaviors. Facebook’s sharing algorithm also helps fake news stories spread, leading to foreign governments meddling in international elections.
Although issues with these and other platforms pose serious problems for society, TikTok's privacy concerns are heightened given the fact that China’s government isn’t afraid to use private data to enforce its desires. In Suzhou — a Chinese city blanketed by cameras equipped with facial recognition technology — citizens are shamed for wearing pajamas in public. In other regions of China, data is used to identify, locate, and persecute Uighur Muslims, a minority group that the ruling Communist Party has labeled a terrorist threat.
There’s also evidence that TikTok has assisted censorship efforts targeting its Chinese users, including documents alleging that TikTok moderators banned livestreams that might be deemed harmful to “national honor.” (That harmful content includes everything from divisive political speech to beer bellies and crooked smiles.)
So how does this affect you?
Why Consumers Still Play a Role
There’s nothing inherently wrong with collecting data. Examples of organizations using data to add value abound: Transportation app Waze uses data to quell congestion and improve the travel experience across cities, and fitness startup Peloton sees data as a way to build upon users’ experiences and customize their workouts.
But we should absolutely be concerned about the possibility of data being used by a foreign government for surveillance purposes. It’s precisely this concern that has prompted some branches of the U.S. military to prohibit the use of TikTok on government-issued smartphones.
The reality is that without an international legal framework in place to ensure that all countries treat consumer data the same way, there’s little to prevent tyrannical governments from using it how they see fit.
That said, engaging with TikTok probably won’t hurt you — as long as you use it thoughtfully. Instead of viewing it as a harmless platform filled with positivity, we should see it for what it is: a giant vat of entertainment that might contain toxins. Drink responsibly!
Because TikTok studies your behavior and makes decisions on your behalf, it’s important you think critically about what you see on the platform. Ask yourself how you feel about the content that’s presented to you.
Is that content based on popularity or on your past behaviors? Does it appear to capitalize on the information you shared with another app or company? How safe do you feel knowing that TikTok (or any other platform) keeps track of your personal preferences? The answers to these questions will help you decide whether to continue using it.
Everyone has a role to play in creating and using social media platforms that make the world better — not worse. By nurturing certain instincts when it comes to social media use (what’s the implication of the content you consume, for instance?), you can play a more active role in guiding these platforms’ futures.
Modern consumers are quick to exchange personal information for free services and flashy features, and that probably won’t change anytime soon. But by holding the companies that collect our data accountable (and choosing to use platforms that value our privacy rather than treating our data carelessly), we can ensure that playful dance challenges and cat videos aren’t just a facade.
About the Author
Dan Conner is the general partner at Ascend Venture Capital, a micro-VC in St. Louis that provides financial and operational support to startup founders looking to scale. Conner specializes in data-centric technologies that enable the future states of industries. Before founding Ascend Venture Capital, Conner worked on the operations side of high-growth startups, leading teams to build scalable operational and financial infrastructure.