The whole situation in January left me with way more questions than answers
I’ve been staring at this blank document for months now, trying to figure out how to write about the TikTok ban without sounding like I’m taking a side in something I still don’t fully understand. Every time I think I’ve got a handle on it, another angle occurs to me that makes me question everything I thought I knew.
It all started when I was scrolling through news headlines in January — which, let’s be honest, was already a wild month for completely unrelated reasons — and suddenly TikTok was banned. Just like that. One day people were posting their usual dance videos and recipe hacks, the next day the app was gone from app stores and existing users couldn’t access it. Then, within hours, it was back. The whole thing lasted less than a day, but it felt like watching a soap opera where they kill off a major character and then reveal it was all a dream in the same episode.
The whiplash was real. My group chat exploded with people freaking out about losing their accounts, then celebrating when it came back, then arguing about whether any of it made sense. Some friends were relieved and others were angry that it happened at all. I just sat there feeling like I’d missed some crucial piece of information that would make it all click into place.
That’s when I realized I had no idea what I actually thought about any of this. Not about TikTok specifically, not about data privacy, not about how we handle tech companies from other countries. I’d been coasting on vague impressions and half-remembered news articles, but when push came to shove, I couldn’t even explain to myself whether I thought the ban was a good idea or not.
So here I am, months later, still trying to untangle it all. But this time I’m actually going to dig into what happened and what the arguments really are, instead of just wallowing in my own confusion.
What Actually Happened (The Timeline That Tells a Story)
The facts are pretty straightforward, even if the politics aren’t. The law specifically named Chinese company ByteDance Ltd. and TikTok as “foreign adversary controlled” with a deadline for divestment of January 19, 2025. The Supreme Court upheld the ban, meaning that starting on Jan. 19, tech giants Apple and Google could no longer offer TikTok on their app stores, and web-hosting providers had to cut ties with the platform or face fines of $5,000 for each user that could still access the service.
TikTok officially went dark in the United States on January 19, 2025, as the federal ban took effect. But here’s where it gets interesting: Trump signed an executive order on his first day in office delaying the app’s ban by 75 days, effectively resurrecting it hours after the platform had gone dark.
The timing wasn’t coincidental — it was choreographed. The original deadline fell on inauguration day, creating a perfect setup for the incoming administration to look like saviors while the outgoing administration could claim they were tough on national security. After taking office, President Donald Trump gave TikTok a 75-day reprieve by signing an executive order that delayed enforcement until April 5.
But the story didn’t end there. On April 4, 2025, just before that deadline, Trump delayed the ban again by 75 days to June 19. And then a week ago on June 19, 2025, a third delay of 90 days pushed the enforcement to mid-September.
Here’s what I find genuinely concerning: these repeated delays don’t solve anything. The Supreme Court ruling still stands. ByteDance still hasn’t sold TikTok to an American company. We’re basically in the same place we were before, just with the clock reset multiple times.
The Security Arguments That Actually Matter
I spent way too much time reading security analyses, and honestly, the concerns aren’t as vague as I thought they were. Security experts identify three main risks: TikTok being part of a Chinese government influence operation designed to sway U.S. politics, TikTok being used to collect personal data on Americans, and the app allowing for injection of malicious code onto devices.
The data collection part is where things get concrete and scary. In December 2022, TikTok admitted that employees had spied on reporters using location data, in an attempt to track down the source of leaked information. Those employees were fired, but TikTok also reportedly planned to surveil the locations of specific American citizens. That’s not theoretical — that actually happened. Intent aside, it’s safe to say this crossed a line — and the U.S. government was not impressed.
The FBI is concerned that the Chinese government could use TikTok to influence American users or control their devices. There’s also evidence that TikTok’s in-app browser can perform keylogging, which theoretically means it could collect passwords, credit card information or other sensitive data that users submit to websites when they visit them through TikTok.
But here’s where I get frustrated with the whole debate: there is no public evidence TikTok has actually used keylogging maliciously, and TikTok says the function is used for “debugging, troubleshooting, and performance monitoring,” as well as to detect bots. Many Silicon Valley companies also do this.
So we have real capabilities for surveillance and influence, documented instances of inappropriate data access, but no smoking gun proof of active malicious use against American users. That’s a genuinely difficult risk assessment, not the clear-cut threat some politicians make it out to be.
Why I Think the “Chinese Threat” Framing Is Both Right and Wrong
The national security concerns are legitimate, but I can’t shake the feeling that we’re being inconsistent about how we apply them. ByteDance is subject to Chinese laws that could compel cooperation with intelligence services — that’s a real structural problem that doesn’t exist with American tech companies, even though they collect just as much data.
The U.S. government’s main argument is that the Chinese Communist Party could potentially influence TikTok’s users by controlling user feeds, suppressing dissent, or spreading disinformation. This isn’t paranoia — it’s how authoritarian information control works.
But here’s what bothers me: we’re treating TikTok as uniquely dangerous when American social media platforms have been used for election interference, radicalizing extremists, and spreading misinformation for years. Facebook literally enabled genocide in Myanmar. Twitter was a primary vector for election conspiracy theories. YouTube’s algorithm has pushed people toward increasingly extreme content.
I’m not saying Chinese control isn’t worse than American corporate irresponsibility — I think it probably is. But the fact that we’re only now getting serious about social media as a national security threat, and only when it’s foreign-owned, suggests this is as much about geopolitics as it is about genuine security.
What TikTok Actually Means to People (And Why That Matters for Policy)
“As the January 19th deadline approaches, TikTok creators and users across the nation are understandably alarmed. They are uncertain about the future of the platform, their accounts, and the vibrant online communities they have cultivated.”
This quote from Senator Markey captures something important that gets lost in the security debates: TikTok isn’t just a data collection platform, it’s a creative and economic ecosystem. Small businesses built customer bases there. Artists found audiences. Social movements organized. People made actual livings.
When we talk about banning TikTok, we’re talking about destroying all of that overnight. That’s not necessarily wrong if the security risks are severe enough, but it’s a real cost that deserves to be weighed honestly against theoretical benefits.
What frustrates me is that the debate treats this as either “ban it completely” or “do nothing.” TikTok has taken steps to address security concerns, including creating TikTok U.S. Data Security Inc., a subsidiary designed to address national security concerns and maintain transparency and oversight. But it’s not clear whether these measures are sufficient or just security theater.
The Solutions We’re Not Seriously Considering
Instead of this all-or-nothing approach, why aren’t we talking about comprehensive data privacy regulation that would apply to all social media companies? The EU’s GDPR model isn’t perfect, but it creates consistent standards for data protection regardless of where companies are based.
We could require all social media platforms operating in the US to store American user data on American servers with American oversight. We could mandate algorithm transparency. We could create real penalties for companies that misuse data, whether they’re Chinese, American, or from anywhere else.
The U.S. government has raised several objections to TikTok’s data collection practices, mainly focused on American users’ sensitive personal information, and TikTok has responded to these concerns by creating security measures. But we haven’t seriously tested whether those measures could be sufficient with proper oversight and enforcement.
Where I Actually Stand (Finally)
After digging into all this, here’s what I think: the security concerns about TikTok are real and legitimate, but our response has been driven more by geopolitical theater than by consistent principles about data privacy and platform security.
I think ByteDance should be required to sell TikTok to remove the structural problem of Chinese government access. But I also think we should use this as an opportunity to create comprehensive social media regulation that addresses the privacy and influence problems across all platforms, not just the foreign-owned ones.
The January ban-and-reversal dance was political theater that helped no one. It scared creators and users without solving any security problems, and it let both political parties score points without doing the hard work of crafting actual policy solutions.
What we need is boring, technical regulation that treats data privacy and platform security as the serious, complex problems they are, rather than as opportunities for geopolitical grandstanding. But boring technical solutions don’t make for good political theater, so I’m not holding my breath.
At least now I know what I think, even if I’m not particularly optimistic about what we’ll actually do about it.
Author’s Note — This blog is exclusively dedicated to academic purposes within the school context and is committed to maintaining a neutral stance, avoiding any political influences or affiliations.
