Nazak Nikakhtar served in the Commerce Department during the Trump administration. She was an assistant secretary for industry and analysis in the Department’s International Trade Administration and also served as an undersecretary for the Bureau of Industry and Security. In those positions, she helped shape U.S. trade policy toward China and led the Commerce Department’s work with the Committee on Foreign Investment in the United States (CFIUS). Now, as a lawyer with Washington, D.C. firm Wiley Rein, she chairs the national security practice, advising government agencies and private companies, while monitoring U.S. efforts to regulate Chinese technology companies, including TikTok. Nikakhtar spoke to The Wire China about the legal paths that could result in a TikTok ban or sale. (TikTok has maintained that it does not store user data in China and is not influenced by the Chinese government, and it is working with American authorities on a mitigation plan.)

Illustration by Lauren Crow
Q: Former President Donald Trump tried to ban TikTok from app stores on national security grounds through the International Emergency Economic Powers Act (IEEPA), but courts found he overstepped his authority. What can we learn from that?
A: The Trump administration was absolutely right about the threats that TikTok poses. We were ahead of the curve because at that time a lot of people questioned it. Now everybody’s come to the realization that TikTok is a problem in that we don’t have visibility with respect to the data that they’re collecting and they’re using.
As a policymaker, you’re always faced with, what are my legal options? And what are the litigation risks with respect to certain options? I never believed that the notion of mandating that app stores prohibit users from downloading the TikTok app was a viable strategy because as a litigator, as an attorney, I know that the impacted party can ask the court to impose a judicial order that restrains the federal government’s actions. And that’s exactly how it happened.
BIO AT A GLANCE | |
---|---|
FORMER POSITION | U.S. Department of Commerce: Assistant Secretary for Industry & Analysis at the International Trade Administration, 2018-2021 |
CURRENT POSITION | Chair of the National Security Practice, Wiley Rein LLP |
I’ve proposed the legal solution of putting ByteDance, the parent company, on the U.S. government’s Entity List, which restricts U.S. exports of goods, software, and technology to the listed company. As a result, if American users are not able to upgrade their app with software updates, which involves exports of software, then the TikTok app would potentially degrade over time. TikTok would die slowly on the vine. When Huawei went on the Entity List, if you were in the United States and you had Huawei handsets or Huawei equipment, the upgrades of those would have been subject to Entity List regulations. The idea was software would degrade over time.
The debate at the time was, if you update an app, is that an actual export of software? After much debate, the realization was that you can actually regulate TikTok this way. By that time, the damage was done. The legal path taken by the U.S. government was ultimately challenged and then blocked by the courts. The U.S. government was correct to act, but the legal path was not the optimal one.
What role can Congress play?
We have the Information and Communications Technology and Services (ICTS) Supply Chain Executive Order that President Trump issued in May 2019. Broadly, the order gives the president the authority to block ICTS transactions with foreign adversary nations. President Biden has since endorsed it. The notable aspect here is that the order was issued under the International Emergency Economic Powers Act (IEEPA) authority. The Berman amendment [a Cold War-era amendment] limits IEEPA’s reach in that IEEPA cannot be used to regulate certain informational materials, such as artistic expression, protected by the First Amendment.

We fundamentally need to modernize the Berman amendment. If I were Congress, I would develop legislation that creates an exception under the Berman amendment for technology platforms offered by foreign countries of concern that pose surveillance risks . Platforms that enable artistic expression that can be misused by foreign countries of concern for dangerous surveillance capabilities could be rationally understood to not be a form of artistic expression that we would want to protect as free speech. A digital technology platform is not speech, it’s simply a platform.
Would there be a risk that U.S. social media companies could get caught up in a revision to the Berman amendment, since people from around the world post on their platforms, and most of what people post to platforms like Facebook and Instagram is public?
MISCELLANEA | |
---|---|
FAVORITE BOOKS | Economic textbooks (I love textbooks). |
MOST ADMIRED | My father and mother. Also, George Washington and Clara Barton. |
It’s not about what somebody outside the company could amass; it’s really about what a digital platform has access to. It’s not the information that you make public on its face. It’s the user account and data that’s not necessarily public. Even more so, the issue is the amassing of public information, the sorting and the analyzing in a way that makes the analytic output and results nonpublic. That’s where I would want to see regulation extending to foreign countries of concern. Who are those? Russia, China, Cuba, North Korea, Venezuela, and Iran. Most of them are sanctioned. We aren’t doing business with them, right? It’s going to be really China.
If you have legislation that basically says, for social media platforms that are owned or controlled by an entity that is governed by country of concern, then Berman amendment exceptions wouldn’t apply, then that way that would allow the ICTS executive order to be leveraged to ban or regulate TikTok. You can still do the Entity List because that’s a separate legal authority.
How does a company get on the Entity List? Can a company challenge the designation?
There’s a group called the End User Review Committee, comprised of interagency [representatives from] Commerce, Defense, State, Department of Energy. Somebody internally within the government or external parties can nominate a company but external parties are not recognized as part of the process.

I’m going to read you the actual legal requirement. “Entities for which there is reasonable cause to believe based on specific and articulable facts, that the entities have been involved, are involved, or pose a significant risk of being or becoming involved in activities that are contrary to the national security or foreign policy interests of the United States.” [So] I don’t need [to have] uncontroverted facts; I need reasonable articulable facts. I don’t need to have concrete proof; I just need to have a reason to believe that [a company] may be acting in ways that are contrary to the foreign policy interests of the United States.
That’s a pretty flexible standard, right? If you are pretty convinced that an entity is harnessing data for surveillance capabilities, without knowing what the potential for surveillance capabilities is — it could be for AI, facial recognition, whatever — that’s certainly contrary to the foreign policy interests of the United States.
An Entity List designee rarely challenges the designation because the legal threshold is so low, and if the inter-agencies already worked to designate this entity, they are probably pretty much locked in on the position that you need to be on the Entity List. If you do try to take it to the courts, it’s a national security determination. The courts are going to defer to the government, because you’re using a national security tool to deal with national security risks. So it’s going to endure any potential threats of litigation.
… if you’re going to think about it in a smart way, how do you convince other lawmakers that legislation is merited? Put TikTok on the Entity List.
I know folks on the Hill are looking at other legislative options. Now, is legislation much more enduring than an Entity List designation? Absolutely. But if you’re going to think about it in a smart way, how do you convince other lawmakers that legislation is merited? Put TikTok on the Entity List. That way you have taken a big bite at the problem. By the time people get desensitized to the initial shock that, ‘Oh, my God, we can’t use TikTok anymore,’ legislation seems to be easier to pass. At that point, it’s one more blow once you’ve done the hard thing.
To me, it’s unfathomable that such an easy, straightforward solution exists, but that nobody’s pursued it. And my hunch is that, who wants to get a bunch of teenagers mad at them? That’s not a national security strategy, the notion that I don’t want teenagers or celebrities mad at me.
In March, Senators Warner and Thune introduced the Restrict Act, which would give the Commerce Department authority to ban or restrict certain technologies from China, Russia, Iran, North Korea, Cuba or Venezuela on national security grounds. The bill has support from the White House and would revise part of the Berman amendment. What do you think of this bill, and its potential to withstand legal challenges?

The bill on its face is robust. Even though it is not immune to challenge — especially when used to regulate freedom-of-speech or -expression type activities — the legislation aims to address substantial risks to national security and so it is likely to withstand legal scrutiny. Ultimately, U.S. courts will likely find that digital platforms that enable freedom of speech or expression, while also enabling surveillance capabilities by adversaries, cannot be classified as protected speech. Fundamentally, a digital platform is not the same as the content. The latter is speech, the former is not.
The nature of the data that you could put on TikTok, there’s a lot that is public. But what people don’t understand is that when you amass literally millions of people’s public data, and you’re using algorithms to sort through and you analyze it, now you’ve taken public information and you’ve done something with it that’s not public, that’s high risk.
What do you think of the bill that was recently approved by Republicans on the House Foreign Affairs Committee that will also give the president power to restrict foreign communications technology?

The key aspect of this bill is Congress’s effort to restrict data flows to foreign adversaries. This is important; the time to impose restrictions on data transfers to foreign adversaries is past due. It is disheartening to see this debate become a party-driven issue rather than a national security-oriented issue. National security should always garner bipartisan support, and the fact that the U.S. intelligence community has repeatedly determined TikTok to be a national security threat, and in light of the EU’s recent determinations that TikTok has been misusing children’s data.
Folks are waiting for Congress to pass legislation to address these risks, and the legislation has to be flexible and broad. If Congress leaves any gaps in the law and, consequently, limits the executive branch’s ability to act, our adversaries will exploit the gaps. If the Restrict Act made explicit that digital platforms that could potentially be used for surveillance by foreign adversaries are not protected speech, then that would make a big difference.
What specifically are the concerns about a Chinese or other foreign company being able to amass and utilize Americans’ data?
It’s unclear what the data may be used for. We know that they’re amassing millions of individuals’ data, aggregating them, and analyzing ways to weaponize the data against U.S. populations or sub-populations. Public data that is aggregated from millions of individuals and then analyzed using software results in output that is essentially non-public. When the results are non-public, they should be regulated. These apps by malign providers are also used as a method to exert soft power and propaganda. They’re very dangerous.
Groups such as the American Civil Liberties Union and the Electronic Frontier Foundation have criticized efforts to ban TikTok. Senator Rand Paul objected to the legislation on free speech grounds, and compared its scope to the Patriot Act. How should lawmakers balance the First Amendment and artistic expression with security concerns?
Free speech and artistic expression is an individual right. But when all of those benefits get amassed on an entire population, that’s when you go across the line to the national security risks. Should everybody have the ability to post funny videos of themselves dancing and things like that? Absolutely. It doesn’t mean that China has a right — a foreign adversary has a right — to scoop up all of that stuff from U.S. citizens, with us giving it to them on a silver platter.

First and foremost, we have to carve out the risk for what it is. And we say look, we are still committed to the ideas of free speech and artistic expression. But the reality is that China is a foreign adversary which is exploiting our markets for free speech and our freedom to collect, collect, collect all this data to potentially misuse it. We don’t have to articulate every single way that they’re going to misuse the data right now, because we don’t know what technology will be capable of tomorrow.
I don’t need to articulate how those things are going to be weaponized, as long as I can articulate that these are the items that have the potential to be weaponized. We don’t expect our citizens to know all the risks, they should trust the policymakers in Washington, who’ve been tasked with this responsibility to protect everybody.
If you’re not regulating the behavior, you’re just regulating the digital platform or app, it’s not really an artistic expression issue. You’re just saying that this is a company subject to the demands of a foreign government. So long as TikTok is controlled by these guys, we don’t want them to be here. Everybody figure out somewhere else to post your videos, but we just don’t want this foreign adversary to have access to millions of people’s data.
It’s not the people’s rights that are being infringed on. It’s who is exploiting those rights. This is where the Berman amendment is actually misread and misused in very dangerous ways. The goal is not regulating the activity. It’s actually regulating the digital platform and the recipient from having access to that activity. You’ve got to cut off access by the recipients, not the actual activity. That’s an important distinction that people need to make.
The Committee on Foreign Investment in the U.S. (CFIUS) has been investigating TikTok since 2019. What might a negotiated agreement look like?
I haven’t seen anything that’s been worked on right now, so I don’t have personal knowledge. But probably you would have a mitigation agreement that would be between TikTok, ByteDance, Oracle or any third-party intermediary, and the government that goes through the interagency review by the CFIUS members. You’re going to have a lot of debate between CFIUS members saying, how do we know if anybody’s going to adhere to it?
Visibility is always an issue. When the U.S. Government enters into a mitigation agreement with the foreign acquirer and the U.S. acquired entity, the [U.S.] government could mandate that an independent monitor come in and ensure that the parties are adhering to the terms of the agreement. You can periodically have audits, but you’re likely not going to have a forensic auditor who’s going to look at digital fingerprints of data flowing over to the Chinese. If TikTok were to transmit information, they’re not going to do it in a way that’s easily detectable. To be clear, the U.S. Government does not appear to have concluded a mitigation agreement with ByteDance and the impacted entities.
If you’re intent on undermining the rules to satisfy your parent company, in a country that is beholden to the government, and the government is demanding that you do scary things, you’re probably going to have a workaround.
The government’s putting itself in an unnecessarily tenuous position because if you take CFIUS’s most important tool and show in a very high level public instance that it’s utterly failed, it’s going to shake everybody’s confidence. If there’s a whistleblower in the company that decides to do the responsible thing and say, I see that things are not going the way that they’re supposed to and information is somehow getting back to the CCP, that could really undermine the entire process that CFIUS has instituted for many years, which is relying on these mitigation agreements to address national security risks.
During a Congressional hearing in March, ByteDance CEO Shou Zi Chew emphasized that ByteDance was working to store data from American TikTok users on Oracle servers in the U.S., and it is having Oracle and other third parties audit its algorithms. How much does that change the threat or the legal options?
We don’t know if everything is being stored in the United States. Even if it were, upgrading the app is still an export. You can still use the Entity List. I don’t think most people have seen the terms and conditions under which Oracle promises to do X, Y, and Z. But understanding the CFIUS process and that all parties have to agree to what is being set in stone, in terms of oversight, supervision, and management of data, I am pretty sure that it’s not going to be as fulsome as one would want, because ByteDance would never have agreed to it. The fact that the data is still accessible by TikTok and ByteDance provides no assurance whatsoever that the dangerous information isn’t going to leave the United States.
It’s really hard to say what the parties are proposing. Having seen too many instances where the government and companies fully think that they’ve addressed national security risks, only to find out that things were circumventable, I don’t believe it’s adequate.
It’s been reported in The Washington Post, under an agreement, the U.S. government will be able to veto appointments for the three-member board and top executives and also set some hiring standards for ByteDance in the U.S. Is there any legal precedent for how that would work?
That’s definitely within CFIUS’s purview. At the end of the day, so what? You get board members that you like, and you trust, and they might not know what they don’t know. Board members don’t have 100 percent visibility, even if they demand it. If a company is intent on behaving in a way that is against U.S. interests and that violates the privacy of millions of U.S. citizens, it’s not like they’re going to get board approval. This notion that the board is going to fix the problem is just so absurd, and so laughable, I just don’t even know what to say about it.
Unless somebody figures out a surefire way to detect transfers of data, and do perfect forensics that always capture the transfer of data from all platforms that are available, what are you doing? I’m not a data forensic specialist, but I know, having been in the national security space long enough, that there is no foolproof data forensic capability that exists. If you’re intent on undermining the rules to satisfy your parent company, in a country that is beholden to the government, and the government is demanding that you do scary things, you’re probably going to have a workaround.
Recently there’s been renewed talk of a forced sale of TikTok. [TikTok has been critical of such a plan, with a spokesperson saying the focus should be on restricting data flows for all social media companies.] Has there been a similar forced divestment for a company of this nature or a media company?
It’s happened before when companies don’t notify CFIUS of their U.S. transactions, and CFIUS subsequently discovers that a foreign investment has taken place posing national security risks that cannot be mitigated. In these instances, CFIUS can unwind the transaction if parties don’t voluntarily unwind. CFIUS can also block proposed transactions from going forward. The government can even acquire private property. But this is really unprecedented, with respect to an app with such widespread use and popularity, for CFIUS to require a forced divestment. And can that even be done with an app? How do you break apart the U.S. component from the global TikTok app? What would it be called, who would own it, and how would the U.S. purchaser acquire the software/algorithm that would make TikTok function as it had before? Would ByteDance hand over this information? Obviously, no, as we’ve seen to date.
If you can project forward a year from now, where do you think we’ll be as far as TikTok’s position in the U.S. and what kind of legal structures might be in place?

The Warner/Thune legislation [in the Senate] is likely to pass in some form, given that it is based on — and importantly strengthens — already existing legal authorities. However, the passage of the bill merely gives the executive branch authority to take action. Taking real action to prohibit high-risk transactions — especially those as large and significant as prohibiting the TikTok app — is a very different proposition.
Transactions that are high-risk necessarily involve digital platforms with millions of users — that’s the characteristic that makes them so dangerous: the fact that so many Americans’ data are exposed. At the same time, the popularity of these platforms is what makes so many politicians reluctant to interfere. Obviously, we can’t have it both ways, so I see the pendulum soon swinging in the direction of national security.
I do think that there’s going to be some legislation, but ultimately whatever legislation passes will be watered down. It’s going to delegate something to the executive branch. There’s going to be an enormous reluctance by the executive branch to do anything that really packs a huge punch. This is what everybody does, they delegate each other responsibility. It’s just this paralysis and fear across the U.S. government, that we don’t want to do something that makes either the public mad at us or corporate leaders. Again, it’s not good governance because, if the risks are there, you need to respond to the risks and not worry about who is going to get mad at you.
China would have to fundamentally alter its practice and become a truly open, market-based economy for the U.S. to trust it going forward.
Pursuing these small, narrow bites at the apple when China’s whole modus operandi is to circumvent is not serving us. I would actually go so far as to say we’re wasting resources. We have legislation that bans the entry into the United States of items made from forced labor [the Uyghur Forced Labor Prevention Act], yet in reality the nation hasn’t blocked the vast majority of those products from entering the U.S. market. There’s no reason for this lack of enforcement – the government has the data to detain high-risk imports for further investigation and potentially supply chain audits, and ultimately import prohibition.
Alan F. Estevez, Under Secretary of the Bureau for Industry and Security, discussing the semiconductor export control rules, October 27th, 2022. Credit: CNAS
Also, with respect to the October 7 export control rules on semiconductor technology, the U.S. government gave a number of large semiconductor companies temporary waivers. We’re wasting real valuable resources to do these small inconsequential bites of the apple, when we really need to be just doing something that’s straightforward, clear, and effective.
Do you think it’s impossible for a Chinese tech company that collects any sort of personal data to operate in the U.S.?
No, absent U.S. government action on TikTok, those apps will continue to flow into the United States. Hundreds, if not thousands, of such apps provided by malign foreign entities are currently in widespread use in the United States and globally.
Is the level of trust between the two countries so low that neither country’s tech companies can ever have full access to each other’s markets? What, if anything, could change this situation?
China would have to fundamentally alter its practice and become a truly open, market-based economy for the U.S. to trust it going forward.

Jennifer Conrad is a writer in Brooklyn. She recently received her MA from Johns Hopkins School of Advanced International Studies, and has written for Time Out Beijing, Vogue, SupChina, Newsweek.com, and WIRED. @jenniferconrad