UK has removed app over concerns data can be monitored by Chinese state, but public remain vulnerable
TikTok is wildly popular, with more than 1 billion people consuming its short video posts around the world. But the app is less favoured by politicians in key markets such as the US and UK, where it has been banned from government-issued phones over security fears. We answer your questions about why TikTok has become a lightning rod for suspicion of Chinese state espionage – and whether nationwide bans are likely.
Why has TikTok been banned from UK government phones?
The main concerns with TikTok relate to data and the fact that it is owned by the Beijing-based ByteDance, a Chinese internet company. Could the Chinese state demand access to data generated by TikTok’s global user base and, for instance, create profiles of people it is interested in, such as government employees in other countries?
The UK government raised data concerns in its statement explaining why it was taking the “prudent” step of removing TikTok from government-issued devices. It said the app was able to access user data from devices, including contacts and geolocation. According to a report by the Australian-US cybersecurity firm Internet 2.0, TikTok’s app can access a user’s calendar, other running applications, wifi networks, and even the sim card serial number.
Referring to similar bans by the US, Canada and the European Commission, the UK government said: “The government, along with our international partners, is concerned about the way in which this data may be used.”
TikTok maintains that its data is stored outside China in Singapore and the US. It is also proposing to store American and European user data in third-party servers in the US, Ireland and Norway. This has not been enough to assuage the concerns of critics, who also fear that the Chinese state could manipulate TikTok’s recommendation algorithm, which curates what people see on the app’s For You feed.
Why is it not being banned from personal phones in the UK?
The government said the “proportionate step” did not extend to personal devices for civil servants, ministers or the general public. However, it added that individuals should be aware of “each social media platform’s data policies” before downloading them.
“The bottom line is that if there is a cybersecurity issue for the government users, the same applies to all of us,” says Alan Woodward, a professor of cybersecurity at Surrey University. “The only argument might be that boring users such as me are not of interest, but there are plenty of professionals outside government where confidentiality is very important. Journalism, legal, medical and so on. If there is a security issue that nobody has found – and many have looked – then please tell us all so we can all delete it.”
Why are there concerns about the Chinese state accessing TikTok data and its algorithm?
TikTok is owned by Beijing-based ByteDance, which has led to politicians in the US, the UK and elsewhere voicing fears that Chinese officials could demand access to TikTok’s user data and source code under domestic laws including the National Intelligence Law of 2017, which states that all organisations and citizens shall “support, assist and cooperate” with national intelligence efforts. TikTok says it has not received a request from the Chinese government for its data and that, if it did, it would refuse.
The US government does not believe TikTok, and this week confirmed that the Biden administration had asked TikTok’s Chinese owners to sell their stakes in the company or face a complete ban in a key market where it has more than 100 million users. TikTok says ByteDance is 60% owned by external investors including the US private equity firm KKR, 20% by its employees and 20% by its founders, Zhang Yiming and Liang Rubo, who carry stronger voting rights than the other shareholders.
Can TikTok user data be accessed within China, including by the Chinese state?
TikTok has long insisted that user data from western nations does not enter China, where ByteDance is based. But time and again, the company has admitted exceptions to this rule.
In 2022, a BuzzFeed investigation revealed a rash of situations in which engineers in China would have access to US data, lasting at least until January 2022. The data was “stored” in the US, but with access controls that allowed staff elsewhere to access it, according to the investigation.
The similar functionality of TikTok and its Chinese sister app Douyin suggests overlap between the engineering teams, but the extent to which resources are shared has remained fuzzy. Analysis of both apps suggests that they may share parts of their source code and are developed from a common code base, according to CitizenLab. TikTok’s credibility was also damaged last year when ByteDance admitted that employees had used the app to spy on reporters.
Can the recommendation algorithm be manipulated by a state actor?
The app’s much vaunted “For You” page has a complex recommendation algorithm that takes into account a huge range of signals, both explicit and implicit, to decide what content a given user should be shown. Despite efforts to introduce transparency to how the company operates, the “FYP” is, like many recommendation algorithms, a black box for users: the total count of signals, how they are applied, and how they result in any given piece of content appearing is ultimately impossible to see from the outside, and difficult even for TikTok itself to answer, given the nature of modern machine learning techniques.
That means it is also difficult for the company to counter fears that the recommendation algorithm could be silently tampered with by a state actor, either through pressure at a corporate level, or through corrupting a small number of employees directly. In 2019, the Guardian revealed that TikTok’s moderation guidelines helped promote Beijing’s view of foreign policy, banning the promotion of separatist agendas in Tibet and Northern Ireland; the company says it now writes its moderation guidelines locally, but some fear that a similar approach would be harder to uncover if applied through automatic selective promotion and demotion of videos.