December 5, 2020

TikTok invites UK lawmakers to review algorithm after China probe

A man holding a phone walks past a sign of Chinese company ByteDance’s app TikTok, known locally as Douyin, at the International Artificial Products Expo in Hangzhou, Zhejiang province, China October 18, 2019.


LONDON — TikTok has invited British lawmakers into its offices to review its algorithm after a parliamentary committee questioned the social media app’s links to China’s ruling Communist Party.  

Elizabeth Kanter, TikTok’s U.K. director of government relations and public policy, invited the committee to visit its “transparency center” and review its code, as well as how it moderates content.

The invite came after Kanter was questioned by Conservative Member of Parliament Nusrat Ghani on whether TikTok or its Chinese parent company ByteDance censors content that China doesn’t like, including content related to the persecution of Uighur Muslims in Xinjiang, where over a million people are being held in so-called “re-education” camps.

“We do not censor content,” Kanter told the committee Thursday on a video call that was broadcast online. “I would encourage you to open the app and search for Tiananmen Square, search for Tibet, you will find that content on TikTok.”

“We do not moderate content based on political sensitivities or affiliation,” Kanter said. “There’s no influence of the Chinese government on TikTok.”

Not always the case

Content moderation challenges

TikTok is the international version of Chinese app Douyin, which is also owned by ByteDance.

Kanter said that TikTok is under no obligation to share user data with the Chinese government due to its corporate structure and she said the company would refuse to hand over data if the Chinese government asked for it. TikTok user data is currently stored on servers in the U.S. and there is a backup in Singapore.

“None of our user data goes to China,” she said. “The Chinese government has never asked us for any user data and of course if they did, we would not give them any data.”

TikTok isn’t the only tech company being probed on its content moderation decisions. Facebook, Twitter, YouTube and other social media companies are all being scrutinized more than ever.

The platforms, which have billions of users between them, are constantly trying to remove inappropriate content, including terrorist acts, nudity, hate speech, drug abuse and misinformation.

Source link