I Helped Build ByteDance’s Censorship Machine Machine Learning Times – The Predictive Analytics Times

Posted: March 5, 2021 at 5:14 am

Originally published in Protocol, Feb 18, 2021.

This is the story of Li An, a pseudonymous former employee at ByteDance, as told to Protocols Shen Lu.

I wasnt proud of it, and neither were my coworkers. But thats life in todays China.

It was the night Dr. Li Wenliang struggled for his last breath in the emergency room of Wuhan Central Hospital. I, like many Chinese web users, had stayed awake to refresh my Weibo feed constantly for updates on his condition. Dr. Li was an ophthalmologist who sounded the alarm early in the COVID-19 outbreak. He soon faced government intimidation and then contracted the virus. When he passed away in the early hours of Friday, Feb. 7, 2020, I was among many Chinese netizens who expressed grief and outrage at the events on Weibo, only to have my account deleted.

I felt guilt more than anger. At the time, I was a tech worker at ByteDance, where I helped develop tools and platforms for content moderation. In other words, I had helped build the system that censored accounts like mine. I was helping to bury myself in Chinas ever-expanding cyber grave.

I hadnt received explicit directives about Li Wenliang, but Weibo was certainly not the only Chinese tech company relentlessly deleting posts and accounts that night. I knew ByteDances army of content moderators were using the tools and algorithms that I helped develop to delete content, change the narrative and alter memories of the suffering and trauma inflicted on Chinese people during the COVID-19 outbreak. I couldnt help but feel every day like I was a tiny cog in a vast, evil machine.

ByteDance is one of Chinas largest unicorns and creator of short video-sharing app TikTok, its original Chinese version Douyin and news aggregator Toutiao. Last year, when ByteDance was at the center of U.S. controversy over data-sharing with Beijing, itcutits domestic engineers access to products overseas, including TikTok. TikTok hasplansto launch two physical Transparency Centers in Los Angeles and Washington, D.C., to showcase content moderation practices. But in China, content moderation is mostly kept in the shadows.

I was on a central technology team that supports the Trust and Safety team, which sits within ByteDances core data department. The data department is mainly devoted to developing technologies for short-video platforms. As of early 2020, the technologies we created supported the entire companys content moderation in and outside China, including Douyin at home and its international equivalent, TikTok. About 50 staff worked on the product team and between 100 to 150 software engineers worked on the technical team. Additionally, ByteDance employed about 20,000 content moderators to monitor content in China. They worked at what are known internally as bases () in Tianjin, Chengdu (in Sichuan), Jinan (in Shandong) and other cities. Some were ByteDance employees, others contractors.

To continue reading this article, click here.

Read more:
I Helped Build ByteDance's Censorship Machine Machine Learning Times - The Predictive Analytics Times

Related Posts