Enlarge / Google CEO Sundar Pichai speaks in Wuzhen, China, in 2017.Du Yang/China News Service/Visual China Group via Getty Images

YouTube says it has "rolled out a fix" for an "error in our enforcement systems" that had led to the automatic deletion of comments that included two phrases critical of China's government. But in an email exchange and phone call with Ars Technica, a company spokeswoman declined to provide real details about why YouTube's software was deleting the comments in the first place.

As I explained on Tuesday, "共匪" means "communist bandit." It was a derogatory term used by Nationalists during the Chinese Civil War that ended in 1949. It continues to be used by Chinese-speaking critics of the Beijing regime, including in Taiwan.

"五毛" means "50-cent party." It's a derogatory term for people who are paid by the Chinese government to participate in online discussions and promote official Communist Party positions. In the early years of China's censored Internet, such commenters were allegedly paid 50 cents (in China's currency, the yuan) per post.

Until Tuesday, YouTube was automatically deleting any comment that includes these phrases. I confirmed the behavior myself on Tuesday morning. Comments containing either phrase would disappear in less than a minute, while other comments—including ones containing other Chinese phrases—stayed on the site.

Users have been reporting this behavior since late last year, with little response from YouTube. That changed on Tuesday when high-profile news sites—starting with The Verge—began covering the story. Within 24 hours of The Verge story appearing, YouTube had fixed the error.

And YouTube says that it was an error, not a deliberate policy decision. But not everyone is convinced.

"This purported 'error' follows a long, disturbing pattern of Google censoring content to try to gain favor with the Chinese Communist Party," Sen Josh Hawley (R-Mo.) wrote in a Wednesday letter to Google CEO Sundar Pichai. Hawley is one of many who suspect this was a deliberate policy decision—not just an innocent mistake.

The case for transparency

On Wednesday, I exchanged emails and talked on the phone with a YouTube spokeswoman. She seemed eager to help but wasn't able to offer me much detail. She said that YouTube relies on classifiers to decide which comments to delete and that YouTube's classifiers didn't take into account "the proper context." She said she wasn't able to provide more detail than that.

I'm sure this wasn't her fault. In a big company like Google, decisions about whRead More – Source

[contf] [contfnew]


[contfnewc] [contfnewc]