Executives from four of the biggest social media companies testified before the Senate Homeland Security Committee Wednesday, defending their platforms and their respective safety, privacy and moderation failures in recent years.
Congress managed to drag in a relatively fresh set of product-focused executives this time around, including TikTok COO Vanessa Pappas who testified for the first time before lawmakers and longtime Meta executive Chris Cox. The hearing was convened to explore social media’s impact on national security broadly and touched on topics ranging from domestic extremism and misinformation to CSAM and China.
Committee Chair Sen. Gary Peters pressed each company to disclose the number of employees they have working full-time on trust and safety and each company in turn refused to answer — even though they received the question prior to the hearing. Twitter General Manager of Consumer and Revenue Jay Sullivan chipped in the only numerical response, noting that the company has 2,200 people working on trust and safety “across Twitter” though it wasn’t clear if those employees also did other kinds of work.
It’s no secret that social media moderation is patchy, reactive and uneven, largely because these companies refuse to invest more deeply in the teams that protect people on their platforms. “We’ve been trying to get this information for a long time,” Peters said. “This is why we get so frustrated.”
Senator Alex Padilla (D-CA) steered the content moderation conversation in another important direction, questioning Meta Chief Product Officer Chris Cox about the safety efforts outside of the English language.
“[In] your testimony you state that you have over 40,000 people working on trust and safety issues. How many of those people focus on non English language content and how many of them focus on non US users?” Padilla asked.
Cox didn’t provide an answer, nor did the three other companies when asked the same question. Though the executives pointed to the total number of workers who touch trust and safety, none made the meaningful distinction between external contract content moderators and employees working full-time on those issues.
Whistleblowers and industry have repeated raised alarms about inadequate content moderation in other languages, an issue that gets inadequate attention due to a bias toward English language concerns, both at the companies themselves and at U.S.-focused media outlets.
In a different hearing yesterday, Twitter’s former security lead turned whistleblower Peiter “Mudge” Zatko noted that half of the content flagged for review on the platform is in a language the company doesn’t support. Facebook whistleblower Frances Haugen has also repeatedly called attention to the same issue, observing that the company devotes 87% of its misinformation spending to English language moderation even though only 9% of the platform’s users speak English.
TikTok and China
In her first appearance before Congress with TikTok, Pappas immediately fell into step with her peers, evading straightforward questions, offering partial answers and even refusing at one point to admit TikTok’s well-documented connections to China. When Sen. Rob Portman (R-OH) pressed Pappas on where TikTok’s Chinese parent company ByteDance is based, she dodged the question awkwardly by claiming the company is distributed and doesn’t have a headquarters at all. Pappas, under oath, also categorically denied explosive reports from BuzzFeed that China-based ByteDance employees regularly accessed private data on U.S. TikTok users, even though that reporting is drawn from leaked audio.
The TikTok executive also declined to agree to Portman’s request that the company cut off the flow of user data to any employees based in China, including ByteDance employees. “Under no circumstances would we give user data to the Chinese government,” Pappas insisted, though she did not weigh in on behalf of TikTok’s parent company.
All told this was another round of Congress getting stonewalled by top decision makers from some of the world’s largest, most powerful and culturally influential companies. For his part as chair, Peters was realistic about the situation, noting that short of regulatory changes to the incentives that drive social media companies, nothing is going to change — including in these sessions.
“I’ll be honest I’m frustrated that… all of you [who] have a prominent seat at the table when these business decisions are made were not more prepared to speak to specifics about your product development process, even when you are specifically asked if you would bring specific numbers to us today,” Peters said, concluding the hearing. “Your companies continue to avoid sharing some really very important information with us.”
Meta, TikTok, YouTube and Twitter dodge questions on social media and national security by Taylor Hatmaker originally published on TechCrunch