WASHINGTON (AP) — A congressional hearing on online hate turned into a vivid demonstration of the problem Tuesday when a YouTube livestream of the proceedings was bombarded with racist and anti-Semitic comments from internet users.
YouTube disabled the live chat section of the streaming video about 30 minutes into the hearing because of what it called “hateful comments.”
The incident came as executives from Google and Facebook appeared before the House Judiciary Committee to answer questions about the companies’ role in the spread of hate crimes and the rise of white nationalism in the U.S. They were joined by leaders of such human rights organizations as the Anti-Defamation League and the Equal Justice Society, along with conservative commentator Candace Owens.
Neil Potts, Facebook director of public policy, and Alexandria Walden, counsel for free expression and human rights at Google, defended policies at the two companies that prohibit material that incites violence or hate. Google owns YouTube.
“There is no place for terrorism or hate on Facebook,” Potts testified. “We remove any content that incites violence.”
The hearing broke down into partisan disagreement among the lawmakers and among some of the witnesses, with Republican members of Congress denouncing as hate speech Democratic Rep. Ilhan Omar’s criticism of American supporters of Israel.
As the bickering went on, committee chairman Rep. Jerrold Nadler, D-N.Y., was handed a news report that included the hateful comments about the hearing on YouTube. He read them aloud, along with the users’ screen names, as the room quieted.
“This just illustrates part of the problem we’re dealing with,” Nadler said.
The hearing comes as the U.S. is experiencing an increase in hate crimes and hate groups.
There were 1,020 known hate groups in the country in 2018, the fourth straight year of growth, according to the Southern Poverty Law Center, which monitors extremism in the U.S. Hate crimes, meanwhile, rose 30 percent in the three-year period ending in 2017, the organization said, citing FBI figures.
Democratic Rep. David Cicilline of Rhode Island grilled the Facebook and Google executives about their companies’ responsibility for the spread of white supremacist views, pushing them to acknowledge they have played a role, even if it was unintentional. Potts and Walden conceded the companies have a duty to try to curb hate.
But the challenges became clear as Cicilline pushed Potts to answer why Facebook did not immediately remove far-right commentator Faith Goldy last week, after announcing a ban on white nationalism on the social network.
Goldy, who has asked her viewers to help “stop the white race from vanishing,” was not removed until Monday.
“What specific proactive steps is Facebook taking to identify other leaders like Faith Goldy and preemptively remove them from the platform?” Cicilline asked.
Potts reiterated that the company works to identify people with links to hate and violence and banishes them from Facebook.
The hearing was prompted by the mosque shootings last month in Christchurch, New Zealand, that left 50 people dead. The gunman livestreamed the attacks on Facebook and published a long post online that espoused white supremacist views.
But controversy over white nationalism and hate speech has dogged online platforms such as Facebook and Google’s YouTube for years.
In 2017, following the deadly violence in Charlottesville, Virginia, tech giants began banishing extremist groups and individuals espousing white supremacist views and support for violence. Facebook extended the ban to white nationalists.
Despite the ban, accounts such as one with the name Aryan Pride were still visible as of late Monday. The account read: “IF YOUR NOT WHITE friend ur own kind cause Im not ur friend.”
On Wednesday, a Senate subcommittee will hold a hearing on allegations that companies such as Facebook, Google and Twitter are biased against conservatives, an allegation leveled by political figures from President Donald Trump on down.
The companies have denied any such bias.