Mark Zuckerberg, the head of Facebook, could be in for a rough ride on Thursday when he testifies to Congress for the first time about the 6 January insurrection at the Capitol in Washington DC and amid growing questions over his platform’s role in fuelling the violence.
The testimony will come after signs that the new administration of Joe Biden is preparing to take a tougher line on the tech industry’s power, especially when it comes to the social media platforms and their role in spreading misinformation and conspiracy theories.
Related: The question every politician should be asking is, what does Mark Zuckerberg want with us? | Marina Hyde
Zuckerberg will be joined by Sundar Pichai and Jack Dorsey, the chief executives of Google and Twitter respectively, at a hearing pointedly entitled “Disinformation nation: social media’s role in promoting extremism and misinformation” by the House of Representatives’ energy and commerce committee.
The scrutiny comes after a report found that Facebook allowed groups linked to the QAnon, boogaloo and militia movements to glorify violence during the 2020 election and weeks leading up to the deadly mob violence at the US Capitol.
Avaaz, a non-profit advocacy group, says it identified 267 pages and groups on Facebook that spread “violence-glorifying content” in the heat of the 2020 election to a combined following of 32 million users. More than two-thirds of the groups and pages had names aligned with several domestic extremist movements.
The top 100 most popular false or misleading stories on Facebook related to the elections received an estimated 162m views, the report found. Avaaz called on the White House and Congress to open an investigation into Facebook’s failures and urgently pass legislation to protect American democracy.
Fadi Quran, its campaign director, said: “This report shows that American voters were pummeled with false and misleading information on Facebook every step of the 2020 election cycle. We have over a year’s worth of evidence that the platform helped drive billions of views to pages and content that confused voters, created division and chaos, and, in some instances, incited violence.
“But the most worrying finding in our analysis is that Facebook had the tools and capacity to better protect voters from being targets of this content, but the platform only used them at the very last moment, after significant harm was done.”
(THE GUARDIAN)