OTTAWA — Newly released documents reveal that federal public safety officials are worried about the tech industry's ability to manage the spread of extremist and terrorist content online, particularly following significant layoffs in the sector. These documents were made public under federal access-to-information laws and were prepared for a 2023 meeting with Google, which owns YouTube, and a meeting with X, formerly known as Twitter.
The documents highlight a surge in terrorist and extremist materials on these platforms after the October 7 attacks on Israel. One briefing note stated, "The Israel-Hamas conflict has created an avalanche … with at times hundreds of thousands of graphic videos and images of mass shootings, kidnappings and other violence widely circulating on social media." Officials expressed disappointment with the industry's response, noting that while tech companies claim to remove large volumes of terrorist content, many thousands of such materials remain accessible.
Officials linked the proliferation of this content to layoffs at X and Google. They noted that a significant number of videos flagged immediately after the October attacks were removed before they reached 1,000 views. A YouTube spokesperson stated, "Violent extremist content is strictly prohibited on YouTube, and we continue to invest in the teams and technologies that allow us to remove this material quickly. Any notion that we’d compromise the safety of our platform is categorically false."
In late 2023, the then deputy minister of Public Safety Canada, who has since retired, met with Google officials during the Halifax International Security Forum and with a senior representative from X at a G7 Interior Ministers’ meeting. Max Watson, a spokesperson for the department, indicated that while the discussions were private, they focused on security issues, including strategies to address online harms.
The internal documents suggest that the layoffs of thousands of content moderation staff may have hindered the platforms' ability to effectively remove extremist content. Earlier in 2023, Google and its parent company, Alphabet, announced layoffs affecting approximately 12,000 employees, with similar cuts occurring across other major tech companies. Reports indicated that about one-third of the staff at Jigsaw, a Google technology incubator focused on content moderation, were also laid off.
Officials proposed that Google consider rehiring some of the laid-off staff and inquired about the company's efforts to mitigate the harms associated with violent and terrorist content, particularly in light of the recent cuts in trust and safety teams across the industry.
YouTube reported that it removed tens of thousands of videos for violating its guidelines on violent extremism and hate speech since the October attacks. The platform stated that over 90 percent of the content removed for violent extremism was taken down before reaching 1,000 views, with this figure rising to approximately 96 percent for videos removed from October to December 2023.
Despite these efforts, public safety officials expressed concern that the removal of such a large volume of content still left "tens of thousands of videos and posts circulating, some getting millions of views." Additionally, officials noted a staggering 900 percent increase in antisemitic content on X, alongside a rise in content deemed Islamophobic.