
The writer appeared deeply familiar with the far-right internet, which he said influenced his thinking. He decorated those guns with neo-Nazi symbols and names of past killers and crusaders. At one point, the author mentioned carrying out the attack with guns to gain American attention and to sow political discord.

Some of those cultural references appeared to be ironic, or a ploy for attention. The alleged shooter referenced an ideological jumble of right-wing personalities in the manifesto. In the 8chan post, Muslims are described as “invaders.” The manifesto is filled with anger toward Muslims and claims that Muslim immigration would lead to “white genocide.” As gruesome clips of the shooting circulated on social media, Facebook, Reddit, and Twitter deleted links to the video-and the poster’s Twitter profile was deactivated.īased on the postings, the shooting appears to have been motivated by anger over Muslim immigration. At one point, he left the building and returned to the car for a new gun, before going back to the mosque and resuming the massacre.Ĩchan users cheered the attack online, with one replying to the original post with a picture of a man saluting. He used the Facebook account to stream a 17-minute video of him opening fire inside a Christchurch mosque. White-supremacist terrorists sometimes release racist manifestos in concert with violent attacks to amplify their beliefs. Other video-streaming sites like TikTok and YouTube require users to have a certain number of followers before they're able to stream live, reports Allyn.The 8chan poster appeared to have created several social-media accounts in recent weeks to promote himself and the manifesto, some of which appear to be sarcastic.

Facebook, Twitter and other sites like them have teams of thousands working to moderate content and block violent media from reaching people.įor example Twitch, the site the Buffalo shooter livestreamed on, could make it harder for people to open accounts and instantly upload live videos. Social media companies used to take a mostly hands-off approach to moderating content on their sites, but now more than ever sites are trying to manage the societal problems their sites create, reports Allyn. Experts say social media companies could do more Listen to his discussion on Morning Edition.


"The social media platforms that profit from their existence need to be responsible for monitoring and having surveillance, knowing that they can be, in a sense, an accomplice to a crime like this, perhaps not legally but morally," Hochul said.Īllyn reports that social media companies usually are not held liable for what they don't police on their sites.
