We Study Mass Shooters. Something Terrifying Is Happening Online.
James Densley and Jillian Peterson The New York Times
A hand typing on a computer keyboard. (photo: Westend61/Imago Images) We Study Mass Shooters. Something Terrifying Is Happening Online.
James Densley and Jillian Peterson The New York TimesOver the past several years, something has changed. We are witnessing the emergence of a different paradigm: a mass shooter no less despairing about life’s hardships but younger, highly connected to online social networks and seemingly convinced that in acting violently he or she is carrying out the only meaningful act possible in a world otherwise devoid of meaning. This shift is highly significant for our understanding of the online-fueled pathologies that afflict our society and for the policies that could help prevent such tragedies.
Consider a recent example. Last month in Tumbler Ridge, British Columbia, an 18-year-old killed her mother and half brother at home, then opened fire at a secondary school she had attended, killing five students and an educator. In the aftermath of the shooting, amid the expected evidence of the shooter’s despair, there emerged an alarming trail of online activity: On Roblox, a game platform, the shooter had created a game simulating a mass shooting; her TikTok account reportedly featured reposted videos of a mass shooter; she belonged to a gore forum where users can post uncensored videos of violence, which has been frequented by other mass killers; and she had visited the online profile of a 15-year-old girl who killed two people at the Abundant Life Christian School in Madison, Wis., in 2024.
The subculture to which this shooter belonged is known as the true crime community. It exists on platforms like Tumblr, Telegram, Discord, TikTok and Roblox, and it celebrates mass murderers. There, the Columbine killers are the subject of fan art, mass shooters earn “saint” status and attack footage is archived and analyzed frame by frame. The offending content routinely gets taken down by the platforms for violating their terms of service, but within hours it often reappears in new guises, often coded. (“Going E.R.” refers to incel violence, for example.) We are seeing that boys often arrive in this community through gore forums, girls through eating-disorder communities.
At least seven school shootings in the United States from 2024 to this past fall have been linked to the true crime community, according to researchers at the Institute for Strategic Dialogue. (Despite the attention mass shootings draw, they remain rare, so even a handful of such examples is notable.) What the true crime community has done, in effect, is take the despair that has always typified mass shootings and give it a performative script. The community turns private pain into a public narrative: Others have felt the way you feel, too, and look what they did. Look how everyone remembers them.
Last August, a 23-year-old fired through the windows of Annunciation Catholic Church in Minneapolis during Mass. She killed two children and wounded more than 20 others. The inscriptions on her weapons told the story of the online community of which she had been a part: There was a quotation attributed to the Columbine shooters and Cyrillic text apparently copied from the shirt of a school shooter in Crimea. There was also an online journal, shown on a YouTube channel with a video calling the attack her “masterpiece.”
This is characteristic of the performative turn in mass violence. The shooter becomes the main character in a story that the true crime community has been writing together for years, and the attack is the climax — both the culmination of nihilism (nothing matters) and, somehow, its imagined overcoming through violence (this matters). The violence is not a means to an end. It is the end. The shooters are not trying to change the world. They are trying to be seen in it, one last time, on terms they control.
There have long been copycat killers, but this is a whole other level — copycat killing fueled by the viral power of meme culture. The 15-year-old shooter in Madison in 2024, for example, quickly became a true crime community icon: A 17-year-old boy who committed a school shooting in Nashville in 2025 and who appears to have been an online associate of the Madison shooter referred to her online before his attack. Likewise, the Minneapolis shooter in 2025 wrote the name of the Madison shooter on her rifle.
The internet was once simply a place you visited to learn things. Now it learns you. If you’re a teenager in crisis, you don’t need to seek out dark material; the algorithms study what you linger on and serve you more content like it. A found-footage mockumentary about the Columbine shooting might lead to a related Reddit thread that might lead to a related Tumblr fan edit that might lead to a Telegram channel where a user posts blueprints of a local school (“just interesting architecture”). Everyone laughs. It’s ironic. Until it isn’t.
There isn’t just one policy solution to mass shootings. It’s a complex problem that requires better resources for school counselors and threat-assessment teams and better firearm-seizure practices during mental health crises. But online platforms also need to be more vigilant. Before the shooting in Tumbler Ridge, the shooter had conversations with ChatGPT that were flagged by OpenAI’s automated systems for describing scenarios involving gun violence. According to reports, about a dozen employees debated whether to alert law enforcement. They decided not to. The account was banned, but nobody called the police.
If companies like TikTok can identify a trending sound or a trending image in seconds, they can presumably build systems to better flag the glorification of violence, slow the resharing of attack footage and quash known violent content so it can’t resurface. They are already good at monitoring content in this way for potential copyright violations.
In an attention economy, what we look at and what we encourage others to look at are unavoidably consequential acts. Every time we draw attention to shooters, we help complete their performance. Somewhere, right now, a teenager is sitting alone, scrolling through a feed that has learned exactly what he or she is looking for. The algorithm knows. The question is whether the rest of us will act on what we know, too.