Media law expert explains social media platforms’ liability in mass shooting cases

Western Mass News spoke with a media law professor who said that there is a difference between a legal responsibility and a moral one, and tech companies have n
Published: May. 25, 2022 at 10:16 PM EDT|Updated: May. 25, 2022 at 11:21 PM EDT
Email This Link
Share on Pinterest
Share on LinkedIn

SPRINGFIELD, Mass. (WGGB/WSHM) - We are learning more about the 18-year-old gunman in Tuesday’s deadly school shooting in Texas.

The gunman, Salvador Ramos, reportedly sent text messages and made social media posts minutes before the attack. Now, we are taking a look at the role social media plays in these types of incidents.

Western Mass News spoke with a media law professor who said that there is a difference between a legal responsibility and a moral one, and tech companies have no legal obligation to report any posts to the police.

Western Mass News has learned more about 18-year-old Salvador Ramos, who killed 21 people, most of whom were children, at an elementary school in Uvalde, Texas.

Minutes before the attack, Ramos allegedly sent several text messages to a German girl he met through a social media app called Yubo.

Some messages read, “I just shot my grandma in her head,” followed by, “ima go shoot up an elementary school [right now].”

Ramos’s Instagram account also showed a picture of two weapons, and Facebook confirmed that Ramos sent private messages through their platform to others, informing them about what he was going to do.

This came less than two weeks after the shooter behind the Buffalo supermarket massacre, Payton Gendron, allegedly posted a racist manifesto on social media and livestreamed the killing on Twitch.

We wanted to know: what is a social media company’s liability when it comes to these types of posts?

Professor of Media Law at Bentley University, Steve Weisman, told us that there is none.

“Legally, the various platforms are protected by section 230, which is the Communication Decency Act, which holds them not responsible for posts that are put up by third parties,” Professor Weisman explained.

He said companies have algorithms in place to flag and even remove posts that they deem inappropriate, but that differs from platform to platform.

“It’s easy with 20-20 hindsight, but it’s very, very difficult going forward to know precisely what you got to look for, precisely what you’re going to have algorithms pick up in order to allow free speech, and yet not insight or glorify violence,” Professor Weisman said. “Part of the problem is, there are people that rant and don’t do anything, so it’s very hard to determine. Is this someone who is imminently going to commit a crime, and therefore, take it down? Or is this just someone exercising their right to be a fool?”

Texas, where Tuesday’s shooting occurred, has its own laws that make it even harder for companies to flag posts.

“The legislature down there put in a law in regards to limiting the ability of social media to censor,” Professor Weisman said, “and in fact, with that law, it could’ve been and could be interpreted as to allow people to actually post the kind of violent rhetoric that we do see.”

He added that he is worried about the role social media may play in desensitizing and encouraging these types of violent crimes.

“One, I’m concerned about people just not even being overly offended now by this kind of violence, accepting that this is what happens,” he told us. “The other thing is, there are violent people who want that publicity, and that’s the thing that encourages others. There was a killer in New Zealand. He said he was encouraged to do this by another killer who had also streamed, so you get support for these violent fringes of society.”

Ultimately, Professor Weisman encouraged people to err on the side of caution, and if you see something alarming posted on social media, say something, because it could save lives.