If someone is injured or killed while using a piece of software, should the developer be held legally responsible? Traditionally, the law provides substantial protection for tech companies from civil lawsuits related to user actions. However, a recent ruling from a federal appeals court suggests changes may be on the horizon.
How the Snapchat Speed Filter Resulted in Three Deaths
In May 2017, three teenagers in Walworth County, Wisconsin, were involved in a fatal accident. Prior to losing control of the vehicle, and crashing into a tree, the 17-year-old driver had reached speeds exceeding 120 miles an hour.
Data later obtained, revealed shortly before the crash occurred, a passenger in the vehicle opened the Snapchat Speed Filter. It’s a controversial feature that documents and posts the speed of whatever vehicle the user is in. Higher speeds often receive increased engagement.
Following the accident, the driver’s parents filed a lawsuit against Snap Inc., the parent company of Snapchat. The suit alleges Snap Inc. knew the Speed Filter encouraged dangerous behavior, and they ultimately shared some responsibility for the crash.
The Lawsuit’s Journey Through the Legal System
Initially, the lawsuit was dismissed, with the judge citing Section 230 of the Communications Decency Act. It provides extensive immunity for tech companies, including social media companies, from libel and civil suits related to the content users create and post. The dismissal wasn’t surprising, as Section 230 is frequently used to shield tech companies from similar suits.
However, what happened next shocked most legal experts. A three-judge panel of the 9th US Circuit Court of Appeals reversed the decision, allowing the parents to continue with their suit against Snap Inc. The court stated Section 230 only applies to content created by users, but the Speed Filter was created by Snapchat.
The Future of Liability Law for Tech Companies
In 2019, victims’ right lawyer Carrie Goldberg brought a similar suit against Grinder, claiming the company failed to exercise proper care in protecting the public from injury and harm. The suit was ultimately rejected by the 2nd US Circuit Court of Appeals, which cited Section 230. However, now that the 9th Circuit Court has interpreted the section in a different way, additional lawsuits against tech companies are likely.
“The recent ruling invites other plaintiffs to test how narrowly the courts define Section 230,” said Attorney Jason Schneider of Schneider Hammers. “Without 230 acting as a shield, it is conceivable that future suits will have an easier time getting in front of a jury, which is generally what tech companies hope to avoid if they are sued.”
The Snapchat case does have some unique and contradictory attributes. In one sense, the company’s status as a publisher is irrelevant, because they created the Speed Filter, so Section 230 does not apply, making the company legally liable. However, because the user posted on Snapchat, Section 230, at least in theory, offers the company legal protection.
Currently, the 9th and 2nd Circuit Courts have issued different interpretations on Section 230’s legal shielding. Many legal experts believe the Supreme Court will eventually step in, with Clarence Thomas already signaling a willingness to weigh in.
Who is legally responsible when someone posts irresponsible content? Is the user completely at fault, or do the actions of the publishing platform also play a role? Tech companies, content creators, and many others remain eager to learn if and how current laws will change.