For anyone following technology with any regularity, one thing has been blatantly apparent for quite some time: our capabilities in the digital world have accelerated far beyond the capacity of our legal structures to cope with and regulate them. Couple that with a legislature limited by its own lack of technological prowess and the influence of lobbyists, and there was never a question of if present day technology would have a legal reckoning, but simply when that reckoning would occur.
That day appears to be getting closer as the legal structures used by social media companies to shield themselves from liability when harm comes to their users appear to be showing holes. The latest challenge comes in the form of a U.S. federal appeals court ruling against the maker of Snapchat that Section 230 of the U.S. Communications Decency Act did not apply to a photographic filter the company made available to its users. That filter was largely blamed in the death of three young Wisconsin residents.
Section 230 essentially states that a website or app like Facebook, Instagram, or Snapchat is merely a platform for other people to post their own content. The parent company simply provides the means for its users to create content, but it bears no responsibility for the nature of that content. This legal assumption went mostly unchallenged for a number of years, but as social media companies come under increased scrutiny not only for the ways in which their users utilize those platforms but the methods the companies use to keep them coming back and drive advertising revenue, those attitudes appear to be shifting. Cyberbullying has driven a notable increase in youth suicide, body image issues and their associated eating disorders have skyrocketed, and social media has found itself in the middle of cases ranging from stalking to lethal violence.
At the same time, Facebook and Instagram parent company Meta has raked in untold billions upon billions of dollars monetizing its platform and driving this activity while actively working to addict its users; most of whom are young.
The implications for social media companies for a change in the way Section 230 is applied cannot be understated and they are vigorously fighting to keep that barrier in place. But if current tides prevail, they may find themselves having to deal with something they haven’t had to since their inception: responsibility.