Apple and Google removed the Parler app from their app stores. And soon after, Amazon booted it from its website hosting platform, essentially de-platforming an entire social media site with millions of users, including some of the people who stormed the United States Capitol (but some of those folks also were on Facebook, Twitter, Reddit, and other social media platforms).
Why it’s concerning:
These tech oligarchs are deciding which product to remove. And we think that’s dangerous overreach. What is worrisome is that tech companies are able to de-platform businesses operating entirely within the law.
Why it’s legal:
Section 230. It is a piece of internet legislation passed into law as part of the Communications Decency Act of 1996. It gives immunity to website publishers from third-party content. And that means these platforms aren’t liable for what users publish. These social media platforms saw violent content on Parler’s site and therefore had the authority to shut it down. But what Parler was doing is also not illegal because of Section 230.
What can be done:
Right now, if we want to change the current law, then we need to propose that platforms should be legally responsible for the content that promotes violence. And some people think that platforms should have more accountability for what their products do to society. And this would encourage more moderation and stricter language guidelines.
But right now, we’re not there yet. We need laws that protect legal sites from being booted without cause as we lack regulation in the social media space. We need a 21st-century legal framework for social media. And this is a conversation that needs to continue so that together we can find solutions to this problem.