Mark Hall - Essays On Tech, Business & Innovation

Share this post

The Thing Missing From Critics' Censorship Arguments About Tech Platforms And Parler

hall.substack.com

The Thing Missing From Critics' Censorship Arguments About Tech Platforms And Parler

Let's Start Talking About Solutions

Mark Hall
Jan 14, 2021
4
1
Share this post

The Thing Missing From Critics' Censorship Arguments About Tech Platforms And Parler

hall.substack.com

The first few weeks of the new year has been a whirlwind in the political world. Most of the major talking points about the events at the Capital have already been said, so I’ll spare you my thoughts on that.

What I would like to discuss is the aftermath of what’s happened with the popular, conservative-leaning app Parler. As you may have heard, it was the de-listed from major app ecosystems. Amazon, through its web hosting business Amazon Web Services (AWS), also decided to no longer host the app.

As a result, Parler is left with few options.

Right-wing app Parler booted off internet on ties to Capitol riot | wqad.com

This comes as YouTube decided to suspend Trump’s account and after Facebook and Twitter placed similar restrictions on him in the days prior.

Immediately after all of this, supposed-advocates of free speech and critics of big tech companies became increasingly vocal about why these actions related to Parler were inappropriate. Many warned that this could be the start of a dangerous precedent.

This post isn’t about making a judgement call on what’s right or wrong or who’s right and who’s wrong. This is about what’s missing in the argument that critics are making.

What’s lacking in the current discourse is a clear view on where the free speech line is and what alternative actions should have been taken, if any. People are criticizing the fallout without proposing additional paths or clearly setting a line in the sand.

Let’s start with common ground. I think most (or at least many) of us can agree that we want the right/ability to speak freely on social platforms, so long as the message doesn’t violate the law (libel, slander, threats, etc). I think/imagine that we can also agree that when someone uses a medium to incite violent actions, there should be constraints or limitations placed on the message and/or messenger.

Agree or disagree so far? If you disagree, let me know where and why? [seriously]

What’s troubling about what I’ve read so far, primarily through social media, is the argument that platforms and companies have gone too far without more constructive ideas or suggested paths.

Here are a few examples (click through to read each):

and this…

and this…

Maybe there’s some validity to these views, maybe not - that debate is for a different time and place.

If (and that’s a strong if) people generally agree that there should be some form of limitation to what people should be able to say on these platforms, then what is the proposed/suggested action you (if you are a critic) think tech companies should take when a company violates the line?

According to Amazon, the apparent violations from Parler seem very straightforward and leaves little up for interpretation:

Does this not meet the standard that critics think should be there? If not, should no action or moderation be taken? What would level of speech would call for action?

These aren’t rhetorical or facetious questions. These are questions to spur an important conversation. I sincerely think that any app or company that facilitates this level of discourse should be held responsible, if not by the hosting platform companies, then by the law.

By focusing the conversation here, maybe we can get one step closer to agreeing on a better path forward and consensus on how to address issues like this in the future.

I may be shouting into the void here, but I’m striving to understand more about the possible solutions versus focusing solely on the problem. Critics, your turn to talk.

1
Share this post

The Thing Missing From Critics' Censorship Arguments About Tech Platforms And Parler

hall.substack.com
1 Comment
Lisa Huang-North (She/Her)
Writes Lisa's Thoughts on Product Mana…
Jan 14, 2021Liked by Mark Hall

This is a timely post, Mark! It's an interesting dilemma... For public companies, they have a fiscal responsibility to create shareholder value. At the core, social platforms are private companies with business goals and established guidelines. When users violate guidelines explicitly set in Terms & Conditions and Agreements (which the user AGREES to upon sign-up) give the company every right to remove the user!

However, as we've all seen, the lines between social platforms as a private entity V.S. platforms enabling the exchange of ideas on a global scale - traditionally a role perceived as "public service" is blurring. And that's why critics (as shown in your article) perceive the removal of a user (aka "censorship") violates their fundamental rights. Should social platforms be regulated as a "public good" then? We are venturing into unchartered water!

Expand full comment
Reply
TopNewCommunity

No posts

Ready for more?

© 2023 Mark Hall
Privacy ∙ Terms ∙ Collection notice
Start WritingGet the app
Substack is the home for great writing