mstdn.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A general-purpose Mastodon server with a 500 character limit. All languages are welcome.

Administered by:

Server stats:

18K
active users

Re YouTube’s decision to re-allow disinformation about the 2020 election:

I’ve seen comments to the effect of “lies and propaganda aren’t free speech.” That’s exactly wrong: they are. We •want• them to be. And that’s the heart of the problem here.

Free speech is a social contract, and Google isn’t holding up their end of it.

Huh?! Let me explain my thinking.
1/

In the US constitution, “free speech” means that the government can’t criminalize political dissent. We •do not• want the government to have the power to decide what is and is not a lie, to decide what is and is not propaganda, and then to declare lies and propaganda illegal.

Because if the gov does have that power, I 100% guarantee that it’s not the Nazis they’ll use it it against first.
2/

The First Amendment is one ingredient in why, for example, police in Atlanta have to go through all these legal gymnastics to strike back at activists they don’t like: unicornriot.ninja/2023/three-a They’re spinning these trumped-up charges, trying to turn fundraising into fraud, and their charges just don’t have a legal leg to stand on.

But if the gov had the power to decide what’s “lies” and “propaganda,” and then make that illegal? Then the police could just go straight for the jugular.
3/

UNICORN RIOTThree Atlanta Activists Arrested, Home Raided Over Bail Fund - UNICORN RIOTAtlanta, GA — Around 9 a.m. on Wednesday, three members of the Atlanta Solidarity Fund were arrested during a raid by the Georgia Bureau of Investigation (GBI) and the Atlanta Police Department and charged with money laundering and charity fraud. Marlon Kautz, Savannah Patterson, and Adele Maclean were arrested at The Teardown Community, a hub […]

Here’s the thing: we really do need lines around speech. Not everyone and everything deserves to be heard. Some speech should be unacceptable. Some speech should have consequences.

Dig deep enough with anyone who considers them a speech tolerance absolutist, bring them face to face enough real-life situations, and you’ll find that there is always, always a line for them somewhere.

There needs to be a line. But we don’t want the gov drawing it (because authoritarianism). What do we do then?
4/

The answer is that the lines around acceptable speech can be, must be, socially negotiated.

The First Amendment says that we will keep the government’s power to regulate speech to an absolute minimum ••with the expectation that society will take on that responsibility.••

•That• is the social contract of free speech in this country.
5/

If you know the 1A, you know that it doesn’t give anybody the right to be heard, or the right to be free from moderation, or the right to say anything in somebody else’s newspaper, or the right not to be kicked off of social media, or the right not to experience negative consequences for speech.

The 1A says that the government will not make those determinations ••with the expectation that society will••.

The 1A only works if society holds up its end.
6/

Questions about where to draw the line around what speech is acceptable are really, really hard. And dangerous!

That’s why we want many news organizations, many web sites, many spaces that are varying degrees of small and large and public and private — and social forces acting on all of them — so that we can continuously hash out these questions.
7/

This means any actor who controls a space where speech occurs needs to be ready to engage with this messy social negotiation of speech. That’s why, for example, every Masto instance needs to have a moderation plan in place from the get-go.

The larger the actor, the greater the responsibility.

Which brings us to YouTube.
8/

We live now in a world where a handful of private entities — Facebook and Google the prime examples — have such massive domains of speech under their private control that their power over speech seems almost government-sized.

This creates an untenable situation. If they apply any sort of standards to speech, aren’t they almost like an oppressive government?
9/

This line of reasoning is of course a fool’s trap. Every time one of these massive privately controlled speech spaces tries to adopt a 1A-shaped moderation policy, they become the Nazi Bar.

The 1A works only because the government doesn’t control the whole space in which speech occurs — and these private entities do. They cannot escape the messy responsibilities of moderation.
10/

It’s paradoxical: when a privately owned space adopts a speech policy that •sounds• like the First Amendment, they are in fact •undermining• the fundamental premise of the First Amendment.

When YouTube says they’ll allow disinformation about the 2020 election, they’re dropping their end of the 1A contract.

And their mistake gets worse.
11/

YouTube has a recommender system. They aren’t just allowing Nazi shit to flow through their pipes. They have an armada of computers that (by design or by accident, doesn’t matter) are figuring out who’s most susceptible to Nazi shit, and then pumping the shit straight into their eyes and ears.

That’s a whole new category of problem.
12/

Erin Kissane

@inthehands I keep thinking that the terms of the bargain here should be much, much clearer

@kissane
Yeah. I truly do not know what exactly that means, but it’s very clear that our current systems were not designed to deal either with actors with this kind of consolidated global power, or with the cascading effects of computers being in the middle of everything.