Blocking Servers, Threads and Whether It's Unusual or Important What Happens

The purpose of this post is to share a mental model of what it means to reduce harm on the Fediverse in general, and Mastodon in particular. I also share some of my personal ideas about what the future could look like in the context of Threads on the Fediverse, to such an extent that it may ever happen. Hopefully it's useful for people who are fans of the idea of federating with big social media, and are trying to figure out when and if they should migrate to a server that would do that.

In general, I think it's possible that all this talk is very premature, with a strong exception for those instances where community members seek assurances of continued relative safety online. It's always ok to ask for that.

Harm and Blocking

Moderation teams block harm. This can involve no discussion for "no brainer" harm, or a lot for when things are more nuanced. It is subjective, but collective. A team will come to some kind of agreement in each case. It is imperfect. It's what has gotten us this far.

Crucially, and I believe this doesn't get mentioned enough in this context, moderation teams will often make these decisions in response to conversations with members of their community. If you see a mod making big pronouncements, it might not be soap-boxing, it might be that they want to loudly reassure members of their community that they're watching out for them.

Causing Harm

Harm - for the sake of this conversation - is caused when a party deliberately acts in a way which makes life difficult for others, where those others aren't causing such harm. It may come in the form of malice, negligence or posing a threat. It may be that which a party does, or that which they allow to happen.

Let's define three tiers of harm which a party may cause, and which would lead to a mod team considering blocking another server or user.

Harm in the World

Any organisation may cause known harm in the world, such that a moderation team will block them from their server. This may be criminal activity, incitement to hatred, or acting in a way which is otherwise incompatible with the values a server is based on.

When such an organisation is active on the Fediverse, a moderation team needs to know they are active and also know what they have done in order to make a judgement. This is easier for organisations which make the news.  One team will block an organisation outright, another may consider the organisation a net positive. In some cases a moderation team will disapprove of an organisation's conduct, but they will give them a chance to do better, and reserve judgement on whether a block is necessary.

In most cases, moderation based on harm in the world doesn't happen because it requires so much effort to know about it. Someone needs to make the moderation team aware, or the organisation needs to have a history known to the moderation team. It tends to only happen in exceptional cases.

Harm on the Network

When an organisation run a server and causes harm on the network, meaning the Fediverse in this context, there is more chance of a moderation team being made aware of it via reports by users, talking to other moderation teams, or following hashtags or groups used for the purpose of moderation.

Some teams will look at behaviour coming from a server, gauge what moderation action is taken, if any, and decide early that they don't want to federate with that server. Others may take an approach where they wait and see if the server causes harm to their community, particularly if they deem any incidents to be borderline, meaning they're not fully convinced of harm.

In other cases a server admin may be known to be repurposing federated content in a way deemed to go beyond what users could reasonably expect to happen, and be deemed a danger and not worth federating with for that reason.

Blocking a server for harm on the network is fairly common, as if moderation teams see someone causing noticeable harm on the network, it's not worth waiting for it to happen their community directly before blocking.

Harm to Us

Where "us" means the community members active on a server, including the moderation team, Harm To Us is the form of blocking which would have the least variance of approach among disparate moderation teams. Abuse of members, posting hateful content, posting graphic content without adequate warning, generally going around looking to stir shit up - these are all fairly typical reasons to block users, or the servers those users come from where it happens too often and without action.

Two other forms of harm, which can happen more or less by accident, are mod overload and server overload.

Mod overload is when a server, which is not necessarily a bad server, generates more moderation workload than it's worth. Processing mod reports is how we keep the place nice, but it is work. Sometimes a server is run in such a way that it's simply not federating with if the mods want to focus on looking after their community as best they can. For some teams this means blocking a server like mastodon.social because its size combined with its policy of allowing signups without approval means the cost of federating far outweighs the value.

Server overload is more common with self-run servers. It might mean that a viral post causes their server to fall over and go asleep for a few hours. This is because every interaction with a post creates a job on the server, and where there's too many jobs, things slow down to hell. So if a cosy wee server with 40 users has a post appear on trending on mastodon.online, that server might have a difficult day or two. It would be typical to grin and bear it, but if it happened a lot, your options would be to start posting worse posts, or to maybe block a big server or two. The biggest thing you're spending money on when you're funding a server is processing that jobs queue. Saving and serving posts isn't 10% of the effort.

How Threads Fits In To This

A while ago, it came to light that Facebook were talking about doing some kind of federation with their upcoming platform, Threads. There was a varied response to this. Some moderation teams announced early that they had no intention of ever federating with anything run by Facebook. This could be considered a Harm in the World motivation for blocking. Some people pinned their colours to the mast using the "Fedipact" hashtag.

Later on, Threads actually launched, and people could evaluate what kind of place it was going to be. Some people who were reserving judgement up to that point, weren't impressed and decided that they would be blocking Threads, in what I would call a Harm on the Network decision. Now, Threads isn't really on the network the way a typical, standard ActivityPub-integrated server application is, but people are assuming it could be some day.

I would argue that in each of these cases, this is normal Fedi behaviour. Some people feel a sense of loss when they hear about it, as they're excited by what could be. Others are reassured that the standards they've become used to won't be lowered in the name of growth - by the moderation team of their home servers at least.

Where We Are Now

Threads is not really on the Fediverse. It's true that you can see some posts from a small number (two or three, last time I looked) of accounts on servers that don't have it blocked.

Some moderation teams are still optimistic that it could be good, and are leaving the door open for now.

A bit like after the first announcement, there was a flurry of debate and that has generally calmed down now. Most people made their point before Threads even launched.

Possible Futures

Threads becomes a full member of the Fediverse and it's great

Servers who choose to federate with Threads are awash with content. Their members have engaging interactions with people on Threads and vice-versa. Worth moving for if you're into that kind of thing.

Threads becomes a full-ish member of the Fediverse but it's a moderation nightmare

Servers are hosed with shitty behaviour. Moderation teams burn out. They cut the cord and go back. Remember, Threads is about 100 times the size of the Fediverse, and about 400 times the size of the biggest Fedi instance, mastodon.social. And it has markedly lower standards of conduct than servers like .social, never mind those with higher standards again. The change of a moderation shit show is quite real.

Threads is a server killer

If being on the trending posts feed on maston.online is bad for a server, going viral on Threads would be about 400 times worse. We haven't seen what a server that big would look like on Fedi in practice. Keeping shit running could be a case of getting used to things being very slow, or having 75% of your user donations being spent on effectively being a Threads client. Neither are hugely appealing.

Fuck all changes

Honestly I'm not convinced Threads on Fedi is really going to happen. I have my theories on why they pretend it will but that's a story for another day. But this whole thing could turn out to be fantasy football, no matter what side of pitch you find yourself.

Maybe the best-case scenario you're hoping for happens and then it's worth moving, but I wouldn't panic.

But wait, I think it should be up to me what servers I can and can't see!

It is, my wee darling, it is.

Mastodon