It’s been a few weeks, and this project of mine is still moving along. Maybe not as fast as I would like, but I am making progress. Since the last post, I’ve spent much of my coding time working on what I consider the biggest feature of Themis: filtering. Here, I want to talk a little bit about what I mean, and why it’s so important.
My computer, my rules
Today, essentially every discussion platform is moderated. What that means depends on the place, but let’s boil it down to its essence. Moderation is censorship, plain and simple. Sometimes it’s necessary, sometimes it serves a purpose, but a moderated community is one that has decided, either by collective choice or external fiat, to disallow certain topics. More importantly, the administrators of the platform (or their anointed assistants) have the power to remove such content, often without debate or repercussion.
Removing the users that post the prohibited content is the next step. If online communities were physical, such suspensions would be the equivalent of banishment. But a much larger site like Facebook or Twitter, so integrated into the fabric of our society, should be held to a higher standard. When so much in our lives exists only in these walled-off places, banning is, in fact, more akin to a death sentence.
It is my strong belief that none of this is necessary. Except in the most extreme cases—automated spamming, hacking attempts, or illegal content that disrupts the infrastructure of the site—there really isn’t a reason to completely bar someone from a place simply because of what others might think. Themis is modeled on Usenet, and Usenet didn’t have bans. True, your account on a specific server could be locked, but you could always make a new one somewhere else, yet retain the ability to communicate with the same set of people.
This is where Facebook, et al., fail by design. Facebook users can only talk to each other. You can’t post on Twitter timelines unless you have a Twitter account. On the other hand, the “fediverse” meta-platform of Mastodon, Pleroma, etc., returns to us this ability. It’s not perfect, but it’s there, which is more than we can say for traditional social media.
Out of sight, out of mind
But, you may be thinking, isn’t that bad? If nobody wants to see, say, propaganda from white supremacists in their discussions, then how is discussion better served by allowing those who would post that content to do so?
The answer is simple: because some people might want to see that. And because what is socially acceptable today may become verboten tomorrow. Times change, but the public square is timeless. As the purpose of Themis is to create an online public space, a place where all discussion is welcome, it must adhere to the well-known standards of the square.
This is where filtering comes in. Rather than give the power of life and death over content to administrators and moderators, I seek to place it back where it belongs: in the hands of the users. Many sites already allow blocklists, muting, and other simple filters, but Themis aims to do more.
Again, I must bring up the analogy of Usenet. The NNTP protocol itself has no provisions for filtering. Servers can drop or remove messages if they like, but this happens behind the scenes. Instead, users shape their own individual experiences through robust filtering mechanisms. The killfile is the simplest: a poster goes in, and all his posts are hidden from view. Most newsreader software supports this most basic weapon in our arsenal.
Others go the extra mile. The newsreader
slrn, for instance, offers a complex scoring system. Different qualities of a post (sender, subject text, and so on) can be assigned a value, with the post itself earning a score that is the sum of all filters that affect it. Then, the software can be configured to show only those posts that meet a given threshold. In this way, everything a user doesn’t want to see is invisible, unless it has enough “good” in it to rise above the rest. Because there are diamonds in the rough.
The score system works, but it’s pretty hard to get into. So, by default, Themis won’t have it. But that doesn’t mean you can’t use it. The platform I’m building will be extensible. It will allow alternative clients, not just the one I’m making. Thus, somebody out there (maybe even me, once I have time) can create something that rivals
slrn and those other newsreaders with scoring features.
But the basics have to be there. At the moment, that means two things. First is an option to allow a user to “mute” groups and posters. This does about what you’d expect. On the main group list (the first step in reading on Themis), muted groups will not be shown. In the conversation panel, posts by muted users will not be shown, instead replaced by a marker that indicates their absence. In the future, you’ll have the option to show these despite the blocks.
Second is the stronger filtering system, which appears in Alpha 4 at its most rudimentary stage. Again, groups and users can be filtered (posts themselves will come a little later), and the criteria include names, servers, and profile information. As of right now, it’s mostly simple string filtering, plus a regex option for more advanced users. More will come in time, so stay tuned.
This is why I started the project in the first place, and I hope you understand my reasoning. I do believe that open discussion is necessary, and that we can’t have that without, well, openness. By placing the bulk of the power back in the hands of the users, granting them the ability to create their own “filter bubbles” instead of imposing our own upon them, I think it’s possible. I think we can get past the idea that moderators, with all their foibles and imperfections, are an absolute necessity for an online forum. The result doesn’t have to be the anarchy of 4chan or Voat. We can have serious, civil conversations without being told how to have them. Hopefully, Themis will prove that.