We’re looking to put together some more detailed rules on what should and should not be submitted to the instance. Things such as, but not exclusively:

  • What types of message you would always like to see removed on sight
  • Whether there are any types of message which should be left up (borderline, with strong corrections from the community)
  • Where the line is drawn on political views (and how gray areas should be treated)

I’ll make no bones: Moderating uk/ukpol has been a learning experience for me.
I’ve learned that there often isn’t much difference between “leaving a comment up because the community has done an excellent job highlighting flaws” and “I should have removed this hours ago, the community shouldn’t have to do this”.
As there isn’t a way to mod-tag a post, inaction on negative posts can reflect badly on the instance as a whole.

Having some clear guidelines/rules will hopefully simplify things.
And more admins should mean that if a report isn’t looked at, someone can review it as an escalation.

I’ve also enabled the slur filters. And we’ll be listening to see if anything needs adding/removing (the template had swearing blocked :| )

So…Answers on a postcard, I guess!

  • zzpza
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    I’ve been at the thick end of this fight for over a decade (on reddit). The vast majority of the work I and my co-mods do is never seen by anyone outside the mod team.

    If it’s an agenda that goes against what people generally think, don’t they get down voted?

    It depends. Very few are stupid enough to say anything that’s obviously bad. Most use dog whistles and innuendo. And when they do speak plainly, there’s an army of their likeminded friends to drown out any descenting voices (brigading) and inflate their (and similar) comments.

    That sounds like their own communities? Those are easy to spot and back away from. If it’s just a group of individuals whose agenda goes against the grain, those “discussions, posts, subjects” are either controversial to begin with or they aren’t of enough significance to make a difference anyway. Where it matters, wouldn’t they be overwhelmingly down voted?

    Yes, bad actors can have their own community. No, you don’t want to go there. Did you ever see /r/MGTOW, or /r/PussyPassDenied? Mysogony is rife, homophobia, transphobia, xenophobia, racism, etc, etc are all more common than they used to be 5-10 years ago.

    I’m not aware of having seen this when it hasn’t been dealt with be mods/admins, usually by locking the post or deleting the comment and with bans. Where you see damage being done through the post having quickly turned toxic, I see the moderation that follows as a red flag.

    If a mod is any good, and knows the group the bad actors are from, then the majority of the work is done behind closed doors (automod for dog whistles and known phrases, etc).

    Bans are seen as a “badge of honor” by most of these people since it’s so easy to create a new account. They can also wait to appealed after 3 months since that’s how far back the moderation log goes (on reddit), so unless you’ve kept notes and evidence it’s easy for them to play the fool and say they’ve “turned over a new leaf”.

    This doesn’t even touch on suspected state actors / state run bot accounts. Several UK regional subreddits saw a wave of anti-Ukranian posts (false news reports about robbery, attacks, theft, etc by refugees) in an attempt to destabilise the UK’s support for Ukraine shortly after the Russian invasion.

    Or the “bad news” accounts that just go from regional subreddit to subreddit posting (legitimate) news stories about bad things. Rape, murder, assault. Anything that can stir up some rage. They never comment, and post at all hours of the day, every day of the week.