Like many of you, I woke up this morning to discover that our instance, along with lemmy.world, had been unexpectedly added to the beehaw block list. Although this development initially caught me off guard, the administrators at beehaw made an announcement shedding light on their decision.

The primary concern raised was our instance’s policy of open registration. Given my belief that the fediverse is still navigating its early stages, I believe that for it to mature, gain traction, and encourage adoption, it is crucial for instances to offer an uncomplicated and direct route for newcomers to join and participate. This was one of the reason I decided to launch this instance. However, I do acknowledge that this inclusive approach brings its unique challenges, including the potential for toxicity and trolls. Despite these hurdles, I maintain the conviction that our collective strength as a community can overcome these issues.

After this happened, the beehaw admins and I had a good chat about their decision. While our stances on registration policies might diverge, we realized that our ultimate goals are aligned: we both strive to foster communities that thrive in an atmosphere of safety and respect, where users can passionately engage in discussions and feel a sense of belonging.

Although the probability of an immediate reversal are slim given the current circumstances, I believe we have managed to identify common ground. It’s evident that, even in separation, we can unite to contribute positively to the broader fediverse community.

In the coming weeks or months, we plan to collaborate with other lemmy instance administrators to suggest enhancements and modifications to the lemmy project. Primarily, our proposals will concentrate on devising tools and features that empower us, as instance administrators, to moderate our platforms effectively.

In the meantime, while I understand may not be ideal for everyone, users who choose to participate on the beehaw instance will be required to register a separate account on their instance.

Thank you all for continuing to make this community great!

  • JohnnyCanuck@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    1 year ago

    So I don’t know what solutions you have discussed with the other instance admins, and I actually know little about how it all works currently, but I had a thought about this for the fediverse as a whole: the admins/moderators of a user’s home instance should be moderating/responsible for that user’s engagement with communities on other instances.

    Right now if Person A creates Instance A and a community on that instance becomes really popular fediverse-wide, Person A is stuck in the lurch of dealing with all of the engagement from everywhere else in the fediverse. If Instance A has 10 users or 100000 users, they still have to deal with x-thousands of users from all over the fediverse. More than likely they’ll just want to defederate, especially if they are small. At the same time, if Person B creates Instance B that invites trolls (on purpose or not) it seems that they have little say in what their own users do on Instance A’s community. In fact, as you pointed out, Person B might not even know that a user from their instance is trolling Instance A.

    Instead, if mods on Instance A take any action against the user on their instance, mods on the user’s home instance (AKA Instance B) should immediately and automatically be notified. Then the moderators from Instance B will need to respond how they see fit with the user. If they don’t see a problem, maybe they do nothing (e.g. the two instances have different philosophies.) But if they do see an issue, they then have the opportunity to respond in whatever way makes sense. Then, between the two instances, if the actions taken on either side seem appropriate, the two instances can continue to get along (i.e. federate). If they disagree in some way (maybe Instance B thinks Instance A is too draconian or maybe Instance A thinks Instance B is too lax) they can part ways (i.e. defederate).

    As an extension to this, it could help Instance B from being a source of brigading. If they suddenly see a bunch of reports coming in from Instance A they would be able to take action on their own side to stop it, either through temporarily defederating or some other mechanism.

    All in all the purpose would be to give both instances the chance to deal with the issues before defederating; hopefully alleviating some of the pressure off of Instance A, and giving Instance B the opportunity to show whether they should be trusted (or not) in general.

    This could be taken a step further and their could be trusted and untrusted federations. Trusted federations work like normal and untrusted federations require mods from the user’s home instance to moderate all engagement before it actually posts to the remote instance. This puts a burden on the home instance, but that’s actually the point. If you’re willing to grow to large numbers and federate widely, then you need to be willing to moderate your users’ content, rather than imposing your users on everyone else (until they defederate.)

    Edit to add: I should mention that I very much appreciate this instance and that I was able to easily create an account, and, I was disappointed by the defederation as it seems like the kind of thing that will kill Lemmy from scaling to something mainstream. I don’t think that’s what the creators of Lemmy want though, anyway.

    • flambonkscious@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      I like your idea but …who would really want to bother?

      It sounds like potentially parenting any/all of your users.

      • Difficult_Bit_1339@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        The alternative is to become a platform where people go to register in order to be assholes. If the user population of your instance is too big for the moderation team then close registrations until the workload is small enough.