One of the admins at lemmy.blahaj.zone asked us to purge a community and all of its users because they thought it was full of child sexual abuse material, aka CSAM, fka kiddy porn. We assured them that we had checked this comm thoroughly and we were satisfied that all of the models on it were of age.

The admin then demanded we purge the comm because they mistook it for CSAM, and claimed that the entire point of the community was to make people think it was CSAM. We vehemently disagreed that that was in fact the point of the community, but they decided to defederate from us anyway. That is of course their choice, but we will not purge our communities or users because someone else makes a mistake of fact, and then lays the responsibility for their mistake at our feet.

If someone made a community intended to fool people into thinking it was kiddy porn, that would be a real problem. If someone of age goes online and pretends – not roleplays, but pretends with intent to deceive – to be a child and makes porn, that is a real problem. Nobody here is doing that.

One of the reasons we run our instance the way that we do is that we want it to be inclusive. We don’t body shame, and we believe that all adults have a right to sexual expression. That means no adult on our instance is too thin, fat, bald, masculine, old, young, cis, gay, etc., to be sexy, and that includes adults that look younger than some people think they should. Everyone has a right to lust and to be lusted after. There’s no way to draw a line that says “you can’t like adult people that look like X” without crossing a line that we will not cross.

EDIT: OK, closing this post to new comments. Everything that needs saying has been said. Link to my convo with the blahaj admin here.

  • Mikey Mongol @lemmynsfw.comOPM
    link
    fedilink
    English
    arrow-up
    76
    arrow-down
    4
    ·
    edit-2
    1 year ago

    Jailbait is by definition people that are under legal age, or at least pretending to be. If someone is of legal age they by definition cannot be jailbait. The appeal there is the violation of the statutory taboo, the allure of the forbidden fruit. That is not OK and we won’t tolerate that.

    If a grownup has a round face and is wearing braces, which is the post that I suspect launched this whole kerfluffle, that’s just how they look. I’m not going to tell them that they can’t be sexy, or you that you can’t be into them being sexy, because of their face or their dental work. Now, you can see why someone might be concerned that someone that has braces might be underage, since many people that are underage have braces. But once we’ve confirmed that they aren’t underage, that should be the end of it.

    • Shit@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      1 year ago

      Was it just that one post that caused all this? Honestly I was expecting something way worse not like a single post in a fairly active looking community…

      • KairuByte@lemmy.world
        link
        fedilink
        English
        arrow-up
        37
        arrow-down
        2
        ·
        1 year ago

        They decided that because they mistook it for CSAM it should be taken down, and the entire community with it.

        Because they assumed one image was CSAM.

        It’s kinda nuts.

        • Anais Rim@mastodon.social
          link
          fedilink
          arrow-up
          31
          ·
          1 year ago

          @KairuByte @Shit

          After reading Ada’s post and comments there (not removed, many were), it’s my opinion the admins there wanted to defederate lemmynsfw anyway and this was a convenient excuse.

          Regardless, it’s their server. Many users there support the decision. Their right, and if the userbase wants that they’ve chosen the right instance for them. Those who don’t want that outcome will move.

          • KairuByte@lemmy.world
            link
            fedilink
            arrow-up
            23
            arrow-down
            1
            ·
            1 year ago

            I’m inclined to agree with you. Though I’ll argue that most users over there are agreeing based on a colorful interpretation of what happened, assuming that there is indeed a community based around legal porn meant to look like CSAM… which doesn’t appear to be the case at all. Look at the community in question (!adorableporn@lemmynsfw.com) and you’ll notice a lack of anything encouraging people to present as underage.

              • Shit@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                9
                ·
                1 year ago

                I wish one side would just post the chat logs between them to clear the air on what really happened.

                  • Shit@sh.itjust.works
                    link
                    fedilink
                    English
                    arrow-up
                    9
                    ·
                    1 year ago

                    I got no skin in this game but you totally should reach out and ask her. ¯⁠\⁠_⁠(⁠ツ⁠)⁠_⁠/⁠¯ might clear the air on what’s up.

            • Anais Rim@mastodon.social
              link
              fedilink
              arrow-up
              8
              arrow-down
              1
              ·
              1 year ago

              @KairuByte

              I’ve seen it. IMO the side panel explicitly says “childlike” and I can see some might have a problem with that. It suggests the purpose Ada objected to over there. And I think mods and admins might want to change that text such that it makes no reference to underage anyone in the context of porn.

              Clearly, all participants are over 18. Good.

        • GBU_28@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          4
          ·
          edit-2
          1 year ago

          I understand the knee jerk reaction… Doesn’t federation mean they are potentially possessing copies of that content, hosting it, by being federated?

          • KairuByte@lemmy.world
            link
            fedilink
            English
            arrow-up
            20
            arrow-down
            1
            ·
            edit-2
            1 year ago

            So, yes. Their instance would have copies of content viewed by their users. That said, they didn’t defederate because of CSAM, which would make perfect sense. They defederated because they made an incorrect assumption, and then wanted an entire community nuked because of that assumption… even after they were corrected.

            The moment things were made clear, they should have said “oh okay, our bad.” But instead they doubled down.

            • GBU_28@lemm.ee
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              13
              ·
              edit-2
              1 year ago

              But if you had the anxiety and fear in your heart that boots were about to kick in your door, and hell, that you are facilitating the consumption of csam , would a few DMs really put you at ease?

              Empathetically assume you had already accepted the worst was occuring, I believe it would be very hard to adjust course and sleep at night

              • KairuByte@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                ·
                edit-2
                1 year ago

                Honestly, I don’t know how the law would handle this kind of situation. But in my mind, the only time you’re in legal hot water is when (a) there is actual CSAM involved, and (b) nothing is done to prevent that association.

                In this case, (a) was proven to be false. So there’s no concern. But if it had been the case, then defederation makes sense.

                Otherwise, there’s no reason to federate at all. Anyone can post CSAM on any instance at any time. There’s nothing in place to detect it, nothing in place to handle it other than manual moderation. That’s just a hard fact of lemmy instance hosting.

                • GBU_28@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  2
                  ·
                  1 year ago

                  I enjoy reading and commenting here, but it is my back of mind fear for federated spaces like Lemmy.

                  Bad actors could spam suspicious or actual csam.

                  All it takes is one admin/hoster to be “made example of” to really shake the system.

                  I hope I’m wrong and ignorant of the realities of the law / prosecution.

                  • KairuByte@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    10
                    ·
                    1 year ago

                    Note: I deleted my comment by mistake. X.x

                    So I think most of the time we would be in the clear, as long as actual CSAM is handled when it is found/reported.

                    Just like Reddit doesn’t get hauled to court when CSAM is posted. And mods don’t get arrested for viewing it while they are removing it.

                • GBU_28@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  1 year ago

                  I enjoy reading and commenting here, but it is my back of mind fear for federated spaces like Lemmy.

                  Bad actors could spam suspicious or actual csam.

                  All it takes is one admin/hoster to be “made example of” to really shake the system.

                  I hope I’m wrong and ignorant of the realities of the law / prosecution.

                  • Mikey Mongol @lemmynsfw.comOPM
                    link
                    fedilink
                    English
                    arrow-up
                    11
                    ·
                    1 year ago

                    We are extremely aware of this possibility and have taken many active steps against it, and we are scrupulously staying on the right side of US law when it comes to reporting potential CSAM. As stated in our FAQ, preventing CSAM on our instance is our highest priority.

            • GBU_28@lemm.ee
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              12
              ·
              1 year ago

              But if you had the anxiety and fear in your heart that boots were about to kick in your door, and hell, that you are facilitating the consumption of casm , wood a few DMs really put you at ease?

              Empathetically assume you had already accepted the worst was occuring, I believe it would be very hard to adjust course and sleep at night

          • assqrw@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            5
            ·
            1 year ago

            Doesn’t federation mean they are potentially possessing copies of that content, hosting it, by being federated?

            Media isn’t replicated. Lemmy is a link aggregator, posts include links to content and that is what is replicated. The fact that instances let you upload images directly does make that a bit confusing but if you look at a post from one instance on another the posts link is still the image on the original instance and is fetched from there. The only local media from federated instances are the thumbnails that are generated and stored locally. That’s still a problem in the instance of some illegal content but less so.