• LesbiansMadeMeGay@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    7
    ·
    7 hours ago

    I think it’s pretty weird that you asked this question when it seems to me you aren’t interested in entertaining even one of the many arguments people are making (even pretty basic ones about literally illegal content for some reasons???)

  • NOT_RICK@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    ·
    11 hours ago

    Under moderated and under administrated instances have ended up with child porn on them. If that shit gets federated out it’s a real mess for everyone. I think screening tools are more advanced now thankfully because it’s been a while since the last incident.

      • Leraje@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        9
        ·
        10 hours ago

        That means the CSAM (its not ‘child porn’ its child abuse) remains on the server, which means the instance owner is legally liable. Don’t know about you but if I was an instance owner I wouldn’t want the shame and legal consequences of leaving CSAM up on a server I control.

      • PhobosAnomaly
        link
        fedilink
        arrow-up
        16
        ·
        11 hours ago

        Make a reliable way to automate that, and you’ll make a lot of money.

        Rely on doing it for yourself, and… well good luck with the mental health in a few years time.

        • FaceDeer@fedia.io
          link
          fedilink
          arrow-up
          4
          arrow-down
          3
          ·
          10 hours ago

          AI would be able to do a good first pass on it. Except that an AI that was able to reliably recognize child porn would be a useful tool for creating child porn, so maybe don’t advertise that you’ve got one on the job.

        • Spiderwort@lemmy.dbzer0.comOP
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          19
          ·
          edit-2
          11 hours ago

          So that’s the indispensable service that admin provides. Childporn filtering.

          I didn’t realize it was such a large job. So large that it justifys the presence of a cop in every conversation? I dunno.

          • PhobosAnomaly
            link
            fedilink
            arrow-up
            15
            arrow-down
            1
            ·
            11 hours ago

            I’ve read through a few of your replies, and they generally contain a “so, …” and a generally inaccurate summary of what the conversation thread is about. I don’t know whether there’s a language barrier here or you’re being deliberately obtuse.

            It would appear to be that your desire for a community without moderators is so strong, that a platform like Lemmy is not suitable for what you want, and as such you are likely not going to find the answer you want here and spend your time arguing against the flow.

            Good luck finding what you’re looking for 👍

          • Zak@lemmy.world
            link
            fedilink
            arrow-up
            6
            ·
            edit-2
            10 hours ago

            If your questions are concrete and in the context of Lemmy or the Fediverse more broadly, admins provide the service of paying for and operating the servers in addition to moderation.

            If it’s more abstract, i.e. “can people talk to each other over the internet without moderators?” then my experience is that they usually can when the group is small, but things deteriorate is it grows larger. The threshold for where that happens is higher if the group has a purpose or if the people already know each other.

      • partial_accumen@lemmy.world
        link
        fedilink
        arrow-up
        10
        ·
        11 hours ago

        Surely filtering out childporn is something that I can do for myself.

        Even if that was a viable option as a solution (it isn’t), humans that are employed to filter out this disgusting content (and worse) are frequently psychologically damaged by the exposure. This includes online content moderation companies and those in law enforcement that have to deal with that stuff for evidentiary reasons.

        The reason its not a viable solution is if YOU block it out because YOU don’t want to see it but its still there, it becomes a magnet for those that DO want to see it because they know its allowed. The value of the remaining legitimate content goes down because more of your time is spent blocking the objectionable material yourself, until its too much for anyone that doesn’t want that stuff and they leave. Then the community dies.

        • Spiderwort@lemmy.dbzer0.comOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          6
          ·
          9 hours ago

          Personal cp filtering automation and a shared blacklist. That would take care of the problem. No moderator required.

          • xmunk@sh.itjust.works
            link
            fedilink
            arrow-up
            3
            ·
            2 hours ago

            If you can write an automated filter to block CSAM then Apple, Meta, Alphabet and others would happily shovel billions at you. Blocking CSAM is a constant and highly expensive job… and when they fuck up it’s a PR shit storm.

            • Spiderwort@lemmy.dbzer0.comOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              1 hour ago

              Maybe keeping it off the network is a lost cause. If we each block it with personal filtering then that changes the face of the issue.

              • xmunk@sh.itjust.works
                link
                fedilink
                arrow-up
                1
                ·
                41 minutes ago

                If lemmy is a hub for those who want to to trade CSAM then it will be taken down by the government. This isn’t something that can be allowed onto the system.

          • db0@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            4
            arrow-down
            1
            ·
            3 hours ago

            Personal cp filtering automation and a shared blacklist

            Oh just those, eh?

            Just goes to show how little idea you have how difficult this problem is.

      • NeoNachtwaechter@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        10 hours ago

        filtering out […] I can do for myself.

        It still means too much legal trouble for the admin if the offending data would be on the server.

      • BlameThePeacock@lemmy.ca
        link
        fedilink
        English
        arrow-up
        12
        ·
        11 hours ago

        The admin keeps the server running.

        As for moderation, it’s far more time efficient for a small group of people to handle this than it is to leave it up to individual users.

        If one person posts a spam message, it’s easier for a couple people to report it and a moderator to remove the post/user than it is to have a thousand people have to see it and decide if they want to ignore/block it.

        • Spiderwort@lemmy.dbzer0.comOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          10
          ·
          11 hours ago

          Yes it’s efficient. And the price is you have a cop now, telling us who can talk and what they can say. Maybe a good cop, maybe bad, probably limited in the ways that people generally are. But this is obvious.

          Ideally the conversation would be controlled by its participants and none other. That’s also obvious.

          • BlameThePeacock@lemmy.ca
            link
            fedilink
            English
            arrow-up
            6
            ·
            11 hours ago

            No, it’s not obvious that conversations would be controlled by it’s participants when there are hundreds or thousands of participants.

            It works fine for 5 people, or even 10, but not once it scales beyond a certain point.

            Just like having a voice call with 5 or 10 people can work, but with 1000 people you have to force mute everyone or it’s going to be a shit show.

            • Spiderwort@lemmy.dbzer0.comOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              5
              ·
              10 hours ago

              For each participant in the conversation, tools to navigate the complexities of the1000 person conversation. Why not? What’s so special about an overarching authority?

          • NeoNachtwaechter@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            10 hours ago

            And the price is you have a cop now, telling us who can talk and what they can say.

            Lemmy’s solution is that you can vote with your feet (=choose another instance with an admin to your liking)

            Of course the solution is incomplete. Real bad admins remain a possibility.

            That is kinda equal to a free world: Real bad humans are a possibility.

            But:

            Big fat BUT:

            If you really want a world where all bad people (according to your own definition) are excluded, then you have turned yourself into that cop that you despise so much.

            • Spiderwort@lemmy.dbzer0.comOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              6
              ·
              10 hours ago

              I think that being the cop in charge of my own perspective is quite acceptable. It’s putting other people in charge that I want to avoid.

      • NOT_RICK@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 hours ago

        They just pay for and maintain the server you use. Don’t forget to donate, DB0 is pretty cool

        • Spiderwort@lemmy.dbzer0.comOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          8
          ·
          11 hours ago

          Paying for the server is one thing.

          Managing our conversations is quite another.

          The first gives them the power to do the second. Yes.

            • Spiderwort@lemmy.dbzer0.comOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              9
              ·
              edit-2
              11 hours ago

              Yes, it gives them that power.

              But do we need their management services for a good functioning conversation? That’s my question

              • NOT_RICK@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                11 hours ago

                I think that works ok for general interest pages, but I don’t want to see sonic porn on a page that’s supposed to be about video game news, and I don’t feel like having to manually sort that out myself. People don’t mark stuff correctly all the time. Even 4chan keeps their boards on topic, and they don’t have heavy handed moderation, well at least when I used to go on there as a teen they didn’t.

                • Spiderwort@lemmy.dbzer0.comOP
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  8
                  ·
                  11 hours ago

                  So if you had a tool that could filter out the sonic porn for you then you would happily forego having an admin?

  • schnurrito@discuss.tchncs.de
    link
    fedilink
    arrow-up
    5
    ·
    10 hours ago

    Suggested reading:

    • Moderation Is Different From Censorship (if you want people to enjoy being on a platform, you need to make sure people see things they enjoy seeing even if other people have been posting other things too)
    • Hey Elon: Let Me Help You Speed Run The Content Moderation Learning Curve (there are also other reasons, including legal reasons, why you can’t have a “censorship-free system” for very long; someone else already raised the point that if you build a “censorship-free system”, the government is eventually going to shut you down for hosting child porn)
    • Spiderwort@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      9
      ·
      edit-2
      10 hours ago

      Given childporn, we’d like to exclude it from the conversation.

      We could handle it personally.

      We could have a human offering censoring advice.

      We could have an ai doing that.

      We could have a shared list, identifying it for us. Compiled by any or all of the above.

      We could weight any of those by trustworthiness.

      • Semjaza@lemmynsfw.com
        link
        fedilink
        arrow-up
        2
        ·
        9 hours ago

        Moderators can be chosen by the people, and changed more easily.

        Machine algorithms are a black box of uncertainty, and the ineffectiveness and mess that things like the YouTube moderation algorithm are hardly an endorsement of them.

        What’s wrong with having moderators?

        • Spiderwort@lemmy.dbzer0.comOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 hour ago

          I don’t like having a cop hovering over our conversation telling us who can talk and what we can say. That’s what is wrong with having moderators.

  • FelixCress@lemmy.world
    link
    fedilink
    arrow-up
    5
    arrow-down
    2
    ·
    9 hours ago

    Elon Muskler and other far right morons poisoned the idea of un-moderated discussion but I still very much support it.

    Moderators role should be mostly limited to removing

    • illegal content
    • obvious trolling
    • nazi propaganda including racism

    They also should ensure that individual communities are place where people can freely express their views without the fear of being downvoted to oblivion by users brigading from other communities just to blanket downvote.

    • haui@lemmy.giftedmc.com
      link
      fedilink
      arrow-up
      1
      ·
      9 hours ago

      I think thats reasonable. Thats why we have different instances and communities with different rules. Works great imo.

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    11 hours ago

    Well, it depends on what you’re aiming for.

    My experience has generally been that if you try to have a conversation in an unmoderated environment, there is a very small percentage of people who enjoy derailing other people’s conversations. Could be just posting giant images or whatever. And it doesn’t take a high percentage to derail conversations.

    There are places that are more-hands-off that do have communities. I guess 4chan, say – not the same thing, but there are certainly people who like that.

    But, in any event, if you want to have a zero-admin, zero-moderator discussion, you can do it. Set up an mbin/lemmy/piefed instance. State that your instance rules are “anything goes”. Then start a community on it and say that you have no rules and give it a shot.

    I tend to favor a probably-more-hands-off policy than many, but even with that, I think that there are typically gonna be people who are just going to try to stop users from talking to each other.

    • Spiderwort@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      11
      ·
      11 hours ago

      Yes trolls.

      But instead of having an admin manage them for us we could have tools for managing them ourselves. That’s an option.

      So the alternative to having an admin is not just chaos. Not at all.

  • haui@lemmy.giftedmc.com
    link
    fedilink
    arrow-up
    1
    ·
    9 hours ago

    Mods and admins are mostly for technical and legal issues. You can choose your instance by the rules or male your own. Managing your convos is your job on the fedi.

  • Opinionhaver
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 hours ago

    It’s always going to be a tradeoff. If it’s unmoderated then you’re going to encounter a lot more spam and off-topic which you have to deal with by yourself but with moderators you’re going to get power tripping. With a truly competent AI moderator you could possibly get around this but not with humans.

    In my case, I’m already doing the majority of managing what I see on my feed by myself so I doubt I’d see a huge difference were the moderators to dissapear.

  • jet@hackertalks.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 hours ago

    Yes, you do it all the time in real life when talking to people. If someone isn’t contributing to the conversation you so don’t invite them back, or change topic or leave. It’s normal human behavior.

  • venusaur@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    8
    ·
    9 hours ago

    Only if you think you need censorship to protect your fragile ego. Something offended you? Move on loser.

    • haui@lemmy.giftedmc.com
      link
      fedilink
      arrow-up
      8
      ·
      9 hours ago

      Braindead take. You can want to exchange ideas without being harassed. There are rules in every instance and community. Mods enforce the community rules. Admins take care of technical issues. Dont like it. Make an unmoderated and (soon) defederated instance. If you just want to shit on people, just move to threads.