• @spacedogroy
    link
    English
    109 months ago

    I think this is probably the highest risk, in my opinion. I honestly think Lemmy needs some automated solution to monitoring and scanning for CSAM content, because as a user I don’t want to exposed to it and as an admin I wouldn’t want to be in the position where I am responsible for censoring it.

    I think lemmy.world have kind of made a good point here: we need an admin in place who’s reactive and willing to maintain the server regularly - in a moderation and a technical sense - or we should consider migrating to a larger instance.

    This is no dig at @tom as he’s done a phenomenal job here and has undoubtedly spent time and money in creating this instance, but it would be good to get a sense off him whether he really feels he wants to continue on with the project. If not, it should lead to a larger discussion of where to go from here, because I don’t think the status quo is sustainable.

    • @mackwinston
      link
      19 months ago

      What’s CSAM? (I don’t want to google it, it sounds “risky” from the context).

      • @fakeman_pretendname
        link
        19 months ago

        It basically means “pornography with children in it”, though I’m unsure of the specifics of the abbreviation, and likewise don’t want to google it.