I’m talking about this sort of thing. Like clearly I wouldn’t want someone to see that on my phone in the office or when I’m sat on a bus.
However there seems be a lot of these that aren’t filtered out by nsfw settings, when a similar picture of a woman would be, so it seems this is a deliberate feature I might not be understanding.
Discuss.
I curate my feed. However because of the lack of an algorithm I don’t get suggestions. This means that I (and likely others) scroll through the everything feed repeatedly.
But I’ll do you one better. Lemmy doesn’t prevent users under the age of 18 from joining so long as they are 13 or older (just like other platforms). There’s a reason that most if not all websites curate for NSFW content, and it’s to make what the public can view with or without an account safe for children who are likely visiting those sites. That’s the reason Facebook won’t let you post half nude photos publicly. It’s the reason Reddit has NSFW tags. You’re preaching to the choir as far as users curating their content. NSFW tagging is literally a tool to use to curate the content you see. If your argument is that posters have no responsibility for what content they post that’s just logically wrong. No laws work that way. It’s literally why platforms aren’t being held liable for the misinformation spread by their users. I’m not complaining about random anime lewds. I’m pointing out that they are not safe for work. So they should be labeled as such since that is the status quo for not safe for work content. In the same way that a lot of content related to the war in Ukraine and the conflict in Palestine are labeled that way.
If you’re gonna be stuck on what the user should be doing rather than treating all of these items the same as far as what they are then you’re gonna have a bad time because I’m not entertaining that. I curate my feed and I don’t scroll the Everything feed at work, but that doesn’t mean I think other users shouldn’t be able to.