• alyaza [they/she]@beehaw.orgM
    link
    fedilink
    arrow-up
    22
    ·
    edit-2
    1 year ago

    When you ban people from a website, they just move to another place, they are not stupid it’s pretty easy to create websites. It’s purely optical,

    you are literally describing an event that induces the sort of entropy we’re talking about here–necessarily when you ban a community of Nazis or something and they have to go somewhere else, not everybody moves to the next place (and those people diffuse back into the general population), which has a deradicalizing effect on them overall because they’re not just stewing in a cauldron of other people who reinforce their beliefs

    • jasory@programming.dev
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      “A deradicalising effect”

      I’m sorry what? The idea that smaller communities are somehow less radical is absurd.

      I think you are unaware (or much more likely willfully ignoring) that communities are primarily dominated by a few active users, and simply viewed with a varying degree of support by non-engaging users.

      If they never valued communities enough to stay with them, then they never really cared about the cause to begin with. These aren’t the radicals you need to be concerned about.

      “And those people diffuse back into the general population”

      Because that doesn’t happen to a greater degree when exposed to the “general population” on the same website?

      • alyaza [they/she]@beehaw.orgM
        link
        fedilink
        arrow-up
        8
        ·
        edit-2
        1 year ago

        I’m sorry what? The idea that smaller communities are somehow less radical is absurd.

        i’d like you to quote where i said this–and i’m just going to ignore everything else you say here until you do, because it’s not useful to have a discussion in which you completely misunderstand what i’m saying from the first sentence.

      • t3rmit3@beehaw.org
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 year ago

        The deradicalizing effect occurs in the people who do not follow the fringe group to a new platform. Many people lurk on Reddit who will see extremist content there and be influenced by it, but who do not align with the group posting it directly, and will not seek them out after their subreddit is banned.

        • jasory@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Sure but what degree of influence is actually “radicalising” or a point of concern?

          We like to pretend that by banning extreme communities we are saving civilisation from them. But the fact is that extreme groups are already rejected by society. If your ideas are not actually somewhat adjacent to already held beliefs, you can’t just force people to accept them.

          I think a good example of this was the “fall” of Richard Spencer. All the leftist communities (of which I was semi-active in at the time) credited his decline with the punch he received and apparently assumed that it was the act of punching that resulted in his decline, and used it to justify more violent actions. The reality is that Spencer just had a clique of friends that the left (and Spencer himself) interpreted as wide support and when he was punched the greater public didn’t care because they never cared about him.

    • jarfil@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      deradicalizing effect on them overall because they’re not just stewing in a cauldron of other people who reinforce their beliefs

      Whom are we talking about here, the ones who get kicked out and seek each other in a more concentrated form, or the ones who are left behind without the radicalizing agents?

      I don’t want to have to deal with Nazis, or several other sects, but I don’t think forcing them into a smaller echo chamber is helping either.

      Ideally, I think a social platform should lure radicalizing agents, then expose them to de-radicalizing ones, without exposing everyone else. Might be a hard task to achieve, but worth it.

      • Zworf@beehaw.org
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Ideally, I think a social platform should lure radicalizing agents, then expose them to de-radicalizing ones, without exposing everyone else. Might be a hard task to achieve, but worth it.

        You really think this works? I don’t. I just see them souring the atmosphere for everyone and attracting more mainstream users to their views.

        We’ve seen in Holland how this worked out. The nazi party leader (who chanted “Less Moroccans”) won the elections by a landslide a month ago. There is a real danger of disenchanted mainstreamers being attracted to nazi propaganda in droves. We’re stuck with them now for 4 years (unless they manage to collapse on their own, which I do hope).

        • jarfil@beehaw.org
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          No, that’s why I said “Ideally”, meaning it as a goal.

          I don’t think we have the means to do it yet, or at least I don’t know of any platform working like that, but I have some ideas of how some of it could be done. Back in the days of Digg, with some people, we spitballed some ideas for social networks, among them a movie ranking one (that turned out to be a flop because different people would categorize films differently), and a kind of PageRank for social networks, that back then was computationally impractical. But with modern LLMs running trillions of parameters, and further hardware advances, even O(n²) with n=millions becomes feasible in real time, and in practice it wouldn’t need to do nearly that much work. With the right tuning, and dynamic message visibility, I think something like that could create the exact echo chambers that would attract X people, allow in des-X people, while keeping everyone else out and unbothered.

          Of course there is a dark side, in that a platform could use the same strategy to mold the opinion of any group… and I wouldn’t be surprised to learn that Meta had been doing exactly that.