the-podcast guy recently linked this essay, its old, but i don’t think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin

  • DefinitelyNotAPhone [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    30
    ·
    7 months ago

    Meh, this is basically just someone being Big Mad about the popular choice of metaphor for neurology. Like, yes, the human brain doesn’t have RAM or store bits in an array to represent numbers, but one could describe short term memory with that metaphor and be largely correct.

    Biological cognition is poorly understood primarily because the medium it is expressed on is incomprehensibly complex. Mapping out the neurons in a single cubic millimeter of a human brain takes literal petabytes of storage, and that’s just a static snapshot. But ultimately it is something that occurs in the material world under the same rules as everything else, and does not have some metaphysical component that somehow makes it impossible to simulate using software in much the same way we’d model a star’s life cycle or galaxy formations, just unimaginable using current technology.

    • Formerlyfarman [none/use name]@hexbear.net
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      7 months ago

      The op its not arguing it has a metaphisical component. Its arguing the structure of the brain is diferent frome the structure of your pc. The metaphor bit is important because all thinking is metaphor with different levels of rigor and abstraction. A faulty metaphor forces you to think the wrong way.

      I do disagree with some things, whats a metaphor if not a model? Whats reacting to stimuli if not processing information?

      • Frank [he/him, he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 months ago

        The op its not arguing it has a metaphisical component.

        Yes they are. They might scream in your face that they’re not, but the argument they’re making is based not on science and observation but rather the chains of a christian culture they do not feel and cannot see.

        A faulty metaphor forces you to think the wrong way.

        The Sapir-Whorf hypothesis, if it’s accurate at all, does not have a strong effect.

        whats a metaphor if not a model?

        To quote the dictionary; “a figure of speech in which a word or phrase is applied to an object or action to which it is not literally applicable.” Which seems to be the real problem, here; Psychologists and philosophers hear someone using a metaphor and think they must literally believe what the psychologist or philosopher believes about the symbol being used.

        • Formerlyfarman [none/use name]@hexbear.net
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          I think you are rigth. Our dissagrement comes from thinking the metaphor refers to structure rather than just language. Lets say an atomic model were the electrons ar flying around a nucleus formimg shells, is also not literaly aplicable. But we think of it as a useful metaphore because its close enough.

          The same should apply to the most sophisticated mathematical models. A useful metaphor should then be a more primitive form of thise process where it illustrates a mechanism. If the mechanism is different from the mechanism in the metaphor then it should be wrong.

          If the metaphor is just there to provide names, then you are offcourse rigth that it should not change anything.

          Whether the metaphor of computers and brains is correct or not should also have no effect on wether we can simulate a brain in a computer. Computers can after all simulate many things that do not work like computers.

    • SerLava [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      7
      ·
      7 months ago

      Mapping out the neurons in a single cubic millimeter of a human brain takes literal petabytes of storage, and that’s just a static snapshot

      I’ve read long ago that replicating all the functions of a human brain is probably possible with computers around one order of magnitude less powerful than the brain because it’s kind of inefficient

      • bumpusoot [any]@hexbear.net
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        7 months ago

        There’s no way we can know that, currently. The brain does work in all sorts of ways we really don’t understand. Much like the history of understanding DNA, what gets written off as “random inefficiency” is almost certainly a fundamental part of how it works.

      • FunkyStuff [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        Resident dumb guy chipping in, but are these two facts mutually exclusive? Assuming both are true, it just means you’d need a computer that’s 1e12x as powerful as our supercomputers to simulate the brain, which is itself 1e13x as powerful as a supercomputer. So we’re still not getting there anytime soon.

        *With a very loose meaning of what “powerful” means seeing as the way the brain works is completely different to a computer that calculates in discrete steps.

    • plinky [he/him]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      6
      ·
      7 months ago

      I could describe it as gold hunter with those sluice thingies, throwing water out and keeping gold, there I described short term memory.

      shrug-outta-hecks

      I don’t disagree it’s a material process, I just think we find most complex analogy we have at the time and take it (as author mentions), but then start taking metaphor too far

      • Frank [he/him, he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        10
        ·
        7 months ago

        Yeah but we, if “we” is people who have a basic understanding of neuroscience, aren’t taking it to far. The author is yelling at a straw man, or at lay people which is equally pointless. Neuroscientists don’t think of the mind or the brain it runs on as being a literal digital computer. They have their own completely incomprehensible jargon for discussing the brain and the mind, and if this article is taken at face value the author either doesn’t know that or is talking to someone other than people who do actual cognitive research.

        I’ma be honest, i think there might be some academic infighting here. Psychology is a field with little meanginful rigor and poor explanatory power, while neuroscience is on much firmer ground and has largely upended the theories arising from Epstein’s heyday. I think he might be feeling the icy hand of mortality in his chest and is upset the world has moved past him and his ideas.

        Also, the gold miner isn’t a good metaphor. In that metaphor information only goes one way and is sifted out of chaos. There’s no place in the metaphor for a process of encoding, retrieving, or modifying information. It does not resemble the action of the mind and cannot be used as a rough and ready metaphor for discussing the mind.

        • Sidereal223 [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          4
          ·
          7 months ago

          I work in neuroscience and I don’t agree that it is on much firmer ground that psychology. In fact, as some people in the community have noted, the neuroscience mainstream is probably still in the pre-paradigmitic stage (using Kuhn). And believe it or not, a lot of neuroscientists naively do believe that the brain is like a computer (maybe not exactly one, but very close).

        • plinky [he/him]@hexbear.netOP
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          7 months ago

          Sure there is: encoding is taking sand from the river (taking noise from the world into comprehensible inputs) storage is taking the gold, modifying is throwing some bits out or taking them to the smith.

          From the bottom up (and in the middle, if we take partial electro, ultrasound or magnetic stimulation) neuroscience andvances are significant but rather vague. We likely know how on molecular level memory works, but that has jack shit to do with information pipelines, but rather rigorous experiments, or in case of machine human interface more like skilled interpretation of what you see and knowing where to look for it (you can ascribe it to top down approach).

          Neuroscientists likely dont, but I think you have rather nicer opinion of tech bros than I do or their ideas among people

          • Frank [he/him, he/him]@hexbear.net
            link
            fedilink
            English
            arrow-up
            11
            ·
            7 months ago

            My opinion of tech bros is that anyone deserving the label “tech bro” is a dangerous twit who should be under the full time supervision of someone with humanities training, a gun, and orders to use it if the tech bro starts showing signs of independent thought. It’s a thoroughly pathological world view, a band of lethally competent illiterates who think they hold all human knowledge and wisdom. If this is all directed at tech bros I likely didn’t realize it because I consider trying to teach nuance to tech bros about as useful as trying to teach it to a dog and didn’t consider someone in an academic field would want to address them.