the-podcast guy recently linked this essay, its old, but i don’t think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin

  • plinky [he/him]@hexbear.netOP
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 months ago

    What is visual memory indeed in informational analogy, do tell me? Does it have consistent or persistent size, shape or anything resembling bmp file?

    The difference is neural networks are bolted on structures, not information.

    • Tomorrow_Farewell [any, they/them]@hexbear.net
      link
      fedilink
      English
      arrow-up
      4
      ·
      7 months ago

      What is visual memory indeed in informational analogy, do tell me?

      It’s not considered as some special type of memory in this context. Unless you have a case for the opposite, this stuff is irrelevant.

      Does it have consistent or persistent size, shape or anything resembling bmp file?

      Depends on a particular analogy.
      In any case, this question seems irrelevant and rather silly. Is the force of a gravitational pull in models of Newtonian physics constant, does it have a shape, is it a real number, or a vector in R^2, or a vector in R^3, or a vector in R^4, or some other sort of tensor? Obviously, that depends on the relevant context regarding those models.

      Also, in what sense would a memory have a ‘shape’ in any relevant analogy?

      The difference is neural networks are bolted on structures, not information

      Obviously, this sentence makes no sense if it is considered literally. So, you have to explain what you mean by that.