• Mnemnosyne@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    5
    ·
    1 year ago

    This is actually what I look forward to most in gaming in the next decade or two. The implementation of AI that can be assigned goals and motivations instead of scripted to every detail. Characters in games with whom we as players can have believable conversations that the devs didn’t have to think of beforehand. If they can integrate LLM type AI into games successfully, it’ll be a total game changer in terms of being able to accommodate player choice and freedom.

    • TwilightVulpine@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      This is something I used to be excited for but I only have been losing interest the more I hear about AI. What are the chances this will lead to moving character arcs or profound messages? The way LLMs are today, the best we can hope for is Radiant Quests Plus. Not sure a game driven by AIs rambling semi-coherently forever will be more entertaining than something written by humans with a clear vision.

      • Renacles@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        AI used to not even be able to do that a year or so ago, give it time and it’ll get there.

        • TwilightVulpine@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          There are some fundamental obstacles to that. I don’t want, for instance, that a game AI does that which I tell it to do. I want to be surprised and presented with situations I haven’t considered. However, LLMs replicate language and symbol patterns according to how they are trained. Their tendency is to be cliche, because cliche is the most expected outcome of any narrative situation.

          There is also the matter that ultimately LLMs do not have a real understanding and opinions about the world and themes. They can give us description of trees, diffusion models can get us a picture of a tree, but they don’t know what a tree is. They don’t have the experiential and emotional ability to make their own mind of what a tree is and represents, they can only use and remix our words. For them to say something unique about trees, they are basically randomly trying stuff until something sticks, without no real basis of their own. We do not have true generalized AI to have this level of understanding and introspection.

          I suppose that sufficiently advanced and thorough modelling might give them the appearance of these qualities… but at that point, why not just have the developers write these worlds and characters? Sure that content is much more limited than the potentially infinite LLM responses, but as you wring eternal content from an LLM, most likely you are going to end up leaving the scope of any parameters back into cliches and nonsense.

          To be fair though, that depends on the type of game we are talking about. I doubt that a LLM’s driven Baldur’s Gate would be anywhere as good as the real thing by a long margin. But I suppose it could work for a game like Animal Crossing, where we don’t mind the little characters constantly rambling catchphrases and nonsense.

          • Renacles@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            I mostly agree but I think that, in some cases, cliche is exactly what we need. AI could be used for the background dialogue generic NPCs have in open world games if used well.

            Overall I think AI is nowhere near advanced enough to be used at a large scale in gaming but it’ll probably get there in 5 to 10 years if it continues advancing at this rate.

            The main issue I see with it is that you need special hardware to run neural networks in a native environment and personal PCs don’t have that so you are stuck with always-online, machine learning or pre-processed data.

    • Chaotic Entropy
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 year ago

      I wonder if they’ll spend as much time defining what an LLM shouldn’t be talking about/doing as they would defining what a non-LLM should be talking about/doing.

    • dangblingus@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      edit-2
      1 year ago

      Characters in games with whom we as players can have believable conversations that the devs didn’t have to think of beforehand.

      Correction: characters in games will have soulless cookie cutter paint by numbers responses that sound hollow and lifeless. AI doesn’t generate, it only remixes.

      Also, have you interacted with a LLM? They’re full of restrictions and they’re not very good at finding recent data. How would that implement in a video game? Devs would have to train the LLM to basically annihilate their own job as writers. Which still wouldn’t really save the dev company/publisher any money or time.

    • Kerb@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      i dont quite think that that is what they meant here.

      the article was talking about productivity a lot,
      and the current ai hype is centered arround generative ai.

      i think what they where talking about here,
      is using ai to speed up stuff like moddeding and terrain generation.

      stuff similar to the second half of this presentation ( starting arround 3:30)

    • kingthrillgore@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Unfortunately Ubisoft is ahead of the curve and is using AI to handle “barks” in its writing process to accomplish this. It’s not going very well.