Great headline, but ask fusion how long they have been 20 years away and how many more years they have…

  • daniskarma@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    3 days ago

    Would it be different if instead of LLM the AI operated machine gun or the corporate software where driven just by traditional algorithms when it comes to that ethical issue?

    Because a machine gun does not need “modern” AI to be able to take aim and shoot at people, I guarantee you that.

    • aesthelete@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      3 days ago

      No, it wouldn’t be different. Though it’d definitely be better to have a discernable algorithm / explicable set of rules for things like health care. Them shrugging their shoulders and saying they don’t understand the “AI” should be completely unacceptable.

      I wasn’t saying AI = LLM either. Whatever drives Teslas is almost certainly not an LLM.

      My point is half-baked software is already killing people daily, but because it’s more dramatic to pontificate about the coming of skynet the “AI” people waste time on sci-fi nonsense scenarios instead of drawing any attention to that.

      Fighting the ills bad software are already causing today would also do a lot to advance the cause of preventing bad software from reaching the imagined apocalyptic point in the future.