• roux [he/him, comrade/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    25
    ·
    14 hours ago

    This right here is giving me flashbacks of working with the dumbest people in existence in college because I thought I was too dumb for CS and defected to Comp Info Systems.

  • keepcarrot [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    25
    ·
    15 hours ago

    One of the things I’ve noticed is that there are people who earnestly take up CS as something they’re interested in, but every time tech booms there’s a sudden influx of people who would be B- marketing/business majors coming into computer science. Some of them even do ok, but holy shit do they say the most “I am trying to sell something and will make stuff up” things.

  • bdonvr@thelemmy.club
    link
    fedilink
    English
    arrow-up
    38
    ·
    16 hours ago

    Can we make a simulation of a CPU by replacing each transistor with an LLM instance?

    Sure it’ll take the entire world’s energy output but it’ll be bazinga af

    • Imnecomrade [none/use name]@hexbear.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      45 seconds ago

      Unfortunately, I am experiencing the opposite effect. I am an IT contractor tasked with writing scripts, and I keep trying to apply for a dev job with a 2 year degree with no success while I see old dinosaur fucks at my job not even knowing what functions are. They use ChatGPT to write scripts for them without any modifications to work with our specific clusterfuck of an environment that they created, and the scripts essentially don’t work and run in production, because we don’t even have a testing environment. Meanwhile I have to clean their mess and get paid much less than them, let alone not have any PTO or benefits. It’s absolutely maddening to be moderately skilled in programming and witnessing some of the dumbest people on the planet get CS jobs through nepotism or impressing HR, moreso the former. It seems like my workplace will only hire someone if and only if they are incredibly incompetent. Can’t wait to get the fuck out of the IT field and pursue electrical engineering someday. I’m not wasting 10-20 years of my life just to get a single promotion in a field dominated by cop-worshipping, white supremacist libertarians.

  • WhyEssEff [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    38
    ·
    edit-2
    18 hours ago

    lets add full seconds of latency to malloc with a non-determinate result this is a great amazing awesome idea it’s not like we measure the processing speeds of computers in gigahertz or anything

    • WhyEssEff [she/her]@hexbear.net
      link
      fedilink
      English
      arrow-up
      25
      ·
      18 hours ago

      sorry every element of this application is going to have to query a third party server that might literally just undershoot it and now we have an overflow issue oops oops oops woops oh no oh fuck

      • WhyEssEff [she/her]@hexbear.net
        link
        fedilink
        English
        arrow-up
        21
        ·
        edit-2
        18 hours ago

        want to run an application? better have internet fucko, the idea guys have to burn down the amazon rainforest to puzzle out the answer to the question of the meaning of life, the universe, and everything: how many bits does a 32-bit integer need to have

        • WhyEssEff [she/her]@hexbear.net
          link
          fedilink
          English
          arrow-up
          20
          ·
          edit-2
          18 hours ago

          new memory leak just dropped–the geepeetee says the persistent element ‘close button’ needs a terabyte of RAM to render, the linear algebra homunculus said so, so we’re crashing your computer, you fucking nerd

          • WhyEssEff [she/her]@hexbear.net
            link
            fedilink
            English
            arrow-up
            21
            ·
            edit-2
            17 hours ago

            the way I kinda know this is the product of C-Suite and not a low-level software engineer is that the syntax is mallocPlusAI and not aimalloc or gptmalloc or llmalloc.

            • WhyEssEff [she/her]@hexbear.net
              link
              fedilink
              English
              arrow-up
              20
              ·
              edit-2
              17 hours ago

              and it’s malloc, why are we doing this for things we’re ultimately just putting on the heap? overshoot a little–if you don’t know already, it’s not going to be perfect no matter what. if you’re going to be this annoying about memory (which is not a bad thing) learn rust dipshit. they made a whole language about it

              • WhyEssEff [she/her]@hexbear.net
                link
                fedilink
                English
                arrow-up
                9
                ·
                edit-2
                17 hours ago

                if they’re proposing it as a C stdlib-adjacent method (given they’re saying it should be an alternative to malloc [memory allocate]) it absolutely should be lowercase. plus is redundant because you just append the extra functionality to the name by concatenating it to the original name. mallocai [memory allocate ai] feels wrong, so ai should be first.

                if this method idea wasn’t an abomination in and of itself that’s how it would probably be named. it currently looks straight out of Java. and at that point why are we abbreviating malloc. why not go the distance and say largeLanguageModelQueryingMemoryAllocator

  • FunkyStuff [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    57
    ·
    19 hours ago

    This is simply revolutionary. I think once OpenAI adopts this in their own codebase and all queries to ChatGPT cause millions of recursive queries to ChatGPT, we will finally reach the singularity.

    • hexaflexagonbear [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      24
      ·
      19 hours ago

      There was a paper about improving llm arithmetic a while back (spoiler: its accuracy outside of the training set is… less than 100%) and I was giggling at the thought of AI getting worse for the unexpected reason that it uses an llm for matrix multiplication.

      • FunkyStuff [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        17
        ·
        19 hours ago

        Yeah lol this is a weakness of LLMs that’s been very apparent since their inception. I have to wonder how different they’d be if they did have the capacity to stop using the LLM as the output for a second, switched to a deterministic algorithm to handle anything logical or arithmetical, then fed that back to the LLM.

        • nightshade [they/them]@hexbear.net
          link
          fedilink
          English
          arrow-up
          9
          ·
          edit-2
          17 hours ago

          I’m pretty sure some of the newer ChatGPT-like products (the consumer-facing interface, not the raw LLM) do in fact do this. They try to detect certain types of inputs (i.e. math problems or requesting the current weather) and convert it to an API request to some other service and return the result instead of a LLM output. Frankly it comes across to me as an attempt to make the “AI” seem smarter than it really is by covering up its weaknesses.

          • FunkyStuff [he/him]@hexbear.net
            link
            fedilink
            English
            arrow-up
            3
            ·
            17 hours ago

            Yeah, Siri has been capable of doing that for a long time, but my actual hope would be that moreso than handing the user the API response, the LLM could actually keep operating on that response and do more with it, composing several API calls. But that’s probably prohibitively expensive to train since you’d have to do it billions of times to get the plagiarism machine to learn how to delegate work to an API properly.