• Amaltheamannen@lemmy.ml
    link
    fedilink
    arrow-up
    18
    ·
    1 year ago

    Check out /r/localllama. Preferably you need a Nvidia you with >= 24 GB VRAM but it also works with a cpu and loads of normal RAM, if you can wait a minute or two for a lengthy answer. Loads of models to choose from, many with no censorship at all. Won’t be as good as chatgptv4, but many are close to gpt3.

      • averagedrunk@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        GPUs are great for parallel tasks. Computing answers requires a lot of parallel tasks. CPUs are amazing for doing one thing at a time.

      • Amaltheamannen@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        They have a lot of fast memory and are great at doing things in parallel. Most AI are just operations on matrixes, which essentially is what a GPU is built for.