So what is currently the best and easiest way to use an AMD GPU for reference I own a rx6700xt and wanted to run 13B model maybe superhot but I’m not sure if my vram is enough for that Since now I always sticked with llamacpp since it’s quiet easy to setup Does anyone have any suggestion?

  • Mixel@feddit.deOP
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    How do use ooba with rocm I looked at the python file where you can install amd and it will just say “amd not supported” and exit. I guess it just doesn’t update the webui.py when I update ooba? I somewhere heard that llama.cpp with CLBlast wouldn’t work with ooba, or am I wrong? Also is konoldcpp worth a shot? I hear some success with it

      • Mixel@feddit.deOP
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 year ago

        I will try that once in home! Ty for the suggestions can I use kobold also in sillytavern? iirc there was an option for koboldai or something is that koboldcpp or what does that option do?

        EDIT: I got it working and its wonderful thank you for suggesting me this :) I had some difficulties setting it up especially with opencl-mesa since I had to install opencl-amd and then finind out the device ID and so on but once it was working its great!