• Rogue
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 days ago

    Thanks

    Any recommendations for communities to learn more?

    Frustratingly Their setup guide is terrible. Eventually managed to get it running. Downloaded a model and only after it download did it inform me I didn’t have enough RAM to run it. Something it could have known before the slow download process. Then discovered my GPU isn’t supported. And running it on a CPU is painfully slow. I’m using an AMD 6700 XT and the minimum listed is 6800 https://github.com/ollama/ollama/blob/main/docs/gpu.md#amd-radeon

      • Rogue
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 day ago

        Thanks, I did get both setup with Docker, my frustration was neither ollama or open-webui included instructions on how to setup both together.

        In my opinion setup instructions should guide you to a usable setup. It’s a missed opportunity not to include a docker-compose.yml connecting the two. Is anyone really using ollama without a UI?