Things are still moving fast. It’s mid/late july now and i’ve spent some time outside, enjoying the summer. It’s been a few weeks since things exploded in the month of may this year. Have you people settled down in the meantime?

I’ve since then moved from reddit and i miss the LocalLlama over there, that was/is buzzing with activity and AI news (and discussions) every day.

What are you people up to? Have you gotten tired of your AI waifus? Or finished indexing all of your data into some vector database? Have you discovered new applications for AI? Or still toying around and evaluating all the latest fine-tuned variations in constant pursuit of the best llama?

  • noneabove1182@sh.itjust.worksM
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Yeah I’m using it with home assistant :)

    Basically I’m using oobabooga for inference and providing an API endpoint as if it were OpenAI, and then plugging that into Microsoft’s guidance, which I then give a tool. The tool takes as input the device and the state, and then calls my home assistant rest endpoint to execute the command!

    • rufus@discuss.tchncs.deOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Thank you for pointing that out. I was completely unaware of microsoft guidance. Once they merge/implement llama.cpp support, i’m definitely going to try it, too.

      • noneabove1182@sh.itjust.worksM
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        That will certainly be amazing, but for now it’s actually not bad to use either oobabooga web UI or koboldcpp to run the inferencing and provide a rest endpoint, cause you can trick basically any program into treating it as if it’s OpenAI and use it the same way