I’ve just discovered OmniGPT that seems to be a chat where you can interact with different LLM (Claude, GPT-4, Llama, Gemini, etc.) and costs $16/month (it was $7/month until a week ago 🤦‍♂️). I’ve read on a Reddit post that it uses the APIs of all the provider that is a thing that can be done for free using a personal account (since the API limit seems to be high). Do you know something like OminGPT that can be self hosted that uses users API keys?

      • Scrubbles@poptalk.scrubbles.tech
        link
        fedilink
        English
        arrow-up
        12
        ·
        edit-2
        7 months ago

        Oh yes, maybe I misunderstood what you were asking. This is the server that will host the models and the API, it also has a nice interface.

        So by local I mean local to the server, you can run it somewhere else and not put the models on your local computer, but yes the server will need them.

        You can then use other apps to connect with it. That’s what I consider self hosting, hosting the whole thing soup to nuts

        • peregus@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          5
          ·
          7 months ago

          What I’m looking for is a frontend that uses GTP-4, Gemini and other AI engine with their respective APIs keys.

            • peregus@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              14
              ·
              edit-2
              7 months ago

              But I will…self host this service! And beside the title, I’ve written a post with a description of what I’m looking for.

              • Nyfure@kbin.social
                link
                fedilink
                arrow-up
                8
                arrow-down
                1
                ·
                edit-2
                7 months ago

                you want a frontend, not the “service” itself.
                Under “service” i usually understand the main logic part of something. In this case the LLM-processing itself.
                Thats probably where the confusion is coming from here.

  • peanuts4life@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    6
    ·
    7 months ago

    I believe Librechat would achieve your goals, but you’d need a PC or server to host it in. It supports all major API.

    Kobold light might with as well, and doesn’t need to be hosted locally, but I don’t think it supports Claude haiku specifically, for unknown reasons.

    Additionally, the official Claude api workshop is pretty good on desktop, but it only supports Claude.