• HelloHotel@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      8 months ago

      I absolutely agree. Use somthing like ollama. do keep in mind that it takes a lot of compiting resources to run these models. About 5GB ram and about 3GB filesize for the smaller sized ollama-unsensored.

      • LainTrain@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        10 months ago

        It’s not great, but an old GTX GPU can be had cheaply if you look around refurb, as long as there is a warranty, you’re gold. Stick it into a 10 year old Xeon workstation off eBay, you can have a machine with 8 cores, 32GB RAM and a solid GPU cheaply under $200 easily.

        • HelloHotel@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          10 months ago

          Its the RAM requirement that stings rn, I beleave ive got the specs but was told or misremember a 64 GB ram requirement for a model.

          • LainTrain@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            10 months ago

            IDK what you’ve read, but I have 24GB and can use Dreambooth and fine-tune Mistral no problem. RAM is only required to load the model briefly before it’s passed to VRAM iirc, and that’s the main deal, you need 8GB VRAM as an absolute minimum, even my 24GB VRAM is often not enough for some high end stuff.

            Plus RAM is actually really cheap compared to a GPU. Remember it doesn’t have to be super fancy RAM either, DDR3 is fine if you’re not gaming on a like a Ryzen or something modern