• PerogiBoi@lemmy.ca
    link
    fedilink
    English
    arrow-up
    88
    ·
    8 months ago

    Also check out LLM Studio and GPT4all. Both of these let you run private ChatGPT alternatives from Hugging Face and run them off your ram and processor (can also offload to GPU).

    • Just_Pizza_Crust@lemmy.world
      link
      fedilink
      English
      arrow-up
      26
      ·
      8 months ago

      I’d also recommend Oobabooga if you’re already familiar with Automatic1111 for Stable diffusion. I have found being able to write the first part of the bots response gets much better results and seems to make up false info much less.

      • PerogiBoi@lemmy.ca
        link
        fedilink
        English
        arrow-up
        41
        arrow-down
        1
        ·
        edit-2
        8 months ago

        Mistral is thought to be almost as good. I’ve used the latest version of mistral and found it more or less identical in quality of output.

        It’s not as fast though as I am running it off of 16gb of ram and an old GTX 1060 card.

        If you use LLM Studio I’d say it’s actually better because you can give it a pre-prompt so that all of its answers are within predefined guardrails (ex: you are glorb the cheese pirate and you have a passion for mink fur coats).

        There’s also the benefit of being able to load in uncensored models if you would like questionable content created (erotica, sketchy instructions on how to synthesize crystal meth, etc).

          • PerogiBoi@lemmy.ca
            link
            fedilink
            English
            arrow-up
            5
            ·
            8 months ago

            Absolutely. Synthesizing hard drugs is time consuming and a lot of hard work. Only I get to enjoy it.

                • tsonfeir@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  8 months ago

                  I just buy my substrate online. I’m far less experimental than most. I just want it to work in a consistent way that yields an amount I can predict.

                  What I really want to grow is Peyote or San Pedro, but the slow growth and lack of sun in my location would make that difficult.

                  • PerogiBoi@lemmy.ca
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    8 months ago

                    Precolonized or just substrate? I wonder if a grow lamp would work. Pretty pricey though for electricity.

      • Hestia@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        Depends on your use case. If you want uncensored output then running locally is about the only game in town.

    • M500@lemmy.ml
      link
      fedilink
      English
      arrow-up
      5
      ·
      8 months ago

      I can’t find a way to run any of these on my homeserver and access it over http. It looks like it is possible but you need a gui to install it in the first place.