• Avatar_of_Self@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    7 months ago

    It really depends on what model you want to run and how much training is bundled with it. You can pretty much run any model if you have enough disk space but of course GPU + VRAM is preferred for a ChatGPT like fast response. Otherwise, running on an older CPU and RAM is going to be noticeably slower, especially with complex models with a lot of training data to trawl through.

    There are some pretty lite models out there but the responses will be more barebones and probably seem ‘less informed’.

    Give GPT4All a try for your first time. It makes install, configuration and usage point-and-click while being fairly straight forward. For the presented/featured models, it presents a small summary and VRAM recommended, though there are many, many other models available from inside the UI.