• Bandicoot_Academic@lemmy.one
    link
    fedilink
    English
    arrow-up
    4
    ·
    7 months ago

    Most people probably don’t have a dedicated GPU and an iGPU is probably not powerfull enough to run an LLM at decent speed. Also a decent model requires like 20GB of RAM which most people don’t have.

    • douglasg14b@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      7 months ago

      It doesn’t just require 20GB of RAM, it requires that in VRAM. Which is a much higher barrier to entry.