Michael Ten @lemmy.world to Technology@lemmy.worldEnglish · 7 months agoOpera is testing letting you download LLMs for local use, a first for a major browserwww.zdnet.comexternal-linkmessage-square33fedilinkarrow-up1136arrow-down120
arrow-up1116arrow-down1external-linkOpera is testing letting you download LLMs for local use, a first for a major browserwww.zdnet.comMichael Ten @lemmy.world to Technology@lemmy.worldEnglish · 7 months agomessage-square33fedilink
minus-squareBandicoot_Academic@lemmy.onelinkfedilinkEnglisharrow-up4·7 months agoMost people probably don’t have a dedicated GPU and an iGPU is probably not powerfull enough to run an LLM at decent speed. Also a decent model requires like 20GB of RAM which most people don’t have.
minus-squaredouglasg14b@lemmy.worldlinkfedilinkEnglisharrow-up7·7 months agoIt doesn’t just require 20GB of RAM, it requires that in VRAM. Which is a much higher barrier to entry.
Most people probably don’t have a dedicated GPU and an iGPU is probably not powerfull enough to run an LLM at decent speed. Also a decent model requires like 20GB of RAM which most people don’t have.
It doesn’t just require 20GB of RAM, it requires that in VRAM. Which is a much higher barrier to entry.