• DarkThoughts@kbin.social
    link
    fedilink
    arrow-up
    2
    ·
    9 months ago

    Yes, but what would a local model do for you in this case? Chatbots in browsers are typically used as an alternative / more contextualized search engine. For that you need proper access to an index of search results. Most people will also not have enough computing power to make use of any complex chatbot / larger context sizes.

    • kakes@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 months ago

      Pennomi wrote a whole list of potential ideas. And honestly, while I agree that local LLMs on typical hardware are underpowered for most tasks, it’s possible they would have the option for those that can run it.

      People are getting all upset over this announcement without even knowing what their plan actually is, like the word “AI” is making them foam at the mouth or something. I’m just saying we should reserve judgements for when we have an idea of what’s happening.