• Virkkunen@fedia.io
    link
    fedilink
    arrow-up
    178
    arrow-down
    2
    ·
    6 months ago

    Don’t worry folks, if we all stop using plastic straws and take 30 second showers, we’ll be able to offset 5% of the carbon emissions this AI has!

  • mctoasterson@reddthat.com
    link
    fedilink
    English
    arrow-up
    78
    ·
    6 months ago

    The annoying part is how many mainstream tech companies have ham-fisted AI into every crevice of every product. It isn’t necessary and I’m not convinced it results in a “better search result” for 90% of the crap people throw into Google. Basic indexed searches are fine for most use cases.

    • AlecSadler@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      16
      ·
      6 months ago

      As a buzzword or whatever this is leagues worse than “agile”, which I already loathed the overuse/integration of.

      • xthexder@l.sw0.com
        link
        fedilink
        English
        arrow-up
        7
        ·
        6 months ago

        Before AI it was IoT. Nobody asked for an Internet connected toaster or fridge…

        • Balder@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          6 months ago

          I always felt like I was alone in this thinking. I think anyone with a bit of a security mindset don’t want everything connected, besides it makes them more expensive and easier to break. It’s certainly very convenient for programmed obsolescence.

          • afraid_of_zombies@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 months ago

            It definitely has to walk in the desert for a while. I know multiple people who like it for some stuff. Like cameras and managing air conditioning.

  • lone_faerie@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    16
    ·
    6 months ago

    AI is just what crypto bros moved onto after people realized that was a scam. It’s immature technology that uses absurd amounts of energy for a solution in search of a problem, being pushed as the future, all for the prospect of making more money. Except this time it’s being backed by major corporations because it means fewer employees they have to pay.

    • pycorax@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      4
      ·
      6 months ago

      There are legitimate uses of AI in certain fields like medical research and 3D reconstruction that aren’t just a scam. However, most of these are not consumer facing and the average person won’t really hear about them.

      It’s unfortunate that what you said is very true on the consumer side of things…

    • afraid_of_zombies@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      3
      ·
      6 months ago

      energy for a solution in search of a problem,

      Except this time it’s being backed by major corporations because it means fewer employees they have to pay.

      Ah yes the classic it is useless and here is a use for it logic.

        • afraid_of_zombies@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          6 months ago

          I have and don’t see the relevance. The argument is that it is useless and then mentions a use case. If you want to say it’s crap I won’t argue the point but you can’t say X and ~X.

  • ben@lemmy.zip
    link
    fedilink
    English
    arrow-up
    40
    ·
    6 months ago

    I skimmed the article, but it seems to be assuming that Google’s LLM is using the same architecture as everyone else. I’m pretty sure Google uses their TPU chips instead of a regular GPU like everyone else. Those are generally pretty energy efficient.

    That and they don’t seem to be considering how much data is just being cached for questions that are the same. And a lot of Google searches are going to be identical just because of the search suggestions funneling people into the same form of a question.

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      6 months ago

      Exactly. The difference between a cached response and a live one even for non-AI queries is an OOM difference.

      At this point, a lot of people just care about the ‘feel’ of anti-AI articles even if the substance is BS though.

      And then people just feed whatever gets clicks and shares.

    • AlecSadler@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      12
      ·
      6 months ago

      I hadn’t really heard of the TPU chips until a couple weeks ago when my boss told me about how he uses USB versions for at-home ML processing of his closed network camera feeds. At first I thought he was using NVIDIA GPUs in some sort of desktop unit and just burning energy…but I looked the USB things up and they’re wildly efficient and he says they work just fine for his applications. I was impressed.

      • ben@lemmy.zip
        link
        fedilink
        English
        arrow-up
        8
        ·
        6 months ago

        Yeah they’re pretty impressive for some at home stuff and they’re not even that costly.

      • dan@upvote.au
        link
        fedilink
        English
        arrow-up
        7
        ·
        6 months ago

        The Coral is fantastic for use cases that don’t need large models. Object recognition for security cameras (using Blue Iris or Frigate) is a common use case, but you can also do things like object tracking (track where individual objects move in a video), pose estimation, keyphrase detection, sound classification, and more.

        It runs Tensorflow Lite, so you can also build your own models.

        Pretty good for a $25 device!

    • dan@upvote.au
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      6 months ago

      I’m pretty sure Google uses their TPU chips

      The Coral ones? They don’t have nearly enough RAM to handle LLMs - they only have 8MB RAM and only support small Tensorflow Lite models.

      Google might have some custom-made non-public chips though - a lot of the big tech companies are working on that.

      instead of a regular GPU

      I wouldn’t call them regular GPUs… AI use cases often use products like the Nvidia H100, which are specifically designed for AI. They don’t have any video output ports.

  • jj4211@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    ·
    6 months ago

    The confounding part is that when I do get offered an “AI result”, it’s basically identical to the excerpt in the top “traditional search” result. It wasted a fair amount more time and energy to repeat what the top of the search said anyway. I’ve never seen the AI overview ever be more useful than the top snippet.

  • Facebones@reddthat.com
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    edit-2
    6 months ago

    Its not even hidden, people just give zero fucks about how their magical rectangle works and get mad if you try to tell them.

    • Halosheep@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      6 months ago

      I use generative ai sometimes, and I find it useful for certain usecases.

      Are you just following the in ternate hate bandwagon or do you really think it’s no good?

    • sunbeam60@lemmy.one
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      14
      ·
      6 months ago

      I don’t know if lemmians (lemmings?) live in the same world I live in; I find AI a HUGE productivity boost, in search, writing, generation, research.

      Of course one has to check results carefully, it won’t remove the need for quality checks and doing stuff yourself. But as a sparring partner, an idea creator and an assistant, I am convinced people who make use of Claude/GPT etc will outperform people who don’t by a wide margin.

      • cows_are_underrated@feddit.org
        link
        fedilink
        English
        arrow-up
        14
        ·
        6 months ago

        Ai devinetively has its use cases and boosts productivity if used right. The stuff google did is just the most bullshit ever seen. Its an example of useless Ai because of the “need” to use Ai.

        • sunbeam60@lemmy.one
          link
          fedilink
          English
          arrow-up
          5
          ·
          6 months ago

          Ok that we agree entirely. Google is worried that their investors will worry because their investors are too dumb to understand that LLMs and Search are two separate things and one isn’t necessarily better because it uses the other.

            • sunbeam60@lemmy.one
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              6 months ago

              I’m not saying I can. I don’t use Google, haven’t for years, so can’t make statements about the quality of its search. Intuitively search isn’t benefitting from the use of AI.

              • sandbox@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                6 months ago

                You wrote:

                I find AI a HUGE productivity boost, in search

                Then when pushed, you walk it back:

                search isn’t benefitting from the use of AI

                Why make your initial comment of support if you just walk back on it? Got some money riding on it or something?

      • raspberriesareyummy@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        4
        ·
        6 months ago

        that’s gonna be mighty useful when we’ve destroyed the planet. Also, you are not working with AI. You are working with a LLM.

        • sunbeam60@lemmy.one
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          7
          ·
          edit-2
          6 months ago

          Against bunker oil shipping, coal power plants, diesel cars, cement, our consumate appetite for plastic, gas and oil based heating, air travel, cooling buildings to 19 C when the outside is 38 C, ammonia based fertiliser, fast fashion, maintaining perfect green lawns in desert environments, driving monster trucks with one passengers on 18-lane highways, dismantling public transport, building glass skyscrapers with no external shade, buying new TVs every second year etc etc I think AI’s reputation as a carbon emitter, especially considering most of Azure, CG and AWS runs on renewables, is overblown and used as a battering ram in a larger battle that comes from very real concerns about how AI is changing our society.

          • quick@thelemmy.club
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            1
            ·
            6 months ago

            most of Azure, CG and AWS runs on renewables

            Yeah definitely believing that

          • raspberriesareyummy@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            3
            ·
            6 months ago
            1. if each of those contributors to wrecking our planet points to the other causes as a justification to not limit their own damage, we’re fucked
            2. it’s not AI. It’s large language models. Glorified statistical text prediction without any originality. It just appears to some naive humans as original because it regurgitates ideas to them that other people have had but they just hadn’t heard before
            3. despite the misnaming, I agree that there’s a real concern that it makes the majority of users even more stupid than mankind on average already is :/
      • Balder@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        edit-2
        6 months ago

        It’s more that there is a vocal minority against it. I’d guess most of us are mostly neutral about it, we see the problems and the benefits but don’t see the need to comment everywhere about our feelings towards it.

  • ArchRecord@lemm.ee
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    2
    ·
    6 months ago

    If only they did what DuckDuckGo did and made it so it only popped up in very specific circumstances, primarily only drawing from current summarized information from Wikipedia in addition to its existing context, and allowed the user to turn it off completely in one click of a setting toggle.

    I find it useful in DuckDuckGo because it’s out of the way, unobtrusive, and only pops up when necessary. I’ve tried using Google with its search AI enabled, and it was the most unusable search engine I’ve used in years.

      • ArchRecord@lemm.ee
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        3
        ·
        6 months ago

        I haven’t had any problems myself.

        In fact, I regularly use their anonymized LLM Chat tab to help out with restructuring data, summarizing some more complex topics, or finding some info that doesn’t readily appear near the top of search. It’s made my search experience (again, specifically in my circumstance) much better than before.

  • just_another_person@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    2
    ·
    edit-2
    6 months ago

    To be fair, it was never “hidden” since all the top 5 decided that GPU was the way to go with this monetization.

    Guess who is waiting on the other side of this idiocy with a solution? AMD with cheap FPGA that will do all this work at 10x the speed and similar energy reduction. At a massive fraction of the cost and hassle for cloud providers.

  • repungnant_canary@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    2
    ·
    6 months ago

    I’m genuinely curious where their penny picking went? All of tech companies shove ads into our throats and steal our privacy justifying that by saying they operate at loss and need to increase income. But suddenly they can afford spending huge amounts on some shit that won’t give them any more income. How do they justify it then?

    • conciselyverbose@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      6 months ago

      It’s another untapped market they can monopolize. (Or just run at a loss because investors are happy with another imaginary pot of gold at the end of another rainbow.)

    • HappycamperNZ@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 months ago

      Perception. If a company isn’t on the leading edge we don’t consider them the best.

      Regardless if you use them or not, if Google didn’t touch AI but Edge did you would believe edge is more advanced.

    • afraid_of_zombies@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      6 months ago

      Because data is king and sessions are going to be worth a lot more than searches. Go through the following

      1. Talk to a LLM about what product to buy

      2. Search online for a product to buy

      Which one gives out more information about yourself?