Don’t worry folks, if we all stop using plastic straws and take 30 second showers, we’ll be able to offset 5% of the carbon emissions this AI has!
Sounds like not using Google search would be a way more effective way of reducing CO2
Removed by mod
…You joined 3 days ago. 🤣
Removed by mod
You were definitely a redditor, right?
That’s an insult around this here fora, boy
Sadly, yes.
The annoying part is how many mainstream tech companies have ham-fisted AI into every crevice of every product. It isn’t necessary and I’m not convinced it results in a “better search result” for 90% of the crap people throw into Google. Basic indexed searches are fine for most use cases.
As a buzzword or whatever this is leagues worse than “agile”, which I already loathed the overuse/integration of.
Before AI it was IoT. Nobody asked for an Internet connected toaster or fridge…
I always felt like I was alone in this thinking. I think anyone with a bit of a security mindset don’t want everything connected, besides it makes them more expensive and easier to break. It’s certainly very convenient for programmed obsolescence.
It definitely has to walk in the desert for a while. I know multiple people who like it for some stuff. Like cameras and managing air conditioning.
If only Google had a working search engine before AI
And yet it’s still garbage…like their search
AI is just what crypto bros moved onto after people realized that was a scam. It’s immature technology that uses absurd amounts of energy for a solution in search of a problem, being pushed as the future, all for the prospect of making more money. Except this time it’s being backed by major corporations because it means fewer employees they have to pay.
There are legitimate uses of AI in certain fields like medical research and 3D reconstruction that aren’t just a scam. However, most of these are not consumer facing and the average person won’t really hear about them.
It’s unfortunate that what you said is very true on the consumer side of things…
energy for a solution in search of a problem,
Except this time it’s being backed by major corporations because it means fewer employees they have to pay.
Ah yes the classic it is useless and here is a use for it logic.
I take it you haven’t had to go through an AI chat bot for support before huh
I have and don’t see the relevance. The argument is that it is useless and then mentions a use case. If you want to say it’s crap I won’t argue the point but you can’t say X and ~X.
Crypto has been hitting all time highs this year; there’s just more bros than before.
Crypto has been hitting all time highs this year; there’s just more bros than before.
There’s a sucker born every minute
Tell that to my wallet. I hold a little crypto and it’s down over 50%
Hey wallet stop buying shitcoins
I skimmed the article, but it seems to be assuming that Google’s LLM is using the same architecture as everyone else. I’m pretty sure Google uses their TPU chips instead of a regular GPU like everyone else. Those are generally pretty energy efficient.
That and they don’t seem to be considering how much data is just being cached for questions that are the same. And a lot of Google searches are going to be identical just because of the search suggestions funneling people into the same form of a question.
Exactly. The difference between a cached response and a live one even for non-AI queries is an OOM difference.
At this point, a lot of people just care about the ‘feel’ of anti-AI articles even if the substance is BS though.
And then people just feed whatever gets clicks and shares.
Googles tpu can’t handle llm’s lol. What do you mean “exactly”?
In fact, Gemini was trained on, and is served, using TPUs.
Google said its TPUs allow Gemini to run “significantly faster” than earlier, less-capable models.
Did you think Google’s only TPUs are the ones in the Pixel phones, and didn’t know that they have server TPUs?
I hadn’t really heard of the TPU chips until a couple weeks ago when my boss told me about how he uses USB versions for at-home ML processing of his closed network camera feeds. At first I thought he was using NVIDIA GPUs in some sort of desktop unit and just burning energy…but I looked the USB things up and they’re wildly efficient and he says they work just fine for his applications. I was impressed.
Yeah they’re pretty impressive for some at home stuff and they’re not even that costly.
The Coral is fantastic for use cases that don’t need large models. Object recognition for security cameras (using Blue Iris or Frigate) is a common use case, but you can also do things like object tracking (track where individual objects move in a video), pose estimation, keyphrase detection, sound classification, and more.
It runs Tensorflow Lite, so you can also build your own models.
Pretty good for a $25 device!
I’m pretty sure Google uses their TPU chips
The Coral ones? They don’t have nearly enough RAM to handle LLMs - they only have 8MB RAM and only support small Tensorflow Lite models.
Google might have some custom-made non-public chips though - a lot of the big tech companies are working on that.
instead of a regular GPU
I wouldn’t call them regular GPUs… AI use cases often use products like the Nvidia H100, which are specifically designed for AI. They don’t have any video output ports.
deleted by creator
Just curious. Do your summaries are made by you, human?
Used AI to summarize article about misuse of AI.
Genius.
Using AI the correct way 👌
The confounding part is that when I do get offered an “AI result”, it’s basically identical to the excerpt in the top “traditional search” result. It wasted a fair amount more time and energy to repeat what the top of the search said anyway. I’ve never seen the AI overview ever be more useful than the top snippet.
Its not even hidden, people just give zero fucks about how their magical rectangle works and get mad if you try to tell them.
deleted by creator
I use generative ai sometimes, and I find it useful for certain usecases.
Are you just following the in ternate hate bandwagon or do you really think it’s no good?
deleted by creator
And it’s only 10x more useless :)
The results used to be better too. AI just produces junk faster.
I don’t know if lemmians (lemmings?) live in the same world I live in; I find AI a HUGE productivity boost, in search, writing, generation, research.
Of course one has to check results carefully, it won’t remove the need for quality checks and doing stuff yourself. But as a sparring partner, an idea creator and an assistant, I am convinced people who make use of Claude/GPT etc will outperform people who don’t by a wide margin.
Ai devinetively has its use cases and boosts productivity if used right. The stuff google did is just the most bullshit ever seen. Its an example of useless Ai because of the “need” to use Ai.
Ok that we agree entirely. Google is worried that their investors will worry because their investors are too dumb to understand that LLMs and Search are two separate things and one isn’t necessarily better because it uses the other.
How can you trust whatever you search for after the glue on pizza thing?
You’re confusing what I said as being solely about search.
No, I’m just addressing the search part. How can you trust it?
I’m not saying I can. I don’t use Google, haven’t for years, so can’t make statements about the quality of its search. Intuitively search isn’t benefitting from the use of AI.
You wrote:
I find AI a HUGE productivity boost, in search
Then when pushed, you walk it back:
search isn’t benefitting from the use of AI
Why make your initial comment of support if you just walk back on it? Got some money riding on it or something?
that’s gonna be mighty useful when we’ve destroyed the planet. Also, you are not working with AI. You are working with a LLM.
Against bunker oil shipping, coal power plants, diesel cars, cement, our consumate appetite for plastic, gas and oil based heating, air travel, cooling buildings to 19 C when the outside is 38 C, ammonia based fertiliser, fast fashion, maintaining perfect green lawns in desert environments, driving monster trucks with one passengers on 18-lane highways, dismantling public transport, building glass skyscrapers with no external shade, buying new TVs every second year etc etc I think AI’s reputation as a carbon emitter, especially considering most of Azure, CG and AWS runs on renewables, is overblown and used as a battering ram in a larger battle that comes from very real concerns about how AI is changing our society.
most of Azure, CG and AWS runs on renewables
Yeah definitely believing that
- if each of those contributors to wrecking our planet points to the other causes as a justification to not limit their own damage, we’re fucked
- it’s not AI. It’s large language models. Glorified statistical text prediction without any originality. It just appears to some naive humans as original because it regurgitates ideas to them that other people have had but they just hadn’t heard before
- despite the misnaming, I agree that there’s a real concern that it makes the majority of users even more stupid than mankind on average already is :/
It’s more that there is a vocal minority against it. I’d guess most of us are mostly neutral about it, we see the problems and the benefits but don’t see the need to comment everywhere about our feelings towards it.
If only they did what DuckDuckGo did and made it so it only popped up in very specific circumstances, primarily only drawing from current summarized information from Wikipedia in addition to its existing context, and allowed the user to turn it off completely in one click of a setting toggle.
I find it useful in DuckDuckGo because it’s out of the way, unobtrusive, and only pops up when necessary. I’ve tried using Google with its search AI enabled, and it was the most unusable search engine I’ve used in years.
DDG has also gotten much worse since the introduction of AI features.
I haven’t had any problems myself.
In fact, I regularly use their anonymized LLM Chat tab to help out with restructuring data, summarizing some more complex topics, or finding some info that doesn’t readily appear near the top of search. It’s made my search experience (again, specifically in my circumstance) much better than before.
To be fair, it was never “hidden” since all the top 5 decided that GPU was the way to go with this monetization.
Guess who is waiting on the other side of this idiocy with a solution? AMD with cheap FPGA that will do all this work at 10x the speed and similar energy reduction. At a massive fraction of the cost and hassle for cloud providers.
Google already has their own TPUs, under the name Coral
I’m genuinely curious where their penny picking went? All of tech companies shove ads into our throats and steal our privacy justifying that by saying they operate at loss and need to increase income. But suddenly they can afford spending huge amounts on some shit that won’t give them any more income. How do they justify it then?
It’s another untapped market they can monopolize. (Or just run at a loss because investors are happy with another imaginary pot of gold at the end of another rainbow.)
Perception. If a company isn’t on the leading edge we don’t consider them the best.
Regardless if you use them or not, if Google didn’t touch AI but Edge did you would believe edge is more advanced.
It doesn’t even need to appeal to you the user, but given the AI Gold Rush, they would have very unhappy investors if they did not.
Very good point
Because data is king and sessions are going to be worth a lot more than searches. Go through the following
-
Talk to a LLM about what product to buy
-
Search online for a product to buy
Which one gives out more information about yourself?
-