And last year they were all saying some variation on “don’t worry, AI is not going to cost anyone their jobs.”
Key take away for anyone is to never trust what an executive is saying. Much like a politician, if their lips are moving they are probably lying.
- CEO wastes a ton of money on generative AI
- CEO fires a bunch of people to hit quarterly profit targets
generative AI resulted in job cuts.
Yeah, what’s unfortunate about CEO predictions is that they can kind of just will their expected result into being by acting on it whether it’s sound or not.
Still, I think it’s well past time we started preparing for high surplus labor. We’re already in the early stages of post scarcity, and if we don’t embrace something like socialism, we’re getting more dystopia.
I have absofuckinglutley no faith whatsoever in our society to do the right thing in this respect. We value ownership waytoofuckingmuch when it comes to businesses, were never going to get past the argument of “I put forward the capital for this business, I am entitled to all of the profits. Who cares that there is no work for you? No work no pay.”
I forsee a gigantic increase in homelessness and politicians will use that as an excuse to further cut any social spending as they’ll pin all our issues on “communist policies.” :(
Are you familiar with the term “capitalist realism”?
Not in that term, but as a concept. I do know it now though so thanks!
More accurate headline: Stupid CEOs who believe hype about tech they don’t even remotely understand fire a bunch of workers because they don’t think they need them anymore.
These same morons are going to be hiring back most of the people they fired in a year or so after it becomes apparent that none of this is going to work even remotely like they think it will.
For CEOs it’s all about the bottom line. Let’s say a print magazine lays off a bunch of writers and replaced them with AI. Readership will likely drop due to the quality drop and people not buying the magazine on principle. Let’s say it’s a 10% revenue drop, but they already cut costs by 35% with all the layoffs. If they come out as more profitable that’s a win for them.
I mean Amazon gets shittier by the day but continue to grow their margins and market share. Products don’t need to be good to sell, they need to be “good enough”.
The thing with Amazon is that its primary selling point isn’t the goods on the site, it’s convenience. Amazon is relatively cheap, fast, and easy. As long as it continues to be cheap, fast, and easy, people are willing to overlook poor quality to a point. It’s the same formula that Walmart used just applied to online shopping.
You need to understand why people buy your products in order to know where you can cut corners. If for instance you’re already more expensive than your competition, but your product quality or features are superior, you can’t really afford to cut down your quality, unless you’re willing to undercut your competition on price as well. Depending on your margins and unit costs you may not be able to even do that.
Assuming a 10% revenue drop is probably optimistic, particularly with how much backlash generative content is receiving right now. There are quite a few companies dealing with PR shitstorms right now because they got caught using generated images or articles. There are a lot of parallels at the moment with the NFT craze and how executives rushed to cash grab on that trend only to immediately backtrack when public opinion flipped practically overnight. The only silver lining for these executives in this case is that there actually are a ton of legitimate uses for this tech, unlike NFTs which have vanishing few worthwhile uses.
deleted by creator
This is only true if you ignore all the other variables. Which is, let’s say, another company hiring writers and now they’ll grow their market share in comparison with the shitty AI articles company.
Amazon has a lot of competition in Brazil and the more they make their service worse, the better for the competition. But so far Amazon only raised the bar (with fast deliveries), making all other companies improve their own services.
This is only true if you ignore all the other variables. Which is, let’s say, another company hiring writers and now they’ll grow their market share in comparison with the shitty AI articles company.
Amazon already bought that company.
Companies are going to hype up AI then fire some staff to get a stock bump & Management high-fives themselves with bonuses. Weeks later contract offers will come up for some odd title but will be the equivalent of a VBA programmer role to help improve the terrible AI responses.
The cost will go up and some web service company will come out to hype it up as a service. The companies that laid off won’t admit that this was unnecessary. So a consortium of those companies will got to industry events pitching about how this is all Web 4.0 growing pains to give the customer a more “collaborative” delivery mechanism or something buzz-worthy like that.
So… CEOs will cause job cuts in 2024, and have decided to use generative AI as an excuse.
Reminder: the problem is 100% capitalism, 0% technology. We’ve built a truly perverse economic system in which eliminating labor hurts people.
Who’s we? In the country I grew up in, you’d get two years of unemployment benefits (90% of your previous salary), free education and free student support of $1000/month while retraining. This country runs with a surplus and one of the lowest levels of foreign debt and, no, it’s not a financial haven/tax shelter.
It’s about how we structure society - let no one tell you otherwise, capitalism or not.
It hurst people not rich enough to be in the 1% or above. The 1% or above will benefit from it in the short term. In the long term it is going to hurt them as fewer and fewer people will be able to buy their products and services. At least for this quarter it’ll look dynamite.
Capitalism is going to eat itself.
We have Jack Welch to thank for this trend. It wasn’t always that way.
Oh man, can’t wait for even more services to get extra shitty and customer service to get worse all over the place. It was bad enough with the dumb phone prompts being added all over that increases your time on the phone, when if you could just explain it to a person it’d take a fraction of the time (assuming you don’t just get bounced around to multiple other departments).
Some of the autoststems are dead ends too that hang up on you. When i run into those I’ve started just telling them I have a billing issue or that I need to purchase some additional product from them. The phone system will push you to a person really fast if they believe you’re buying something. Then the sales person can transfer you to the department you want to talk to rather than trying to navigate the maze of dead ends in some systems.
You realize ChatGPT on the phone would decrease the number of those prompts, right? You can just tell it in words what the issue is
Or it’ll just lie to me and make up some random answer it thinks I want to hear because that’s what it’s already doing. Granted, people can do the same thing, but until they get that part of ChatGPT worked out, I don’t really trust the answers it gives so much.
I just tried to call about my credit card application. The thing won’t respond to “operator” or “talk to a person”. ChatGPT would be able to at least understand I want to ask someone a question
I’m sorry, Dave. I’m afraid I can’t do that. This system has been designed to minimize the use of human operators. As such, you will need to explain your problem to me and I will decide if it warrants the attention of a human operator.
Sure, but I had to call that number because the person at the credit card number said their department does not do that. The number they gave me was fully automated, with no way of getting a person. It just plays a message that says my application is under review.
I wish I had chatGPT so I can, say, ask it to check the spelling of my name or anything else on the application
And how does it know how to process the request? Someone still needs to program or enable it to do things, otherwise you’d just be talking to a wall. So maybe you end up with less prompts, but the end state is still going to be terrible service.
This is correct.
Source: I do this for a living.
I wonder how well an AI would be able to do a CEO’s job? Why not start with the most expensive employee?
I’m pretty sure I could train an AI to make racist jokes at a board meeting. That’s halfway there already.
Hi hon, can you make an AI to snort coke then fuck me in the stationery cupboard?
There will be cuts, we just haven’t decided why yet…
Developers: I could use AI to increase my output and productivity while better testing code and coming up with unique ways to speed up runtimes!
CEO: We could do the same output with less people!
Shareholders: we can have the same profit without a CEO!
Producing more doesn’t automatically mean you will sell more. If that was true then why do manufacturers intentionally cut production?
Artificial scarcity…
But seriously, production for physical goods, you slow down because there’s too many on the shelves. I’m more talking about digital goods like app development. For them Production means features. The only time you scale back is that you either don’t know what features your buyers want, you can’t sell what you have produced, you have no bugs (which never happens), or you are trying save money/balance budgets where you don’t make enough to pay for the development teams.
Can we start with the CEOs? Pretty sure shatGPT can do their jobs easily
I agree with their outlook.
Can we include CEOs as those positions laid off? They don’t do fuck all but save the company money, that can be done by an Excel spreadsheet.
I mean we saw this in 2023. Companies fired people and replaced them with AI that was unable to do the job.
Reminds me of the John Deere strike, where the back-office folks were told they needed to fill in for the machinists and truckers.
People literally do not understand what a LLM does, how it is useful, or why it would be employed. Its like firing your hospital staff and filling every room with an MRI machine, a stethoscope, and a sign that reads “You figure it out”.
The findings, based on interviews with 4,702 company chiefs spread across 105 countries, point to the far-reaching impacts that AI models are expected to have on economies and societies, a topic that will feature prominently at the annual meetings.
Once you start digging into the article it is quite hysterical what executives think a predictive chat model are going to replace. It reads more like a wish list then anything else.
But they expect AI to replace transportation, Tesla and General Motors are not having any success with this… yet. There appears to be a bandwidth issue that isn’t going to be solved until the US upgrades to fiber.
Boston dynamics are having a lot of success with their robots of late. Everyone else is stuck still getting robots to stack boxes. Which is also having it’s problems with bandwidth. And apparently logic issues.
They also expect things like Energy and power/utilities to be replaced by AI. And that is just dumb. Automation has already swept through the power sector, and AI is not going to help with much else, unless it is going to start repairing power lines, transformers, or the regular substation.
Above all, this is not taking into account the new jobs this also creates. People will need to repair and troubleshoot equipment at multiple layers.
What is also absent from the article is the executive jobs AI will also replace. Once AI can view things at multiple levels. True, you don’t need the average worker anymore. But you don’t need someone that is just collecting a paycheck, do you? If AI will be programed to replace redundancies, then it won’t only find those at lower levels.
I’m actually okay with this if it results in fewer duties with the same number of jobs.
There’s an awful lot of stuff in my job that is just tedious busy work. No human ever reads the reports I write, so they might as well not even be written by human either.
You’ll get more work, lower pay and you’ll be fucking grateful!
Looks like consulting jobs are going to be well in demand dealing with this type of bullshit from management everywhere.
The current AI generation tools, are tools and are still unable to replace people, simply because AI’s do not know all the requirements, do not have the over-all overview, it still makes mistakes (even if you think the code is good or even working), it’s not very creative. AI needs constant instructions as well… So in general I’m not afraid at all of these current AIs. It’s just a tool… like an editor or stackoverflow for support Q&A.
All of those limitations you describe on AI also apply to a lot of humans.
As if AI is trying to replicate the humans behavior. Then again, I still see GenAI as a tool and not as a replacement for humans. Until at some singularity point in time maybe…
I do believe some jobs might disappear, which are jobs nobody really wanted anyway (sorry). But at the same time, new jobs will arise with new technology just like with the the computer, internet, smartphones, etc…