Is your job threatened by AI?

JoeBobbaLou2

Orange Belt
@Orange
Joined
Mar 17, 2025
Messages
381
Reaction score
363
Simple question, is your job in danger of Ai rendering you and your skills obsolete?

any personal stories here of those being laid off, or soon to be laid off?
 
Absolutely.... Trying to put my tendrils in any area of this company I can to secure as much of a role as possible. Sucks, but that's life.

I will go back to electrical work if necessary, but man, I'm getting up there, and I do not want to at all.
 
There's mass layoff everywhere. Entry level white colar roles will be either done by AI or compete by fresh grads fighting for limited positions.

Some blue colar jobs are not safe either, see other thread Amazon is getting rid of 30,000 jobs in coming months.
 
I don't think so - but who knows. Nothing seems truly safe
 
There's mass layoff everywhere. Entry level white colar roles will be either done by AI or compete by fresh grads fighting for limited positions.

Some blue colar jobs are not safe either, see other thread Amazon is getting rid of 30,000 jobs in coming months.
Many folks just don't want to see it. It sucks. The key is understanding that Ai won't completely take over every job or something so severe, but it will give a few people the tools to do several jobs at once, so they will merge several roles into one or two. Best to use whatever resources you can to learn to do several roles at your company, if you possibly can. Especially white collar and the like. Or not, and hope for the best.
 
The big one will be when self driving cars and trucks take over.

Think about how many people make a living just driving stuff around. Truck drivers, UPS, Uber, doordash... That's going to be the breaking point.
 
There's mass layoff everywhere. Entry level white colar roles will be either done by AI or compete by fresh grads fighting for limited positions.

Some blue colar jobs are not safe either, see other thread Amazon is getting rid of 30,000 jobs in coming months.
my good friend was one of them. :(
 
The big one will be when self driving cars and trucks take over.

Think about how many people make a living just driving stuff around. Truck drivers, UPS, Uber, doordash... That's going to be the breaking point.
Somewhere in the intersection of technology and economics there must be a theory pertaining to the upper limit where more tech is no longer profitable. Specifically, the Model T became huge because the Average Joe could afford it. But if tech takes away too many jobs, the rich people alone wont be enough to keep the technology profitable.
 
Prettyt safe this way - unless AI can mop up wet floors at The Blue Oyster Bar.
 
20+ year programmer here, I commented in the last thread.

No, and people who say it will are based level web developers who will get replaced, not real programmers.
 
Programmer and no. I've been following it closely, integrating it where I can into my workflows, trying to get all the efficiency gains possible etc.
Also following the business side and R&D sides of it.

I have no fear of being replaced, actually quite the opposite. It seems to be severely hampering the development of new programmers, and there is already a shortage of good devs that really know what they're doing which is going to get worse I believe.

I've mentioned it here before but I've already been on and heard about quite a few projects that doubled their expenses in having to redo a bunch of LLM sludge they thought they could get away with in prod.

*edit*

Forgot to say - it's also quite shit as it is, and I don't see it getting much better. I think we've already seen the curve start to flatten with respect to LLMs for code.
Latest GPT is actually worse than previous iteration, which quite a few predicted would be the case.

A lot of people who don't know comp sci / programming etc think because they can spin up the front end to a website or some simple py scripts that it's turned them into real programmers.

Real devs regularly work on problems that it simply can't solve, something I run into a lot.
It will proudly state the theory and methods of execution but utterly fail to make it work, sometimes be ause it was never possible in the first place.

(I keep adding to this list as I think of things but I'll leave it for now, there is much more):
They're also horrible with context on large projects, often recreating already existing functionality, coupling large classes/objects, disrespecting scope, adding mountains of auxillary bloat that was never needed, leaving glaring security flaws, endless race conditions in certain types of projects (games, streaming data etc), no spatial reasoning - extreme issue with engineering applications.

They're really, really bad, but it's hard to explain to people that don't have the level of knowledge required.

It's like a really accurate metaphor for why it isn't good.
 
Back
Top