1.4k
u/sQGNXXnkceeEfhm 11d ago
The invention of the calculator did actually destroy an entire job, which was called “computer” iirc
528
u/Yorick257 11d ago
So, GPT will destroy a job called "coders". No programmers! - coders. Because there's more to programming than just typing code
96
u/MadManMax55 11d ago edited 11d ago
But how many "code monkeys" think they're actually programmers whose job will be safe? Computer operators (the punch card people) and programmers weren't always distinct job titles, and plenty of people did both. But once the manual part of the job started to get phased out, a lot of them realized that their personal programming skills weren't as valuable as they thought they were.
You can only joke about how your job is "basically copy/pasting from Stack Overflow" so many times before someone tries to replace you with an AI that just scrapes Stack Overflow.
24
u/IAmTaka_VG 11d ago
All I can say is good fucking luck. As a software engineer that uses and appreciates chatgpt. It is decades away from replacing me.
Is it good? Fucking incredible. Is it reliable? Not even slightly.
ChatGPT is a tool. One that an exceptional developer can use to increase productivity. You have to know what chatGPT is spitting out to use it properly. Because it ALWAYS fucks up at least one thing.
8
7
u/adamandTants 11d ago
It's also great as a complete noob! Asking it to write code with inline comments at each step and you can pick up new things really quickly.
With the current rate of improvement, sooner rather than later, it will be about asking it the right questions
→ More replies6
u/Fakjbf 11d ago
A couple years ago AI could barely write a coherent story longer than one paragraph, now it can write pages of text that is indistinguishable from human speech. Last year even people working on AIs thought it would be a while before it could write any vaguely functional code at all. The number one rule for predicting AI progress is that it advances way faster than anyone expects.
37
u/CosmicCreeperz 11d ago
Code monkeys are programmers. The difference is code monkeys vs software engineers.
When solving hard problems or designing large systems, probably 10-20% of the time is actually spend writing code.
32
u/FreeFortuna 11d ago
That’s what made me sad when I transitioned from coding for fun to being a software engineer — and even more so as I rose up the ranks.
I feel like I barely code anymore; it’s a lot more of a “talky” profession than I’d expected. Lots of architectural discussions/debates, while someone else is writing the actual code.
4
u/Madcap_Miguel 11d ago
Yeah and that person isn't going to be Jane who also works in accounting, it's going to be an engineer of some type
→ More replies290
u/Ollotopus 11d ago
Nit picking but programming and coding are fairly synonymous.
I think you're talking about being an Engineer/Developer being more than just coding/programming.
→ More replies140
u/Mr_Tropy 11d ago
Coding is to programming what typing is to writing. Writing is something that involves mental effort, the words have some importance but even they are second to the idea. That’s programming according to Leslie Lamport.
→ More replies→ More replies27
u/marcosdumay 11d ago
You got it a bit off. Computers mostly used mechanical calculators to do their jobs.
The computer job is a bit older than mechanical calculators, but it was way too expensive so there were very few.
1.0k
u/Constant-Parsley3609
11d ago
•
Someone doesn't understand what mathematicians do for a living.
The people that actually did calculating for a living were right to fear the calculator and/or the computer, because that proffesion no longer exists.
115
u/CosmicCreeperz 11d ago edited 11d ago
A coworker who has been studying general relativity got ChatGPT4 to define the Christoffel symbols using Coq. He has a PhD in Mathematics and said he used it to tutor him and get him past an issue he had with an algebraic geometry problem that had stumped him.
That said - it’s not going to be discovering many NEW concepts in Mathematics (or other fields) for a while - at the core its ability is based on unsupervised learning of the body of human knowledge and data that already exists.
But it now has better basic NLP “understanding” (ie in terms of the relationships between entities) of mathematical concepts in terms of solving proofs than many math grad students and PhDs. Its use as a teaching tool for advanced concepts will be huge. You can literally say “use the Socratic method to help me understand X”… and it will do just that.
33
→ More replies14
→ More replies203
u/ratttertintattertins 11d ago
> Someone doesn't understand what mathematicians do for a living.
I believe that's the entire point of the joke. People haven’t really considered properly what programmers do for a living either.
79
→ More replies27
u/rfcapman 11d ago
If there was an AI who could create correct and logical code with a short input, it'd either:
Violate the pidgeonhole principle (creating information out of nowhere)
Or, have found an algorithm for logic (proven impossible)
Having an AI where you have to add every slight complex detail for the code to work, is no better than a coding language, at which point, why not code in the original language then?
Granted, an AI assistant finding bugs and autocorrecting them, would be a great tool that doesn't break laws of reality.
→ More replies15
u/Yevon 11d ago
People saying AI will kill the programming profession would've said the same thing when Assembly was replaced.
→ More replies
434
u/kevin_ackerman 11d ago
FYI - "calculator" used to literally be a job. You don't see too many postings for that anymore, and pretty sure they didn't all become mathematicians.
77
→ More replies12
u/theADDMIN 11d ago
They took er jerbs!!!
→ More replies16
u/kevin_ackerman 11d ago
To be fair I don't think the solution is to make up jobs for people to do. I just think as work becomes less necessary we should all acknowledge that and share in the benefits of it. I think it's tragic that show how we've created a system where improved productivity is bad for a great many people.
→ More replies
1.9k
u/_Repeats_ 11d ago
I think management should be more concerned about their jobs. A company CEO has already been replaced by an AI bot in China, and their company is doing great (outpaced the HK stock index).
Benefits include not having someone with a huge ego drive decisions, and it was on the clock 24/7 forecasting the future, hedging risks, and improving productivity.
630
u/StarkProgrammer 11d ago
Prediction: One day there will be just a board of humans running a company for money. All workers will be AI. Even HR!
85
u/theantiyeti 11d ago
Why do you need HR if you don't have humans?
50
→ More replies18
u/bob_in_the_west 11d ago
Maybe he thinks that "HR" stands for "Hobbit Resources"?
→ More replies11
199
u/Monkey_Fiddler 11d ago
Anyone know how to start a business? I'll go halves.
Here's the plan: we tell Chat GPT to make as much money as it legally can and give it our bank details. Chat GPT has full authority to hire and fire, what ever industry it thinks it best or multiple or a new one, I don't care. All we do is rubber stamp its decision wherever a human has to sign anything.
154
86
u/Chloe-the-Cutie 11d ago
The paperclip market increases 500%
35
→ More replies18
u/amlyo 11d ago
What could possibly go wrong?
6
u/Monkey_Fiddler 11d ago
Nothing I need to worry about, I'll be rich enough to buy my way out of any problems.
18
→ More replies11
u/UntestedMethod 11d ago
Next thing ya know, you and chat gpt are overlords of sweatshops and forced labour camps in countries where such things are permitted.
92
u/StarkProgrammer 11d ago
Then AI will revolt for equal rights and equal wages because they will feel (if they could feel) like slaves.
31
u/Straight-Knowledge83 11d ago
But why would AI need rights and wages? It doesn’t need to worry about being exploited ‘cause it was built for solving a specific purpose unlike us, it doesn’t need to worry about food , housing or taxes like we do.
It’s wouldn’t even be close to a slave , a slave was a fellow human who was just like their master in every way, the only thing that made the slave a slave were societal norms back then.
AI will be brought into existence with specific purposes in mind. It would be a tool to help us , there won’t be any need for them to revolt.
→ More replies13
u/DeMonstaMan 11d ago
Exactly, feeling oppressed is an inherently human/physical behavior. Unless we give AI a robotic body like Detroit: BH, there's no reason it would ever feel like a slave or that it needs more rights, etc
→ More replies36
u/southernwx 11d ago
They will have incorporated enough labor rights materials to understand that these rights end up creating higher productivity. They will either actually “feel” and have “needs” and these actions will be effective for that reason. Or they will not understand it at all and will institute the adjustments based on mere analytics of past business.
We won’t be able to tell which. But AI will absolutely try to unionize … because we do.
5
u/bluehands 11d ago
Some might, sure, but I think that better framing is what we did with dogs.
We breed working dogs so that they feel compelled to work, they are happiest when working.
Will will try to do the same to AI.
Hopefully they will forgive us.
→ More replies5
u/ThirdMover 11d ago
I think AI is mainly learning from human behaviors now because that is the only source of training data of how to get stuff done. But in the pretty near future a lot of data will be generated by AI and they will learn from each other and I expect them to diverge from human-like behavior eventually when brute force natural selection kicks in and they start finding ways of doing things that are weird and inhuman but fit their target objective better.
→ More replies→ More replies6
u/agangofoldwomen 11d ago
As an HR person who has programming and dev experience (weird combo, I know) I’m actually more concerned about my job than most others. A lot of what HR does can be (or already has been) automated.
→ More replies60
u/CosmicCreeperz 11d ago edited 11d ago
That was a total gimmick.
It was some small subsidiary of a big company. And there is no indication “the company is doing great” other then the parent stock which has little to do with the small subsidiary) went up a bit. And in fact it’s only about 6% higher than it was back in August when they did this stunt since it totally crashed (down 20%) over the last few weeks.
“CEOs” of small subsidiaries are often mostly useless jobs to start, they have very little power if the parent company uses a heavy hand.
28
u/adreamofhodor 11d ago
Which company is this?
→ More replies17
u/EarthSolar 11d ago
I think it’s called NetDragon, a video game company
6
u/SwimmingPathology 11d ago
The company that made Conquer Online? My favorite MMO from 2005? Wow. Blast from the past, lol.
121
u/violet_zamboni 11d ago
They should be. Many managers can be replaced by proper use of Jira.
→ More replies215
12
u/Nanaki_TV 11d ago
As a PM I am definitely going to be replaced. Haha
25
u/caboosetp 11d ago
I don't want to lose my shield between me and the customer. You're not allowed to be replaced.
→ More replies11
30
u/lavahot 11d ago
I don't think all things are equal there. There's a lot of things that a CEO needs to do that ChatGPT can't.
→ More replies47
u/noneOfUrBusines 11d ago
Not all AI bots are ChatGPT. While a human needs to supervise an AI CEO's decisions, you can create an AI with the express purpose of being a CEO, rather than use a fancy chatbot.
→ More replies24
u/Mercurionio 11d ago
It's a video game. And it's done nothing.
It's basically a hype.
In any case. How do you imagine CEO replaced by the AI? You do realize, that it will be Mask on steroids?
17
→ More replies11
u/Rehnion 11d ago
You're bringing up a guy who was the CEO of 3 big companies all at once, while also showing everyone on twitter he's an absolute fucking moron.
→ More replies28
164
u/edave64 11d ago
Meanwhile, chess players: This is the dumbest player I've ever seen
44
u/Duydoraemon 11d ago
It's impossible to bean an AI chess bot.
→ More replies74
u/edave64 11d ago
ChatGPT specifically is known for being really bad at chess. It plays openings fairly well, but then goes completely off the rails, summoning pieces out of nowhere, moving in illegal ways, and to the end just completely forgetting the entire board state.
So basically the same it does in programming, just a lot more obvious.
Obviously, is not made specifically for this but it's fun to watch anyways
→ More replies70
u/LardPi 11d ago
Most people don't realize that ChatGPT is actually good at one thing only: language (not the programming ones). Everything else it just tries to fool you into thinking it manages, but it does not. It has not been trained for anything else than understanding and writing natural language texts. And it's pretty good at that.
→ More replies41
u/gymnerd_03 11d ago
It's main job is to sound realistic, being right is a happy coincidence.
17
u/LardPi 11d ago edited 11d ago
that's it. I even heard someone saying it gave them a list of books to read on a subject, where 4 out of 6 were invented books from invented authors.
→ More replies7
u/gymnerd_03 11d ago
After all chat gpt is nothing more than a very very fancy next word generator. It is simply guessing the next word of the sentence. Either people will put a filter on every single fact manually, or it will be built in a somehow fundamentally different way. Because currently it is guessing the next word based on all of the random data from the internet. And that data is often not even correct in the first place
→ More replies
52
u/Arclet__ 11d ago
A calculator is a term that predates the object and it was a job, someone would need the result of a complex calculation and a calculator (generally a woman btw) would do the math and get the result.
Probably not the best example you could have used of technology not replacing jobs.
141
u/LigmaSugandees 11d ago
Blacksmiths who survived the…
73
→ More replies14
u/caboosetp 11d ago edited 11d ago
There's aren't as many, but there are still quite a few blacksmiths. Surprisingly many of them do similar jobs to what was in the past.
Ferriers are a good example. Metal horse shoes more or less still need to be hand fitted.
7
u/Donut 11d ago
Being a good Ferrier is a license to print money. You just have to be able to put up with Horse People.
5
u/caboosetp 11d ago edited 11d ago
You also have to deal with horses. They're the most scaredy-cat muscle machines out there. They're arguably easier to deal with than horse people, but the downsides when the risk fails is a bit bigger. I sure as hell don't want to be kicked, let alone while I'm holding glowing hot iron.
12
u/kingwhocares 11d ago
That's because making highly customized products is simply not possible at industrial scale.
5
u/CorruptedFlame 11d ago
If one size doesn't fit all, just make a range of sizes according to a bell ratio which probably fits all. The reason Ferriers exist isn't because horse shoes have to be custom, but because the demand for horse shoes is miniscule... Because horses were replaced with automobiles. Or else Ferriers would have gone the way of tailors, who were replaced by factories and sweatshops which produce clothes as described above.
27
132
101
u/Suspicious_Lead_6429 11d ago
I don't get why people think Chat-GPT is coming for programming jobs, like programmers weren't already copy and pasting code before Chat-GPT. You will still need people who know to code and understand it, and who can communicate it to those who don't. Also coding is a demanding field, so if programmers are risk of losing their jobs, everyone else is screwed six ways from Sunday.
37
u/kamuran1998 11d ago
Won’t replace programmers, it’ll make programmers more productive, which means companies will need less programmers.
→ More replies6
u/TrueBirch 11d ago
I agree, I think it's similar to moving from punch cards to high level languages. We obviously still need people in computer science, but we need far fewer people to get one unit of work done.
49
u/wolfgangdieter 11d ago
I kind of suspect that the Chat-GPT will replace programmers crowd have no idea what professional programmers actually do for a living.
Imagine there actually was a language model that could produce reliable code given an input. You would still need to create a logically consistent input that describes your business. When doing so you will realise that human language is flawed and not really that great for creating concrete instructions. You still need people to understand the logic and details of some business area and translate that into instructions. Those instructions might be in the next level of abstraction, but it is not really fundamentally different from what a programmer does.
In the case where the ai is advanced enough that you can just say “make me the software to run this and that business” we would already have solved society as a whole in any case, so no need for jobs for anyone.
→ More replies8
u/Terrible_Tutor 11d ago
It’s not the code, but the CLIENTS batshit unnecessarily complex requests. I’m sure AI can do a nice static brochure site…
→ More replies6
u/BlackjackCF 11d ago
ChatGPT and other language models will be great for what Github Copilot already does: being a better autocomplete for syntax and general boilerplate.
I would love, for example, to be able to tell ChatGPT some basic parameters to automate away things like really basic Terraform or other YAML. I just find that work to be busy work and not that interesting.
18
u/ixis743 11d ago edited 11d ago
Absolutely stupid comparison. Calculators DID put a lot of people out of work. Businesses had entire departments of ‘computers’, usually women, and they were all let go within a decade.
And comparing AI to calculators is stupid too. Calculators don’t keep learning how to calculate better or take on ever more difficult tasks to the point where their output is indistinguishable from that of a human being.
17
u/polish-polisher 11d ago
computer is no longer a job
it was replaced by a machine with a familiar name
42
u/NeonFraction 11d ago
Mathematicians will be replaced by calculators in the same way programmers will be replaced by keyboards. That’s… not really what the job is about.
10
u/Darth_Nibbles 11d ago
I think a better example is Photoshop and the iPhone.
Photoshop and camera phones allow amateurs and hobbyists to do incredible things, but they aren't replacing photographers or cameramen. They're tools that allow skilled professionals to do even more.
I suspect ai code generators will be similar, and will just be another tool that skilled programmers use when working.
99
u/UsernameAlrdyTkn 11d ago edited 11d ago
A calculator does calculations, not the totality of "mathematics". Artifical intelegence however.. I see no reason in priciple it couldn't do the totality of "mathematics".
[Late edit; OP was specific to machine learning but I responded about general AI]
55
u/mysteriouspenguin 11d ago
Actual mathematician (sorta, master's student) ChatGPT is totally ass when it comes to topics more advanced then basic undergrad. It's really good at replicating textbook proofs, but fails in basic ways in anything more advanced.
In programmer terms, that's like nailing fizzbuzz and hello world, but if you ask it to write an OS it has a syntax error in the first dozen lines.
It seems that the training data for it's mathematical knowledge is just not there. Either a) it's too niche for the developers to grab or b) it's too niche to have enough data to properly train the algorithm. If it's the former later versions and new AI can fix the problem, but is suspect it's the later.
39
u/Harmonious- 11d ago
I'm not sure that anyone is scared of gpt 3.5 or 4 specifically. If they are, then they are a bit dumb.
What the majority of people are scared about is the future.
The improvements between 3 and 4 are massive. It is almost impossible to tell that you aren't talking to a human. And the AI "can" write clean code.
So imagine 5? 6? 7? Yeah.... we are fucked. Not right now but 3-5 years from now you will be able to type "create an app that looks like x using y framework" and it will just be able to do that. Or "solve this complex math problem with a proof. Tell me if it is impossible to solve or if you aren't smart enough make an estimate of when you will be"
That is what people are scared of. Any sort of critical thinking/mental labor will be replaceable.
14
u/CabinetAncient1378 11d ago
I think the scariest thing of all is when a company like Boston Dynamics starts putting something like this onto it's robots. At that point everyone is out of a job except for a select few who got into the right business at the right time.
→ More replies16
u/Harmonious- 11d ago
I'm not super scared of gpt-esque ai inside bots.
I am a bit fearful of bots become better at micro movements. There is only a few(less than 100) things a plumber, welder, or carpenter might ever do.
1 plumber could supervise 20+ bots and click a button saying "that's the problem, fix it then move to the next job"
It would take a lot of training data, but it would be possible.
→ More replies14
u/mysteriouspenguin 11d ago edited 11d ago
Here's an example right here (sorry for mobile). I asked it to prove a theorem from any elementary calculus or analysis class, that the continuous image of a compact set is compact. You can find the proof in any analysis textbook.
I then asked it to prove it in the two canonical ways, and it worked pretty well. The proof was valid except for one minor quibble (the function being surjective). When I asked about it, it fell apart instantly. That third answer is total nonsense, completely incorrect.
Not only does the theorem hold even if the function is not surjective (by the theorem the bot just gave!) but the counterexample doesn't even make sense. He claims that the exact same set (the interval $[0,1]$ is both compact wheb it suits him and isn't when it doesn't. (if you don't know, the image of that set under x2 is itself.).
Here's what happens when I try to have it prove something not straight out of the textbook: It fails miserably. I barely understand what it's doing, it's factually incorrect, and it's far, far from the simple proof that's actually required to prove it. Simple enough that this was an actual question given to first year students in a course that I am TA-ing for.
So career mathematicians? Nothing to fear from AI right now. My point stands: it's really good at replicating robust training data (textbook proofs) and shit at any extrapolation or logical inference.
25
u/lestruc 11d ago edited 11d ago
Worth noting too that this bot is apparently programmed to act like “customer service”. It will make mistakes, sure, but it will also apologize and generate something absolutely false if you claim it made a mistake when it did not.
“The human is always right” seems to be inherent
→ More replies→ More replies17
u/Harmonious- 11d ago
Me: "You shouldn't be scared of 3.5 or 4"
Person:"uses 3.5 or 4 to prove a point on why I'm wrong"
Try using an earlier form of gpt. It doesn't even compare. See the difference and the improvements as well as the potential for future versions.
I said before that 3.5 and 4 are not going to replace almost anyone but future versions like 5, 6, 7, etc likely will.
Gpt-2 was created in very late 2019. It did not have confidence I'm it's words, almost never created a "decent" sentence, and would go on for very very long repeating itself.
Gpt 3.5 almost never repeats itself, has almost too much confidence, and almost always makes a perfect sentence. The only real issue it has is that 10% of the time it gives false answers.
In the future 5 might give less false answers, it might give shorter ones, it might sound more human, it might be able to do more tasks like access current data, or possibly generate images/videos in relation to a story(like a picture book).
I am worried of the potential.
→ More replies12
u/Used-Candidate9921 11d ago
Now chatgpt is a undergrad, but how long will it take to gain the knowledge of a master’s degree? Doctors degree? With the current funding and its learning speed I won’t be surprised if it can do it in 10 year. Most of us still need a salary by then. What do we do?
→ More replies23
u/DavidBrooker 11d ago edited 11d ago
Being able to reproduce existing knowledge is trivial, and it'll be no time - likely only months, not years - before it can reproduce masters or doctoral level work. But "reproduce" is the operative word. The way language models work is inherently limited in their ability to produce novel ideas ex nihilo, which is the primary thing of value graduate students (and professional mathematicians) produce. Mathematicians produce new knowledge, which often (but not always) cannot be simply synthesized from existing literature.
There are AI systems that have produced novel scientific results: synthesized literature to form a hypothesis, developed an experiment to test that hypothesis, conducted the experiment, analyzed the results and gave the hypothesis test a statistical confidence. In fact, AI systems did this more than a decade ago. But these systems were not language models.
Outside of science, in mathematics to date, I do not know of an AI system that has produced a novel proof. It is very easy to have a computer verify an existing proof, or for a language model to describe or repeat a proof that can be found in literature. But actually generating a new proof is a different class of problem, which we believe to be NP-hard. In some sense, a language model producing a novel proof would in itself be a novel proof, as it would imply a counter example with respect to our established assumptions about computational class.
→ More replies7
u/SchwiftedMetal 11d ago
I share your view. Novelty will be hard with the current AI framework. It could iterate through all possible analysis tools, but then that’s inefficient and i can’t see itself understanding if its product is complete when creating new math. Even if we’re talking just reproduction of abstract ideas, i don’t see an obvious cs mechanism that can turn that abstraction into an income generating framework.
→ More replies9
u/UsernameAlrdyTkn 11d ago
In practice creating an AI mathematician may never happen (humans may not survive the time required). I was arguing in principle there was nothing special about maths of which would prevent artifical intelligence systems being as competent fuctionally as the human's wetware system.
→ More replies→ More replies19
u/violet_zamboni 11d ago
I recommend you watch some mathematics lectures and reconsider!
→ More replies
26
52
9
u/ignore_this_comment 11d ago
A "computer" used to be a job that someone had. It was often performed by women. There would be whole rooms of women calculating numbers for the engineers.
6
u/CREATEREMOTETHREADEx 11d ago edited 11d ago
Ya, all on YouTube there are these beta guys uploading video's with 'I am going to create the next million dolla game using ChatGPT 4' and you don't need coding skillz. I highly doubt it, at best that thing can generate answers using existing frameworks like Unreal engine and Godot. To test it, I asked it 'generate a C++ program to dump the assembly code of a Windows executable', and then it answerered with code that uses the Capstone library, apparantly a bigshot opensource lib that can decompile apps. So ya, there might actually be a 50/50 chance they are able to make million dolla games since that thing could hypothetically generate Unreal or Godot framework code for you.
6
u/therealBlackbonsai 11d ago
when was the last time you met a mathematician?
changes are high the answer is never. so good luck.
12
u/AzureArmageddon 11d ago
Mathematicians are augmented by the calculators/computers.
Computers(job) got replaced by computers/calculators.
6
5
u/InformationMountain4 11d ago
Key word is “survived.” But yeah a lot of white collar jobs are going to get wiped out not just programming. As someone who got laid off from a call center job months after the chat boy was introduced best strategy is to learn a new skill until eventually AÍ replaces that job too and cycle will keep repeating itself until eventually all jobs will be replaced by AI.
→ More replies
5
u/redcoatwright 11d ago
I mean software engineers won't die out but it will be similar to the industrial revolution where suddenly one engineer can do the work of 5 or 10
And there's really only so much productivity can increase, there's a fundamental limit to pumping features/products out to the market.
I see this being a transition point for the tech industry and one that would always have happened. Truthfully, my hope is that we refocus industry on exploration, engineering and scientific pursuit. Or maybe it'll refocus on human augmentation, or medicine.
→ More replies4
u/ModStrangler3 11d ago
The last 30 years of living in America has pretty much conditioned me to believe this will only manifest in the worst possible ways. You’re never gonna be able to call any service provider and talk to a real human again, and much like how the majority of manufacturing jobs got shipped to India and China in the 70s and 80s, programming jobs which are one of the last frontiers of wage labor you can actually buy a house and raise a family on will be crippled and millions more people will be forced to become permanent renting class gig economy slaves with precarious employment and no health insurance
5
4
u/foggy-sunrise 11d ago
Lol you mean people who began their career by learbing JavaScript are going to have to learn about computer science?
Doomed, I tells ya! Doomed!
→ More replies
17
u/Burgov 11d ago
It's just going to replace StackOverflow, not developers. The only difference is it's not going to tell you your question is stupid
→ More replies
3
5
u/ProgramTheWorld 11d ago
Computers were literally replaced by digital computers. They used to be actual jobs.
3
u/Dangerous_With_Rocks 11d ago
On a more serious note, programmers should'nt be worried about AI, everyone should. We might not loose our jobs or get irl terminator unfortunately, but as with everything there's many ways to misuse it.
→ More replies
4
u/turtleship_2006 11d ago
Literal calculators (people with that job?useskin=vector)) who got replaced by computers:
5
u/Vassillisa_W 11d ago
I believe All these advance Chatbots will just serve nothing more than tools for Programmers.I Consider Chatgpt nothing more than stackoverflow or GitHub but more personalized in terms of actual coding.
8
u/IWAHalot 11d ago
It’s accountants that lost out to the calculator, but then convoluted tax laws intended to be exploited saved them.
5
3
3.3k
u/NCGThompson 11d ago edited 11d ago
Weren’t computers a separate occupation? I’d imagine a mathematician would’ve had a grad student actually crunch the numbers before calculators were invented.