Banks remain with COBOL because it's unsexy and stable. And then they say... let's just YOLO some vibe code into the next release sight unseen! Logic checks out.
If you have an app where the user must give their address when subscribing, what are the test that can be done?
User doesn't exist, invalid character in a field, user exists, wrong street name for the zip, wrong state for zip, wrong house number in the street, age below threshold, age above threshold,...
Each of these example must be done manually at least once to prove that the logic is correct and the tester must keep a report of it.
But, for each of these basic test, the data must be in a specific state (especially for the name already exists) so between each test you usually have a data preparation phase.
When you have a lot of these tests because it's spanning logic from decades, it takes time, especially when dealing with investments or insurance.
And usually for these test, you hire specific people that are targeted on correctness, not speed.
Now imagine what happens when you're at step 89 of your test and it fails.
The dev fix the code, fix the automated tests... And the tester restarts from step 1.
You couldn't score any higher on the risk factors. The training corpus for COBOL can't be all that large so the models won't understand it that well. Humans are largely out of the loop and the tooling guardrails are insufficient. Causing a billion dollar disaster with the help of a "shotgun surgeon"? Priceless.
Banks are slowly moving away from their old COBOL systems. It's about cost as much as it's about catching up with the neo-bank competition.
The main thing that makes this difficult is that in most cases the new system is supposed to be more capable. Transactional batch processing systems are replaced with event-based distributed systems. Much more difficult to get right.
They stick with COBOL because it runs well on the mainframe. The mainframe and sysplex architecture gives them an absurd level of stability and virtualization that I don't think the rest of the market has nearly caught up to yet. Plus having a powerful and rugged centralized controller for all of this is very useful in the banking business model.
This is the reason. IBM Mainframe business grew 60%. The modern mainframe is the best state of the art platform for computing, in both reliability and efficiency.
Also, IBM mainframes are wonderfully isolated from physical hardware. They could change processors in the next model, and users would notice a small delay as binaries were recompiled on-the-fly the first time it was used.
They surely could extract more performance from the hardware by shedding layers, but prioritized stability and compatibility.
I don't think learning how to write COBOL was ever a problem. Knowing that spaghetti codebase and how small changes in one place cause calamity all over the place is. Those 4 people's job is to avoid outages, not to write tons of code, or fix tons of bugs.
Honestly, there's companies that have lost the source code for some of their applications. Or, they depend on components from vendors that have long ceased to exist. I remember there being a lot of consternation around being able to compile and link against binary components that have just been around forever that could never be recompiled themselves. More people "Learning COBOL" was never going to be a solution to that ball and chain. And yeah, LLMs are good in the reverse engineering space too so maybe we'll finally see movement on that in the next decade.
You're probably right, no disagreement there. but in the context of my previous comment, the people that write cobol today, I don't think there is a lot of work for them trying to reverse engineer native code back to cobol because the source is lost. But you make a really good point, if AI can assist with lost code recovery, perhaps it will assist them in migrating away from it or getting rid of workarounds and complexities implemented to get that previously opaque binary's behavior.
I would say more significantly, 4 million people can read it. The changes required for any given quarter are probably miniscule, but the tricky part is getting up to speed on all those legacy patterns and architectural decisions.
A model being able to ingest the whole codebase (maybe even its VCS history!) and take you through it is almost certainly the most valuable part of all.
Not to mention the inevitable "now one-shot port that bad boy to rust" discussion.
In my experience, learning COBOL takes you a week at most, learning the "COBOLIC" (ha ha) way of your particular source base will take you a couple of months, but mastering it all including architecture will take you a year, half a year if you're really good.
One year from zero to senior doesn't sound that hard, does it? Try that with a Java codebase.
This seems to make the classic mistake that everyone makes when they conflate two things as the same - programming and business logic/knowledge (and I'd also throw in complex systems knowledge there too).
Often, understanding the code or modifying it is the easy part! I'm sure a decent amount of people on this website could master COBOL sufficiently to go through these systems to make changes to the code.
However, if I understand from my own career enough, knowing why those things are there, how it all fits together in the much broader (and vast) system, and the historical context behind all of that, is what knowledge is being lost, not the ability to literally write or understand COBOL.
I'm pretty sure they're talking about converting COBOL to Python or Go and that is the benefit. That doesn't require knowing the architecture and system design. I'm not familiar with COBOL and COBOL systems so I could be wrong... but Python programmers who can then study the system are easy to find.
This is fintech - I've not worked in banking specifically, but fintech (or fintech adjacent) most of my career, and from my POV these things can get insanely complicated in very unintuitive ways because the financial world is messy and complicated.
I've never worked on COBOL systems specifically, but just going from my experience working on fintech problems in dense legacy stacks of various languages (java is common), that are extremely hard to understand at times, the language itself is rarely if ever the problem.
"Just need to convert it to Go or Python" is kind of getting at the fallacy I am trying to describe. The language isn't the issue (IME). I do have my gripes about certain java frameworks, personally, but the system doesn't get any easier to understand from my POV as to simply rewrite it in another language.
Even let's say it was this simple in the case of COBOL - these are often extremely critical systems that cannot afford to fail or be wrong very often, or at all, and have complex system mechanisms around that to make it so that even trying to migrate it to a new system/language would inevitably involve understanding of the system and architecture.
That's true. COBOL is pretty easy to read so language is not the problem. The project then becomes a rewrite and that's almost never a good idea. Perhaps in the future when AI can convert the software and verify the logic.
> knowing why those things are there, how it all fits together in the much broader (and vast) system, and the historical context behind all of that, is what knowledge is being lost
How big is your context window? How big is Claude's context window? Which one is likely to get bigger?
Sure yet admin vs engineering in terms of jobs ... one is now on the decline either slowly or quickly. Now it requires 1/4 to 1/2 of the engineers once employed in the profession. I dont see how that's a good thing for any economy.
I think I've seen 2 initiatives to move off of AS/400 to a something else in my lifetime and neither one completed. One was at a bank another at an insurance company. Not to mention that a typical COBOL programmer is more interested in retiring than learning to vibe code. At this point I think the software stocks have reached peak panic and hysteria. There is just no rhyme or reason for sharp declines like this.
This is the first thing that occurred to me. The people above suggesting a cobol to python or go update confuse the heck out of me. Why not just convert to vanilla jacascript at that point? Bizarre
the COBOL migration part is probably the least of IBM's moat tbh. what's sticky is that the actual risk of a migration is 5% technical and 95% organizational -- regulatory sign-offs, audit trails, test coverage for systems that haven't had tests written in 40 years. AI can generate the Rust/Java equivalent but it can't own the migration project. that's still IBM consulting's territory for a while.
LOL, anyone who thinks an LLM is smart enough to untangle 50+ years of cobol spaghetti has obviously never worked at a bank or insurance or railroad or.....
Off the shelf, sure. On the other hand, I wonder if domains that require the strictest rigor may have retained a high degree of good change documentation and tests that could be included in training.
What are you implying 5 years of experience as a Product technical delivery architect influencer and some basic web development skills don't transfer to writing critical software?
So curiously I wonder if it's not that Anthropic/Claude can do this magically. More like can individuals at IBM who are heavy hitters just leave and create their own company and effectively provide these services because AI gives them the productivity to do so?
Relevant to Colombia's payment infra fragility: Bancolombia outage blocks transfers to Nequi/other banks since Feb 22 (IBM machine failed in maintenance).
~70% of national txns (100M+ /6mo, 600K interbank/mo). Daily USD flows: $50M+ A la Mano + Nequi peaks $100M+.
Single-vendor risk in billion-scale retail payments? Details: https://www.bloomberglinea.com/latinoamerica/colombia/caidas...
Just at the time when the cohort of COBOL programmers who wrote the business logic and compliance s/w for almost all finance/fintech institutions which predate the modern era of software, and exist as inner core "DO NOT TOUCH IT" code, are dying off.
Which to me represents both an opportunity and a threat. The opportunity is to take the maintenance of the "DO NOT TOUCH IT" out of the greybeards hands, before the lid goes on the coffin.
The threat is that nobody is going to be asking "we did it because we could do it, but nobody asked if we should do it" about almost any change coming down the line.
> The "inability to act" which, as Forrester points out, "provided the incentive" to augment or replace the low-internal-speed human organizations with computers, might in some other historical situation have been an incentive for modifying the task to be accomplished, perhaps doing away with it altogether, or for restructuring the human organizations whose inherent limitations were, after all, seen as the root of the trouble. [...]
> Yes, the computer did arrive "just in time." But in time for what? In time to save--and save very nearly intact, indeed, to entrench and stabilize--social and political structures that might have been either radically renovated or allowed to totter under the demands that were sure to be made on them.
- Joseph Weizenbaum, Computer Power and Human Reason (1976) pp 29-30
Haven't people done this before? Back in the 80's one of our "team" members (I use the term loosely) would reverse engineer the day's code every evening, rendering it unreadable to everyone else who created it.
Of course, nobody these days asks "why?"
If it ain't broke ...
Banks have tons of money (OPM!) and IMO, could rewrite legacy code, but
Cobol is an extremely verbose programming language, and it was used in an era when the practice of programming was much less developed. Calls into libraries were often not used, and instead any re-used code was copied, essentially inlined by hand. (With all the obvious problems that caused.)
The combination of automating complex processes, requiring embarrassing amounts of code to do simple things, re-use by copy and the fact that it was dominant in it's field for such a long time (4 decades!), the amount of COBOL code that exists out there is just staggering.
I know of 1 smaller Cobol company. They alone had 10 to 100 million lines, after a decennium of trying to move to Java. Source files containing 1 routine were easily a few KLoc. The number in the article is probably too low.
The main problem with these code bases are they predate modern coding practices, so the sheer size, incomprehensability and untestability will crush you. You can easily spend your whole carreer reading the source code and reach pension age before you finish. Also, the organisational difference between 2 code bases is much bigger than modern code bases, as every company invented their own practices, and people rarely switched companies so didn't know what others were doing.
I’ve always thought the whole point of staying in COBOL is not to make unnecessary changes, and that many required changes are critical and need experts who know how to handle them exactly.
Which language would you convert the COBOL to that has a compiler that compiles to the Z-series' fixed and floating decimal type machine instructions for financial calculations?
The point is that if you convert away from COBOL to a more modern language, you can also move away from Z-series hardware to commodity x86 and ARM servers. That's why this announcement affected IBM's share price.
IEEE 754-2008 defines decimal floating point arithmetic that is compatible with COBOL and is usually implemented using the Intel Decimal Floating Point Math Library on commodity hardware.
For a typical core banking ledger application, the performance cost of a software implementation of DFP (vs. having DFP hardware instructions) is pretty low, and greatly outweighed by the benefits of being able to use commodity hardware and more maintainable languages.
Are there ARM or Intel servers capable of the reliability and availability of the Z-Series in Parallel Sysplex operation where processing can continue uninterrupted even if one of a pair of data centers becomes unavailable?
If a change of platform is the real objective, why not compile the COBOL for the ARM or Intel server?
I have a close relative at one of the biggest COBOL shops in the US, and something tells me we're about to find out how we take the stability of our payments infrastructure for granted.
Their company no problem grinding older developers into retirement for the sake of padding their quarterly numbers, work-life balance is hell there. They refuse to try to compete with the modern developer market, senior level pay tops out around $125k. Despite what you may have read about experienced COBOL developer pay, know that is not the average experience. The talent pool was not replenished because they did not want to pay, overseas contracting firms also stopped training COBOL developers because their contractors could earn more building modern infra on AWS, so now they're between a rock and a hard place.
I have little doubt that we are going to see a massive payments infra failure as a result of this. Not because the AI is inherently bad, but because the promises of the tech combined with terrible management practices will create the perfect conditions for a catastrophe.
> how we take the stability of our payments infrastructure for granted.
I was about to comment we should all closely watch those bank statements and balances...
While I'm OK with the use of AI to understand the COBOL codebase, I understand it's a single prompt away from transformation and production. Just one executive approval away ha.
This makes no sense. If IBM supposedly gets a significant amount of revenue from COBOL (a dubious proposition) then wouldn't this actually help them as COBOL programmers are getting rarer and rarer?
Indian service companies can train some of their intake with COBOL, some obscure printer programming language, Clojure etc and give them anxiety about getting into a career dead end.
IBM makes a lot of money selling mainframes to companies that have COBOL codebases dating back to the 70s or earlier. The main reason said companies still buy mainframes to run said COBOL programs is because it’s too risky to try to port them to more standardized, cheaper platforms. THEORETICALLY, a COBOL-proficient Claude could make it feasible to port these old COBOL codebases to something more modern that can run on bog-standard x86 servers, and it’s unlikely customers would buy them from IBM.
IF, and it’s a big if, Claude make it possible to migrate off of COBOL, this would be a massive blow to IBM.
Nothing is so difficult about COBOL. It’s just old-fashioned and everything surrounding it is legacy. Most people seem to think it has a negative value when placed on a resume. Maybe that’s true, I don’t know.
I have trouble getting people to even look at C code these days. I don’t understand why devs are so afraid of old things.
It's quite the contrary, the less interpretative the language, the better. And no, LLMs were not trained on English to begin with. And they don't perform best in English.
Please expand more on the idea that LLM's are not trained on English to begin with. Not sure what you mean by this as clearly many LLM's are trained on data that contains a lot of English. For instance GPT-1 seems to have been trained on a purely English corpus.
That’s not how it works. Being trained a ton of human text doesn’t mean you can complete the next token for a program that needs to be logically coherent.
Imagine all your data is Reddit threads and now I ask you what follows “goto”, how would Reddit help you?
The opposite is likely true - there isn’t a ton of publicly available cobol code compared to e.g React, so an LLM will degrade.
Since no one's mentioned it, this was tried once before by a company called Micro Focus, which sold software to modernize "legacy" COBOL-based mainframe applications so they could run on x86, and while somewhat successful, they failed to disrupt IBM's mainframe business, because these legacy systems traditionally relied as much on the idiosyncrasies of the mainframe hardware itself and not just the COBOL language.
If all Anthropic is offering is some kind of smart language translation, I'm not sure they will have many takers. And whatever you think about mainframes, it is quite nice that our credit card networks work as efficiently and reliably as they do.
IBM plunges? Anthropic? COBOL? Is Deirdra Bosa somehow involved in this?
Consider that back in the day, we once had a situation where the shared runtime library supporting a programming language on Megabank's PROD cluster was updated from something like 4.1.1 to 4.1.2 after months on its TEST cluster, plus lots of formal meetings, planning, pages of signatures, contingencies, extra ops people onsite, etc. --and Megabank still proceeded to loose more than ten grand per minute after pressing [Return] on that update. At least a dozen people lost their jobs at the following Friday morning meeting. It turned out that a subtle change in the vendor's implementation of a floating point function was not caught because testing didn't consider enough digits of precision. Mind you NO CODE WAS CHANGED, only a dynamically linked runtime library SUPPLIED BY THE COMPUTER MANUFACTURER --not a third party. Point being (no pun intended), when you go monkeying around with stable production systems that are doing On Line Transaction Processing (OLTP), "bad things" happen, treasure and careers are in jeopardy --and COBOL Life is all about what goes on inside of those systems.
Anthropic is great at gorilla marketing with all their PuRe BS and AI hype. Bottom line for the young peeps here:
1. There is really NO WAY IN HELL that any CIO at a credible Financial Institution will ever authorize a hallucinating chatbot to convert their core logic from COBOL to Python and Go.
2. The only way such institutions "escape" COBOL is through M&A (data --not code-- exported to the new entity's system).
3. As far as COBOL devs tapping out, that happened a very long time ago. (How many of you knew that COBOL was codeveloped by the late mathematician US Navy Rear Admiral "Amazing" Grace Hopper who also invented the first compiler?) COBOL is a simple computer language from a simpler time that enabled non-tech professionals from adjacent fields (like Accounting) to become application programmers (or "implementors" in the parlance of that era). Making minor changes to COBOL code such as PIC layouts and COMPUTE statements is no big deal that requires waking up a 95 year old, the problem is strictly related to production change control (like what version of the compiler and runtimes are you using to rebuild the production binary, etc.) and that isn't even specific to COBOL, except that it is better understood with more modern languages like C (but still remains an alien concept to almost an entire generation that entered the trade after 1997 due to all of the senior level people outside of a handful of tech companies being forced to crack open their 401Ks and move on to other fields during the dot-com crash).
----
Ken Lay died an innocent man --he had a heart attack before his sentencing. He was brought down by whistleblowers. Hang tough peeps. Fight. Back. Against. AI. Bullshitters. https://youtu.be/qJiALpiqpk8
"After all, if Dario Amodei had bought puts on IBM, and the dozens of companies that have plunged more than double digits in recent weeks, he would have made billions, certainly enough to fund his company for months if not years. "
Anthropic put out another blog post about modernizing/migrating away from COBOL several months ago IIRC, it is surprising that this was not priced in already
Yes IBM license for mainframe are expensive but it never fails.
I worked on a migration project where only the tests would take a few thousand days.
Yes they could be automated, but the regulations in place required that a human sign that all the test were executed at least once by a human.
Did running the test suite take 10 years? Like literally what exactly do you mean?
User doesn't exist, invalid character in a field, user exists, wrong street name for the zip, wrong state for zip, wrong house number in the street, age below threshold, age above threshold,...
Each of these example must be done manually at least once to prove that the logic is correct and the tester must keep a report of it.
But, for each of these basic test, the data must be in a specific state (especially for the name already exists) so between each test you usually have a data preparation phase.
When you have a lot of these tests because it's spanning logic from decades, it takes time, especially when dealing with investments or insurance.
And usually for these test, you hire specific people that are targeted on correctness, not speed.
Now imagine what happens when you're at step 89 of your test and it fails.
The dev fix the code, fix the automated tests... And the tester restarts from step 1.
The main thing that makes this difficult is that in most cases the new system is supposed to be more capable. Transactional batch processing systems are replaced with event-based distributed systems. Much more difficult to get right.
They surely could extract more performance from the hardware by shedding layers, but prioritized stability and compatibility.
Now, 4 million people can write it.
A model being able to ingest the whole codebase (maybe even its VCS history!) and take you through it is almost certainly the most valuable part of all.
Not to mention the inevitable "now one-shot port that bad boy to rust" discussion.
One year from zero to senior doesn't sound that hard, does it? Try that with a Java codebase.
Often, understanding the code or modifying it is the easy part! I'm sure a decent amount of people on this website could master COBOL sufficiently to go through these systems to make changes to the code.
However, if I understand from my own career enough, knowing why those things are there, how it all fits together in the much broader (and vast) system, and the historical context behind all of that, is what knowledge is being lost, not the ability to literally write or understand COBOL.
I'm pretty sure they're talking about converting COBOL to Python or Go and that is the benefit. That doesn't require knowing the architecture and system design. I'm not familiar with COBOL and COBOL systems so I could be wrong... but Python programmers who can then study the system are easy to find.
I've never worked on COBOL systems specifically, but just going from my experience working on fintech problems in dense legacy stacks of various languages (java is common), that are extremely hard to understand at times, the language itself is rarely if ever the problem.
"Just need to convert it to Go or Python" is kind of getting at the fallacy I am trying to describe. The language isn't the issue (IME). I do have my gripes about certain java frameworks, personally, but the system doesn't get any easier to understand from my POV as to simply rewrite it in another language.
Even let's say it was this simple in the case of COBOL - these are often extremely critical systems that cannot afford to fail or be wrong very often, or at all, and have complex system mechanisms around that to make it so that even trying to migrate it to a new system/language would inevitably involve understanding of the system and architecture.
How big is your context window? How big is Claude's context window? Which one is likely to get bigger?
I think this is a game changer for trying to migrate secondary services like tools or batch jobs
this has been going on all Feb.
And then win the contracts to do this and have sufficient bankroll that they can be successfully sued and recover damages if they screw up?
No.
Someone like accenture might eat their lunch though
The threat is that nobody is going to be asking "we did it because we could do it, but nobody asked if we should do it" about almost any change coming down the line.
> Yes, the computer did arrive "just in time." But in time for what? In time to save--and save very nearly intact, indeed, to entrench and stabilize--social and political structures that might have been either radically renovated or allowed to totter under the demands that were sure to be made on them.
- Joseph Weizenbaum, Computer Power and Human Reason (1976) pp 29-30
Of course, nobody these days asks "why?"
If it ain't broke ...
Banks have tons of money (OPM!) and IMO, could rewrite legacy code, but
why?
Oracle is trying (and mostly failing) at frontier model training
Click it.
That number sounds enormous. If the same code runs on 10,000 ATMs, are they counting that 10,000 times?
Cobol is an extremely verbose programming language, and it was used in an era when the practice of programming was much less developed. Calls into libraries were often not used, and instead any re-used code was copied, essentially inlined by hand. (With all the obvious problems that caused.)
The combination of automating complex processes, requiring embarrassing amounts of code to do simple things, re-use by copy and the fact that it was dominant in it's field for such a long time (4 decades!), the amount of COBOL code that exists out there is just staggering.
The main problem with these code bases are they predate modern coding practices, so the sheer size, incomprehensability and untestability will crush you. You can easily spend your whole carreer reading the source code and reach pension age before you finish. Also, the organisational difference between 2 code bases is much bigger than modern code bases, as every company invented their own practices, and people rarely switched companies so didn't know what others were doing.
There’s hardly any room remained for LLM.
IEEE 754-2008 defines decimal floating point arithmetic that is compatible with COBOL and is usually implemented using the Intel Decimal Floating Point Math Library on commodity hardware.
For a typical core banking ledger application, the performance cost of a software implementation of DFP (vs. having DFP hardware instructions) is pretty low, and greatly outweighed by the benefits of being able to use commodity hardware and more maintainable languages.
If a change of platform is the real objective, why not compile the COBOL for the ARM or Intel server?
Their company no problem grinding older developers into retirement for the sake of padding their quarterly numbers, work-life balance is hell there. They refuse to try to compete with the modern developer market, senior level pay tops out around $125k. Despite what you may have read about experienced COBOL developer pay, know that is not the average experience. The talent pool was not replenished because they did not want to pay, overseas contracting firms also stopped training COBOL developers because their contractors could earn more building modern infra on AWS, so now they're between a rock and a hard place.
I have little doubt that we are going to see a massive payments infra failure as a result of this. Not because the AI is inherently bad, but because the promises of the tech combined with terrible management practices will create the perfect conditions for a catastrophe.
I was about to comment we should all closely watch those bank statements and balances...
While I'm OK with the use of AI to understand the COBOL codebase, I understand it's a single prompt away from transformation and production. Just one executive approval away ha.
IF, and it’s a big if, Claude make it possible to migrate off of COBOL, this would be a massive blow to IBM.
I’m porting my whole codebase to cobol!
I write SAAS suites for archeological sites.
I have trouble getting people to even look at C code these days. I don’t understand why devs are so afraid of old things.
Feb 13: IBM tripling entry-level jobs after finding the limits of AI adoption
https://news.ycombinator.com/item?id=47009327
Jan 28: IBM Mainframe Business Jumps 67%
https://news.ycombinator.com/item?id=46802376
Imagine all your data is Reddit threads and now I ask you what follows “goto”, how would Reddit help you?
The opposite is likely true - there isn’t a ton of publicly available cobol code compared to e.g React, so an LLM will degrade.
If all Anthropic is offering is some kind of smart language translation, I'm not sure they will have many takers. And whatever you think about mainframes, it is quite nice that our credit card networks work as efficiently and reliably as they do.
Times are a changin'
Consider that back in the day, we once had a situation where the shared runtime library supporting a programming language on Megabank's PROD cluster was updated from something like 4.1.1 to 4.1.2 after months on its TEST cluster, plus lots of formal meetings, planning, pages of signatures, contingencies, extra ops people onsite, etc. --and Megabank still proceeded to loose more than ten grand per minute after pressing [Return] on that update. At least a dozen people lost their jobs at the following Friday morning meeting. It turned out that a subtle change in the vendor's implementation of a floating point function was not caught because testing didn't consider enough digits of precision. Mind you NO CODE WAS CHANGED, only a dynamically linked runtime library SUPPLIED BY THE COMPUTER MANUFACTURER --not a third party. Point being (no pun intended), when you go monkeying around with stable production systems that are doing On Line Transaction Processing (OLTP), "bad things" happen, treasure and careers are in jeopardy --and COBOL Life is all about what goes on inside of those systems.
Anthropic is great at gorilla marketing with all their PuRe BS and AI hype. Bottom line for the young peeps here:
1. There is really NO WAY IN HELL that any CIO at a credible Financial Institution will ever authorize a hallucinating chatbot to convert their core logic from COBOL to Python and Go.
2. The only way such institutions "escape" COBOL is through M&A (data --not code-- exported to the new entity's system).
3. As far as COBOL devs tapping out, that happened a very long time ago. (How many of you knew that COBOL was codeveloped by the late mathematician US Navy Rear Admiral "Amazing" Grace Hopper who also invented the first compiler?) COBOL is a simple computer language from a simpler time that enabled non-tech professionals from adjacent fields (like Accounting) to become application programmers (or "implementors" in the parlance of that era). Making minor changes to COBOL code such as PIC layouts and COMPUTE statements is no big deal that requires waking up a 95 year old, the problem is strictly related to production change control (like what version of the compiler and runtimes are you using to rebuild the production binary, etc.) and that isn't even specific to COBOL, except that it is better understood with more modern languages like C (but still remains an alien concept to almost an entire generation that entered the trade after 1997 due to all of the senior level people outside of a handful of tech companies being forced to crack open their 401Ks and move on to other fields during the dot-com crash).
----
Ken Lay died an innocent man --he had a heart attack before his sentencing. He was brought down by whistleblowers. Hang tough peeps. Fight. Back. Against. AI. Bullshitters. https://youtu.be/qJiALpiqpk8
https://www.zerohedge.com/news/2025-06-11/israeli-lawmakers-...
https://www.zerohedge.com/political/black-fatigue-goes-viral...