Tuesday, February 24, 2026

Is Gen AI Building the Y2.5K Crisis?

This incident happened a few years ago while I was consulting for a company. A friend from the reporting team called me to help with an issue. A particular reporting table had not loaded that day, leading to an error in the reports that were generated that day. The team had looked and found that there was a mainframe-based application responsible for putting out a file which made its way to this reporting table, and that file had not arrived that day.

Identifying the source of the issue was simple; the real problem actually started there. There was no one on the team who knew how to use a mainframe or even log into one. Since I have some experience working on Mainframes, my friend requested me to help him resolve the issue. I requested access to the Mainframe and investigated the issue. The issue was simple - an incoming file had not arrived on time and so the JCL hadn't triggered. The fix was straightforward—to request the support team to rerun that Mainframe job.

A Symptom of a Larger Problem

It was a small fix, but the incident highlighted something bigger. On one hand, it proved again why Mainframes remain so dominant, even in this era of Cloud and AI. Programs written 40–50 years ago still run—and they just run. Yes, the screens are black and green, but they never failed to do what they were designed to do.

I might be wrong, but I believe the Y2K (Year 2000) crisis dealt the first significant blow to the image of the Mainframe. The crisis wasn't a failure of the Mainframe itself, but a byproduct of a different era—one where programmers slashed the '19' from the YEAR and used the 'MM-DD-YY' format for date simply to save on the costs of memory. Yet, Mainframes got all the bad rap. This was further compounded by the Mainframe getting overshadowed by newer-looking, colorful, and "cooler" technologies arriving with cooler names, too. We are now at a point where many major corporations on the planet use the mainframe in some form, yet there is hardly anyone left who understands the code.

  1. The developers who built the code have now retired, and there is no one ready to take over from them to continue working on Mainframes.
  2. Working on Mainframe is considered an “uncool” job. Everyone wants to work on newer tools where they can drag and drop to build an ETL pipeline or build colorful screens with images and videos.
  3. Colleges have almost stopped teaching COBOL, JCL, and other Mainframe tools. I have not seen any coaching institutes teaching Mainframe skills either.
  4. Mainframe-related knowledge on internet forums also seems to be scarce compared to newer programming languages and tools.
  5. Libraries carry tons of books related to Cloud and AI, but hardly any related to the Mainframe.

 The AI Disconnect

Bottom line: companies are running on mainframe code, yet hardly anyone knows anything about the underlying programming language, the technology stack, or the business logic that is running behind the scenes.

CEOs and CTOs proudly talk about how AI is developing not just code snippets, but entire applications in minutes and hours— that otherwise took weeks and months with a big development team. Companies boast of how productivity has improved with AI doing all the coding with just a few prompts, saving them tons of money. AI platforms can now detect and heal code bugs without any human assistance. Once you lay out a basic idea, AI-based platforms can build and implement those solutions quickly with zero human intervention.

All this sounds fantastic, and it looks like the future is already here. But what does all this really mean? I think this only means that we are now adding loads and loads of AI-generated code, all running with little to zero human intervention. AI is becoming better every day, and this process will only hasten.

History Repeating Itself

Maybe 20–30 years down the line, we will see the "Mainframe scenario" repeat itself, where manual programming is considered uncool. Colleges will likely have stopped teaching programming as a subject, and libraries will be filled only with books related to prompt engineering, RAG, and other AI tool stacks.

My Prediction: The Y2.5K crisis of 2050

Imagine you are in 2050 and an AI-generated system has stopped working. The 'Prompt Engineers' of that era will stare at a strange set of lines (the actual programming code), trying to decipher what they really mean. We will have traded the 'black and green' screens of the 1960s for the 'black boxes' of 2050.

Perhaps this will be known as the 'Y2.5K' crisis: a world relying on a foundation of code that works perfectly until it doesn’t, only to realize we have let the skills required to look under the hood go extinct.


~Narendra V Joshi

Is Gen AI Building the Y2.5K Crisis?

This incident happened a few years ago while I was consulting for a company. A friend from the reporting team called me to help with an issu...