Hacker News new | past | comments | ask | show | jobs | submit login
COBOL underpins the entire financial system (wealthsimple.com)
195 points by chaosdesigner 7 days ago | hide | past | favorite | 184 comments





Don't ever work on a COBOL modernization project. It is a career killer with little to no upside. COBOL systems and the people directly using them are Chernobyl.

Both government regulators and operations personnel have made careers out of trusting black boxes that no one fully understands anymore. Regulatory specifications are ambiguous and disorganized. Imagine working with someone who knows that you are trying to shed light on the technology that they were responsible for, yet do not fully understand. How cooperative do you think these people will be with your efforts? How do you think the managers responsible for this work survive the politics of the project's inevitable failure? Blame rolls downhill. Modernization projects require strong leadership and collaboration across many teams. That's a huge problem because such an environment very likely does not exist in the world today. No one understands how the algorithms work or even how they should work! Forget how challenging it is to work with COBOL, which someone could eventually figure out. Whatever you've written has to be tested, and you don't have a trusted source of business logic from which to verify how it ought to work. You only have the black box.

If you've been given this task, start interviewing at other companies. You were given a suicide mission by people who are well aware of that.

My experience with this was with regulatory margin trading systems supporting a multi-billion dollar market.


There’s a lot of that out there. I have avoided COBOL but I have seen stuff written in VB6/COM which was also in the same state. Eventually the one guy who knows anything about it leaves or dies. In the name of “progress” someone wraps it in something else and sticks a web API around it but everyone is afraid to touch that bit of VB code. Now a third party calls this and wonders why it takes 10 seconds for an HTTP API call to run and they can’t be issued in parallel or it throws a 500 error. Then the third party asks to send 20,000 of these a day and then finger pointing and stuff occurs at management level between both companies.

I have been on both sides of this and it sucks. I came up with Rug Driven Development as a title. All the risks and future problems are swept under the rug quickly while maintaining a happy smile and pretending everything is just fine and dandy.


> I came up with Rug Driven Development

I so need to steal this term. Not exactly for what you are describing even if I come across this on a nearly weekly basis in so many different industries. Be it automotive, be it finance, be it what ever.

But - and this is were I will probably use it in the future - I come across it in agency work. You win a shiny new project proposal and need to quickly show something for your corporate "partners" to shine before their management. To make them look good for their yearly appraisals or what not.

You already know that you will never fix the underlying shit you built, because your contract will be a new pitch three years down the line and it isn't clear if you will win it again. So let the next people deal with the tech debt - but guess what: They will just do the same. So debt will pile on and on.

So "Rug Driven Development" will be my goto term going forward.


"Rug-Driven Development" is a combination Delegation of Responsibility, Explosives Engineering, and Musical Chairs. ^_^

Nailed it :)

> I have avoided COBOL but I have seen stuff written in VB6/COM which was also in the same state. Eventually the one guy who knows anything about it leaves or dies. In the name of “progress” someone wraps it in something else and sticks a web API around it but everyone is afraid to touch that bit of VB code.

I call such things "Ancient Wonders". As in artifacts from long ago that the company owns, but nobody knows how they work or how they were built, and nobody can build one today. There may, in fact, be only one left. Whoever did build them was privy to some Tribal Knowledge that has since been lost.

I worked at a shop whose Ancient Wonder was their custom templating engine with an embedded Python interpreter. The only guy who knew how to compile it had left years before I joined, and as a result everybody just copied the same Solaris .so to new development/test/prod servers as they came in.


> I call such things "Ancient Wonders"

Read this story, "Institutional Memory and Reverse Smuggling" [0] - once upon a time, a petrochemical factory was built. Decades later, the plant is still operating and being maintained. But at this point, nobody knows how the whole factory worked, why it was built that way, which processes it ran or how it was constructed...

[0] https://lemming-articlestash.blogspot.com/2011/12/institutio...


Rug Driven Development causes the majority of technical debt that I see in most of my clients, and it happens at the small scale, too. Just a couple weeks ago, a junior engineer was faced with a key process that segfaulted about once a minute. No worries though, they reasoned; the process immediately respawned and because it managed a stateless procedure, nothing is lost. Except the 1 out ~30,000 times it didn't respawn. No worries though, they reasoned; they put in a request to the monitoring team to look for the absence of the process for longer than three minutes and auto-spawn it. Ta da! Problem solved!

They let this go on for about six years before I arrived and saw it. It took me pointing out that there is a decent chance this is caused by the process relying upon something in the OS kernel that isn't quite aligned to the kernel documentation but is good enough, and the immediate respawn behavior might rely upon that in turn. If the kernel ever is "fixed", then this audit-compliance-related process suddenly stops and doesn't respawn, or worse, won't start at all, becoming top of mind with all the senior management. It is always cheaper to fix problems in the small before they become problems no one can ignore.

The urge to sweep problems under the rug and move on is very powerful within our industry. Until you've done the same yourself and been bitten so many times, your scar tissue twitches every time you see the same pattern again. These days, I treat code problems like I treat cleaning up messes while cooking: I clean up as I go along. My scar tissue thanks me now. There is a delicate balance between addressing these problems in the small and bikeshedding, though.

BTW, I performed an strace that revealed a SIGKILL just pops in without anyone or any known process issuing it. The application developers suspect something in the OS, so we're now engaged with the application support team, OS support team, and our own internal OS support team to trace it down and beat it into submission.


If it's Linux there are various ways to find out who is sending that signal, killsnoop being one (part of bcc: https://github.com/iovisor/bcc), another using a trivial systemtap script (https://www.ibm.com/support/pages/systemtap-kill-who-killed-...).

I <3 Linux for just this kind of community. Thank you, you're righteous!

> Rug Driven Development

I'll steal this :) Encountered a few of those during my career as well as if you can't see a problems they can't hurt you...


I have just straight up cleaned out my desk and turned in my notice once upon discovering something similar when delving deeply into the innards of an old embedded hardware stack.

So you're saying it's not just technical debt it's "national technical debt" ?

£1000/day to work on a COBOL project ain't bad though

I doubt if you're going to get that much if you haven't got a decade plus of experience with COBOL systems.

Helps when you still have your dad's notes when he wrote the system ;)

It sounds like the majority of that rate is for knowing how to navigate an organization still using cobol. A fortune 50 using such a system as its backbone may be less cost sensitive than a FANG with only a small number of engineers responsible for holding the whole thing up.

It depends on whether you are paid regardless of the outcomes beyond your control.

Assume standard contracting in the UK. It's always a daily rate, nothing to do with outcomes.

Rather than reading yet another article about how there’s all that COBOL still out there powering critical systems, which should comes as no surprise whatsoever to regular readers of Hacker News —— why not read a book about the deeper reasons how and why old technologies persist, and the ethic and duty of maintaining them. Specifically:

David Edgerton, The Shock of the Old: Technology and Global History Since 1900 (2006).

Andrew L. Russell and Lee Vinsel, The Innovation Delusion: How Our Obsession with the New Has Disrupted the Work That Matters Most (2020).

You won’t agree with everything in those books, but they will make you think deeply about old technologies and the stories we tell ourselves about innovation.


Chicago FD dispatched fire engines by telegraph until the 1990s.

http://chicagoareafire.com/blog/2015/02/chicago-fd-history-t...


Somewhat related question you might be able to answer - what are good examples of really resilient code bases a junior developer can look at to learn?

Old code still in production just sounds antifragile to me and that's the sort of code I'd like to write


I’m not the best person to answer this - others should chime in too.

I think sadly you won’t be able to read the preeminent examples of really old antifragile code that’s still powering the world. That stuff is mostly proprietary - banks and governments and so on. Hence this article.

BUT - there are still plenty of mature open source codebases to read, especially operating systems, databases, and Unix utilities. Sure, they’re living projects that now look quite different from the original, but they’re descended from code a few decades old and written in the same language — C. There’s Emacs (35 yrs old), GCC (33 yrs), the Linux kernel (29 yrs), MySQL (25 yrs), and Postgres (a youngster, only 24).

And then there’s the programming languages. C itself is now 48 years old! Since it’s the language of the Linux Kernel and Postgres and so much else, I’d be willing to bet it’ll be around for another 48 years. (Yes, I love Rust, and I’d use it over C for any new project, but this is the Linux Kernel folks!)

But - since this is Hacker News - I should note that C is beaten by a long long way by Lisp, which is 62 years old. That’s older than COBOL - and people still love it. John McCarthy really had programming figured out.


Lisp is amazing. John McCarthy played an amazing role, but there were others too - eg Alonzo Church and the guys that later made Lisp what it is (Common Lisp), with a significant extra set of features including lexical scope.

For those who want to learn it - check out this short guide. Highly recommended language!:

https://github.com/ashok-khanna/common-lisp-by-example


Lisp is amazing. John McCarthy played an amazing role

But in terms of real world production code it is barely a footnote whereas COBOL fills entire libraries.

So while it might be great there just isn’t a lot of examples to look at


Lisp as a language might be a footnote, but the ideas the language proposed were revolutionary. Recursion, REPL and even conditionals all came from Lisp (FORTRAN had a conditional entity at the time, but not what we'd recognise as a conditional today and I believe they later adopted Lisp-style conditionals anyway). IIRC, Lisp could be expressed in 5 small base functions defined in assembly, from which the entire language could be defined in turn, making it outrageously portable.

It's hard to understate how seminal Lisp was.


I can understand how the ideas are important and revolutionary, but they were then adopted/extended/reused in other languages that seem to have far surpassed lisp in recognition and pragmatic use. Why hasn't lisp reached the heights of usage of something like C++ or Javascript or Java if it's so popular with computer scientists?

It's an interesting question, and one I'm not old enough to answer properly. From what I can tell, lisp was popular up until the 80s, where it was heavily used in AI. Then there's a reference to an AI winter, where I suspect OO languages gained a lot of traction and lisp fell out of fashion.

It's hard to understate how seminal Lisp was.

I don’t disagree but the OP was asking for examples of production code to look at


Ah, but you missed my point :)

Maybe you aren't seeing Lisp itself in production code (although someone suggested Emacs which is Lisp), but you see it's ideas and influence in almost every language and program you'd find today. There's little bits of Lisp everywhere you look, even if you don't recognise it as Lisp.


Which is nice (and very true) but hardly useful when all you want is to look at some actually used lisp code. Emacs might be a good candidate for that.


One example is Metabase, which turns out to be written in Clojure: https://github.com/metabase/metabase

The Linux kernel is a great achievement, but I'm not sure if I'd call it resilient or antifragile.

The code base is underdocumented, has no automated tests, and full of security issues waiting to be found. The moment you're off the beaten path, you end up debugging random bugs and regressions. Code quality is generally very high, as one would expect, but the code is not necessarily resilient or well-designed.

SQLite is a much better example of truly antifragile code.


The kernel is the opposite of a long running system. Developers are working on the master branch that is never run or deployed.

Every 5 years or so, linux distributions will pick up the latest kernel version and ship it. To upgrade you have to reformat and reinstall your computer, there's no continuity of service, there's no upgrade path.


What? To upgrade Debian (the archetypical stable distro) to a new release, you just do an "apt dist-upgrade": https://wiki.debian.org/DebianUpgrade

Minor patches to the Debian kernel are released all the time and you get those with a regular "apt upgrade". With Arch and other rolling-release distros, you get major new kernel versions all the time as part of the normal package upgrade flow.


>>Every 5 years or so, linux distributions will pick up the latest kernel version and ship it. To upgrade you have to reformat and reinstall your computer, there's no continuity of service, there's no upgrade path.

What in the world are you talking about? Distros are constantly pulling versions of Linux kernel from "master" tweaking it a little bit for their system and shipping it. Most at least a few times a year.


This does not match my lived experience over the last 25 years. I would argue from that that you are incorrect.

Macsyma and REDUCE are written in Lisp. The development of those two computer algebra programs started in the mid/end 1960s. They are still maintained today. That makes them over 50 years old.

Some history on Macsyma:

https://www.sciencedirect.com/science/article/pii/S074771711...

REDUCE:

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.153...


I've never actually seen anyone use macsyma? Is it underneath something "more popular" ?

The Postgres project started in 1986. It was renamed to PostgreSQL in 1996, but it’s a lot older than that.

I heard how postgresql was a kind of addin to postgres, adding sql syntax. Then the non sql parts of postgres atrofied away until the addin more or less swallowed its parent product. A bit like how emacs became an editor and teco was forgotten.

It's sad that so much well-written code is proprietary.

I'm currently doing a masters in CS as a career change from mechanical engineering and what surprises me is how little time we spend learning to read code. It would be nice to have a Great Works type program for code bases


LISP and its variants was the introductory programming language at MIT from the 1970s to the 2000s. Partly because they had the LISP/A.I. royalty on their faculty in this era. Students complained they wanted something more modern to get a better internship. They was silly because the majority of students entered MIT having already learned the latest fad language during high school.

Fortran is 63. so get off our lawn, kids.

I would add Pascal to the mix. It used to rule the world of computing.

Pascal was not really used professionally (the original Macintosh toolbox notwithstanding -- Microsoft used C to reimplement it when they developed the Windows API). It was the undergrad teaching language until the recommended curriculum of the ACM replaced it with Java. Neither choice is an excellent technical language for professional use but they're both considered simplistic enough for pedagogy.

If we talk about their importance and longevity, Pascal deserves a spot

It ran in many places in the 70s and was quite popular at the time of Turbo Pascal.

TP compiler was created by Anders Hejlsberg, that later became chief architect of Delphi, that convinced Microsoft to hire him to work as chief architect on C# and now he's a core developer on Typescript.


I thought it used to be one of the most popular languages:

https://www.youtube.com/watch?v=YqxeLodyyqA&ab_channel=elmus...

Not sure about the source though.


I recently learned that the programming language for the Navision ERP system (now Microsoft 365 Business Central) is a Pascal variant called AL.

> Pascal was not really used professionally

Delphi was/is used quite a bit professionally.


Of course but it never reached the levels of c, c++, VB or Java usage.

I’m mid-30s, working for a finance megacorp and we have a few production codebases older than me. Some observations:

* Sadly, as others have said, these are proprietary internal systems and members of the public won’t ever be able to see them.

* Although the codebases have ancient lineage, the code doesn’t stay static and has been extended and patched over the years, and sometimes transplanted wholesale to a different technology using automated tools (e.g. mainframe to .NET, although that particular one was an unmitigated disaster for maintainability).

* Ancient code usually persists in production for reasons unrelated to its technical qualities, and usually in spite of them. For example, an insurance company managing a closed book of pensions has very few compelling reasons to change its technology because the product it supports doesn’t change, and regulatory changes can be handled by tweaks and auxiliary reporting systems.

* Having said that, ancient code was usually written with a great deal more discipline than modern code, and I believe this stems from the division of labor between Programmer and Analyst, which no longer exists. By having two people involved in every part of the system, you had all the benefits of rubber ducking, peer-review, documentation, and the simple act of talking to another human about what you are doing.

* Ancient systems tend to be very rigorously tested for functionality that directly supports business activity. Ancillary functionality such as management information reports, less so. These systems change infrequently enough that the major defects have all been flushed out and fixed, over the years. There is very little feature churn, which does wonders for stability.

* The code at the heart of these systems usually wasn’t designed with a modern threat model in terms of security. Security is often bolted-on by isolating these systems behind other systems, and they get a lot of mileage out of obscurity too.

* The code is often the least interesting thing about ancient systems anyway. Few are performing rocket science tasks - they are usually just doing batch data processing and green screens. More interesting is the ecosystem of infrastructure, interfaces, job schedules, copybooks and data models, resilience features, and documentation. These are the things that make the system really work. Lines of COBOL are usually unilluminating.


Having said that, ancient code was usually written with a great deal more discipline than modern code, and I believe this stems from the division of labor between Programmer and Analyst, which no longer exists.

Shotgun compiling wasn’t a thing and programs were planned on paper (flowcharts) before any code was written.

Modern day programmers scoff as they crank out another web app which will be thrown out and totally rewritten in the fashionable framework of the day in less than a year.


I am old enough to have started my career exactly in such an environment (COBOL in automotive industry).

You are right, there was more discipline, and systems, no matter what size they were, were painstakingly described on paper first, then implemented (programmers had very little chance to come up with neat tricks), then tested. Infrastructure was very "primitive", but on the other hand you basically never had obscure bugs manifesting in your database, or your batch scheduler etc.

On the other hand (I am talking of the late 80s) systems had already grown to a behemoth size, and unfortunately you could already find stuff that was messy and poorly documented.

Take also in account that IT (at least in Europe) had a boom between 70s and 80s... but this also meant that lots of people started working in IT with almost no formal education and without any "craft experience": exactly the same mistakes and cul-de-sacs were independently made and discovered all over the place, and there was basically no way for developer X to know that someone else had already solved those, even if the other person was sitting across the street or two floors above.


Regarding your point about ancient systems having higher quality, maybe there has been a selection effect? The systems that were crap might have been rewritten years ago before you even came to the company.

There's a very strong drive for quality. When you handle payments or trading systems, having 1% of transaction disappearing doesn't cut it.

It's the environment around critical systems. When shit hits the fan, it's really bad and there's a feedback loop to developers/managers/companies.

Software is ironed out over time. Bugs are removed and things are very stable in finance.

Software that really doesn't work (and can't be fixed) might be killed or never see the light of day. The result of failure is too bad and too visible.

It's way more than that. There's all sort of strange incentives. Like you don't get dinged for fixing a bug instead of making a new features (frequently there's no new features to make, it's largely maintenance work).


Of course but that’s kinda the point. If you have a 20 year old system still kicking odds are good that it lasted because it’s quality. If it wasn’t it would have been replaced ages ago.

Or more probably the company using them folded...

This comment (I think, from my limited experience) really does capture all the context around my question really well. Of course not all old code is immediately well-written by mere virtue of its survival - a lot of it is, as mentioned here and else where, sustained because of internal company politics or financial reasons, in flux and a living codebase, etc.

But I am interested in the stuff that survives because of how well-written it is. To your point though, the context and ecosystem of these codebases is as important, if not more so, that the code itself. A lot of comments are pointing to the linux kernel which is a tour de force in and of itself, but also really interesting for the community/communities its inspired.


That last point might be applicable to any software system.

I'm not the best person to answer this either, but whenever anyone says old code is fragile and we should just throw it all out and start over, I always point at this, possibly famous post, by Joel Spolsky: https://www.joelonsoftware.com/2000/04/06/things-you-should-...

Couple of key comments for me "The idea that new code is better than old is patently absurd. Old code has been used. It has been tested. Lots of bugs have been found, and they’ve been fixed."

"It’s important to remember that when you start from scratch there is absolutely no reason to believe that you are going to do a better job than you did the first time. First of all, you probably don’t even have the same programming team that worked on version one, so you don’t actually have “more experience”. You’re just going to make most of the old mistakes again, and introduce some new problems that weren’t in the original version."

A similar analogy might be Chesterton's Fence, it's important to know why it's in the state it's in before you decide to alter it?: https://fs.blog/2020/03/chestertons-fence/


The sqlite codebase is often mentioned as an example of well-written, well-tested antifragile code.

It is not at all remotely "antifragile". It's extremely brittle and after so many years you just know the happy path through the code. Any deviation results in spectacular failure and bug reports get closed as "wontfix". I can't show you examples because copyrighting, licensing, and I'd lose my job.

It has been my experience that modern new software is just like this, only it's more ubiquitous.

"Antifragile" doesn't feel like a very robust word to me.

Google it. It's a commonly used word that needs definition.

Good pun.

It's just the opposite. Old code is so fragile because you don't know what's important in it.

An example is Bitcoin's concensus critical code: anything that goes in there probably has to be maintained forever (or for the life of Bitcoin)


> Old code is so fragile because you don't know what's important in it. An example is Bitcoin's concensus critical code

I don't think Bitcoin is a perfect example here, the Bitcoin consensus rule is fragile because it's Bitcoin, not because it's old. The unchangeable consensus code is a result due to the fundamental property of Bitcoin itself (every client must run in lockstep, even the slightest deviance cannot be tolerated, which means there should be one and only one Bitcoin client whose behavior shall remain consistent for as long as Bitcoin exists), this outcome is independent from its code quality, age, and other factors.

...Nevertheless, on second thought, the Bitcoin analogy isn't too far from the truth. If an aged legacy system is the dependency of many other systems and serves a critical role, it effectively becomes Bitcoin-like.


Of course, regulations help old code to remain as well, which is probably the case in finance.

Just think about Boeing's case where the company is trying to fit new code into the memory of a 8086 chip.


I thought it was the 80286, not the original 8086. Don't underestimate the computing power of an aircraft too much...

Probably you are right, I was lazy to check it :)

Old codebases are really more about exploiting the sunk cost fallacy, nepotism and cultivating a fear and uncertainty of new technology.

That is to say, old codebases are mostly about organizational politics and social dynamics, rather than anything inherently stable or high-quality about the codebase itself.

So basically, if you want your codebase to still be in production 50 years from now, here's what you do: build something important and mission critical. It should be something that could cost millions if it ever fails. Make the design so bespoke and specialized that only you and people trained by you actually know how it works and how to fix it. Fraternize with the management and use nepotism to grease the wheels and convince them to overlook the inefficiencies inherent in your project. This works well in government projects where you can offer kickbacks in the form of campaign donations. Once you are providing a necessity, and there is a real financial risk associated with your system failing, you're set for life. Expect your contract to be renewed next year, and every year after that, forever and ever and ever and ever.


Thanks for these!

I have a copy of The Shock of The Old and I did find it thought-provoking. I had recently become aware of the ideas/discipline of discourse analysis[1] so it was more enlightening than it might have been to someone who wasn't really aware that our narratives are framed (like younger me).

I'll be checking out the Russell and Vinsel.

1. https://en.wikipedia.org/wiki/Discourse_analysis


I'd throw Postman's work in there as well...

https://en.wikipedia.org/wiki/Technopoly


These books sound great. Would you recommend I read them in that order too?

Yes! The second book is heavily influenced by the first.

The first one (Edgerton) is history, with the argument being roughly that we should frame the history of technology not just in terms of inventions/innovations, but also in terms of what was actually in use at a particular moment. (So the early 21st century is about self-driving cars and the iPhone, but also the Haber-Bosch Process and COBOL.)

The second book is more about the present, and is concerned with how we tend to talk too much about innovation and not enough about maintenance. (So they love Right to Repair and Open Source maintainers, among other things.)


Thank you :)

Coding in COBOL was a real downer after 2 years of assembly language coding. One register pointed to the top of all your main memory in assembler. Solving COBOL dumps required resolving pointers to pointers, and I had not kept good notes in COBOL class. Fortunately soon after I returned to COBOL programming debuggers came on line that solved the COBOL misdirection for you. Misdirection is the kindest word I can think of to describe COBOL dumps.

I've really never solved a dump since COBOL, which I have not worked in since 1996. Debugging has come a long way.


This was such an incredibly insightful historical perspective. Thanks for sharing.

I don't like articles about COBOL, because most of them talk about how young people don't want to learn it. I'm fine with learning shitty technology. The real issue is the money. Look at the salaries for COBOL jobs. They suck shit.

If banks need to update their software, hire people and train them. It will take time, and time is money. That money could be invested into something else (there is an opportunity cost). If that something else is a higher priority to you, then maintaining the integrity of your critical software is not actually as important to you as you say it is.

I will not take a pay cut to work on COBOL for a bank.


> I will not take a pay cut to work on COBOL for a bank.

You're probably just not the right type of personality. It seems that COBOL programmers have nice, slow paced, easy and safe jobs. That could be why there's so little hiring for COBOL programmers. Once you get someone, they stay put for decades until they retire. I'm sure there are plenty of programmers out there ready to drop the stressful sleepless start-up jobs and settle into something easy and 9-5, but perhaps don't know how to get from A to B.


I have a brother-in-law who is a COBOL programmer. Very much as you describe. I'd like to know more about it, honestly, but he's very much the kind of guy for whom work is just work and not interesting to talk about after hours. In his early fifties he is one of the youngest at his job - and despite having been at one company his whole career, he's one of the least longest-serving. We don't talk money, his lifestyle doesn't seem lavish but maintaining a single-income family in a high cost of living area is a lifestyle not to be sniffed at.

I'd say this totally depends on your area.

When I graduated most of my classmates were hired by the local branch of an international consultancy firm (hint: it's currently involved in a legal feud with the GOP in the US). Pay is below market average, and their job involves dealing with legacy Java and COBOL codebases. The Java part is OK, but they find COBOL very painful to work with. They are also required to do unpaid overtime due to hard deadlines.

On the other hand, a few of us went on to work for startups all over the country, and if you compare our careers ever since, they are like night and day. We can switch jobs easily due to having marketable skills (COBOL is a dead end outside a few companies) and we are making twice as much without working overtime.


I do not get why you say that COBOL programmers get paid less. Once you know everything about the systems and the language you should be able to get above market pay because you are really hard to replace. Pay me more or I leave would be a pretty easy management decision.

You would think, but no: programmers can never make more than managers. Management makes sure of it.

Poignant but ultimately true.

“Should” is the key word. I’ve looked at COBOL job listings every few years and I’ve never seen one for more than $90k US. So maybe once people are on the job maybe they have more leverage but those job listings don’t indicate any desperation for talent.

Look at daily rates, not permanent positions.

What daily rates do you see?

Right. I grew up hearing that COBOL and Fortran devs would become extinct and extremely valuable in my lifetime.

This mention this in the article. They used to grab people off the street and teach them COBOL, paid for by the banks. What happened between the 1960s and today that this type of practice stopped?

They realized they could wait around for the government to bail them out on this expense too?

Either it’s a really easy language to learn or the banks used to invest much more in teaching it than they do now (= zero).

If COBOL is so domain-specific, why not teach it as such on the job? Why is it the public’s fault for not investing in it?


> Either it’s a really easy language to learn or the banks used to invest much more in teaching it than they do now (= zero).

Very few companies invest in training these days. They expect schools and colleges to teach some professional skills (this is not what they should be doing) and invest as little as possible in new hires, specially for technical jobs.


> If COBOL is so domain-specific, why not teach it as such on the job? Why is it the public’s fault for not investing in it?

That's what my comment was saying. According to the article, the banks used to pay for their staff to learn COBOL on the job. It's 100% the financial industries fault that they have allowed this situation to arise. Why did that practice stop? I think this is one of the areas where capitalism is slowly failing in recent decades.


Presumably their codebases grew hairier and hairier and now it takes an enormous amount of work to change things without breaking stuff.

Maintaining and understanding legacy codebases is often something given to outsourcing companies too.

As it requires a deep understanding of how the monolith works and its connections to other applications, it usually needs to be done in conjunction with the support team which may be outsourced too.


I'm hearing that COBOL programmers are paid way above market rate right now.

wait it out, eventually it will have to go up as the old timers die off.

I have some experience in Fintech. I wrote the translation layer between financial cores and a RESTful API (essentially wrapper logic). D+H, Fiserv, BottomLine, all provided us integration documentation which allowed us to preform actions (transfer funds between accounts, get balances, etc.) and it wasn't _too_ bad.

I do remember one particular core that was surprising difficult. Everything was encoded in EBCDIC. You had to rotate telnet (!) connections depending on the day of the week. There was one person (a man named Earl) who could help you if you had a problem with the continuous stationary documentation (luckily it was scanned).

Oh the memories...


I was amazed to learn how much the US depends on SFTP for ACH. Though if a more instant based system becomes available through the Federal Reserve we could finally get to quicker payments like the rest of the world.

In the EU, SEPA was forced through via legislation[1]. The financial industry had to be dragged kicking and screaming along.

[1] https://en.m.wikipedia.org/wiki/Single_Euro_Payments_Area


I didn't realize how lucky we are with SEPA until after having to deal with US banks.

And this is why lobby groups should be regulated into good behavior - so that economic interests don't have the power of hundreds of millions of voters.

rotating telnet connections lol

I have no problem with older systems being used for financial stuff like this. Sometimes it’s easier to just keep using what you already have. What I do have a problem with is when the antiquated computer systems make an entire country’s banking sector move glacially. ACH is so slow, and other countries (EU especially) have payment systems like SEPA and Faster Payments which are universally used, and clear pretty much instantly. I don’t know why the US doesn’t have something like this (I believe the fed is working on it), but it’s definitely created the room for the myriad of p2p payment apps like cash app and Venmo in the US, which really need not exist in a more developed banking system.

The Federal Reserve is rolling out an instant payment system called FedNow in 2023. It’s design and implementation languished until Congressional pressure with concerns about Zelle becoming a dominant payment platform as a private entity (owned by Early Warning Systems).

A separate issue is the Fed exploring creation of “digital currency”, essentially deposit accounts at the central bank. That was encouraged by the threat of Facebook’s Libra and China’s “digital yuan”.

https://corpgov.law.harvard.edu/2020/08/31/fednow-the-federa...

https://www.reuters.com/article/us-china-currency-digital-ex...


This is not without controversy. It has placed the Fed in the awkward position of competing with some of the firms it regulates, who had already begun building their own systems.

In all fairness, same shit in Europe with PSD-2 (Payment Services Directive). Imagine an auditorium with representatives from each bank in Europe, no matter how big or small; each shouting at each other about how their systems are they way they are and will not be changed! It makes agreeing upon standards awfully hard.

SEPA standards are pretty slick though, built a couple implementations myself for transfer documents (pain.00X) and It Just Works™.


> SEPA standards are pretty slick though, built a couple implementations myself for transfer documents (pain.00X) and It Just Works™.

Where did you learn about them enough to be able to implement them? Are the standard(s) public available?


I don't know why there are working examples of things and the USA just has to go it on their own. Why not just wholesale copy SEPA and use it rather than the "we have to build it here" syndrome. I'll never understand that. Same way with working educational and prison systems.

COBOL has a SEP field on it (and for good reasons - it is someone else's problem), but when discussing APL, I've often reached for it as an example that everything is just preference.

When I show an APL or K gem such as "|/0(0|+)\" (which is a complete and rather efficient implementation of the maximum-subarray-sum problem), people usually complain that "it's unreadable".

But then "ADD ONE TO X GIVING X" compared to "++x" or "x=x+1" shows that explicit is only better than implicit when you're not familiar with the notation.

Readability is all about your expectation and familiarity.


> COBOL has a SEP field on it (and for good reasons - it is someone else's problem),

What do you mean by SEP field? (I can't find it in the article either). Is this a programming construct like a field of a record?


Somebody Else's Problem field. It's a joke from Hitchhiker's Guide to the Galaxy. Instead of an invisibility field, where you can't see something, you use an SEP field, which makes you look at something, subconsciously think "that's somebody else's problem", and then you look away.

I found out most people confound familiarity and "intuitiveness" when trying to teach Emacs. They would say it was not intuitive because common key shortcuts were not default. I would try to explain, to no avail I must say, that it is rather intuitive since, once you understand the basics, everything follows quite logically and key bindings are customizable.

One of rules of design is that there is no such thing as intuitive, only familiar.

While I get your point, your example seems to only be partially valid. Programming languages are moving away from the ++ operator due to its ambiguity in favor of x=x+1. Exactly to make sure that what you expect to happen actually happens.

Moving away from? Ambiguity? Maybe from "x++", and even that is not clear.

There is no ambiguity, "++x" is as strong as ever in the C family including C, C++, C#, Java, JavaScript. Python added "x+=1" which is as ambiguous as "++x" and did so (relatively) recently, even though it had "x=x+1".

I'm not sure why you think "programming languages are moving away from the ++ operator". Care to elaborate?


If you know C++, do you also know that `a[i] = i++` is UB... but only until C++17? If not, I hope that answers the question.

That's one of the reasons some newer languages like Rust purposefully avoided increment operators, where instead you just do `x += 1` (and the add-assign operator doesn't return anything) or `x = x + 1`.


C has lots of undefined behavior unrelated to ++ (which only adds a small twist in the grand scheme of things).

In C afaik f(g(),h()) is still UB because either g() or h() could be called first. ++ is just another small detail. (And no, I didn’t know C++17 made that behavior defined. Do you have a quick description of how it is defined now?)

“Newer languages like rust” - just say “rust”, unless you have another good example of a non fringe language that adopted most c syntax including += but without ++.


`f(g(),h())` is not UB. The order of evaluation of parameters is unspecified, but it's not undefined. They will be evaluated in some order before the call to `f()` takes place. If both functions modify the same global state, the order of modification is unspecified. It's still not undefined, and your use of global variables is still not recommended.

Zig did too.

> Python added "x+=1" ... (relatively) recently

Not all that recently anymore – it was added 20 years ago in Python 2.0: https://docs.python.org/2/whatsnew/2.0.html#augmented-assign...

But the walrus operator, "x := x + 1", was added very recently: https://docs.python.org/3/whatsnew/3.8.html#assignment-expre...


I didn't mean to say that old languages would change the behavior of well established operators, that seems unlikely to happen and would be quite bad for backwards compatibility. Newer languages, like Rust and Go (although they went with a different solution [0]) but probably others as well don't want to deal with the ambiguity of post- or pre-increment operators. Python as well as you mentioned. From your other replies I see that you don't agree that there is any ambiguity. That is fine with me, I don't plan on trying to convince you. I lack the in-depth knowledge to be able to do that in a credible way anyway. I just note that the designers of newer languages seem to disagree with your conclusion. [0] https://golang.org/doc/faq#inc_dec

Not that I ever had problem with it, but ambiguity with ++ lies in prefix/postfix version of the operator. I don't see how x+=1 is ambiguous in any way, it's just a shorter way to write it, but there's no side-effects like with x++?

There is no ambiguity. Prefix is increment pre (increment before getting the value). Postfix is increment post (increment after getting the value). Both have side effects as does x+=1.

consider the call f(x+=1,x+=1) when x=5; what is the call? f(6,7) or f(7,6)? Iirc even f(6,6) and f(7,7) were valid behavior in the past.


"x += 1" in Python is a statement, not an expression, so that's a SyntaxError.

But I suppose you would run into this kind of ambiguity with the walrus operator, i.e.

  >>> x = 5
  >>> f = lambda a, b: (a, b)
  >>> print(f(x := x + 1, x := x + 1))
  (6, 7)
PEP 572 appears[1] to specify that the evaluation order should always be left to right.

[1] https://www.python.org/dev/peps/pep-0572/#change-to-evaluati...


Any COBOL contractors that can tell what the pay/hour is? First hand experience please, rumors I've read them all already.


My firm pays a contractor for supporting our mainframe systems. His company charges us ~$900 a day for his services.

Your firm pays his company $216k a year. He probably sees 1/3 to 1/2 of that, maybe less. I'm not in software but that seems low doesn't it

$900/day is $225,000/year.

Or about $112.50/hour.

But this is what your company pays as Corp-to-Corp, to his company.

His gross pay is significantly less. Maybe at $65/hour. So his annual pay might only be $130,000/year.

And while this is decent pay, for middle class wages, it is not really very impressive.


also, how does one learn COBOL nowaday? it’s mostly a curiosity buy would be interesting to see what people that use it recommend

I'm working on learning COBOL for IBM i, and to do so, I picked up a server that runs it on eBay (€500), learned to install it, then started reading the IBM documentation[1]

You can skip the hardware purchase by joining the IBM i Hobbyists discord[2]; there are a decent number of people with machines who are happy to give out accounts (myself included)

IBM i is a lesser used platform for COBOL, though; the main one is IBM mainframes (aka MVS and z/OS). To get started there, you could try Master the Mainframe (I don't have a link handy, sorry) or tk4[3], and get support from the Mainframe discord[4]. I'm focusing on IBM i because I found it significantly easier to get started, and the platform itself has a really unique design that I was interested in learning about.

[1] https://www.ibm.com/support/knowledgecenter/ssw_ibm_i_73/rza...

[2] https://discord.gg/s5xs5ceW26

[3] http://wotho.ethz.ch/tk4-/

[4] https://discord.gg/XFrzjs2ydP


https://www.coursera.org/learn/cobol-programming-vscode is one place to start. :)

If you're done that, I presume there's also https://www.coursera.org/professional-certificates/ibm-z-mai... though it's starting to feel even more like marketing after seeing all that.


Download and install the GnuCOBOL compiler. Go through some of the many tutorials on the web. ????. Profit.

Next up: learning Ada.


There’s also https://github.com/openmainframeproject/cobol-programming-co... which IIRC is an IBM backed open source course.

It's funny: the more readable and maintainable and testable and modular some code written in a certain programming language is, the easier it is to understand and rewrite it step by step.

Hence, code written in good programming languages might have a shorter lifetime compared to say... COBOL or PHP code. A bit paradox.


I think the elitist attitude towards COBOL really should stop. COBOL is a domain-specific language for implementing commercial/financial batch processing jobs, no more, no less. With adequate control over decimal arithmetic and established declarative practices for structured file exchange, subroutine reuse, and job control. Based on a language spec with reasonable portability. It's not about good or bad but fitness for a purpose. COBOL programs can be very clear and to the point compared to eg Java programs replacing them with lots of misguided OOP patterns and metaprogramming.

You make a good point, I’ve not thought about COBOL as a DSL before but the analogy fits.

I think there are legitimate criticisms, such as the fact that it’s not portable between architectures, compilers are typically closed source and expensive, code is typically not portable between compilers.

These are all things we mostly expect languages to have gotten past now, so I can understand the feeling that it’s stuck in the past. The thought experiment of what would a DSL for banking look like though does suggest COBOL isn’t too bad.

It feels like this approach would work well with the approach of companies like MicroFocus: compile COBOL to JVM/CLR, and then allow pieces to be replaced with Java/CLR languages as necessary, and allow running on “normal” machines, removing the dependencies on architecture (by emulating in the compiler). That way, these old codebases almost become a specialised VM language, while regular modern languages can be used to augment.


First of all, why would COBOL be considered a DSL? Is Java a DSL too then?

Second, you make it sound as if COBOL would be somehow better than using Java, C#, ... because it is a DSL. But that's not the case - or can you give some more concrete points for why it should be better?


> First of all, why would COBOL be considered a DSL?

Let me answer as GP and having made that claim. Because COBOL can't really do much anything else ;? Eg have fun implementing an event-driven GUI app (actually there are/were solutions for running COBOL green-screen mainframe apps on browsers, and things like IBM's HATS for running 3270 apps in Java portlets). Though apart from batch processing, I guess COBOL the language but not necessarily the runtime works well also for writing backend service implementation code.

More seriously, as I recall it, COBOL simply has straightforward idioms for arithmetic, date calculations, statically-typed structured file I/O, ISAM file access (and SQL?) as part of the language rather than Java's BigDecimal and various date libraries that all suck in a different way and at best cause enormous fluent-style expressions with cognitive overhead. Plus, COBOL doesn't have reflection and metaprogramming so self-important idiomatic Java code golf is spotted immediately as out of place next to actual business logic.


> Eg have fun implementing an event-driven GUI app

Which leads to the question: why is this harder in COBOL than in Javascript or Java? Is it just the lack of a library? If so, does the lack of libraries make a programming language a DSL? Or is it because it is difficult to create such a library? If so, why is that difficult for COBOL and not for Javascript/Java and what else is difficult to build with COBOL.

> I guess COBOL the language but not necessarily the runtime works well also for writing backend service implementation code

I'm sorry to be picky on words here, but as it is the core of the whole discussion: "works well" is quite meaningless if not put into context. "Works better than X" or "has an advantage over X in some way, for example..." is much more fruitful for a discussion about this topic.

> More seriously, as I recall it, COBOL simply has straightforward idioms for arithmetic, date calculations, statically-typed structured file I/O, ISAM file access (and SQL?) as part of the language rather than Java's BigDecimal and various date libraries that all suck in a different way and at best cause enormous fluent-style expressions with cognitive overhead.

I don't want to be ignorant here... and I'm also not a fan of Java. In fact I dislike Java so much that I declined highly paid jobs. However, is COBOL really better in these things?

So I have seen some production COBOL code at work before and I just looked up some COBOL questions on stackoverflow, e.g.: https://stackoverflow.com/questions/48016044/formatting-date...

Sorry, but even Java's horribly verbose syntax looks way better than this. Apart from the annoying verbosity, I think Java's new time library as well as BigDecimal/BigInteger and also its SQL libraries are not too bad anymore. And if one wants nicer syntax - there are enough languages out there that do it better.

Okay, having SQL directly embeddable is nice, I agree. I don't think it is good programming language design to do that, but then again SQL has been very stable over time and it's certainly nice to use it like that.

As for...

> statically-typed structured file I/O

What does that mean? I tried to find it out but didn't really get an idea what you mean by that.

> Plus, COBOL doesn't have reflection and metaprogramming so self-important idiomatic Java code golf is spotted immediately as out of place next to actual business logic.

That's true, but that doesn't make COBOL better - it just makes Java worse. ;)


Plus you don't need to pay for more cpu / lpar licenses on the mainframe.

COBOL just didn't improve as a language (yeah, I know that there is also OOP COBOL) and hence it's not productive to use it.

Even Java is improving more than COBOL and improving fast is certainly not Java's strength...


For those who haven't seen this old joke, Bruce Clement suggested in SIGPLAN 1992 that OOP COBOL should be called "ADD 1 TO COBOL GIVING COBOL".

I think it is fair to say that Java up till about Java 8 had a problem with not evolving fast enough.

However, I'd say now Java is really one of the quicker evolving languages out there, which is really impressive. They went from a go like evolution pace to being closer to a Rust or Javascript evolution.

Turns out, fast release cycles do a lot of good for a language.


Java certainly got faster, which is good. Related to the manpower, it is still moving super slow. Which makes sense to some degree as the language just has so many old problems and does not want to considerably break compatibility and also hasn't invested into how to move fast as a language.

The problem is that if you convert your COBOL code to the language du jour (LDJ), you've only kicked the can down the road. You will always have to keep some COBOL experts on staff (the conversion is never finished), plus the new LDJ experts.

In ten or twenty years, LDJ will be obsolete, now you have to convert to LDJ2. Now you have COBOL programmers on staff, plus LDJ programmers, plus LDJ2 programmers. No one wants to be on the obsolete teams, your salary budget bloats, and you have more Babel.

If there's always going to be something better, and you're always going to be fragmented, why not stay with just the one single obsolete technology?


This is one of my great fears as a software maintainer.

It seems almost impossible for any product to ever fully shed itself of a technology. Primarily because while the initial work may yield benefits that management loves, they never prioritize finishing the job as some feature always takes priority.

And now you have 2 problems. You have half your product in one language/tech, and another dark corner in another tech. Your build is more complex and your product more fragile.


To me the appropriate language for a re-write of (software written in) COBOL is SQL.

Edit: clarification in parentheses.


I can't confirm this is fact, but remember when unemployments systems in NJ couldn't keep up and they were asking for volunteers to help when the pandemic started? The people who worked on such systems were complaining that they were previously laid off to be replaced by contractors.

https://www.reddit.com/r/recruitinghell/comments/fz6b74/pitc...

If I were going to work for a bank, I'd look at Nubank.

* They have made 2 tech aquihires - Cognitect and Plataformatec

* Clojure

* Engineering culture

I can't imagine how bad it would be to build up 10 years in a COBOL factory just to be let go. Pretty easy to get a new job I suppose. Most big business places I've dealt with don't embrace remote either.


Our wonderful reserve currency rests atop a fairly well tested and robust system called SWIFT which at its core is just ftp servers with some message queues. If it ain’t broke?

Have some doubts about its robustness and security [0]. To be honest, though, once you are against a dedicated nation-state team all bets are off.

[0] https://en.wikipedia.org/wiki/2015%E2%80%932016_SWIFT_bankin...


blink

I've never heard of SWIFTNet using FTP at its core, do you have some links I could read?

Wikipedia, for example , while not a fully trusted source, has

> Alliance Access (SAA) and Alliance Messaging Hub (AMH) are the main messaging software applications by SWIFT, which allow message creation for FIN messages, routing and monitoring for FIN and MX messages. The main interfaces are FTA (files transfer automated, not FTP)

https://en.wikipedia.org/wiki/Society_for_Worldwide_Interban...


Is this still really true? Most trading shops use C++ and a ton of banks use Java. This article provides no evidence that COBOL is still the dominant language.

Yes, it dominates. We don't talk about it a lot, because talking about things working isn't all that interesting. We don't talk about it when things don't work, because the failure isn't the language but the design.

We don't talk about COBOL so much as we talk with COBOL. I have email after email with COBOL snippets discussing various projects (and I'm an analyst here, not a programmer). No one mentions COBOL by name, but almost anyone, even nonprogrammers, understand it to a degree that they can reason about what is happening and point out errors in the assumptions made by the programmer.

There are IVRs that talk to mainframes via tn3270 to execute CICS transactions. Most CICS programs are written in COBOL. There are external websites, internal web applications, even VBA macros that rely on tn3270 and CICS, thus COBOL.

While other languages may dominate the business of languages, COBOL dominates the business of business.


Exactly my experience. Been working in fintech for 15+ years. Went through 4 tier 1 banks and a few smaller ones. Most popular languages are cpp, java and sometimes even c# or ocaml. Never did I hear cobol even being mentioned.

Just wondering how true those statements about cobol still being so popular are...


How many retail banks, insurances and government departments have still got a mainframe sitting at the very back though, based on a good few engagements in banks and gov in the UK, I haven’t yet come across where ultimately some mainframe batch job is hidden behind a fair few layers of middleware and MQ. And my guess is that most of that is on COBOL.

I have an intuition that COBOL will become very valuable in the years to come as the majority of greybeards that keep it running will stop work - and then the projects to rewrite core business functions in banks, insurances will come think and fast and will provide a never ending gravy train for contractors that can help with migrations or keeping the lights on. I for one have COBOL on the list of languages which I want to know enough to understand.

Here's a place where interop can do magic. There's a .NET implementation of COBOL out there for instance. First thing I'd do if I inherited a project like this is change the runtime and write new code in C#, then slowly migrate away the internals ( and for places where the claim that COBOL is great holds merit, wrap those up in assemblies implemented in COBOL)

This is like the code you write on the stuff you blast off into space—sometimes the most important things need the simplest, most trustworthy, and, yes, oldest code.

The header (as high as the screen) doesn't work without js. Scroll to read the actual article (readable without js).

I realize this is a naive perspective, but rather than try to “translate millions of lines of COBOL to Java”, why not just build a new system from the ground up, based on requirements and first principals? The function of these systems seems fairly simple from the outside. What am I missing?

As someone else pointed out, there are no requirements the code is the only source.

A long time ago I work on the core processing system for a large US insurer, the business would continually call up for confirmation on the business rules from the code. Not a basis for success on complex business laden with tonnes of regulatory oversight.


Absolutely, but for systems like this, there should be clean separation between business rules and core application logic. Filters, hooks, rules engines, etc. I very much realize “easier said than done” but the complacency with regard to keeping these systems the way they are is truly alarming.

For companies like banks, the transition is the riskiest part. For management, there is no upside in such projects since the system already works, but downsides are project failure and your banking systems ceasing to work. Not to mention cost.

the only thing driving change on such a system is when it can no longer to scale to handle the "traffic" or legislation that forces it.

That would take a long time and can’t be realistically predicted (even if it could, the prediction would be months). Management will never ok a development effort whose schedule isn’t predicted as more than a few weeks (ESPECIALLY in fin tech).

I work in a OTA and you would be surprised to know how much of the accommodation industry still relies on good old fax machines. I'm not aware of any other industry using fax as much as this (which I'm sure there is).

Indeed.com lists 690 cobol job ads for the whole of USA. 36,600 java ads.

So if I learn Cobol, I'll land a job in a bank?

You might also need to build a time machine to go back to the 1960s

Yes, but it will pay 1/5 of what you can make expending the same effort learning python and doing “machine learning” for a web startup.

On the other hand, your COBOL job will last a long time. And you won't be talked into taking a shitty salary with the promise of big payout that doesn't happen.

Why new languages can't be easy for newbies as BASIC and COBOL were? Why technology gets harder instead of getting easier?

Because we keep building on top of things, and trying to make stuff simpler to use, rather than simpler to change. Hence why I'm curious to see what programming could be in 10-15 years time, when programmers had an iPhone (or equivalent) they _whole_ life.

"Because we keep building on top of things, and trying to make stuff simpler to use, rather than simpler to change"

wow, very good point


It's been a long time since I have used BASIC, but is Python really harder?

It introduces some concepts that are absent from BASIC. If you think about it, BASIC, with its limited conditional branching, no concept of functions - it was set variables, then use GOSUB, with all variables being globally visible, is very close to the machine language of the host computer with some added types. If you wanted to do recursion, you'd need to invent a stack.

Python is much easier than BASIC for anything other than the simplest of tasks. Plus there is memory safety and much more dynamism. It also encourages you to write in a more explicit fashion than things like LISP or Ruby or Perl (ug!) that I prefer.

Classic BASIC is memory safe, if you don't use POKE to alter a memory location or CALL to invoke machine code.

ctrl-f blockchain: nothing?

This technology is clearly crap and ripe for disruption. Sure: there are some good reasons for things to be the way that they are but decades of compatibility work won't save banks against a competent competitor.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: