flatpepsi17 a day ago

Article starts mentioning 4GL's - a term I have not heard in a long, long time.

COBOL's promise was that it was human-like text, so we wouldn't need programmers anymore. A lot like "low code" platforms, and now LLM generated code.

The problem is that the average person doesn't know how to explain & solve a problem in sufficient detail to get a working solution. When you get down to breaking down that problem... you become a programmer.

The main lesson of COBOL is that it isn't the computer interface/language that necessitates a programmer.

  • vishnugupta a day ago

    I agree with you by and large except for this part.

    > COBOL's promise was ... we wouldn't need programmers anymore..average person doesn't know how to explain & solve a problem

    COBOL wasn't intended to be used by an "average" person but rather those with deep domain knowledge. They would know the business processes so well that they could transcribe it in COBOL with little or no need to learn how the computers worked. In some ways similar to analysts/data folks using SQL to communicate with databases.

    While at it let me share a few more aspects of the top of my head.

    COBOL and 4GLs in general were primarily intended to be used to build business applications; payroll, banking, HRMS, inventory management and so on. Even within that emphasis was more towards batch processing operations to reduce the burden on people doing routine bulk operations like reconciliation.

    COBOL harks back to the times when there was no dedicated DBMS software. Which is why you see so much focus on how files are organised and the extensive verbs around files which somewhat resemble SQL today.

    • moomin a day ago

      In my experience, often it’s hard to find that person with deep domain knowledge, and even when they do, it’s unstructured, they take things for granted they shouldn’t* and the have no appreciation of the demands of formalism.

      Getting anything you can use to construct a work plan, never mind a detailed feature list, out of clients can be a dark art.

      *To the point I have repeatedly experienced a point close to the end of the project where they go “What do you mean you don’t handle a case I have failed to mention for the entire duration of the project?”

      • dcminter 21 hours ago

        I recall a spec doc from a domain expert that said something like:

        "The transaction consists of a credit stub and a debit stub. If the debit stub is missing and is of type X then we do A and if it is of type Y then we do B."

        How to know what flavour the missing item was? Absolutely no mention of that...

        • chii 17 hours ago

          It's interesting that domain experts all exhibit the same cognitive issue - their assumptions are just so ingrained that they cannot articulate it at all.

          The fact that they "know" a missing stub would have a type is because they actually have some more information than they let on, and this information is only known by the expert. For example, they know if the submission was from party A, it must be type X.

          But that fact might not ever be recorded in the computer system, in a way that the old business process would've had a record of.

          And this is just one small example - imagine something more complex!

          So realistically, the job of a programmer is to force an expert to articulate all of their assumptions. IMHO, the best way to do it is to be sitting with the expert, and observe exactly what they do.

          • DebtDeflation 10 hours ago

            My experience with "business domain experts" is that the majority of them are simply executing a process that someone else defined a long time ago. Their definition of "success" is usually that all steps within the process execute successfully without error and they can move on to the next transaction or activity. Very few of them are capable of taking a step back and considering what the process is actually trying to achieve and whether there might be a better way of accomplishing it. This leads to constant "paving of the cowpath" where archaic processes just get replicated in new technologies every so many years.

            • chii 8 minutes ago

              That is a fair point. The programmer will have to learn the process, but also understand the process' goals and intentions.

              If they do, then they by definition become a domain expert. It's just that this takes a while, and projects usually don't give enough time for such to take place unfortunately.

          • dcminter 9 hours ago

            > For example, they know if the submission was from party A, it must be type X.

            Ha! As far as I remember it was almost exactly this when we interrogated them (but it's been a while).

          • eru 15 hours ago

            > IMHO, the best way to do it is to be sitting with the expert, and observe exactly what they do.

            Or you give them a prototype of the program, and see what they complain about?

            • Seylox 14 hours ago

              Oh, you're onto something. May I slap a sticker on it and call it "agile"?

              • normie3000 14 hours ago

                How about "extreme programming"?

                • eru 6 hours ago

                  I have much more sympathies for 'extreme programming' than for 'agile'. Mostly because I see many more clueless people use 'agile', while 'extreme programming' seems to be a term that time forgot and eg http://www.extremeprogramming.org/ is still in a state of relative innocence. The website was apparently last updated in 2009.

            • discretion22 14 hours ago

              My experience is that invariably results in "development by veto". Each prototype they say that's not what I want, give me something else (that I'll fail to describe just like the last time) and I'll tell you that is wrong too after you've worked on it for a few weeks.

              Occasionally, you'll randomly get something they accept - but only for a few weeks until they come across some missing capability for some other thing they never told you about.

              • eru 36 minutes ago

                > My experience is that invariably results in "development by veto".

                Yes, I wasn't entirely serious.

                Though you can get pretty far by doing some roleplay, where you pretend to be the computer/system (perhaps put up paper screen to make it easier to roleplay, and pass messages written on paper) and have the expert interact.

              • randomdata 12 hours ago

                You still have to actually listen to the complaints. "That's not what I want" does not mean try again, it means they have no interest in what you are trying to offer in even the most basic sense. The lesson from that type of complaint is that you are barking up the wrong tree. Time to move on to something else.

                When you are solving a real problem, you will still receive complaints, but they will be much more constructive.

          • zknow 16 hours ago

            you might say the domain expert expects the coputer to also already be a domain expert

    • tannhaeuser a day ago

      > COBOL and 4GLs in general

      COBOL dates back to 1959, much earlier than 4GLs, and the cited 1992/1999 articles make the point that 4GLs were poised to replace the likes of COBOL and FORTRAN when in fact those dinosaurs, or rather nautili since still living, turned out to outlive 4GLs except SQL (when counted as 4GL).

    • nerdponx 16 hours ago

      > In some ways similar to analysts/data folks using SQL to communicate with databases.

      But SQL has the exact same problem. Except for very trivial scenarios, you can't just be an expert and plop your expertise into a SQL query. You have to learn how to use SQL to use SQL.

      • vishnugupta 12 hours ago

        > You have to learn how to use SQL to use SQL.

        As with any other tool one has to learn it to effectively use it. Some find the learning curve not worth it and stick with Excel which is OK. But the thing is even Excel has to be learned to make full use of its potential.

        • randomdata 12 hours ago

          Keep in mind that the context is around domain experts being able to transcribe their domain knowledge into a machine-understandable language without concern for the intricacies of the machine it is executed on. That is where COBOL and SQL are said to have failed to live up to the hype, of which I'd agree. SQL is not a particularly good abstraction. Even for relatively trivial tasks, you still need to understand how computers work. EXPLAIN is the bread and butter of SQL users.

          Ultimately every abstraction is leaky. There will never be a solution where you never need to understand how computers work under all circumstances. But my impression is that you can go a lot further in Excel before the stuff going on behind the scenes starts to get in your way? From what I have seen, Excel itself is more likely to get in your way before not knowing how computers work does.

  • froh a day ago

    > The problem is that the average person doesn't know how to explain & solve a problem in sufficient detail to get a working solution.

    I intuit this also is an intrinsic limit to LLM based approaches to "you don't need them expensive programmers no more"

    with LLMs magically "generating the solution" you move the responsibility for concise expression of the problem up the ladder.

    and then you "program" in prompts, reviewing the LLM-proposed formalization ("code").

    I other words, the nature of "programming" changes to prompt engineering. alas you still have to understand formal languages (code)...

    so there'll always be plenty to do for humans who can "math" :-)

    • mmcdermott 19 hours ago

      There is a disconnect somewhere. When I read online, I hear about how GenAI/LLMs replace programmers and office workers. When I go to work, I mostly hear the question of how we can apply GenAI/LLMs, apart from discussion of the general buzz.

      Maybe this is a reflection of local conditions, I'm not sure, but it doesn't seem like the truly revolutionary changes require the solution to find a problem. It was immediately clear what you could do with assembly line automation, or the motor car, or the printing press.

      • eru 15 hours ago

        Electricity famously took perhaps twenty years for people to slowly figure out how to re-organise factories around it.. Hence the delayed impact on productivity figures.

        To elaborate: in the bad old days of you had one big engine, eg a steam engine, that was driving shafts and belts all around the factory. There was a lot of friction, and this was dangerous. So you had to carefully design your factory around these constraints. That's the era of multi-story factories: you used the third dimension to cram more workstations closer to your prime mover.

        With electricity, even if you have to make your own, you just need cables and you can install small electric motors for every task on every workstation. Now your factory layout becomes a lot more flexible, and you can optimise for eg material flow through your factory and for cost. That's when factories becomes mostly sprawling one-story buildings.

        I simplify, but figuring all of that out took time.

        • conradev 15 hours ago

          a quote from Steve Jobs, explaining that the breakthrough invention was the fractional horsepower motor:

          "Let’s look at the brief history of computers. Best way to understand it’s probably an analogy. Take the electric motor. The electric motor was first invented in the late 1800s. And when it was first invented, it was only possible to build a very, very large one, which meant that it could only be cost-justified for very large applications. And therefore electric motors did not proliferate very fast at all.

          But the next breakthrough was when somebody took one of these large electric motors and they ran a shaft through the middle of a factory and, through a series of belts and pulleys, brought…shared the horsepower of this one large electric motor on 15 or 20 medium-size workstations, thereby allowing one electric motor to be cost-justified on some medium-scale tasks. And electric motors proliferated even further then.

          But the real breakthrough was the invention of the fractional-horsepower electric motor. We could then bring the horsepower directly to where it was needed and cost-justified it on a totally individual application. And I think there’s about 55 or so fractional-horsepower motors now in every household."

        • mmcdermott 13 hours ago

          Adoption takes time, for sure, especially when dealing with fixed assets like a factory. The difference I'm poking at is that electricity had a clear value proposition and improved over time. I see people looking for the value proposition in GenAI/LLMs, which brings me to the original question.

          If GenAI now was like early electricity, we would know what we wanted to use it for, even if we weren't there yet. That isn't what it looks like to me, but I'd be curious to know if that's just where I'm sitting, metaphorically speaking.

          Every company I have worked for had more work than hands for programming and other knowledge work. Capacity is valuable. Does anyone here see GenAI teams being spun up for "management" by a human? Or do we see fancy Google search / code completion?

          • eru 5 hours ago

            > Adoption takes time, for sure, especially when dealing with fixed assets like a factory.

            I was talking about the need to re-imagine and re-organise how factories work, not about the physical factories themselves. So it's more like a 'software' problem.

            > Does anyone here see GenAI teams being spun up for "management" by a human? Or do we see fancy Google search / code completion?

            How would the two cases look different? If you have a human worker that uses GenAI to help her complete tasks (via something like fancy auto-completion of text, code etc) that previously took a whole human team, that's exactly how you would 'spin up a team of GenAI for management by a human' would look like, wouldn't it?

            It's just our framing that's different, and perhaps who that human is: you take someone who's familiar with the actual work and give her the tools to be faster, instead of taking someone who's more familiar with the meta-level work of managing humans.

            I suspect that's because managing humans is a rather specialised skill in the grand scheme of things, and one that doesn't help much with telling GenAI what to do. (And, human managers are more expensive per hour than individual contributors.)

            ---

            In any case, I agree that GenAI at the moment is still too immature to be trusted with much on its own. I hope more globally optimising AI like AlphaGo etc comes back in style, instead of essentially 'greedy' contemporary GenAI that just produces one token after another.

      • jerf 14 hours ago

        I think it's a lot of little things. There's a lot of people very motivated to keep presenting not just AI in general, but the AI we have in hand right now as the next big thing. We've got literally trillions of dollars of wealth tied up in that being maintained right now. It's a great news article to get eyeballs in an attention economy. The prospect of the monetary savings has the asset-owning class salivating.

        But I think a more subtle, harder-to-see aspect, that may well be bigger than all those forces, is a general underestimation of how often the problem is knowing what to do rather than how. "How" factors in, certainly, in various complicated ways. But "what" is the complicated thing.

        And I suspect that's what will actually gas out this current AI binge. It isn't just that they don't know "what"... it's that they can in many cases make it harder to learn "what" because the user is so busy with "how". That classic movie quote "Your scientists were so preoccupied with whether they could, they didn't stop to think if they should" may take on a new dimension of meaning in an AI era. You were so concerned with how to do the task and letting the computer do all the thinking you didn't consider whether that's what you should be doing at all.

        Also, I'm sure a lot of people will read this as me claiming AI can't learn what to do. Actually, no, I don't claim that. I'm talking about the humans here. Even if AI can get better at "what", if humans get too used to not thinking about it and don't even use the AI tool properly, AI is a long way from being able to fill in that deficit.

    • devjab a day ago

      This is only true to an extend. We have a lot of digitally inclined workers who’re developing programs or scripts to handle a lot of things for them. It’s imperfect and often wildly insecure and inefficient, but unlike any previous no-code or “standard” solution it actually works. Often in conjunction with “standard” solutions.

      On one hand you’re correct in that there will always be a need for programmers. I really doubt there will be a great need for generalist programmers though. The one area that may survive is the people who’re capable of transforming business needs and rules into code. Which requires a social and analytical skillset for cooperating with non tech people. You’ll also see a demand for skilled programmers at scale and for embedded programming, but the giant work force of generalist developers (and probably web developers once Figma and similar lets designers generate better code) is likely going to become much smaller in the coming decades.

      Then is basically what the entire office workforce is facing. AI believers have been saying AI would do to the office what robots did to the assembly line for years, but now it actually seems like they’re going to be correct.

      • stroupwaffle 20 hours ago

        Another parallel is type foundries and printing presses. At one point people operated these linotype machines which used molten lead. Of course this transitioned to photo typesetting which, to the dismay of everyone had poor results. Along came Donald Knuth and TeX to fix those deficiencies. NOTE: mechanical printing has a profoundly better result no matter what. It is the ink and the impression in paper that makes it superior (for letterforms and such).

        So, if AI follows suit, we will witness the dumb (but very knowledgeable) AI start to supplant workers with questionable results; and then someone (or a team) will make a discovery to take it to the limit and it’ll be game over for large swaths of jobs.

        https://en.m.wikipedia.org/wiki/Hot_metal_typesetting

        • Animats 19 hours ago

          TeX was predated by a family of macro-based document languages that began with RUNOFF and continued through roff, nroff, troff, ditroff, and groff. Plus tbl, eqn, and mm*. Some manual pages still use that stuff. Most Linux systems still install it by default. TeX has roughly the same concept, a macro-based layout language, but a better design with far less cruft.

    • adamc 16 hours ago

      Not to mention someone would need to evaluate and test the proposed solution... which with today's LLMs I would not bet heavily on its correctness.

      This is some years ago, but a friend of mine, trained in a 4GL that was still a procedural programming language, went somewhere that was using a higher level, model-based generation of code based on that language. It turned out they still needed a few people who understood how things worked beneath the hood.

      I am deeply skeptical that human-language level specifications will ever capture all the things that really need to be said for programming, any more than they do for mathematics. There are reasons for formalisms. English is slippery.

    • jasfi a day ago

      A lot of business people want to get something functional that they can sell, and hire a programmer if/when they can afford one. That niche is seeing a lot of uptake with regards to LLM based approaches.

      This works for them because an MVP typically isn't a lot of code for what they need, and LLMs have a limited scope within which they can generate something that works.

    • immibis 19 hours ago

      In fact we've been using programming LLMs for a long time, which we call compilers.

      • mannykannot 16 hours ago

        The acronym LLM stands for what is now a term of art for a class of information- processing systems which are produced by, and themselves produce their output by, methods very unlike those for compilers. This is just as well, considering the consequences that would follow from compilers routinely hallucinating.

  • unscaled a day ago

    4GL were supposed to be even more of that, with more "human-language-like" constructs added to the language to deal with things besides general logic, simple data structures and arithmetic.

    The author mentions "4GLs" were all the rage in the early 1990s, but I doubt that that was true outside of the mainframe world. The 4GL movement, as a conscious movement, seems to have always been highly mainframe oriented (the Wikipedia article mentions reducing the amount of punched cards necessary for a program as initial goals). By the 1990s you could categorize many languages as 4GL, but I doubt this term was used with any enthusiasm outside of the mainframe world. It was the opposite of a buzzword.

    1992 wasn't too long ago. Linus Torvalds has already released Linux, and Guido van Rossum was already working on Python. Perl was already gaining popularity, and Haskell also saw it first versions released. The forefront of technology was already shifting from expensive workstations to consumer-grade PCs and language designers gave little thought to 4GL concepts, even when they happened to design something that could qualify as a 4GL for personal computers (e.g. dBase, HyperTalk, AppleScript).

    I agree that human-like text is a bad idea for most use cases of programming, but I think this is not why the 4GL movement failed, and in fact most 4GLs weren't more "natural language-like" than the 3GL COBOL. I think the main problem was that the 4GL movement has never really defined a new generation or anything useful at all. The previously defined generations of language introduced revolutionary changes: translation from friendlier assembly language to machine code (2GL) and compilation (3GL). The only change we can properly define from the loose definition of 4GL is "put more features that used to be external routines or library directly into the language".

    This approach worked out relatively well when the language was domain-specific. This is how we got some of the most successful 4GLs like SQL, R and MATLAB. These languages have syntax that deals directly with data tables, statistics and linear algebra directly into the language and became very useful in their own niche. The concept of a general-purpose 4GL, on the other hand, was always destined to boil down to an overly bloated language.

    • int_19h a day ago

      dBase and its numerous descendants and competitors (FoxPro, Clipper etc) were extremely popular for line-of-business desktop applications in the 90s. And, yes, they are indeed traditionally categorized as 4GLs - and, given how nebulous the definition always has been anyway, I think that "traditionally categorized" is the most practical definition that you can use here.

      But, yes, I agree that aside from the generally more verbose and sometimes unwieldy syntax, there wasn't really that much to it in practice. I did work with FoxPro, and the reason why it was popular was not because you had to write things like "ACTIVATE WINDOW", but because it had many things baked directly into the language that nicely covered all the common tasks a pre-SQL data-centric app would need - e.g. a loop that could iterate directly over a table.

      • farrelle25 19 hours ago

        Gosh it's a long time since I heard 'Clipper' mentioned. I used to do 'PC' apps for Banks in the early 90s. Turbo Pascal and Clipper were popular with us. (We used PL/1 rather than COBOL for batch processing)

        Then VB 4.0 started to get popular around 1996 and ruled the roost...

        So many technologies... does anyone remember 'SUPRA' from that era! (think it was supposed to be a 4GL language/interface for mainframe databases)

        • El_RIDO 17 hours ago

          Sigh I work at a company that not long ago added support for applications written to use SUPRA to their portfolio. It's not dead yet, there are companies out there still running it in production and willing to spend money to replace it, while keeping their business logic.

          • adamc 16 hours ago

            Where I work we still use Software AG's Natural for mainframe programming. It's not really a bad language for what it is (very much focused on database programming). The main limitation is that they never created or provided great mechanisms for something like a standard library, so we do a lot in Python now, and occasionally other languages.

            From my perspective, the standard libraries of languages like Python and Java, as well as effective package managers such as pip or npm or cargo, have raised the bar so high that it is difficult for old, specialist languages to compete in most cases. Although the security problems of the package managers give me some pause.

      • Delphiza 17 hours ago

        That class of software also allowed for very efficient data capture against normalised tables. A recall as early as Paradox for DOS (something I haven't thought of for a while) in about 1990 being really simple tools for creating one-to-many database capture 'forms' (with selection boxes, date drop downs, the lot). The richness of form design and tight coupling to the database meant that the language did not need to be very powerful and could just run as a script on top of a rich database environment. The PC-based successor to mainframe 4GL concepts was late-nineties RAD (Rapid Application Development) of Delphi and VB. MS Access was the Windows successor to those tools and was wildly successful as a way for 'business people' to build apps. It took many years for windows low-level app development or the web to catch up to the richness, but they have never really achieved the same level of non-programmer usability.

        • int_19h 12 hours ago

          Yep, and C# (or VB.NET) + WinForms sort of carried that torch well into the aughts. You can still see traces of that all over classic .NET - stuff like DataSet and stock widgets designed specifically for those kinds of CRUD apps such as BindingNavigator.

          It's interesting that we have largely abandoned this approach to software development despite its amazing productivity in that niche. I guess a large part of it is because custom-made software is much less common in general than it used to be.

        • gopher_space 14 hours ago

          In my mind ‘low-code’ was perfected in FileMaker Pro and then quietly abandoned because you still needed an interest in the subject to use it.

    • cmiles74 16 hours ago

      I have to disagree that 4GL languages being aimed at the big iron, mainframe world. After browsing the Wikipedia page, there seems to be some confusion around what a 4GL would actually be... For instance, RPG is lumped into this category despite it functioning at a pretty low level and predating the idea by about 30 years. When I first started working with RPG we had worksheets from IBM that felt reminiscent of punch cards.

      In my experience, most 4GL languages were aimed at microcomputers and did reasonably well. Others have mention FoxPro and dBase, 4D and FileMaker also slot nicely into this category. IMHO, they had great success in the back office of small businesses.

      I have seen some effort to force SQL into this category, perhaps with the idea that a SQL database with stored procedures technically meets the 4GL definition.

  • AdieuToLogic a day ago

    > COBOL's promise was that it was human-like text, so we wouldn't need programmers anymore. A lot like "low code" platforms, and now LLM generated code.

    The more things change, the more they are the same.

  • bloppe a day ago

    Even LLMs have not realized the dream of a natural language computer interface. Everyone who uses them significantly has to read up on prompt engineering and add little things like "explain your steps" or "describe it like I'm 5" or other oddly specific sequences of characters to get the results they want. That's not natural language. It's a DSL.

    • rbanffy a day ago

      Worse. It’s a DSL without a formal specification. You are writing prompts blindly in hopes they trigger the desired behaviour from the LLM.

      A bit like writing enchantments to force demons to do your bidding.

      • namaria 19 hours ago

        Worse. You're just providing tokens which will get values you don't know or can predict attached to them and trying to influence values which will produce tokens based on rules you don't know, which change all the time for reasons you also don't know. Hopefully the tokens you get are the ones you're hoping for, and if you don't have complete mastery of the subject you won't know if they are the tokens you need or can even trust their meaning.

        • anthk 18 hours ago

          Z Machine text adventures are far more predictable...

      • ykonstant 19 hours ago

        >A bit like writing enchantments to force demons to do your bidding.

        But without the cool chanting and poetic language; just like cyberpunk was realized without the vivid imagery and neon lights :(

        • rbanffy 19 hours ago

          > just like cyberpunk was realized without the vivid imagery and neon lights :(

          The 21st century never ceases to disappoint. It’s a cheap, low budget and dystopian version of what we imagined.

          • NoGravitas 17 hours ago

            To be fair, what we imagined was dystopian, too. It's just that some people with a lot of ambition and not much media literacy didn't realize it was dystopian, and set about to build that future.

            • rbanffy 15 hours ago

              I am from the Thunderbirds generation. It wasn't a perfect 21st century, but at least it was cool. And the sound track was excellent.

            • zknow 16 hours ago

              could have been high budget dystopian at least!

              • TheOtherHobbes 14 hours ago

                The budget is extremely high, but very unevenly distributed.

                • rbanffy 14 hours ago

                  This is why I love Hacker News so much.

  • UniverseHacker a day ago

    > we wouldn't need programmers anymore

    This blows my mind, since it seems like a fairly low level/terse language compared to more modern domain specific languages.

    But in some sense they were dead right... since (I assume) that what "programming" meant at the time was being able to write raw machine code by hand on paper, and have it work - something few people can or need to do nowadays

    • AdieuToLogic a day ago

      > This blows my mind, since it seems like a fairly low level/terse language compared to more modern domain specific languages.

      I have heard others and myself describe COBOL in many ways, most involving creative expletive phraseology which would make a sailor blush, but "low level/terse language" is a new one to me.

      > But in some sense they were dead right... since (I assume) that what "programming" meant at the time was being able to write raw machine code by hand on paper ...

      LISP and Fortran predate COBOL IIRC.

      • UniverseHacker 14 hours ago

        > LISP and Fortran predate COBOL IIRC

        I didn't mean to imply COBOL was anything close to the first programming language, only that I was speculating what 'programming' generally meant within computer culture at the time. I was not around at that time- but I strongly suspect that directly writing machine code and/or assembly was still common practice throughout the entire 1950s, whereas it is far less common nowadays.

        I wonder what year Fortran overtook assembly and became the most popular programming language during that era? I suspect it was well after COBOL came out. Surely there is a lag time for any new programming language to become commonplace.

        I couldn't find any data on that, but I was able to find that C was released in 1972, but took until 1982 to overtake Fortran, and until 1985 to overtake Pascal. I often forget how slow new things propagated through the culture in pre-internet times.

      • andsoitis a day ago

        > LISP and Fortran predate COBOL IIRC.

        Correct. Fortran, LISP, and COBOL were invented in ‘57, ‘58, and ‘59, respectively.

        • jll29 a day ago

          > Yes, but the ideas behind COBOL were older still. Flowmatic, COBOL’s predecessor, dates back to 1955, so it really just depends how you count.

          Yes. but the ideas behind LISP were older still: Church's typed lambda calulus was conceived in 1936.

        • moomin a day ago

          Yes, but the ideas behind COBOL were older still. Flowmatic, COBOL’s predecessor, dates back to 1955, so it really just depends how you count.

    • thorin 20 hours ago

      Saying we don't need "programmers" any more was true when a programmer was someone who used very low level languages such as Assembly and probably had used punched cards in the past etc. Languages like cobol / fortran / plsql gave analysts a chance of designing things on paper and handing off to developers or even doing the development themselves which couldn't have happened in the past. Using something like python these days feels like the kind of thing that would have been thought of as a 4gl in those days for some use cases. However, python also works as a general purpose programming language.

    • electroly a day ago

      Do you mean something other than "terse" here? Or are you perhaps thinking of a different language? I cannot possibly imagine that adjective being used to describe COBOL. It is the #1 textbook example of a verbose language--the opposite of terse.

      • UniverseHacker a day ago

        What I mean is that it is an attempt to make a high level domain specific language, but is not my modern standards

  • systems_glitch 19 hours ago

    This is often the exact reason I give to people when they don't understand why Amy and I don't think "everyone should be a 'coder!'" should be pushed in schools, heh.

    When you're graduating students from high school who go into college as engineering hopefuls who can't solve X - 2 = 0 for X, what hopes does the average individual have for solving programming problems?

  • wvenable a day ago

    My company is finally upgrading away from a product that is written in a 4GL language. This product probably started out on a Unix but was ported to Windows decades ago. It has both a web and classic VB front ends.

    All the source code is available and theoretically I could make changes and compile it up. The language itself is basically just plain procedural code but with SQL mixed right in -- somewhat like DBase or Foxpro but worse. I think the compiler produces C code and is then compiled with C compiler but it's been a while since I looked into it. Requires a version of Kornshell for Windows as well.

  • agumonkey 19 hours ago

    I spent very little time in the cobol world but what I got is that its use outgrew many times its original design (batch processing not too complex tables/rows). Whenever you start to need complicated state machines and abstractions the language will implode.

  • actionfromafar a day ago

    Vision 4GL. Like VB but cross platform and with a horribly unstable IDE which would corrupt the source code. (Which was in some kind of binary format not amenable to source control.)

norir a day ago

I think of scala in this context. I think that scala is basically dead at this point in the way that COBOL was framed in the article. Yes, there are still many businesses/services that have critical components written in scala but current mindshare has cratered for new projects. I only single out scala because I have spent a lot of time with it and have seen it go through the hype cycle (in 2012-14 it seemed like I was constantly seeing doing $X in scala pieces on HN and I almost never see it referenced here anymore). It's probably a natural and inevitable phenomenon (and a bit of a shame because scala did get some things right that other mainstream languages still have not).

  • guessmyname a day ago

    I know a couple of engineering teams at Apple that are working on new projects in Scala, while also maintaining some legacy systems. Some of these projects are quite critical to the company’s ecosystem, e.g. test systems. I’ve spoken with several engineers who helped create these systems years ago; they’re all now in senior management positions. Some still stand by the technology choices made back then, while others are more open when given a chance to reflect. The general consensus is that if Kotlin had been available at the time, or if Swift had been a viable option for back-end services, they definitely wouldn’t have chosen Scala for those projects.

    • emmelaich a day ago

      Surprised they don't use Swift. Or is that too unstable? Or is there a on-JVM requirement?

      • worthless-trash a day ago

        My money is that they started these projects before swift was available on linux.

        I have no evidence to say that apple use Linux, but businesses gotta business so isnt a big bet to make.

        • swiftcoder a day ago

          Even if the servers run MacOS, swift wasn't really being aimed at backend usecases for the first few years of its existence....

        • mozman a day ago

          Apple is BSD based - not Linux.

          • scottlamb a day ago

            Apple is a company, not an operating system. The parent is almost certainly aware macOS is BSD-based and is suggesting Apple also uses Linux in e.g. cloud deployments. They are of course correct.

            • randomdata 10 hours ago

              > The parent is almost certainly aware macOS is BSD-based

              Doubtful. Surely they would know macOS is XNU-based?

              • worthless-trash 2 hours ago

                I am very aware of the situation of what Mac OS and Linux kernels and userspace are.

                I work for Red Hat in the kernel maintenance team.

                Edit: i just realised you were doubting the correctness that it was BSD, not that I knew.

    • andsoitis a day ago

      > The general consensus is that if Kotlin had been available at the time, or if Swift had been a viable option for back-end services, they definitely wouldn’t have chosen Scala for those projects.

      But they were not.

  • bad_user a day ago

    Scala is very much alive and kicking.

    https://redmonk.com/sogrady/2024/09/12/language-rankings-6-2...

    The initial hype has died off and that's OK. The hype cycle is inevitable for all languages. Also, predictions rarely happen, mostly because the landscape has changed. Mainstream programming languages can no longer die like Cobol did.

    E.g., Java has been dying ever since 2001, surviving the dotcom bubble, .NET, the P in LAMP, Ruby, JS, or Go. Python was supposed to die on its version 3 migration, with people supposedly moving to Ruby.

    FWIW, Scala is the world's most popular FP language, it has good tooling, and libraries, and Scala 3 is a wonderful upgrade.

    • norir a day ago

      I spent 3 years working on scala tooling in my free time. One of my libraries is used by the vast majority of scala users (it is a dependency of other widely used tools). There was growth from 2018-2023 but it has flatlined over the last year. Right when theoretically it should be getting the boost from scala 3 having now been stable for a bit.

      Personally I feel that scala has too much in the language and the compiler is too slow. The tooling is pretty good but it is finicky and ends up getting slow and/or unreliable with larger projects. Even if I were to restrict myself to a small subset of scala, I would still be unsatisfied with the long compile times which was the primary reason I decided to move on.

      I don't know if I agree with your contention that languages can't die like COBOL. I think you can relatively easily keep a legacy scala system up, put it in maintenance mode and write new features/products in something else. That is what I expect is already happening with scala and that this trend is likely to accelerate. Keep in mind also that Martin Odersky is nearing retirement age and it's really hard to imagine scala without him. He has much more power/control than the head of most languages.

      • bad_user a day ago

        IMO, there's rarely such a thing as maintenance mode. Projects constantly evolve, and in turn this drives more investment in tooling and the ecosystem needed to keep those projects up. And this investment is what eventually drives more new projects and fresh blood, keeping the language fresh and in demand.

        Again, look at Java.

        Ofc, there's always the question of what happens with a market that isn't constantly growing due to zero-interest rates phenomenon. I guess we'll see, but IMO, that's problematic for newer languages, not established ones.

        I too am a contributor of very popular libraries and am very familiar with ecosystem. One thing to keep in mind is that the language's culture has evolved. When I picked up Scala, back in 2010, the Future pattern and Future-driven libraries were all the rage. Whereas nowadays people prefer alternatives which now includes blocking I/O (Loom), with Future-driven libs being a risk going forward.

        • int_19h a day ago

          I don't think many people would describe Java as "fresh" these days. In demand, sure, but this is overwhelmingly driven by existing large enterprise codebases. Also, for all the talk about nifty new features, how much stuff is still on v11 even?

          • bad_user 21 hours ago

            I understand from where your reply is coming from, but again, I was reading the same opinions about Java since more than 2 decades ago.

            > overwhelmingly driven by existing large enterprise codebases

            That happens with all mainstream languages, but it's a feedback cycle. The more popular a language is (in large enterprise codebases), the more it will get used in new projects, for obvious reasons. People want to get shit done and to have good ROI and maintenance costs. Therefore, the availability of documentation, tooling, libraries, and developers helps, in large and small projects alike.

            And yes, Java is quite fresh, IMO.

            • randomdata 10 hours ago

              > The more popular a language is (in large enterprise codebases), the more it will get used in new projects

              It seems to me the more popular a language, the more poorly written libraries are found in it, which soon starts to draw people away from what is popular to a new language that has a limited library ecosystem thinking they can fix the mistakes they saw last time and make a name for themselves in the process. Lather, rinse, repeat.

            • int_19h 12 hours ago

              Two decades ago was 2004; that would be when Java first shipped generics. I remember those times and I don't think the sentiment was similar then. People certainly had many complaints about Java, and more specifically about some elements of the stack such as EJB, but as a whole I don't recall it being predominantly seen as "legacy" back then the way it is now.

          • baud147258 20 hours ago

            > how much stuff is still on v11 even?

            We've had a potential client ask for a PoC in Java 8, to integrate with their current system... But yeah, our product is deployed with Java 11 and since some dependencies have issues with 18, we'll likely stay that way for a few more years

          • surgical_fire 21 hours ago

            I've been a Java developer for nearly 2 decades, in multi companies, despite being proficient with other languages. Java just happened to pay better.

            Nearly all companies I worked for were developing new systems, tools, etc. Rarely I was doing maintenance on "existing larger enterprise systems".

  • bigger_cheese a day ago

    I think Perl today is probably closer to COBOL it was massive for a time, felt like it was everywhere.

    Nowadays it is increasingly niche. Like COBOL there is still a lot of perl code out in the wild.

    • bigiain a day ago

      Perl footgunned itself with the Perl5/Perl6/Raku and almost 2 decades between major releases debacle.

      I wrote a _lot_ of Perl, starting with Perl4 cgi scripts in the mid 90s, then Perl5 and FastCGI and Apache ModPerl. I loved it as a language. But by the time I left that gig in 2008, nobody wanted Perl any more. I mostly drifted around PHP, Python, Ruby, and Javascript for a few years until moving away from full time coding and up (sideways?) into leadership and mentoring roles.

      Interestingly I got _into_ the Perl gig when I bailed on a COBOL maintenance gig where it was clear nobody was at all interested in listening to how their 10+ year old custom COBOL warehouse management app (written by the company the boss's sister used to own) running on EOLed Wang minicomputers - was completely incapable of dealing with 4 digit dates for Y2K. I jumped ship to make that somebody else's problem.

      • codr7 9 hours ago

        I lost interest long before Raku, Parrot was a wild ride :)

    • chihuahua 12 hours ago

      When I used to work for ZipRecruiter in 22-23, much of their codebase was Perl. Pretty mind-boggling. There were even people working on it who would construct arguments why it was a perfectly reasonable language for ongoing software development. But some VP put his foot down and said "no more new projects in Perl!" and they started using Go for new projects.

    • enriquto a day ago

      > Nowadays it is increasingly niche.

      Still, if you buy a brand new mac today, most of the executable scripts in the system are written in perl.

      You can check it yourself by running:

          file -bL /bin/* /usr/bin/* | cut -d' ' -f1 | sort | uniq -c | sort -n
      
      As of 2024, macOS is essentially a Perl operation.
      • rightbyte 20 hours ago

        The bad thing with using a proper language like Perl for admin scripts is that they will degenerate into programs.

        The good thing with Bash etc is that they are so bad you wont and when you do it anyway atleast you get some whip lashes for it.

        • layer8 11 hours ago

          The problem is that Bash also isn’t really great for admin scripts.

        • anthk 18 hours ago

          There are good programs written in Perl. PangZero, Synaptic, most of Debian config tools...

    • xarope a day ago

      time to brush up my perl. Requires some zen'ess and flow time to grok the #@[]{} again...

      • kevindamm a day ago

        It's been over a decade for me but I remember the # for tables and @ for arrays not being that hard to decipher, it was the "where is this $_ referring to at this point?" kind of puzzles that would stump me, especially when pattern matching application implicitly uses it too.

        • skipkey 17 hours ago

          A pretty good rule of thumb was always, $_ is most likely the thing you need it to be. Oh, there are exceptions, but in my experience they were generally caused by code that had been made inappropriately clever.

  • mcv a day ago

    Scala can never be dead like COBOL because it has never been alive like Google. I love it too, but Scala has always been fringe. COBOL was everywhere.

  • alfalfasprout a day ago

    What about spark? Given the incredible adoption of spark across the industry I don't see scala going away anytime soon.

    • SOLAR_FIELDS a day ago

      Probably PySpark and similar connectors are robust enough now that they are not necessarily joined at the hip like they were 10 years ago. If you were working in Spark at the maximal of its hype cycle around then you basically had to use Scala in at least some extent - even if it was simply a core team exposing native API’s in other languages - since it was the most native approach that exposed all the apis you needed. Nowadays probably other languages and wrappers have caught up enough that using Scala is not such the absolute requirement it was before.

      • tdeck a day ago

        This is very true in my experience. I worked in Spark for 3 years and never touched Scala code. I imagine there are many people using Spark who don't even know it's written in Scala, or whose only interaction with Scala is accidentally stumbling on Scala Spark documentation when you were meaning to Google for PySpark.

  • darksaints a day ago

    It's a shame too. Scala3 is actually an amazing language, and has the best type system out of all of the common functional languages. Part of me wonders if Scala would still have died off if Scala3 came out first.

    • xiaodai 19 hours ago

      ocaml, f# and haskell

  • 7thaccount a day ago

    I assume it became less popular when Java became more bearable.

    • n_plus_1_acc a day ago

      And kotlin came around with great IDE support, and with good features without the complexity of scals

      • lol768 19 hours ago

        It's hard to understate how much better the Kotlin IDE support is vs Scala. In terms of reliability the Scala IntelliJ plugin really seemed to go downhill for me with Scala 3, which was a real shame.

  • Lance_ET_Compte a day ago

    Scala is the basis for Chisel HDL, which is widely used in the RISC-V design community.

  • jackcviers3 a day ago

    We use it for all new services at Writer. Jack Henry, SiriusXM, Starbucks, Disney streaming services, and Capitol One all have services (not data-science) divisions producing new projects in Scala ranging from the last five years to today.

    There are many others, of course, bit those are the teams at places people have heard of off of the top of my head. It's far from dead.

    What does seem to be dying are the framework-centric Play Akka, and non Airflow raw Spark jobs out there. Now, a lot of that is because they were framework jobs that happened to originate in the scala ecosystem - scala was largely incidental and was chosen because of founding project members' preferences or due to the need to develop a commercial market, imho.

    • BirAdam a day ago

      That’s precisely why people think it died. It became stable and therefore useful. It is therefore not talked about every 3 seconds by startup founders.

    • mozman a day ago

      As a fellow vendor for one of those names you dropped - I recommend you not to name any companies.

      • wholinator2 a day ago

        Why? Googling every name given returns public job postings for (senior) scala engineers. Presumably scala divisions at these companies are public knowledge?

  • xiaodai 19 hours ago

    > scala did get some things right that other mainstream languages still have not

    Examples?

  • mhh__ a day ago

    Scala3 looks fairly interesting.

    The problem however is that I can't be bothered to roll out a JDK, and secondly if I did it might encourage someone else to start writing Java again internally. Risky payoff...

tombert a day ago

You know, one of these days I really need to sit down and play with some of these "legacy" languages, like Fortran or COBOL or Ada or APL; languages that have certainly fallen out of popularity but are still used in some critical places.

It does make me wonder about millions and millions of lines of Java out there; Java has more or less eaten the enterprise space (for better or worse), but is there any reason to think that in 30-40 years the only people writing Java will be retirees maintaining old banking systems?

  • Muromec a day ago

    Cobol is still there not because of cobol itself, but because of vendor and platform lock-in. And I guess having monolithic codebase/platform.

    it’s not even esoteric and difficult, just a lot of it without much structure visible to you.

    • danielmarkbruce a day ago

      This is what people miss about COBOL. It's not like people are compiling COBOL and running it on Linux on an x86 box. They are running it on legacy operating systems (and hardware) which provide a different set of underlying services. It's a whole different planet.

      • crackez a day ago

        Negativo friendo.

        The mainframe is turning into a middleware layer running on Enterprise Linux. We've containerized the mainframe at this point, and I mean that directly - eg. Running jcl, multiple CICS regions, all in COBOL that originated on z/OS is now running in k8s on amd64.

        • kjellsbells a day ago

          I hope you're right, but many comments here on HN suggest their experience with mainframes is very different. z/OS and its predecessors provided so many services completely transparently to the application that a mainframe to modernity migration is doomed to fail unless it can completely emulate (or design around) the capabilities provided by the OS and other subsystems.

          Even ignoring the needs of the super high end customers like banks (eg, cpus in lockstep for redundancy), being able to write your app and just know that inter-node message passing is guaranteed, storage I/O calls are guaranteed, failover and transaction processing is guaranteed, just raises the bar for any contender.

          K8s is wonderful. Can it make all the above happen? Well, yes, given effort. If I'm the CTO of an airline, do I want to shell out money to make it happen, risk it blowing up in my face, or should I just pay IBM to keep the lights on, kick the can down the road, and divert precious capital to something with a more obvious ROI? I think their "no disasters on my watch/self preservation" instinct kicks in, and I can't really blame them.

          HN thread:

          https://news.ycombinator.com/item?id=36846195

          • Spooky23 a day ago

            Like anything else, some places are awesome, some not. I’ve seen both. The worst ones are just like modern places with overcustomized PeopleSoft or SAP - except the blobs of off the shelf software were purchased 30 years ago by people long dead.

            Other places stopped development 20 years ago and surrounded the mainframe with now legacy middleware. A lot of the “COBOL” problems with unemployment systems during COVID were actually legacy Java crap from the early 2000s that sat between the mainframe and users.

          • Muromec a day ago

            >If I'm the CTO of an airline, do I want to shell out money to make it happen, risk it blowing up in my face, or should I just pay IBM to keep the lights on

            But that's the thing, we are at the point when "keep paying IBM" isn't the acceptable answer anymore.

          • zifpanachr23 13 hours ago

            I work on them full time (not doing application programming and so I can't really speak to COBOL) but this is mostly accurate as it relates to the environment.

            A lot of these services are completely transparent to the application, but that doesn't mean they are totally transparent to the entire programming staff. The system configuration and programming is probably more complicated (and lower level usually, certainly YAML hasn't really caught on in the Mainframe world outside of the Unix environment) all things considered than something like k8s.

            So that's where a lot of the complications come in to play. Every application migration is going to necessarily involve recreating in Kubernetes or some other distributed system a lot of those same automations and customizations that decades worth of mainframe systems programmers have built up (many of whom will no longer be around). And however bad the COBOL labor shortage really is, the shortage of mainframe assembly programmers and personel familiar with the ins and ours of the hardware and system configuration is 10x worse.

            It should also be noted that not everywhere that has a mainframe has this issue. There is a wide disparity between the most unwieldy shops and the shops that have done occasional migrations to new LPARs and cleaned up tech debt and adopted new defaults as the operating system environments became more standardized over time. In the second case where a shop has been following the more modern best practices and defaults and has fewer custom systems lying around, ... the amount of effort for a migration (but also in a lot of ways, the motivation to take on a migration project) is lessened.

            The case where some company is just absolutely desperate to "get off the mainframe" tend to be cases where the tech debt has become unmanageable, the catch 22 being that these are also the cases where migrations are going to be the most likely to fail due to all of the reasons mentioned above.

          • kjs3 15 hours ago

            I hope you're right, but many comments here on HN suggest their experience with mainframes is very different.

            HN is not the place to seek authoritative experience with something like COBOL.

        • accra4rx a day ago

          [I work as a SA] . There are many companies that don't have a original COBOL source code only compiled objects which has been running for more than few decades. How can you guarantee that it will run perfectly in k8s . Major companies can never take that risk unless you give them some insurance against failure

        • Muromec a day ago

          There is a major drawback to this approach -- you need to have somebody who knows what they are doing. Total deal breaker in most of the places that have this problem in the first place.

          • gerdesj a day ago

            "you need to have somebody who knows what they are doing"

            That applies everywhere.

            Your parent comment has managed to stuff a mainframe in a container and suddenly, hardware is no longer an issue. COBOL is well documented too so all good and so too will be the OS they are emulating. I used to look after a System 36 and I remember a creaking book shelf.

            The code base may have some issues but it will be well battle tested due to age. Its COBOL so it is legible and understandable, even by the cool kids.

            If you lack the skills to engage with something then, yes, there will be snags. If you are prepared to read specs, manuals and have some reasonable programing aptitude and so on then you will be golden. No need for geniuses, just conscientious hard workers.

            It's not rocket science.

            • Muromec a day ago

              It's not the point I'm trying to make. Yes you can do fancy stuff like that and de-mainframing COBOL is to run in on k8s is the path I would personally choose if I had to deal with it. It sounds like a lot of fun and the sense of accomplishment to finally have it running should be great.

              The problem is -- it's very smart and unique, while organizations that have this kind of a problem don't want to depend on unique set of skills of a few highly capable individuals. Everything needs to be boring and people have to be replaceable.

              In this paradigm, vendor java with aws lock-in is a cost, but in-house fancy stuff with cobol on k4s done by smart people in house is worse -- it's a risk.

            • SonOfLilit a day ago

              The need applies everywhere, the difficulty of fulfilling it tends to be an order of magnitude more in places that tend to run COBOL.

              I'm working at one. You wouldn't believe the stories.

        • mathgorges a day ago

          This is fascinating to me as an ex-mainframer that now works on a niche hyperscaler. I would love to learn more!

          Will you let me know some of the names in the space so that I can research more? Some cursory searching only brings up some questionably relavent press releases from IBM.

          • crackez 18 hours ago

            Look up Micro Focus Enterprise Server and Enterprise Developer. They are now owned by Rocket.

            • danielmarkbruce 13 hours ago

              I second this and know some of the folks who work on Enterprise Server. Good people. They have a partnership of some sort with AWS and there is a bunch of decent docs around Enterprise Server on AWS

          • yourapostasy a day ago

            Sounds like they’re talking about running IBM Wazi on Red Hat OpenShift Virtualization. As far as I know, there isn’t a System z-on-a-container offering, like you install from a Helm Chart or comes to you from an OCI registry. If it is the IBM I know, it’s completely out of reach of most homelab’ers and hobbyists.

            IBM Wazi As A Service is supposed to be more affordable than the self hosted version and the Z Development and Test Environment (ZD&T) offering. ZD&T is around $5000 USD for the cheapest personal edition, so maybe around $2500-3500 USD per year?

        • danielmarkbruce a day ago

          Yup, but the COBOL application doesn't know you've done that.

      • Muromec a day ago

        A different kind of cloud you can say.

        • danielmarkbruce a day ago

          ha yes. There is actually a pretty cool product that is made by a division of Rocket Software named "AMC", it takes a COBOL app running on an IBM system and deploys it to a whole set of services on AWS. There are some smart dudes at that shop.

          • Muromec a day ago

            Doesn't surprise me at all, somebody out there should be smart enough to make good money on that and not be very loud about it either.

      • WesleyJohnson a day ago

        We're running RM/COBOL on RHEL8 VMs backed powered by VMware. I don't work with it, I'm in a different IT area, but our COBOL codebase supports the lion's share of our day-to-day operations.

  • 9659 a day ago

    Ada is an order of magnitude more modern and sophisticated than your other examples.

    I expect Ada will capture 0.05% of the market for the next 100 years.

    • johnnyjeans 16 hours ago

      Ada will probably go the way of the dodo as Dependent types catch on. It's phenomenal how ahead of it's time it was, and continues to be. Contracts are an absolute killer feature, and I see a lot of people who are otherwise very serious about memory safety scoff about logical safety, not understanding just how powerful that construct really is.

    • tombert a day ago

      Fair, I guess the list was “languages that I know were popular at one point but I don’t know anyone really using now”.

      Ada definitely does seem pretty cool from the little bit I have read about it. I’m not sure why it’s fallen by the wayside in favor of C and its derivatives.

      • johnnyjeans 15 hours ago

        It's easy to get lost in the modern way we look at compilers and toolchains, but it wasn't always like this. Free compilers basically didn't exist 30+ years ago. Certainly none of the free compilers were good. For the longest time, your only options for Ada compilers were priced at government contractor-levels (think $10k per seat... in the 80s). It's also an extremely complicated language, while C isn't. A single, moderately skilled programmer who can at least make their own FSM parser can write a reasonably complete C compiler in the space of a month. There's no hand-rolling your own Ada compiler. Even just complying with SPARK is a herculean task for a team of experts.

        This is much the same reason I'm highly skeptical of Rust as a replacement systems language to C. A multitude of very talented folk have been working on writing a second Rust compiler for years at this point. The simplicity and ease of bootstrapping C on any platform, without any special domain skills, was what made it absolutely killer. The LLVM promise of being easily ported just doesn't hold true. Making an LLVM backend is outrageously complicated in comparison to a rigid, non-optimizing C compiler, and it requires deep knowledge of how LLVM works in the first place.

        • 9659 7 hours ago

          if gnat (the gnu ada translator) from NYU had come out 5 years earlier, ada might have caught on with the masses.

      • aidenn0 a day ago

        Ada was mandated by the DoD for a bit. My understanding is that, in practice, this involved making a half-hearted effort in Ada, failing and then applying for a variance to not use Ada.

        • hardburn a day ago

          I actually met a programmer who worked on military jets. According to her, Ada is only used anymore for the older jets that were already programmed in it, and she worked in C++.

          • greenavocado a day ago

            Military jets coded in C++. God help us all.

            • hardburn 6 hours ago

              Most aerospace stuff is. The thing is, they have reams of very specific rules about how it's coded, how to verify that code, and how to verify the compiler of that code, and how to verify the code output from that compiler. It's not an easy process to replace, but its proven reliable just by all the commercial planes flying every day without falling out of the sky.

              In theory, something like Rust could do the job instead, but they'd still have to verify the entire chain. Rust is for the rest of us to get something half as reliable as that while also being able to write more than two lines of code per day.

            • FpUser a day ago

              No need to be so dramatic. Shitheads will make software fail in any language. Memory "safety" will not help you correctly and in timely manner calculate position of flight controls for example.

              • User23 a day ago

                One can write reliable, and I mean airtight good enough for medical devices and nuclear deterrence, in basically any even vaguely modern language (think Algol-60 or later). It’s simply a matter of disciplined design and running on hardware that’s sufficiently predictable.

          • 9659 a day ago

            yes, this is true. mainly due to a perceived lack of ada programmers on the market.

        • actionfromafar a day ago

          Often, I'm sure, but there are large code bases in Ada still. It's a shame, it looks like a really great language I would love. But it's a chicken and egg problem. If only Mozilla had decided on Ada instead of Rust! :-)

          • cyberax a day ago

            Ada doesn't offer any safety for dynamic memory. In fact, Ada is now adopting Rust's approach with the borrow checker.

            • actionfromafar a day ago

              Great! Time to jump on the Ada bandwagon then! ;)

    • 7thaccount a day ago

      Ada is pretty cool, but not sure if any more modern than APL. Both are actively maintained and useful in different areas.

      • int_19h a day ago

        Ada has seen quite a few major features added to it in the past couple of decades.

    • wbl a day ago

      The one shop that really used it is now open to C++ and I expect Rust. But their projects tend to last a long time: 3 generations have flown in one of them, etc.

    • thayne a day ago

      Modern fortran is actually fairly modern too. But most fortran codebases aren't modern fortran, they're Fortran 77. If you're lucky.

      • atrettel a day ago

        I agree that many modern Fortran codes aren't truly "modern" Fortran, but in my experience most codes have at least been ported to Fortran 90, even if they largely keep a lot of Fortran 77 baggage (especially the type system and indentation!). In all of my experience, I've really only encountered a single Fortran code being used currently that is actually Fortran 77 in the flesh. That said, I still think many Fortran codes would benefit from using more modern features, since so many are stuck in the past and are difficult to maintain for that reason.

        • jsrcout a day ago

          The codebase I've been working in lately is mostly pre-77 FORTRAN, maintained as such for all this time. "Stuck in the past" is an apt description.

        • thayne a day ago

          Perhaps I should have said "originally written in f77", and still look like it.

  • Mc91 a day ago

    I program an Android app for a Fortune 100 company. Last commit where someone edited a Java file was last week.

    Most of the new code from the past few years has been in Kotlin though.

    • Muromec a day ago

      This. Nobody wants to have the COBOL problem again, so the developer hiring money follows the programming language popularity market (with a certain regulatory approved laf ofc)

      • psjs a day ago

        “laf” or “lag”?

        • Muromec a day ago

          Lag of course. Math doors only open once in 25 years, you know the drill.

    • layer8 11 hours ago

      That’s because it’s Android.

  • ecshafer a day ago

    Fortran is pretty nice to write in if you are just writing numerical stuff. If I were just doing a pure numerical simulation, I would rather do it in fortran than c++ or python (without numpy which is just fortran and c++)

  • masto 17 hours ago

    I feel like APL is worth the experience, because it's such a different paradigm.

    I've got a soft spot for it as well because I actually used it. At work. On a PC. In the 90s. My assignment was to figure out how to get data into it, for which I ended up writing a routine that operated on floating point numbers as vectors of 1s and 0s and swapped the bits around to convert from Microsoft to IEEE format. While wearing an onion on my belt, of course.

  • overtomanu 13 hours ago

    Similar thing applies to SAP ABAP. It is like Java from a parallel world, where the accumulated cruft for maintaining backward compatibility is 3-4 times more than Java. It also like a low code/no code environment where the language, the UI, ABAP IDE etc is tightly coupled to one another. Like Java, it has continued to add more language features with time, but the legacy code using old constructs is still there in the codebase of many of the orgs.

    Initially and up to some extent still now, it is verbose and wording wise, very similar to COBOL, then somewhere I guess in the late 90s, OO paradigm wave came in, and it had "OO ABAP" with classes and methods. Now cloud wave is influencing it and ABAP now has a new cloud flavor "ABAP for cloud" where most of the old constructs are not supported.

  • rqtwteye 13 hours ago

    "is there any reason to think that in 30-40 years the only people writing Java will be retirees maintaining old banking systems?"

    I don't think so. But it's pretty much guaranteed that a lot of the people who are complaining about COBOL today are writing systems that will be legacy in 30 years. And the generation of programmers then will be complaining about today's programmers.

    Especially when I look at node or python with tons of external packages (.NET going the same way), I don't see a good long term future.

  • adastra22 a day ago

    Fortran is not a legacy language.

  • Suppafly a day ago

    >but is there any reason to think that in 30-40 years the only people writing Java will be retirees maintaining old banking systems?

    It feels like we're getting into that space already.

    • Muromec a day ago

      Nah not really. People just started replacing COBOL with java and employers are wise enough to hire people who are 30-40 years minimum from retirement.

      It can also be upgraded in smaller chunks and finding enough developers for the tool is an important metric corporate is looking at.

      If anything, banks are actively optimizing for developer experience to make sure 60% of new hires don’t run away in the first year. If anything, banks are better at navigating those kind of structural risks, they were just slow on undertaking such risks exist.

      If you have an episode of existential anxiety because of dat AI eating mijn job, getting a union job in a bank is a way to hedge this particular risk.

      • rightbyte 20 hours ago

        > employers are wise enough to hire people who are 30-40 years minimum from retirement.

        Uhm... loyalty is punnished and workers need to change jobs to keep 'market rate' wages. So dunno about that.

        I think it is more about that newcomers to the job market are easier to abuse.

      • gwd a day ago

        > ...employers are wise enough to hire people who are 30-40 years minimum from retirement.

        Um oh yeah, the reason we're hiring 20-year-olds is because we want to ensure we have lifelong support for the new system we're writing. Not because they're cheaper, they're still idealistic and naive, they'll work long hours for foosball tables and stacks, or anything like that...

        • Muromec a day ago

          In a place where you can imagine having COBOL, working long hours is frown upon and being idealistic beyond personal integrity isn't a good quality either. Not saying such places aren't cheap, as of course they are. Being cheap is their exact point.

      • User23 a day ago

        > employers are wise enough to hire people who are 30-40 years minimum from retirement.

        Well I hope they’re wise enough to not let any good employment attorneys catch wind because that’s blatantly illegal.

        • consteval 11 hours ago

          The problem with such laws is it's trivial to avoid. Do they look old? I mean, you can't presume someone is thinking about age when they choose not to hire someone, but they definitely could be.

          Discrimination is an almost "thought crime", meaning you can commit it entirely in your head. But the outcome is real. So it's very tough to spot, particularly when said discrimination also aligns with the most common societal biases.

        • Muromec a day ago

          It's not a requirement, but the outcome of hiring results demographics wise is very visible.

    • strken a day ago

      I think Android makes a difference here. Sure, a lot of people are on Kotlin, but a lot aren't.

  • eslaught a day ago

    I wrote a small program in Algol 68 once. It was horrible because it didn't even have heap allocation in the language, so things you'd think of doing in C (e.g., tree data structures) just didn't work. That and all the compiler errors were pure numerical codes which you had to go look up in the manual (not provided). And forget about getting line numbers.

    I am very much glad I wasn't alive at the time this was the state of the art.

    • pdw 18 hours ago

      You're probably thinking of Algol 60? Algol 68 definitely had heap operations, the sample code on Wikipedia even showcases them to build linked lists.

  • fastneutron 20 hours ago

    Fortran is alive and well in science and engineering. The more modern standards are much nicer to work with, but largely backwards compatible with stuff written 50 years ago.

  • Yodel0914 a day ago

    I’m not sure I’d choose to use Fortran, but at one point I had to maintain an app that had a Delphi UI and Fortran business logic. The Fortran, although spaghetti, was much less frustrating to work with.

  • karlgkk a day ago

    > in 30-40 years the only people writing Java will be retirees maintaining old banking systems?

    I kinda suspect that if Java is still around in 30 years, what we call Java will be - at best - vaguely recognizable.

  • jtolmar a day ago

    I can't say whether Java as a whole will ever become the next COBOL, but Java 8 already is well on the way there.

  • RickJWagner a day ago

    IBM offers a free COBOL plugin for VSCode and a nice tutorial with it.

    I started programming in COBOL (circa 1990) and took the tutorial just for fun earlier this year.

lefessan 19 hours ago

COBOL is not dead, but it's difficult to get access to, because there is almost no open-source tooling around it for Linux. We (OCamlPro) have created a project, called SuperBOL, to create an open-source environment around the GnuCOBOL open-source compiler (that is now very mature and used by companies). We have already released the VScode extension with an LSP for COBOL to get a modern IDE, and we are working on other tools, mostly depending on our customers.

  • kwanbix 19 hours ago

    The problem is not so much access to tooling, but access to mainframes. I can learn COBOL in a day or two, and I would love to work on a "boring" COBOL job, but I have no experience with mainframes.

    • lefessan 17 hours ago

      The goal of SuperBOL (https://superbol.eu) is to allow companies to migrate from mainframes to Linux workstations, to get a "modern" experience of development and deployment.

      Indeed, mainframes are hard to get access to, and require a training by themselves, I have worked on Linux and Windows for years, and development on a Mainframe has nothing in common :-)

      I think the problem of COBOL is not only the lack of COBOL developers, it is the lack of expertise in COBOL environments, because they have become obsolete (both on mainframe and proprietary tooling for Linux/Windows). By providing a modern environment on Linux for COBOL, our goal is to solve the hardest part of the problem, as learning COBOL itself is not so hard for existing open-source developers...

      • kwanbix 11 hours ago

        I imagine 99% of the work is to be done on "obsolete" mainframes as you call them. I doubt there is much new development in COBOL in 2024.

    • imgabe 18 hours ago

      Is there anything particularly different about mainframes compared to working on a server besides it probably being a different operating system? I assume it has a command line and you ssh into it somehow (or something similar)? Or are they still running punch cards or something?

      • julian_t 18 hours ago

        It's a very different (and foreign) environment. Job control language, how data is stored... if you come from a typical modern server environment you'd be pretty lost in the mainframe world.

        • macintux 17 hours ago

          In 1996 I took a TCP class in Chicago for which it turned out I was overqualified; it was mainly how to use tools like telnet and FTP.

          But what I remember most: the two other students were mainframe programmers, and they were just as baffled by my world as I was by theirs. It really was an entirely different computing paradigm, although 30 years later I probably have enough experience to make more connections than I could then.

      • lefessan 17 hours ago

        You can find videos on Youtube, it's like an IDE where you would do everything in a user interface with menus, but the interface is a 25x80-char screen only :-) Everything is "almost" similar to a Unix system, but with different names, files are called "datasets", there are very few tools, but every one of them has a language to do very complex tasks...

    • tephra 18 hours ago

      I know two people that spend some of their time writing COBOL for a major bank. They do find that part of the job pretty boring, it is basically just writing down SQL queries in a COBOL file and then trying to get passed their 50 year old development workflow (merge to master, then do testing in a testing environment, then get code review..).

      • noisy_boy 16 hours ago

        I would love to do that for a living assuming it has job security, no crazy obsession with "velocity" and sane working hours.

  • pantalaimon 18 hours ago
    • lefessan 17 hours ago

      We are among the major contributors to GnuCOBOL, we use https://github.com/OCamlPro/gnucobol to review our contributions before pushing them on the SVN of Sourceforge.

      Cobolworx is indeed working on a gcc frontend for COBOL. It's an impressive work (that was presented at FOSDEM this year), but less mature than GnuCOBOL, and tied to gcc, whereas GnuCOBOL can work with any C compiler (llvm, msvc, etc.) by translating COBOL to C.

      Though we desig SuperBOL to work with GnuCOBOL, it could also be used with GCOBOL when it will be officially stable.

msla a day ago

"I don't know what the language of the year 2000 will look like, but I know it will be called Fortran." —Tony Hoare

COBOL is alive in that it keeps changing from era to era, to the point modern COBOL looks rather little like the 1950s COBOL everyone instinctively thinks about when they heard the term. It's as if we were still programming in Algol because Java had been called Algol-94 or something.

  • Animats a day ago

    Nobody writes

        MULTIPLY A BY B GIVING C ON SIZE ERROR STOP RUN.
    
    any more.
    • graypegg a day ago

      I mean, if you squint your eyes a bit, that could be SQL! So even if it's not COBOL, there's people out there writing in a vaguely english business programming language.

      • throw-the-towel a day ago

        And SQL kinda dates from the same era, I wonder if this type of language was in vogue 50 years ago?

        • tdeck a day ago

          The only notable similarities I see are lack of special characters, all caps by default (most languages from this era are actually case insensitive), and using English words. Those characteristics were in vogue 50 years ago because many computers didn't support lowercase characters, and the set of non-alphanumeric characters supported tended to vary a lot between machines. Here's what the Jargon File had to say about EBCDIC, for example:

          > EBCDIC: /eb´s@·dik/, /eb´see`dik/, /eb´k@·dik/, n. [abbreviation, Extended Binary Coded Decimal Interchange Code] An alleged character set used on IBM dinosaurs. It exists in at least six mutually incompatible versions, all featuring such delights as non-contiguous letter sequences and the absence of several ASCII punctuation characters fairly important for modern computer languages (exactly which characters are absent varies according to which version of EBCDIC you're looking at).

      • tannhaeuser a day ago

        So you spotted that? I have no proof or links to share, but I've always thought SQL was inspired by, or at least made to not look out of place next to COBOL. I recall COBOL coding card layout interpreted a flag on punch cards at the char column where top-level picture clauses needed to start specifically for designating a line as SQL for static embedded SQL preprocessing.

        • DaiPlusPlus a day ago

          I think it’s more that computers at the time didn’t all have lowercase characters. Consider that even C and C++ supported trigraph/digraph compatibility chars until something like last year (and IBM still complained…):

      • Suppafly a day ago

        seriously sometimes writing SQL feels more like composing a google query than programming.

        • jl6 a day ago

          A great thing about being a programmer is getting to complain about the crappy requirements you have to work with. SQL, on the other hand, is not a program - it’s a precise specification of the result you want, in a format that lets the database engine write the “program” for you. Thus, writing SQL helps you appreciate the struggle to achieve good requirements, and there is a chance you will develop empathy for those cursed to write them.

          • int_19h a day ago

            That can be said of any program written in a pure declarative language, but even so not all of them look like SQL. And, yes, they are still programs.

        • Ekaros a day ago

          Well, it is in the name. Structured Query Language. And I would argue that it is very often right mind send. You are trying to query data, not process it. Thus actually making it query seems rather reasonable paradigm.

      • zozbot234 a day ago

        The nice thing about a vaguely English like language is that your average LLM is going to do a better job of making sense of it. Because it can leverage its learnings from the entire training set, not just the code-specific portion of it.

        • kibwen a day ago

          Not for generating it, because the more it looks like prose the more the LLM's output will be influenced by all the prose it's ingested.

          • crackez a day ago

            I've used o365 copilot to analyze a COBOL app I had source code to, and it was great at explaining how the code worked. Made writing an interface to it a breeze with some sample code and I swear I am not a COBOL person, I'm just the Linux guy trying to help a buddy out...

            It also does a reasonable job of generating working COBOL. I had to fix up just a few errors in the data definitions as the llm generated badly sized data members, but it was pretty smooth. Much smoother than my experiences with llm's and Python. What a crap shoot Python is with llm's...

        • valval 15 hours ago

          The exact opposite is true.

    • Smar a day ago

      > print food if tasty?

      Ruby is nice.

      • zdragnar a day ago

        Maybe I'm in a minority, but I genuinely dislike conditions placed afterwards.

        They feel great to type out when you're in the flow, but coming back and reading them grates on my nerves. Seeing the condition first means I load a logical branch into my mental context. Seeing the condition after means I have to rewrite the context of what I just read to become part of a logical branch, and now the flow of reading is broken.

        • User23 a day ago

          Try thinking of it as prefix if and infix if?

          And in any event it’s a very natural language pattern if you know what I mean.

          • zdragnar 12 hours ago

            Ostensibly, code is read more than it is written. I'd rather stick to forms that favor staying in a flow.

            Natural language patterns are conversational, and / or use pacing to create emphasis and imply meaning.

            With code, we aren't reading a natural language. Injecting natural language conventions amongst things like some_string.chars.each { |c| ... } Is entirely unnecessary and unhelpful in my not very humble opinion.

            • User23 9 hours ago

              I’m not really sure what you mean. Both prefix and infix if are both based on natural language conventions. So, for that matter, is reading code from left to right and any other number of patterns.

              The infix if form is as if not more readable than the prefix if in cases where only a single statement is guarded.

              I mean we could code without any pesky natural language at all by using some kind of lambda calculus with de Bruijn indices, but I think most people would find that considerably less readable.

    • analog31 a day ago

      Man that's almost like Hypercard.

    • kernal a day ago

      >Nobody writes MULTIPLY A BY B GIVING C ON SIZE ERROR STOP RUN.

      You had me at MULTIPLY A BY B

  • MathMonkeyMan a day ago

    More accurate might be "I don't know what the language of 2000 will be called, but I know it will look like Fortran."

  • 9659 a day ago

    This was almost true in 2000. It is not true now. Things change. Slowly.

  • j0hnyl a day ago

    But are these legacy systems from the 70s, 80s, 90s using modern cobol?

    • NikolaNovak a day ago

      Depends what you mean; but not necessarily.

      I am managing an ERP system implemented / went live in 2016. It's working on modern P10 hardware, which was released in 2021. The ERP system is continually updated by the vendor and customized by the client.

      Even for COBOL running on an actual mainframe, which I think most HNers would think of 1970s dinosaur, most of the actual machines in production would be pretty new. IBM z16 was launched in 2022.

      So they are "legacy systems" in the sense they're not written on a javascript framework which was launched last week, running on lambda instances in AWS :). But they are not "OLD" systems, as such.

      • zifpanachr23 12 hours ago

        Yep, the system is old in the same way that we could call x86 "old". The architecture is backwards compatible with instructions going back to the mid 1960s...but that doesn't mean new instructions and updates to the ISA aren't being pushed out on a pretty regular cadence.

        The new Telum II processor (and certainly this also implies another big ISA update and new hardware cycle) was announced at Hot Chips just a few weeks ago for example. See:

        https://chipsandcheese.com/2024/09/08/telum-ii-at-hot-chips-...

    • jcranmer a day ago

      Almost certainly yes. The "legacy systems" are likely running on versions of the mainframe with long-term support contracts, whose vendors are committed to providing newer compilers with support for newer versions of the specification as necessary.

    • ithkuil a day ago

      When you hear about people being paid $X vs 10x$X to fix some cobol; is there a correlation between the age of the cobol system?

      • HeyLaughingBoy a day ago

        Probably not; just a matter of how desperate they are.

        • ithkuil a day ago

          Which is also a function of how hard it is to find someone who has the required skills to address the problem

  • TMWNN a day ago

    > "I don't know what the language of the year 2000 will look like, but I know it will be called Fortran." —Tony Hoare

    Kemeny and Kurtz described Fortran as "old-fashioned" in 1968! <https://dtss.dartmouth.edu/sciencearticle/index.html>

    • randomdata 12 hours ago

      Hence "I don't know what the language [...] will look like". Hoare seems to have made that remark around the same time work to redesign Fortran, what eventually became Fortran 90, began. Presumably he was aware of that effort and thought that Fortran would be able to keep reinventing itself as future needs dictated.

kukkeliskuu a day ago

Cloud is the new mainframe, except worse. It has all the downsides, but does not have the biggest upside.

The grandpa could create (using CICS), a very reliable and performant service that would call other services inside the same transaction. The platform would handle all the complicated stuff, such as maintaining data integrity.

Try to write AWS Lambdas that call each other within the same transaction.

  • sofixa 21 hours ago

    > It has all the downsides

    Vendor lock-in from a single vendor? Wildly expensive capex and opex? Impossibility for people to know any of the tech involved without you sending them on a course to learn about it or them already having experience with it?

    > Try to write AWS Lambdas that call each other within the same transaction.

    Why is that your comparison? Was deploying to the mainframe as simple as throwing a .zip with your code at an API that you could give access to developers?

    • otabdeveloper4 15 hours ago

      > Vendor lock-in from a single vendor? Wildly expensive capex and opex? Impossibility for people to know any of the tech involved without you sending them on a course to learn about it or them already having experience with it?

      Is this a trick question? The answer is 'yes' to all three.

      • sofixa 14 hours ago

        For mainframes, it is.

        For AWS, it isn't. Outside of a few narrow exceptions, there is no vendor lock-in into a single vendor. (A container that can run into Lambda can run into Google Cloud Run just fine).

        There is no capex with AWS.

        There's a free tier and it's freely accessible to anyone. Anyone, and I mean anyone, can start learning it if they want to.

        Good luck getting access to a mainframe to play around to see how and what works. Or finding any useful tutorials from this century.

        • kukkeliskuu 4 hours ago

          On the mainframe renewal projects I have done, the mainframes have been leased. Hence there is no "CapEx" in the sense you mean it. On the enterprise cloud transformation projects I have done, there has been a substantial development cost ("CapEx") to get to the point where you can just drop in a .zip file. So the CapEx/OpEx question is complex.

          The grandpa developer delivered a function, not a .zip file. Nowadays the developer needs to deliver a .zip file -- because the developer is responsible for wrapping the function in something that executes the function -- often SpringBoot in corporate environment.

          He could use AWS Lambdas, but that locks them in. Also, you need to worry about restart times, and price/performance is high, because there are many layers of virtualization.

          But the biggest loss is that in "best-of-breed architecture" (bunch microservices running in Kubernetes) the developers have in practice no way of guaranteeing data integrity. Systems are in perpetually inconsistent state (called "eventual consistency") and we just pretend the problem does not exist.

          The grandpa developer could develop functions that would call other functions, and all would be executed within a transaction. It would be within his means to maintain data integrity.

          For the individual developer, the situation is much better. I can get my Django application to be hosted on fly.io in no time and it is quite cheap. I think the cost of running services is temporarily subsidized by influx of VC money, but that will eventually change.

snovymgodym a day ago

As always, these discussions will depend on your definition of "dead" and "alive".

If we can call a technology dead once no new business is built on it, then I think we can safely call COBOL dead (and the IBM 390x aka Z/OS platform along with it, for which "COBOL" is usually a proxy).

But if we say that anything still being used in production is not dead, then of course COBOL is alive and significantly more alive than many other things which are younger than it.

But this shouldn't really be taken as a positive point for COBOL or the mainframe ecosystem. It's simply a fact of life that organizations tend to stick with the first thing that works, and for the types of entities involved in the first wave of digitalization (e.g. governments, banks, airlines) that was usually an IBM mainframe along with the software that runs on it.

  • 8fingerlouie a day ago

    > and the IBM 390x aka Z/OS platform along with it

    The problem with killing the mainframe is that no other platform really exists that can handle the amount of simultanous IO that you can get on a mainframe. Our mainframe easily processes 100m transactions per hour, with room to spare. And keep in mind that those transactions are for the most part synchronous, and will result in multiple SQL transactions per transaction.

    Yes, eventual consistency is a thing, but it's a very bad match with the financial world at least, and maybe also military, insurance or medical/health. You can of course also partition the workload, but again, that creates consistency issues when going across shards.

    Also, COBOL is far from dead, but it's slowly getting there. I don't know of a single bank that isn't actively working on getting out of the mainframe, though all projections i've seen says that the mainframe and COBOL will be around until at least 2050.

    Give that a thought. That's 26 years of writing COBOL. Considering that COBOL programmers are also highly sought after, and usually well paid, one could literally still begin a career as a COBOL programmer today and almost get a full work life worth of it.

  • zifpanachr23 13 hours ago

    If by new business we literally mean startups, then that has been the case basically forever, even back in the 60s and 70s. Mainframes were never really for new businesses in the startup sense of the word. The barrier to entry has always been kind of extreme. So the big customers even back in the day were always old insurance companies and banks and governments etc. It wasn't really until minicomputers that "new businesses" doing computing at all was feasible.

    So in that sense, not much has really changed, and for the target market of the product, I don't think it makes sense as a good metric for whether the platform is dead or alive.

  • DaiPlusPlus a day ago

    > we can call a technology dead once no new business is built on it

    You don’t suppose any bank - or other large financial institution - might have standardised on Cobol for their core business flows/processes? In which case a new business-unit or “internal startup” team (e.g. a new category of insurance product) might very-well have some part written in Cobol so it integrates with the rest of the bank - or at very-least might be built-on-top of the org’s existing Cobol-running infrastructure (i.e. Not written in Cobol, but still runs on Z/OS because there’s no budget for buying new commodity x86 racks and the people to manage and run them).

    • snovymgodym a day ago

      Sure, I know for a fact that what you're describing exists. That's not really what I mean by new business being built on it. That's a case of a very large and old business already being so locked into the mainframe ecosystem for their core systems that anything new they try to do ends up needing some kind of integration system with the legacy system.

      What I mean is that nobody starts a business today and says "Ok, we need an IBM mainframe running DB2 and we'll have a bunch of COBOL, ReXX, and PL/I programs for handling our business logic".

      • zifpanachr23 13 hours ago

        There was a decent amount of that going on in China in the 90s and early 2000s actually in the banking sector. You probably won't see much "new business" until you see companies in new large markets explode in size. As in you are unlikely to ever see "new business" in the US, because you'd need a new bank or something to somehow explode to the size of one of the big four and suddenly realize they need to get on board with what all the competition is doing in order to compete.

        But it has happened at least a little within the past couple of decades, most notably with China but there have probably been other examples in Asia.

    • makeitdouble a day ago

      I was under the impression that banks with core COBOL processes all had an intermediate layer in Java/C# to deal with these kind of integration.

      We saw exactly the case of a new business unit being created, and like most other units it wouldn't get direct access to the lowest layer, and interact instead with a saner level of API and modules in the language of their stack.

      • jamesfinlayson a day ago

        Yeah that my impression too - I haven't worked in banking but I've worked at a few places with core functionality written in Fortran and then web-facing API layers on top of that (some was in Java in think).

  • calibas a day ago

    COBOL is undead.

palisade a day ago

Note: I'm getting some hate from others who think I would pick or prefer COBOL over a modern language. I wouldn't. I was making an outside-the-box "devil's advocate" objective observation. I just wanted to preface that here. Okay, the rest of my original comment remains below:

The irony is that we already had a memory safe and stable language in Cobol that was easier to read and understand than Rust. But, no one wants to use it so it is "dead" but it runs everything that made the modern age possible.

RUST:

println!("Enter number: ");

let mut input_string = String::new();

io::stdin().read_line(&mut input_string).unwrap();

let number: i32 = input_string.trim().parse().expect("Please enter a valid number.");

let result = if number % 2 == 0 {

    "EVEN"
} else {

    "ODD"
};

println!("The number: {}", result);

COBOL:

display 'Enter number: '

accept number

if function mod(number,2) = 0

    move 'even' to result
else

    move 'odd' to result
end-if

display 'The number: ',result

  • sestep a day ago

    This is a weird take. Sure, plenty of cool/nice things from old languages (e.g. variable-sized stack frames in Ada) get lost, and some then get rediscovered by future languages, potentially wasting effort. And I don't know COBOL, so maybe you're actually making a good point.

    But I find that hard to believe. Does COBOL really solve all the same problems Rust is intended to solve? Is it as performant? Can it interface with native code from other languages in the same way? Does it have a usable and sane package manager built on top of a module system that facilitates composability and backward compatibility? Does it have a way to describe the shape of data and errors as ergonomically as Rust's algebraic data types?

    Genuinely curious: as I said, I don't know COBOL. I'd find it extremely surprising if the answers to all these questions are "yes," though. Just as there are reasons COBOL is still used, there are also (good) reasons new languages have been created.

    • palisade a day ago

      A lot to unpack in this question.

      Do they solve all the same problems? No, for example COBOL lacks a modern concept of concurrency within a single program. COBOL's concurrency features are based on task-level parallelism, which involves dividing a program into multiple tasks that can be executed concurrently.

      Is it performant? Yes. COBOL is highly efficient particularly in handling large datasets and complex business logic and its compilers are optimized for reliability and speed.

      Can it interface with native code? Yes.

      Does it have a package manager? No.

      Does it describe shape of data? No. Data structures in COBOL are defined using fixed-length records.

      Note: I'm not a COBOL expert. I did learn it in college, though.

    • Muromec a day ago

      Imagine having a shell script being called from a cron job that writes data in a bunch of tab separated memory mapped files (memory mapping happens when you configure the thing), but you have more files than memory. And all the shell scripts call and include each other and have global variables too.

      And that underpins most of the critical infrastructure in your country.

      • User23 a day ago

        Except mainframe IO and interrupts actually work reliably. Unix on the other hand is a proud member of the worse is better club. It still doesn’t really handle interrupts correctly, but thanks to 40 years of kludges most people consider it close enough.

  • kibwen a day ago

    It's a bit odd to say these programs are comparable when the Cobol version isn't handling errors whereas the Rust program is (by panicking, but that's better than the silently wrong behavior of the Cobol one). Here's a runnable version of the above Cobol program (adding the necessary boilerplate); note that it prints "even" for an input of `abc` and "odd" for an input of `12`:

        identification division.
            program-id.
                even-or-odd.
        data division.
            working-storage section.
                01 num pic 9.
                01 result pic x(4).
        procedure division.
            display 'Enter number: '
        
            accept num
        
            if function mod(num, 2) = 0
                move 'even' to result
            else
                move 'odd' to result
            end-if
        
            display 'The number: ', result
        stop run.
    
    It's peculiar to call out Rust's syntax specifically when, like most other languages these days, is mostly C-like (though with a sprinkling of OCaml). And syntax aside, Rust and Cobol have wildly different goals, so "just use Cobol" doesn't suffice to obviate Rust's purpose for existing.
    • palisade a day ago

      Good catch! My cobol is rust-y. :D

      I guess my post is getting misread as "just use cobol" when it was more of a XKCD-like reflection; e.g. why did we all do that / keep doing that. We done did Cobol, and Rust. And, one is "dead" but not really and now here we are.

      https://xkcd.com/927/

  • erik_seaberg a day ago

    Sorry, https://www.ibm.com/docs/en/cobol-zos/6.2?topic=statement-ex... seems to be demonstrating a language that is not memory-safe (maybe it used to be, but how?)

      COMPUTE SIZE-NEEDED = LENGTH OF OBJ + LENGTH OF VARTAB * NUM-ELEMENTS
      ALLOCATE SIZE-NEEDED CHARACTERS INITIALIZED RETURNING VPTR
      SET ADDRESS OF VARGRP TO VPTR
      MOVE NUM-ELEMENTS TO OBJ
      MOVE BUFFER(1:SIZE-NEEDED) TO VARGRP
      SET VPTR TO ADDRESS OF BUFFER
      FREE VPTR
    • palisade a day ago

      The compiler would have rejected that, if I remember correctly. I'm not in the field of cobol myself, I learned it briefly in college ages ago.

      • electroly 16 hours ago

        Which part do you think would be rejected? This code is an example from the z/OS COBOL documentation--I'm quite sure it works.

        • palisade 13 hours ago

          The heap stuff is new I guess, we didn't have that back when I was writing programs in it. So, yea, not so safe anymore. :D I take back what I said about it being safer then. I can't go back and edit my original post, it's been too long.

          The compiler did normally warn for data bounds checking, so I figured it would in this case. If that's not the case anymore then I'm wrong.

  • Muromec a day ago

    Shell script is memory safe too, but you don't write anything longer than 100 lines in it for a reason.

    • palisade a day ago

      When you bank, COBOL (40% of online banks). When you use the ATM, COBOL (95% of ATM transactions). When you travel, COBOL (96% of airline ticket bookings). Healthcare, COBOL. Social Security, COBOL. Point of Sale, COBOL. IRS, COBOL. Pension funds? COBOL. Hotel bookings? COBOL. Payroll programs? COBOL.

      It is estimated that there is 800 billion lines of COBOL code in production systems in daily use. That is a bit more than 100 lines.

      This was why Y2K genuinely scared everyone and was a very real problem. The only reason we can look back at it and laugh now is that an army of engineers sat down and rewrote it all in the nick of time.

      • wglb a day ago

        The Y2K effort was much more nuanced than this. I was there for it and it was more like highly targeted patching based on carefully crafted test sets and frameworks.

        > army of engineers sat down and rewrote it all in the nick of time.

        No way did all get rewritten. Where source was available, fixes were applied and systems retested.

        True drama ensued for programs for which the source was no longer obtainable.

        The company I was at during that time had programs that had been in production since at least 1960.

        The other effort that took place was attending to the systems during the midnight boundary with everybody either in the office or on call.

        The other strong observation was that the risks were very much not understood, with exaggerations both extreme and dismissive. Also not discussed in the popular press at the time was the extent that most of these systems were not truly totally automated.

      • Muromec a day ago

        I'm a big enjoyer of arcain arts, but I happen to work in a place that actually has it and no -- nobody likes COBOL and it's not cool in any sense.

        • palisade a day ago

          Well, there is a good reason no one likes it. It isn't cool, I completely agree. Readable, simple, safe, performant and still relevant though? Ya.

          • Muromec a day ago

            >Readable, simple, safe, performant and still relevant though?

            It's performant, you can't take away that.

      • arcticbull a day ago

        Legacy code yeah, nobody's hitting File > New Project in COBOL

        It's just that nobody understands how the systems work and they're ossified. Those systems are going to be emulated until our grandchildren take over because nobody can understand them well enough to craft a replacement. Juuuust up until an LLM rewrites them for us.

        [edit] I mean those airlines systems are so old that they don't support special characters on names, passenger names are two fixed-length fields (first name, last name) and title and middle name just gets appended together.

        So you get LASTNAME/FIRSTNAMEMIDDLENAMENTITLE on your bookings. And each of those fields is truncated lol.

        and of course flight numbers are fixed at 4 digits, so we're running out of those.

        Not exactly a great ad.

        • toast0 a day ago

          "Legacy code" is also known as "the important code that makes the business work"

          If these new fangled languages are so great, one day they can be legacy code too. :P

          • Muromec a day ago

            That's not what makes something legacy. Legacy is something highly not advisable to change because it's both makes the business work and can't be easily changed because of complexity, loss of context, high blast radius or whatever. It's just there and you have to deal with it. If it wasn't complex, opaque and scary to touch it would not have been just another piece of something to be replaced and updated like the copyright date in the footer.

        • palisade a day ago

          Oof, I've got good news and bad news for you.... they still are creating new code in it.

          Yeah, there are fewer engineers in COBOL which is why it pays BIG bucks now. They desperately need someone to maintain that massive infrastructure that has been built up over 75 years that cannot be replaced easily or quickly.

    • duskwuff a day ago

      Besides - standard COBOL is only "memory-safe" by way of not supporting dynamic memory allocation. Like, at all. Even strings are stored in fixed-length arrays.

      "A ship in harbor is safe, but that is not what ships are built for."

  • hollerith a day ago

    Bizarre comment. No developer who should be allowed anywhere near a computer would ever consider choosing COBOL where Rust is appropriate or vice versa.

    • zozbot234 a day ago

      Agreed. It's easy to have memory safety when you don't even support heap allocation. Now if OP had said "Java" or "C#" instead of "COBOL", they would've had a solid point. But the way Rust ensures memory safety without mandating GC while still allowing for complex allocation patterns can be said to be practically unfeasible for any of the usual "legacy" languages, with the notable exception of Ada.

    • palisade a day ago

      Well, I said it was ironic that we went out of our way to make a newer more complicated to read language that was memory safe when we already had a language that was simpler and readable that was safe.

      I didn't say I wanted to code in it, though. I'd prefer in no particular order Kotlin, Python, Go, C++, Rust, Perl, C#, Java, Zig, etc. Anything really over COBOL myself. I'm part of the problem.

      But, if I was hard up for money and wasn't getting nibbles for jobs? I could see getting into COBOL because there is a lot of money in it and always work available.

      My statement stands though, we need to do better when designing the syntax of our languages. Cobol is disliked, yet simple and readable. What does that say about our new languages. How hated are our "new" language remnants going to be when few of us are longer around to maintain them 50 - 75 years from now? And, how easy are they going to be to pick up?

      Addendum: I guess it won't matter if the singularity comes and just writes it all for us, of course. Then it will all just be machine code and we won't need these "only human" translation layers any longer.

      • strken a day ago

        Is COBOL actually memory safe in the same way Rust is memory safe? I thought it was just "we don't allow dynamic allocation", and I'd assume programmers often implement their own half-baked dynamic allocation on top.

        • rightbyte 20 hours ago

          Just like Rust the 'use after free' problem becomes the 'use after it does not make sense' problem instead. Which Valgrind wont find for you either.

          I think new Cobol has 'allocate' and 'free' though.

    • 7thaccount a day ago

      I don't think the use cases for Cobol (bank software) typically overlap with those for Rust (operating systems...etc).

      It's like saying no gardener should be allowed near a garden that would choose a shovel over a pair of shears. Both have a place.

eddieroger 17 hours ago

I'm very late to this post, so I'm sure this will get lost, but in case OP sees it, I'm very sorry for the loss of your grandparents, and hope that you found some joy and comfort in writing about your grandfather fondly in this article, and he has found peace after the loss of your grandmother.

cwbriscoe a day ago

Started my career doing Y2K stuff in 1998 and I still touch COBOL here and there today. I have a 10,000 line CICS program that runs every 30 seconds that I wrote in 2010. It has never failed since.

  • supportengineer a day ago

    That's what I liked about developing Oracle stored procedures activated by cron jobs. Ran for 5 years, no maintenance needed.

    • lloydatkinson a day ago

      That seems like a low barrier of expectations. I can think of several DB's that would run exactly like that.

  • bdjsiqoocwk a day ago

    I don't understand these "never failed" comments. Without further context, it's meaningless. If I write a python script and never change anything in its environtment or inputs, it won't fail either. That's not specific to cobol.

    • _old_dude_ a day ago

      COBOL changes very slowly, once in a decade or two. Python does not offer support of a release for more than 3 years and a half [1].

      [1] https://en.wikipedia.org/wiki/History_of_Python

      • 0cf8612b2e1e a day ago

        I could believe there are legacy installations happily humming away on Python 2.7 without issue.

        • remlov a day ago

          Several years ago I briefly worked at a major telecommunications provider with services across the southern United States that ran Python 2.4 on their production provisioning servers. Worked just fine.

          • gavindean90 a day ago

            The difference being that the COBOL is still supported after a decade.

            • int_19h a day ago

              ActiveState still offers a supported Python 2.7 version across all major platforms for those who need it (https://www.activestate.com/products/python/python-2-7/), so that's 14 years and counting.

              If enough stuff needs it, people will keep it running. Java 8 will probably be in the same boat eventually if/when Oracle finally drops support.

              • 0cf8612b2e1e 14 hours ago

                I am not even sure what support is needed at this point. The interpreter is what it is. You know there are no new libraries to integrate.

                I guess deploying it on a newer OS which might make it challenging to install unless it is a freshly compiled build?

                • int_19h 12 hours ago

                  Patches for security issues, most notably.

      • yieldcrv a day ago

        But a compute instance or bare metal computer that never needs a new release wont have to deal with that in python either

        Its only new builds on someone else’s computer that have this modern issue

    • tannhaeuser a day ago

      I understand the context to be that COBOL, as a DSL for batch processing, declares its .data and .bss segments, or the equivalents on host systems, statically in the DATA DIVISION and usually doesn't dynamically allocate memory. This, coupled with CPU, memory, and I/O bandwidth reservation from a job scheduler on an exclusive hot-swappable partition on a host (z/OS aka MVS) plus USVs, redundant disks/disk ports, and networks makes "never fail" much more a consequence and primary objective of mainframe architectures where COBOL workloads are usually run.

    • kibibu a day ago

      I imagine the backwards compatibility story of COBOL is a little better than Python's

    • ang_cire a day ago

      If you're actually patching your python installs, that is by no means certain.

      • andreasmetsala a day ago

        I don’t think those mainframes running COBOL are getting patched either.

        • p_l a day ago

          They are patched up regularly. The COBOL code itself maybe not, but the runtimes?

        • ang_cire a day ago

          They absolutely are. Modern COBOL (even 25 year old COBOL) isn't running on ancient System360/370/390s or something, it's running on modern z/OS mainframes.

    • Spooky23 a day ago

      If the python script has external dependencies… lol.

martinclayton a day ago

In case anyone is interested...

The SO Developer Surveys give some info on the job market for COBOL as it appears on the average salary versus years-of-experience graphs, which I like as there's as many stories or reasons as you can think of to explain them.

In 2023 there were 222 respondents who averaged 19 years of experience, and an average salary of $75,500. In 2024 the exact number of respondents is not shown, but likely similar based on the color code of the point, but the average experience had dropped to 17 years.

Elsewhere in the graph my favourite open question is: how come the over 2000 respondents mentioning Swift average over 11 years experience in a language that's only been public for 10 years?

2024 https://survey.stackoverflow.co/2024/work#salary-comp-total-...

2023 https://survey.stackoverflow.co/2023/?utm_source=so-owned&ut...

  • clarle a day ago

    iOS development has been around for quite some time now. Most senior iOS and Cocoa developers probably started with Objective-C before slowly migrating codebases over to Swift.

    • martinclayton a day ago

      I think this must be it, or at least this is one story that fits.

      Seems a shame that people report Objective-C experience as Swift experience to such a great extent. These surveys are not resumes...

      Perhaps it just "proves" that all data in these charts is questionable.

palisade 18 hours ago

Oh, btw, COBOL has the 2038 problem and it is right around the corner. We're going to need A LOT of new COBOL engineers to fix it. It runs so much of our world. We managed to save the world from Y2K in the nick of time. But, I'm not sure if we're going to have the minds necessary to solve 2038 by then as the can has just been kicked down the road without consideration. If anyone is worried there won't be jobs, there WILL be jobs.

Not to be too macabre, but we need to transfer the knowledge while the people who have it are still alive, can remember and can teach others to pick up the torch. And, let us call it was it is, of those remain and still have the desire to make the effort to transfer that knowledge.

It is easy to look back on y2k and think well that wasn't a big deal, but the only reason it wasn't is because people tirelessly worked to stop it. It is a testament to their success.

Regarding y2k Robert Bemer tried to warn people in 1971, with 29 years left to go. And, Peter de Jager published his attention-getting article "Doomsday 2000," in 1993 (in Computerworld), with a mere 7 years left which finally put the fire under everyone's ass. Keep in mind, there were still many original COBOL programmers and mainframe experts left to talk to at that time. And, there was a lot less code to change back then than there is now.

Voting tabulation, insurance, utilities, administrative systems, banking, ATMs, travel, healthcare, social security, point of sale, IRS, pension funds, TACTICAL NUKES, hotel bookings and payroll programs. More than 800 billion lines of COBOL code in production systems in daily use. For better or worse, it is the very bedrock of our modern society.

If you want to replace it with something that you want to maintain instead, that's fine too but we're running out of time.

"Danger, Will Robinson! DANGER!" https://www.youtube.com/watch?v=OWwOJlOI1nU

"Listen the nothing will be here any minute. I will just sit here and let it take me away too. They look... like... big... strong hands.... Don't they?" https://youtu.be/symP4QT7wLU?feature=shared&t=24

  • JackSlateur 15 hours ago

    > we need to transfer the knowledge while the people who have it are still alive

    Nah. We need to not transfer that knowledge, because the problem will be solved when the house is on fire.

    But do not worry : nothing will happen until then. If those people cared, they would work to replace all that cruft, not enhance it to fix 2038.

    • palisade 14 hours ago

      I hope you're right and it just solves itself.

  • coldpie 16 hours ago

    This sounds interesting, but I wonder who this message needs to be directed to? As a dev who doesn't work there, I can't just go "fix 2038 for the post office." Are you encouraging devs like me to go try to get themselves hired into these positions now, and advocate allocating resources to fix these problems? Are you trying to tell the higher-ups at these places about a problem they might not know about?

    • palisade 14 hours ago

      I'm not sure! The problem might need to be addressed from lots of different directions; university coursework, companies and organizations depending on these systems paying more and hiring more people, more devs getting into these jobs, a thinktank to sit down and think through how best to address the problem, government regulation, etc. I'm not sure any one thing is going to deal with this, it's such a massive issue.

  • palisade 16 hours ago

    Forgot to mention the post office. And, there are probably many more.

gpraghu a day ago

A touching article! I have enjoyed similar times with my grandpa. On the topic of Cobol, I simply don't understand why people hate it so much. It has a shallow learning curve like Python, is self-documenting enough that one doesn't need to write a bunch of text, and is available on every conceivable architecture, with great performance. I personally wrote a payroll for an entire factory on a B1800 with 128K of memory and a 10MB hard disk! So what's to complain? In my mind, Java is deader than Cobol!

  • ape4 a day ago

    Its the amount of boiler plate that people hate.

    • acdha a day ago

      I think there’s something to that but there’s also a lot of selectivity there. Many of the same people who complained about COBOL because it was verbose adopted things like enterprise Java, so there’s more than a suggestion that this might be a less than completely objective assessment.

      The bigger problem: COBOL was an open standard but none of the implementations were open source for ages (I haven’t looked at GNU COBOL in years, but I think this is no longer the case) so nobody was building new things or experience when they had to pay to get started.

masfoobar 20 hours ago

condolences to the writer on his grandads passing.

It is a bit of a reality check when words like 'grandpa' are linked to an article from 1992! My brain is expecting the article to be from the 60's, 70's... or possibly 80's.

My world view, it is hard to image a child born in 2000 is 24 years old now. Their grandparents could be as old as I if they had children (and their children) at a young age.

Then I read at the end he was 91 when he passed. He did well! Likely around my Grandads age - and managed to last an extra 24 years on this planet!

I remember reading a book on COBOL in my younger days learning to program, alongside BASIC, C, and Pascal. I might still have it. Despite reading and never coding in it, I have been (fortunate, I guess) to have never programmed in it.

I do agree with the writer that using the word "dead" in the programming language world is unrealistic. Some would argue that there are popular, modern languages out there as being "dead" - but they might get a huge push for one reason or another in the future. Could COBOL find a new, niche spot.

Maybe.

jtotheh 14 hours ago

I worked for a while as a contractor for the US Dept. of Education Student Loan system. It was z/OS with DB2 and most new business logic was done in this weird language "Gen" https://www.broadcom.com/products/mainframe/application-deve... . Gen can supposedly "generate" java and other stuff but they used it to generate COBOL for the mainframe. You could debug the Gen code on the 3270 emulator, rather than trying to deal with the generated COBOL. There were a small number of people (like 6) who were handling that code. The data and I guess some of the code went back to like 1980 at least. There was so much legacy code, I doubt they've changed platforms. I was supposed to be more a Java guy but I did a little Gen. Mainframe is very alien to me. The people that knew it well could really crank on it, though. I joined when they were converting an MS ASP front end to a Java one. So we wrote Java that users interacted with via the web and that made calls to Gen (really, to cobol). In retrospect there was a lot wrong with that operation... One interesting problem that came up once was that the mainframe didn't sort things the same as Java. It turned to be caused by EBCDIC vs UTF.

  • sigmonsays 13 hours ago

    thanks for sharing that, it's super entertaining to consider what crazy things people might be doing in the future. Debugging EBCDIC was a surprise and got me laughing.

adamc a day ago

Technologies die very slowly once things of economic value depend on them. COBOL probably isn't used from new projects very often, but the economics of ditching it aren't very good either. It already works. Rewriting things that work is a great way to create new problems at great expense, so institutions are hesitant.

WaitWaitWha a day ago

(Programming) languages take very long to "die". Most often you will get a long drawn out tail, and often parts of a language gets absorbed into other languages. Only the sages and etymologists will know where they have come from.

Old man reminiscence following, skip if you are not bored:

I worked with SNOBOL and I thought it will be a long term programming language. I also want to think that I had some tiny, minuscule hand in dev of RIPscrip pre-Telegraphix, alas it went as the dodo bird.

I think I have forgotten more programming languages than I can count on my hands. Yet, I see them in some part every day in newer languages, "discovered" by some expert. "What has been will be again, what has been done will be done again; there is nothing new under the sun."

One language has come to my aid for the last 30-ish years Perl has came to my aid many times.

(I tell you a secret - in the deep deep bowels of a a very, very large, jungle named company, servers still have tiny Perl scripts running some core functions. I discovered this, when there was a problem that I had to deep dive into. I a recommendation to change to a hard-coded variable. The answer was "it will take two weeks". Why? Because no one knew what it will do or could read Perl. It was a 30 second job, including sdlc. Think xkcd Dependency https://xkcd.com/2347/ )

socketcluster a day ago

It's interesting reading articles from previous generations how they make it sound like people seem to remember what everyone in the tech industry said as if everyone mattered. I guess there weren't many people around in the industry back then.

Nowadays, even if someone is right about something and most people are doing it wrong, nobody will care to even discuss it unless the person making the statement is one of maybe 3 top influencers in that field.

mbloom1915 a day ago

almost all major financial institutions, utilities, gov't agencies, etc still rely heavily on COBOL today. If it ain't (extremely) broken, don't fix it?

COBOL developers are literally dying out which has made for a competitive market for remaining talent. I've heard of some large consultants charging over $500/hr to their clients for a COBOL developer!

  • akavi a day ago

    I feel like every time COBOL is mentioned we get these stories about crazy high comp for COBOL developers, but anecdotally my aunt worked on COBOL projects in the mid 2010s and was paid a much more modest 45 $/hr. Good money for small town middle America where she lives, but nowhere close to what a decent JS dev can get.

    • chucksmash a day ago

      There's also the difference between what a consulting company bills for the COBOL developer and what they pay the developer. Not every consultant is the captain of their own ship.

      My first job after college was a software shop organized in a "services" model, where clients would have to sponsor teams to do feature dev or support beyond initial onboarding. It's been a long time and my memory is hazy, but as I recall I was expected to bill ~40 hours a week to clients and if I only worked 40 hours that week (being OT exempt, this was always the goal), my hourly pay came out to between 10-20% of what the company billed the client.

      So $500/hr on the bill and $45/hr on the paycheck both manage to sound plausible, even at the same company.

    • ghosty141 a day ago

      Similar experience with a friend of mine. I feel like these high salaries only apply to people who habe worked at one of these companies for a looong time.

      • Ekaros a day ago

        High salaries are when a grey beard consultant is bought in for a few months to fix something or implement some new regulation. And I don't think it was that high was ever as high as financialised tech companies managed.

      • adastra22 a day ago

        High salaries are relative. $90k is a high salary for most people in the world, even for tech workers outside of Silicon Valley.

  • psunavy03 a day ago

    In COBOL implementations, it's generally not just knowledge of the language that makes you valuable, it's knowledge of the implementation at that particular organization. I'm not a COBOL dev myself, but I work with them, and part of the challenge is that everything is so uber-customized, tightly coupled, and there's 40+ years of undocumented business logic buried in the code.

    It's like the old joke about the engineer being asked for an itemized bill: "Chalk mark: $1. Knowing where to put it: $4,999."

  • jeremyjh a day ago

    I think the moat that COBOL developers have is not just their knowledge of the language, but knowledge of the mainframe programming and operating environment. Its just so alien to developers familiar with Windows/Linux, and there is really no way to get experience with the environment that I know of, other than to be employed doing it.

    But yeah that stuff is never going away as far as I can tell. Its just too risky to rewrite those core systems and many a boondoggle has tried and failed.

    • rodgerd a day ago

      About a decade ago I looked into moving some COBOL components off-mainframe (either as COBOL-on-Linux or a rewrite into Java, which itself is really COBOL Mk II at this point), and your point about the operating environment is one of the key elements, but not all of it; there's also the fact that the first big shift to automation, via mainframe assembler and COBOL, is when companies sacked a lot of the folks who knew how and why the pre-automation processes worked - that knowledge exists in the mainframe code and the heads of the people who work(ed) on it, and nowhere else. A rewrite or a replatform is very, very hard and risky as a result; the system is now defined by how the mainframe runs the processes, to a very large degree.

      The third is that COBOL is only the tip of the iceberg. As soon as I spent time learning about the code I was being asked to look at, you get into decades of evolving programming practises. Modern COBOL is multithreaded, probably uses DB2 and relational datamodels. COBOL from thirty years ago is probably single-threaded, only runs right on high-clocked single-execution models, cuts down to hand-written s390 assembler regularly, and uses VSAM files with non-relational data. Older code still will be sharing data simply by banging it into memory regions for other code to read out of, because that's how you got performance back in the day.

      Trying to identify how you'd pull a function out of that and move it off is somewhere between extremely difficult and impossible. It's usually so complicated and expensive it's easier to try and hire people who want to apprentice as mainframe programmers and keep the current codebase running.

      • mschuster91 a day ago

        > A rewrite or a replatform is very, very hard and risky as a result; the system is now defined by how the mainframe runs the processes, to a very large degree.

        And that's why so many neo-banks/fintechs are eating the lunch of the established banks left and right, same for insurance. The "old guard" is unwilling to pay the costs of not just upgrading off of mainframes (aka the rewrite work itself)... but of changing their processes. That is where the real cost is at:

        When you have 213.000 employees like BoA has and everyone needs to have at least 10 hours of training and 2 weeks until they're familiar with the new system enough to be fully productive, that's like 2 million man-hours just for training and 16 million hours in lost productivity, so assuming $50/h average salary it's around 900 million dollars in cost. Unfortunately for the dinosaurs, the demands of both the customers and (at least in Europe) regulatory agencies especially for real-time financial transfers just push the old mainframe stuff to limits, while at the same time banks don't want to cede more and more of that cake to Paypal and friends that charge quite the sum for (effectively) lending money to banks.

        In contrast, all the newcomers start with greenfield IT, most likely some sort of more-or-less standard SAP. That one actually supports running unit and integration tests automatically, drastically reducing the chance of fuck-ups that might draw in unwanted regulatory attention.

        • jeremyjh a day ago

          BOA doesn't train the vast, vast majority of its workforce on mainframe systems these days. No one working in a branch or call center is looking at green screens anymore. The mainframe systems are simply used as back-ends connected through web services (yes, even in CICS!) or MQ Series and the like to web GUIs.

          Source: worked there for many years, and built some of those integration systems.

        • panopticon a day ago

          Eh, I think the tech stack is less important than the legal and regulatory structure.

          Most fintechs aren't banks and partner with a Real Bank™ to provide the actual bank accounts. Fintechs are under much less regulatory scrutiny (for now—that may be changing with recent, high-profile screwups) and can move with much more freedom regardless of the tech stack they've chosen.

    • psunavy03 a day ago

      Migrations are still a thing, with various approaches and success rates.

  • bespokedevelopr a day ago

    I work for a major utility and they used to run everything on mainframe and cobol but that went away long before I started programming. My coworker is nearing retirement, around 30 years here, and he started on cobol and worked on transitioning off. He has some really fun stories but my point being, the tales of cobol prevalence are very exaggerated. Maybe some parts of finance are still using it, not my area.

  • amelius a day ago

    Can't we just apply a bunch of correctness preserving translations towards a modern PL, perhaps aided by an LLM to keep the source as human readable as possible, while (I'm stressing this) preserving correctness?

    • exhilaration a day ago

      IBM offers just such a service under the WatsonX branding, it's an LLM to convert COBOL to Java: https://www.ibm.com/products/watsonx-code-assistant-z

      I work at a company with a large COBOL codebase and this has been mentioned in a few presentations about our modernization efforts.

      • grammie a day ago

        You should take a look at my company. Heirloom Computing. Heirloom.cc We have migrated many mainframe application and millions of lines of cobol and pl1 into Java and deployed it into production on prem and into the cloud.

      • russfink a day ago

        But is the conversion maintainable by a human? I’ve seen Fortran to C translators that end up encoding state transition machines that are impossible to read.

      • refneb a day ago

        How did that go? My employer is going to try snd evaluate watsonx product. Have you had any luck converting large/complex COBOL modules ?

        • artificialLimbs 9 hours ago

          Don’t know about COBOL, but we recently threw some ancient RPG 3 programs into ChatGPT for conversion to both ‘plain English’ and PHP and it got pretty close - maybe 80-90%.

    • Muromec a day ago

      You can’t, unless you transform cobol to cobol and run the emulator on aws. It will still manage to fail you in some way

  • jillesvangurp a day ago

    Not a bad gig to take if you can swallow your pride a bit.

    I bet LLMs can make working with COBOL a lot easier and more fun than it ever was. I bet that's true for a lot of legacy stuff.

    • the_af a day ago

      Working with COBOL was never fun, so that's a low bar.

      Like others have said, what's valuable is an understanding of the business and legacy cruft that comes with spending time working at this kind of companies/banks/etc rather than knowledge of COBOL.

  • Spooky23 a day ago

    That’s for specialists for the mainframe or specific skill.

    Generalists are usually offshored and are cheap.

  • IshKebab a day ago

    That seems like a myth to me. I actually looked up COBOL salaries and they were a bit higher (like 20%) but definitely not enough to make them tempting.

    • francisofascii a day ago

      There is typically a big difference between a consultant's hourly rate and a full time salary hourly rate.

      • IshKebab a day ago

        Yeah exactly. I was comparing like for like (contracts or full time). The difference due to the fact that it was COBOL was definitely not enough to make me want to learn COBOL.

  • Muromec a day ago

    It is very much broken and said institutions don’t like it

  • the_af a day ago

    COBOL jobs are not particularly well paid in my country.

    In any case, they would have to pay well by a large margin to justify working on dead boring legacy systems, too.

ryukoposting a day ago

When I was in college, I knew a guy who got an internship at Wells Fargo writing COBOL. He hated it.

The punchline is that this was in 2018.

deenadz a day ago

COBOL is dead, long live COBOL.

For any cobol devs here, we at https://cobolcopilot.com would love to hear from you

  • Muromec a day ago

    You need to sell on-prem to those people. No way a single byte of that sweet sweet poison is going to ever leave the corporate network boundary.

timvdalen 15 hours ago

> such as fourth-generation programming language (4GL). If you’re not familiar with that term, suffice it to say that the Wikipedia page lists several examples, and Cobol has outlasted most of them.

I'll have you know I was approached for a FileMaker project not too long ago!

  • calvinmorrison 15 hours ago

    cries in maintaining our entire business backend in Aestiva HTML/OS

krackout 13 hours ago

COBOL is dead? Not at all. Are new projects created in COBOL? Yes they do. If not in older COBOL form, definitely in SAP ABAP.

For those who haven't heard about it, ABAP (Advanced Business Application Programming) is the name of SAP’s proprietary, fourth-generation programming language :) It's SAP's main language. It's a direct descendant of COBOL, I'd describe it as a COBOL with OOP extensions.

Since SAP's ecosystem is sneaking everywhere, COBOL in its modern, very close incarnation (ABAP), gains new space!

If in any doubts, check some ABAP code. It's not simply influenced by COBOL, it's COBOL.

FLT8 a day ago

20 years ago I worked on a mainframe system that, at the time, was said to have "18 months to live". Fast forward to today, the system is more entrenched than it ever was, and still has "18 months to live".. I'm convinced it will outlive me, and probably the next generation too.

facorreia a day ago

I worked for a company in the late 1980s that started developing with a 4GL product (Dataflex) instead of COBOL. The article is right that COBOL has outlasted most (all?) of those 4GL solutions.

Looking back, COBOL would have been a better technical choice back then. Dataflex's metadata-based dynamic UI and report generation saved some simple, repetitive work, but much more effort was wasted working around its limitations.

xiande04 13 hours ago

I read The Cuckoo's Egg by Clifford Stoll (highly recommend btw) published in 1989. I laughed out loud when he described Cobol as an antiquated language that no one wanted to support. In 1989.

solatic a day ago

COBOL is endangered, even for banks and airlines. Just look at the executives who see decide to open new digital banks - they're not building on top of COBOL or mainframes. The old banks will be outmaneuvered by the new ones, and eventually succeed them in the market.

The story of languages like COBOL isn't that a language is too deeply embedded to become too expensive to replace. It just means the replacement will happen at a higher level - the business itself, and will take more time as a result.

  • nasmorn a day ago

    A single cobol mainframe application is not a problem for a bank. Big banks are usually made by buying up dozens of other banks so they might have very many of these mainframes running and interoperating. That is where the real insanity lies

pnw 13 hours ago

My first job was programming an ancient COBOL system in a government agency riddled with outdated tech.

The only real upside was, COBOL is so wordy, it forced me to improve my typing speed!

bigiain a day ago

This makes me feel old.

In '92 I was maintaining COBOL code for a custom written warehouse management system for a wholesale boat bits distributor. The company that wrote it had lost almost all their COBOL devs, and were all in on Windows NT application dev.

I hate to admit it to myself, but I am in fact _just_ old enough that I could have cs grad aged grandkids, if I'd had kids early and they'd also had kids early. :sigh:

mcv a day ago

Just this week a colleague asked if someone knew Cobol. Apparently another team had a Cobol-related issue.

So despite its long death, it still seems to be kicking about. I doubt we'll ever get rid of it.

happyjim a day ago

Key components of the U.S. Internal Revenue Service tax processing code (e.g., the "Individual Master File" or IMF) are written in COBOL and IBM Assembly Language.

There is an ongoing effort to refactor as Java. This will ultimately take years and cost $100s of millions of dollars. There is still a small but shrinking team of graybeards who can actually maintain the code, which has to be reprogrammed every year to accommodate changes to tax code.

See, e.g., IRS IT Strategic Plan documents, publicly available.

rogerian 18 hours ago

I think many would dispute that its dead. Apparently more than 95% of ATM swipes and 43% of banking systems are written in COBOL. No idea how true that is.

lasermike026 19 hours ago

With LLMs which programming language used becomes irrelevant. LLMs do not replace programmers yet but they do give programmers incredible leverage making these points moot.

HPsquared 18 hours ago

Scientists still use Greek, lawyers still use Latin.

  • anthk 18 hours ago

    Romance language users will use both at the same time to create new terms.

    • HPsquared 18 hours ago

      I guess those ancient languages still live on through a lot of modern languages. Unlike COBOL which is more of an offshoot.

HackerQED a day ago

RIP. He is an old man with wisdom and a sense of humor.

kayo_20211030 a day ago

Great story. There's something wicked personal in it, and it's very good. I reckon that this bloke's grandfather was an interesting bloke - cobol or no.

LarsDu88 a day ago

As long as there are tactical nukes that depend on COBOL, COBOL ain't dead.

We might all die, but COBOL will sit happy in its steel reinforce nuclear bunker

  • diggan a day ago

    Still doesn't beat Assembly, which will continue running on Voyager 1 even after the inevitable demise of our planet. Would survive the end of our solar system too.

    • LarsDu88 a day ago

      Assembly ain't a language. Differs for every chip microarchitecture. Doubt there's many folks who know voyager 1 assembly

      • criddell 19 hours ago

        Assembly is a language. It’s human readable, not machine readable. Modern assemblers support all kinds of higher level constructs through macros.

      • TheSkyHasEyes 15 hours ago

        > Assembly ain't a language. Differs for every chip microarchitecture.

        Your last sentence explains why ASM is a language. ASM compiles to machine language.

yawnxyz a day ago

huh so are any languages actually dead? ChatGPT mentions FORTRAN, ALGOL, or Pascal... which I don't think are dead at all.

Ada I've never heard of, so maybe that one's dead?

If they're able to write WebAssembly compilers for all these languages, then they'll probably live forever!

The only reason punchcards are "dead" is bc the machines are gone or mostly unavailable...

  • int_19h 21 hours ago

    It depends on how you define "dead". ALGOL proper has been dead for many decades, but pretty much all mainstream general purpose PLs today are its direct descendants, and sometimes this ancestry is plain to see (e.g. every time you write "struct" or "void" in a C-like language, that's straight from ALGOL 68). I once wrote a comment on HN summarizing all the various bits and pieces I know of that are still around: https://news.ycombinator.com/item?id=18691821

  • marcolussetti a day ago

    Ada is still updated, last released in 2023. Given its original audience is the Department of Defense, it seems to me very likely it is far from dead.

sshine a day ago

I know someone my age (mid-late 30s) who is a COBOL programmer for a bank.

He's been that for ~5 years.

I don't think it's going away any time soon.

Crontab a day ago

Do open source COBOL programs exist? Just wondering since I see it mentioned occasionally here.

mckn1ght a day ago

Huh, so it mentions 4GLs… what generation would we consider rust/kotlin/swift then?

  • jcranmer a day ago

    The idea of programming language generations were based on paradigms of programming that never really caught on. The idea, roughly, is that 3GL are those languages where you specify how something is to be done, 4GL is where you specify what is to be done instead, and 5GL is you specify the problem and the computer does everything for you.

    This breaks down with the fact that it's really difficult, outside of really constrained spaces, to turn a "what" specification into a high-performance implementation, so any practical language needs to let you give some degree of control in the "how", and as a result, any modern language is somewhere uncomfortably between the 3GL and 4GL in the paradigm, not fitting entirely well in either category.

  • stonethrowaway a day ago

    They haven’t been around long enough to even be considered in the running.

    • adamc a day ago

      4GL was really more a marketing slogan than a true generation. The idea was something like "with third generation tools, you have to drive the car, making every turn yourself, with 4th Gen., you say "Go to the Ritz!".

      It wasn't true, although they did make some operations easier in tangible ways.

      Rust is a wholly different kind of thing -- not easier than, say, Java, but lots more control with better guarantees. It's more a systems programming language. 4GLs were all application-focused.

  • rodgerd a day ago

    The modern analogue of 4GLs would be the promise of LLMs letting you write prompts so you don't have to learn a programming language; the promise of the original 4GLs like Pearl (not to be confused with perl) and Objectstar was to let you have non-programmers writing business logic without being COBOL or FORTRAN programmers.

    • psunavy03 a day ago

      Ironically, the whole reason COBOL has its weird-ass syntax was to let you have non-programmers writing business logic without being assembly or C programmers. We can see how well that worked.

      • acdha a day ago

        I think about that every time I hear someone saying LLMs will make programmers unemployable. There’s no doubt that the work will change but I think a lot of the premise is based on a fundamental misunderstanding of the problem: business systems are just more complex than people like to think so you’re basically repeating https://xkcd.com/793/ where people higher on the org chart think the problem is just cranking out syntax because they “know” how it should work.

        I think we’ve had at least 4 generations of that idea that reducing coding time will be a game-changer: the COBOL/SQL era of English-like languages promising that business people could write or at least read the code directly, 4GLs in the 80s and 90s offering an updated take on that idea, the massive push for outsourcing in the 90s and 2000s cutting the hourly cost down, and now LLMs in the form being pushed by Gartner/McKinsey/etc. In each case there have been some real wins but far less than proponents hoped because the hard problem was deciding what it really needed to do, not hammering out syntax.

        There’s also a kind of Jevons paradox at work because even now we still have way more demand than capacity, so any productivity wins are cancelled out. At some point that should plateau but I’m not betting on it being soon.

sys_64738 a day ago

I recently found a 3.5” disk image I had with my 1990 COBOL programs on it.

  • iefbr14 21 hours ago

    You are lucky. I started in '75 and my first cobol programs were on punch cards. Maybe some bits are still going round in napkins and toilet paper..

Frummy a day ago

It's tragicomical, since it's at the core of renowned institutions I thought surely this must be a world of logical, crisp perfection. A perfectly engineered engine, surely if these systems are so important and at the very center of what makes society work and all flow of money and whatever, geniuses must have perfected it all thrice-over. I wouldn't say reality was equal to 1/expectations^3 , but maybe 1/expectations^2. Probably no one will relate, a COBOL job was the first developer job of a relatively young guy like me. Crash course in tech-debt, decades worth of managerial shortsighted behavior, bureaucracy and all that. At least the naive hope provided energy to learn it better so it wasn't useless. But maybe it veered on delusion when I hoped to rewrite ALL of it in the company I was.

nrollinson 21 hours ago

COBOL's gone? Time to tell grandpa his coding skills are officially retro chic.

MarkusWandel a day ago

Frankly, in all these stories about COBOL programs being modified for Y2K and whatever... isn't COBOL a compiled language? What's really amazing is that all these legacy systems have buildable source code and the toolchain to build them with i.e. that that stuff hasn't suffered "bit rot" or other neglect.

bitwize 16 hours ago

I find it fascinating that the recommended environment for IBM mainframe COBOL development is... Visual Studio Code. IBM makes a plugin that lets you remotely edit COBOL code on the mainframe in your editor.

Guess COBOL is alive enough to warrant this kind of support.

  • zifpanachr23 13 hours ago

    Works for Assembly and JCL and REXX as well, and if you don't mind turning off some of your local LSP support like header resolution and the like, also C/C++ and Java and Shell.

    The old guard mostly still prefers ISPF though cause they've become really fast in it not unlike a Unix greybeard is gonna prefer something like vim.

    I'm sorta torn on it. I like using the 3270 environment cause I can get around to different places a little easier than via vscode, but if I'm editing a lot of large files, it's nice to be able to see more code at once and have them open in multiple side by side tabs. You can do that in ISPF, but it's a little more unwieldy and you have less dynamic control over font size.

dev1ycan 16 hours ago

COBOL programmers spreading fake rumors about COBOL being dead so they keep their $200k-300k salaries

blastonico a day ago

Soon after Java was released, when the hype around the language was on fire, people used to say that it was going to replace COBOL - due to the "build once run everywhere" motto. Java indeed gained market share in the finance industry, but COBOL is still there.

  • cies 16 hours ago

    Java is much bigger though in that space. Java's relative market share in the finance industry %LOC is waaaay bigger than COBOL's.

FrustratedMonky a day ago

Is PERL dead yet?

  • cies 11 hours ago

    I asked ChatGPT -- Please estimate the popularity of these languages, relative to the most popular one (that gets 100%). Base your answer on community activity on different platforms (stackoverflow, reddit, hacker news, dev.to, github) and the number of books available and sold: COBOL, Perl, PHP, BASIC, TCL, ColdFusion.

    Estimated popularity:

        PHP: 100% (reference)
        Perl: 30%
        COBOL: 25%
        BASIC: 10%
        TCL: 8%
        ColdFusion: 5%
    
    
    (I consider all of these dead, or, "in maintenance mode")
    • rafark 7 hours ago

      Why php though? Php is as old as java and very popular and constantly updated.

throw0101b a day ago

Bloomberg's Odd Lots podcast had an episode last year, "This Is What Happens When Governments Build Software":

* https://www.youtube.com/watch?v=nMtOv6DFn1U

One reason COBOL systems have been around for so long is because they encoded business rules that need to be understood if you want to try to transfer them to a new system. From the podcast (~16m):

> Like when we're working in unemployment insurance, again during the pandemic, my colleague was talking with the claims processors week over week and we're trying to dissect it and figure out what's going wrong and clear this backlog and one of these guys keeps saying, “Well, I'm not quite sure about that answer. I'm the new guy. I'm the new guy.” And she finally says, “How long have you been here?” And he says, “I've been here 17 years. The guys who really know how this works have been here 25 years or more.”

> So think about. You know, going from doing some simple cool, you know, tech app, you know, easy consumer app to trying to build or fix or improve upon a system that is so complex that it takes 25 years to learn how to process a claim.

> That's sort of, I think, what needs to be on the table as part of this agenda is not just “can the tech be better?” But can we go back and simplify the accumulated like 90 years of policy and process that's making that so hard to make?

Also an observation on how decisions are sometimes made:

> And I think that there's a deep seated culture in government where the policy people are the important people. They do the important stuff and technology, digital is just part of implementation, which is not just the bottom of a software development waterfall. It's the bottom of a big rigid hierarchy in which information and power and insights only flows from the top to the bottom.

> And so it's problematic in part because the people who are doing the tech are really just sort of downstream of everything else and the power and ability and willingness to step up and say “Hey, we probably shouldn't do those 6,700 requirements, we should probably focus on these 200, get that out the door and then, you know, add edge cases as as later.” There's no permission really to say that.

  • Muromec a day ago

    > There's no permission really to say that.

    There is not permission to say that because your requirements are often set in a black letter law and you didn't buy a right kind of suite to be present where they were decided for the last 40 years.

  • shadowgovt a day ago

    > ...add edge cases as as later.” There's no permission really to say that.

    I think there would be some value to closing that feedback loop to give legislators the signal "You know, what you're considering is actually pretty fuzzy conceptually... We're discovering while considering how to code it up that you probably don't actually have good, clear definitions for all the terms in this bill." But the biggest thing to remember about government IT is the clientele, which changes the approach from commercial / industry software.

    Google can optimize for the common case. Google can cut the edge cases. Google can change APIs on a whim.

    Google's users choose to be Google's users and can go elsewhere if they don't like it.

    Government citizens don't have that choice. And in general, people don't lose access to their food if Google effs up. Or go without their legally-deserved unemployment pay. Or go to jail because their taxes were mis-calculated.

    In the government space, the "edge cases" are human beings, alike in dignity. The rules and policies end up complicated because human beings are complicated. And yeah, it ends up being some messy software. Because you can't just decide to ignore the law when it's inconvenient to plumb the information that the client has a child under the age of 18 who is not a dependent because they're an emancipated minor, but said emancipated minor does have a child of their own, and the client is the primary caregiver for that child while her parent is in prison... from here to there in the dataset.

    • Muromec a day ago

      >Because you can't just decide to ignore the law when it's inconvenient to plumb the information that the client has a child under the age of 18 who is not a dependent because they're an emancipated minor, but said emancipated minor does have a child of their own, and the client is the primary caregiver for that child while her parent is in prison... from here to there in the dataset.

      That's all very true, but nobody ever codifies that. When the data doesn't fit the constrains of the form that aims to handle a reasonable generalilized case, you simply get a phone call from a human in the loop. That human has a supervisor and you can also go to a court when they write your name with E instead of É and try to bullshit you about some kind of ASCIEBCDIC nonsense like it's real.

      In the end you have one dataset which tells who is a child of who, another telling who has custody rights and a third one making sense of amounts and recipients of childcase subsidies. Maintained by different departments and eventually consistent or maybe not.

    • spongebobstoes a day ago

      My interpretation is a little different. We agree that humans are affected by the edge cases, although I believe that's also true at very large companies like Google or Meta.

      I don't think it's about avoiding programming 6700 edge cases, but more so that when you have an excessive number of cases, it's likely an indication that something is being missed. that could be due to a bug in software or due to unclear definitions in the legislation.

      in those cases, rather than attempting to program it exactly, it might be better to bring a human into the loop.

      and to me, that could be the point of having a tighter feedback loop. because otherwise the developers will just do their best, which will be buggy or incomplete. because they can't not do their job.