Programmers are dying out and dying out

They can’t die out.

Have you come across such headlines?

How did you feel? I bumped and felt aggression and teeth grinding slight déjà vu.

And the longer I peered at the apocalyptic headlines, the stronger the feeling.

After all, I’ve already seen this somewhere.

My memory is bad, but a vague feeling in the subcortex guided me deeper and deeper. And here I am already in 2016.

But in 2013.

At some point I asked myself the question: “Where did all this come from?” Why didn’t the conventional “programmers” please you so much? And why programmers?

This is how this article was born. Here I will try to get to the reasons that prompt the authors of articles to write apocalyptic forecasts (except for the desire to hype, of course). I can’t promise that I’ll get there, and I won’t psychiatrist doctor to make diagnoses, but I’ll still put forward a couple of hypotheses. Whether they are true or not is up to you to decide.

Disclaimer. I will use the term “programmers” as a convention. I will hit the enemy with his own weapon in order to emerge higher in the search. I hope to rise to the very top (if you understand what I mean) so that first they read my article, and only then, with a smile, the high-flying articles with horror stories.

I won’t pull my pants by the straps – I’ll go straight to the answer to the question “Where did the hostility towards programmers come from, that they want to get rid of them so much?”

Fortran and COBOL as a means of “getting rid of” programmers

As they write in historical articles, one day Backus got tired of writing in assembly language – it was hard – so he came up with Fortran. There is truth in this, but one nuance is missed: Fortran was conceived as a tool that would allow scientists… to reduce the number of programmers.

Do you think I’m making this up? No, friends. This is written about, for example, in Encyclopedia Britannica:

The clipping says that Fortran helped quickly write programs that ran as efficiently as those written in machine language and assembly language. At the same time, programs in Fortran could be written much faster and not only programmers, but also engineers and scientists could do this. In the text (on the screenshot) they write exactly this:

“With the creation of an efficient higher-level (or natural) language, also known as third-generation language, computer programming expanded beyond a narrow circle to include engineers and scientists who were instrumental in expanding the use of computers.”

That’s what I need. It turns out that one of the reasons for the emergence of Fortran was to include more scientists in programming, instead of programmers who wrote in assembly or machine language.

My hypothesis is confirmed by the fact that Fortran was also implemented in the USSR for similar purposes. Here’s what they wrote about it in the article “Govorun and his team.”

“At JINR (Joint Institute for Nuclear Research) there were mathematicians who wrote programs according to the instructions of physicists. And at CERN, physicists wrote the programs themselves. In Fortran.

The talker was a man of action, and as a result, he was not a philosopher. He was more interested not in theory, but in the tree of life. He saw: it works. And it works effectively. So this will work for us

….<>

Nikolai Nikolaevich Govorun was given all the documentation for the translator. Fortran-66 was written here at CERN, and this Fortran standard was subsequently implemented on BESM-6. CERN transferred all processing programs written in Fortran and its entire Fortran library to JINR. The rest, as they say, was a matter of technique.”

For context. Nikolai Nikolaevich Govorun – Soviet mathematician, corresponding member of the USSR Academy of Sciences, editor-in-chief of the magazine “Programming” from 77 to 88. It was under his leadership that Fortran was introduced at JINR and the first translator from Fortran to BESM-6 was created. And the results were the same as Backus’s.

Quote from the article (link above).

“Previously there were programmers, they were also mathematicians, there was a computer center, customers from other laboratories. Orders were received in an impersonal form. N. N. Govorun cited as a curiosity the case when Candidate of Physics and Mathematics Sciences I. N. Silin carried out an order from BLTP for a graduate student.

The Fortran revolution put everything in its place. Physicists themselves began to write processing programs.”

I conclude that one of the fundamental reasons for the emergence of Fortran was to give scientists the opportunity to write programs themselves. And reduce the number of programmers. There is another reason – money, but more on that later.

Now let’s move on to COBOL and go straight to the Grace Hopper quote. This is how she describes her work on language in an interview she was interviewed in December 1980.

Quote:

Hopper: I continued to call for more user-friendly languages. I’ve always tried to do that, which is why I need these other people-oriented languages. Most of the things we get from computer science (and academia?) are not adapted to humans in any way.”

And in working on COBOL, as Grace Hopper herself said (in the same interview), she was entrusted with what she wanted to do – make it easier for business users to use computers. And to confirm this, let’s take a quote from historical article about COBOL.

“Experience with the previous system, the Harvard Mark I, led Grace Hopper to the conviction that computers needed a programming language consisting of commands in English, not special characters….<>…COBOL was created from the very beginning as a language for business – the main types of input and output data in it are numbers and text strings. At the same time, thanks to the use of clear commands in English, even a non-specialist, looking at the code, could figure out exactly what actions the program performs.”

Pay attention not to the phrase that “even a non-specialist could understand the code.” It turns out that COBOL was NOT created for programmers. It was created for businessas they write on the resource of Brown University (one of the oldest and most prestigious private universities in the USA).

“Although FORTAN was good at processing numbers, it was not so good at processing input and output, which was of most importance to business computing. Business computing began to gain momentum in 1959, and because of this, COBOL was developed. It was designed from the ground up as a language for business people

…<>…

COBOL instructions also have a grammar very similar to English, making it quite easy to learn. All of these features were designed to make it easier for average businesses to learn and implement.”

The goal (one of?) of the development of COBOL was to push aside professional programmers so that everyone could sit on a computer, and managers could dispense with developers and write code simply using English words. The idea is the same as with Fortran.

And COBOL worked – you could take a person from a completely different field, train him, and in a month or two he could already write programs. And I’m not exaggerating. I found material from 2012in which a developer who wrote in COBOL from at least 1969 to 1989 describes this point.

Google Translate.

“…COBOL had one undeniable advantage that continues to this day: it is easy to learn. It is a simple language that encourages a clear coding style. Each variable is clearly defined in detail, as are the file formats. We typically recruited employees with little or no computer education and spent several months getting them to be productive in terms of computer programming. And COBOL completely freed employees to focus on the complexity of business applications rather than the complexity of the language.

Our bank then discovered that bringing in an experienced loan officer, training them in COBOL, and then asking them to help program a complex loan business process worked extremely well. We also found that we could take the computer operations staff, give them a 30-day training course, and then send them to the user departments to learn the business processes before putting them back to programming some of those business processes.”

How do you like the course “COBOL developer in 30 days”? Sorry, I’m just making an unfunny joke.

But why have I been telling you about this for several pages already? The point is that I want to convey the idea – the idea of ​​ridding software development of programmers has been in the air since the advent of computers, software development and programmers themselves.

But where did this attitude come from?

Let’s try to figure it out. And the first place to start is with the economy.

It is now that computing resources are relatively cheap so that personnel costs are much greater than the cost of computing resources. However, from the 50s to the 70s, everything was different. For example, in the material “A View of 20th and 21st Century Software Engineering» Barry Bam describes the cost ratio in 1955 as follows.

“On my first day at work, my manager showed me the GD ERA 1103 computer, which occupied a large room. He said, “Now listen. We’re paying $600 an hour for this computer and $2 an hour for you, and I want you to act accordingly. “”

Although the salary per person in relation to the machine was meager, but, in general, the costs of programmers were VERY, VERY expensive. Let’s turn to Backus again. This is what he writes in the article “Programming in America in the 1950s—Some Personal Impressions» about the reasons for the appearance of Fortran.

“FORTRAN did not actually arise from a brainstorm about the beauty of a programming language in mathematical notation. It all started with the recognition of the main problem of the economy – the costs of programming and debugging had already exceeded the cost of launching the program.

As computers became faster and cheaper, this imbalance became more and more intolerable. This prosaic economic understanding, plus experience with tedious programming, plus an unusually lazy nature, led to my constant interest in simplifying programming. This interest led directly to work on Speedcoding for the 701 and efforts to introduce floating point operations to the 704, as well as indexing.”

It’s prosaic, but it’s about money.

Here’s a clarification from the book about the history of Fortran

…which is written according to Backus himself. Here’s what’s in it…

If in Russian (thanks to Google translation), then the point is that…

“…one of the factors that influenced the development of FORTRAN was economics. In the 1950s, the cost of programmers was no less than the cost of the computer itself (this fact follows from the average salary plus overhead costs and the number of programmers in each center, as well as from computer rental data.)

In addition, between a quarter and half of the computer’s operating time was spent on debugging. That is, programming and debugging accounted for as much as three-quarters of the cost of operating a computer. Obviously, as computers became cheaper, this situation got worse. This economic factor was one of the main motives that led me to propose FORTRAN in a letter to my boss Cuthbert Hurd in late 1953 (exact date unknown).”

What can we understand from this passage? Programming is expensive.

The next thing to move on to is “independence.”

For example, this is what they write in the article Programmer: An Assault on Bugs from 1977.

It contains an interesting quote from the main character, programmer George Machmahon. This is how he describes the work of his colleagues a decade earlier.

“In the 1960s…programmers held some of the few office jobs that offered a lot of independence…Independence has always been one of the benefits of being a programmer. They often worked on their own. Moreover, there is no good way to check the programmer’s progress. If you want to see how the builder is doing, you can go and look at the house. There’s not much to pay attention to in the program until everything is ready.”

Here’s a quote from Backus again, from the article “Programming in America in the 1950s—Some Personal Impressions»

“Programming in America in the 1950s was full of enthusiasm. virtually untainted by scholarship or the stuffiness of academia. The computer inventors of the early 1950s were too impatient to hoard an idea until it was fully developed and written up in scientific papers. They wanted to convince others.

Action, progress and superiority over competitors were more important than simply authoring an article. Recognition in the small community of programmers was more often given for a bright personality, outstanding programming achievements, or the ability to drink well than for intellectual insight. Ideas flowed freely along with the drinks in countless meetings, as well as in sober private discussions and informally circulated documents. Ideas were the property of anyone who might use them, and the scientific practice of noting sources and related work was almost universally unknown or unpractised.”

It turns out that the programmers were on their own and the authorities were not authorities for them? I think so, because they had to work their heads with incredible power to make programs work at a speed that would justify the large costs of maintaining them. And all you have at hand is a description of the task and an instruction manual for the machine.

And now the question is: how could certified scientists feel working together with a person on whom the work of this very scientist depends, but who does not care about academic degrees? At the same time, it is not clear what he is doing there with the computer. I don’t know the answer to this question, so I’ll leave it here quote from John A.N. Leehistorian and PhD (graduated in 1958 from the University of Nottingham, UK – a contemporary of Backus).

“Mr. Backus and his young team developed a programming language that was like a combination of shorthand and algebra. Fortran was very similar to the algebraic formulas that scientists and engineers used in their daily work. Having undergone training, they no longer depended on priest-programmers to translate their scientific and engineering problems into a language that a computer could understand.”

Have you noticed the word “priests” (yes, priesthood is translated differently, but let’s leave “priests”)? This is “LJJ” for a reason. Backus himself called this activity a “black art” in the sense that it was secret and wrote about this in the same article “Programming in America in the 1950s—Some Personal Impressions»

“Programming in the early 1950s was a black art, a private, secret affair involving only the programmer, the problem, the computer, and perhaps a small library of routines, and a primitive assembly program. Existing programs for similar problems were unreadable and therefore could not be adapted for new purposes. There were practically no general principles of programming. Thus, every problem required starting from scratch, and the success of the program depended primarily on the personal techniques and ingenuity of the programmer.”

And here is a quote from Backus about “priests.”

“Just as free Westerners developed a chauvinistic pride in their knowledge of boundaries and a corresponding conservatism, many programmers in the heady 1950s began to consider themselves members of a priesthood, guarding skills and secrets too complex for mere mortals…

…This sentiment is noted in an article by J. H. Brown and John W. Carr (1954): “many professional machine users strongly opposed the use of decimal numbers…for this group the process of machine learning was a process that could not be conveyed to the uninitiated.” This attitude dampened the impulse for complex programming. The priesthood wanted and received simple mechanical devices for the priests.

The hard work burdened them, but they viewed more ambitious plans to make the programs available to a wider population with hostility and derision. It was obviously a foolish and presumptuous dream for them to imagine that any mechanical process could perform the mysterious inventions necessary to write an efficient program. Only priests could do this.

They were steadfastly opposed to those crazy revolutionaries who wanted to make programming so easy that anyone could do it.»

That’s it.

Summarizing, we will get a more or less clear picture of why this profession was treated this way and why they constantly tried to get rid of it. It can be described something like this: “Too expensive programmers do something incomprehensible, on which a lot depends… How can we get rid of them?” Familiar theses, isn’t it? I do not claim that I am 100% right and only these reasons prompted the creation of the same Fortran (there were others), but this is one of the reasons for the attitude towards programmers in general.

And the echo of this attitude can still be heard.

And horror stories about replacing a programmer with AI fit well into this idea of ​​replacing “priests” with something more predictable…

“Evil” computers from the 70s to the 90s

Initially, there was no idea of ​​​​replacing programmers with a program or some kind of device, and I did not find a single quote from scientists, researchers and significant figures.

There were hopes that the machine would be able to think like a person. For example, in 1970 Marvin Minsky said Life Magazine:

“In three to eight years, we will have a machine with the general intelligence of the average person.”

There were hopes that the machine would learn to play complex games – Herbert Simonone of the pioneers in AI research, said in 1957 that AI would be able to beat humans at chess within the next 10 years.

Pioneers like Newell hoped that AI would be a universal solver of problems beyond human control. And he, by the way, tried in every possible way to bring his dream closer by creating the “Universal Problem Solver” (General Problem Solverin particular, mathematical ones.

In general, AI was perceived as an assistant, as an “amplifier” of a person.

“In an interview with the magazine “Economic Strategies” Valery Ovchinnikov gives an example of arithmetic devices based on threshold logic elements created in the 1960s. These modular arithmetic circuits could be trained by “weighting” the signals at the inputs of threshold logic gates. It’s easy to draw a direct analogy with neurons and perceptrons and the regulation of neuron thresholds: modern neural networks learn in exactly the same way.

…<>…

Scientists designed a system that could learn to solve simple problems: recognize images and speech, encode information, calculate the trajectories of moving targets.

When A.N. arrived in Zelenograd. Kosygin, he was shown a small microelectronic device that could, at the hardware level, solve the problem of predicting various events and calculate the trajectories of aircraft. Kosygin immediately realized that this miracle could be used for management purposes. Surprisingly, he immediately grasped the essence: the main thing is to teach the elements to learn, and then it will be possible to solve any problems… He realized that we are developing artificial intelligence that greatly enhances human capabilities, and turned to us with the question: “Is it possible to do this system and put it in my office, run programs there so that it learns to do something useful for me, examines my subordinates and, if possible, teaches me to make some decisions?”

AI/robots were perceived as an assistant in the home and national economy. For example, here is a screenshot from the book by Ivakhnenko A.G., Lapa V.G. “Prediction of random processes”, publishing house “Naukova Dumka”, 1971. from the article History of neural networks in the USSR.

These were roughly the expectations.

But I didn’t find any forecasts that AI will replace the programmer as a unit. I found only fears and concerns that certain “computers” will replace humans as a whole. For example, take a quote from Mr. McMahon from the 70s (link above).

“Computers are taking over the world. Computers are putting everyone out of work. Computer as president. The computer is like a garbage man. Interesting concepts. I think I can understand that people are worried about them, but it doesn’t bother me. We are at such a primitive stage; computers have only been around for 25 years. Computers can’t do everything, and, at least in my lifetime, neither can God. This is an amazing device to get things done. But this, of course, is not God. »

And here excerpt from interview Zaal Lyanov, head of the EPAM training center in St. Petersburg:

“In the late 1970s, when I was a high school student who had just begun to engage in development, the possibility of replacing AI was already being discussed…<>…At the university, I was partially involved in the topic of AI, and at our department work was carried out on text processing in artificial language. According to futurologists, AI will soon learn to process specific tasks and be able to provide ready-made solutions. More than 30 years have passed, and this “soon” will not come. And judging by what is happening now in this area, despite the seemingly grandiose successes, there is still a very long way to go before replacing a person.”

We are not talking specifically about programmers.

I thought, maybe then I’ll find horror stories about this in some publications from the 80s and 90s? It would seem that there are quite a lot of achievements: and self-learning robotsAnd machine vision systemsMercedes-Benz built a self-driving van (search for Ernst Dieckmanns), and chatbots appeared (Jabberwacky and Cleverbot), and in the 90s, Deep Blue actually defeated grandmaster Garry Kasparov in chess.. No, I didn’t find anything regarding programmers…

And then I thought I should look deeper again and found a quote from AI pioneer Herbert Simon from 1965:

“In twenty years, machines will be able to do any job that a person can do.”

Here it is! This is the key! Perhaps, once upon a time, the idea was ingrained in the mass consciousness that some machines / AI / robots would take away jobs from people. In general, among people. And for many decades, until the 2000s, advances in this area only intensified fears. About AI taking the job only Nobody thought about it among programmers. “We’ll all be left without work here, what kind of programmers!?”

Everyone was simply afraid that they would be left without work. And authoritative people have these fears just supported.

Timothy Leary: “By 2007, the deficit problem will be solved. Since robots and computers will do most of the work, you won’t have to work.”

Therefore, no one paid attention to programmers. And when it became clear to everyone that the predictions regarding robots for all people were not coming true, they switched to IT. Conditional robots will not replace the cashier from the hypermarket? Well, we have incomprehensible programmers here, they get paid a lot, they live somehow separately, they revolve in their own world, let’s replace you?…”

But until the 2000s came and AI matured to the point where it could take away jobs, it was taken away by programmers from India.

Horror stories about outsourcing in India

In May 2019, RBC interviewed Gerd Leonhard, a futurist, billionaire, and philanthropist. This is what Gerd predicted in those distant and wonderful times:

“We live in a world where more than 70% of the professions in demand in the future do not yet exist, and 50% of currently existing professions will soon turn into freelance ones. Everything is changing too quickly. For example, the social media industry, which didn’t really exist a decade ago, now employs 21 million people. Now people are trying to teach children the exact sciences – mathematics, physics, development, engineering disciplines. But this is exactly what machines are already better at than we are!

In ten years, all developers will be unemployed—or at least most of them. India produces 1 million engineers a year – can you imagine what army of unemployed people there will be? We need to teach what makes us human – the ability to communicate, understanding, humanism. I tell my son: traveling the world is much more useful than studying for an MBA.”

And what a title.

Ohhohoh…Dear Gerd, I have bad news for you.

Besides, you’re a little late. Your horror stories are outdated. All this has already happened in The Simpsons in the 90s, where this scarecrow came from. Having gone through the forums with memories of those times, I found a lot of interesting things.

Translation.

“I started programming in the early 90s and went to college in the late 90s/early 00s. It may seem crazy, but because of these tools and outsourcing, everyone I spoke with thought that being a programmer was a dead-end minimum wage job, and many strongly advised me to choose another specialty.”

And further.

“When I majored in CS in 2000, I was told that…it would be impossible for an American to find a job because they would hire all their programmers from India.”

And further.

Translation

“The mother of my friend, the only professional programmer I knew, told us that all programmer work in the US would be done remotely from China/India”

And further.

“I graduated from school in 2001. People told me that everything would be outsourced to India and that there was no point in starting programming.”

And further.

Translation:

“I did my CS degree in 2000, I heard the same thing, companies would rather send someone from India than hire you”

It may seem that I pulled quotes and now I’m passing it off as truth? Perhaps, but even books have been written on this topic. Here is Edward Yorton, “The Decline and Fall of the American Programmer» First edition. Last thing.

Here’s the summary.

“Software development could soon move from the United States to software factories in a dozen other countries if American development organizations do not use the key software technologies discussed in this new publication, according to Edward Yordon…”

In the book, Ed Yordon puts forward a very plausible thesis that software development will move overseas, where labor is cheaper. It must be said that the thesis has the right to life. Seeing how corporations are moving production to China, India and Southeast Asia, such a thought may come. But the derivative of this thought turned out to be incorrect.

Ed strongly advises programmers to start using mechanistic methods so that they churn out code like “cookies in a factory.” But Yordon “slightly” confused “programming” and conditional “coding”. And I also got a little mixed up between stamping a product – be it a cookie, or a car, consisting of a set of standard operations – and software development.

Software development is not just about code. It’s NOT so much code. Replacing a developer with a machine, like at a factory, will not work. But the idea is attractive. Attractive, but not viable. After all, already in the 2000s, programmers from India were talked about as a threat. no one spoke.

What’s interesting is that the myth has taken off and now India has the largest number of workers in IT and quite a decent unemployment among Indian developers.

Moreover, the situation has turned 180 degrees – because now the media are already scaring developers from India!

The headline reads that “Top AI expert says most outsourced programmers in India will be unemployed in 2 years thanks to this technology”

Very ironic.

It is interesting that later, instead of programmers from India, they began to intimidate with journalists from India who would work for food.

And now instead journalists programmers from India, we have ChatGPT and the same threats.

What did I want to say?

It turned out chaotic, I’ll try to put everything together.

First thought.

As I see it, the idea of ​​“abolishing” programmers has been going on since the very moment software development appeared as a phenomenon. This thought has long settled somewhere in the subcortex, and maybe in the collective unconscious (I’m not an expert here, correct me) and constantly breaks out – just give a reason.

When AI threatened everyone, the programmers were forgotten. Or maybe there was no such problem at that time? Perhaps, but most likely it was, because why then put horror stories about outsourcing into circulation? And when the predictions about the apocalypse, as expected, did not come true, the “unconscious” (again) switched to IT.

It turns out that the horror stories that programmers will be “cancelled” by AI are a kind of outburst of general fear about a terrible and incomprehensible future on programmers, whom they have been trying to get rid of for 70 years, because the general mass of people do not understand what programmers are doing here and what do they get that kind of money for?. So this crazy thought arises: “Yeah, ChatGPT can write code – now you’ll be left without a job!”

Second thought. Now there will be just reasoning and verbiage about what has boiled over.

Together with the first thought, simply banal human stupidity and overestimation will be added.

AI is overrated. It was overrated in the 60s, 70s, 80s, and now. But it is overrated not because of AI, but as some kind of magical technology that people cling to and start making noise about. A typical example from the recent past is blockchain. Maybe I won’t look for clippings from articles, because you already remember very well how much noise there was about him. So where is he? It’s not about technology. They just screwed up the blockchain and abandoned it, because there is something more interesting.

It’s about the media. They cling to something “magical”, pluck out all the hype grass, leaving bare ground, and run on without going into details at all.

  • It doesn’t matter that just because a supercomputer beats a world champion at chess, that doesn’t mean it’s also good at other things. After all, the main thing is the title.

  • It does not matter that just because an algorithm can solve programming problems, this does not mean that it can write software. It doesn’t matter that code is not the biggest part of the process, and that we have not only those who write code, but also those who write requirements and test and much more. No – let’s just generalize and wishful thinking!!

That’s why I just don’t read articles like this. Well, I’ll see something like “AI will replace all copywriters and editors in a year. At the end of the article there is a link to vacancies – apply before it’s too late, stupid!”

First, let them try. Purely for the sake of experiment. I just know what the results will be 🙂

And secondly, I’ve already seen all this many times. And all these horror stories are repeated in different variations from year to year. “History is a living memory, a teacher of life and a Messenger of new events,” as Comrade Pliny said. He who knows history knows the future. And, it seems to me, I know a little history, so I sit straight and don’t cough.

And if I don’t worry, then you, developers, QA, DevOps engineers, analysts and everyone else, have nothing to worry about. If my work is hard to convey Chinese room speech model, then yours – by orders of magnitude! Not by times, but by orders of magnitude!

They have been trying for 70 years, but they just can’t.

But this is my purely objective opinion.

And in general, don’t worry – it will pass. The wave will subside over time, as always, and they will scare us with something else. This is what I am 100% sure of.


In general, I may be wrong, I may make mistakes, I may overestimate something, and so on. I hope for objective criticism, despite all the mess that I wrote here, constructive criticism and your generosity. Thank you for your attention, I shake your hand firmly.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *