Is this really the end of software development?

beeline cloud We often discuss topics that concern the IT community.

Layoffs, layoffs, layoffs, they never seem to end. The website layoffs.fyi even has a tracker of layoffs in the technology sector. Startup funding has fallen to its lowest level in the last 5 years, writes Crunchbase. In addition to general economic problems, there is a looming fear on the horizon that AI will completely displace programmers.

Tech news is reminiscent of reports about the end of the world. Is it so?

As someone who started my programming career right after the dot-com bubble burst, I can confidently say that it's not all bad. When I first started out as a professional, people told me that “the Internet is dead” and that I should get a real job. But the problems that dot-com startups struggled with have not gone away. Moreover, many of the solutions found in the early 2000s were buried in the ground, only to reappear a few years later.

It's not over yet, just like in 2000

When I started working as a junior programmer in 2001, the dot-com bubble was hitting the tech industry. Because of it, Google had to launch an advertising platform in order to make a profit. Tech companies that were lucky to stay afloat tried to start making money as quickly as possible. However, many of them were never able to survive. The technology and software industry gradually recovered, improved and developed rapidly over the following years.

The doomsday tale “the Internet is dead” ultimately proved untrue. The Internet, technology and software have evolved faster than ever before. It wasn't the end then and it isn't the end today. Take a deep breath and look around. We still fly subsonic planes, there are still no flying cars, and I can't even order a hoverboard. 3D without glasses is complete nonsense, we still don't have the video calling technology featured in Star Wars in the 1980s.

Your car can't even park on its own, let alone drive around town unattended. Yes, there are self-driving cars with ridiculously large amounts of sensors, like Google's Waymo. But… just place a traffic cone on the hood of a self-driving car to experience the full power of current AI.

AI will not be able to replace a single developer in the near future

“New development tools and increased computing power mean we will need fewer programmers” was the credo of the early 2000s. In fact, everything was exactly the opposite. Even though individual software projects got by with fewer developers, the general desire for technology meant that there were more and more projects, and therefore more and more developers needed to be attracted. AI and cloud technologies are radically improving developer productivity. However, they create new problems and opportunities that require even more developers to solve.

If you've ever used Github Copilot, ChatGPT, Google Gemini or Chevrolet Chat to create code, you've probably noticed that the training and scope of these models are significantly limited.

LLM training and the “AI dilemma”

If you ask any LLM to write code in Swift for an in-app purchase on iOS, then most likely the output will be an implementation of StoreKit1, consisting of approximately 50-60 lines. However, StoreKit2, announced 2 years ago, greatly simplifies the in-app purchase process and reduces the code to just a few lines. When training LLMs such as ChatGPT or Copilot, either outdated data is used, or the sample is dominated by StoreKit1 implementations, which forces LLM to always use it, since the neural network “considers” it the most suitable.

The point is exactly how LLMs are structured. These are language models, they do not “understand” or “comprehend” the true meaning of words. They use available information to create a variation on a given theme. This variation really looks like something new, since information in this form did not exist before. However, the result, whatever one may say, is still based on ready-made training data. This leads to the AI ​​dilemma, where it becomes very difficult for innovation to make its way into the world as the model tends to use the most applicable and therefore most popular answer.

“Create a photo of a programmer in 2050 who writes applications for a computer” – What, what? Will computer screens get smaller again and we'll start wearing ties while working on code again? 😉

In social sciences and computer science, this effect is known as the Matthew effect or “rich get richer.” The more information an LLM produces, the more often that information becomes the basis for training the model once it is used by a human. At some point, LLM will stop bringing anything new and will start going in circles and only using old information if people stop feeding it with their creativity. Ultimately this will lead us to a dead end.

There are ways to solve these problems on a small scale and with simpler implementations, like recommendation engines. However, if we talk about more global problems where LLMs are widely used, then it is hardly possible to find a scientifically based algorithmic solution to this dilemma.

The only viable solution at this point? Human intervention.

There are things LLMs may never know

LLMs need to be trained. To do this, they need to obtain a large amount of data from people. Moreover, they need the data in digital form. Now I will give a very simple and understandable example.

Man: How many Reichsmarks did the first first class trip from Cologne to Barcelona cost in 1931?

Copilot: Unfortunately, I don't have exact fares for this particular route in 1931.

ChatGPT: It is difficult to provide accurate information on the cost of first class train travel from Cologne to Barcelona in 1931, as historical ticket prices can vary and are not always well documented.

By the way, there is an answer – 160 Reichsmarks (110 in second class and 70 in third). How do I know? I can read historical documents that have never been scanned and are not available digitally.

The same fundamental problem applies to programming. When I wrote my article “Vintage Programming On Macintosh System 7.5 With Think C & ResEdit”, I tried several LLMs. None of them were able to create even a simple button that displays a dialog with a message when clicked. They were complete failures and produced either gibberish or code not even remotely related to the Macintosh system. And all these are just isolated examples, because the list of LLM restrictions is almost endless.

Leave it to LLM to port a 1990s Macintosh application to modern macOS? Well, good luck.

This is not so much archival information as a very specific field of knowledge for which you will have to train a model to work. And accuracy is far from the strong point of neural networks. The algorithms do not provide for this. The media is full of examples where LLMs, already in operation, deviate from the provided initial data and produce errors that lead to real problems.

It is impossible to copy an organic object that has not been fully studied

Ask any neuroscientist what exactly happens in the human brain when a person falls in love. The answer will be this: we ourselves don’t know for sure. And that's completely normal. Science has not yet “debugged” the human brain. It cannot be reproduced. It is not even possible to really treat it, with the exception of some operations, which people of the future will probably consider barbaric and medieval.

Science has no idea how this thought thing works.

How to create general-purpose artificial intelligence if you have no idea how an organic brain works? No way. To reproduce something, you need to fully understand its functionality in every detail. Humanity is very, very far from understanding exactly how the human brain functions. And we are even further away from understanding how the brain learns and stores information. All we can do is “emulate” the mind using primitive means.

Don't get me wrong: I love LLMs and think we've made great strides in the field of AI. Synthesized voices have become very good, and they are extremely difficult to distinguish from natural human ones. The image and video generation is fantastic. However, I think we are still very far from seeing a full-fledged Hollywood blockbuster filmed entirely by computer without human intervention.

Software and technology have surpassed people

Rapid progress in technology, in software, in cloud computing, in artificial intelligence, allows people to find many new solutions to known (and even unknown) problems. On the surface, some of our challenges lie in the area of ​​technology, such as self-driving cars. But in reality everything is more complicated.

In my experience driving an autonomous BMW car, most problems arise from people creating clutter on the road, such as drawing unclear markings that even an experienced driver can't handle. Conflicting road signs are another interesting problem. And all this in Germany, a country known for its precision and legal scrupulosity. Because it is customary to follow very specific and precise rules. It would seem an ideal environment for an autonomous car. Not at all.

If even people cannot understand the traffic situation, then how will a computer do it?

It is the chaos inherent in humans that hinders technological progress.

Most people don't know what chips are in their phones and what they are for. Not to mention how it all works.

People don't understand why their credit card is valid abroad. They don't understand how their mobile phone transmits data, authenticates with VLR and HLR. Technology literally dictates the rules of our daily lives every second, and most people who use it have no idea how it works at the most basic level. They know how to use technology, but lack fundamental knowledge. The amount of technology that people use every day, but have no idea about, is simply mind-boggling.

A society in which technology develops faster than humans and is far ahead of them may give rise to another era of dark ages. An era when technology and science will have to be hidden from people to prevent fear and myths. Sam Altman's warnings may indicate that our society is already moving in this direction.

Society is experiencing an education crisis

In order for a person to keep pace with technological developments, constant training and education is vital. However, the decline of the education system is leaving millions of people behind.

This applies not only to the conventional Tom, Dick and Harry, but also to software developers. If you look at the adoption of modern cloud technologies, serverless applications and approaches such as event-driven architectures, and then compare this with the number of “classic” three-tier web apps, you can see that the adoption of modern application architectures is a long way to go. There are even new applications built on traditional architectures that don't make sense in today's world. They are created this way not because it is expedient, but because the developers do not know anything more suitable.

It now takes twice as long to fly from London to New York as it did in the 1980s.

Believe it or not, there are still quite a few websites that are not optimized for mobile devices (in 2024!). I've personally met web developers who couldn't even explain in the most general terms how TCP/IP or even HTTP works. But these are the network protocols that they use and work with every day.

Recent layoffs in the technology sector are argued to be “oversupply.” However, if overstaffing resulted in more programs, more services, more users, and more revenue, there would be no need to cut staff. Instead of increasing productivity by adding more employees, tech companies became less productive when they hired new people.

Companies have had to realize that there must be a balance between quantity and quality in hiring.

Hiring tons of people does not mean solving all problems; instead, you need to find the strictly required number of qualified specialists. Nowadays, companies are no longer cutting staff, but are rebuilding them to take into account the increased productivity that will come with new and more advanced AI tools. Eventually they will start hiring again, but will be much more selective.

The learning curve has become steeper these days while learning has become much easier. Developers and software engineers with the right skills will have no problem even in the current job market. Just like it was in the early 2000s. Constantly learning and keeping up with technology has been, is and will be the only way to survive in the software industry.

By the way, we at beeline cloud regularly launch free courses – catch a selection in different areas:

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *