Programming and IT are in the dark, but this is not certain

Usually any article begins with a story about how the developers suffered with the monolith. Comites didn’t commit, absolutely everything was bad, release once a month, terrible migrations, girls wouldn’t let me telephone. But suddenly they read a success story. And as soon as the monolith was cut, life and development were transformed. But in the end, development became orders of magnitude more complicated, and here’s why.

How the monolith works, there is a database + a couple of slaves. All requests work in a single address space. The transaction either worked or it didn’t. There simply cannot be any other options. The project has one mono repo, which allows you to update libraries and be sure that the service uses exactly this version of the library. Easy deployment and testing. Mainly a relational database + redis is used for caching. Simple profiling, since the entire request is at a glance, you only need to use the built-in database tools. A single point of logs, which allows you to write a tool for visualization and monitoring.

What do microservices have? A complete set of problems. There are no transactions, now distributed things have appeared. A request over the network must contact other services, “wait” for a response, collect information, validate, make sure that all services have responded, and only then, after all these actions, the transaction is considered completed, this is just a mega simplification. Simply by sawing the monolith, out of the blue the problem of distributed transactions arises; of course, it is solved by additional code, additional tools, additional network lags, synchronization, etc. The number of microservices is growing, and API desynchronization between services is also growing. How to deploy it for testing and debugging is also unclear. How to update the libraries used and be sure that they are of the same version, I generally consider it an overwhelming task; I still have to create some kind of monorep for the libraries and drag them into microservices. Absolutely all aspects of the project become more complicated when cutting or switching to a microservice architecture. Cognitive load increases. And we are urgently looking for >= 1 girl because without him it simply won’t take off.

There is only one alleged argument. Each microservice becomes independent and small. And yes, I absolutely agree, but what’s stopping you from making a modular monolith out of a legacy monolith. Combining the advantages of a monolith and microservices, without microservices 🙂

And the craziest thing is that 95% of companies don’t need micro services. High performance is achieved on a modern average server, without terabytes of RAM. A simple modular monolith. Just from time to time, you need to pay attention to performance and analyze the slowdown. If, for example, you have correctly configured the Postgresql config, this is half the success.

I am sure that this trend is declining. And many companies are again returning to monolithic solutions.

  1. And what's the result?

    Back to basics:)

    I have no answer. Managers and businesses want quick release of features and the product itself. This affects profits. The capitalist ship of IT is already leaking and will definitely capsize someday, but obviously not in the coming years. Therefore, the developers have no chance to change anything. Profit will always be at the forefront and will sweep away everyone and everything. So if you are reading me, you are the resistance 🙂

    But my soul aches for fast software. I'm attracted to the concept of having little and getting more. And that’s why I develop a library outside of working hours LDL, this is an analogue substitute for the SDL library. But it is written in C with classes and has a single API for all supported systems, DOS, Linux, Windows.

    Perhaps it’s not so bad in modern IT. We received more modern tools and accelerated development. We got very powerful iron. But often, with the next lag on a website or in a program, you understand that it should not lag, the reason is clearly in frameworks or libraries. That basic tasks began to take disproportionately long to process and the amount of data being processed. Simply put, there is nothing to slow down. And when you imagine that billions of devices around the world, without a payload, execute billions of instructions, which are the price for a certain simplicity, then you understand what the price is. And how much more will it increase over time? Is there a limit to this?

Thank you for your attention.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *