You can start a conversation about the types of output devices for data visualization from afar – even with computers equipped with a kinescope. However, the seventies and eighties are usually taken as a reference point in the modern history of monitors – a period when several key events occurred in the field of hardware solutions at once. At this stage, the “workstation” of the programmer took on the usual form for us: terminals with remote access to a shared computer were replaced by personal computers. The technology of image generation was also transformed – with the advent of video adapters (the first of which, the Monochrome Display Adapter, was developed by IBM in 1981), the load on the processor and RAM decreased, which allowed to give more resources to the graphics without harming the system.
The resolution of the screens with MDA was 720 × 350, and they worked with a very narrow range of data – black and white text characters. In the same year, the Color Graphics Adapter was released, which offered a number of additional advantages: graphic mode along with text mode, support for sixteen colors and the ability to work in several resolutions depending on the color rendering needs (maximum resolution, 640×200, was available when working in text mode with display of two colors). After a pause of a couple of years, they were followed by an improved Enhanced Graphics Adapter with an expanded palette (sixty-four colors) and a resolution of 640 × 350 pixels.
The result of this series of IBM developments and an important milestone in the history of monitors was the creation of the Video Graphics Array in 1987. It was a technological leap in several respects at once. Color rendering has become much more accurate and more detailed due to the fact that the number of supported colors has increased immediately by several orders of magnitude (262144 colors in total, of which sixteen could be used to construct a specific image). The image stretched vertically – a new resolution was introduced, 640×480 pixels. A 4: 3 aspect ratio was optimal for perception and was subsequently considered a default option for a long time. In total, VGA worked with ten resolution options, which allowed users to adjust the number of colors and image size to their preferences and monitor capabilities. Finally, the new model began to use the analog communication interface between the adapter and the monitor – a foundation for future color rendering improvement. All of this combined has made VGA the market standard for years to come.
IBM PS / 2 Model 50 – the first computer model to use VGA
Of course, this did not slow down the end of the development of technology. In the late eighties and the first half of the nineties, several more improved versions of the adapter, known under the general name Super Video Graphics Array, were released, which gradually increased the amount of video memory, the range of colors (up to 16.7 million) and the size of the picture. At the turn of the decade, the famous 800 × 600 resolution appeared, the record (but not popularity) of which was soon broken by a model with a resolution of 1024 × 760. According to the data that analysts managed to get out of the past era, up to two thousand people mainly dealt with screens at 800 × 600, 1024 × 768 and 640 × 480 pixels – it is not by chance that these three resolutions were usually supported by popular games. As it is easy to calculate, despite the increase in values, the 4: 3 ratio remained unchanged.
Meanwhile, a revolution was brewing in the world of monitors. For a long period, the cathode ray tube was considered the most practical and efficient way of generating images on personal computers. An alternative method of displaying data using liquid crystals has been known since the sixties and seventies, however, attempts to apply the technology on large screens revealed a lot of problems caused by the instability of the layer. For the time being, calculators, watches and the vague hope of showing themselves in that niche of the market where CRT monitors weren’t available — on laptop computers — remained the lot of liquid crystal displays.
1985 Toshiba T1100, one of the earliest laptop models, features an LCD screen
So it finally came out. By the mid-nineties, the main disadvantages of LCD displays in laptops were eliminated: the contrast leveled, the colors appeared, and the need for additional lighting was compensated by the built-in backlight. When the technology began to be used on monitors for stationary computers, it became obvious that with comparable image quality, it gives many nice bonuses – lightness, compactness, low power consumption. CRT screens have been kept afloat for quite some time, but by 2003 the balance had finally shifted in favor of flat monitors. Along with the overall equipment, one important historical feature gradually disappeared – the ability to adjust the resolution on the monitor with acceptable losses in graphics quality. LCD screens were designed to work strictly in the resolution under which they were made.
So, the thickness of the average monitor dropped sharply, while the rest of the measurements steadily continued to grow. To appreciate and feel the pace of this progress, just look at the picture below. This installation (right side) artist Aram Bartall succinctly demonstrates how screen sizes have changed over the course of fifteen years. Suffice it to say that three quarters of this pile relate specifically to the twenty-first century. The work dates from 2013, so it should also be borne in mind that in its current form it would be replenished with a number of other elements, including paper displays at 3840 × 2160 and 7680 × 4320 pixels.
Finally, in addition to continuous scaling over the past decade, another noteworthy shift has occurred – a rethinking of the classic aspect ratio – 4: 3. Deviations from this standard, of course, have happened before, especially when the time came for the rapid development of laptops, but until the mid-2000s they were not systemic in nature. The trigger for a radical change in the year was the desire to bring television screens and computer displays to a single standard – perhaps partly because the user experience began to revolve more around video content. Funnily enough, evolution described a full circle here: after all, the history of monitors actually began with television screens. One way or another, since 2008, the title of the standard has moved from a 4: 3 ratio, first to 16:10, and then to 16: 9.
Having made this short excursion into history, let us return to the question that interested us first of all: how do PC users respond to all this variety of possibilities? If we talk about the total mass – quite restrained and prudent. Despite the ongoing struggle for every new hundred pixels on the screen, the process of moving the main audience to new, more spacious monitors has always been unhurried. As already mentioned, despite the fact that formally the border of the first thousand pixels was crossed back in the late eighties, the resolution of 800 × 600 remained the leader without a hint of competition for almost the entire next decade (as far as we can judge by the fragmentary statistics of those years). According to W3Schools, back in 2000, he owned a 56% market share – by today’s standards, the figure is fantastic – and only by 2003 the championship finally passed to the resolution of 1024 × 768.
The accelerating pace of graphics development does not affect the interest of users in new products – the trend towards a gradual increase in popularity continues to this day. With the beginning of the era of LCD screens, the resolutions become fixed – probably this also played a role, preventing the possibility of experimenting with different densities on older monitors. If contact to the latest statistics of 2019, you can see that on a global scale, the resolution of 1366 × 768 is still the most demanded despite the abundance of higher pixel options. It is noteworthy that it reached the top in 2013, after dispersing in six years, and was kept there stably. In a word, all the data we have studied over the past three decades indicate a low market mobility.
Statistics of the popularity of screen resolutions around the world over the past year
The reasons for this state of affairs are not hard to guess. Firstly, improving the image quality is a significant bonus, but hardly enough for the average user to provoke him to immediately replace the equipment, especially during the period when it still remains in the price category of ultramodern exclusive. Wide popularity comes to new displays for the most part when they cease to be new and begin to shift towards the market standard.
At the same time, the development of user preferences cannot be called linear-translational unhurried movement from smaller values to large ones. As the resolution range grows, people begin to scatter more and more between the available options. Statcounter graphs show that even absolute leaders have not recently won more than a third of the total audience, and the “Other” has taken root in the three most popular options, combining a whole scattering of various resolutions. The fact that more than half of those working on computers are satisfied with very outdated displays seems to us curious. Perhaps, by the current moment, the image quality has passed a certain milestone – the standard has become so high that even the resolutions that do not reach it are acceptable for the average person who does not work very hard with visual content.
However, as part of the total mass, separate groups stand out that are much more urgently in need of a high-quality schedule and are actively using new technological opportunities. These are, first of all, graphic designers, digital artists, gamers (so, in the Steam community 1920 × 1080 resolution took a leading position in 2017). And here a logical question arises: do developers belong to this part of the audience?
A review of Internet sources showed that there is no definite answer based on quantitative data on this question yet – mass surveys have not yet been conducted among this group of users. Nevertheless, it cannot be said that the community is indifferent to the problem: there are more than enough fragmented, subjective expositions of personal experience on the Web, from debates in forums to recommendations of bloggers and online publications. Of course, in the aggregate, all this creates a variegated and contradictory picture of preferences.
If we try to deduce a common kernel from the polyphony of this collective understanding, the following emerges. The developer’s activities are primarily related to the processing of data in text format, and graphic content is a more peripheral area. The text data when working with the code is highly dense, which requires a strong visual concentration and creates a strain on the eyes. In addition, a spacious and well-ordered workspace is of great importance – programmers appreciate the opportunity to have before their eyes not only the right piece of code, but also related materials, sources and programs.
From these initial provisions, with several reservations, several conclusions can be drawn:
- Since high resolution provides a clearer, more comfortable image for the eye, theoretically, developers should strive for the maximum possible number of pixels.
- At the same time, the effect of excellent picture quality can be completely leveled by too high a density. Many note that high resolution with a small screen size gives a lot of unpleasant side effects – from ripples in the eyes to headaches.
- In theory, programmers should opt for large screens – both to maintain a reasonable density, and to expand the workspace. However, another point must be taken into account here: a configuration with two monitors has been very popular in recent years. Some consider modern 4K monitors to be a worthy alternative, but this replacement also has opponents who prefer a clearer border between the code and non-code zones and a smaller range of vertical movement of the gaze.
- Finally, everything that was said above about the tendency of users to compromise, when the question arises of choosing between image quality, performance and price, to a large extent applies to developers. With all the potential benefits that high-end screens provide, most will focus not on the ideal, but on the acceptable. The secondary role of graphics means that the average programmer will invest, first of all, all the same, in the power of the machine.
- To some extent, the previous factor is probably smoothed out due to the fact that programmers, in general, are more knowledgeable about technological innovations and have higher requirements for their computers as the main work tool.
This logical chain led us to the following conclusion: in their preferences, programmers probably go a little ahead of the majority of users, introducing new models into work a couple of years before they finally become the basic standard.
But these mental constructions, of course, required verification with a real sample, albeit to a limited extent. The first step was a survey that we conducted among the developers of the company. Local statistics, in general, confirmed our conclusions: while 1366 × 768 still prevails in the world of ordinary mortals, gradually losing ground under the onslaught of 1920 × 1080, for developers this is a long gone stage: the main competition unfolds between more modern formats. The results of the first validation inspired us and now the team of analysts is determined to check the result on a wider audience. We ask the Khabrovsk citizens to contribute to our statistics – later we will report on the results.