So, let’s begin. Zhere somewhere there should be a beautiful introduction, in the style of but did you know about the planned obsolescence of equipment, that many manufacturers deliberately do not produce hardware in such a way that the consumer buys a new one every year. After all, we could even talk about Apple’s new presentation with their A17 and a whole new button…
…But the conversation will be about some more mundane things. In general, my laptop prompted me to write an article. And this is a stone against AMD. No, no – owners of Ryzen ultrabooks and others like them can calm down, we will talk about “Prehistoric Times”.
And so, on the third try — what is the article about?
About technologies that are forgotten by manufacturers, which is why owners of equipment get a pumpkin instead of a not-so-bad piece of hardware! I’ll consider:
Own precedent (which became the reason for the article). (AMD Fusion)
Less obvious precedents. (Intel Kaby lake-G)
Things I’ve only heard about. (Itanium/Larrabee/Cannon Lake)
Use Case – APU Drivers + Discrete Graphics
What was it: Technology of hybrid use of discrete and integrated graphics cards.
Why didn’t it take off: Performance did not meet expectations. Despite the fact that such technology completely covered the lower price segment and allowed very budget laptops to run modern games.
Why is it better not to have something like this now than to have it?: Complete lack of support at the driver level. In the best case, of the two cards, only the discrete one will work. And in most cases, if the card does not detect the type of load, it will try to run games on the integrated one.
So, the patient: a magical pink laptop from HP, purchased in 2015, and still serving faithfully (albeit in the state of a dying swan). A laptop in the lower price segment (at one time it was priced around $200). And for its price and age, it has simply impressive characteristics.
4-core processor from AMD.
BUT! Without first and third level cache. One core – one thread.
12 gigabytes of RAM (there were 4 in stock, but this was resolved).
Beautiful R8 365DX. Which represents two whole video cards. R7 360 and R6 built into the processor. With 2.5 gigabytes of memory (naturally slow DDR3, taken from a common bag).
At the time of purchase, it was the most powerful budget laptop (also pink) that could be obtained. Key…
Pink budget. And the power itself was achieved through a dual video card, which carried everything.
It’s no joke, but at one time, on this laptop, even after some shamanism, the real Witcher 2 flew.
There was a catch right away, the lack of level 1 and 3 cache. And a 5400 RPM hard drive, which made basic work with the laptop even worse than with its predecessor (Asus k53z 2010). But in fact, SSD solved the problem of performance in everyday life.
And so, being the owner of a laptop for $200, which is good for work (after installing an SSD, of course), and looks good, and you can even play some games on it. One day he turned into a pumpkin.
This happened in 2018…
Just 3 years after the laptop was released, it stopped receiving graphics drivers. At that time, two things saved me.
Presence of older ones on hard drive.
Ability to Google.
Actually, I still live on those drivers. But every year, with every Windows update, they work worse and worse… Moreover, modern games no longer understand AMD Fusion technology, which means, at best, only the R7 M360 (which is not a very powerful video card) is enabled. And at worst, games even run on the R6, turning gameplay into a slideshow.
Of course, we can say – upgrade your laptop, who plays games on budget laptops today? But the very fact is that just 3 years after the release of the laptop, the manufacturer stops producing drivers and independently turns it into a pumpkin.
Now, drivers for this HP are very difficult to find. Not on the laptop manufacturer’s website. There are no these graphics on the AMD website.
And even if you find graphics for the APU for the a10 8780 on the AMD website, it still completely disables the built-in graphics card and only connects R7. And yes, one could put up with the loss of some productivity, if not for one “but”.
With modern drivers, the laptop regularly crashes into BSOD. With the old ones it works like clockwork.
Why are you boiling?
And yet, just think, this is a special case of AMD Fusion refusing a specific processor and video card. Why raise such a fuss, write something, decide something?
Well, firstly, until recently the laptop was completely satisfied with its performance (like a work machine).
Secondly, curiosity, which made me think that this is really the only case when a manufacturer abandons its own technologies, bricking its own devices.
The first and most obvious one that came to mind was Apple’s technology, which, with the release of each subsequent model, turns the previous ones into a “lagodrome”. But this is obvious, everyone knows this.
But what do other manufacturers deal with their failures, and which technologies have not reached our time? What about:
Intel stone, with a built-in card from AMD?
The first 10nm ultra-low power processor that “failed”
Xeon on steroids, which could work on both servers and desktops, but due to its complexity, remained a local project that the manufacturer supported insofar as some contracts for supply and maintenance were concluded.
48 core graphics, straight out of 2008?
And this is probably not everything that has happened in recent years, but this is what caught my eye.
Intel Kaby Lake-G. A stone for the blues? Red?
What was it? An Intel processor with built-in “discrete” graphics and 4 gigabytes of dedicated graphics memory directly on the processor chip.
Why didn’t it take off? A very specific solution that has not found its consumer. On the one hand, a full-fledged Vega built into the processor is much better than UHD cards of that time. And yet, it was inferior to any GeForce or Radeon in terms of power. The output is neither fish nor meat. Too expensive for an integrated video card. Too weak for discrete. And then IRIS XE arrived.
Why is it better not to have this now than to have it: For the same reason as the previous device. Although the situation is not so critical. Kaby Lake G does not have the latest drivers, which technically turns it into half a pumpkin. On the one hand, separate drivers for Radeon Vega, which is used as part of the technology, are still being released. On the other hand, the internal synergy with the processor and internal memory soldered directly on the chip is lost.
So let’s start with the obvious. A situation completely identical to mine, but which happened with Intel and AMD literally a year and a half after the “bricking” of my own device.
This was another brilliant solution at the intersection of competitors. Full-fledged VEGA within the 8th generation of Intel processors. This is the case when full-fledged graphics on a chip could be an excellent solution for:
But! The appearance of IRIS broke all plans for computers with good (for its time) graphics at an affordable price. And even if you use processors of this series together with discrete graphics accelerators, then the soldered 4 gigabytes of memory could be used for some other purposes.
However, IRIS turned out to be cheaper, and just a year and a half after the release, Intel Kaby Lake G stopped receiving driver updates, which does not allow them to be used in conjunction with modern OSs, or to realize the potential of Vega.
Cannon Lake Intel. I3 who couldn’t
What was it? An attempt to give an answer to Ryzen in terms of technical process and power consumption. The first 10Nm processor from Intel, which appeared back in 2018.
Why didn’t it take off? We were unable to establish mass production.
Why is it better not to have something like this now than to have it? Strange architecture, high percentage of defects, low-power processors. In case of failure, they are not subject to warranty exchange.
Who got hurt again? Laptops. Namely Lenovo a 330. In general, on IXBT (I don’t know if it’s possible to put a link to third-party resources) there is a whole article about other processors, incl. 10 nm processor with 16 threads, originally from 2018 on 10 nanometers.
But only the i3 8121U reached the general consumer.
What can you say about him? Yes, absolutely nothing, except that used laptops with it have practically not survived to this day, and those that exist are suspiciously cheap (<$100). Perhaps it’s a high percentage of defects, perhaps something else.
But the fact remains a fact. Great technology. The amazing technical process, the breakthrough that AMD made with its zen, could have belonged to Intel.
Xeon on steroids – Itanium
What was it? Epic processor an alternative to the then deadlocked x86-64
Why didn’t it take off? Focusing exclusively on the server market, and the lack of competition in the consumer market, has slowed down investment, and as a result, slowed down the development of the architecture.
Why is it better not to have something like this now than to have it? everything is ambiguous here. It’s a good idea to have an EPIC architectural server if you’re concerned about security. Due to its low prevalence, it is somewhat better protected, and its vulnerabilities are not known to a wide range of people. On the other hand, if you are looking for a solution for mass use, then the high cost and backwardness of the architecture will make it a worse choice.
Intel larrabee – instead of pins
What was it? A 48 core processor or a RISC architecture video card.
Why didn’t it take off? The complexity of the architecture did not allow us to adapt the process of writing software for it.
Why is it better not to have something like this now than to have it? The technology did not see the light of day.
Finally, I wanted to talk about a device that never came out, and thank God. Or maybe not. Because then, the era of video cards that could run The Witcher 3 could begin even before the release of The Witcher 3.
48 core video card. It sounds like a miner’s paradise, but unlike previous precedents, fortunately this time the device never came out.
Intel Larrabee was supposed to compete with AMD Fusion when it was supposed to release its first graphics adapters combined with integrated graphics.
At the time of announcement, this was supposed to be a real revolution, ready to compete with the Intel Geforce GTX 285 in power.
In fact, this was the Blues’ answer to the then Cell architecture, which, all other things being equal, had a solid performance increase in relation to other hardware of that time. But the complexity of programming for a multi-core RISC architecture with a graphics chip decided everything for Intel. And along with the abandonment of the Cell architecture, Intel also abandoned Larrabee.
But just imagine what a future could await us, in which 48 core processors would be in every home? Just a joke of course. The technology showed its inconsistency back in those same years. But no one bothers you to fantasize?
But there was also ARM for desktop PCs. But there were also x86 tablets. And also a sea of things that, with my amateurism, I don’t even know about!
P.S. Now I’m watching processors with E and P cores with the same interest. Which, on the one hand, look like the next step in the evolution of processors, and on the other hand, there are concerns about whether they will join the horde of forgotten technologies, and whether devices on the 12th and 13th generations of Intel will turn into a pumpkin in just 3 years. 4 years?