How close are we to creating a Terminator?

A cold-blooded machine that knows no sadness or pity. Relentlessly and straightforwardly carrying out the task assigned to her to the end. This is how the viewer remembers the Terminator from the film of the same name. According to his scenario, a soulless machine came to us from the future. And now, right now, we are approaching the time when in the world from the film, machines take over people.

If you turn your attention to the technical side of this robot, then what was surprising in 1984 now seems somewhat familiar, and is already looming on the horizon. What technologies are these robots built on there, deep inside their steel skull?

Let's assume for just one day that James Cameron is already in 1984 knew something and did not make a science fiction film, but tried to send us a warning. What if the director of the film tried to protect us from what the abuse of new technologies could lead to and how close we could get to creating such machines with their help? I propose today, April 1, 2024conduct a deep and thoughtful analysis of the terminator’s operating mechanisms and together find the answer to this burning question.

Neural networks bring us closer to the future

People tried to teach computers talk right after their appearance. Now even Alice, who sounds from almost every telephone, has a voice indistinguishable from a human voice.

Still, for the operation of a smart robot system, one voice is not enough. There must be a source of text that would send him what needs to be said. Neural networks are already coping with this task. Neural networks such as ChatGPT, YandexGPT, RuGPT and LLAMA supply the voice engine with answers to your questions, texts and fairy tales, and the voice engine, in turn, voices them.

Neural networks are developing very quickly and, in general, we can already say that those that work today are very similar in their behavior to a robot, which is very similar to a person. On the one hand, they write simple and uncomplicated texts, do not understand jokes and cannot joke themselves, are poor in emotions and do not always answer your questions correctly, and the text they generate is as dry and straightforward as the speech of this robot . Well, on the other hand… On the other hand, they frighten you with their ability to recognize your voice, recognize faces, collect information about people, solve problems in mathematics and physics, and even create pieces of music.

For example, the latest version of ChatGPT can already recognize patterns in images and describe them. This is how Copilot, a former Bing that uses the latest version of ChatGPT, recognizes images.

The world through the eyes of a cyborg

Let's now look at the world through the eyes of the Terminator. We already know perfectly well what he looks like.

The red color symbolizes the shift of the range of light perceived by the robot into the infrared zone. Otherwise, the interface is presented by the command line, without a graphical shell – GUI, as, for example, in Unix or the command line in Windows.

This interface displays command lines located in different parts of the screen, each of which represents its own part of the result of processing incoming information. It is similar to the text interface of modern neural networks, which today run on a desktop computer, performing calculations on a processor or video card, and without access to the network.

We have all seen such text responses very well; these are lines of text from the results that a neural network, like Alice, produces after processing your messages. The popular Stable Diffusion neural network also runs on a discrete video card inside a computer. Here is an example of the interface of another large language model (Large Language Model, LLM), LLaMA, which already works autonomously, using computer resources.

Look, the cyborg system works exactly the same way as modern language models, through text. Images from cameras located in place of the eyes are recognized and converted into a text description. For example, an LLM like ChatGPT 4 does the same thing to explain objects contained in photographs.

From this we can assume that in the Terminator OS several neural networks operate in parallel: the first processes and displays messages about the state of the system, the second processes visual information, the third LLM is responsible for communicating with people, etc. Like gears in a complex mechanism, each of them processes them your portion of incoming information. Responses from each of the individual neural networks are displayed each in their own window, and the main neural network reads them all at the same time, analyzes and decides on further actions.

Most of today's LLMs require internet access to work. At the same time, in 1984, the Internet as we know it now did not yet exist, and in its head there should have been a good set of such neural networks that work autonomously.

Neural networks for different purposes

To understand what neural networks could be used in his built-in computer, let's look at the complete list of tricks that our hero resorts to in the movies.

It must be said right away that the robot is anthropomorphic, that is, it repeats the structure of a person. This needs to be examined in more detail. In the film, a robot from the future can also: shoot – sometimes directly with both hands – and almost never misses, fake a voice, start a car without a key, drive vehicles, assess the environment and calculate further developments, make a diagnosis and perform simple surgical operations .

Maybe it is ChatGPT that underlies the robot’s consciousness, which is found in the microcircuits of its steel skull? Or, perhaps, it is the combination of different neural networks in this way that will lead to the creation of the main one, capable of processing information at the speed of the human brain. Arnold Schwarzenegger himself tells us about the same thing. In the video, he warns about the danger of terminators appearing in reality.

Voice fake

There are now many neural networks for voice faking. One of them – Heygen, it allows you not only to create a video, but also to voice it with a voice copied from the character. This neural network will very quickly allow you to copy the voice heard in a conversation and reproduce the specified text with it. Terminator tools may well include its analogue, operating autonomously, for example, on a graphics chip.

Ability to operate technical devices

In the film we are shown how deftly the Terminator starts various cars. And he already knows how to manage them. Now several companies are developing autopilots, and all of them use new neural networks. Among them are Tesla, and even our Yandex. One of the latest neural networks to control the machine is used by a subsidiary of Sberbank Cognitive Pilot.

When Schwarzenegger was about to steal a truck in the first part, look at how the neural network immediately guided him in control, giving a diagram of the internal structure of the car based on the car model.

A car model can now be recognized by ChatGPT without any problems; it can also recognize such a scheme if it sees it from another source, extract from it the details important for driving a car and transfer it to the main neural network.

Knowledge of medicine

Everyone remembers how in the film the terminator shoots at people, but, following John Connor’s request, does not kill anyone?

To do this, he must be able to diagnose people based on their physical injuries. Today we see how exactly such neural networks are already actively are being implemented in Russian medicine.

Probably, 12 minutes for an appointment in Russian clinics were allocated for a reason, but taking into account the prospects for neural networks. Hoping that in the near future you will be greeted at a reception not by a person, but by a neural network, because it will work several times faster.

Don't move.

Don't move.

Event Forecasting

As for predicting the development of events in the current situation, ChatGPT has already completely learned how to do this. He disassembles the photograph into its component parts, translates it into text, and then his LLM determines from the text what will happen next. There are similar examples floating around the internet where he looks at images from a physics perspective and says what will happen as a result of the actions taking place in the photograph.

It's incredible how close James Cameron was to the truth! Listen to the terminator himself from the second part, time 1:05.

– Can you learn stuff shich you haven't been programmed with so you can be more human?  – My CPU is neural net processor, a learning computer.  The more contact I have with humans the more I learn

– Can you learn stuff shich you haven't been programmed with so you can be more human? – My CPU is neural net processor, a learning computer. The more contact I have with humans the more I learn

John Connor: “Can you learn what you weren't programmed to do? Become more human?”

Terminator: “My CPU is a neural network processor, a learning computer. The more I communicate with people, the more I can learn.”

If James Cameron could have guessed back in 1991, so far from us, then how could we not heed his warning?

Two-handed shooting

What no one in the world has ever tried is to put a weapon in the hands of a bipedal robot. No one except one company, and you know perfectly well where it is. And their robot’s name is very simple – Fedor.

https://dzen.ru/a/XT754B6OPwCyHqx3

According to the assurances of his designer, “Fyodor can speak and recognize speech, walk, climb stairs, navigate in space by turning his head, overcome an obstacle course, drive a car and an ATV, and even crawl on all fours. Thanks to his good hand motor skills, he can work with “Various tools that Emergency Ministry employees use to save people. The robot can also apply splints, give injections, and also help in production: assemble other robots.”

By the way, our Fedor is now one of the few robots that control the machine with their own manipulator hands, and not just through electronic interfaces.

As we see, all the actions of the iron villain can now be performed by neural networks. To do this, they currently require a lot of time and resources, but work is now underway to improve their energy efficiency and increase their performance. The only thing missing is a neural network that quickly processes the outputs from all these neural networks that produce the result of processing their portion of information.

Physical body

To move and participate in everyday human life, the terminator uses a body that can walk, work with its hands and, in its parameters, is not much different from a person. It is anthropomorphic, they are also called humanoid, humanoid or android. From a production point of view, these are some of the most complex robots available.

In scenes where the robot moves after damage to the protective skin, the sound of servomotors is clearly audible, as in modern mechanisms. That is, it is built on a completely modern element base.

Of course, we all know that the most advanced humanoid robot currently available is called Atlas, manufactured by Boston Dynamics.

Its competitors for the title of the most mobile robot today are robots from Agility Robotics

and Chinese Android Unitree H1.

The Chinese, as is typical for them, have approached the production of Unitree H1 in a big way and are riveting them in batches. Oddly enough, Unitree H1 can be bought on Aliexpress.

The OpenAI developers themselves with their Figure 1 and Tesla are not far behind Boston Dynamocs and Agility, but they are incomparably worse than Atlas.

I must say something about the Japanese. Their robot ASIMO Honda, which was once the first to walk on two legs, lost to Atlas in a competitive battle and retired from the race. The project to develop it was closed.

In terms of the degree of interaction with technology from the human world, now, perhaps, Fedor is in first place. Atlas and the Chinese robot can neither shoot nor drive, but only move boxes on shelves.

However, Fedor cannot do the most important thing why he was created in human form – walk. Its foreign counterparts cope much better with this. Perhaps the new neural network will build its body from these androids. They already strongly resemble a human skeleton.

With skin and facial expressions, another company has advanced the furthest – Ameca. They developed a coating with properties similar to real skin and taught the robot to express emotions.

Some of these machines have excellent mechanisms for walking on two legs, others have decent artificial arms, others can drive cars or change facial expressions. Will any of the neural networks in the future figure out how to collect the best qualities from each of them?

Processor and code

If you look at all the walking robots above, you may notice that they all have one thing in common – the absence of a full head. You and I have already realized that they have not yet invented a normal head.

What separates us from using all the capabilities of modern neural networks in one head is the size of the servers required for their operation. For example, to operate ChatGPT, its creator OpenAI built entire data centers, the capacities of which users gain access to by subscription. Such computing power not only requires a huge amount of electronics, but also consumes an incredible amount of electricity and also requires a large amount of heat to be dissipated – an efficient liquid cooling system.

To operate such a complex system for working with the environment in real time, the coordinated work of several neural networks will be required at once; accordingly, the power will be comparable to a modern data center.

It turns out that they will be able to pack all the computing power into one small package sometime in the future. To operate such a high-speed computing system, in the head of the terminator, which is the same in size as a normal human one, there must be a processor comparable in speed to a real data processing center of some kind of conventional MTS. This processor is powered by a power source that is said to last for 120 years and is so energy efficient that it can be cooled by a liquid cooling system comparable in size to the human circulatory system. Since the terminator is a cyborg, why shouldn’t the circulatory system drive not cold, but warm blood, cooling it, like ours, from the environment?

It is possible that for this purpose it was possible to either switch to ultra-small technical processes, or produce processors assembled on a new material that can further reduce the technical process. Switching from the conventional material used in their manufacture – silicon, will help reduce the size of microcircuits. on the other, germanium. Or, go further and use new materials altogether, for example, graphene. You can go even further and think that these could be promising chips with quantum structureas well as photonic processors.

As one of the programmers found out, the code displayed on the screen is assembler MOS 6502. Combining a new processor and fast code in one of the low-level programming languages ​​will have a great effect on the performance of such a system. Why assembler, because it would be so difficult to write so much code in it? This is precisely not difficult, because such a task can be assigned to the neural network itself, which converts the code.

It looks like now is the time to heed the Terminator director's warnings. To create a robot, only a small matter remains: probably, existing devices need to be modified. AGI, created by leading artificial intelligence developers, may well be suitable for the role of the main neural network, the prototype of which is Skynet. It is possible that this is exactly what is missing to create such an intelligent robot.

How might the scenario for the appearance of terminators like the T-101 develop?

The film warns us about the emergence of the superfirm Cyberdyne Systems, which developed the terminators, and which Sarah Connor and her iron bodyguard tried to destroy. Who should we be afraid of and from whom should we expect trouble, who is most suitable for the role of Cyberdyne now? Will it be Boston Dynamics, a factory for the production of AI chips, video cards, for example, NVidia, quantum computers – ours or foreign ones, Copilot, or Skynet – this is the company that today produces the robot Fedor?

Just recently, news spread around the world that Agility Robotics has created a factory for the production of humanoid robots. Its highlight was that, as stated in the news stories, the robots on it would be assembled by the robots themselves.

The first Skynet robots looked like people, but they could still be distinguished. However, they were already quite similar to us. Let's calculate what is missing in our time to create terminators.

Today, for their appearance, they still lack: the main ultra-fast neural network, a small and ultra-powerful (by today's standards) processor, a power source with a long operating time and artificial skin. However, their creation is already proceeding by leaps and bounds.

It's possible that this is a robot skin being developed. Saudi scientistspower supply on cold nuclear fusion, which will continue develop here after the collapse of science in the nineties, and A.G.I.the ultra-insightful artificial intelligence that OpenAI, Microsoft, and Google alike are striving for.

Let's assume that a new version of ChatGPT, with a squad of several Atlas and Fedor robots, captured the Agility Robotics factory. On it she multiplied herself many times over, building an infinite number of copies of them. It will no longer be difficult for an entire army of future terminators to take over the production of realistic facial expressions from Ameca.

Taking the best of his bodies – Atlas – as a basis, after several iterations to improve and strengthen the internal mechanisms, he will add to it the best qualities from each of the other models of bipedal robots. Autonomous neural networks will be placed in the head. He will put artificial skin and Ameka’s smile on the reinforced skeleton of Atlas, and the result will be a machine that is very similar to a person.

This robot will put on black glasses, and just look for those among us. Well, if the robot Fedor shares the technology of driving cars and some of his other skills, then nothing will stop them on the way to taking over the world!

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *