Human brain in Python

I welcome those who like to speculate about how soon neural networks will take away people’s jobs and take over the world. And also those who have never been interested in this topic. In general, make yourself comfortable.

I plan to release several articles in which I will talk about my attempts to recreate the neural network in its original form. Those. repeat the functionality of a real neuron, and then of an entire neural network in code.

Where it all started

I have been using neural networks for a long time. With nostalgia I remember how poor the first version of ChatGPT was… And I was always interested in the question of the engine compartment of at least the simplest of them. But as soon as I started to delve into this topic, SHE immediately brought me down to earth. Yes, yes – MATHEMATICS. Since I don’t like mathematics, to put it mildly, and neural networks do not let go of my curiosity, I decided to go a different route.

Firstly, the question arose: does the “neural network” live up to its name? Are mathematical models of neural networks really so similar to what happens in our skull?

Secondly, it would be interesting to recreate the neural network in its original sense. And see what comes of it.

Surely I'm not the first to do this. But looking at other people's results is quite boring… so let's go!

Neurobiology

How can you program a neuron without knowing what it represents? Therefore, after reading a couple of articles from the Internet, talking with ChatGPT and watching a dozen lectures from this wonderful playlistI came to the following conclusions.

  • At the first stage, you need to forget about the entire brain and its structures, because it has not yet been studied exactly how they all interact

  • There are different types of neurons, but applying the main principle Homo sapiensabstractionI can create only one neuron model and get any other one by changing the characteristics of the original one

  • There are a large number of cells that support the life of a neuron. They can also be excluded from the model

  • Since the most primitive neural networks do not provide organisms with more than the ability to move and eat, then I should not immediately look towards solving complex problems. Let the neural network at least make it possible to revive the worm model

Neuron device

Externally, a neuron looks like this:

Internal structure of a neuron. Watch the lectures for more details: first And second.

The work of a neuron is divided into four parts.

Part 1: Capacity Building

Here we introduce a new concept: “neurotransmitter”, or “neurotransmitter”. This is a substance that can open special channels in dendrites. Through these channels, positively and negatively charged ions enter the neuron.

The ions in turn create an electrical potential. When the potential reaches a threshold value, potential transmission occurs along the axon.

Part 2: Potential Transfer

Due to the electrical potential, channels open, which allow sodium ions to enter, which in turn open the following channels. This results in a chain reaction that travels along the axon.

When the sodium ions reach the end of the axon, they open other channels that let in calcium ions.

Part 3: Neurotransmitter Release

During the life of a neuron, a large number of substances are produced. Including neurotransmitters. It is too quite a fascinating processbut I won’t describe it now.

These substances are transferred in vesicles to the end of the axon.

Calcium ions allow these vesicles to fuse with the body of the neuron and release them into the synapse. A synapse is the place where the dendrites of one neuron and the axon processes of another are connected.

Part 4: Repolarization

Next comes a wave of repolarization. It moves in the opposite direction and equalizes the electrical potential. After which the neuron is again ready to accept neurotransmitters. The cycle is complete.

I hope from my brief description you have at least the slightest understanding of how a neuron works. Let's move on to the code.

General scheme

The general scheme of how a neural network works looks like this:

General model of the nervous system in the body

General model of the nervous system in the body

Let's say the controller class is skin. It generates an impulse to the neuron. Next, the neural network processes this impulse. And through other neurons, the impulse returns to the controller class. For example, on muscles so that the creature shrinks from an external stimulus.

The neural network will be calculated in a simple loop. The neural network will initiate actions within a neuron, as well as transmit neurotransmitters between neurons. First, let's program the neuron itself.

Neuron class

Applying abstraction from complex chemical and biological terms through much thought, I came to the following

class Neuron:
    def __init__(self, name):
        self.name = name
        self.treshold = 10  # treshold of activation
        self.returnablity = 0.1  # percent of remaining transmitters
        self.speed = 5  # the less, the faster signal will be sent after activation
        self.recovery = 5  # if 0, neuron ready to send signal. -=1 on each step after

        self.sta = 5  # sta - steps to activation. Set > 0 when created to autostart
        self.str = 0  # str - steps to recovery

        # outer tm
        self.dendrite = [0, 0]  # recieve from another synaps
        self.synapse = [0, 0]  # send to another neuron and reset to zero

        # inner tm
        self.reproductivity = [0.5, -0.1]  # amount of transmitters + on each step
        self.accumulated = [0, 0]  # move to synapse and set accumulated * returnability

        self.current_state = [0, 0]  # how many transmitters in synapse; before calculations complete
        self.last_state = [0, 0]  # after calculations; [activator, ingibitor]

Let's go in order.

# outer tm
self.dendrite = [0, 0]  # recieve from another synaps
self.synapse = [0, 0]  # send to another neuron and reset to zero

Neurotransmitters are generally of two types: excitatory and inhibitory. Depending on whether they let in positive or negative ions. Therefore, we abstract from specific substances. Let the zero element always be responsible for positive ones, and the first element for negative ones.

dendrite – responsible for storing excitatory and inhibitory ntobtained on dendrites

synapse – is responsible for storing them, but in the synapse (between the dendrites of one and the axon of the other)

self.treshold = 10  # treshold of activation
self.sta = 5  # sta - steps to activation. Set > 0 when created to autostart
self.speed = 5  # the less, the faster signal will be sent after activation

When the potential exceeds the threshold value treshold, potential transfer occurs. It is reflected in the counter variable sta. That is, after 10 iterations, the NT will be released into the synapse. Each time a neuron is activated, it is set to a constant speed value.

self.recovery = 5  # if 0, neuron ready to send signal. -=1 on each step after
self.str = 0  # str - steps to recovery

After NT transmission, the neuron is restored. str is another counter variable.

# inner tm
self.reproductivity = [0.5, -0.1]  # amount of transmitters + on each step
self.accumulated = [0, 0]  # move to synapse and set accumulated * returnability

Throughout the life of a neuron, neurotransmitters are produced in it. reproductivity shows how much will be created in each iteration. And accumulated – how much is already contained in the neuron.

Yes, this is a rather straightforward and clumsy logic, but in the future, if desired, you can use any functions for dynamic calculation of nt.

When the sta counter reaches zero, the values ​​from accumulated are assigned to synapse, and accumulated is reset to zero.

The next connected neurons will take nt from synapse, and part of the nt will be returned from synapse to accumulated. (Yes, part of the nt goes back to the neuron)

self.current_state = [0, 0]  # how many transmitters in synapse; before calculations complete
self.last_state = [0, 0]  # after calculations; [activator, ingibitor]

current_state stores the potential level in the neuron. Since the calculation occurs sequentially in a loop, a situation may arise that a neuron must accept an NT from many others, but one has already been calculated, and the rest have not yet been calculated. Therefore, an additional last_state attribute is introduced for each neuron, which will be updated for each neuron after the calculations are completed. Those. During the calculation process, new data is written to current_state, and last_state is used.

In the diagram it looks like this

And this is already a whole neural network!

Neural network class

Okay, there is a neuron class. But by itself it is useless. It needs to be made to work.

class Network:
    def __init__(self):
        #  replace axons and dendrites with it
        self.neurons: {Neuron: [Neuron]} = {}
        self.run = False

The neurons dictionary stores all the neurons in the neural network as keys, as well as a list of neurons with which it is associated as values.

There is also a run flag, which helps stop the neural network when it is running in a separate thread.

Next are a couple of methods for creating and editing a network

# first to second (one way communication)
def link(self, n1, n2):
    if n2 in self.neurons[n1]:
        self.neurons[n1].remove(n2)
    else:
        self.neurons[n1].append(n2)
def add(self, n: Neuron):
    self.neurons[n] = []

And finally, the main logic of its work

  def maincycle(self):
      while self.run:
          for neuron in self.neurons.keys():
              neuron.step()
              tm = neuron.synapse
              neuron.synapse[0] = neuron.synapse[0] * 0.1
              neuron.synapse[1] = neuron.synapse[1] * 0.1
              amount = len(self.neurons[neuron])
              for dendrite in self.neurons[neuron]:
                  dendrite.dendrites((tm[0]/amount, tm[1]/amount))

          for neuron in self.neurons.keys():
              neuron.last_state = neuron.current_state

          time.sleep(0.01)

Here we go through all the neurons in the dictionary. We distribute neurotransmitters from the synapse equally across all neurons connected to it. And then we update the states of each.

As you may have noticed, the step method of the neuron is called in the loop. It implements the logic of its operation

def step(self):
    self.accumulated[0] += self.reproductivity[0]
    self.accumulated[1] += self.reproductivity[1]
    if self.str > 0:
        print(pcns(), self.name, 'ВОССТАНАВЛИВАЮСЬ')
        self.str -= 1
    elif self.sta == 1:
        print(pcns(), self.name, 'ВЫБРАСЫВАЮ')
        self.sta = 0
        self.synapse[0] += self.accumulated[0]
        self.synapse[1] += self.accumulated[1]
        self.accumulated[0] = self.accumulated[0] * self.returnablity
        self.accumulated[1] = self.accumulated[1] * self.returnablity
        self.str = self.recovery
    elif self.sta > 0:
        print(pcns(), self.name, 'ПЕРЕДАЮ')
        self.sta -= 1
    elif self.last_state[0] + self.last_state[1] > self.treshold:
        print(pcns(), self.name, 'АКТИВИРУЮСЬ')
        self.current_state = [0, 0]
        self.sta = self.speed
    else:
        print(pcns(), self.name, 'НАКАПЛИВАЮ')
        self.current_state[0] += self.dendrite[0]
        self.current_state[1] += self.dendrite[1]
        self.dendrite[0] = 0
        self.dendrite[1] = 0

And the dendrites method for receiving nts from the synapse

def dendrites(self, tm):
    print(pcns(), self.name, 'ПРИНИМАЮ')
    self.dendrite[0] += tm[0]
    self.dendrite[1] += tm[1]

Now let's try to run it all

if __name__ == '__main__':
    net = Network()
    net.run = True
    threading.Thread(target=net.maincycle).start()
    n1 = Neuron('ПЕРВЫЙ')
    n2 = Neuron('ВТОРОЙ')
    net.add(n1)
    net.add(n2)
    net.link(n1, n2)
Part of the output in the console

———–1————
386500223981600 TRANSFER FIRST
386500224013300 SECOND ACCEPT
386500224025400 SECOND TRANSFER
———–2———–
386500727221500 TRANSFER FIRST
386500727240500 SECOND ACCEPT
386500727253100 SECOND TRANSFER
———–3———–
386501230345700 TRANSFER FIRST
386501230378100 SECOND ACCEPT
386501230395500 TRANSFER SECOND
———–4————
386501739995900 TRANSFER FIRST
386501740041700 SECOND ACCEPT
386501740061700 TRANSFER SECOND
———–5————
386502247167700 I THROW OUT THE FIRST
386502247208500 SECOND ACCEPT
386502247231500 I THROW OUT THE SECOND
———–6———–
386502761955400 FIRST RESTORING
386502761996500 SECOND ACCEPT
386502762028300 SECOND RESTORING
———–7———–
386503265130200 FIRST RESTORING
386503265159800 SECOND ACCEPT
386503265181800 SECOND RESTORING

Everything works exactly as intended! (although this is unlikely to be understood upon first reading)

GUI

But nothing is clear! You say. And I completely agree with you. Therefore, after spending an hour with ChatGPT, I was able to get a graphical interface in pygame.

Using this interface, you can add neurons, remove, create and delete connections. Save and load models, move around the screen and zoom. And also display indicators in real time. (I was pleasantly surprised by the quality of ChatGPT 4.0)

Here the connection is highlighted during signal transmission

Here the connection is highlighted during signal transmission

The source code can be found in my github.

Conclusion

I managed to realize the original idea. Everything works the way I wanted. I was surprised how similar the mathematical models of neural networks are to what they really look like.

In the following articles I will show how I design a neural network. I'll try to bind it to objects and interact with them. And also display instructions for fine-tuning the neural network (indicators whose values ​​can be changed there).

I want to create a mechanism for the evolutionary development of a neural network. It's hard to imagine how you can manually create thousands of neurons. Let them themselves be generated randomly, and I will just set the conditions for natural selection.

In general, there are still a lot of ideas. As they are implemented, I will write new articles. Thanks for reading!

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *