Xenobots and evolutionary algorithms

We decided to write a short “entertainment material” on the topic of biorobots in the context of artificial intelligence and take a break from the technical part of our articles

The AI ​​approach does not aim to imitate the human brain or any other animal. Only a few “specific” projects aim to replicate neural processes/cognitive abilities.

Although classical MLPs are inspired by activations and building neural connections inside our brains, a biorobot is a synthesis of robotics and organic tissues. And this synthesis is complex. It is quite difficult to achieve an adequate body shape of an animal to recreate the important functions of typical living representatives of our world.

However, machine learning also finds itself in scientific achievements in this field. For example, in the case of creating a biorobot from frog DNA, the shape of which was selected by AI. For what? – the most fruitful replication.

Scientists from the University of Vermont, Tufts University, and the Wyss Institute for Biologically Inspired Engineering at Harvard University have created the world's first self-replicating living robots, or xenobots. They are made from frog cells, and the “robots” themselves are designed using AI algorithms.

The AI ​​was able to test billions of body shapes to find the most efficient designs for replication, resulting in Pac-Man-shaped organisms that could harvest cells to create xenobot “babies.” These mini-robots could also pick up payloads, such as medicine, and could regenerate cells.

At one time, Michael Levin, the project manager, proved to everyone that the little Pac-Men would destroy

It is important to note that, unlike other bio-options of robotics, xenobots can reproduce – a kind of new living creatures molded by man. And you think: this news definitely could not have passed me by! But the development took place back in 2020. But hundreds of generative neural networks and thousands of bubble startups have not yet broken out in their first Seed rounds. The neural network hype was just beginning.

Since the dawn of agriculture, humans have manipulated organisms for their own benefit.

In recent decades, genetic editing technology has become widespread, and recently several artificial organisms have been created that imitate the forms of known animals. An important feature of the research here is

However, this study is the first to create “fully biological machines from scratch.”

Sam Kriegman, a graduate student and lead author of the paper, used an evolutionary algorithm to generate thousands of candidate designs for new life forms.

In their research, using the Deep Green supercomputer cluster at UVM's Vermont Advanced Computing Core, they sought to solve problems posed by scientists, such as one-way locomotion. The supercomputer repeatedly assembled various body shapes and configurations from simulated cells.

The programs, based on basic principles of biophysics and the capabilities of frog skin and heart cells, gradually selected the most successful projects, while unsuccessful options were discarded. After hundreds of algorithm launches, the most promising projects were selected for further testing.

The next step was taken by a team at Tufts University led by Michael Levine and featuring microsurgeon Douglas Blakiston. These scientists brought the computer-generated designs into real life.

They used stem cells derived from the embryos of African frogs Xenopus laevis, earning the organisms the name “xenobots.” The cells were separated and incubated individually. Then, using tiny forceps and an electrode, the cells were cut and reassembled under a microscope to create computer-generated structures.

Assembled into shapes that do not exist in nature, the cells began to function together. Skin cells formed a passive structure, and the contractions of the heart muscles became orderly, creating forward movement. This movement was directed by computer design and supported by spontaneous, self-organizing patterns, allowing the xenobots to move independently.

The same team that created the first living robots ('xenobots' assembled from frog cells, as reported in 2020) found that these computer-designed, hand-assembled organisms could swim into their tiny cup.

And find individual cells, gather them together by the hundreds, and assemble the xenobot “babies” inside their Pac-Man-shaped “mouths,” which after a few days become new xenobots that look and move just like themselves.

Evolutionary algorithms: types and example of optimization of parameters by neural network

Evolutionary algorithms are a class of methods inspired by biological evolution and include several main types, each with its own unique mechanisms and applications. Let's look at the main types of evolutionary algorithms, their operating principles, and key technical aspects.

Genetic Algorithms (GA). Genetic algorithms are one of the best-known types of evolutionary algorithms. They operate on a population of candidate solutions that are encoded as genotypes (usually strings or arrays). The main operators include selection, crossover, and mutation.

Evolutionary strategies (ES). Evolutionary strategies focus on the processes of mutation and selection. Unlike genetic algorithms, ES use numerical vectors to represent individuals and apply Gaussian distributions for mutation. The main parameters are population size, mutation parameters, and selection rules.

Genetic programming (GP). Where individuals are programs or expressions, usually expressed as parse trees.

The main operators include crossover and mutation, similar to those used in GA but applied to program trees. GP is used to automatically generate programs, algorithms, and expressions, as well as to solve regression and classification problems.

The evolutionary algorithm, implemented on the Deep Green supercomputer cluster at UVM's Vermont Advanced Computing Core, operates by mimicking the principles of biological evolution to optimize new life forms.

This process involves several key steps that interact and are refined through mechanisms similar to those seen in natural selection.

The algorithm starts by generating an initial population of random cell configurations, each of which represents a potential solution to a given problem. These configurations can be associated with various parameters and characteristics that depend on the biophysical capabilities of the cells.

After the initial population has been created, each configuration is evaluated for its effectiveness in performing a given task.

The assessment is based on criteria defined by scientists, such as speed of movement or efficiency in performing specific functions. The process is similar to phenotypic assessment in biology, where the expressed traits of an organism are assessed for their adaptive value.

The configurations that show the greatest efficiency are selected to participate in further stages of the algorithm, while less successful variants are discarded. This selection is based on the principle of survival of the fittest, where only the best solutions move on to the next stage.

Stored configurations undergo a process of recombination and mutation, which creates new variants of configurations.

Recombination is analogous to sexual reproduction in biology, where the genetic material of two parents combines to create offspring with new properties. Mutation, on the other hand, introduces random changes in configuration, allowing new possibilities to be explored and potentially more efficient solutions to be found.

The process of recombination and mutation is repeated many times, with each new generation of configurations undergoing the same procedure of evaluation, selection, recombination and mutation.

At each stage, the algorithm aims to improve the overall efficiency of the configurations, which is analogous to the accumulation of adaptive changes in biological populations.

Ultimately, after numerous iterations, the algorithm identifies the most optimal cell configurations to perform the given tasks.

Technically, the algorithm is controlled by neural networks that model cell interactions and behavior based on given biophysical rules. Neural networks are trained to recognize and predict the most successful configurations, which increases the efficiency and accuracy of the algorithm.

They play a key role in the evaluation and optimization process, providing a high degree of adaptability and self-organization capacity.

The Deep Green supercomputer cluster provides the necessary computing power to perform complex calculations and simulate a large number of configurations, which allows the evolutionary optimization process to be carried out with high speed and accuracy.

An example of the simplest evolutionary algorithm:

Let us consider the principle of operation using the Python programming language as an example, using the DEAP (Distributed Evolutionary Algorithms in Python) library to implement an evolutionary algorithm that optimizes the parameters of a neural network.

  1. Initialization of the population. Each individual in the population is a set of neural network parameters, including the architecture (number of layers, number of neurons in each layer) and weight parameters. In Python, this can be implemented as follows:

import random
from deap import base, creator, tools, algorithms
import numpy as np

# Определение типа Fitness для минимизации функции ошибки
creator.create("FitnessMin", base.Fitness, weights=(-1.0,))
creator.create("Individual", list, fitness=creator.FitnessMin)

def create_individual():
    # Пример создания нейронной сети с случайной архитектурой
    num_layers = random.randint(1, 5)
    layers = [random.randint(10, 100) for _ in range(num_layers)]
    return layers

toolbox = base.Toolbox()
toolbox.register("individual", tools.initIterate, creator.Individual, create_individual)
toolbox.register("population", tools.initRepeat, list, toolbox.individual)
  1. Fitness evaluationEach individual (neural network) is assessed for its ability to perform a given task, such as classification or regression.

from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from sklearn.neural_network import MLPClassifier
from sklearn.metrics import accuracy_score

data = load_iris()
X_train, X_test, y_train, y_test = train_test_split(data.data, data.target, test_size=0.2)

def evaluate(individual):
    # Создание и обучение нейронной сети с заданной архитектурой
    layers = tuple(individual)
    clf = MLPClassifier(hidden_layer_sizes=layers, max_iter=1000)
    clf.fit(X_train, y_train)
    predictions = clf.predict(X_test)
    accuracy = accuracy_score(y_test, predictions)
    return 1 - accuracy,  # Для минимизации ошибки

toolbox.register("evaluate", evaluate)
  1. Crossover and mutation operators. These operators generate new solutions based on existing ones, maintaining genetic diversity in the population:

def mate(ind1, ind2):
    # Одноточечный кроссовер для архитектуры нейронной сети
    size = min(len(ind1), len(ind2))
    cxpoint = random.randint(1, size - 1)
    ind1[cxpoint:], ind2[cxpoint:] = ind2[cxpoint:], ind1[cxpoint:]

def mutate(individual):
    # Мутация: случайное изменение количества нейронов в одном из слоев
    if len(individual) > 0:
        layer_idx = random.randint(0, len(individual) - 1)
        individual[layer_idx] = random.randint(10, 100)

toolbox.register("mate", mate)
toolbox.register("mutate", mutate, indpb=0.2)
toolbox.register("select", tools.selTournament, tournsize=3)
  1. The main loop of the evolutionary algorithm. The algorithm is executed over several generations, each time updating the population and improving the parameters of the neural networks:

population = toolbox.population(n=50)
ngen = 40
cxpb = 0.5  # Вероятность кроссовера
mutpb = 0.2 # Вероятность мутации

for gen in range(ngen):
    # Оценка всех индивидуумов в популяции
    fitnesses = list(map(toolbox.evaluate, population))
    for ind, fit in zip(population, fitnesses):
        ind.fitness.values = fit

    # Отбор индивидов для участия в следующем поколении
    offspring = toolbox.select(population, len(population))
    offspring = list(map(toolbox.clone, offspring))

    # Применение кроссовера и мутации
    for child1, child2 in zip(offspring[::2], offspring[1::2]):
        if random.random() < cxpb:
            toolbox.mate(child1, child2)
            del child1.fitness.values
            del child2.fitness.values

    for mutant in offspring:
        if random.random() < mutpb:
            toolbox.mutate(mutant)
            del mutant.fitness.values

    # Оценка нового поколения
    invalid_ind = [ind for ind in offspring if not ind.fitness.valid]
    fitnesses = map(toolbox.evaluate, invalid_ind)
    for ind, fit in zip(invalid_ind, fitnesses):
        ind.fitness.values = fit

    # Замена старого поколения новым
    population[:] = offspring

    # Вывод текущего наилучшего решения
    fits = [ind.fitness.values[0] for ind in population]
    print(f"Generation {gen}: Best fitness {min(fits)}")

It is important to understand that the evolutionary algorithm in the context of a neural network is about optimizing the parameters of evolution, and not about the full implementation of Darwinian selection in the entire algorithm. But here I would like to express the main feature that modern startups forget – the mono-productivity of AI. Optimizing the shape of the “new frog” through a neural network is an ideal option for implementing ML in such projects. One problem – one feature.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *