How MEPhI students summoned Shiva at Eurobot 2024, part 2

IN first part of the articlewe described the design details and electronics used in our SheevaBot, a robot assembled to participate in the Eurobot 2024 competition. Let's move on to looking at the software part.

Communication with low-level microcontrollers

The first step in programming SheevaBot was to establish communication between the main microcomputer (RPi4) and the microcontrollers responsible for controlling the various components of the robot. It was decided to abandon the use of Wi-Fi due to the risk of signal loss, which could lead to unpredictable behavior of the robot.

At the beginning of development, we tried to create our own API for data transfer via SPI, using a protocol with the transfer of three numbers (motor number, rotation direction, and speed/RPM). This approach worked with servos and stepper motors, but turned out to be ineffective for electric motors, especially when integrated with ROS. As a result, we had to look for ready-made communication options. This is how we came to microROS.

So, we have decided on the connection UART + microROS. It remains to choose from the whole variety of options a relatively inexpensive controller, accessible and at the same time powerful enough to solve our problems. Here you can find tables with the “meta” of controllers, where colcon.meta, colcon_lowmem.meta, colcon_verylowmem.meta characterize the number of nodes, services, etc. that a microcontroller belonging to this class can perform.

Low level management

In order to control motors, servos, various sensors and other electronics in our project, we used a Raspberry Pi Pico microcontroller based on the RP2040. The choice fell on this debug board for a reason.

Firstly, it is inexpensive and powerful, has 264 KB of RAM and 2 MB of Flash memory, and also has two ARM Cortex-M0+ cores with a clock frequency of 133 MHz. For comparison, the table shows the characteristics of the well-known STM microcontroller series – STM32F411CEU6 (popularly known as Black Pill).

Characteristic

Raspberry Pi Pico

STM32F411CEU6

Microcontroller

RP2040

STM32F411CE

Architecture

ARM Cortex-M0+ dual-core

ARM Cortex-M4

Core frequency

133 MHz

100 MHz

Random Access Memory (RAM)

264 KB

128 KB

Built-in memory (Flash)

2 MB

512 KB

GPIO

26

16

SPI

2

3

I2C

2

2

UART

2

3

Secondly, microROS package It is quite easy to install on Pico, which, together with all of the above, served as the basis for choosing the microcontroller.

Pico programming was done in VS Code using the Raspberry Pi Pico C/C++ SDK. By the way, on the Raspberry website presented comprehensive documentation for this microcontroller, as well as management on a quick start in Pico SDK with setting up everything you need.

The autonomous robot project turned out to be quite demanding in terms of the number of microcontroller pins, and all the elements (namely, a display controlled via the I2C bus, 6 servos, 4 pairs of Nema17 with DRV8825 drivers, 2 limit switches, 4 brushless electric motors and a sensor) could not fit on one Pico. MPU6050). Therefore, it was decided to increase the number of microcontrollers to two. At the same time, it was necessary to solve two rather complex and mutually exclusive problems at once – to evenly distribute the load and conveniently and compactly arrange these microcontrollers on the shield. So, the distribution of microcontrollers looks like this:

  • Pico1 – motors, servo drives, MPU6050, limit switches;

  • Pico2 – stepper motors, display.

Why do we need microROS?

As noted above, one of the criteria for choosing the Raspberry Pi Pico microcontroller was the ability to install microROS on it quite easily. So why did we choose microROS?

What is it? microROS? In general, microROS is “ROS2” for microcontrollers. Its main advantage is that it allows for quick integration of a microcontroller into the ROS2 environment without the need to manually write an API for interaction between the microcontroller and the microcomputer. Thanks to this, the result of various “complex” operations (such as image processing, navigation, etc.) performed on the microcomputer can be transmitted to the microcontroller, which in turn will start certain actuators and transmit back the readings of certain sensors.

Thus, thanks to microROS, a single robotic system is formed that can perceive the surrounding world, analyze it and change it.

Peripheral Management

Below are links and descriptions of the code for controlling motors, servos, stepper motors, display and MPU6050.

Hidden text

Nema 17 Stepper Motor Control was implemented using DRV8825 drivers. The code can be found at the path pico2\external_lib\stepper.

For minimal motor control, a PWM signal must be supplied to the STEP driver pin (each PWM pulse is a rotation by a certain degree), and the rotation direction must be selected (the DIR pin). In our project, we decided to go further and control the driver state by reading the data from the FAULT pin. If the driver overheats, the state “0” is set on it, and when subsequently sending data to the microcontroller by timer (using the timer_callback(...) V pico2.cpp) the function is triggered stepper_overheat()which returns the appropriate error code and (just in case) turns off the driver by sending a signal to the EN pin. The rotation control function itself stepper() selects the required motor, checks the drivers for overheating (reads the readings from the FAULT pin), then selects the direction of rotation and sets the PWM duty cycle to 50% for a certain time, after which it resets it to 0. Thus, the motor rotates to a given angle.

In order to minimize heating of the drivers, it was decided to switch them off after the cassette holder has finished turning. At the same time, this option is not suitable for lifting the cassette holder, since it must be constantly held in a fixed position.

To receive messages from the microcomputer in pico2.cpp a function was declared stepper_subscriber_callback()and the message itself was a standard type geometry_msgs/msg/Vector3containing data about the motor number, direction of rotation and angle of rotation (exactly three coordinates).

MG90S Servo Control was organized quite simply. The code can be found along the path pico1\external_lib\servo. The rotation angle of this servo is adjusted by filling the PWM at a fixed frequency of 50 Hz, you can read more about this Here. The function is responsible for all this. servo()which selects the required servo drive and adjusts the pulse width in the range of 500–2500 µs (0–180 degrees) in accordance with the obtained angle.

To receive messages from the microcomputer in pico2.cpp a function was declared servo_subscriber_callback()and the message itself was a standard type geometry_msgs/msg/Vector3containing data on the servo drive number and rotation angle (two coordinates, the third is not used).

Control JGB37-3525 geared motors involves using PWM and several pins to select the direction and read the rotation speed. The code can be found at pico1\external_lib\motors. So, for the correct operation of the motor, a PWM signal with a frequency of about 20 kHz is required, and the speed depends on the duty cycle of this signal.

The function is used to control the speed and direction of rotation. motorN_controller()(N is the motor number), which takes only one argument – speed, which can be positive or negative. The direction of rotation is selected in accordance with the sign, and the module sets the speed.

To receive messages from the microcomputer in pico1.cpp a function was declared twist_subscriber_callback()and the message itself was a standard type geometry_msgs/msg/Twist. In this case, for full control of the robot's movement on the plane, it is sufficient to use the linear velocities along the x and y axes and the angular velocity of rotation around the z axis. This data comes from the microcomputer, is pre-processed using the kinematics library, which converts the received velocities into revolutions of each of the four motors and is then fed into the function motorN_controller().

Reading data from MPU6050 was carried out with the help of the library i2cdevlibwhich contained an example of using this module to calculate rotation angles around the x, y, z axes, as well as linear accelerations along the x and y axes. The code can be found at pico1/external_lib/imu. In this case, the library itself and auxiliary files are located in the MPU6050 folder, and the code for reading data is in imu.cpp.

The library contains quite complex mathematics, which is very demanding on the microcontroller's resources. To reduce the load on it, you can increase the parameter values MPU6050_DMP_FIFO_RATE_DIVISOR V MPU6050_6Axis_MotionApps_V6_12.h. More details about this are written Here.

The transfer of angle and linear acceleration values ​​to the microcomputer was carried out by a timer in the function timer_callback(rcl_timer_t *timer, int64_t last_call_time)and the message itself was a standard type sensor_msgs__msg__Imuinto which the values ​​of linear accelerations along the x and y axes, as well as the angle of rotation relative to the z axis, were transmitted. When the MPU6050 was turned on, calibration was performed, as a result of which a certain stationary coordinate system was established, relative to which the angle was measured.

Unfortunately, the peculiarities of mathematical processing of the sensor readings make it impossible to get rid of drift, as a result of which the initially set “stationary” coordinate system rotates arbitrarily, and the more strongly, the more abruptly the robot moves. The only way to avoid this is to periodically calibrate, pre-positioning the robot in the same way as it was initially set. But this can only be done by means of some external systems, because the sensor itself already contains an error. A magnetometer, which is installed in more advanced versions of this scheme, can partially help to solve this problem, but this solution in rooms where there are strong magnetic fields seems very questionable.

Control display assumed the use external library. The code can be found along the path pico2\external_lib\display. In this case, the library itself and auxiliary files are located in the SSD1306 folder, and the functions Print_string() And Print_char() are intended to print a line and a symbol respectively in display.cpp.

To receive messages from the microcomputer in pico2.cpp a function was declared display_subscriber_callback()and the message itself was a standard type std_msgs/msg/UInt8 to transmit the printed number.

In order to preserve historical fairness, it is worth noting that initially a larger-diagonal ST7735_TFT display was used, and a more functional library was used to control it. However, connecting this display required many more wires, the number of which was limited by the Slip-ring current collector, so in the end a screen controlled via I2C was chosen.

Description of algorithms and software

Structural diagram of the SheevaBot control system

Structural diagram of the SheevaBot control system

Detailed description of the SheevaBot algorithm

The development of the algorithm of work in the global plan can be divided into several parts: the movement of the robot, the capture process, work with cameras, the main algorithm.

Structural diagram of the program

Structural diagram of the program

Movement

The input is a radius vector in the coordinate system of the center of the playing field, where the angle is measured relative to the robot (from the MPU6050 sensor).

Scheme of choosing the direction of movement of the robot depending on the radius vector

Scheme of choosing the direction of movement of the robot depending on the radius vector

The figure shows the local coordinate system of the robot, the arrows indicate the direction of rotation, and the dotted line divides the entire coordinate system into areas. Depending on the range in which the radius vector angle falls, the robot turns there. It is worth noting that the IMU MPU6050 sensor gave values ​​either from 0° to 180° or from -180° to 0°, so the algorithm had to be adapted.

The length of the radius vector was formed in the control algorithm, in which the distance from the robot's position to the target was calculated, but more on that later.

External camera

How to detect ArUco markers using the library OpenCVa large number of things have been written material and projects on GitHub.

To obtain data from the camera about the distance to certain ArUco markers, it is necessary to know the parameters of the camera (focal length, coordinates of the principal point and distortion coefficients). Thanks to good calibration, you can squeeze everything possible out of even the cheapest camera, but still, the closer to the edges of the viewing angle, the greater the error in predicting the distance on the field.

Flower picking

Flower Capture Algorithm

Flower Capture Algorithm

Since the process of compression/decompression of the manipulator on the servo drive, as well as the lifting/lowering of the Manipulator Flower are actually binary commands, we got by with one node with a large number of conditional operators. When a control command is received to capture a flower, the Manipulator Flower is lowered, Captured by the servo drive, the Manipulator Flower is lifted, the Cassette is rotated, and upon arrival at the flower delivery point, the task becomes even simpler: send a command to Release by the servo drives.

Basic algorithm

The main algorithm of interaction between the robot and its subsystems is implemented through two ROS topics: one for reading commands from the main algorithm, the other for sending responses from nodes. This approach made it easy to debug the system by analyzing messages in topics. All information sent to topics has a standard format: commands like “pick up a flower” or “move along the radius vector {r, φ}”, and responses from nodes have the structure “system name/node code: response”.

The peculiarities of working with RPi4 and microROS became obvious during the testing phase. Launching a node to connect Pico via microROS required constant rebooting of Pico, which complicated the work. To solve this problem, one of the RPi4 pins was used, connected to the reset pin of Pico. If there were no signals from Pico, the RPi4 pin dropped the voltage, causing a reboot.

Strategy of behavior on the field

Before starting the movement, the robot uses a camera and 4 ArUco markers to determine the center of the coordinate system and the coordinates of the points of interest on the field (scanning locations, flowers, starting bases). Then the robot heads to the scanning location optimally selected for searching for flowers.

During the scanning stage, the robot rotates on the spot, reading information about each flower: the index, the distance to it, and the angle of rotation. Then the distance to the nearest flower is calculated, and the robot moves along the radius vector, which is constructed based on the known angle and distance. After approaching the flower, the manipulator goes down, grabs the flower, and rises. This cycle is repeated 6 times until all the robot's “hands” have grabbed flowers.

When all the flowers are collected, the robot heads to the unloading base, after which the algorithm completes its work.

Anti-collision system

To prevent collisions on the field, a simple but reliable anti-collision system was implemented. It activates emergency braking when the maximum distance between robots is reached. If the opponent's robot retreats to a safe distance, our robot can continue moving. However, this did not happen in our matches, and the activation of the system led to the automatic end of the match when time expired.

Results and conclusions

Despite the fact that the final results were not high, our team managed to pass the admission, win one match and, most importantly, defeat ourselves and receive the “Will to Win” award. Having gone through this path, we realized that at the stage of preparation for the competition, preliminary prototyping is important, the right camera is very important, and a stock of fireproof Raspberry Pi is priceless. 🙂

Based on the first experience of participation, we will be able to build a more reliable and multifunctional robot in the new season. For the preparation process for Eurobot 2025 and other robotics competitions, we invite you to t.me/engistories (Contact person: Valkovets Danila).

SheevaBot was created by:

1. Ermakov Alexander (captain)
2. Dvinskikh Pavel (electronic engineer)
3. Karavaev Kirill (electronic engineer)
4. Konovalov Georgy (designer)
5. Ivan Lyubimtsev (programmer)
6. Manshin Timur (programmer)
7. Monastyrny Maxim (programmer)
8. Osipov Kirill (designer)
9. Trifonov Fedor (technologist)
10. Shuvalov Gleb (SMM).

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *