positioning a multi-joint manipulator by tilting the head

When adding another joint, the question immediately arises of how to control it in conjunction with the previous one. My first idea was to transfer the principle used only for the forearm to the entire structure: the end of the entire structure tends to the viewing plane, and the positions of the intermediate joints are determined using inverse kinematics. In practice, such a hand would behave like a snake, which wriggles along the line of sight. I decided to leave this idea for later, when I would have more accurate servos and sensors at my disposal, and the current implementation turned out to be much simpler. The forearm is positioned according to the old algorithm, but at any time we have the opportunity to switch to the hand positioning mode, in which the position of the forearm is fixed. That is, the two joints are controlled separately. This is not an ideal solution, but if you observe the behavior of your hand in real life, you will find that many movements have a similar separate nature, when the rough positioning of the forearm is performed first, and only then fine operations with the hand are performed.

Let's look at the brush positioning algorithm. After fixing the forearm, we also fix the plane of view. We can then tilt our head forward and back and rotate it left and right, changing the angles of the line of sight relative to the X and Z axes. We want the hand to rotate around these same axes of the global XYZ coordinate system. For reasons of comfort and aesthetics, we also want to keep the head rotations small, but the corresponding wrist movements to occur over a wider range. In the demonstration at the end of the article, you can see that head rotations in brush control mode are almost imperceptible, while the hand itself rotates to large angles – this is achieved by mapping small ranges of head rotation angles to large ranges of brush rotation angles (working range of head rotation for each of the X and Z axes are from -8 to 8 degrees, which translates into ranges from -90 to 90 degrees for the brush). Thus, from the head rotation angles we get two angles – vX and vZ with ranges of 180 degrees. Next, using these angles, we construct a vector codirectional to the hand in the xyz coordinate system associated with the wrist, that is, the end of the forearm. Finally, we find the angles α and β for the servos in the xyz coordinate system that produce such a vector.

Now let's start implementation. We'll look at the hardware first and then the software.

The design has undergone some changes. Firstly, I got rid of the l239d driver and manual voltage control on the servomotor. Last time I mistakenly thought that the Arduino was broken and was generating incorrect PWM signals, which prevented them from being controlled by the servomotor, but the problem, as it turned out, was insufficient current from the power supply. Now all three servos are connected directly to the Arduino, and they have a separate power line through a dc-dc boost converter with a maximum current of 4 amps.

The central servo motor responsible for rotating the forearm and the forearm itself have been modified. First, since there are now two servos attached to the far end of the bar, we had to balance the bar by attaching a weight to the near end so that the forearm servo does not do additional work against gravity. However, this increased the total moment of inertia, which leads to oscillations of the motor around a given angle (the shaft reaches the target angle, but due to inertia it flies over it, then tries to correct its position and again flies over the target corner in the opposite direction, etc. .). This problem was solved by adding an improvised clutch to the design of the servomotor, consisting of two plates, one of which is attached to the motor body and the other to the shaft, and a sponge material that provides friction between the plates and dampens small vibrations.

The remaining two servos are mounted at the far end of the rod in series, forming a joint with two degrees of freedom. Here it is immediately worth noting that a real hand has all three degrees of freedom (and in fact, all 6, since the joints have a very complex geometry and along with rotations, translations occur along all three axes, which causes a lot of problems for exoskeleton developers, but this is already a topic for another conversation), and with two degrees of freedom we can only set the longitudinal axis of the brush, but we cannot set the rotation around this axis (try extending your index finger and pointing it in a certain direction, and then, maintaining this direction, rotate the brush around the axis of the finger) .

Two mpu6050 still act as sensors, one is attached to a belt that is worn on the head, and the second is attached to the shoulder. Let me remind you of the peculiarity of this sensor – the measurement of the angle of rotation around the Z axis occurs inertially, that is, by integrating angular accelerations over time, which leads to a drift in the absolute value of the angle due to errors in the measurements. For this reason, we cannot simply use the absolute value of the angle, and next we will see how this problem was solved.

The last thing I want to talk about the design is how exactly it switches between forearm and hand control modes. Everything is trivial here – we use a button, by pressing which we switch to brush control mode. In the future, I plan to replace the button with a myodensor, which is already on its way to me, and use biceps contractions to control the structure.

Let's now look at the code. Here I want to dwell on two things that I consider non-trivial to understand, and everything else can be viewed on the project page on github Wosk1947/Eye_Guide_Bionic_Hand: A prototype for a bionic hand that is being controlled by sight (github.com).

First, let's look at how we get the angles of rotation of the hand vX and vZ from the angles of head rotation.

const double headPalmThreshold = 8;
const double palmThreshold = 90;

dZ = mpu.getAngleZ() - prevZ;
if (abs(dZ) < 0.08) {dZ = 0;}
prevZ = mpu.getAngleZ();
currentZ += dZ;
currentX = mpu.getAngleX() - originX;
if (currentX < -headPalmThreshold) {
    currentX = -headPalmThreshold;
}
if (currentX > headPalmThreshold) {
    currentX = headPalmThreshold;
}
if (currentZ < -headPalmThreshold || currentZ > headPalmThreshold) {
    currentZ -= dZ;
}
vX = palmThreshold * currentX / headPalmThreshold;
vZ = palmThreshold * currentZ / headPalmThreshold;

To eliminate drift along the Z axis, we filter out any small changes in dZ, setting them to zero. This is a simple solution that has proven to be better than the Low-Pass filter, since during drifting unwanted jumps in the angle value are permanent. Next, we limit the head rotation angles to 8 degrees in each direction and scale the resulting angle to a range of -90 to 90 degrees. This method does not allow us to use the absolute value of the angle, instead we use relative changes in the angle, that is, the center (vZ = 0) will shift over time (but only when the head moves), which can be easily corrected by turning the head towards the center shifting slightly beyond working angle range of 16 degrees. The solution, again, is not ideal, but in practice it practically does not interfere with working with the device.

Let's also look at how we get the α and β angles for the wrist servos:

void palmMotorAngles(BLA::Matrix<4,4> headRotation, BLA::Matrix<4,4> armRotation, double vX, double vZ, double* angles) {
  rVH = headRotation * eulerAnglesToMatrix(vX, 0, vZ, EEulerOrder::ORDER_ZYX);                              
  getTranslation(Inverse(armRotation) * rVH * palmForward, vPalm);
  if (vPalm[1] < 0) {
    vPalm[1] = 0;
    double lenMultiplier = 1 / sqrt(vPalm[0] * vPalm[0] + vPalm[2] * vPalm[2]);
    vPalm[0] = vPalm[0] * lenMultiplier;
    vPalm[2] = vPalm[2] * lenMultiplier;
  } 
  beta = asin(vPalm[2]); 
  if (cos(beta) != 0) {
    double s = vPalm[0] / cos(beta);
    if (s > 1) {
      s = 1;
    }
    if (s < -1) {
      s = -1;
    }
    alpha = -asin(s);
  }        
  angles[0] = beta * 180 / PI; 
  angles[1] = alpha * 180 / PI;
}

As arguments we take the matrices of head rotation headRotation and forearm rotation armRotation, which we recorded at the moment of transition from the forearm positioning mode to the hand positioning mode, the angles vX and vZ and the angles array, where we will write the result.

The rVH matrix is ​​the rotation matrix of the brush in the global coordinate system. To obtain the hand rotation matrix in the wrist coordinate system, you need to multiply the rVH matrix on the left by the matrix inverse to the armRotation matrix. After this, we multiply the resulting matrix by the unit vector palmForward, extract the translation components and obtain the hand vector vPalm in the wrist coordinate system.

Next we must restrict this vector to only positive y values. This is due to the fact that the sg90 servomotors have a working range of 180 degrees, that is, our hand can move within a hemisphere directed along the axis of the forearm. If the vector comes out of this hemisphere, then we construct a new vector, which is a projection of the original onto its base. Finally, from the vector we find the angles α and β of its rotation in the wrist coordinate system. Separately, we need to clarify how we solve the problem with gimbal lock, which arises if the angle β is equal to +90 and -90. The solution, again, is trivial – in this case we simply do not calculate the angle α, but use its previous value. If there is no gimbal lock, then we may encounter another problem when the sine obtained by dividing two very small numbers, due to their limited accuracy, can take a value greater than 1 in absolute value. In this case we are simply artificially limiting the value to 1.

For the tests, a visualization was written in Python:

Here the gray frame is the plane of the head tilt. Red, blue and green denote the wrist coordinate system, blue denotes the original hand vector, and orange denotes the resulting one.

And finally, let's look at the whole structure in action:

In the video I demonstrate several fine motor skills. In the first case, the brush reached closely spaced small objects. In the second – operations to turn an object on the table. In the third – typing using the keyboard.

Conclusion

This iteration took me much longer because I encountered so many challenges. Catering, balancing the central servomotor. Shortly after the tests, the central servomotor failed – the gears of the gearbox were destroyed. Also, when writing code, I began to run into memory limitations on the Arduino itself. For subsequent iterations, it will be necessary to move to a more serious element base, with more powerful servos, more accurate sensors and a different microcontroller (at the moment I’m thinking towards Rapsberry Pi). There are several ideas where the project can be developed. Firstly, as already written above, I would like to finally add a myodensor. Secondly, in the next iteration I would like to create a mechanism for capturing objects. In general, there are a lot of plans, and I can’t wait to get to work. With this I say goodbye to you, thank you all for your attention!

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *