Post on 22-Jul-2020
TX report -
Exoskeleton arm Designing the link between the user and the arm
Université de Technologie de Troyes
TX laboratory project
Autumn semester 2017
19-01-18
Gijs Verhoeven
2
Abstract
Three separate parts for an exoskeleton arm have been developed. A program in Arduino, with
an accompanying, small-scale, electronics prototype, a glove with force sensors and a
connector, that connects the user’s arm to the exoskeleton arm. This last part is modeled in
Creo4 and then 3D printed.
Two prototypes have been made for the Arduino program and the electronics. The first one
could measure the values of two force sensitive resistors and use those values to steer two
simple motors accordingly. It was possible to distinguish between four different movements.
A second prototype was made for more stable and secure control of the motors. This version
involved a current sensor to know the actual torque of the motors and a control loop
mechanism in the Arduino code, comparing the actual torque to a calculated target torque and
sending the resulting value to the motors again.
The target group of this exoskeleton arm would be nurses. They have to get up and lay down
other people all day and also have to carry heavy objects regularly. After having done this for
several years, this could cause severe pains in the back. To prevent this, the arm is being
developed to help them doing their work.
Obviously also other people than nurses could use the arm for their purposes but nurses are
the main target group.
3
Index
Chapter Page
Introduction 4
The goal 5
Existing work 6
The first prototype 7
The advanced prototype 11
The glove 15
The connector 16
Future improvements 18
Conclusion 20
References 22
Appendix 23
4
Introduction The goal of this project is to design the link between an exoskeleton arm and its user, a nurse.
An exoskeleton arm is being developed to help these nurses while doing heavy work. This arm
will support the movements of the arm of the nurse.
This report elaborates on the user experience of this arm. In other words: the interplay
between the user, the sensors and the actuators. It does this by describing the development of
three different parts of the robot arm that are designed: The implementation of the sensors on
the user, the attachment of the robot arm to the human arm, and the electronics and code that
steer the actuators according to the sensor input.
First the problem is further specified, then existing work has been researched and first solutions
to the issues have been made. Then the solutions are realized in the three different parts. In the
end, suggestions for further improvements are made and conclusions are drawn.
Enjoy the reading of this report!
5
The goal Nurses have to help people get up and lay down many times per day. They also need to carry
heavy objects regularly. After having done this for many years, nurses can get serious back
problems. To solve this problem, the development of an exoskeleton arm was started. Besides
the creation of the mechanical arm itself, there are more aspects that need to be considered.
For example how the arm will be connected to the user in a comfortable way, how the arm
knows what movement the user makes and if the arm needs to help doing that movement, or
with what force the motors need to help. Moreover, what different movements the arm should
be able to make, since a human puts her arm/wrist (unconsciously) in many positions during a
day. These were all issues about the interface between human and machine that still needed a
solution before this TX project was started.
During the first weeks and meetings, first solutions to these issues have been generated and
the problem has been narrowed down a bit. It was agreed upon to first focus on four basic
movements in the 2D plane: push, pull, lift and lower. These four are often used and, when
successfully implemented, will form a solid basis to expand to more complex movements later.
Soon after, a first prototype was made, implementing the first solutions to the issues.
6
Existing work Before narrowing down to four movements, it was difficult
to figure out how to measure all the different movements,
so a look has been taken at existing work in the field of
movement sensing. It was found that many gloves have
been created by multiple organizations/businesses. These
gloves have as goal to support hand movements like
grabbing and opening objects. Obviously, then also sensors
are used in some way. The first two images 1 and 2 show
the ‘Bioservo SEM glove’1. A glove for ordinary people that
strengthens the grip. In this case, sensors on the fingertips
were used, integrated in a glove. This seemed to be an
elegant solution for force sensing on the hand and fingers.
Images 3 and 4 show the ‘Exo Glove Poly’2 that is made to
help people with a disability in their hands. It measures
the intention of the user and enforces the movement. A
button is used to start measuring the intention.
The solutions of using a glove with sensors and a button to
start measuring are also used in this project, as they
seemed to be elegant and reliable.
Also a short look was taking into existing exoskeleton arms
but they all had a very different base design than this
project so no inspiration was derived here.
Image 1
Image 2
Image 3 Image 4
7
The first prototype Specifications The following list of initial specifications was agreed upon after discussing the problem:
- Use Arduino as microcontroller
- Move in 2 directions: up/down & front/back (so in every x and y direction in a 2D plane
perpendicular to the user’s body)
- Know when to aid and when not to aid
- Sense when one of the above moves is made
- Respond appropriately with a force relative to the force of the user
- Use 2 motors (on elbow and on shoulder) as actuators
- The user’s hand should be free to do any movement as normal without the exoskeleton
First solution To build a first prototype, technical solutions had to be found for some of the above
specifications.
Know when to aid and when not to aid
The arm should not always help. Some actions
can perfectly be done without it. The user should
be able to “turn on” the arm when its aid is
desired. To realise this, two solutions were
evaluated. The first is a speech recognizer3 as can
be seen on image 5. This device can be
controlled by saying commandos like “start”,
“up” and “left”. It was bought and tested but it
appeared to not always trigger after the
commando was said, depending on different accents and pronunciations. It was decided that
this could be quite an unreliable solution, leading to annoyance of the user.
The second option is a simple button4, like in figure 6. The user can push
this button, which is located on the arm itself, to turn the exoskeleton arm
on and off. When turned on, the motors will not immediately start to work,
instead, the sensor data will be evaluated by the Arduino who drives the
motors when needed. This option was chosen for the first prototype
because of its reliability, ease of use and ease of implementation.
Image 5
Image 6
8
Sense when one of the above movements is made
The nurse can make many different moves in the 2D
plane. Using sensors, different moves should be
distinguished, and the motors should be driven
accordingly. This prototype focusses first on the push
and pull movement. Later, many other movements can
be added. It was found that, to pull, forces are exerted
on the fingertips. To push, forces are exerted on the
hand palm. To measure these forces, two different
force sensitive resistors5 are being used. A big version for on the hand
palm and a small version for on one fingertip (image 7 and 8). By trial and
error, it was found that the middle of the hand palm is best suited to
accurately measure exerted forces (image 9). Load cells were also
evaluated as a solution but, since they are bigger, they were found to be
worse than the force sensitive resistors, looking from a user experience
point of view. To expand the device to sense more movements later on,
more sensors and buttons can be added, or another solution may be
found.
Respond appropriately with a force relative to the force of the user
The arm should not always work on maximum power. Therefor it was decided to make the
force of the arm relative to the sensed force by the sensors. This way, the arm will help a little
when a small force is sensed, and more when a big force is sensed. This will be done by
controlling the motors with PWM, pulse width modulation. Two simple motors6 will be used for
this first prototype (image 10). The only problem of this solution is that,
once the arm exerts a force, the sensor senses more force, so the arm
exerts even more force, and so on. This possible positive feedback loop
should be stopped. Later on, this will be discussed in further detail.
The user’s hand should be free to do any movement as normal without
the exoskeleton
To achieve this, it was decided to implement the sensors into a thin
glove. This way the hand should be free to move without being hindered
by the exoskeleton arm.
Image 7 Image 8
Image 9
Image 10
9
Realisation List of components:
- Arduino Uno
- Laptop
- Arduino to laptop cable
- 2 simple motors
- 2 Infineon motor control shields with BTN8982TA for Arduino
- 2 wall socket adapters
- Breadboard
- Wires
- 6 10kOhm resistors
- Force sensitive resistor small, 1,30 cm sensing area.
- Force sensitive resistor big, 2,30 cm sensing area.
- AllegroTM ACS712-05A current sensor for Arduino (later added in advanced version)
Schematic
Image 11
10
Building the prototype
Image 11 shows a schematic of the
total (first) setup: Two sensors
sending values to an Arduino with
breadboard, that steers two motors
accordingly.
First, the two motors were
connected, separately, to check if
they were not broken. On image 12,
the setup for one motor can be
seen. Image 12
Then, the two sensors were connected to the same circuit as the two motors. An Arduino
program was written in which the sensor values are read, mapped and used as input for the
motors. This way, the motors’ rotational speed can be controlled by pressing the sensors.
Pressing harder makes the motors spin harder. On image 13,
a motor is spinning when the sensor is being pushed.
Different actions were defined: Push, pull, lift and lower.
Push: the arm moves from folded up to stretched out.
Pull: the arm moves from stretched out to folded up.
Lift: the arm is stretched out and goes up.
Lower: the arm is folded up and goes down.
For these actions it was defined which sensor would
measure the force and what motors would need to turn on
and in what direction they would need to spin. Note that in
this prototype, physical buttons have been replaced by
coded buttons, because of lack of bread board space.
In table 1, an overview can be seen for each movement,
with motor 1 being the one on the shoulder and motor 2
being the one on the elbow.
‘Coded button’ is the button that needs to be pushed (on
the keyboard) to activate the movement. ‘Data from sensor’
is from which sensor the force will be read. ‘Motor
activated’ is which motor(s) will be controlled with the data. ‘Spinning direction’ is the direction
in which the motor will spin.
Image 13
11
Coded button Data from
which sensor
Motor activated Spinning
direction
Push a big 1 & 2 1:clock
2:counterclock
Pull c small 1 & 2 1:counterclock
2:clock
Lift e Big and/or small 1 1:clock
Lower g Big and/or small 1 1:counterclock Table 1
The advanced prototype
This first prototype was able to recognize four different movements via buttons, measure the
force and use this data to steer motors accordingly. A good beginning, but with some flaws. For
example the positive feedback loop that was noticed before: if the sensor senses force, the arm
presses, and so the sensor senses more force and the arm presses harder, and the sensor
senses more force, and so on. To remove this, and to get a more stable and reliable output
overall, a feedback loop was designed and integrated. This consisted of an additional
electronics part and a code part. It works as follows: The rotational force (torque) of the motors
should be known at any time to know what torque the motors should have, to help the user
with a certain force. See the following image 14: Sensors will measure the force of the users.
According to that, a target force for the exoskeleton is set. Then, with geometric functions, the
target torque is calculated for both motors (shoulder/elbow). The current is measured to know
the actual torque and it is compared to the target within a control loop. The difference times 20
is sent as new input for the motor. This process in repeated continuously. This value of 20 is
chosen arbitrarily and can be tweaked. The user can counteract to the arm to reduce its torque.
Note that only one sensor was drawn, but this could also be two sensors. This depends on what
movement is being made. As can be seen in table one, sometimes one sensor is used for both
motors, and sometimes both sensors are used.
This entire process was made in Arduino code which can be found in Appendix 1.
Separate functions were developed to read the sensors, set the target force, calculate the
target torque, read the actual torque, compare the real and target values in a control loop, and
to steer the motors with the calculated value.
On image 15 can be seen how this advanced prototype looks like, note that there is only one
difference with the first prototype and that is the addition of the current sensor, the small
device connected with yellow wires.
12
Image 15
Image 14
13
The geometry to calculate the target torque
works as follows: By definition9, the torque (t)
is equal to the cross product of the force, the
length of the perpendicular arm and the sine of
angle θ.
Following the Z-rule of geometry, θ = θ1 + θ2.
As in image 16 can be seen. The perpendicular
length to the elbow is equal to LE = R2 * sin(θ1 +
θ2). To the shoulder is then LS = R1 * sin(θ1) + R2
* sin(θ1 + θ2).
The elbow torque is then equal to τE = LE *
Force * sin(θ).
The shoulder torque is then equal to τS = LS *
Force * sin(θ).
These functions were integrated into the code.
As can be seen in the graph on image 17, torque is linearly proportional to the current7. This
relation was in the end used when finding a solution on how to measure the actual torque.
The actual torque was measured using a
current sensor: the AllegroTM ACS712-05A
for Arduino was used8 (image 18). The
torque of the motor is proportional to the
amount of current flowing through it. This
is why current measurement was found to
be a reliable way of measuring the torque
indirectly.
This measuring worked but gave very
inaccurate results at first. This was a big
problem for some time but then it was found that the PWM of
the motor was the reason. Because of the PWM, the sensor
would sometimes measure at high PWM state and sometimes
at low PWM state, which gave very fluctuating results. This
problem has been solved in two ways. Firstly the PWM rate
has been increased significantly with a factor of 50 (with a line
of code). Compare image 19 and 20. The ratio up-down stayed
the same but the periods became much smaller, giving a higher
Image 16
Image 17
Image 18
14
chance of measuring during up-time. Secondly a bit of code was added that calculates the
running average of the measurements. This average is used in functions and calculations, to
further remove peak values and other inaccurate measurements and to get a steady output.
To make the measurements of the current sensor more reliable, first the ‘flexitimer’ library and
attachInterrupt function in Arduino were investigated. Both these pieces of software make it
possible to do something only when a certain event occurs. In this case it could be used to only
measure when the value is high. However both these functions were not successfully put into
practice. Then it was decided to use a running average function.
Image 19
Image 20
15
The glove For this prototype it was decided that a glove would be the best solution to attach the sensors
to the user in an unobtrusive way. A glove would allow the user to keep doing everything with
her hand and fingers as usual, while still being able to carry the necessary sensors for the
exoskeleton arm. These sensors are the two sensors that were talked about before. One small
force sensitive resistor, and one big force sensitive resistor. Via trial-and-error testing it was
found that putting the small sensor on the middle finger and the big sensor on the hand palm,
would capture all necessary data to distinguish and measure the four different movements for
this prototype: push, pull, lift and lower.
This glove looks like this on image 21 and 22:
Image 21 Image 22
16
The connector This is the part that connects the arm of
the user to the exoskeleton arm. On
image 23 it can be seen how this is not
yet the case7. The user is there, the
exoskeleton arm is there, but the
connection has yet to be designed. On
image 24 and 25, the final design of this
connecter can be seen. It consists of two
parts, and inside and outside part. The
tube of the robot arm will go through
the small hole of the outside part and
the wrist will go through inside part.
When the two parts are locked into each
other, the user is connected to the arm.
These two parts were 3D-printed, after
they were drawn using the software
Creo4.
Note how the inner part can rotate a bit
inside the outer part, via this ball-shaped
design. This was made to not fully lock
the user to the robot arm. Even
connected to the robot arm, the user still
has to be able to do small movements
with her own arm. Moreover, this design
allows the user to still rotate her wrist
independent of the robot arm. This is
essential when doing movements like
pushing and pulling. They require the
wrist to be in different positions.
Also note how the inner part is well
rounded off in the inside. This is done to make the wrist fit in as comfortably as possible. Using
the robot arm multiple times per day should be no problem and should definitely not cause any
pain for the user. On the next images 26, 27, 28, 29, 30 and 31, close-ups of the models7 of the
different parts can be seen.
Image 23
Image 24
Image 25
17
Image 26 Image 27
Image 28 Image 29
Image 30
Image 31
18
Future improvements Of course there are still some things to improve on this project. Some ideas for future
improvements will be given. For example the connector can be used for much more than just
connect the arm to the robot. The buttons, to distinguish the different movements, could be
placed on this part, so that they are easily reachable for the user and so that they cannot easily
be pressed accidently. To go even further, sensors could be added to measure the heart rate
and or muscle tension. This information could be used to sense which movement is made and
would replace the use of buttons. These sensors would form a much more elegant solution
than using buttons. If it is actually feasible needs to be tested however, maybe it is not possible
or too hard to clearly distinguish movements this way, or is much calibration needed before
each use and for each different user.
Another point of improvement that is really needed for actual implementation, is reduction of
weight of the robot arm as a whole. The process of the mechanical design of the arm is not
described in this report but at the moment it consists of metal and heavy motors. This is much
extra weight for a nurse to carry and would make the use of the arm almost obsolete, since the
reduction of force due to the arm, is compensated by the weight of the arm itself.
Talking about the code, the flexitimer library or the attachInterrupt method, could be
implemented to make the results of the current sensor even more stable. This is important
because the measured torque is directly used to calculate the proportional value. This value is
sent to the motors directly. So if this torque slightly changes constantly, the motors will be a bit
shaky as well. Which is bad for doing precise work with the arm.
The control loop could be expanded to include a differential and integral component as well.
This would form a PID controller that, when calibrated well, gets very fast to a stable value and
stays at that value. This would be better for sending more stable values to the motors as well.
Sudden peak values would not make it to the motors.
The current code does not take into account the current position of the motors and the arm.
This information could be useful to prevent the arm from turning to strange positions that
could harm the user, by setting maximum and minimum values for the position. For example it
would know when the arm is fully stretched out or when the arm is turned backwards or
maximally folded up.
At the moment, the Arduino code is only tested with the electronical setup described earlier
and not with the actual arm. Of course this code needs to be used for the actual arm to
calibrate values and make final adjustments. After all, the simple motors will work differently
than the big motors of the arm itself. Maybe then also adjustments need to be made on the
19
connector to make it for example move smoother. It would be painful if the robot arm suddenly
moves but the connector slightly resists the movement because of internal friction. It must be
made sure that the connector will support all movements made by the robot arm.
20
Conclusion It can be concluded that much progress was made in the development of an exoskeleton arm.
The link was created between the user and the robot arm via a glove for the sensors, a
connector between the user and the robot arm, and Arduino code that handles the sensor data
and translates it to useful values for the motors.
Most of the time was spent on developing the Arduino code and accompanying electronics. As
described, firstly, specifications for the robot arm needed to be agreed upon. Secondly, this had
to be translated to a solution in Arduino code and electronics, which lead to the first prototype.
Then several flaws were found and solutions needed to be discussed. These solutions were
then implemented and lead to the advanced prototype, making use of a control loop for more
stable and reliable control of the robot arm.
A first version for the glove and the connector were made, as solutions to linking the robot arm
to the user. However these are not yet tested on users so possible flaws have yet to be
discovered.
It can be concluded that especially in the first weeks, the start-up of the project was a bit slow.
With only one meeting per week, time passed by quickly before the project was made concrete
and specifications were made. Only once the building and developing process started, the
project advanced at a higher pace. This because each time concrete progression could be
presented and feedback could be given which formed clear next steps for the next meeting. In
the beginning it was also not yet clear what the exact end goal for the project would be, later
this became known. Working with a clear end goal in mind is much more convenient than not
knowing where the project will go and what is expected from you.
It can be said that the supervisor, on the one hand, gave me much space to come up with
solutions myself and to work them out, and on the other hand, supported me when it was
needed by supplying electronic components like the simple motors, their shields, and the
current sensor and also by helping me with for example creating the 3D model of the connector
in Creo4. Something that would have cost me much time, but what he did in some minutes.
By doing this project I learned much about doing solo projects. Previously I always had to do
projects in groups. In a solo project, everything has to be done by you, also the aspects that you
are less good at, which will take much time. Doing a solo project carries more responsibility
than a team project, because there is no task division with other team members. If something
does not work, it is your responsibility to solve the problem. Of course there is help from one or
more supervisors, which is very helpful, but therefore there needs to be good communication.
During all stages of the project, all parties should be up-to-date and should have the exact same
21
project in mind while talking about it. Therefore I learned that going concrete, start building, as
early as possible, is important in a project. This way you can just show what solution you are
thinking about and the other person sees the same thing in front of him and can give direct
feedback. Sharing each other’s ideas without prototypes, drawings or mock-ups can be very
confusing and tiring if you constantly do not really understand each other. In the beginning of
the project I maybe should have started building earlier. This is hard to say because we also
advanced slowly because we only had one meeting per week.
The importance of clear communication also became clear to me later on in the project when
we were discussing the control loop system for the code. Only with drawing a schematic, our
ideas on this complex system became clear.
This solo project will be a good preparation on my final Bachelor assignment which I will need
to do next semester. Now I am already familiar with working alone for a supervisor on a bigger
project, going through the design process and solving problems that occur.
22
References [1] Healthlink Holdings Ltd. “Bioservo SEM glove”.
https://www.youtube.com/watch?v=2Z6wHzuA6BQ, 18-03-2016, last visited 18-01-18
[2] SNU BioRobotics Lab. “Exo-Glove Ply (extended) (Seoul National University)”
https://www.youtube.com/watch?v=oNEFXcWRIG8, 15-02-16, last visited 18-01-18
[3] Grove – Speech Recognizer http://wiki.seeed.cc/Grove-Speech_Recognizer/ , last visited 20-
10-17
[4] Sparkfun mini pupshbutton switch https://www.sparkfun.com/products/97 last visited: 01-
11-17
[5] Sparkfun force sensitive resistor hookup guide https://learn.sparkfun.com/tutorials/force-
sensitive-resistor-hookup-guide?_ga=2.196452648.528605695.1508501959-
225737108.1507732004 last visited: 07-11-17
[6] Sparkfun Hobby motor – gear https://www.sparkfun.com/products/11696 , last visited: 07-
11-17
[7] Pierre-Antoine Adragna, Université de Technologie de Troyes (2017)
[8] DC current Measurement using ASC712-05A, https://circuits4you.com/2016/05/13/arduino-
asc712-current/ 13-05-16, last visited 01-12-17
[9] patrickJMT “Torque – An application of the cross product”
https://www.youtube.com/watch?v=2LqtwgrDRQ8 19-10-08, last visited 26-11-17
23
Appendix 1 – Code for the advanced
prototype
/****************************************************************
**************
Force_Sensitive_Resistor_Example.ino
Example sketch for SparkFun's force sensitive resistors
(https://www.sparkfun.com/products/9375)
Jim Lindblom @ SparkFun Electronics
April 28, 2016
Create a voltage divider circuit combining an FSR with a 3.3k
resistor.
- The resistor should connect from A0 to GND.
- The FSR should connect from A0 to 3.3V
As the resistance of the FSR decreases (meaning an increase in
pressure), the
voltage at A0 should increase.
Development environment specifics:
Arduino 1.6.7
*****************************************************************
*************/
//code to control the FSRs taken from above link. Code adjusted
by Gijs Verhoeven, using Arduino 1.8.5, january 2018,
//Université de technology de Troyes, TX laboratory project,
designing the link between an exoskeleton arm and the user
//Code adjusted to drive motors according to calculated values
derived from the FSR's values, on button push.
int fsrsADC = 0; //measured value small FSR
int fsrbADC = 0; //big FSR
int FSRB_PIN = A5; // Pin connected to FSR/resistor divider
const int numReadings = 10; //change this value for an
average over more values
24
float readings[numReadings]; // the readings from the analog
input
int readIndex = 0; // the index of the current
reading
float total = 0; // the running total
float average = 0; // the average
int inputPin = A2; //input pin for the current
sensor
void setup()
{
pinMode(FSRB_PIN, INPUT);
Serial.begin(9600);
TCCR2B = (TCCR2B & 0b11111000) | 0x01; //line to increase the
PWM rate, to increase accuracy of the current sensor.
for (int thisReading = 0; thisReading < numReadings;
thisReading++) { //initiate the running average array
readings[thisReading] = 0;
}
}
void loop() {
delay(500);
sensorRead();
targetForce(sensorRead());
targetTorqueElbow(targetForce(sensorRead()));
//targetTorqueShoulder(targetForce(sensorRead()));
actualTorque();
steerMotor();
Serial.print("sensorRead: ");
Serial.println(sensorRead());
Serial.print("targetForce: ");
Serial.println(targetForce(sensorRead()));
Serial.print("targetTorque: ");
Serial.println(targetTorqueElbow(targetForce(sensorRead())));
Serial.print("actualTorque: ");
25
Serial.println(actualTorque());
Serial.print("propValue: ");
Serial.println(controlLoop());
Serial.println(" ");
}
// If the FSR has no pressure, the resistance will be
// near infinite. So the voltage should be near 0.
const float VCC = 4.98; // Measured voltage of Ardunio 5V line
const float R_DIV = 3333.0; // Measured resistance of 3.3k
resistor
float sensorRead() {
fsrbADC = analogRead(FSRB_PIN);
if (fsrbADC >= 10) // If the analog reading is non-zero
{
// Use ADC reading to calculate voltage:
float fsrbV = fsrbADC * VCC / 1023.0;
// Use voltage and static resistor value to
// calculate FSR resistance:
float fsrbR = R_DIV * (VCC / fsrbV - 1.0);
// Guesstimate force based on slopes in figure 3 of
// FSR datasheet:
float forceb;
float fsrbG = 1.0 / fsrbR; // Calculate conductance
// Break parabolic curve down into two linear slopes:
if (fsrbR <= 600) {
forceb = (fsrbG - 0.00075) / 0.00000032639;
} else {
forceb = fsrbG / 0.000000642857;
}
return forceb;//eg 1400 gram.
}
if (fsrsADC >= 10) // If the analog reading is non-zero
{
float fsrsV = fsrsADC * VCC / 1023.0;
26
float fsrsR = R_DIV * (VCC / fsrsV - 1.0);
float forces;
float fsrsG = 1.0 / fsrsR;
if (fsrsR <= 600) {
forces = (fsrsG - 0.00075) / 0.00000032639;
} else {
forces = fsrsG / 0.000000642857;
}
return forces;
}
*/
}
//function that takes the measured force and returns a target
force.
float targetForce(float force) {
return force/100;// divided by 100 to go from grams to Newton.
}
//function that takes the calculated targetForce and returns the
target torque for the elbow motor.
float targetTorqueElbow(float targetForce) {
float angleShoulder = 40*(2*PI/360.00); //rads. These angles
should be sensor input
float angleElbow = 30*(2*PI/360.00); //rads
float armLength = 0.39 * sin(angleShoulder + angleElbow); // in
meters. lower arm: 0.39m, upper arm: 0.35m
float rad = angleShoulder + angleElbow; //geometry
float angle = sin(rad); //rads
return armLength * targetForce * angle;//in Nm
}
//same for shoulder
float targetTorqueShoulder(float targetForce) {
float angleShoulder = 40*(2*PI/360.00); //rads. These angles
should be sensor input
float angleElbow = 30*(2*PI/360.00); //rads
27
float armLength = 0.35 * sin(angleShoulder) + 0.39 *
sin(angleShoulder + angleElbow); // in meters. lower arm: 0.39m,
upper arm: 0.35m
float rad = angleShoulder + angleElbow; //geometry
float angle = sin(rad); //rads
return armLength * targetForce * angle;
}
#define IN_3 5 //motor2 PWM pin CLOCKWISE ROTATION
#define IN_4 10 //motor2 PWM pin COUNTERCLOCKWISE ROTATION
#define IN_1 3 //motor1 PWM pin CLOCKWISE ROTATION
#define IN_2 11 //motor1 PWM pin COUNTERCLOCKWISE ROTATION
#define INH_1 12 //pin to turn motor on
#define INH_2 13 //pin to turn motor on
//in all cases, 's' stands for 'small FSR' and 'b' for 'big FSR'.
int FSRS_PIN = A0;
bool push = false; //buttonstate
bool pull = false;
bool lift = false;
bool lower = false;
int Motor_DC1 = 0; //actual DC
int Motor_DC2 = 0;
int incByte = 0; //read button
void steerMotor() {
pinMode(FSRS_PIN, INPUT); //measure the sensor values
pinMode(IN_1, OUTPUT); //the motors
pinMode(IN_2, OUTPUT);
pinMode(IN_3, OUTPUT);
pinMode(IN_4, OUTPUT);
pinMode(INH_1, OUTPUT);
pinMode(INH_2, OUTPUT);
28
digitalWrite(INH_1, 1); //activate motors
digitalWrite(INH_2, 1);
if (Serial.available() > 0) { //turn arm on/off for different
movements,
//this code replaces physical buttons for now.
incByte = Serial.read();
if (incByte == 'a') {
push = true;
}
if (incByte == 'b') {
push = false;
analogWrite(IN_1, 0); //stop the motors
analogWrite(IN_4, 0);
}
if (incByte == 'c') {
pull = true;
}
if (incByte == 'd') {
pull = false;
analogWrite(IN_2, 0);
analogWrite(IN_3, 0);
}
if (incByte == 'e') {
lift = true;
}
if (incByte == 'f') {
lift = false;
analogWrite(IN_1, 0);
}
if (incByte == 'g') {
lower = true;
}
if (incByte == 'h') {
lower = false;
analogWrite(IN_2, 0);
}
}
29
if (push && !pull && !lift && !lower) { //push movement
Motor_DC1 = controlLoop();//write the control loop values to
the motor
if(Motor_DC1 > 255){ //not exceed limits
Motor_DC1=255;
}
if(Motor_DC1 <0){
Motor_DC1 = 0;
}
analogWrite(IN_1, Motor_DC1); //in this version, only the
push movement
//works on the controlLoop's value, the others are still
programmed
//directly to the sensorvalue.
/* Motor_DC1 = map(fsrbADC, 0, 1023, 0, 30);
Motor_DC2 = map(fsrbADC, 0, 1023, 0, 30);
analogWrite(IN_1, Motor_DC1);
analogWrite(IN_4, Motor_DC2);*/
}
if (!push && pull && !lift && !lower) { //pull movement
fsrsADC = analogRead(FSRS_PIN);
Motor_DC1 = map(fsrsADC, 0, 1023, 0, 60);
Motor_DC2 = map(fsrsADC, 0, 1023, 0, 60);
analogWrite(IN_2, Motor_DC1);
analogWrite(IN_3, Motor_DC2);
}
if (!push && !pull && lift && !lower) { //lift movement
fsrbADC = analogRead(FSRB_PIN);
fsrsADC = analogRead(FSRS_PIN);
if (fsrbADC > 10) {
Motor_DC1 = map(fsrbADC, 0, 1023, 0, 30);
analogWrite(IN_1, Motor_DC1);
}
if (fsrsADC > 10) {
Motor_DC1 = map(fsrsADC, 0, 1023, 0, 60);
analogWrite(IN_1, Motor_DC1);
}
}
30
if (!push && !pull && !lift && lower) { //lower movement
fsrbADC = analogRead(FSRB_PIN);
fsrsADC = analogRead(FSRS_PIN);
if (fsrbADC > 10) {
Motor_DC1 = map(fsrbADC, 0, 1023, 0, 30);
analogWrite(IN_2, Motor_DC1);
}
if (fsrsADC > 10) {
Motor_DC1 = map(fsrsADC, 0, 1023, 0, 60);
analogWrite(IN_2, Motor_DC1);
}
}
}
/*Code to use the current sensor taken from
https://circuits4you.com/2016/05/13/arduino-asc712-current/.
* Code for the running average taken from
https://www.arduino.cc/en/tutorial/smoothing.
* Codes combined to return the running average torque value of
the motors, measured by the current sensor.
*/
double mVperAmp = 185;
double RawValue = 0;
double ACSoffset = 2500;
double Voltage = 0;
double Amps = 0;
float torque = 0;
int currentMeasure = A2;
float actualTorque() {
pinMode(currentMeasure, INPUT);
// subtract the last reading:
total = total - readings[readIndex];
// read from the sensor:
readings[readIndex] = analogRead(currentMeasure);
// add the reading to the total:
31
total = total + readings[readIndex];
// advance to the next position in the array:
readIndex = readIndex + 1;
// if we're at the end of the array...
if (readIndex >= numReadings) {
// ...wrap around to the beginning:
readIndex = 0;
}
// calculate the average and use it in next calculations for
the torque
average = total / numReadings;
Voltage = (average / 1024.0) * 5000;
Amps = ((Voltage - ACSoffset) / mVperAmp);
torque = Amps * .5;//just to make the real value a bit
bigger/easier to use
return torque;
}
float controlLoop() {
float error = targetTorqueElbow(targetForce(sensorRead())) -
torque;//amount of diviation from target
float propValue = error * 20; //proportional value
return propValue;
}