EN40 dynamics and vibrations

22
EN4 Dynamics and Vibrations Design Project Quadcopter-pong Synopsis In this project you will design and test a control system (written in MATLAB) that will fly a quadcopter along a prescribed path. The control system must be programmed to pick up a colored beer-pong ball and deliver it to a prescribed location. 1. Organization This project must be done in groups of up to 4 students. The project `deliverables’; are (i) A Matlab code that will control the quadcopter; and (ii) A short (2-3 page) report (one report per group) describing your design calculations, and (iii) a short oral presentation describing your design procedures, and demonstrating the performance of your design; and (iv) an evaluation of the performance of your team-mates on the project. You will demonstrate your design to faculty on Saturday May 7th. As part of the testing, you should give a short (10 min) presentation describing your design procedure. Your written report is due at the presentation. Warning: We only have 5 sets of equipment to test designs. 2. Overview of design requirements The goal of this project is to design and implement a control system that will fly a quadcopter along a prescribed path. You will be provided with the following equipment 1. A Microsoft Kinect 2, which will measure the position of the quadcopter; 2. A Crazyflie2 micro-quadcopter; 3. A Crazyradio PA USB radio that will send control signals to the quadcopter; 4. A battery charger and spare batteries for the quadcopter 5. Python software that will (i) manage the radio interfaces the quadcopter; and (ii) run your MATLAB scripts that fly the quadcopter; 6. A Matlab code that will do the image processing to track the quadcopter. 7. A few template MATLAB codes that you can extend to implement your design 8. A thrust-stand that you can use to measure the thrust developed by the quadcopter motors as a function of battery voltage. Before you start: Download and save the following Matlab code from the course website: controller.m (You will write your feedback controller inside this template script) You can also download the following functions – you can modify these if you like. Each kinect sensor has its own unique optics so we have had to pre-install a customized version on each of the computers in B&H096 – if you want to use your own modified versions please contact Prof Bower for assistance. getKinectCoords_sample.m (reads the Kinect and calculates coordinates from the images) preview_kinect_image_sample.m (an interactive MATLAB GUI that helps set up the image processing) thrust_logger.m (a MATLAB script used to measure the quadcopter thrust – its use is optional).

Transcript of EN40 dynamics and vibrations

Page 1: EN40 dynamics and vibrations

EN4 Dynamics and Vibrations Design Project Quadcopter-pong

Synopsis In this project you will design and test a control system (written in MATLAB) that will fly a quadcopter along a prescribed path. The control system must be programmed to pick up a colored beer-pong ball and deliver it to a prescribed location. 1. Organization

• This project must be done in groups of up to 4 students. • The project `deliverables’; are (i) A Matlab code that will control the quadcopter; and (ii) A short

(2-3 page) report (one report per group) describing your design calculations, and (iii) a short oral presentation describing your design procedures, and demonstrating the performance of your design; and (iv) an evaluation of the performance of your team-mates on the project.

• You will demonstrate your design to faculty on Saturday May 7th. As part of the testing, you should give a short (10 min) presentation describing your design procedure. Your written report is due at the presentation.

• Warning: We only have 5 sets of equipment to test designs. 2. Overview of design requirements The goal of this project is to design and implement a control system that will fly a quadcopter along a prescribed path. You will be provided with the following equipment

1. A Microsoft Kinect 2, which will measure the position of the quadcopter; 2. A Crazyflie2 micro-quadcopter; 3. A Crazyradio PA USB radio that will send control signals to the quadcopter; 4. A battery charger and spare batteries for the quadcopter 5. Python software that will (i) manage the radio interfaces the quadcopter; and (ii) run your

MATLAB scripts that fly the quadcopter; 6. A Matlab code that will do the image processing to track the quadcopter. 7. A few template MATLAB codes that you can extend to implement your design 8. A thrust-stand that you can use to measure the thrust developed by the quadcopter motors as a

function of battery voltage. Before you start: Download and save the following Matlab code from the course website:

• controller.m (You will write your feedback controller inside this template script) You can also download the following functions – you can modify these if you like. Each kinect sensor has its own unique optics so we have had to pre-install a customized version on each of the computers in B&H096 – if you want to use your own modified versions please contact Prof Bower for assistance.

• getKinectCoords_sample.m (reads the Kinect and calculates coordinates from the images) • preview_kinect_image_sample.m (an interactive MATLAB GUI that helps set up the image

processing) • thrust_logger.m (a MATLAB script used to measure the quadcopter thrust – its use is optional).

Page 2: EN40 dynamics and vibrations

Some additional software is needed to manage communications between the quadcopter, the Kinect, and your control scripts. This only works on recent Windows operating systems and is tricky and time-consuming to install, so we have provided a dedicated computer with each quadcopter. You will need to copy your codes onto these computers to test your designs. To complete the design, you will need to: (i) Complete some preliminary hand-calculations and MATLAB simulations to understand how the

control system works, and to calculate values of important parameters in the control system. (ii) (Optional: Measure the thrust developed by the quadcopter motor as a function of the control

setting and the battery voltage). (iii) Write a MATLAB code that will calculate roll/pitch/yaw/thrust settings that will fly the

quadcopter to where it should be. (iv) Test and fine-tune your control system on an actual quadcopter (v) Write a code to make your quadcopter pick up a colored ping-pong ball and deliver it to a

prescribed location. The steps in the project are described in more detail in the sections to follow. 3. Design calculations The basic design problem you must solve is (i) measure the position of the quadcopter; and (ii) use the measured position to calculate the roll, pitch, thrust and yaw settings to be sent to the quadcopter. You will do this using a so-called ‘PID feedback-control system’ (The PID stands for Proportional, Integral, Derivative). To implement your design, you should:

(i) Do some quick hand-calculations to understand how the PID controller works; (ii) Write a more sophisticated MATLAB code that will solve the equations of motion for the

quadcopter and an idealized version of the control system (iii) Use your MATLAB code to estimate values for the parameters needed in the actual control

system. 3.1 Hand calculations The following simple calculations will help you understand how a feedback controller works. The figure shows an idealization of a quadcopter. To keep things simple, we will focus only on controlling its height, and assume that its roll, pitch and yaw are stabilized separately. Assume that:

• The quadcopter has mass m. • At time t=0 the quadcopter is on the ground, at z=0. • We wish to hold the quadcopter at a constant altitude *z . • We can measure the actual height z of the quadcopter; • We control the motors by sending a signal (a number between 0 and 60000) τ to the quadcopter.

A computer on-board the quad will then adjust the motor speeds to produce a total vertical thrust T βτ= , where β is a constant (which we will need to measure).

You will now derive the equations of motion for the quadcopter and its controller, and determine how it will behave.

Target altitude

zz*

m

Page 3: EN40 dynamics and vibrations

(a) Write down F=ma for the quadcopter and hence show that its position is related to τ by 2

2d zm mgdt

βτ= −

(b) For a basic control system we could choose to set *( )PK z zτ = − where PK is a constant (proportional control). For this choice, use the solutions to vibration problems to write down the solution for ( )z t in terms of *z , m, g, PK and β . Does this controller work?

(c) For a better system, we can determine the velocity /dz dt of the quadcopter, and set *( )P D

dzK z z Kdt

τ = − − (proportional-derivative control). For this choice, write down the modified

equation of motion for y, and hence use your understanding of vibrations to (a) Give a formula for the value of DK that will make the quadcopter go to its final position as quickly as possible; and (b) For this choice, derive a formula for ( )z t in terms of *z , m, g, PK and β . .

(d) Notice that the quadcopter never gets to the right height with the controller in (c). We can do a bit

better if we measure the mass, and use *0 ( )P D

dzK z z Kdt

τ τ= + − − , where 0 /mgτ β= is the thrust

setting that will just support the weight of the quadcopter. What is the final position with this choice?

Summary: This idea is called ‘feedback control’ – the basic idea is to apply a control setting (the number τ ) that is determined by the difference *e z z= − between where you would like the quadcopter to be

*( )z , and where it actually is (z). For P-D control, we choose P DdeK e Kdt

τ = + . We can use the same

idea to control x,y position and the yaw of the quadcopter too: but instead of sending a thrust setting, we will send signals that control the roll, pitch and yaw rate. If you are wondering ‘What about the I in PID’ stay tuned – we will add this later. Complications: As is usual in engineering the basic operating principles in this design are simple but several complications arise when we try to implement it. The two big ones here are (a) The motor thrust does not just depend on τ - because of the airflow around the quad, the thrust depends on how the quadcopter is moving; and (b) It takes us a finite amount of time to read the Kinect images, process them to calculate the quadcopter position, and calculate the necessary value of τ . Your code will be able to run at about 30 frames per second. For this reason, we need to select controller settings that will make the proportional controller oscillate at about 0.5 cycles per second or slower. The main purpose of your design calculations is to estimate the required controller settings – with this initial estimate, you can fine-tune the design by hand. Optional additional calculations: If you like, you can set up a similar model that will help design the controller for horizontal motion. In this case, you will send pitch and roll settings (in degrees) to the quadcopter.

Page 4: EN40 dynamics and vibrations

3.2 Writing a MATLAB quadcopter simulator You can do much more sophisticated design calculations by using MATLAB to solve the equations of motion for the quadcopter instead of doing them by hand. In this section, you will write a MATLAB code to simulate the motion of a feedback-controlled quadcopter in 3D, and will use the code to find values for the P-D control parameters ,PZ DZK K and ,PXY DXYK K controlling vertical and horizontal motion. For this step, we will consider the full 3D ( , , )x y z motion of the quadcopter. Assume that:

• The position and velocity of the quadcopter is specified in a fixed {i,j,k} basis, with i along the line of sight of the kinect, and k vertical (see the figure).

• We specify the desired position of the quadcopter as * * *( , , )x y z (these could vary with time) • A sensor on the quadcopter will measure the yaw angle ψ of the quadcopter. The yaw is the

rotation of the quadcopter about the Ne axis (with right hand convention) shown in the figure. We use the convention that yaw is zero when Le is parallel to i.

• The controller will send signals to the quadcopter that specify the thrust (a number between 0 and 60000) τ ; the roll and pitch angles ,θ φ (which represent rotations of the quadcopter about the ,L Te e axes, in degrees); and the yaw rate λ (in degrees per second) .

• These settings cause the motors to apply the thrust at an angle. The resulting vector components of thrust are

sin sin cos sin coscos sin sin sin cos

cos cos

x

y

z

TTT

θ ψ θ φ ψθ φ ψ θ ψ βτ

θ φ

+ = −

• The quadcopter will respond to the yaw signal λ by rotating about the Ne axis at a rate ddtψ λ=

Page 5: EN40 dynamics and vibrations

With this pre-amble, work through the following calculations: (a) Use F=ma and the usual manipulations to show that the equations of motion for position ( , , )x y z ,

velocity , ,x y zv v v and yaw ψ are

(sin sin cos sin cos ) /(cos sin sin sin cos ) /

(cos cos ) /

x

y

z

x

y

z

x vy vz v

d v mdt

v mv m g

θ ψ θ φ ψ βτθ φ ψ θ ψ βτ

θ φ βτψ λ

= + −

(b) Write a MATLAB script that will use ode45 to calculate ( , , )x y z , velocity , ,x y zv v v and yaw ψ with constant values for , , ,τ φ θ λ . Use the following parameters:

• m=35 grams • 51.17 10 Nβ −= × (the units are Newtons) • Initial position and velocity ( 0, 0, 0)x y z= = = , 0, 0, 0x y zv v v= = =

Check your code by predicting the position and velocity of the quadcopter for a few (constant) values of , , ,τ φ θ λ and make sure the code gives the answers you expect.

(c) Next, add the control system to the calculation. As before, the basic idea is to make the controller

settings τ ; the roll and pitch angles ,θ φ (in radians); and the yaw rate λ respond to any deviation between the actual and desired positions of the quadcopter. Here is one way to do this1:

*0

* *

*

( )cos sin

cos sin

( ) ( )

( )

PZ DZ z

y x

x y

x PXY DXY x y PXY DXY y

P

K z z K vC C

C C

C K x x K v C K y y K v

K ψ

τ τθ ψ ψ

φ ψ ψ

λ ψ ψ

= + − +

= − +

= + +

= − − = − −

= −

Modify your MATLAB code to use these formulas to calculate τ ; ,θ φ inside the ‘EOM’ function.

1 The trig terms in the formulas for yaw and pitch need a little explanation. If the yaw is zero, then pitching the quadcopter will make it move parallel to the x direction, and rolling it will make the quadcopter move parallel to the y direction. But if the quadcopter yaws, this is no longer the case. The trig terms do the necessary basis change for you.

Page 6: EN40 dynamics and vibrations

3.3 Adding Integral Control The P-D controller you implemented in the preceding section works quite well, but if the quadcopter is not perfectly balanced (i.e. it does not stay in one place when it is flying with zero roll and pitch angle), or if the thrust is not calibrated perfectly, the quadcopter tends to fly some distance away from the desired point, instead of exactly where it should be. The ‘I’ term in the P-I-D controller is designed to fix this problem. It works like this.

• The controller keeps track of the integrated error between the desired and actual position, by calculating

* * *

0 0 0( ) ( ) ( )

t t t

x y ze x x dt e y y dt e z z dt= − = − = −∫ ∫ ∫

• The feedback control signal is modified to include a term proportional to ( , , )x y ze e e as follows. *

0

* *

*

( )cos sin

cos sin

( ) ( )

( )

PZ DZ z IZ z

y x

x y

x PXY DXY x IXY x y PXY DXY y IXY y

P

K z z K v K eC C

C C

C K x x K v K e C K y y K v K e

K ψ

τ τθ ψ ψ

φ ψ ψ

λ ψ ψ

= + − − +

= − +

= + +

= − − + = − − +

= −

To understand why this works, notice that if the quadcopter is flying any distance away from the desired position the integrated error will continue increasing. A term proportional to the error is fed back to the controller. This means that the thrust and roll/pitch settings will be adjusted until the error no longer increases: if the error stops increasing, the quadcopter must be exactly at the desired position. You can test the effects of the ‘integral’ control by adding it to your MATLAB design code. When you write your quadcopter control code (section 6) you will need to do the integrals to calculate the error, but in the MATLAB design code it is more convenient to have the MATLAB ODE solver do the integrals. To do this, you can modify the equations in Section 3.2(a) to include the error variables to the unknowns. The modified system of equations becomes:

*

*

*

(sin sin cos sin cos ) /(sin sin cos sin cos ) /

(cos cos ) /

x

y

z

x

y

z

x

y

z

vxvyvz

mvmvd

m gvdt

x xee y ye z z

θ ψ θ φ ψ βτθ ψ θ φ ψ β

θ φ βτλψ

+ + = − − − −

At time t=0 we know 0x y ze e e= = = . Add the additional three lines to your matlab code to compute the error, and add the extra ‘I’ term to the feedback control equations. Then follow the steps in Section 3.4 to test your code, and to calculate values for the PID coefficients in your controller.

Page 7: EN40 dynamics and vibrations

3.4 Tuning the controller parameters Finally, run the following calculations to obtain initial estimates for the controller parameters. In each case run the simulation for about 20-30 seconds, and plot the position and yaw as functions of time. Please follow these instructions carefully – incorrect values for the controller gains cause crashes that often break the tracking ball off the quadcopter. This is not a big problem as they are designed to be replaceable, but replacing them does take some time….

(a) Set the initial position, velocity, yaw and error to zero. Set the desired coordinates to * * * *0 1x y zψ= = = = . Set the controller parameters 0PXY DZ DXY IXY IZK K K K K= = = = = .

Then adjust the value of PZK to give a period of vertical oscillation of about 4 seconds. The value of PZK will probably be between 1000 and 10000

(b) Next, keep the initial and desired positions in (a) and set 0PXY DXY IXY IZK K K K= = = = . With the value of PZK in part (a) adjust DZK to give a slightly under-damped response (you should see a small overshoot of the desired position). The value of DZK will be similar to the value of PZK .

(c) Repeat (b) with the new values of PZK and DZK , but now gradually increase IZK until you just start to see a second oscillation in the height.

(d) Next, you need to tune the horizontal controller. Set the initial position, velocity and error in the quadcopter to zero. Set the desired coordinates to * * * *0 0.1z y xψ= = = = . Set

0DXY IXYK K= = and adjust the value of PXYK to give a period of horizontal oscillation (in the x direction) of about 3 seconds. The value of PXYK will probably be less than 1.

(e) Next, adjust DXYK to give a slightly under-damped response (you should aim to produce a small overshoot of the desired position, but no observable subsequent oscillations).

(f) Finally, increase IXYK until you just start to see a second oscillation start to appear in the x position.

(g) The last parameter to set is the yaw constant PK ψ . To set this * * * *0 1x y z ψ= = = = . Then adjust PK ψ so that the yaw reaches 90% of the desired value in about 2.5 seconds.

You now have values for the parameters in the feedback controller – you will use these values when you write your Matlab script to control the quadcopter (Section 6).

Page 8: EN40 dynamics and vibrations

4. Thrust calibration To function correctly, the control system that you designed in the preceding sections requires a value for the constant 0τ that specifies thrust required to just support the quadcopter’s weight. (To see what happens when this term is not included in the control system, see the last part of this video). This turns out to be a little more complicated than it seems, because β is a function of the battery voltage: as the battery drains, the motor produces progressively less thrust. You can deal with this by continuously monitoring the battery voltage and using a formula to calculate

0τ . We have found that the following function works well:

0 101600 15300Vτ = − where V is the battery voltage. You can obtain a more accurate calibration by measuring the thrust directly. This step is optional and may not be necessary. Try flying the quadcopter with the calibration provided first (you can try increasing the 101600 slightly if the quadcopter consistently sinks after takeoff, or reduce it slightly if it consistently ascends too fast). If after a bit of experimenting you are still unhappy with its altitude control follow the instructions below to run the thrust calibration. We have not yet installed the Arduino support package on the computers in B&H096 so contact Prof Bower if you want to use the thrust-stand. The figures below shows a thrust-stand designed for this purpose: it consists of a small 100gram load-cell with an attachment that fits the quad-copter, and an Arduino-board that will measure the voltage from the load cell and convert it to a mass reading. A software interface has been provided that reads the Arduino, manages the radio communications with the quadcopter, and provides an interface to a MATLAB script that can be used to measure the thrust. The matlab script thrust_logger.m provided runs a feedback-controller (described in Appendix C) that keeps the motor thrust just equal to the weight. The controller setting, battery voltage, and force values are printed to a data file that can be read and plotted after the test.

To use the thrust-stand:

• Log into the computer provided in 096 with your Brown user ID and password (the same one you use to log into the computers in the B&H instructional facility).

• Copy all the MATLAB codes controller.m , getKinectCoords.m preview_kinect_image.m and thrust_logger.m to the desktop (if you wish, you can store them in some other directory, but then you will need to browse to select the directory every time you open the thrust controller).

• Double-click the ‘EN40 thrust calibrator’ program on the desktop and wait for it to start. • Optional: If you did not store your MATLAB codes on the desktop browse to the folder where

you stored them • Before you attach the quadcopter to the test stand press the ‘Zero load-cell’ button.

Load Cell

Page 9: EN40 dynamics and vibrations

• Put a fresh battery in the quadcopter, to ensure you get thrust readings for the full range of battery voltages.

• Attach the quadcopter to the test-stand with the screws provided. Be careful not to damage the load-cell while you do this (the load-cell has a 100gram capacity). The figures above show a quadcopter without the red ball, but you should run the test with the red ball and the training-wheels attached. The mass of the quadcopter in grams will be displayed on the GUI. If you like, you can use this measurement, together with the measured value for 0τ , to determine the constant β .

• Turn on the quadcopter by pressing the small button at the front. The LEDs should flash and the motors will spin briefly if the power-up was successful. If not, try pressing the button again.

• Press ‘scan’ to look for active quadcopters. If any are found, they will be listed in the drop-down menu. Each quadcopter is set to communicate on a different radio channel – make sure you select the one you want to test and not one being used by another group!. Then press ‘connect’ to initialize the radio connection. If the connection is successful the link quality bar will display, and (shortly afterwards) the battery voltage gets displayed.

• Press ‘Begin Measurement’ to start the test. The thrust calibrator will attempt to adjust the thrust until the load cell records zero weight (so the motor thrust is just supporting the weight of the quadcopter). The time, battery voltage, thrust setting, and the raw and filtered force-sensor readings are saved in a .csv file that you can read and process after the measurement is complete. Note that motor vibrations cause a lot of noise on the force sensor reading so it is filtered using a first order low-pass filter with a time-constant of about 5 seconds. The data taken during the first 8 seconds or so is not reliable.

• The measurement will end automatically when the thrust value exceeds 60000 or the battery gets too low to continue radio communication. You can also end the measurement manually.

NOTE: For unknown reasons the radio or python script will occasionally hang up and the quadcopter motors wont spin up. If this happens try (1) Turn off the quadcopter, turn it back on again, and reconnect. If this fails try (2) Shut down the thrust calibrator window, reopen it, also turn off quadcopter and turn it back on, then connect. After you have completed a full test, you can read the MATLAB .csv file and plot a graph of the thrust that just supports the quadcopter weight against the battery voltage (its helpful to plot thrust-v-time and load cell reading-v-time as well). It’s best to do a scatter plot, because the thrust fluctuates a lot, partly because of air currents, and partly because of noise on the load-cell signal. You can fit a regression curve through the thrust-v-battery voltage data to find a suitable function for 0τ

5. Image Processing MATLAB codes to read the Kinect images and calculate coordinates have been provided for you. This section provides a short overview of how this is done. A more detailed description of the Matlab code is provided in Appendix A. You can modify these codes if you like. The Kinect behaves much like a camera, but in addition to recording a digital color image, it also records a ‘depth’ image, which measures the distance of objects in the field of view from the sensor (it does this by measuring the time it takes for a laser beam to leave the kinect, reflect off a surface, and return to the Kinect sensor – see this article for some more detail). When you fly your quadcopter, you will be able to preview the images captured by the Kinect, and may need to tweak the default settings to make sure the images are captured correctly. A brief outline of the tracking algorithm follows.

Page 10: EN40 dynamics and vibrations

When you press the ‘preview kinect image’ button in the flight manager, a MATLAB window will open that displays the color image (the quadcopter GUI will be unresponsive until you close the matlab window). The figure below shows a representative image.

Some brief explanations:

• The red border in the color image marks the edge of the region where the quadcopter is permitted to fly. If the quadcopter crosses the red border an ‘out of frame’ error will be returned and the controller will activate an automatic ‘crash land’ sequence that powers down the motors.

• The software will try to detect the red ball attached to your quadcopter. It does this by searching for red pixels in the image (the algorithm will get confused if anything else red appears in the image).

• If enough red pixels are detected to estimate the position of the ball, a green rectangle surrounding the quadcopter (or whatever red object is found) will be displayed in the color image. To speed up the tracking algorithm, only the image inside the green rectangle is searched for red pixels during flight. The code keeps track of the velocity of the quadcopter and tries to move the search rectangle to the expected position of the quadcopter each time the Kinect is sampled.

You can identify (and modify) which regions of the color image are detected by the code to be part of the red ball by pressing the ‘filtered color image’ button below the color image. Pixels that are red are shaded black in this image, any other color is white. If the coordinates can be extracted from the images, they are also displayed. A representative example is shown in the figure below

Page 11: EN40 dynamics and vibrations

Note that

• Red pixels are identified by their ‘RGB’ values (red, green blue). The Matlab code searches for pixels that have RGB values that lie within a specified range.

• You can change the value and range for the RGB values by using the buttons on the bottom right hand corner of the filtered color image. Their values will be provided to your MATLAB control code while you fly your quadcopter.

The color image is not used directly to calculate the quadcopter’s coordinates. Instead, it is used to determine the approximate region inside the ‘depth image’ that contains the quadcopter. You can view the ‘depth image’ by pressing the ‘depth image’ button at the bottom of the color or filtered color images.

• A white border on the outside edge of the depth image shows the edges of the region where the quadcopter is permitted to fly. If the quadcopter crosses the border an ‘out of frame’ error will be returned and the controller will activate an automatic ‘crash land’ sequence that powers down the motors.

• Dark blue pixels (with a value of 0) are areas where the laser-beam did not reflect properly and the Kinect could not measure a depth value.

• If a red ball is detected in the color image, the approximate region containing the quadcopter is outlined by a white rectangle in the depth image.

• The Matlab script determines the quadcopter co-ordinates by finding the closest point to the Kinect inside the rectangular region, and then calculating the centroid of all the pixels in the rectangular sub-region with depth values within 100mm of the closest point. This yields three numbers: (1) The distance of the centroid from the kinect, in mm; (2,3) The pixel coordinates of the centroid in the depth image. These have to be converted into physical (x,y,z) coordinates. The code provided uses a basic pinhole camera approximation to do this.

Page 12: EN40 dynamics and vibrations

You can use the MATLAB codes provided to detect colors other than red, if you like. To do this, you can press the ‘detect region color’ button. This will display a static snapshot of the color image. You can zoom in on a region that you would like to track (which must have a distinct color that does not appear anywhere else in the image), then crop the image by dragging a rectangle around the region you would like to track. The code will sample the average and maximum/minimum RGB values inside this region, and then display the ‘filtered color’ image with these new values. You can adjust the filter values and windows until only the object of interest appears in the filtered image. 6. Implementing the controller You will control your quadcopter by writing a MATLAB script that calculates the roll, pitch, thrust and yaw-rate settings to be sent to the quadcopter. The Python client that manages communications with the quadcopter will repeatedly call your function to obtain values for the control settings. The sample rate is typically of order 15-20 per second. A code template has been provided for you. The input and output arguments of this function should not be changed (they have to be compatible with the Python code that manages communications with the quadcopter). The template should be fairly straightforward to follow, but there is one feature of the code that needs a little explanation. Before take-off, the Python client will call your code to make sure that the Kinect has been able to detect the quadcopter and to obtain its initial position. During this phase the motor will not be turned on. The INITIALIZING variable will be set to 1 during this phase. You can test the value of this variable, and when it is set to 1, open any output files you need, and so on.

Page 13: EN40 dynamics and vibrations

Your MATLAB script should accomplish the following tasks: (i) Obtain the position and velocity of the quadcopter from the Kinect images. A MATLAB code

(described in the Appendix) has been provided for this purpose, but you can modify it if you wish.

(ii) Calculate the integrated error measures

* * *

0 0 0( ) ( ) ( )

t t t

x y ze x x dt e y y dt e z z dt= − = − = −∫ ∫ ∫

You can use a simple Euler time-stepping scheme to continuously update the error, as follows: Let ( )xe t denote the error in x at time t. The error at a short time t t+ ∆ later is

*( ) ( ) ( )x xe t t e t t x x+ ∆ = + ∆ − You will need to store the values of ( , , )x y ze e e as persistent variables in your Matlab script (see the code template for instructions).

(iii) Calculate 0τ using the formula in Section 4 or the results of your measurements. (iv) Calculate the required controller settings from the formulas

*0

* *

*

( )cos sin

cos sin

( ) ( )

( )

PZ DZ z IZ z

y x

x y

x PXY DXY x Ix x y PXY DXY y Iy y

P

K z z K v K eC C

C C

C K x x K v K e C K y y K v K e

K ψ

τ τθ ψ ψ

φ ψ ψ

λ ψ ψ

= + − − +

= − +

= + +

= − − + = − − +

= −

The formulas above assume that the angles ,θ φ and the angular velocity λ are in radians. Note that the yaw sensor on the quadcopter gives ψ in degrees. You must also convert ,θ φ andλ into degrees when you return their values (as output variables from your function) to the flight controller.

(v) Print data recording the performance of the flight to a file. We suggest printing at least the desired position and the actual position of the quadcopter to a csv file. You can then read the file in MATLAB and plot the data after the flight, so you can see how accurately your controller is working. You can print a lot of other data as well, e.g. the velocity; the controller settings; the battery voltage; and (if you chose to log them) the accelerometer and gyro signals.

Of course, the desired positions * * *( , , )x y z can be functions of time. You will need to write code to make the quadcopter fly along some path. Don’t try anything too fancy for your first test – try

(1) Set the desired yaw * 0ψ = (2) With * *( , )x y held fixed, raise the quadcopter at constant speed for 5 sec until it reaches a height

0.4m above its initial position. (3) Next, with x,z held fixed move *x 0.4m towards the Kinect at constant speed for 5 s. (4) Hold position steady for 10 sec (5) With * *( , )x y Lower the quad to 0.1m below its initial height over 5 sec (6) At the end of the flight activate the ‘crash land’ sequence by returning FAIL=1. This will cause

the controller to power down the motor for a reasonably soft landing. Once you get this to work, you can try more complicated paths. It’s best not to try to fly too close to the ground – ‘ground effect’ makes the quadcopter very difficult to control, and the ground can also interfere with the Kinect’s depth measurement system.

Page 14: EN40 dynamics and vibrations

7. Testing your controller: Once you have all your codes written, you can try a test flight. This is straightforward:

1) Make sure the takeoff stand is about 1.5m away from the Kinect 2) Log into the computer provided with your Brown user name and password (the same one you use

for the computers in the instructional computer lab). 3) Copy the MATLAB code controller.m to the desktop (if you wish, you can store it in some other

directory, but then you will need to browse to select the directory every time you open the flight manager).

4) Open the flight manager from the PC desktop. This will (eventually) open up a GUI. 5) If you stored your matlab codes somewhere other than the desktop, use the buttons provided to

browse to the folder. 6) Place the quadcopter on the takeoff stand, with its front facing away from the Kinect (the battery

connection is at the back of the quadcopter). 7) While the quadcopter is level, press the small switch at its front to turn it on. The LEDs should

flash, and the motors spin briefly if the power-up was successful. If not, try turning it on a second time.

8) By default, only the battery and yaw signals are logged from the quadcopter. If you wish, you can log data from the accelerometers and gyros on the quadcopter as well. This will slow down the controller communications slightly. Note that if you click the checkboxes the radio link will disconnect. A new radio link needs to be established to update the logger.

9) You can enter parameter values to be passed to your controller script if you need them (they get passed to your controller.m script and your code has to do something with them. If you enter nothing, that’s fine….).

10) You can preview the Kinect image if you wish (it is a good idea to do this before your first flight, just to make sure nothing red other than the ball appears in the field of view and that a depth image). This will open some matlab windows – click them to bring them to the front. The flight manager GUI will not respond until you close the Matlab windows (use the ‘close’ button at the bottom of the window to do this otherwise the connection to the python script will break).

11) Press the ‘scan’ button to search for quadcopters. If they are found they will be listed in the drop-down menu. Each quadcopter is set to communicate on a different radio channel – make sure you select the one you want to fly and not one being used by another group!. Then press ‘connect’ to initialize the radio connection. If the connection is successful the link quality bar will display, a message appears in the Status bar, and (shortly afterwards) the battery voltage gets displayed. If the connection doesn’t work try again – also try restarting the quadcopter.

12) Press the ‘Start Flight’ button to begin a flight. It is helpful to have someone ready to catch the quadcopter for your first flight in case it goes crazy in case it goes crazy. If something goes wrong you can press the red button to abort the flight – this will activate a ‘crash land’ sequence that ramps down the motors and hopefully brings the quadcopter to a reasonably soft landing. Once you have a bug-free controller code and the correct controller gains the quadcopter should behave well, as long as it doesn’t lose radio contact.

Troubleshooting:

• If you press the ‘Start Flight’ button and the motors don’t spin up within a few seconds it is because of radio interference. In this case press the ‘Abort Flight’ button, disconnect the radio, and re-connect. You can try turning off and restarting the quadcopter as well. The problem usually resolves itself in a few minutes, but we haven’t found a way to stop this happening.

• Radio interference can also cause the controller to lose contact with the quadcopter during flight, in which case it might plummet to the ground, or fly off in some strange direction. Just try the flight again (assuming the quadcopter survives)

Page 15: EN40 dynamics and vibrations

• The battery chargers are flakey. Move the battery connectors around in the sockets until the lights come on. The light turns off again when the battery is fully charged.

• Consult a TA or faculty member for help repairing damage… To fine-tune your controller, it is helpful to write a MATLAB script that will read the .csv file recorded during the flight and plot graphs showing the actual and desired positions of the quadcopter (and any other data you might find interesting). You can then try adjusting the controller settings to see if you can improve performance. You will find that the quadcopter is stable only for a rather narrow range of values of the feedback-controller constants – with the wrong values the quadcopter will oscillate around the desired position, sometimes quite violently. It is likely that the optimal controller settings are functions of the battery voltage. We have not explored this. Once you are happy with your controller you can work on getting the quadcopter to pick up and deliver its payload. You can use the ‘preview_kinect_image.m’ function, or the ‘getKinectCoords.m’ function, to detect and measure the position of the colored ball that must be picked up, as well as the bowl that serves as the delivery area. It is best to measure their coordinates with the Kinect, since this guarantees that the quadcopter will fly to the correct points even if the physical coordinates are not exactly right. For this year you can choose the initial and final positions of the payload yourself… 8. Report Guidelines Your report should

(1) Give the values you selected for your feedback control parameters (2) Describe how you calculated the feedback-control parameters, and show some graphs that predict

the expected behavior of the feedback-controller. (3) If you measured the motor thrust, report the results of this experiment. (4) Describe any measurements you made to test the actual performance of the control system (5) Describe the strategy you used to pick up and transport the beer-pong ball to the desired position.

8. Grading Rubric Your project will be graded as follows:

• Matlab design calculations 10 points • Coding and tuning a feedback controller 5 points • Report writing and presentation 10 points • Oral presentation 10 points • Performance during testing 5 points • Peer Evaluation: 10 points

(50 Points)

Page 16: EN40 dynamics and vibrations

Appendix A: Calculating the position of the quadcopter from the Kinect Sensor The quadcopter is tracked using a simple MATLAB code that will

(i) Read a color and depth image from the Kinect (ii) Detect the quadcopter in the color image (by detecting the red ball) (iii) Use the results of (i) to crop out a small region of the depth image containing only the quadcopter (iv) Find the depth and pixel coordinates of the centroid of the quadcopter in the cropped depth image (v) Convert the depth and pixel coordinates to actual physical x,y,z coordinates (vi) If (x,y,z) coordinates cannot be found, the code must return an error message .

For this project it is very important to write efficient matlab code, because the calculations have to be done in real-time. For this reason we can’t use the sorts of sloppy brute-force loops that we often use to process data in EN40 homeworks, but instead need to use the fast methods that you learn in CS40. This appendix describes how the image processing codes provided with the project work. You can modify and improve these if you wish. A.1 Reading the kinect: The Kinect behaves much like a camera, but in addition to recording a digital color image, it also records a ‘depth’ image, which measures the distance of objects in the field of view from the sensor (it can also record audio and an infra-red image, but we won’t use those in this project – unless you choose to write your own fancy code to use them for something, of course… ). You can use the following code to read the Kinect sensor: depth_image = kinect_handle.getDepth; color_full = kinect_handle.getColor; The variable ‘color_full’ is a1080x1920x3 matrix that specifies the RGB (red,green,blue) values of every pixel in the image, as follows: color_full(i,j,1) – an integer from 0 to 255 specifying the red content of the pixel at (i,j) color_full(i,j,2) – an integer from 0 to 255 specifying the green content of the pixel at i,j color_full(i,j,3) – an integer from 0 to 255 specifying the blue content of the pixel at i,j. A pixel is red if color_image(i,j,1) is large, and color_image(i,j,2) and color_image(i,j,3) are both small. In practice reflections and shadows give an object that appears ‘red’ to our brains a fairly large range of RGB values on an image.

i

j(1,1)

Quadcopter

Page 17: EN40 dynamics and vibrations

The variable ‘depth_image’ is a 424x512 matrix that specifies the distance of every pixel in the image from the Kinect sensor (an integer number between 0 and 4000). A value of 0 in a pixel indicates that a valid depth reading for that pixel could not be acquired; a value of 4000 means the object in the pixel is out of range of the sensor. You can think of the (i,j) indices of the pixels as coordinates, with the i=1,j=1 in the top left corner of the image, as shown in the picture: i specifies the vertical position; and j specifies the horizontal position. Note that because it is used for gaming, the Kinect does not behave quite like a camera: it records the mirror image of what the camera actually sees. A pixel on the left of the image is actually to the right of the Kinect if you face in the direction the sensor is pointing. The kinect updates its images roughly 15-20 times a second or so, and the code has to wait until this process has been completed before you can acquire the data A.2 Cropping the color image It takes too long to search the entire color image for red pixels. Instead, a rectangular region of the image that is expected to contain the quadcopter is first cropped out using the lines xlo = max(current_color_coords(1)-search_window_size(1),1); xhi = min(current_color_coords(1)+search_window_size(1),c_width); ylo = max(current_color_coords(2)-search_window_size(2),1); yhi = min(current_color_coords(2)+search_window_size(2),c_height); color = imcrop(color_full,[xlo,ylo,xhi-xlo,yhi-ylo ]); Here, color(:,:,3) is a smaller matrix (approx. 250x150 pixels) of RGB values that gets searched for the image. The vector current_color_coords specifies the image coordinates of the expected location of the quadcopter. This is continuously updated as the quadcopter moves around to keep the cropped box in the right place. A.3 Finding the red ball in the color image: Red pixels are found using the following line of code: [row, col] = find(color(:,:,1)<color_vals(1)+color_range(1) ... & color(:,:,1)>color_vals(1)-color_range(1) ... & color(:,:,2)<color_vals(2)+color_range(2) ... & color(:,:,2)>color_vals(2)-color_range(2) ... & color(:,:,3)<color_vals(3)+color_range(3) ... & color(:,:,3)>color_vals(3)-color_range(3) ); Here, row is a vector containing the i index of every red pixel, and col is a vector containing the j index of every red pixel. The vectors color_vals and color_range specify the range of RGB values that are expected to be part of the red ball. You can then find the (i,j) coordinates centroid of the red area as follows: i_centroid = round(median(col)); j_centroid = round(median(row));

i

j(1,1)

Quadcopter

Page 18: EN40 dynamics and vibrations

A.4 Cropping the depth image

Next, we need to find the quadcopter in the depth image. The first step is to crop out a piece of the depth image that will (hopefully) contain only the quadcopter and some background. We know that the quadcopter will be the object that is closest to the kinect in the cropped image (because if there were anything between the quadcopter and the kinect, the sensor would not see the quadcopter!), so it can be identified easily.

The following procedure is used to crop the depth image

(i) Find the approximate location of the quadcopter in the depth image from the color coordinates. This is a bit more tricky than it seems, because (a) the depth image has a different number of pixels to the color image, and (b) the lens for the depth sensor is not located in the same place as the camera, and so does not see exactly the same image. It is possible to convert from one location to the other very precisely (the Kinect software has a built in function that does this) but the calculation is too slow and unreliable for our purposes. Instead the camera coordinates are mapped to depth coordinates using a quick and dirty mapping

xoffset = floor(-70*(329+1380-xav)/1380 + 35*(xav-329)/1380); yoffset = floor(22*(13+1053-yav)/1053-18*(yav-13)/1053); depthCoords = ... round([xav*d_width/(c_width)+xoffset,yav*d_height/(c_height)+yoffset]);

(ii) Compute the coordinates (ilow,jlow), (ihigh,jhigh) of two corners defining a rectangular region around the cropped image

(iii) You can then crop the depth image using the following MATLAB commands: rowrange = (ilow:ihigh)'; colrange = (jlow:jhigh)'; cropped_depth = depth_image(rowrange,colrange); cropped_depth(cropped_depth == 0) = 4000;

Here, the variable cropped_depth is a matrix with size (ihigh-ilow+1)x(jhigh-jlow+1). The matrix stores data in exactly the same way as the full depth image, except that the pixel cropped_depth(1,1) is located at the top left hand corner of the cropped region. The last line puts any pixels with invalid depth out of range (this removes them from the calculation in step 5).

ij

Page 19: EN40 dynamics and vibrations

A.5 Caculating the pixel coordinates of the quadcopter in the cropped depth image

We can now detect the coordinates of the quadcopter as follows:

(i) Find the distance of the point in the cropped depth image that is closest to the kinect: minD = min(min(cropped_depth));

(ii) Find all the pixels in the depth image that have depth values between minD and mind+L, where L is the approximate size of the quadcopter (in mm), as follows:

[row,col] = find(cropped_depth<minD+L);

(iii) Find the average depth value of all the pixels identified in step (ii). We could do this with a loop, but that is too slow. Instead, we use the following two lines of MATLAB: index = sub2ind(size(cropped_depth),row,col); quadcopter_depth = round(mean(cropped_depth(index)));

(iv) Calculate the i,j coordinates of the centroid (see step A.2) relative to the corner of the cropped image.

(v) Correct the result of (iii) to find the coordinates relative to the top left corner of the full depth image

(vi) The results of (iii) and (v) now specify the two pixel coordinates and depth coordinate of the quadcopter.

A.6 Calculating the physical coordinates of the quadcopter

Finally, we need to convert the pixel coordinates found in step A.5 into actual physical coordinates. It’s better to get a fast, approximate answer than a slow, highly accurate one, because we need to do the computations as fast as possible.

• We use the coordinate system shown in the figure: x will specify the distance from the Kinect2; z will be vertically upwards, and y will be the 3rd coordinate (using right handed Cartesian coordinates).

• The depth sensor measures the distance d of the quadcopter from the sensor, in millimeters. It is accurate enough to assume x d≈

• We can convert from (i,j) pixel coordinates to (y,z) physical coordinates by idealizing the Kinect as a ‘pinhole camera’. Light-rays pass from the object; through a pin-hole, to a sensor a distance w behind the pin-hole. If we take x=y=0 when the quadcopter is at the center of the image (j=256, i=212), this suggests that we could set ( 256) /y d j w≈ −

Page 20: EN40 dynamics and vibrations

( 212) /z d i w≈ − − (the signs in these formulas are confusing – its best to use the i,j coordinate system shown in the figure in Sect 5.1 to work them out).

• We can do a simple experiment to measure the value of w (eg put a red object in a known position, and record the values of i and j).

A.7 Fail-safe features In addition to computing the coordinates, the code must do two more things:

(i) It must set the variable FAIL to 1 if the quadcopter is not detected (ii) It must return an error message if the quadcopter is about to fly out of the range of view

of the Kinect2. These error messages will be sent to the Python code that communicates with the quadcopter, which will power-down the motors and (hopefully) bring the quadcopter to a reasonably controlled crash. The code has failed to detect the quadcopter if the vectors returned by [row,col]=find(…..) in Section A3 are empty. We use the matlab function isempty(vector) to detect this. The quadcopter is about to fly outside the range of the kinect if d gets too large or small, or if i or j get close to the edge of the image. By default an out of range signal is sent if d falls below 0.6m or exceeds 2.5m, or if i or j come within 10 pixels of the edge of the depth image. Appendix B Calculating the quadcopter velocity The PID controller needs to know the velocity of the quadcopter, in addition to its position. We could estimate this by recording the position at successive time intervals, and then using / t= ∆ ∆v x , but this produces rather noisy data. We can do a better job by using ideas from the vibrations section of the course to implement a simple digital filter, which also gives us an estimate of the quadcopter velocity (this produces a version of a ‘Kalman filter’). When we analyzed the behavior of a forced spring-mass system, we found that:

• If the force has a high frequency; the mass doesn’t move; • If the force has a low frequency, the mass displacement is proportional to the force. • In between, the system resonates, but if the system is critically damped, the amplification at the

resonant frequency is small. The critically damped spring-mass system therefore behaves just like a low-pass filter: it responds to low-frequency excitation, but high-frequency forcing has no effect. Following this idea, we can use the equation of motion for a critically damped, forced, spring-mass system to create a simple low-pass filter. Specifically, let x be the measured position vector, and let Fx be the filtered signal. Then:

2

2 21 2F F

F

nFnF

d ddtdt ωω

+ + =x x x x

We can choose the natural frequency nFω to determine what frequencies will be filtered out. If we were given a signal x, we could solve this equation to produce a filtered signal Fx . But our problem is a bit more complicated: we only know x at discrete intervals of time (whenever the Kinect is

Page 21: EN40 dynamics and vibrations

able to provide them). To deal with this, we need to borrow some ideas that we use to solve differential equations (eg using the ODE solvers in MATLAB). First, we re-write the equation into a form that MATLAB could solve, by introducing the filtered velocity as a new variable

2 ( ) 2

FF

F FFnF nF

ddt ω ω

=

− −

vxx x vv

Now, suppose that we are able to get signals from the Kinect at discrete times 1 2 30, , , ...t t t t= Suppose, in particular, that we already know the values ,F F

n nx v at time nt . We then get a new signal x from the Kinect at time 1n nt t t+ = + ∆ , and wish to find 1 1,F F

n n+ +x v . We can replace the MATLAB equation of motion with a discrete approximation

1 12

1 1 1

1( ) 2

F F Fn n nF F F Fn n nF n nF nt ω ω+ +

+ + +

−=

∆ − − −

x x v

v v x x v

We can easily solve these two equations to find 2

1 2 2

1 1

( )(1 2 )

F FF n nF nn

nF nFF F Fn n n

tt t

t

ωω ω+

+ +

+ ∆ −=

+ ∆ + ∆

= + ∆

v x xv

x x v

We now have our filter. To use it: (1) Measure the position x(0) of the quad at time t=0 (2) Set F

nx = x(0); Fnv =0

(3) Acquire a new value of x from the kinect, and find the time increment t∆ (4) Calculate 1 1,F F

n n+ +x v using the formulas; (5) Update 1 1,F F F F

n n n n+ += =x x v v and loop back to (3) for the next sample. This algorithm is implemented inside the ‘getKinectCoords’ MATLAB function provided. It uses a rolloff frequency of 18 Hertz for the filter, which corresponds to the approximate sampling rate of the Kinect (so 2 18nω π= × ).

Page 22: EN40 dynamics and vibrations

Appendix C: Designing the controller for thrust measurement The Matlab script thrust_logger.m implements a simple feedback loop that uses readings from a load-sensor to continuously adjust the motor thrust to just balance the weight of the quadcopter. This appendix explains how the controller was designed, and gives equations that will help you change it. Let ( )F t denote the force reading (in Newtons) from the load-cell. The reading is very noisy (because of high-frequency vibrations during the test). The noise can be reduced by putting the signal through a simple first-order filter. With this approach, the filtered signal ˆ ( )F t is related to F by

ˆ ˆ( )F

dF K F Fdt

= − (1)

where FK is a constant (with units of s-1). The smaller the value of FK , the more the noise is reduced. But if the value of FK is too small, the filter will not respond quickly enough to changes in force that we want to detect. The code uses 0.25FK = for a 4 sec filter time constant. The motor thrust is controlled with a simple PI controller (there is no need for the ‘D’ term) by calculating the thrust setting from the filtered force:

* *

0

ˆ ˆ( ) ( )t

P IK F F K e e F F dtτ = − + = −∫ (2)

where *F is the desired value for the force sensor reading. Of course, in our application we are trying to make * 0F = , so the motors balance the quadcopter weight. To design the feedback controller, we need to predict how F̂ varies with time. Recall that the actual thrust produced by the motors is roughly proportional to τ F βτ= (3) With some algebra, we can use this to get an equation of motion for F̂ . Differentiating equation (1) with respect to time and eliminating F using equation (3) gives

2

2

ˆ ˆF

d F d dFKdt dtdtτβ

= −

(4)

Now differentiate (2) with respect to time and substitute for /d dtτ in (4)

2

*2

ˆ ˆ ˆˆ( )F I Pd F dF dFK K F F K

dt dtdtβ β

= − − −

(5)

We can re-arrange this to get a standard ‘Case 3’ vibration equation

( )2*

2

ˆ ˆ11 ˆP

I F I

Kd F dF F FK K K dtdt

ββ β

++ + = (6)

The controller in the thrust_logger.m script was designed as follows: • We choose 0.25FK = for a 4 sec filter time constant (so the filter takes about 5 secs to respond to

a step change in force); • We choose the natural frequency 0.3536n I FK Kω β= = in eq (6), which gives 0.5IKβ = • We choose the damping factor 0.88ζ = in eq. (6) to give a slightly under-damped response.

This gives 1 0.882

PI F

I

K K KKβ ββ+

= , and so 1.5PKβ = .

If you like, you can tweak these values to change the behavior of the controller.