Report Project 3

download Report Project 3

of 11

Transcript of Report Project 3

  • 8/2/2019 Report Project 3

    1/11

    Ashish Lal

    Project 3

    MTRN3100

    Ashish Lal

    10/22/2010

  • 8/2/2019 Report Project 3

    2/11

    Ashish Lal

    Contents

    1. Introduction......................................................................................................................................... 3

    2. Generating a real map from a mobile robots laser scans.................................................................... 3

    3. Path finding......................................................................................................................................... 5

    a) Dijkstras Algorithm ................................................................................................................... 5

    4. Fail-Safe Methods for the Robot......................................................................................................... 5

    a) Implementation of the Barrier and Rings Method................................................................... 5

    5. Discussion ........................................................................................................................................... 7

    6. Conclusion .......................................................................................................................................... 8

    7. Appendix ............................................................................................................................................. 8

    a) Appendix 1 .................................................................................................................................. 8

    b) Appendix 2.................................................................................................................................. 9

    c) Appendix 3................................................................................................................................ 10

  • 8/2/2019 Report Project 3

    3/11

    Ashish Lal

    50 100 150 200 250 300

    50

    100

    150

    200

    250

    300 0

    20

    40

    60

    80

    100

    120

    140

    160

    180

    200

    1. Introduction

    The objective of this project is to design a map which takes real time simulated data

    from the robots scan. A path finding technique should also be developed, where a user can

    specify a desired destination along with fail-safe measures. It is also essential to take in

    account of the robot size.

    2. Generating a real map from a mobile robots laser scans

    To create a 2D map (OG grid) for the robot, data is collected from the LMS200 laser.

    This data is further processed in real time displaying the obstacles, safe and unsafe area on

    the map for the robot. The figure below shows the complete map of the area with obstacle

    information once the robot has moved around the room scanning the area.

    Fig 1 and Fig 2 (different colour map) shows the occupancy grid with each grid resolution of

    10cm by 10 cm and the dimensions of the grid as 300 by 300 (cells).

    The occupancy grid (Fig 1 & Fig 2) is obtained after a devised algorithm is applied.

    When a single beam of laser hits and obstacle it stops there and that is how an obstacle is

  • 8/2/2019 Report Project 3

    4/11

    Ashish Lal

    determined. Since the area the laser travelled through before hitting obstacle is seen as an

    empty area a certain way is devised to clear this area in the OG.

    Firstly, while considering the robot to be stationary (Fig XX) a vector is defined with a

    specifiedANGLE. This vector is then accessed in loop where l (the y value) is set to increase

    and decrease, this way all the cells can be accessed in front of the reference cell and given a

    value depending if its and obstacle, unknown or known areas on the map.

    Fig 2 shows vector from the astationary robot

    Fig 3 shows vector from the a mobile robot

    However, since the robot is moving in the 2-D space, problems occur when the robot

    scans a location that is behind it due to the actual vector (i.e. vector which normally is

    defined from laser sensor to the location of the obstacle it scanned.) Becoming inverted and

    pointing the wrong way and clearing spaces behind the robot. To overcome this dilemma, the

    directions of the vectors are checked before the spaces are declared as clear. If the vector is

    negative, its direction is inverted to the correct orientation as shown in Fig 3, so it points from

    the reference point toward the obstacle.

  • 8/2/2019 Report Project 3

    5/11

    Ashish Lal

    ClickHere

    toEND

    Click at desired Initial Point

    50 100 150 200 250 300

    50

    100

    150

    200

    250

    300

    0

    20

    40

    60

    80

    100

    120

    140

    160

    180

    200

    ClickHere

    toEND

    Click at desired Initial Point

    50 100 150 200 250 300

    50

    100

    150

    200

    250

    300

    0

    20

    40

    60

    80

    100

    120

    140

    160

    180

    200

    3. Path finding

    a) Dijkstras Algorithm

    It works for a given point in a map; the algorithm finds the path with lowest cost (i.e.

    the shortest path) between the two specified points. However this technique accounts for

    points only. Designing fully functional autonomous software for the robot would require

    considering the volume of the trolley itself. There are two ways of applying the solution, one

    is to modify the map, taking in account of the robot size and the other more difficult approach

    is modifying the algorithm. Modifying the algorithm would require the process to be

    performed on a series of different cells of the occupancy grid (which define the robot)

    increasing the time taken to find a feasible path to the desired destination. Therefore more

    feasible approach would be constructing barriers around the obstacles in the OG.

    4. Fail-Safe Methods for the Robot

    a) Implementation of the Barrier and Rings Method

    Fig 4 The OG with Barriers(199) only Fig 5 The OG with Rings and Barriers

  • 8/2/2019 Report Project 3

    6/11

    Ashish Lal

    50 100 150 200 250 300

    50

    100

    150

    200

    250

    300

    0

    20

    40

    60

    80

    100

    120

    140

    160

    180

    200

    125 130 135 140 145 150

    75

    80

    85

    90

    95

    0

    20

    40

    60

    80

    100

    120

    140

    160

    180

    200

    The implementation of the barriers and rings (fig 5) occur before the path finding

    function is called. The obstacle (200) and the barrier (199) are set in the OG with a high value

    so that under no circumstances the path finding algorithm will travel though that area.

    However to be cautious, the implementation of the rings is essential since these essentially

    provide more safety feature for the robot to navigate around an obstacle. The level of safety

    can be set according to how many rings surround the barriers. The source code for the

    implementation of the barriers and rings are provided in appendix 2 and appendix 3

    respectively.

    The dimensions of the barrier will be consistent to the dimensions of the robot.

    Specifically, the barrier will be half the width of the robot because this is the distance from

    the centre of the robot to its edge. However if the robot is not a perfect square, it can get very

    close to the wall or obstacles. To avoid this situation the maximum dimension of the robot

    can be chosen, but this will restrict the robot going through narrow corridors where it would

    actually fit in reality (e.g. navigating in a crowded hallway of the hotel). A better solution is

    to implement rings (fig 6 & 7) around the barriers which has a cost low enough to be a

    feasible path when there is no other path.

    Fig 6 Shows the path generated on the OG Fig 7 shows the robot moving through the

    narrow space.

  • 8/2/2019 Report Project 3

    7/11

    Ashish Lal

    x (index)

    50 100 150 200 250 300

    50

    100

    150

    200

    250

    300

    x (index)

    110 120 130 140 150 160

    50

    60

    70

    80

    90

    100

    Fig 8 shows topographic map i.e. the cost

    function with destination where red =expensive and blue = cheap route.

    Fig 9 is a zoomed in version of the

    topographic map.

    The cost to go function in Fig 8 & 9 shows the topographical cost map to go to each

    location. The scale is set where the blue indicates the easy to go to place therefore the

    cheaper place, while the red indicate the places its hard to get to i.e. more expensive. The

    blue areas can be seen close to the point since this area is easier and faster to get to and the

    robot has to avoid fewer obstacles. The red areas are further away from the origin of the robot

    since it would take longer to get to those points and it would have to avoid a greater number

    of obstacles to get to that point.

    5. Discussion

    This software developed for the robot is capable of running on a real-life robot;

    however its processing time is not as quick as expected in the real world industrial

    application. As development goes on for this software in the future, one could look into

    implementing software which considers the robot size in the Dijkstras algorithm, this would

    be a difficult task but if successful it will increase the efficiency of the robot. Another

  • 8/2/2019 Report Project 3

    8/11

    Ashish Lal

    limitation for this software is it doesnt take in account for the driving and manoeuvring

    capabilities of the robot and only gives a straight lined path (fig 9).

    6. Conclusion

    The project was a success where a consistent map was generated from the mobile

    robots laser scans. Dijkstras algorithm was used in combination to determine path finding.

    Fail-safe measures such as the applying barriers and rings around obstacles were

    successfully in implemented and was fully functional. The robots size was also taken into

    consideration and it could also travel through narrow spaces within the map without causing

    any damage to the robot itself and the surroundings.

    7. Appendix

    a)Appendix 1This source code outlines the reversing of the vector and converting the map to a global axis.

    timeLaser =double(Data.time);a = MyAPI.ReadPose() ; % read all new position records since last reading

    if a.n>1, % any new records?% yes, there are 'n' new readingstimePose = a.Data(1:a.n,1) ; % timestamps of the readings.

    % estimate (in this case just interpolate) the robot's pose at% the time of the laser measurement

    if timeLasertimePose(end), continue ; end ;

    % [Lx,Ly,Lh] are (x,y,heading) at time=timeLaserLx = interp1(timePose,a.Data(1:a.n,2),timeLaser) ;Ly = interp1(timePose,a.Data(1:a.n,3),timeLaser) ;Lh = interp1(timePose,a.Data(1:a.n,7),timeLaser) ;

    % however the position corresponds to the centre of the back% axis, ==> I need to translate it to the laser position (front% of the robot)

    Lx = Lx + H*cos(Lh) ;

    % translate the point to the laser position (H meters aheadthe robot's position)

  • 8/2/2019 Report Project 3

    9/11

    Ashish Lal

    Ly = Ly + H*sin(Lh) ;else,

    continue ;end;

    % I convert the local laser readings --> global.

    aa2 = aa+Lh-pi/2+OffsetLaserAngle ;cosaa = cos(aa2) ;sinaa = sin(aa2) ;xx = Lx + Ranges.*cosaa ;yy = Ly + Ranges.*sinaa ;% xx[],yy[] are the laser's points expressed in the global% coordinate frame

    X = xx; % laser in globalY = yy;

    for u=1:length(X);l = abs(Y(u)-(Ly));

    %define vector length of laser globally as the distance to the%obstacle minus the robots current headingth = atan((X(u)-Lx)/(Y(u)-Ly));%angle of laser vector also defined globallya=1;if Y(u)-(Ly) 0.1OG_x = floor((Lx+a*l*tan(th))*10+150);OG_y = floor((Ly+a*l)*10);

    if OG_x>0 && OG_x0 && OG_y0 && OG_x0 && OG_y

  • 8/2/2019 Report Project 3

    10/11

    Ashish Lal

    wr=60; br=60;%create barrier around obstacle

    left_right=ceil((wr/wg)/2);front_back=ceil((br/bg)/2);i_i=1:left_right;i_j=1:front_back;

    %create barrier around obstaclefor b_i=1+left_right:300-left_rightfor b_j=1+front_back:300-front_back

    if OG(b_i,b_j) == 200%implement barrier%left and rightOG(b_i+i_i,b_j)=199;OG(b_i-i_i,b_j)=199;%above belowOG(b_i,b_j+i_j)=199;OG(b_i,b_j-i_j)=199;%top left top rightOG(b_i-i_i,b_j+i_j)=199;

    OG(b_i+i_i,b_j+i_j)=199;%bottom left bottom rightOG(b_i-i_i,b_j-i_j)=199;OG(b_i+i_i,b_j-i_j)=199;

    endend

    end%barrier ends here

    c) Appendix 3This source code outlines Implementation of the rings method.

    %% Analogue Bubble

    %decrementer 3,2,1for ring = 3:-1:1;

    %analogue bubbleif ring==1

    intermediate = 90;elseif ring == 2

    intermediate = 50;else

    intermediate = 10;endfor r_i = 1+max(ring):300-max(ring)

    for r_j = 1+max(ring):300-max(ring)%if barrierif OG(r_i,r_j) == 199

    for k = -1:1if OG(r_i-ring,r_j+k) < intermediate;

    OG(r_i-ring,r_j+k) = intermediate;endif OG(r_i+ring,r_j+k) < intermediate;

    OG(r_i+ring,r_j+k) = intermediate;endif OG(r_i+k,r_j-ring) < intermediate;

    OG(r_i+k,r_j-ring) = intermediate;end

  • 8/2/2019 Report Project 3

    11/11

    Ashish Lal

    if OG(r_i+k,r_j+ring) < intermediate;OG(r_i+k,r_j+ring) = intermediate;

    endend

    endend

    enden %Analogue Bubble ends here