[IEEE 2010 IEEE International Conference on Intelligent Systems and Knowledge Engineering (ISKE) -...

5
Multilevel Fuzzy Navigation Control Scheme Applied to a Monitoring Mobile Robot Erick Rojas-Ramírez División de Estudios de Posgrado e Investigación Instituto Tecnológico de Toluca Metepec, Edo. de México, México [email protected] Jorge S. Benítez-Read Gerencia de Ciencias Aplicadas Instituto Nacional de Investigaciones Nucleares La Marquesa, Ocoyoacac, Edo. de México, México [email protected] Abstract— The design, construction and real time performance of a mobile monitoring system based on a Khepera mobile robot are presented. The functions performed by the system are: (a) line following, (b) obstacle avoidance, (c) identification of test points along the path, (d) recognition of the mark (bar code) located at each test point and, (e) measuring of a physical parameter. For the navigation, an innovative multilevel fuzzy control scheme is implemented in which the fuzzy sensor fusion, related to the perception of the environment, reduces the complexity of the navigation function. Other distinctive characteristics are the identification of test points by means of a Kohonen’s neural network and the processing of a one-dimensional video signal for recognition of landmarks located at each test point. Keywords - mobile monitoring system; multilevel fuzzy control; neural nets; Khepera robot. I. INTRODUCTION The monitoring of some physical variables, such as radiation, corrosion, high humidity or high temperatures, is important in certain physical areas, especially if humans must enter into these environments for maintenance or other related tasks. In some cases, the monitoring task itself is carried out, manually, by humans. The exposure of personnel to these variables might represent risks to their health and safety. In order to reduce these risks, it is desirable to have automatic or semi-automatic monitoring systems. In addition, the hardware platforms of these systems can be designed to reach areas of difficult physical access. Monitoring systems based on mobile robots in which sensors, associated electronics, and computer equipment have been integrated, offer the versatility of navigation in environments of interest with certain degree of autonomy. This mobility advantage has led some researches, in the last two decades, to focus on the development of control techniques and strategies to solve the mobile robot navigation problem in real non-structured or partially structured environments. In [5], an obstacle avoidance neuro-fuzzy controller is presented, in which the membership function parameters of the fuzzy controller are modified through supervised learning. With respect to the application of AI techniques on mobile robotics, Cuesta and Ollero [4] designed a virtual perception function to control the direction and speed of a Romeo-3R robot. Fuzzy logic has been reported, for instance, in [3] for the orientation and motion control of a mobile robot, and in [1] to arbitrate the behavior in autonomous vehicles. Artificial neural networks have also been use on navigation problems [9]. Likewise, an implementation of a genetic algorithm which uses an adaptation function to self-tune the membership functions of another obstacle avoidance fuzzy system is reported in [7]. Finally, a sensorial processing technique, known as general perception vector, is used to obtain information about the robot surrounding media [2]. In this paper the design, construction, and testing of a monitoring system based on a Khepera mobile robot are presented. The functions performed by the system are: (a) line following, (b) obstacle avoidance, (c) identification of test points along the path, and (d) recognition of the mark (bar code) located at each test point. For the task of navigation, an innovative cascade fuzzy control scheme is implemented in which fuzzy sensor fusion, related to the perception of the environment, has been used to reduce the complexity of the navigation function. In the firths level, the information collected by the optical sensors is fusioned to reduce the number of variables utilized by the second level that controls the motion of the robot wheels. The navigation of the robot considers the presence of obstacles along the robot path and incorporates an algorithm based on fuzzy logic to avoid them and to return to the path. Along the trajectory, several light sources indicate the points where measurement of certain variables must be taken. The system, as distinctive characteristics, identifies these test points using a Kohonen’s neural network and processes a one-dimensional image, taken by a linear vision module, for the recognition of a bar code located at each test point. When a test point is detected, the robot orientates towards it, departs from the line, and approaches the light source to a specified distance. Once the measurement of the variable of interest is completed, the data is transferred to a PC using a radiofrequency module mounted on the robot. II. THE KHEPERA ROBOT It is a mobile robot of 5.5 cm of diameter (figure 1). The manufacturer (K-team, see www.k-team.com) developed this robot for research and didactical purposes. Algorithms for obstacle avoidance, object recognition using artificial vision, and robot arm manipulation, can be designed and implemented on the robot microprocessor platform. This robot includes ___________________________________ 978-1-4244-6793-8/10/$26.00 ©2010 IEEE

Transcript of [IEEE 2010 IEEE International Conference on Intelligent Systems and Knowledge Engineering (ISKE) -...

Page 1: [IEEE 2010 IEEE International Conference on Intelligent Systems and Knowledge Engineering (ISKE) - Hangzhou, China (2010.11.15-2010.11.16)] 2010 IEEE International Conference on Intelligent

Multilevel Fuzzy Navigation Control Scheme Applied to a Monitoring Mobile Robot

Erick Rojas-Ramírez División de Estudios de Posgrado e Investigación

Instituto Tecnológico de Toluca Metepec, Edo. de México, México

[email protected]

Jorge S. Benítez-Read Gerencia de Ciencias Aplicadas

Instituto Nacional de Investigaciones Nucleares La Marquesa, Ocoyoacac, Edo. de México, México

[email protected]

Abstract— The design, construction and real time performance of a mobile monitoring system based on a Khepera mobile robot are presented. The functions performed by the system are: (a) line following, (b) obstacle avoidance, (c) identification of test points along the path, (d) recognition of the mark (bar code) located at each test point and, (e) measuring of a physical parameter. For the navigation, an innovative multilevel fuzzy control scheme is implemented in which the fuzzy sensor fusion, related to the perception of the environment, reduces the complexity of the navigation function. Other distinctive characteristics are the identification of test points by means of a Kohonen’s neural network and the processing of a one-dimensional video signal for recognition of landmarks located at each test point.

Keywords - mobile monitoring system; multilevel fuzzy control; neural nets; Khepera robot.

I. INTRODUCTION The monitoring of some physical variables, such as

radiation, corrosion, high humidity or high temperatures, is important in certain physical areas, especially if humans must enter into these environments for maintenance or other related tasks. In some cases, the monitoring task itself is carried out, manually, by humans. The exposure of personnel to these variables might represent risks to their health and safety. In order to reduce these risks, it is desirable to have automatic or semi-automatic monitoring systems. In addition, the hardware platforms of these systems can be designed to reach areas of difficult physical access. Monitoring systems based on mobile robots in which sensors, associated electronics, and computer equipment have been integrated, offer the versatility of navigation in environments of interest with certain degree of autonomy. This mobility advantage has led some researches, in the last two decades, to focus on the development of control techniques and strategies to solve the mobile robot navigation problem in real non-structured or partially structured environments. In [5], an obstacle avoidance neuro-fuzzy controller is presented, in which the membership function parameters of the fuzzy controller are modified through supervised learning. With respect to the application of AI techniques on mobile robotics, Cuesta and Ollero [4] designed a virtual perception function to control the direction and speed of a Romeo-3R robot. Fuzzy logic has been reported, for instance, in [3] for the orientation and motion control of a mobile robot, and in [1] to arbitrate the behavior in

autonomous vehicles. Artificial neural networks have also been use on navigation problems [9]. Likewise, an implementation of a genetic algorithm which uses an adaptation function to self-tune the membership functions of another obstacle avoidance fuzzy system is reported in [7]. Finally, a sensorial processing technique, known as general perception vector, is used to obtain information about the robot surrounding media [2].

In this paper the design, construction, and testing of a monitoring system based on a Khepera mobile robot are presented. The functions performed by the system are: (a) line following, (b) obstacle avoidance, (c) identification of test points along the path, and (d) recognition of the mark (bar code) located at each test point. For the task of navigation, an innovative cascade fuzzy control scheme is implemented in which fuzzy sensor fusion, related to the perception of the environment, has been used to reduce the complexity of the navigation function. In the firths level, the information collected by the optical sensors is fusioned to reduce the number of variables utilized by the second level that controls the motion of the robot wheels. The navigation of the robot considers the presence of obstacles along the robot path and incorporates an algorithm based on fuzzy logic to avoid them and to return to the path. Along the trajectory, several light sources indicate the points where measurement of certain variables must be taken.

The system, as distinctive characteristics, identifies these test points using a Kohonen’s neural network and processes a one-dimensional image, taken by a linear vision module, for the recognition of a bar code located at each test point. When a test point is detected, the robot orientates towards it, departs from the line, and approaches the light source to a specified distance. Once the measurement of the variable of interest is completed, the data is transferred to a PC using a radiofrequency module mounted on the robot.

II. THE KHEPERA ROBOT It is a mobile robot of 5.5 cm of diameter (figure 1). The

manufacturer (K-team, see www.k-team.com) developed this robot for research and didactical purposes. Algorithms for obstacle avoidance, object recognition using artificial vision, and robot arm manipulation, can be designed and implemented on the robot microprocessor platform. This robot includes

___________________________________ 978-1-4244-6793-8/10/$26.00 ©2010 IEEE

Page 2: [IEEE 2010 IEEE International Conference on Intelligent Systems and Knowledge Engineering (ISKE) - Hangzhou, China (2010.11.15-2010.11.16)] 2010 IEEE International Conference on Intelligent

some modules for artificial vision, radio frequency communication and robot arm.

Figure 1. Khepera robot

III. MULTILEVEL FUZZY NAVIGATION CONTROL A serious problem in the design of fuzzy systems is the

exponential increase of rules with the number of input variables to the fuzzy system. For instance, for a system with 10 inputs (sensors) and two fuzzy sets defined for each input variable, the number of rules is 210 or 1024; a real time implementation of the control algorithm on the PC becomes impractical. To deal with this problem, a multilevel fuzzy control was proposed as an alternative. The main idea is to divide the input variables in different groups where each group represents the input to a low dimensional fuzzy system [10], instead of sending all the inputs to a single high dimensional fuzzy system. In this work, the multilevel fuzzy control has been used to solve the navigation of the mobile robot.

The navigation of the robot follows the flow diagram shown in figure 2. In this diagram, several fuzzy systems are used to describe the surroundings around the robot, which in turn leads to the definition of the motion of each robot’s motor.

Figure 2. Multilevel fuzzy navigation control scheme.

A. Sensor Fusion The first level (see figure 2) performs a sensory fusion of

eight signals of proximity and four signals from the sensors that detect a black line drawn on the floor surface. These signals are the inputs to four fuzzy systems (LFP, RFP, BFP, and LP) that characterize the perception of the environment around the robot. The first fuzzy system, LFP (Left Fuzzy Perception), receives the signals from the proximity sensors x1, x2 and x3. The second system, RFP (Right Fuzzy Perception), uses the signals coming from the proximity sensors x4, x5 and x6. The third fuzzy system, BFP (Back Fuzzy Perception), processes the signals of the proximity sensors x7 and x8. Finally, system LP (Line Perception) receives the signals from the infrared reflective sensors x9, x10, x11 and x12. The first three perception systems describe approximately the surroundings of the robot (fuzzy description). The last system gives information of the line along which the robot moves. The four outputs, one per system, are used as the inputs to the following level.

B. Navigation Control The second level receives the four signals generated by the

sensor fusion layer, processes the information and generates the signals that control the speed and direction of the robot motors. The algorithm corresponding to this second stage (shown in figure 3) selects one of four possible actions: Line following, wall following, obstacle avoidance or go to xy.

Figure 3. Navigation control scheme.

IV. LANDMARK DETECTION

A. Landmark Detection with Kohonen’s Neural Net This stage, known as the detection of test points, is in

charge of searching each test point along the robot main trajectory. One infrared light source is located at each test

Page 3: [IEEE 2010 IEEE International Conference on Intelligent Systems and Knowledge Engineering (ISKE) - Hangzhou, China (2010.11.15-2010.11.16)] 2010 IEEE International Conference on Intelligent

point. IR sensors around the robot measure the amount of IR light.

Figure 4. Angle formed between the robot and the light source.

A Kohonen’s type neural net [6] is used to orientate and move the robot towards the light source. Using the array of six IR proximity sensors located on the front part of the robot, a computational algorithm that represents a Kohonen neural network was implanted on the robot processor. The function of this network is to detect the angular position between an infrared light source and the relative axis of the Khepera robot [8].

The neural network consists of an architecture of two layers. In the first layer (input layer), the inputs represent the signals coming from the six photo-transistors of the proximity sensors located around the front of the robot. The second layer (competition layer) is composed of 90 neurons (see figure 5).

Figure 5. Unidimensional Kohonen’s neural net architecture.

B. Training For the training phase, a data set is obtained by placing the

robot at a distance of 10 cm of a light source and then rotating the robot 180 degrees while recording both sensor vectors and associate angles. In the training phase, the network has been adapted using (1).

)]()()[()()()1( , twtxtathtwtw jji −−=+ , (1)

where w(t+1) represents the weights of winning neuron associated to an input vector x(t). From one cycle to the next, the effective width of the neighborhood function is reduced from 45 to 1 neighboring neurons (2).

���

����

�−= 2

2,

, 2exp)(

σji

ji

dth , (2)

where jid , represents the distance between the winning neuron and the neighbor neurons, σ represents the effective width of the neighbor. The learning rate )(ta is updated according to (3).

���

���−=

εα tat exp)( 0 , (3)

where 9.00 =a and 1000=ε .

C. Application for angle detection Finally, the output of the neural net is used to compute the

angle between the relative axis of the robot and the light source, as shown in (4).

°−=° 902)( )( xiPxϕ , (4)

where )(xiP is the position of the i(x) winning neuron in the competition layer. The average error is 2 degrees (90 neurons are distributed over the 180 degrees with spacing of ±2). The maximum error of 4 degrees occurs at the end of sensors. Plots of the desired and the network responses are shown in figure 6.

-pi -pi/2 0 pi/2 pi0

10

20

30

40

50

60

70

80

90

Angle [rad] to light source

Win

ning

neu

ron

ANN ResponseDesired Response

Figure 6. Response of the neural net after 1000 iterations.

Page 4: [IEEE 2010 IEEE International Conference on Intelligent Systems and Knowledge Engineering (ISKE) - Hangzhou, China (2010.11.15-2010.11.16)] 2010 IEEE International Conference on Intelligent

V. RECOGNITION OF TEST POINTS To perform this task, the linear vision module K213 of the

Khepera is used. Each test point has a bar code that identifies the point. The scanning of a bar code using the vision module produces a matrix of data with 1x64 pixels. Later, this information is treated pixel to pixel by the gradient equation:

1+−= iii ppG (5)

Where i is the i-th value associated to each pixel, and iG is the resultant gradient for i-th pixel. Then, a threshold is used for each gradient value as follows:

040 =→< ii GG (6)

The purpose of this threshold is to reduce the noise inherently attached to the signals. Figure 7 shows the results of the recognition of test points. The values obtained of this process are used for decoder the type of landmark.

10 20 30 40 50 60

-50

0

50

100

150

200

250

Pixel

Gra

y Sc

ale

Width : 35Number of Edges: 6Start Pixel: 13End Pixel: 47Binary Code: 1 0 1 0 1Binary to Decimal: 21

Figure 7. Recognition of test points.

VI. GRAPHICAL USER INTERFACE The communication between the Khepera robot and the PC

is established through the GUI (Graphical User Interface). The speed of communication through a standard RS-232 allows the monitoring of the sensors of the robot that sense temperature, proximity, intensity of ambient light, line on the floor, voltage applied to every motor and the scanning of the resultant image of a bar code. Likewise, it is possible to visualize the value of infrared position of the light sources with respect to the robot using the Kohonen neural network; this value is visualized by means of a compass. The idea is to provide enough tools to the user to control the robot navigation, as well as to display a plot of the variable of interest in real time.

Figure 8. User Interface for monitoring and control of the Khepera robot.

VII. RESULTS AND CONCLUSIONS The use of a sensor fusion layer, which processes the data

in a fuzzy manner, reduces the amount of information to be processed by the stage that controls the speed and direction of the robot. The total scheme is easily implemented on the robot processor. The processing is fast enough, preventing delays that could damage the robot. In addition, the fusion sensor layer reduces the disturbances caused by the noise, mainly originated by the light sources, thus giving certain degree of robustness to the system: the robot moves smoothly. On the other hand, the response of the neural network is very close to the ideal one. In fact, part of the error comes from the non homogeneous disposition of the sensors. Nevertheless, there offers a resolution of ±2 degrees, with an execution time of 10 ms, which is ideal for applications that need processing in real time. At this moment, only temperature is being measured by the prototype. The designed temperature measuring circuit is based on the AD 594AD sensor from Analog Devices, having an operating range from 0 to 125°C, with a near-linear behavior. The measured value is sent to the PC using a radiofrequency module. One of the purposes of this work has been to show the feasibility of applying this kind of systems to critical facilities, for instance, in some restricted areas of a nuclear reactor or a gamma irradiation facility to monitor levels of radiation, pressure, temperature, and humidity, among others.

REFERENCES [1] Abreu, Antonio and Correia, Luis; “Fuzzy behaviors arbitration in

autonomous vehicles,” Proceedings of the Portuguese Meeting in Artificial Intelligence-EPIA99, Vol. 1695 of LNCS, 1999, pp. 237-251.

[2] Braunstingl, R. and Sans, P.; “Fuzzy logic wall following of a mobile robot based on the concept of general perception,” Proceedings of the 7th International Conference on Advanced Robotics, Vol 1, 1995, pp. 367-376, Spain.

[3] Cardoso, Filipe and Custódio, Luis; “Fuzzy logic steering controller for a guided vehicle,” Proceedings of the IEEE 7th Mediterranean

Page 5: [IEEE 2010 IEEE International Conference on Intelligent Systems and Knowledge Engineering (ISKE) - Hangzhou, China (2010.11.15-2010.11.16)] 2010 IEEE International Conference on Intelligent

Electrotechnical Conference, MELECON’ 94, 1994, pp. 711-714, Turkey.

[4] Cuesta, Federico and Ollero, Aníbal; “Intelligent mobile robot navigation,” Springer tracts in advanced robotics, Springer-Verlag, Vol 16, 2005.

[5] Floreano, D., Godjevac, J., Martinoli, A., Mondada, F and Nicoud, D. J.; “Design, control, and application of autonomous mobile robots,” Swiss Federal Institute of Technology in Lausanne, 2000.

[6] Haykin, Simon; “Neural networks: A comprehensive foundation,” MacMillan, ISBN 002352781-7,1999.

[7] Lee, Seung-Ik; Cho and Sung-Bae; “Emergent behaviors of a fuzzy sensory-motor controller evolved by genetic algorithm,” IEEE

Transactions on Systems, Man, and Cybernetics, Vol 31, No. 6, 2001, pp. 919-929.

[8] Malmostrom, K. and Munday L.; “A simple robust robotic vision system using Kohonen feature mapping,” Proccedings of the 2nd IEEE Australia and New Zealand Conference on Intelligent Information Systems, 1994, pp. 135-139.

[9] Sugihara, K., Tabuse, M., Shinchi, T. and Kitazoe, T.; “Control system for the Khepera robot by a neural network with competition and cooperation,” Artificial Life and Robotics, Springer Japan, Vol. 5, Num. 6, 2001, pp. 26-28.

[10] Wang, Li-Xin; “A course in fuzzy systems and control,” Prentice Hall, Inc, Englewoods Cliffs, N.J., 1996.