Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

download Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

of 36

Transcript of Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    1/36

    Rapid Aerial Photo System for Precision Agriculture

    Application using Unmanned Aerial Vehicle

    Submitted in Fulfillment of Requirement in this Lecture :

    EL8002 Topik Lanjut di Teknik Elektro - SEM II 2006/2007

    By

    Widyawardana Adiprawita

    NIM 33206007

    Supervised By :

    Prof. Dr. Ir. Adang Suwandi Ahmad

    NIP 130672118

    School of Electric Engineering and Informatics

    Bandung Institute of Technology

    2007

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    2/36

    2

    Table of Contents

    Table of Contents................................................................................................................ 2

    Introduction......................................................................................................................... 3

    Research Description .......................................................................................................... 4

    Problem Definition.......................................................................................................... 4

    Research Objective ......................................................................................................... 5

    Research Methodology ................................................................................................... 5

    Lastest Progress .............................................................................................................. 7

    Agricultural Remote Sensing Basics .................................................................................. 8

    Remote Sensing . . . How You Can Use It On Your Farm............................................. 8

    The Electromagnetic Spectrum....................................................................................... 8

    Electromagnetic Energy and Plants ................................................................................ 9

    How Does Remote Sensing Work?............................................................................... 11

    Remote Sensing: The Complete Process ...................................................................... 13

    Near Infra Red in Simple Remote Sensing for Agriculture.............................................. 14

    What is Near Infrared?.................................................................................................. 14

    What does NIR tell us? ................................................................................................. 15

    The Digital Advantage.................................................................................................. 17

    Testing for IR Sensitivity.............................................................................................. 18

    NIR Images................................................................................................................... 19

    The Proposed System........................................................................................................ 21

    The Autopilot System................................................................................................... 22

    Control Algorithm..................................................................................................... 22

    Sensor Selection........................................................................................................ 22

    Autopilot Hardware Design and Prototyping ........................................................... 24

    Hardware in the Loop Simulation................................................................................. 25HIL Simulator Development..................................................................................... 26

    HIL Simulator Utilization......................................................................................... 29

    The Flight Planner Software......................................................................................... 31

    Preliminary Test and Concluding Remark........................................................................ 33

    References......................................................................................................................... 35

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    3/36

    3

    Introduction

    Agriculture is one of the main income sources in Indonesia. Most of the Indonesian

    citizens have jobs in Agriculture field. Despite this importance of agriculture in

    Indonesia, there is still lacks of good agriculture practices in Indonesia.

    One of the emerging practices in agriculture is "Precision Agriculture". Precision

    Agriculture refers to the use of an information and technology-based system for field

    management of crops. Information technology-based system will help the farmer making

    the right decision. This approach basically means adding the right amount of treatment at

    the right time and the right location within a fieldthats the precision part. Farmers

    want to know the right amounts of water, chemicals, pesticides, and herbicides they

    should use as well as precisely where and when to apply them.

    By using the tools of precision Agriculture, farmers can specifically target areas of need

    within their fields and apply just the right amounts of chemicals where and when they are

    needed, saving both time and money and minimizing their impact on the environment.

    Irrigation is both difficult and expensive and gets even more difficult when the

    topography of the terrain is graded. Farmers have a tendency to over irrigate, spending

    both more time and money than is necessary. Often times farmers look at weathervariables and then schedule irrigation based on that information. But if they had better

    information, they could use scientific models and equations to compute more precisely,

    how much water their crop is using or how much more is needed. And all this require to

    have an accurate map of the field. Much of the ability to implement precision agriculture

    is based on information technologies; in particular, global positioning and navigation and

    geospatial / remote sensing mapping and analysis.

    As mentioned before one of the key technology in precision agriculture is geospatial /

    remote sensing mapping and analysis. An optimum remote sensing system for precision

    agriculture would provide data as often as twice per week for irrigation scheduling and

    once every two weeks for general crop damage detection. The spatial resolution of the

    data should be as high as 2 to 5 square meters per pixel with positional accuracy of within

    2 meters. Additionally, the data must be available to the farmer within 24 hours of

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    4/36

    4

    acquiring them. Turnaround time is more important to farmers than data accuracy. They

    would gladly accept remote sensing measurements that are as poor as 75 percent accurate

    if they were assured of getting them within 24 hours of acquisition. Unfortunately, there

    are currently no Earth orbiting satellites that can meet all of a precision farmers

    requirements. This is where the Autonomous Unmanned Aerial Vehicle (UAV) will play

    its role.

    This research will proposed a new kind of relatively low cost autonomous UAV that will

    enable farmers to make just in time mosaics of aerial photo of their crop. These mosaics

    of aerial photo should be able to be produced with relatively low cost and within the 24

    hours of acquisition constraint. The autonomous UAV will be equipped with payload

    management system specifically developed for rapid aerial mapping. As mentioned

    before turn around time is the key factor, so accuracy is not the main focus (not

    orthorectified aerial mapping). This system will also be equipped with special software to

    post process the aerial photos to produce the mosaic aerial photo map.

    Research Description

    Problem Definition

    1. Aerial photo of crop fields is needed to enable farmers make the right decisionabout kind and amount of treatment at the right time and the right location within

    a field.

    2. Turn around time of the aerial photo is more important to farmers than dataaccuracy. Usually the farmers need the information within 24 hours of

    acquisition.

    3. A system that enables the farmers to make fast turn around time of the aerialphoto of the crop field is needed.

    4. Cost is important matter, this includes low first time investment and lowoperational and maintenance cost.

    5. Ease of operation is also important matter, considering the availability of humanresources quality.

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    5/36

    5

    Research Objective

    1. Design and implement an UAV platform which is small enough to be operatedfrom typical crop field in Indonesia without the need of special airstrip. The

    proposed launching method is by hand, so the UAV platform should be able to doshort take off and landing (STOL, short take off and landing).

    2. Design and implement a low cost autonomous autopilot navigation system thatwill be used to automatically navigate the UAV to cover the crop field to produce

    the mosaic aerial photo

    3. Design and implements a simple flight planning software that will generateswaypoints covering the crop field optimally

    4. Design and implement payload management system onboard the UAV that enablethe automatic timing of digital camera shutter release for aerial photo taking

    5. Design and implement a post processing software that automates the mosaickingprocess of the aerial photos, so the turn around time will be fulfilled

    6. For future development : development of specific payload for precisionagriculture other than digital camera (such as bio chemical sensor, environmental

    sensor, weather sensor, etc)

    Research Methodology

    Development of Autonomous Unmanned Aerial Vehicle

    1. Airframe : this is the aerial platform that will be instrumented with autonomousautopilot system and carry the mission specific payload (automatic flight planet

    for low altitude photography and digital camera).

    2. Attitude, Heading and Position Reference System (AHPRS) : this is the mainreference input for autonomous autopilot system. The AHPRS outputs euler

    angles (roll, pitch and yaw), true north absolute heading and position (latitude,

    longitude and altitude).

    3. Autopilot System : this is the main controller of the airframe. It consists of twomain part, the low level control system that governs the pose / attitude of the

    aircraft based on the objective trajectories. The second parts is the waypoint

    sequencer. This part determined which location the airframe should go (latitude,

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    6/36

    6

    longitude and altitude), and thus determine the trajectories input the control

    system.

    4. Digital Camera Payload Management System and Automatic Flight Planner : TheFlight Planner component will first make automatic waypoint (longitude, altitude,

    and altitude) that will optimally covers the area of interest to be photographed.

    Inputs to this systems are boundary of the area (longitude and altitude), scale of

    the desired aerial photo and horizontal-vertical overlap of each photo segment,

    then the system will automatically determined the altitude and automatic

    sequencing of digital camera shutter release. The Digital Camera Payload

    Management will simply command the shutter release sequence and logging the

    exact time and position of the shutter release (usually recognize as metadata, this

    information is needed for post processing and automatic rapid mosaicking).

    5. Ground Station Software : this component will enable the operator to plan andmonitor the mission execution the Autonomous Unmanned Aerial Vehicle, as

    well as reconfiguring the mission during execution. The monitoring is done in real

    time because a high speed long range data modem is used to transmit and receive

    mission parameter between UAV and ground station.

    6. Final Integration : these steps are taken when all supporting components of theUAV are completely developed.

    Development of Post Processing Software

    Automatic Photo Mosaicking : this component will automaticallycombined the aerial photographs that covers small area along with the

    metadata (longitude, latitude and altitude of the digital camera) into single

    large aerial photographs that covers larger area.

    Aerial Photo based Agriculture Information System : this is the tools thatwill be used by the farmers to make analysis to the aerial photograph and

    support the decision making about the crop field. Remote Sensing and

    Geographic Information System concepts are involved in this system

    along with Precision Agriculture good practices.

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    7/36

    7

    Overall System Testing : this steps is conducted after the Unmanned Aerial System and

    Post Processing are completely developed. The objective is to make positive feedback to

    the overall research and development and to publicize the system to the potential users :

    the farmers.

    Lastest Progress

    Not all those task have been completed until this reporting phase. Airframe will be

    developed after the completion of other system, at this time a simple remote controlled

    trainer 60 airframe will be used for development purpose. AHPRS is still under

    development (there is other research report about this), for now the attitude reference for

    the autopilot is using thermopile sensor. The autopilot system have been developed in

    previous research and will be used as the basis. Digital Camera Payload Management

    System and Automatic Flight Planner have been completely developed and tested.

    Ground Station Software have been completely developed and tested.

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    8/36

    8

    Agricultural Remote Sensing Basics

    When farmers or ranchers observe their fields or pastures to assess their condition

    without physically touching them, it is a form of remote sensing. Observing the colors of

    leaves or the overall appearances of plants can determine the plant's condition. Remotely

    sensed images taken from satellites and aircraft provide a means to assess field conditions

    without physically touching them from a point of view high above the field.

    Most remote sensors see the same visible wavelengths of light that are seen by the human

    eye, although in most cases remote sensors can also detect energy from wavelengths that

    are undetectable to the human eye. The remote view of the sensor and the ability to store,

    analyze, and display the sensed data on field maps are what make remote sensing a

    potentially important tool for agricultural producers. Agricultural remote sensing is not

    new and dates back to the 1950s, but recent technological advances have made the

    benefits of remote sensing accessible to most agricultural producers.

    Remote Sensing . . . How You Can Use It On Your Farm

    Remotely sensed images can be used to identify nutrient deficiencies, diseases, water

    deficiency or surplus, weed infestations, insect damage, hail damage, wind damage,

    herbicide damage, and plant populations.

    Information from remote sensing can be used as base maps in variable rate applications

    of fertilizers and pesticides. Information from remotely sensed images allows farmers to

    treat only affected areas of a field. Problems within a field may be identified remotely

    before they can be visually identified.

    Ranchers use remote sensing to identify prime grazing areas, overgrazed areas or areas of

    weed infestations. Lending institutions use remote sensing data to evaluate the relative

    values of land by comparing archived images with those of surrounding fields.

    The Electromagnetic Spectrum

    he basic principles of remote sensing with satellites and aircraft are similar to visual

    observations. Energy in the form of light waves travels from the sun to Earth. Light

    waves travel similarly to waves traveling across a lake. The distance from the peak of one

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    9/36

    9

    wave to the peak of the next wave is the wavelength. Energy from sunlight is called the

    electromagnetic spectrum.

    The wavelengths used in most agricultural remote sensing applications cover only a small

    region of the electromagnetic spectrum. Wavelengths are measured in micrometers (m)

    or nanometers (nm). One um is about .00003937 inch and 1 m equals 1,000 nm. The

    visible region of the electromagnetic spectrum is from about 400 nm to about 700 nm.

    The green color associated with plant vigor has a wavelength that centers near 500 nm

    Wavelengths longer than those in the visible region and up to about 25 m are in the

    infrared region. The infrared region nearest to that of the visible region is the near

    infrared (NIR) region. Both the visible and infrared regions are used in agricultural

    remote sensing.

    Electromagnetic Energy and Plants

    When electromagnetic energy from the sun strikes plants, three things can happen.

    Depending upon the wavelength of the energy and characteristics of individual plants, the

    energy will be reflected, absorbed, or transmitted. Reflected energy bounces off leaves

    and is readily identified by human eyes as the green color of plants. A plant looks green

    because the chlorophyll in the leaves absorbs much of the energy in the visible

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    10/36

    10

    wavelengths and the green color is reflected. Sunlight that is not reflected or absorbed is

    transmitted through the leaves to the ground.

    Interactions between reflected, absorbed, and transmitted energy can be detected by

    remote sensing. The differences in leaf colors, textures, shapes or even how the leaves are

    attached to plants, determine how much energy will be reflected, absorbed or transmitted.

    The relationship between reflected, absorbed and transmitted energy is used to determine

    spectral signatures of individual plants. Spectral signatures are unique to plant species.

    Remote sensing is used to identify stressed areas in fields by first establishing the spectral

    signatures of healthy plants. The spectral signatures of stressed plants appear altered from

    those of healthy plants. The following figure compares the spectral signatures of healthy

    and stressed sugarbeets.

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    11/36

    11

    Stressed sugarbeets have a higher reflectance value in the visible region of the spectrum

    from 400-700 nm. This pattern is reversed for stressed sugarbeets in the nonvisible range

    from about 750-1200 nm. The visible pattern is repeated in the higher reflectance range

    from about 1300-2400 nm. Interpreting the reflectance values at various wavelengths of

    energy can be used to assess crop health.

    The comparison of the reflectance values at different wavelengths, called a vegetative

    index, is commonly used to determine plant vigor. The most common vegetative index is

    the normalized difference vegetative index (NDVI). NDVI compares the reflectance

    values of the red and NIR regions of the electromagnetic spectrum. The NDVI value of

    each area on an image helps identify areas of varying levels of plant vigor within fields.

    How Does Remote Sensing Work?

    There are several types of remote sensing systems used in agriculture but the most

    common is a passive system that senses the electromagnetic energy reflected from plants.

    The sun is the most common source of energy for passive systems. Passive system

    sensors can be mounted on satellites, manned or unmanned aircraft, or directly on farm

    equipment.

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    12/36

    12

    There are several factors to consider when choosing a remote sensing system for a

    particular application, including spatial resolution, spectral resolution, radiometric

    resolution, and temporal resolution.

    Spatial resolution refers to the size of the smallest object that can be detected in an

    image. The basic unit in an image is called a pixel. One-meter spatial resolution means

    each pixel image represents an area of one square meter. The smaller an area represented

    by one pixel, the higher the resolution of the image.

    Spectral resolution refers to the number of bands and the wavelength width of each

    band. A band is a narrow portion of the electromagnetic spectrum. Shorter wavelength

    widths can be distinguished in higher spectral resolution images. Multi-spectral imagery

    can measure several wavelength bands such as visible green or NIR. Landsat, Quickbird

    and Spot satellites use multi-spectral sensors. Hyperspectral imagery measures energy in

    narrower and more numerous bands than multi-spectral imagery. The narrow bands of

    hyperspectral imagery are more sensitive to variations in energy wavelengths and

    therefore have a greater potential to detect crop stress than multi-spectral imagery. Multi-

    spectral and hyperspectral imagery are used together to provide a more complete picture

    of crop conditions.

    Radiometric resolution refers to the sensitivity of a remote sensor to variations in the

    reflectance levels. The higher the radiometric resolution of a remote sensor, the more

    sensitive it is to detecting small differences in reflectance values. Higher radiometric

    resolution allows a remote sensor to provide a more precise picture of a specific portion

    of the electromagnetic spectrum.

    Temporal resolution refers to how often a remote sensing platform can provide

    coverage of an area. Geo-stationary satellites can provide continuous sensing while

    normal orbiting satellites can only provide data each time they pass over an area. Remote

    sensing taken from cameras mounted on airplanes is often used to provide data for

    applications requiring more frequent sensing. Cloud cover can interfere with the data

    from a scheduled remotely sensed data system. Remote sensors located in fields or

    attached to agricultural equipment can provide the most frequent temporal resolution.

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    13/36

    13

    Remote Sensing: The Complete Process

    The following figure illustrates a satellite remote sensing process as applied to

    agricultural monitoring processes. The sun (A) emits electromagnetic energy (B) to plants

    (C). A portion of the electromagnetic energy is transmitted through the leaves. The sensoron the satellite detects the reflected energy (D). The data is then transmitted to the ground

    station (E). The data is analyzed (F) and displayed on field maps (G).

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    14/36

    14

    Near Infra Red in Simple Remote Sensing for Agriculture

    Near infra red is one of the electromagnetic spectrum that can be used in remote sensing

    for agriculture application. One interesting thing about this near infra red spectrum,

    is that this spectrum can be produced using low cost modified consumer digital still

    camera.

    What is Near Infrared?

    The electromagnetic spectrum is a plotted distribution of all radiant energies as a function

    of their wavelength. It ranges from the shorter wavelengths of x-rays and gamma rays to

    the longer wavelengths of radio waves and microwaves.Remember, wavelengths and

    frequencies have an inverse relationship; high frequency means shorter wavelengths and

    vis versa. There are several regions of the electromagnetic spectrum that are useful for

    remote sensing. Probably the most common and the one we will discuss is infrared (IR).

    IR is found between the visible and microwave portions of the electromagnetic spectrum .

    Near infrared (NIR) makes up the part of IR closest in wavelength to visible light and

    occupies the wavelengths between about 700 nanometers and 1500 nanometers (0.7m

    1.5m). NIR is not to be confused with thermal infrared, which is on the extreme end of

    the infrared spectrum and measures heat.

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    15/36

    15

    What does NIR tell us?

    Since NIR has longer wavelengths than visible light, it exhibits peculiar properties that

    can be exploited for remote sensing applications. Some of the information that can be

    obtained from NIR is crop stress (water and nutrient stress being the most common) and

    weed/pest infestations.

    Leaf chlorophyll absorbs energy in the visible red wavelengths (600-700nm); crops with

    healthy leaves absorb higher levels of energy at these wavelengths. Healthy leaves are

    characterized by a nutrient rich, hydrated, and disease free leaf-cell structure. The healthy

    and turgid spongy cells within these leaves reflect NIR (Figure2). Conversely, stressed

    crops and crops in stages of senescence are characterized by an increase in red reflectanceand a decrease in NIR reflectance.

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    16/36

    16

    Absorbance and reflectance of different wavelengths

    By measuring the amount of reflected NIR and Red wavelengths, it is possible to

    determine the vegetative health of plants in an image. There are a number of different

    methods for quantifying the relationship between NIR/Red reflectance and plant

    happiness. These methods are called vegetative indices.

    Vegetative Indices

    The following is a short list of some common indices:

    Ratio Vegetation Index (RVI)

    This is simply a ratio of NIR and Red reflectance. It has a typical range of around 1 for

    bare soil and >20 for thick vegetation.

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    17/36

    17

    Normalized Difference Vegetation Index (NDVI)

    The NDVI is the most commonly used vegetative index for remote sensing. Because the

    difference is divided by the sum of the reflectance, it normalizes the output and

    overcomes any problem with varying light levels at the time the measurements were

    taken. Typical numbers are between 0 for no vegetation and 1 for dense vegetation.

    Negative numbers are only possible over water bodies since 99.9% of land features will

    reflect more NIR than Red.

    Soil-Adjusted Vegetation Index (SAVI)

    The SAVI index is almost identical to the NDVI, but it adds a soil adjustment factor that

    takes into account soil brightness. The values for L vary between 0 and 1, but a factor of

    0.5 is a common approximation when the correct value is not known.

    These vegetation indices can be very useful. Many studies have shown a strong

    correlation between an index and a crops overall health. Recent studies have even used

    reflectance measurements in determining nitrogen content.

    The Digital Advantage

    The CCDs (charge-coupled device) found in digital cameras turn out to be quite sensitive

    to NIR. This fact has led many digital camera manufacturers to install special internal IR

    cut filters designed to reduce IR contamination in visible light photos. These internal IR

    cut filters vary in the amount of IR that they transmit and therefore some cameras work

    much better for this sort of work than others.

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    18/36

    18

    In order to use a digital camera for NIR work, a special external filter is needed to filter

    out all visible light and allow only IR to pass through. These filters are known as IR pass

    filters and have been used for years by film-based IR photographers and are available at

    many camera store locations.

    Since digital cameras use no film and all image information is recorded at the CCD, they

    provide many advantages over film-based IR photography:

    Instant images Proper exposures since it uses its CCD to measure light and set the shutter speed

    and aperture

    Affordable no more expensive IR film

    Testing for IR SensitivityBefore purchasing an IR filter for a camera, make sure the digital camera will even work.

    The internal IR cut filters installed on cameras vary greatly so it is best to check the

    camera to be sure it will be sensitive enough to NIR to capture quality images.

    Here is a simple test that can test the sensitivity of a digital camera to NIR.

    Aim an infrared remote control (like the one you have for you television) at thecamera.

    Press any button on the remote.

    If you can see a bright light similar to the one below emitting from the front of theremote control through the camera, then the camera is sufficiently sensitive to the

    NIR band and can be used for the purpose of obtaining NIR images.

    The image on the LCD of the camera should look something like this:

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    19/36

    19

    NIR Images

    The NIR images that you receive from a digital camera are monochrome, meaning they

    appear like black and white photos. It is possible to make the image look like an

    'Ektachrome' color IR photo by combining a standard RGB photo with the IR image in

    Adobe Photoshop (see the tutorial for Photoshop).

    Taking both images can easily be done with a tripod and a single camera where one

    image is taken with the filter and the other without.

    One method we are testing is using two digital cameras in tandem, attached to a helium

    filled blimp. These cameras are then operated from the ground by a serial connection and

    images are stored on flash memory cards.

    By using consumer digital cameras to obtain IR images, a farmer could detect crop stress

    early and receive images instantly at a fraction of the cost of other methods.

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    20/36

    20

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    21/36

    21

    The Proposed System

    Here is the diagram of the poposed system

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    22/36

    22

    The Autopilot System

    Control Algorithm

    The control algorithm consists of two layers. The first upper layer is waypoint sequencer.

    The second lower layer is sets of PID (proportional, integrative, and derivative)

    controller. The waypoint sequencer reads the waypoints given to the autopilot control

    system by the operator. Each waypoint basically consists of 3D world coordinate which

    are latitude, longitude and altitude. Based on this waypoint information and current

    position, attitude and ground speed, the waypoint sequencer will output several

    objectives: attitude (roll, pitch and yaw/heading objective) and ground speed objectives.

    These objectives will be read by PID controller as its setting point and will be compared

    with actual value using PID algorithm to produce servo command value that will actuate

    the airframe's surface control (aileron, elevator and rudder) and throttle.

    Sensor Selection

    Based on the Control Algorithm Development step, there several measurements needed

    by the PID control scenarios. These measurements are position measurements and

    attitude measurements.

    Position measurements are:

    speed,

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    23/36

    23

    latitude, longitude, and altitude.

    Attitude measurements are:

    heading / yaw, roll, and pitch.

    For acquiring position measurements, a GPS receiver is used. The uBlox TIM-LA is

    chosen because it's relatively low cost and can provide 4 position information (speed,

    latitude, longitude, altitude and heading) every second (4Hz).

    For measuring roll and pitch angle, the best solution would be using Attitude and

    Heading Reference System (AHRS). AHRS consists of inertial sensors (gyroscope and

    accelerometer) and magnetic field sensor (magnetometer). Strap down inertial navigation

    mechanization and proprietary fusion algorithm is usually used in combining the sensor

    readings to produce reliable attitude information. But the commercially available AHRS

    is beyond this research budget. The other simple and low cost alternative is using a pair

    of thermopile sensor for sensing the horizon. The idea comes from Co-Pilot tm, an

    auxiliary device to train a beginner aeromodeller.

    The basic principles are :

    thermopile sensor can sense remote object temperature, sky is typically having lower temperature than ground, and finally by installing the thermopile sensor in roll and pitch axis (4 thermopile sensors),

    during level flight all sensors approximately see the same amount of sky and

    ground, so the sensor output will approximately be the same. During pitch up

    (nose up) attitude the facing forward thermopile sensor sees more sky than

    ground, and the facing backward thermopile sensor sees more ground than sky. So

    the facing forward thermopile sensor sense cooler temperature than the facing

    backward thermopile sensor. By knowing the difference between the two sensor,

    the pitch up angle can be calculated. The same principle applied to the roll axis.

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    24/36

    24

    The thermopile sensor used in this prototype is Melexis MLX90247. For sensing heading

    / yaw a rate gyroscope is used. The absolute heading offset for yaw rate integration is

    taken from GPS heading. The yaw rate gyro used in this prototype is Analog Device

    ADXRS401.

    Autopilot Hardware Design and Prototyping

    The UAV autopilot hardware system consists of two parts, the first part is for sensor

    processing and the second part is for stabilization and navigation control. Here is the

    block diagram of the autopilot system

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    25/36

    25

    Hardware in the Loop Simulation

    Field trial is one of the most critical steps in UAV development. UAV usually consists of

    relatively high priced airframe, engine, actuator / servo, and payload system, so when

    there is failure in control system field trial, the risk is airframe crash, and usually only

    minor part of the crashed UAV that can be used for the next research and development.

    This step proved to be one of the main problems in UAV research and development.

    One of the solutions for minimizing the effect of control system failure in field trial is

    Hardware in the Loop (HIL) Simulation.

    Hardware in the loop (HIL) simulator simulates a process such that the input and output

    signals show the same time-dependent values as the real dynamically operating

    components. This makes it possible to test the final embedded system under real working

    conditions, with different working loads and in critical/dangerous situations.

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    26/36

    26

    In the case of UAV autopilot system development, A HIL simulator can be developed to

    simulate the flight characteristic of the airframe including the sensor output and the

    control input signal. The UAV autopilot system can be installed with the HIL simulator

    to see how the overall system works as a closed loop system. Here we can tune the PID

    gain parameter as well as the other system parameter and watch the effect to the airframe

    in the HIL simulator.

    HIL Simulator DevelopmentThis writer develop HIL simulator based on commercially available simulation software.

    By using this approach, the basic simulation feature doesn't have to be implemented from

    scratch. Only specific functionality needed by HIL simulator need to be added. This

    specific functionality usually relates with interfacing between simulation software and

    autopilot hardware (sensor measurement and servo actuation simulation).

    The chosen simulation software for HIL simulator development is X-Plane, because these

    reason:

    X-Plane is very interesting for non aerodynamicist developer, because we canmake an airframe based only on its geometric dimension. The physics model is

    based on a process called Blade Element Theory. This set of principles breaks an

    airframe down by geometric shape and determines the number of stress points

    along its hull and airfoils. Factors such as drag coefficients are then calculated at

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    27/36

    27

    each one of these areas to ensure the entire plane is being affected in some way by

    external forces. This system produces figures that are far more accurate than those

    achieved by taking averages of an entire airfoil, for example. It also results in

    extremely precise physical properties that can be computed very quickly during

    flight, ultimately resulting in a much more realistic flight

    model. The X-Plane accuracy of the flight model is already approved by FAA, forfull motion simulator to train commercial airline pilot.

    X-Plane's functionality can be customized using a plug in. A plug in is executablecode that runs inside X-Plane, extending what X-Plane does. Plug ins are

    modular, allowing developers to extend the simulator without having to have the

    source code to the simulator. Plug ins allow the extension of the flight simulator's

    capabilities or gain access to the simulator's data. For HIL simulator purposes we

    need to make plug in that

    o reads attitude and position data to simulate sensor measuremento writes surface control deflection values to simulate servo command

    has ability to communicate with autopilot hardware with some kind of hardwareinterface (to give sensor measurement and accept servo command).

    The HIL simulator consists of 3 main parts : sensors, process, and actuators.

    The Sensors part simulates the sensor's output data in the airframe. This data will be

    processed by the UAV autopilot hardware as input. The sensor output data that should be

    produced by HIL simulator are position data (speed, altitude, latitude and longitude) and

    attitude data (roll, pitch and yaw). This can be accomplished by reading data from the

    simulator.

    The Actuators part simulates how the UAV autopilot hardware can change the surface

    control of the airframe (aileron, elevator and rudder) and throttle position. In real world

    application this will be done by controlling the hobby servos put in the corresponding

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    28/36

    28

    control surface or throttle engine. In HIL simulator this is done by writing data to the X-

    Plane that will affect the control surface of the simulated airframe.

    The Process part simulates how the airframe will react to the input given by the UAV

    autopilot hardware. So basically this part is where we should put the system dynamic

    model (transfer function). Generally this is the most complex part of the HIL simulator,

    but fortunately this part is already provided by the X-Plane (using its blade element

    approach). The HIL simulator plug in communicates with the UAV autopilot hardware

    through RS232 serial communication.

    Here is the flowchart of the HIL simulator

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    29/36

    29

    HIL Simulator Utilization

    The HIL simulator is utilized to:

    develop the Ground Control Station Software, refine the firmware implementation through simulated closed loop tests, refine the hardware implementation through UAV autopilot long run reliability

    test, and

    PID gain tuning Testing the automatic shutter release mechanism

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    30/36

    30

    There was one finding when testing the UAV autopilot reliability. The power supply

    regulator was not stable, and it can be seen from the overall system performance in the

    HIL simulator. This failure results in airframe crash in its worst. Since this test is

    conducted in HIL simulator no financial lost occurred. This is one example how HIL

    simulator can prevent airframe crash in real world field trial.

    The HIL simulator enables the PID gain tuning based on trial end error basis. Analytical

    method of PID gain tuning is much more difficult since we have to have the

    mathematical model of the plant (airframe transfer function). It's considered easier for

    this writer to tune the PID gain on trial and error basis since airframe crash is not a

    problem in HIL simulator.

    The shutter release mechanism is added to the autopilot system. The HIL is also utilized

    to test this mechanism. Basically it's a proximity detector for latitude position if the aerial

    photo trajectory is north-south, or proximity detector for longitude position if the aerial

    photo trajectory is east-west. The HIL simulator enable the automatic shutter release

    mechanism to be rigorously tested before real world test.

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    31/36

    31

    The Flight Planner Software

    The flight planner software enable us to define the following requirement :

    area boundary to be photograph (defined as latitude-longitude of left-bottomcoordinate and latitude longitude of right-top coordinate)

    the field of view of the camera used desired aerial photo scale (will determine the altitude of the trajectory) horizontal and vertical photo overlap small and large overshoot needed for the airframe to turn direction ground altitude trajectory direction (north-south or east-west)

    and this software will output the following :

    trajectory / waypoints for the autopilot system shutter release interval other information such as :

    o AreaHorizo AreaVerto AreaPhotoHorizo AreaPhotoVerto HorizPhotoCounto VertPhotoCounto TotalPhotoCounto VertPhotoDistanceo HorizPhotoDistanceo PhotoAltitudeo WaypointCount

    The waypoints generated by this software have to be uploaded to the autopilot. Here is

    the screenshot of this software

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    32/36

    32

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    33/36

    33

    Preliminary Test and Concluding Remark

    Several test have been conducted with the system. In this preliminary test, we haven't

    used NIR camera, we only use conventional digital still camera. Several remark that can

    be concluded from the test result are :

    The HIL simulator functions as expected The flight planner functions as expected The autopilot functions as expected The autopilot shutter release mechanism functions as expected Large and small overshoot for the trajectories need to be determine carefully (in

    the test, the large overshoot is to small, so when taking photo in head wind

    direction, the completed trajectories oscillate) Vibration damper for camera needs to be developed. In the preliminary test, only

    simple foam damper is used, and it can not isolate the engine vibration, so the

    photo is not clear.

    Here is the photo of the airframe

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    34/36

    34

    Here is several example of photo taken by the system

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    35/36

    35

    References

    1. Ronnback, Sven, 2000, Development of a INS/GPS Navigation Loop for anUAV, Institutionen for Systemteknik Avdelningen for Robotik och Automation,

    Lulea Tekniska Universitet

    2. Sanvido Marco,,Hardware-in-the-loop Simulation Framework,Automatic ControlLaboratory, ETH Zurich

    3. Arya, Hemendra,, Hardware-In-Loop Simulator for Mini Aerial Vehicle, Centrefor Aerospace Systems Design and Engineering, Department of Aerospace

    Engineering, IIT Bombay, India

    4. Gomez, Martin, 2001, Hardware-in-the-Loop Simulation, Embedded SystemDesign, URL http://www.embedded.com/showArticle.jhtml?articleID=15201692

    5. Desbiens, Andre and Manai, Myriam,, Identification of a UAV and Design of aHardware-in-the-Loop System for Nonlinear ControlPurposes, Universite Laval,

    Quebec City, Canada

    6. B. Taylor, C. Bil, and S. Watkins, 2003, Horizon Sensing Attitude Stabilisation:A VMC Autopilot, 18th International UAV Systems Conference, Bristol, UK,

    7. URL http://www.x-plane.com8.

    Widyawardana Adiprawita, Development of Simple Autonomous Fixed WingUnmanned Aerial Vehicle Controller Hardware (draft), School of Electric

    Engineering and Informatics, Bandung Institute of Technology, 2006

    9. Sven Ronnback, Development of a INS/GPS navigation loop for an UAV, LuleaTekniska Universiteit, 2000

    10.Yong Li, Andrew Dempster, Binghao Li, Jinling Wang, Chris Rizos, A low-costattitude heading reference system by combination of GPS and magnetometers and

    MEMS inertial sensors for mobile applications, University of New South Wales,

    Sydney, NSW 2052, Australia

    11.Stelian Persa, Pieter Jonker, Multi-sensor Robot Navigation System, PatternRecognition Group, Technical University Delft, Lorentzweg 1, Delft, 2628

    CJ,The Netherlands

  • 7/30/2019 Rapid Aerial Photo System for Precision Agriculture Application Using Unmanned Aerial Vehicle

    36/36

    12.J.F. Vasconcelos, J. Calvario, P. Oliveira, C. Silvestre, GPS AIDED IMU FORUNMANNED AIR VEHICLES, Instituto Superior Tecnico, Institute for Systems

    and Robotics, Lisboa, Portugal, 2004

    13.Michael J. Caruso, Applications of Magnetic Sensors for Low Cost CompassSystems, Honeywell, SSEC