Remote sensing image processing

download Remote sensing image processing

of 137

Transcript of Remote sensing image processing

  • 7/27/2019 Remote sensing image processing

    1/137

    Image processing and dataInterpretation

    Mahesh Pal

    NIT Kurukshetra

  • 7/27/2019 Remote sensing image processing

    2/137

    Remote Sensing: Definition

    To acquire data about an object without being incontact with it.

    The art, science, and technology of obtainingreliable information about physical objects and theenvironment, through the process of recording,measuring, and interpreting imagery and digital

    representations of energy pattern derived fromnon-contact sensor systems (Colwell, 1997, ASPRS )

  • 7/27/2019 Remote sensing image processing

    3/137

    Remote sensing system

    Different stages in remotes sensing1. Source of electromagnetic energy

    2. Transmission of energy into atmosphere from source to earth.

    3. Energy interaction with earth (reflection, absorption, transmission).

    4. Transmission of reflected/emitted energy towards sensor.5. Detection of energy by sensor, converting it in to electrical output or

    image.

    6. Transmission/recording of the sensor output.

    7. Pre-process of the data and production of data product.8. Collection of ground truth and other information.

    9. Data analysis and interpretation.

    10. Integrating interpreted images with other data for final application

  • 7/27/2019 Remote sensing image processing

    4/137

    Fundamental of remote sensing by George Joseph, 2005

    (source NPTEL, IIT Kanpur)

  • 7/27/2019 Remote sensing image processing

    5/137

    Electromagnetic spectrum

  • 7/27/2019 Remote sensing image processing

    6/137

    Name Wavelength ( m)

    Optical wavelength 0.30 - 15

    (a) Reflective portion 0.4 - 3.0

    (i) Visible 0.4 - 0.7

    (ii) Near IR 0.7 - 1.30

    (iii) Middle IR 1.30 - 3.0

    (b) Far IR (Thermal, Emissive) 7.00 - 15.0

  • 7/27/2019 Remote sensing image processing

    7/137

    Creating Landsat Images from Raw Data: San Francisco - Oakland

  • 7/27/2019 Remote sensing image processing

    8/137

    Microwave region

  • 7/27/2019 Remote sensing image processing

    9/137

    Remote sensing data

    Data collected by sensor onboard thespace/air/terrestrial platform is available in the form

    of digital images.

    Received data is processed to derive useful

    information about earth features.

    To interpret these images suitable corrections,

    enhancements, and classification techniques are

    used. A typical image interpretation may involve manual

    and digital (computer assisted) procedures .

  • 7/27/2019 Remote sensing image processing

    10/137

    Concept of digital Image

    Digital images are array of Number and a file containingnumbers that constitute gray level values or digital number

    (DN) values.

    An Image is represented as a matrix of row and column.

    Image is a pictorial representation of pattern of landscapewhich is composed of elements- indicators of things and

    events which reflect physical, biological, and cultural

    components of landscape.

    Similar conditions in similar surroundings reflect similarpatterns and unlike conditions reflect unlike patterns.

  • 7/27/2019 Remote sensing image processing

    11/137

    A sample image of Landsat ETM+ data

  • 7/27/2019 Remote sensing image processing

    12/137

    Band 1 pixel values

    Each DN in this digital

    image corresponds to

    one small area of the

    visual image andprovide the level of

    darkness or lightness of

    the area.

    Higher the DN value,the brighter the area.

    Hence the zero value

    represents a perfect

    black.

    The maximum value

    represent perfect white

    and the intermediate

    values are shades of

    gray.

  • 7/27/2019 Remote sensing image processing

    13/137

    Landsat ETM+ image of Littleport in Cambridgeshire (England), acquired on 19

    June 2000. Band combination 4,3,2

  • 7/27/2019 Remote sensing image processing

    14/137

    Radar intensity image of same study area

  • 7/27/2019 Remote sensing image processing

    15/137

    Digital image processing

    Digital image processing involvesmanipulation and interpretation of digitalimages with the use of computer algorithms.

    It is an extremely broad subject with theinvolvement of mathematical computations .

    Digital image processing has many advantages

    over analog image processing. It allows using a wider range of algorithms to

    be applied to the input data

  • 7/27/2019 Remote sensing image processing

    16/137

  • 7/27/2019 Remote sensing image processing

    17/137

    Pre-processing

    Operation involved in preprocessing aims to correctdegraded image acquired from sensor (raw image) to create abetter presentation of original image.

    These operations are called pre-processing because theynormally precede further manipulation and analysis of image

    data to extract specific information.

    Radiometric correction

    Atmospheric correctionGeometric correction

  • 7/27/2019 Remote sensing image processing

    18/137

    Radiometric correction

    Are used to modify DN values in order to

    account for noise, that is, contributions to the

    DN that are a function NOT of the feature

    being sensed but due to:

    The intervening atmosphere

    The sun-sensor geometry

    The sensor itself errors and gaps

  • 7/27/2019 Remote sensing image processing

    19/137

  • 7/27/2019 Remote sensing image processing

    20/137

    Radiometric correction

    Missing lines

    Striping

    Illumination and view angle effects

    Sensor calibration

    Terrain effects

  • 7/27/2019 Remote sensing image processing

    21/137

    Missing scan line

    Source: earthobservatory.nasa.gov

    Occurs due to error in scanning or sampling equipment of the

    sensor.

    Pixel values are generally missing in these lines.

    Easiest method is to replace the missing value by the

    corresponding pixel on the immediately preceding scan line

  • 7/27/2019 Remote sensing image processing

    22/137

    Stripping or banding

    transmission

    striping in Landsat

    2 MSS Red Green

    and Blue value

    Occurs due to non-identical detector responseif a detector of a electromechanical sensor goes out of adjustment and provide

    reading less than or greater than other detectors fro same band over the same

    ground cover.

    Several methods are proposed to remove this error.

    Linear method-assume a linear relation between input and outputHistogram matching- Histogram of each line is created. Stripes are

    characterized by distinct histogram.

  • 7/27/2019 Remote sensing image processing

    23/137

    Illumination and View angle correction

    position of the sun relative to the earth changes

    depending on time of the day and the day of the year.

    In the northern hemisphere the solar elevation angle issmaller in winter than in summer.

    An absolute correction involves dividing the DN-valuein the image data by the sine of the solar elevation

    angle

  • 7/27/2019 Remote sensing image processing

    24/137

    Sensor calibration

    necessary to generate absolute data on physical

    properties

    Reflectance

    Temperature

    Emissivity

    Backscatter Values provided by data provider / agency

  • 7/27/2019 Remote sensing image processing

    25/137

    Terrain effects

    Cause differential solar illumination

    Some slopes receive more sunlight than others

    Magnitude of reflected radiance reaching the

    sensor

    Topographic slope and aspect introducesRadiometric distortion

    Bidirectional reflectance distribution function(BRDF)

    Require DEM and sun elevation for correction

  • 7/27/2019 Remote sensing image processing

    26/137

    BRDF

    BRDF gives the reflectance of a target as a function ofillumination geometry and viewing geometry.

    The BRDF depends on wavelength and is determinedby the structural and optical properties of the surface,such as shadow-casting, multiple scattering, mutualshadowing, transmission, reflection, absorption andemission by surface elements, facet orientationdistribution and facet density.

    The BRDF is needed in remote sensing for thecorrection of view and illumination angle effects

  • 7/27/2019 Remote sensing image processing

    27/137

  • 7/27/2019 Remote sensing image processing

    28/137

    Radiometric correction

    Atmosphere leads to selective scattering absorption

    and emission. Total radiance received at sensor

    depends on ground radiance (direct reflected) and

    scattered radiation from the atmosphere (pathradiance).

    relationship between radiance received at the sensor

    (above atmosphere) and radiance leaving the ground

    Ls at sensor radiance, H total downwelling radiance (incident energy),

    reflectance of target (albedo), T atmospheric transmittance, Lp atmospheric path

    radiance (wavelength dependent)

  • 7/27/2019 Remote sensing image processing

    29/137

    In solar reflection region, scattering is mostdominant causing path radiance.

    Path radiance causes (1) Reduction in contrast dueto masking effect, causing dark object appearingless dark and bright object less bright.

    and (2) adjacency affectatmospheric scattering

    may direct some radiation away from the sensorFOV-cause a decrease in spatial resolution ofsensor.

    Two methods for correction:

    Dark object subtraction and using atmospheric

    models such as MODTRAN (http://modtran.org/),6S (Second Simulation of a Satellite Signal in theSolar Spectrum; http://rtcodes.ltdri.org/)

  • 7/27/2019 Remote sensing image processing

    30/137

    Dark object subtraction

    The most The most common atmospheric effect onremotely-sensed imagery is an increase in DN valuesdue to haze, etc.

    This increase represents error and should be

    removed Dark object subtraction simply involves subtracting

    the minimum DN value in the image from all pixelvalues

    This approach assumes that the minimum value (i.e.the darkest object in the image) should be zero

    The darkest object is typically water or shadow

  • 7/27/2019 Remote sensing image processing

    31/137

    Atmospheric corrections are needed in three cases:

    1. When we want ratio of two bands (scattering is inverselyproportional to wavelength shorter wavelength more scattering)

    2. When we want to compare upwelling radiance from asurface to some property of that surface in terms of

    physically based model.

    3. When comparing satellite data acquired at different dateas state of atmosphere changes from time to time.

    4. Radiometric corrections for illumination, atmosphericinfluences, and sensor characteristics are done prior todistribution of data to the user.

  • 7/27/2019 Remote sensing image processing

    32/137

    Atmospheric correction

    Beijing, China, May 3, 2001

  • 7/27/2019 Remote sensing image processing

    33/137

    Geometric correction

    Geometric correction is the process of rectification ofgeometric errors introduced in the imagery during the processof its acquisition. It is the process of transformation of aremotely sensed image so that it has the scale and projectionproperties of a map.

    A related technique called registration is the fitting of thecoordinate system of one image to that of a second image ofthe same area.

    Geocoding and georeferencing are the often-used terms inconnection with the geometric correction process. The basicconcept behind geocoding is the transformation of satelliteimages into a standard map projection so that image featurescan be accurately located on the earth's surface, and theimage can be compared.

  • 7/27/2019 Remote sensing image processing

    34/137

    Scan Skew: The main source of geometric error in

    satellite data is satellite path orientation (non-polar).

  • 7/27/2019 Remote sensing image processing

    35/137

    Reasons for image rectification

    For scene to scene comparison of individual pixels inapplications such as change detection or thermal inertiamapping.

    For GIS data for GIS modeling.

    For identifying training samples according to mapcoordinates.

    For creating accurate scaled photomaps.

    To overlay an image with vector data such as ARC/INFO.

    For extracting accurate area and distance measures.

    For mosaicing. To compare images that are originally at different scales.

  • 7/27/2019 Remote sensing image processing

    36/137

    The geometric correction process involves in

    1. Identifying the image coordinates of several

    clearly identifiable points, called ground

    control points (or GCPs), in the distorted

    image (A - A1 to A4).2. The true ground coordinates are obtained

    from a map (B - B1 to B4, or another image ),

    matching them to their true positions in

    ground coordinates This is called image-to-

    map or image to image registration.

    To geometrically correct the original distorted image,

    resampling is used to determine the DN values of new

    pixel locations of the corrected image.

    The resampling process calculates the new pixel values

    from the original digital pixel values in the uncorrected

    image.Nearest neighbour, bilinear interpolation, and cubic

    convolution are three resampling methods. Nearest

    neighbour method uses the DN value from the original

    image which is nearest to the new pixel location in the

    corrected image.

  • 7/27/2019 Remote sensing image processing

    37/137

    Resampling methods

    http://www.geo-informatie.nl/courses/grs20306/course/Schedule/Geometric-correction-RS-new.pdf

    New DN values areassigned in 3 ways

    a.Nearest Neighbour

    Pixel in new grid getsthe value of closestpixel from old grid retains original DNsb. Bilinear InterpolationNew pixel gets a value

    from the weightedaverage of 4 (2 x 2)nearest pixels;smoother but syntheticc. Cubic Convolution(smoothest)New pixel DNs arecomputed fromweighting 16 (4 x 4)surrounding DNs

    http://www.geo-informatie.nl/courses/grs20306/course/Schedule/Geometric-correction-RS-new.pdfhttp://www.geo-informatie.nl/courses/grs20306/course/Schedule/Geometric-correction-RS-new.pdfhttp://www.geo-informatie.nl/courses/grs20306/course/Schedule/Geometric-correction-RS-new.pdfhttp://www.geo-informatie.nl/courses/grs20306/course/Schedule/Geometric-correction-RS-new.pdfhttp://www.geo-informatie.nl/courses/grs20306/course/Schedule/Geometric-correction-RS-new.pdfhttp://www.geo-informatie.nl/courses/grs20306/course/Schedule/Geometric-correction-RS-new.pdfhttp://www.geo-informatie.nl/courses/grs20306/course/Schedule/Geometric-correction-RS-new.pdfhttp://www.geo-informatie.nl/courses/grs20306/course/Schedule/Geometric-correction-RS-new.pdfhttp://www.geo-informatie.nl/courses/grs20306/course/Schedule/Geometric-correction-RS-new.pdfhttp://www.geo-informatie.nl/courses/grs20306/course/Schedule/Geometric-correction-RS-new.pdf
  • 7/27/2019 Remote sensing image processing

    38/137

    Image Enhancement

    Visual analysis and interpretation are oftensufficient for many purpose to extractinformation from remote sensing data.

    Enhancement means altering the appearance ofdigital image so as to make it more informativefor visual interpretation.

    The characteristics of each image in terms ofdistribution of pixel values will change from onearea to another.

  • 7/27/2019 Remote sensing image processing

    39/137

    Image transformation and filtering are also usedas image enhancement techniques.

    For visual enhancement Changing image contrastis one of the important exercise.

    Contrast is defined as the difference in colourthat makes an object (or its representation in animage) distinguishable.

    The range of brightness values present in animage is also referred as contrast.

  • 7/27/2019 Remote sensing image processing

    40/137

    In raw imagery, the useful data often covers only a small portion of the available range

    of digital values (for example for 8 bits or 256 levels). Contrast enhancement involves

    changing the original values so that more of the available range is used, thereby

    increasing the contrast between targets and their backgrounds.

    For contrast enhancements concept of an image histogram is important.

    A histogram is a graphical representation of the brightness values that comprise an

    image. The brightness values (i.e. 0-255) are displayed along the x-axis of the graph. The

    frequency of occurrence of each of each DN values in the image is shown on the y-axis.

  • 7/27/2019 Remote sensing image processing

    41/137

    By manipulating the range of DN values represented by the

    histogram of an image, various contrast enhancement techniques

    can be applied.

    The simplest type is a linear contrast stretch. This involves

    identifying the minimum and maximum brightness values in the

    image (lower and upper bounds from the histogram) and applying a

    transformation to stretch this range to fill the full range.

  • 7/27/2019 Remote sensing image processing

    42/137

    Histogram equalization Histogram is transformed so that

    all pixels have same frequency along the whole range. This

    method expands some parts of the DN range at the expenseof others by dividing the histogram into classes containing

    equal numbers of pixels

    Piece wise linear stretch- when histogram is bi-model. In

    this approach a number of linear enhancement steps that

    expands the brightness ranges by using breakpoints are

    used.

  • 7/27/2019 Remote sensing image processing

    43/137

    Density slicing-combining DNs of differentvalues within a specified range into a single

    value. This transforms the image from a continuum of

    gray tones into a limited number of gray orcolor tones reflecting the specified ranges in

    digital numbers. This is useful in displayingweather satellite information.

  • 7/27/2019 Remote sensing image processing

    44/137

    Filtering Filtering include a set of image processing functions

    used to enhance the appearance of an image.

    Filter operations can be used to sharpen or blur

    images, to selectively suppress image noise, to detect

    and enhance edges, or to alter the contrast of theimage.

    Two broad categories

    Spatial domain filtering Frequency domain filtering

  • 7/27/2019 Remote sensing image processing

    45/137

    Spatial Domain Filtering

    An image enhancement method that modifies pixelvalues based on the values of the surrounding pixels,

    with the objective of enhancing areas of high or low

    spatial frequency.

    The spatial frequency of a remotely sensed image isdefined by the change in brightness value per unit

    distance in any part of the image.

    Rapid variations in brightness levels reflect a high

    spatial frequency; 'smooth' areas with little variation

    in brightness level or tone are characterized by a low

    spatial frequency.

  • 7/27/2019 Remote sensing image processing

    46/137

    Spatial filtering

    Low pass filter -These are used to emphasizelarge homogenous areas of similar tone andreduce the smaller detail. Low frequency areasare retained in the image resulting in a smoother

    appearance to the image. Average, Median andmajority filters

    Original Image with a profile line Low-Pass Filtered Image and profile line

  • 7/27/2019 Remote sensing image processing

    47/137

    Spatial filtering

    In spatial domain filter, a moving window of a

    set of pixels in dimension (i.e. 3X3 and 5X5) is

    passed over each pixel in image.

    Mathematical calculation using pixel value

    under the window is applied and central pixel

    of window is replaced by this value.

    This window is called Convolution kernel.

  • 7/27/2019 Remote sensing image processing

    48/137

    Convolution Filters (ERDAS)

    1. Edge Detection/enhancement

    2. Low pass/High Pass

    3. Horizontal4. Vertical

    5. Sharpen

    6. Summary

  • 7/27/2019 Remote sensing image processing

    49/137

    Edge detection filters highlight linear

    features, such as roads or field boundaries.

    These filters are useful in applications such asgeology, for the detection of linear geologic

    structures (lineament).

    Are used to determine boundaries ofhomogenous regions in radar images.

    Roberts and Sobel filters (High pass filters)

    are Mostly used.

  • 7/27/2019 Remote sensing image processing

    50/137

    MedianActual

    Edge detection High pass

  • 7/27/2019 Remote sensing image processing

    51/137

    Frequency Domain Filter

    Fourier transform of an image is expressed byamplitude spectrum which involves breaking downof an image into its frequency.

    Filtering of these components is done usingfrequency domain filters.

    These filters operates on amplitude spectrum of animage by removing, attenuating or amplifying theamplitudes in specific wavebands.

    Frequency domain can be represented as a 2D

    scatter plot known as Fourier spectrum. In whichlower frequency falls at centre and progressivelyhigher frequencies are plotted outwards.

  • 7/27/2019 Remote sensing image processing

    52/137

    Fourier spectrum of ETM+ PAN image

  • 7/27/2019 Remote sensing image processing

    53/137

    After applying the circular mask

    Converted image

  • 7/27/2019 Remote sensing image processing

    54/137

    Frequency domain filtering

    1. Fourier transform of the original image to

    compute Fourier spectrum.

    2. Select an appropriate filter function and multiply

    it by the elements of the Fourier spectrum.3. Perform an inverse Fourier transform to have an

    image in spatial domain.

    4. Ideal, Bartlett, Butterworth, Gaussian andHanning are some of the filters for frequency

    domain filtering.

  • 7/27/2019 Remote sensing image processing

    55/137

    Image Transformation

    A process through which one can re-express

    the information content of an Image.

    image transformations generate new images

    from two or more source images which

    highlight particular features or properties of

    interest, better than the original input images.

  • 7/27/2019 Remote sensing image processing

    56/137

  • 7/27/2019 Remote sensing image processing

    57/137

    Various methods

    Arithmetic Operations

    Principal component analysis

    Hue, saturation and intensity (HSI) transformation

    Fourier and wavelet transformation

  • 7/27/2019 Remote sensing image processing

    58/137

    Arithmetic operations

    Image addition

    Image subtraction

    Image division (image ratioing)

    Image multiplication

    These operation are done on two or more co-

    registered images of same area.

    Division is most widely used operation for geological,

    ecological and agricultural applications.

  • 7/27/2019 Remote sensing image processing

    59/137

    Vegetation indices (VIs)

    Image division serves to highlight subtlevariations in the spectral responses of varioussurface covers.

    By ratioing the data from two different spectral

    bands, the resultant image enhances variations inthe slopes of the spectral reflectance curvesbetween the two different spectral ranges thatmay otherwise be masked by the pixel brightness

    variations in each of the bands. VIs are combinations of surface reflectance at

    two or more wavelengths designed to highlight aparticular property of vegetation.

  • 7/27/2019 Remote sensing image processing

    60/137

    More than 150 VIs are published in scientificliterature so far.

    Only a small subset have substantialbiophysical basis or have been systematicallytested.

    Important VIs are:

    Normalized Difference Vegetation Index (NDVI)

    Simple Ratio Index (SR)

    Enhanced Vegetation Index (EVI)

    Atmospherically Resistant Vegetation Index(ARVI)

    Soil adjusted vegetation Index (SAVI)

    NDVI

  • 7/27/2019 Remote sensing image processing

    61/137

    NDVI

    NIR-R/NIR+R

    SR

    NIR/R

    EVI

    2.5(NIR-R)/(NIR+6R-7.5B+1)

    ARVI

    (NIR-(2R-B))/(NIR+(2R-B))

    SAVI

    (NIR-R)(1+L)/(NIR+R+L)

    Where, NIR=Near Infrared, R=red, B=Blue and L=soil-

    brightness dependent correction factor

  • 7/27/2019 Remote sensing image processing

    62/137

    Principal component Analysis (PCA)

    Image transformation techniques based onprocessing of the statistical characteristics ofmulti-band data sets to produce new bands.

    Can be used to reduce data redundancy andcorrelation between bands.

    The new bands are called components.

    PCA attempts to maximize (statistically) the

    amount of information (or variance) from theoriginal data into the least number of newcomponents.

  • 7/27/2019 Remote sensing image processing

    63/137

    Actual Image

    PCA1

    PCA2 PCA3

  • 7/27/2019 Remote sensing image processing

    64/137

    HSI transform

    Images are generally displayed in RGB colours(primary colours)

    An alternate to this is to hue, saturation andintensity system

    Hue refers to average wavelength of the lightcontributing to the colour

    Saturation specifies the purity of colour relative to

    gray Intensity relates to the total brightness of a colour.

    This is used for image enhancement

  • 7/27/2019 Remote sensing image processing

    65/137

    HIS image of first 3 PCA file

    RGB image from HSI

    l f

  • 7/27/2019 Remote sensing image processing

    66/137

    Image Classification

    A process of assigning pixels or a group of pixels in an image

    to one of a number of classes.

    The output of image classification is a thematic map.

    A thematic map is a map that focuses on a specific theme or

    subject area (like land use/cover in remote sensing)

    Land cover is the natural landscape recorded as surface

    components: forest, water, wetlands, urban, etc. Land cover

    can be documented by analyzing spectral signatures of

    satellite and aerial imagery.

    Land use is the way human uses the landscape: residential,commercial, agricultural, etc. Land use can be inferred but not

    explicitly derived from satellite and aerial imagery.

  • 7/27/2019 Remote sensing image processing

    67/137

  • 7/27/2019 Remote sensing image processing

    68/137

    Conventional multispectral classification

    techniques perform class assignments based only

    on the spectral signatures of a classification unit. Contextual classification refers to the use of

    spatial, temporal, and other related information,

    in addition to the spectral information of a

    classification unit in the classification of an image.

    Object based classification (based on

    segmentation techniques)

    A classification unit could be a pixel, a group of

    neighbouring pixels or the whole image

  • 7/27/2019 Remote sensing image processing

    69/137

    Classification Approaches

    There are three approaches to pixel labeling

    1. Supervised

    2. Unsupervised

    3. Semi-supervised

    Steps in supervised classification

  • 7/27/2019 Remote sensing image processing

    70/137

    Steps in supervised classification

    Definition of Information Classes

    Training/Calibration Site Selection

    Locate areas of known classes on the image

    Generation of Statistical Parameters (if statistical classifier is used) define the unique spectral characteristics of selected pixels

    Classification

    assignment of unknown pixels to the appropriate information

    class

    Accuracy Assessment

    test/validation data for accuracy assessment

    Output Stage

  • 7/27/2019 Remote sensing image processing

    71/137

    Training

    pixels Test pixelsFull Image Data

    Classifier Used

    Principle of supervised classification

  • 7/27/2019 Remote sensing image processing

    72/137

    Supervised classification

    Requires training areas to be defined by the

    analyst in order to determine the characteristics

    of each class.

  • 7/27/2019 Remote sensing image processing

    73/137

    Ground reference image of the study area

  • 7/27/2019 Remote sensing image processing

    74/137

  • 7/27/2019 Remote sensing image processing

    75/137

    Result of supervised classification

    WaterConiferDeciduous

    Legend:

    Land Cover Map

  • 7/27/2019 Remote sensing image processing

    76/137

    Supervised classifiers

    Maximum likelihood

    Neural network Decision tree

    Kernel based sparse classifiers

  • 7/27/2019 Remote sensing image processing

    77/137

    ijw

    k

    i

    jk

    Input

    Layer

    Hidden

    Layer

    Output

    Layer

    Back-propagation neural network classifier

  • 7/27/2019 Remote sensing image processing

    78/137

    BY Maximum Likelihood classifier

  • 7/27/2019 Remote sensing image processing

    79/137

    By Neural network classifier

  • 7/27/2019 Remote sensing image processing

    80/137

    By support vector machine

  • 7/27/2019 Remote sensing image processing

    81/137

    Accuracy

    With ETM+ dataset using 7 classes

    Classification accuracy

    Classifier Accuracy (%)Maximum likelihood 82.60

    Decision tree 85.60

    Neural network 85.10

    Support vector machine 87.37

    U i d l ifi

  • 7/27/2019 Remote sensing image processing

    82/137

    Unsupervised classifier

    Unsupervised classification, searches for naturalgroups of pixels, called clusters, present within thedata by means of assessing the relative locations ofthe pixels in the feature space.

    An algorithm is used to identify unique clusters ofpoints in feature space, which are then assumed torepresent unique land cover class.

    Number of clusters (i.e. classes) are defined by user.

    These are automated procedures and thereforerequire minimal user interaction.

    U i d l ifi

  • 7/27/2019 Remote sensing image processing

    83/137

    Unsupervised classifiers

    The most popular clustering algorithms usedin remote sensing image classification are:

    1. ISODATA, a statistical clustering method, and

    2. the SOM (self organising feature maps), an

    unsupervised neural classification method.

    ISODATA

  • 7/27/2019 Remote sensing image processing

    84/137

    ISODATA

    Iterative Self-Organizing Data Analysis Technique Parameters you must enter include:

    N - the maximum number of clusters that user

    wants (depends on his knowledge about area)

    T - a convergence threshold, which is the

    maximum percentage of the pixels whose class

    values are allowed to be unchanged between

    iterations M - the maximum number of iterations to be

    performed.

  • 7/27/2019 Remote sensing image processing

    85/137

    ISODATA Procedure

    N arbitrary cluster means are established,

    The image is classified using a minimum

    distance classifier

    A new mean for each cluster is calculated

    The image is classified again using the new

    cluster means

    Another new mean for each cluster is

    calculated

    The image is classified again.

  • 7/27/2019 Remote sensing image processing

    86/137

    After each iteration, the algorithm

    calculates the percentage of pixels that

    remained in the same cluster betweeniterations

    When this percentage exceeds T

    (convergence threshold), the program stopsor

    If the convergence threshold is never met,

    the program will continue for M iterationsand then stop.

    S i i d l ifi ti

  • 7/27/2019 Remote sensing image processing

    87/137

    Semi-supervised classification

    Use small number of labeled training data tolabel large amount of unlabeled data. Because collection of training data is expensive

    The basic assumption of most Semi-Supervised learning algorithms Nearby points are likely to have the same label.

    Similar data should have the same class label.

    Two points that are connected by a path goingthrough high density regions should have the samelabel.

    General image classification procedures

  • 7/27/2019 Remote sensing image processing

    88/137

    1. Selecting information classes such as urban, agriculture,

    forest areas, etc. Conduct field studies and collect groundinformation and other ancillary data of the study area.

    2. Preprocessing of the image, including radiometric,atmospheric, geometric and topographic corrections, imageenhancement.

    3. Creating a reference image from ground survey from actualimage to generate training signatures.

    4. Image classification

    5. Supervised mode: using training signature

    6. unsupervised mode: image clustering and cluster grouping

    7. Post-processing: complete geometric correction & filteringand classification decorating.

    8. Accuracy assessment: compare classification results withground truth.

    P t i d t i

  • 7/27/2019 Remote sensing image processing

    89/137

    Parametric and non parametric

    Parametric classifiers are based upon statisticalparameters (mean & standard deviation used byMLC) and based on the assumption that data arenormally distributed.

    non-parametric methods make no assumptionsabout the probability distribution of the data, andare often considered robust because they may

    work well for a wide variety of class distributions,as long as the class signatures are reasonablydistinct (NN, SVM, DT etc).

    I /Ph t I t t ti

  • 7/27/2019 Remote sensing image processing

    90/137

    Image/Photo Interpretation

    An act of examining images for the purpose ofidentifying objects and judging theirsignificance.

    Depending upon the instruments employedfor data collection one can interpret a varietyof images such as aerial photographs, scanner,thermal and radar imagery.

    Even a digitally processed imagery requiresimage interpretation.

    B i i i l f i t t ti

  • 7/27/2019 Remote sensing image processing

    91/137

    Basic principal of interpretation

    Image is a pictorial representation of pattern oflandscape which is composed of elements-indicators of things and events which reflectphysical, biological, and cultural components of

    landscape. Similar conditions in similar surroundings reflect

    similar patterns and unlike conditions reflectunlike patterns

    Type and nature of extracted information dependon knowledge, skill, and experience of interpreter,method used for interpretation and understandingof its limitations.

    Elements of image interpretation

  • 7/27/2019 Remote sensing image processing

    92/137

    Elements of image interpretation

    Shape Size

    Tone

    Shadow Pattern

    Texture

    Association Site

    Time

    Shape

  • 7/27/2019 Remote sensing image processing

    93/137

    Shape

    Shape refers to the general form or outline of an individual object.

    Man made features have specific shapes

    A railways is readily distinguishable from a road or a canal as its

    shape consists of long straight tangents and gentle curves as

    opposed to curved shape of a highway.

  • 7/27/2019 Remote sensing image processing

    94/137

    Size

  • 7/27/2019 Remote sensing image processing

    95/137

    Size

    Length, width, height, area, volume of theobject. It is a function of image scale.

  • 7/27/2019 Remote sensing image processing

    96/137

    Tone/colour

  • 7/27/2019 Remote sensing image processing

    97/137

    Tone/colour Tone of an object refers to relative brightness or colour in an

    image. One of the fundamental element to differentiate between

    different objects.

    It is qualitative measure

    No feature has a constant tone.

    Tone vary with the reflectivity of the object, the weather, the

    angle of light on an object and moisture content of the

    surface.

    The sensitivity of the response of tone to all theaforementioned variables makes it a very discriminating

    factor.

    Slight changes in the natural landscape are more easily

    comprehended because of tonal variations.

  • 7/27/2019 Remote sensing image processing

    98/137

    Shadow

  • 7/27/2019 Remote sensing image processing

    99/137

    Shadow

    A shadow provides information about theobject's height, shape, and orientation (e.g.tree species);

    Patterns

  • 7/27/2019 Remote sensing image processing

    100/137

    Patterns

    Similar to shape, the spatial arrangement of objects(e.g. row crops vs. pasture) is also useful to identify anobject and its usage.

    Spatial phenomenon such structural pattern of anobject in an image may be characteristic of artificial aswell as natural objects such as parceling (plot of land)patterns, land use, geomorphology of tidal marshes or

    shallows, land reclamation, erosion gullies, tillage,plant direction ridges of sea waves, lake districts,nature terrain etc.

  • 7/27/2019 Remote sensing image processing

    101/137

    Texture

  • 7/27/2019 Remote sensing image processing

    102/137

    Texture

    Frequency of tonal change in particular areaof an image

    A qualitative characteristics and generally

    refers as rough or smooth.

    Texture involves the total sum of tone, shape

    pattern and size, which together give the

    interpreter an intuitive feeling for thelandscape being analyzed.

  • 7/27/2019 Remote sensing image processing

    103/137

    Association

  • 7/27/2019 Remote sensing image processing

    104/137

    Association

    Associating the presence of one object withanother, or relating it to its environment, can

    help identify the object (e.g. industrial

    buildings often have access to railway sidings;nuclear power plants are often located beside

    large bodies of water).

    For example white irregular patches adjacentto sea referees to sand.

  • 7/27/2019 Remote sensing image processing

    105/137

  • 7/27/2019 Remote sensing image processing

    106/137

    Site

  • 7/27/2019 Remote sensing image processing

    107/137

    Site

    Location of an object amidst certain terraincharacteristics shown by the image may

    exclude incorrect conclusions e.g., site of an

    apartment building is not acceptable in aswamp or a jungle

    Time

  • 7/27/2019 Remote sensing image processing

    108/137

    Time

    Temporal characteristics of a series ofphotographs can be helpful in determining the

    historical change of an area (e.g. looking at a

    series of photos of a city taken in differentyears can help determine the growth of

    suburban neighborhoods.

  • 7/27/2019 Remote sensing image processing

    109/137

    Activities in image interpretation

  • 7/27/2019 Remote sensing image processing

    110/137

    Activities in image interpretation

    Detection

    Selectively picking up the object of importance

    for the particular kind of interpretation

    Recognition and

    identification

    Classification of an object by means of specific

    knowledge, within a known category, upon its

    detection in the image.

    AnalysisProcess of separating a set of similar objects and

    involves drawing boundary lines.

    Deduction

    Separation of different group of objects and

    deducing their significance based on converging

    of evidence

    Classification Establishment of the identity of objectsdelineated by analysis

    IdealizationStandardization of representation of what is

    actually seen in imagery.

  • 7/27/2019 Remote sensing image processing

    111/137

    Applications

    Assessment of Ground water Quality

  • 7/27/2019 Remote sensing image processing

    112/137

    for Potability Journal of Geographic Information System, 2010, 2, 152-162

    Water quality management is an important issue in the

    modern times.

    The data collected for Tiruchirappalli city have been utilized to

    develop the approach.

    Groundwater quality for potability indicated high to moderate

    water pollution levels at Srirangam, Ariyamangalam, Golden

    Rock and K. Abisekapurm zones of the study area, depending

    on factors such as depth to groundwater, constituents ofgroundwater and vulnerability of groundwater to pollution.

  • 7/27/2019 Remote sensing image processing

    113/137

    Groundwater vulnerability mapping

  • 7/27/2019 Remote sensing image processing

    114/137

    Published in Applied Geography by Barnali Dixon

    The overall goal of this research is to improve the methodology

    for the generation of contamination potential maps by using

    detailed landuse/pesticide and soil structure information inconjunction with selected parameters from the DRASTIC model.

    Other objectives are to incorporate GIS, GPS, remote sensing and

    the fuzzy rule-based model to generate groundwater sensitivity

    maps, and to compare the results of new methodologies with the

    modified DRASTIC Index (DI) and field water quality data.

  • 7/27/2019 Remote sensing image processing

    115/137

    DRASTIC, an overlay and index method developed for the

    Environmental Protection Agency (EPA) by the American Water

    Well Association (Aller et al., 1987) is a widely used model.

    This model assesses contamination potential of an area to

    pollution by combining key factors influencing the solute

    transport.The original DRASTIC Index (DIorg) was calculated using the most

    important hydrogeologic factors that affect the potential for

    groundwater pollution.

    The DRASTIC Index does not provide absolute answers; it onlydifferentiates highly vulnerable areas from less vulnerable areas.

    This model does not include soil structure in the model.

  • 7/27/2019 Remote sensing image processing

    116/137

  • 7/27/2019 Remote sensing image processing

    117/137

    The landuse data used in this study was obtained from

    Landsat 5 Thematic Mapper (TM) 1992. TM images from

    two seasons, spring and summer, were used in this study.

    The image was classified into 4 level I classifications, suchas urban, forests, water and agriculture. Then the images

    were further classified for agricultural crops.

  • 7/27/2019 Remote sensing image processing

    118/137

  • 7/27/2019 Remote sensing image processing

    119/137

    Satellite remote sensing of surface air quality

  • 7/27/2019 Remote sensing image processing

    120/137

    In Atmospheric Environment 42 (2008) 78237843

    by Randall V. Martin

    Global observations are now available for a wide range of

    species including aerosols, tropospheric O3, troposphericNO2, CO, HCHO, and SO2.

  • 7/27/2019 Remote sensing image processing

    121/137

  • 7/27/2019 Remote sensing image processing

    122/137

    HCHO (Formaldehyde) columns are closely related to surface VOC (

    Volatile Organic Compounds) emissions since HCHO is a high-yield intermediate

    product from the oxidation of reactive non methane VOCs.

  • 7/27/2019 Remote sensing image processing

    123/137

  • 7/27/2019 Remote sensing image processing

    124/137

    Hyperspectral Remote Sensing of Water

  • 7/27/2019 Remote sensing image processing

    125/137

    Hyperspectral Remote Sensing of Water

    Quality Parameters for Large Rivers in the

    Ohio River Basin Naseer A. Shafique, Florence Fulk, Bradley C. Autrey, Joseph

    Flotemersch

    Compact Airborne Spectrographic Imager (CASI) datawas used.

    In situ water samples were collected and a field

    spectrometer was used to collect spectral data

    directly from the river.

  • 7/27/2019 Remote sensing image processing

    126/137

    Method of 2D and 3D Air Quality monitoring using a Lidar

  • 7/27/2019 Remote sensing image processing

    127/137

    Published in 16th Conference on Air pollution Meteorology

    To characterize urban and industrial pollution in FRANCE.

    Lidar (Light Detection And Ranging) equipped with a scanning device, allows

    realizing mapping of particles.

    In industrials sites for plumes detection, urban site to show pollution from

    traffic and also in a tunnel with big circulation.

    During this experiment a LIDAR , works at 355 nm and have a spatial resolutionof 1.5m is used.

    It is equipped with a cross- polarised channel which discriminate non-spherical

    particles from the others.

    For these measurements the lidar was placed at a horizontal position. The lidar

    signal is inverted using the so-called slopemethod. From this calculation weretrieve the backscatter profile and then calculate an extinction value along

    the optical path, and detect the plumes very accurately.

  • 7/27/2019 Remote sensing image processing

    128/137

  • 7/27/2019 Remote sensing image processing

    129/137

  • 7/27/2019 Remote sensing image processing

    130/137

    Horizontal scanning from Lyon near a tunnel. We can observe a huge

    concentration of particles at the intersection of many roads.

    Mapping of heavy metal pollution in stream

    sediments using combined

  • 7/27/2019 Remote sensing image processing

    131/137

    g

    geochemistry, field spectroscopy, and hyperspectral

    remote sensing

    The aim of this study is to derive parameters from spectral variationsassociated with heavy metals in soil and to explore the possibility of extending

    the use of these parameters to hyperspectral images and to map the

    distribution of areas affected by heavy metals on HyMAP data. Variations in

    the spectral absorption features of lattice OH and oxygen on the mineral

    surface due to the combination of different heavy metals were linked to actualconcentrations of heavy metals.

  • 7/27/2019 Remote sensing image processing

    132/137

    Location map and HyMAP image of the Rodalquilar area, SE Spain. (a) Locations of

    sampling points along the studied main stream, showing the five sections. (b) HyMAP

    image acquired in 2004.

  • 7/27/2019 Remote sensing image processing

    133/137

  • 7/27/2019 Remote sensing image processing

    134/137

    Soil Mapping

  • 7/27/2019 Remote sensing image processing

    135/137

    Soil survey and mapping using remote sensing, Tropical Ecology43(1): 61-74, 2002

    M.L.MANCHANDA, M.KUDRAT & A.K.TIWARI

  • 7/27/2019 Remote sensing image processing

    136/137

    Impact of industrialization on forest and non-forest

    Assessing impact of industrialization in terms of LULC in a dry tropical region

    (Chhattisgarh), India using remote sensing data and GIS over a period of 30 years

    Environ Monit Assess (2009) 149:371376

    Multi-sensor data fusion for the detection of underground coal fires, X.M. Zhang,

    C.J.S. Cassells & J.L. van Genderen, Geologie en Mijnbouw77: 117127, 1999.

  • 7/27/2019 Remote sensing image processing

    137/137

    http://www.fas.org/irp/imint/docs/rst/Front/tofc.html

    For large number of applications of remote sensing data

    www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppt

    http://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-

    sensing/fundamentals/

    http://nature.berkeley.edu/~gong/textbook/

    rst.gsfc.nasa.gov/

    http://nptel.iitm.ac.in/courses/Webcourse-contents/IIT-

    KANPUR/ModernSurveyingTech/ui/TOC1.htm

    http://www.ccrs.nrcan.gc.ca/glossary/http://maic.jmu.edu/sic/rs/resolution.htm

    http://www.gis.unbc.ca/courses/geog432/lectures/lect7/index.php

    http://www.fas.org/irp/imint/docs/rst/Front/tofc.htmlhttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/fundamentals/http://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/fundamentals/http://nature.berkeley.edu/~gong/textbook/http://nature.berkeley.edu/~gong/textbook/http://nature.berkeley.edu/~gong/textbook/http://nature.berkeley.edu/~gong/textbook/http://nature.berkeley.edu/~gong/textbook/http://nature.berkeley.edu/~gong/textbook/http://nature.berkeley.edu/~gong/textbook/http://nature.berkeley.edu/~gong/textbook/http://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/fundamentals/http://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/fundamentals/http://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/fundamentals/http://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/fundamentals/http://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/fundamentals/http://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/fundamentals/http://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/fundamentals/http://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/fundamentals/http://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/fundamentals/http://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/fundamentals/http://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/fundamentals/http://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/fundamentals/http://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/fundamentals/http://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/fundamentals/http://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/fundamentals/http://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.cof.orst.edu/cof/teach/...powerpoint.../imageclassification.ppthttp://www.fas.org/irp/imint/docs/rst/Front/tofc.html