Rem Sensing 8
-
Upload
smzahidshah -
Category
Documents
-
view
17 -
download
3
Transcript of Rem Sensing 8
![Page 1: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/1.jpg)
Image Classification: Supervised Methods
Lecture 8
Prepared by R. Lathrop 11//99
Updated 3/06
Readings:
ERDAS Field Guide 5th Ed. Ch 6:234-260
![Page 2: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/2.jpg)
Where in the World?
![Page 3: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/3.jpg)
Learning objectives• Remote sensing science concepts
– Basic concept of supervised classification– Major classification algorithms– Hard vs Fuzzy Classification.
• Math Concepts• Skills --Training set selection: Digital polygon vs. seed pixel-
region growing --Training aids: plot of training data, statistical measure
of separability; --Edit/evaluate signatures
-- Applying Classification algorithms
![Page 4: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/4.jpg)
Supervised vs. Unsupervised Approaches
• Supervised - image analyst "supervises" the selection of spectral classes that represent patterns or land cover features that the analyst can recognize
Prior Decision
• Unsupervised - statistical "clustering" algorithms used to select spectral classes inherent to the data, more computer-automated
Posterior Decision
![Page 5: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/5.jpg)
Supervised vs. Unsupervised
Edit/evaluate signatures
Select Training fields
Classify image
Evaluate classification
Identify classes
Run clustering algorithm
Evaluate classification
Edit/evaluate signatures
![Page 6: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/6.jpg)
Supervised vs. Unsupervised
Red
NIR
Supervised Prior Decision: from Information classes in the Image to Spectral Classes in Feature Space
Unsupervised Posterior Decision: from Spectral Classes in Feature Space to Information Classes in the Image
![Page 7: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/7.jpg)
Training• Training: the process of defining criteria by which
spectral patterns are recognized• Spectral signature: result of training that defines a
training sample or clusterparametric - based on statistical parameters that assume a normal distribution (e.g., mean, covariance matrix)nonparametric - not based on statistics but on discrete objects (polygons) in feature space
![Page 8: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/8.jpg)
Supervised Training Set Selection
• Objective - selecting a homogenous (unimodal) area for each apparent spectral class
• Digitize polygons - high degree of user control; often results in overestimate of spectral class variability
• Seed pixel - region growing technique to reduce with-in class variability; works by analyst setting threshold of acceptable variance, total # of pixels, adjacency criteria (horiz/vert, diagonal)
![Page 9: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/9.jpg)
ERDAS Area of Interest (AOI) tools
Seed pixel or region growing dialog
![Page 10: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/10.jpg)
Region Growing: good for linear features
Spectral Distance = 7 Spectral Distance = 10
![Page 11: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/11.jpg)
Region Growing: good for spectrally heterogeneous features
Spectral Distance = 5 Spectral Distance = 10
![Page 12: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/12.jpg)
Supervised Training Set Selection
Whether using the digitized polygon or seed pixel technique, the analyst should select multiple training sites to identify the many possible spectral classes in each information class of interest
![Page 13: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/13.jpg)
Guided Clustering: hybrid supervised/unsupervised approach
• Polygonal areas of known land cover type are delineated as training sites
• ISODATA unsupervised clustering performed on these training sites
• Clusters evaluated and then combined into a single training set of spectral signatures
![Page 14: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/14.jpg)
Training Stage
• Training set ---> training vector
• Training vector for each spectral class- represents a sample in n-dimensional measurement space where n = # of bands
for a given spectral class j
Xj = [ X1 ] X1 = mean DN band 1
[ X2] X2 = mean DN band 2
![Page 15: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/15.jpg)
Classification Training Aids• Goal: evaluate spectral class separability• 1) Graphical plots of training data
- histograms- coincident spectral plots- scatter plots
• 2) Statistical measures of separability - divergence - Mahalanobis distance
• 3) Training Area Classification
• 4) Quick Alarm Classification- paralellipiped
![Page 16: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/16.jpg)
Parametric vs. Nonparametric Distance Approaches
• Parametric - based on statistical parameters assuming normal distribution of the clusters
e.g., mean, std dev., covariance
• Nonparametric - not based on "normal" statistics, but on discrete objects and simple spectral distance in feature space
![Page 17: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/17.jpg)
Parametric Assumption: each spectral class exhibits a unimodal normal
distribution
0 255Digital Number
# of pixels
Class 1 Class 2
Bimodal histogram: Mix of Class 1 & 2
![Page 18: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/18.jpg)
Training Aids
• Graphical portrayals of training data
– histogram (check for normality)
“good”
“bad”
![Page 19: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/19.jpg)
Training Aids
• Graphical portrayals of training data– coincident spectral
mean plots
![Page 20: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/20.jpg)
Training Aids
• Scatter plots: each training set sample constitutes an ellipse in feature space
• Provides 3 pieces of information - location of ellipse: mean vector
- shape of ellipse: covariance- orientation of ellipse:
slope & sign of covariance
• Need training vector and covariance matrix
![Page 21: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/21.jpg)
Red Reflectance
NIRReflectance
Grass
Trees
water
ImperviousSurface &Bare Soil
Spectral Feature Space
Examine ellipses for gaps and overlaps. Overlapping ellipses ok within information classes; want to limit between info classes
Conifer
Broadleaf
Mix: grass/trees
![Page 22: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/22.jpg)
Training Aids• Are some training sets redundant or overlap too greatly?
•Statistical Measures of Separability: expressions of statistical distance that are sensitive to both mean and variance
- divergence- Mahalanobis distance
![Page 23: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/23.jpg)
Training Aids
• Training/Test Area classification: look for misclassification between information classes; training areas can be biased, better to use independent test areas
• Quick alarm classification: on-screen evaluation of all pixels that fall within the training decision region (e.g. parallelipiped)
![Page 24: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/24.jpg)
Classification Decision Process
• Decision Rule: mathematical algorithm that, using data contained in the signature, performs the actual sorting of pixels into discrete classes
• Parametric vs. nonparametric rules
![Page 25: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/25.jpg)
Parallelepiped or box classifier
• Decision region defined by the rectangular area defined by the highest and lowest DN’s in each band; specify by range (min/max) or std dev.
• Pro: Takes variance into account but lacks sensitivity to covariance (Con)
• Pro: Computationally efficient, useful as first pass• Pro: Nonparametric• Con: Decision regions may overlap; some pixels
may remain unclassified
![Page 26: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/26.jpg)
Red Reflectance
NIRReflectance
Spectral Feature Space
Upper and lower limit of each box set by either range (min/max) or # of standard devs.
Note overlap in Red but not NIR band
Parallelepiped or Box Classifier
![Page 27: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/27.jpg)
Parallelepipeds have “corners”
Red reflectance
NIR
reflectance
Adapted from ERDAS Field Guide
.
Parallelepiped boundary
Signature ellipseunir
ured
Candidate pixel
![Page 28: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/28.jpg)
Parallelepiped or Box Classifier: problems
Red reflectance
NIR
reflectance
Soil 1 Soil 2
Soil 3
Water 1
Water 2
Veg 1
Veg 2
Veg3
Adapted from Lillesand & Kiefer, 1994
Overlap region
Misclassified pixel
??Unclassified pixels
![Page 29: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/29.jpg)
Minimum distance to means
• Compute mean of each desired class and then classify unknown pixels into class with closest mean using simple euclidean distance
• Con: insensitive to variance & covariance
• Pro: computationally efficient
• Pro: all pixels classified, can use thresholding to eliminate pixels far from means
![Page 30: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/30.jpg)
Minimum Distance to Means Classifier
Red reflectance
NIR
reflectance
Soil 1 Soil 2
Soil 3
Water 1
Water 2
Veg 1
Veg 2
Veg3
Adapted from Lillesand & Kiefer, 1994
![Page 31: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/31.jpg)
Minimum Distance to Means Classifier: Euclidian Spectral Distance
X
Y 92, 153
180, 85
Xd = 180 -92
Yd = 85-153Distance = 111.2
![Page 32: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/32.jpg)
Feature Space Classification
• Image analyst draws in decision regions directly on the feature space image using AOI tools - often useful for a first-pass broad classification
• Pixels that fall within a user-defined feature space class is assigned to that class
• Pro: Good for classes with a non-normal distribution
• Con: Potential problem with overlap and unclassified pixels
![Page 33: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/33.jpg)
Red Reflectance
NIRReflectance
Spectral Feature Space
Analyst draws decision regions in feature space
Feature Space Classifier
![Page 34: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/34.jpg)
Statistically-based classifiers
• Defines a probability density (statistical) surface
• Each pixel is evaluated for its statistical probability of belonging in each category, assigned to class with maximum probability
• The probability density function for each spectral class can be completely described by the mean vector and covariance matrix
![Page 35: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/35.jpg)
Parametric Assumption: each spectral class exhibits a unimodal normal
distribution
0 255Digital Number
# of pixels
Class 1 Class 2
Bimodal histogram: Mix of Class 1 & 2
![Page 36: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/36.jpg)
2d vs. 1d views of class
overlap
0 255Digital Number
# of pixels
wi
wj
Band 2
Band 1
Band 1
![Page 37: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/37.jpg)
Probabilities used in likelihood ratio
0 255Digital Number
# of pixels
p (x | wj)p (x | wi)
wi
wj
}{
![Page 38: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/38.jpg)
Red Reflectance
NIRReflectance
Spectral Feature Space
Ellipses defined by class mean and covariance; creates likelihood contours around each spectral class;
Spectral classes as probability surfaces
![Page 39: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/39.jpg)
Red Reflectance
NIRReflectance
Spectral Feature Space
Some classes may have large variance and greatly overlap other spectral classes
Sensitive to large covariance values
![Page 40: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/40.jpg)
Mahalonobis Distance Classifier D = (X-Mc)T (COVc
-1)(X-Mc)
D = Mahalanobis distance c = particular class
X = measurement vector of the candidate pixel
Mc = mean vector of class c COVc = covariance matrix
COVc-1 = inverse of covariance matrix T = transposition
Pro: takes the variability of the classes into account with info from COV matrix
Similar to maximum likelihood but without the weighting factors
Con: parametric, therefore sensitive to large variances
![Page 41: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/41.jpg)
Maximum likelihood classifier
• Pro: potentially the most accurate classifier as it incorporates the most information (mean vector and COV matrix)
• Con: Parametric procedure that assumes the spectral classes are normally distributed
• Con: sensitive to large values in the covariance matrix
• Con: computationally intensive
![Page 42: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/42.jpg)
Bayes Optimal approach• Designed to minimize the average (expected) cost
of misclassifying in maximum likelihood approach
• Uses an apriori (previous probability) term to weight decisions - weights more heavily towards common classes
• Example: prior probability suggests that 60 of the pixels are forests, therefore the classifier would more heavily weight towards forest in borderline cases
![Page 43: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/43.jpg)
Hybrid classification• Can easily mix various classification algorithms in a
multi-step process• First pass: some non-parametric rule (feature space or
paralellipiped) to handle the most obvious cases, those pixels remaining unclassified or in overlap regions fall to second pass
• Second pass: some parametric rule to handle the difficult cases; the training data can be derived from unsupervised or supervised techniques
![Page 44: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/44.jpg)
Thresholding• Statistically-based classifiers
do poorest near the tails of the training sample data distributions
• Thresholds can be used to define those pixels that have a higher probability of misclassification; these pixels can be excluded and labeled un-classified or retrained using a cluster-busting type of approach
![Page 45: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/45.jpg)
Thresholding: define those pixels that have a higher probability of
misclassification
0 255Unclassified Regions
# of pixels
Class 1 Class 2 Threshold
![Page 46: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/46.jpg)
Thresholding• Chi square distribution used to help define a one-
tailed threshold
0Chi Square
# of pixels
Threshold: values above will remain unclassified
![Page 47: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/47.jpg)
Hard vs. Fuzzy Classification Rules
• Hard - “binary” either/or situation: a pixel belongs to one & only one class
• Fuzzy - soft boundaries, a pixel can have partial membership to more than one class
![Page 48: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/48.jpg)
Hard vs. Fuzzy Classification
Water Forested Wetland
Forest
Hard Classification
Fuzzy Classification Adapted from Jensen, 2nd ed. 1996
![Page 49: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/49.jpg)
Hard vs. Fuzzy Classification
NIR reflectance
MIR
reflectance
Water
Forested Wetland
Forest
Adapted from Jensen, 2nd ed. 1996
Hard decision boundaries
![Page 50: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/50.jpg)
Fuzzy Classification: In ERDAS
•Fuzzy Classification: in the Supervised Classification option, the analyst can use choose Fuzzy Classification and then choose the number of “best classes” per pixel.
•This will create multiple output classification layers, as many as the number of best classes chosen above.
![Page 51: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/51.jpg)
Fuzzy Classification: In ERDAS•Fuzzy Convolution: calculates the total weighted inverse distance of all the classes in a window of pixels and assigns the center pixel the class with the distance summed over the entire set of fuzzy classification layers. •This has the effect of creating a context-based classification. •Classes with a very small distance value will remain unchanged while classes with higher distance values may change to a neighboring value if there are a sufficient number of neighboring pixels with class values and small corresponding distance values.
![Page 52: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/52.jpg)
Main points of the lecture
• Training: --Training set selection: Digital polygon vs. seed pixel-region growing --Training aids: plot of training data, statistical measure of separability; --Edit/evaluate signatures. • Classification algorithms:
– box classifier, – minimum distance to means classifier, – feature space classifier, – statistically-based classifiers (maximum likelihood classifier,
Mahalonobis distance classifier)• Hybrid classification: statistical + Threshold method; • Hard vs Fuzzy Classification.
![Page 53: Rem Sensing 8](https://reader035.fdocuments.net/reader035/viewer/2022062320/55cf9aac550346d033a2d834/html5/thumbnails/53.jpg)
Homework
1 Homework: Unsupervised classification (Hand up your excel file and figure process);
2 Reading Textbook Ch. 9:337-389;
3 Reading Field Guide Ch. 7:226-231, 235-253.