HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... ·...

30

Transcript of HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... ·...

Page 1: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition
Page 2: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition
Page 3: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

HYPERSPECTRAL DATAPROCESSING

Page 4: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition
Page 5: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

HYPERSPECTRAL DATAPROCESSINGAlgorithm Design and Analysis

Chein-I Chang

University of Maryland, Baltimore County (UMBC), Maryland, USA

Page 6: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

Copyright # 2013 by John Wiley & Sons, Inc. All rights reserved

Published by John Wiley & Sons, Inc., Hoboken, NJ

Published simultaneously in Canada

No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means,

electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of

the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization

through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers,

MA 01923, (978) 750-8400, fax (978) 750-4470, or on the web at www.copyright.com. Requests to the Publisher for

permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ

07030, (201) 748-6011, fax (201) 748-6008, or online at http://www.wiley.com/go/permission.

Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book,

they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and

specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be

created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not

be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author

shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental,

consequential, or other damages.

For general information on our other products and services or for technical support, please contact our Customer Care

Department within the United States at (800) 762-2974, outside the United States at (317) 572-3993 or fax (317) 572-4002.

Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in

electronic formats. For more information about Wiley products, visit our web site at www.wiley.com.

Library of Congress Cataloging-in-Publication Data:

Chang, Chein-I.

Hyperspectral data processing : algorithm design and analysis / Chein-I Chang.

p. cm.

Includes bibliographical references and index.

ISBN 978-0-471-69056-6 (hardback)

1. Image processing–Digital techniques. 2. Spectroscopic imaging. 3. Signal processing. I. Chang, Chein-I.

Hyperspectral imaging. II. Title.

TA1637.C4776 2012

621.39’94–dc23

2011043896

Printed in the United States of America

10 9 8 7 6 5 4 3 2 1

Page 7: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

This book is dedicated to members of my family, specifically

my mother who provided me with her timeless support

and encouragement during the course of preparing

this book. It is also dedicated to all of my students

who have contributed to this book.

Page 8: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition
Page 9: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

Contents

PREFACE xxiii

1 OVERVIEWAND INTRODUCTION 1

1.1 Overview 2

1.2 Issues of Multispectral and Hyperspectral Imageries 3

1.3 Divergence of Hyperspectral Imagery from Multispectral Imagery 4

1.3.1 Misconception: Hyperspectral Imaging is a Natural Extension

of Multispectral Imaging 4

1.3.2 Pigeon-Hole Principle: Natural Interpretation of Hyperspectral Imaging 5

1.4 Scope of This Book 7

1.5 Book’s Organization 10

1.5.1 Part I: Preliminaries 10

1.5.2 Part II: Endmember Extraction 12

1.5.3 Part III: Supervised Linear Hyperspectral Mixture Analysis 13

1.5.4 Part IV: Unsupervised Hyperspectral Analysis 13

1.5.5 Part V: Hyperspectral Information Compression 15

1.5.6 Part VI: Hyperspectral Signal Coding 16

1.5.7 Part VII: Hyperspectral Signal Feature Characterization 17

1.5.8 Applications 17

1.5.8.1 Chapter 30: Applications of Target Detection 17

1.5.8.2 Chapter 31: Nonlinear Dimensionality Expansion to Multispectral

Imagery 18

1.5.8.3 Chapter 32: Multispectral Magnetic Resonance Imaging 19

1.6 Laboratory Data to be Used in This Book 19

1.6.1 Laboratory Data 19

1.6.2 Cuprite Data 19

1.6.3 NIST/EPA Gas-Phase Infrared Database 19

1.7 Real Hyperspectral Images to be Used in this Book 20

1.7.1 AVIRIS Data 20

1.7.1.1 Cuprite Data 21

1.7.1.2 Purdue’s Indiana Indian Pine Test Site 25

1.7.2 HYDICE Data 26

1.8 Notations and Terminologies to be Used in this Book 29

vii

Page 10: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

I: PRELIMINARIES 31

2 FUNDAMENTALS OF SUBSAMPLE AND MIXED SAMPLE ANALYSES 33

2.1 Introduction 33

2.2 Subsample Analysis 35

2.2.1 Pure-Sample Target Detection 35

2.2.2 Subsample Target Detection 38

2.2.2.1 Adaptive Matched Detector (AMD) 39

2.2.2.2 Adaptive Subspace Detector (ASD) 41

2.2.3 Subsample Target Detection: Constrained Energy Minimization (CEM) 43

2.3 Mixed Sample Analysis 45

2.3.1 Classification with Hard Decisions 45

2.3.1.1 Fisher’s Linear Discriminant Analysis (FLDA) 46

2.3.1.2 Support Vector Machines (SVM) 48

2.3.2 Classification with Soft Decisions 54

2.3.2.1 Orthogonal Subspace Projection (OSP) 54

2.3.2.2 Target-Constrained Interference-Minimized

Filter (TCIMF) 56

2.4 Kernel-Based Classification 57

2.4.1 Kernel Trick Used in Kernel-Based Methods 57

2.4.2 Kernel-Based Fisher’s Linear Discriminant Analysis (KFLDA) 58

2.4.3 Kernel Support Vector Machine (K-SVM) 59

2.5 Conclusions 60

3 THREE-DIMENSIONAL RECEIVER OPERATING CHARACTERISTICS (3D ROC)

ANALYSIS 63

3.1 Introduction 63

3.2 Neyman–Pearson Detection Problem Formulation 65

3.3 ROC Analysis 67

3.4 3D ROC Analysis 69

3.5 Real Data-Based ROC Analysis 72

3.5.1 How to Generate ROC Curves from Real Data 72

3.5.2 How to Generate Gaussian-Fitted ROC Curves 73

3.5.3 How to Generate 3D ROC Curves 75

3.5.4 How to Generate 3D ROC Curves for Multiple Signal Detection and

Classification 77

3.6 Examples 78

3.6.1 Hyperspectral Imaging 79

3.6.1.1 Hyperspectral Target Detection 79

3.6.1.2 Linear Hyperspectral Mixture Analysis 80

3.6.2 Magnetic Resonance (MR) Breast Imaging 83

3.6.2.1 Breast Tumor Detection 84

3.6.2.2 Brain Tissue Classification 87

3.6.3 Chemical/Biological Agent Detection 91

3.6.4 Biometric Recognition 95

3.7 Conclusions 99

viii Contents

Page 11: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

4 DESIGN OF SYNTHETIC IMAGE EXPERIMENTS 101

4.1 Introduction 102

4.2 Simulation of Targets of Interest 103

4.2.1 Simulation of Synthetic Subsample Targets 103

4.2.2 Simulation of Synthetic Mixed-Sample Targets 104

4.3 Six Scenarios of Synthetic Images 104

4.3.1 Panel Simulations 104

4.3.2 Three Scenarios for Target Implantation (TI) 106

4.3.2.1 Scenario TI1 (Clean Panels Implanted into Clean Background) 106

4.3.2.2 Scenario TI2 (Clean Panels Implanted into Noisy Background) 107

4.3.2.3 Scenario TI3 (Gaussian Noise Added to Clean Panels

Implanted into Clean Background) 108

4.3.3 Three Scenarios for Target Embeddedness (TE) 108

4.3.3.1 Scenario TE1 (Clean Panels Embedded in Clean Background) 109

4.3.3.2 Scenario TE2 (Clean Panels Embedded in Noisy Background) 109

4.3.3.3 Scenario TE3 (Gaussian Noise Added to Clean Panels

Embedded in Background) 110

4.4 Applications 112

4.4.1 Endmember Extraction 112

4.4.2 Linear Spectral Mixture Analysis (LSMA) 113

4.4.2.1 Mixed Pixel Classification 114

4.4.2.2 Mixed Pixel Quantification 114

4.4.3 Target Detection 114

4.4.3.1 Subpixel Target Detection 114

4.4.3.2 Anomaly Detection 122

4.5 Conclusions 123

5 VIRTUAL DIMENSIONALITY OF HYPERSPECTRAL DATA 124

5.1 Introduction 124

5.2 Reinterpretation of VD 126

5.3 VD Determined by Data Characterization-Driven Criteria 126

5.3.1 Eigenvalue Distribution-Based Criteria 127

5.3.1.1 Thresholding Energy Percentage 127

5.3.1.2 Thresholding Difference between Normalized Correlation

Eigenvalues and Normalized Covariance Eigenvalues 128

5.3.1.3 Finding First Sudden Drop in the Normalized Eigenvalue

Distribution 128

5.3.2 Eigen-Based Component Analysis Criteria 128

5.3.2.1 Singular Value Decomposition (SVD) 128

5.3.2.2 Principal Components Analysis (PCA) 129

5.3.3 Factor Analysis: Malinowski’s Error Theory 129

5.3.4 Information Theoretic Criteria (ITC) 130

5.3.4.1 AIC 131

5.3.4.2 MDL 131

5.3.5 Gershgorin Radius-Based Methods 131

5.3.5.1 Thresholding Gershgorin Radii 134

5.3.5.2 Thresholding Difference Gershgorin Radii between RL�L and KL�L 134

Contents ix

Page 12: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

5.3.6 HFC Method 135

5.3.7 Discussions on Data Characterization-Driven Criteria 138

5.4 VD Determined by Data Representation-Driven Criteria 140

5.4.1 Orthogonal Subspace Projection (OSP) 140

5.4.2 Signal Subspace Estimation (SSE) 142

5.4.3 Discussions on OSP and SSE/HySime 143

5.5 Synthetic Image Experiments 144

5.5.1 Data Characterization-Driven Criteria 144

5.5.1.1 Target Implantation (TI) Scenarios 145

5.5.1.2 Target Embeddedness (TE) Scenarios 146

5.5.2 Data Representation-Driven Criteria 149

5.6 VD Estimated for Real Hyperspectral Images 155

5.7 Conclusions 163

6 DATA DIMENSIONALITY REDUCTION 168

6.1 Introduction 168

6.2 Dimensionality Reduction by Second-Order Statistics-Based Component Analysis

Transforms 170

6.2.1 Eigen Component Analysis Transforms 170

6.2.1.1 Principal Components Analysis 170

6.2.1.2 Standardized Principal Components Analysis 172

6.2.1.3 Singular Value Decomposition 174

6.2.2 Signal-to-Noise Ratio-Based Components Analysis Transforms 176

6.2.2.1 Maximum Noise Fraction Transform 176

6.2.2.2 Noise-Adjusted Principal Component Transform 177

6.3 Dimensionality Reduction by High-Order Statistics-Based Components Analysis

Transforms 179

6.3.1 Sphering 179

6.3.2 Third-Order Statistics-Based Skewness 181

6.3.3 Fourth-Order Statistics-Based Kurtosis 182

6.3.4 High-Order Statistics 182

6.3.5 Algorithm for Finding Projection Vectors 183

6.4 Dimensionality Reduction by Infinite-Order Statistics-Based Components Analysis

Transforms 184

6.4.1 Statistics-Prioritized ICA-DR (SPICA-DR) 187

6.4.2 Random ICA-DR 188

6.4.3 Initialization Driven ICA-DR 189

6.5 Dimensionality Reduction by Projection Pursuit-Based Components Analysis

Transforms 190

6.5.1 Projection Index-Based Projection Pursuit 191

6.5.2 Random Projection Index-Based Projection Pursuit 192

6.5.3 Projection Index-Based Prioritized Projection Pursuit 193

6.5.4 Initialization Driven Projection Pursuit 194

6.6 Dimensionality Reduction by Feature Extraction-Based Transforms 195

6.6.1 Fisher’s Linear Discriminant Analysis 195

6.6.2 Orthogonal Subspace Projection 196

6.7 Dimensionality Reduction by Band Selection 196

x Contents

Page 13: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

6.8 Constrained Band Selection 197

6.9 Conclusions 198

II: ENDMEMBER EXTRACTION 201

7 SIMULTANEOUS ENDMEMBER EXTRACTION ALGORITHMS (SM-EEAs) 207

7.1 Introduction 208

7.2 Convex Geometry-Based Endmember Extraction 209

7.2.1 Convex Geometry-Based Criterion: Orthogonal Projection 209

7.2.2 Convex Geometry-Based Criterion: Minimal Simplex Volume 214

7.2.2.1 Minimal-Volume Transform (MVT) 214

7.2.2.2 Convex Cone Analysis (CCA) 214

7.2.3 Convex Geometry-Based Criterion: Maximal Simplex Volume 215

7.2.3.1 Simultaneous N-FINDR (SM N-FINDR) 216

7.2.3.2 Iterative N-FINDR (IN-FINDR) 216

7.2.3.3 Various Versions of Implementing IN-FINDR 218

7.2.3.4 Discussions on Various Implementation Versions of IN-FINDR 222

7.2.3.5 Comparative Study Among Various Versions of IN-FINDR 222

7.2.3.6 Alternative SM N-FINDR 223

7.2.4 Convex Geometry-Based Criterion: Linear Spectral Mixture Analysis 225

7.3 Second-Order Statistics-Based Endmember Extraction 228

7.4 Automated Morphological Endmember Extraction (AMEE) 230

7.5 Experiments 231

7.5.1 Synthetic Image Experiments 231

7.5.1.1 Scenario TI1 (Endmembers Implanted in a Clean Background) 232

7.5.1.2 Scenario TI2 (Endmembers Implanted in a Noisy Background) 233

7.5.1.3 Scenario TI3 (Noisy Endmembers Implanted in a Noisy

Background) 234

7.5.1.4 Scenario TE1 (Endmembers Embedded into a Clean

Background) 235

7.5.1.5 Scenario TE2 (Endmembers Embedded into a Noisy

Background) 235

7.5.1.6 Scenario TE3 (Noisy Endmembers Embedded into a Noisy

Background) 236

7.5.2 Cuprite Data 237

7.5.3 HYDICE Data 237

7.6 Conclusions 239

8 SEQUENTIAL ENDMEMBER EXTRACTION ALGORITHMS (SQ-EEAs) 241

8.1 Introduction 241

8.2 Successive N-FINDR (SC N-FINDR) 244

8.3 Simplex Growing Algorithm (SGA) 244

8.4 Vertex Component Analysis (VCA) 247

8.5 Linear Spectral Mixture Analysis-Based SQ-EEAs 248

8.5.1 Automatic Target Generation Process-EEA (ATGP-EEA) 248

8.5.2 Unsupervised Nonnegativity Constrained Least-Squares-EEA

(UNCLS-EEA) 249

Contents xi

Page 14: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

8.5.3 Unsupervised Fully Constrained Least-Squares-EEA (UFCLS-EEA) 250

8.5.4 Iterative Error Analysis-EEA (IEA-EEA) 251

8.6 High-Order Statistics-Based SQ-EEAS 252

8.6.1 Third-Order Statistics-Based SQ-EEA 252

8.6.2 Fourth-Order Statistics-Based SQ-EEA 252

8.6.3 Criterion for kth Moment-Based SQ-EEA 253

8.6.4 Algorithm for Finding Projection Vectors 253

8.6.5 ICA-Based SQ-EEA 254

8.7 Experiments 254

8.7.1 Synthetic Image Experiments 255

8.7.2 Real Hyperspectral Image Experiments 258

8.7.2.1 Cuprite Data 258

8.7.2.2 HYDICE Data 260

8.8 Conclusions 262

9 INITIALIZATION-DRIVEN ENDMEMBER EXTRACTION

ALGORITHMS (ID-EEAs) 265

9.1 Introduction 265

9.2 Initialization Issues 266

9.2.1 Initial Conditions to Terminate an EEA 267

9.2.2 Selection of an Initial Set of Endmembers for an EEA 267

9.2.3 Issues of Random Initial Conditions Demonstrated by Experiments 268

9.2.3.1 HYDICE Experiments 268

9.2.3.2 AVIRIS Experiments 270

9.3 Initialization-Driven EEAs 271

9.3.1 Initial Endmember-Driven EEAs 272

9.3.1.1 Finding Maximum Length of Data Sample Vectors 272

9.3.1.2 Finding Sample Mean of Data Sample Vectors 273

9.3.2 Endmember Initialization Algorithm for SM-EEAs 274

9.3.2.1 SQ-EEAs 274

9.3.2.2 Maxmin-Distance Algorithm 275

9.3.2.3 ISODATA 275

9.3.3 EIA-Driven EEAs 275

9.4 Experiments 278

9.4.1 Synthetic Image Experiments 278

9.4.2 Real Image Experiments 281

9.5 Conclusions 283

10 RANDOM ENDMEMBER EXTRACTION ALGORITHMS (REEAs) 287

10.1 Introduction 287

10.2 Random PPI (RPPI) 288

10.3 Random VCA (RVCA) 290

10.4 Random N-FINDR (RN-FINDR) 290

10.5 Random SGA (RSGA) 292

10.6 Random ICA-Based EEA (RICA-EEA) 292

10.7 Synthetic Image Experiments 293

10.7.1 RPPI 293

xii Contents

Page 15: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

10.7.2 Various Random Versions of IN-FINDR 296

10.7.2.1 Scenario TI2 297

10.7.2.2 Scenario TI3 299

10.7.2.3 TE2 301

10.7.2.4 TE3 Scenario 303

10.8 Real Image Experiments 305

10.8.1 HYDICE Image Experiments 305

10.8.1.1 RPPI 306

10.8.1.2 RN-FINDR 306

10.8.2 AVIRIS Image Experiments 309

10.8.2.1 RPPI 309

10.8.2.2 RN-FINDR 310

10.9 Conclusions 313

11 EXPLORATION ON RELATIONSHIPS AMONG ENDMEMBER

EXTRACTION ALGORITHMS 316

11.1 Introduction 316

11.2 Orthogonal Projection-Based EEAs 318

11.2.1 Relationship among PPI, VCA, and ATGP 319

11.2.1.1 Relationship Between PPI and ATGP 319

11.2.1.2 Relationship Between PPI and VCA 320

11.2.1.3 Relationship Between ATGP and VCA 321

11.2.1.4 Discussions 322

11.2.2 Experiments-Based Comparative Study and Analysis 323

11.2.2.1 Synthetic Image Experiment: TI2 323

11.2.2.2 Real Image Experiments 325

11.3 Comparative Study and Analysis Between SGA and VCA 330

11.4 Does an Endmember Set Really Yield Maximum Simplex Volume? 339

11.5 Impact of Dimensionality Reduction on EEAs 344

11.6 Conclusions 348

III: SUPERVISED LINEAR HYPERSPECTRAL MIXTURE ANALYSIS 351

12 ORTHOGONAL SUBSPACE PROJECTION REVISITED 355

12.1 Introduction 355

12.2 Three Perspectives to Derive OSP 358

12.2.1 Signal Detection Perspective Derived from (d,U)-Model

and OSP-Model 359

12.2.2 Fisher’s Linear Discriminant Analysis Perspective from OSP-Model 360

12.2.3 Parameter Estimation Perspective from OSP-Model 362

12.2.4 Relationship Between dLSap ðrÞ and Least-Squares Linear SpectralMixture Analysis 362

12.3 Gaussian Noise in OSP 364

12.3.1 Signal Detector in Gaussian Noise Using OSP-Model 365

12.3.2 Gaussian Maximum Likelihood Classifier Using OSP-Model 366

12.3.3 Gaussian Maximum Likelihood Estimator 367

12.3.4 Examples 367

Contents xiii

Page 16: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

12.4 OSP Implemented with Partial Knowledge 372

12.4.1 CEM 373

12.4.1.1 d Is Orthogonal to U (i.e., P?Ud ¼ d) and R ¼ I

(i.e., Spectral Correlation is Whitened) 374

12.4.1.2 An Alternative Approach to Implementing CEM 374

12.4.1.3 CEM Implemented in Conjunction with P?U 375

12.4.1.4 CEM Implemented in Conjunction with P?U in White Noise 376

12.4.2 TCIMF 377

12.4.2.1 D ¼ mp ¼ d with nD ¼ 1 and U ¼ m1m2 � � �mp�1

� �with

nU ¼ p� 1 378

12.4.2.2 D ¼ mp ¼ d with nD ¼ 1 and U ¼ m1 m2 � � �mp�1

� �with

nU ¼ p� 1 and R ¼ I 378

12.4.2.3 D¼ d and U ¼ Ø (i.e., Only the Desired Signature d is Available) 378

12.4.3 Examples 379

12.5 OSP Implemented Without Knowledge 383

12.6 Conclusions 390

13 FISHER’S LINEAR SPECTRAL MIXTURE ANALYSIS 391

13.1 Introduction 391

13.2 Feature Vector-Constrained FLSMA (FVC-FLSMA) 392

13.3 Relationship Between FVC-FLSMA and LCMV, TCIMF, and CEM 395

13.4 Relationship Between FVC-FLSMA and OSP 396

13.5 Relationship Between FVC-FLSMA and LCDA 396

13.6 Abundance-Constrained Least Squares FLDA (ACLS-FLDA) 397

13.7 Synthetic Image Experiments 398

13.8 Real Image Experiments 402

13.8.1 Image Background Characterized by Supervised Knowledge 402

13.8.2 Image Background Characterized by Unsupervised Knowledge 405

13.9 Conclusions 409

14 WEIGHTED ABUNDANCE-CONSTRAINED LINEAR SPECTRAL

MIXTURE ANALYSIS 411

14.1 Introduction 411

14.2 Abundance-Constrained LSMA (AC-LSMA) 413

14.3 Weighted Least-Squares Abundance-Constrained LSMA 413

14.3.1 Weighting Matrix Derived from a Parameter Estimation Perspective 414

14.3.1.1 MD-Weighted AC-LSMA 415

14.3.1.2 LCMV-Weighted AC-LSMA 415

14.3.2 Weighting Matrix Derived from Fisher’s Linear Discriminant

Analysis Perspective 416

14.3.3 Weighting Matrix Derived from an Orthogonal Subspace Projection

Perspective 417

14.3.3.1 OSP-Weighted AC-LSMA 417

14.3.3.2 SSP-Weighted AC-LSMA 418

14.4 Synthetic Image-Based Computer Simulations 419

14.5 Real Image Experiments 426

14.6 Conclusions 432

xiv Contents

Page 17: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

15 KERNEL-BASED LINEAR SPECTRAL MIXTURE ANALYSIS 434

15.1 Introduction 434

15.2 Kernel-Based LSMA (KLSMA) 436

15.2.1 Kernel Least Squares Orthogonal Subspace Projection (KLSOSP) 436

15.2.2 Kernel-Based Non-Negative Constraint Least Square (KNCLS) 438

15.2.3 Kernel-Based Fully Constraint Least Square (KFCLS) 439

15.2.4 A Note on Kernelization 440

15.3 Synthetic Image Experiments 441

15.4 AVIRIS Data Experiments 444

15.4.1 Radial Basis Function Kernels 449

15.4.2 Polynomial Kernels 452

15.4.3 Sigmoid Kernels 454

15.5 HYDICE Data Experiments 460

15.6 Conclusions 462

IV: UNSUPERVISED HYPERSPECTRAL IMAGE ANALYSIS 465

16 HYPERSPECTRAL MEASURES 469

16.1 Introduction 469

16.2 Signature Vector-Based Hyperspectral Measures for Target Discrimanition

and Identification 470

16.2.1 Euclidean Distance 471

16.2.2 Spectral Angle Mapper 471

16.2.3 Orthogonal Projection Divergence 471

16.2.4 Spectral Information Divergence 471

16.3 Correlation-Weighted Hyperspectral Measures for Target Discrimanition

and Identification 472

16.3.1 Hyperspectral Measures Weighted by A Priori Correlation 473

16.3.1.1 OSP-Based Hyperspectral Measures for Discrimination 473

16.3.1.2 OSP-Based Hyperspectral Measures for Identification 473

16.3.2 Hyperspectral Measures Weighted by A Posteriori Correlation 474

16.3.2.1 Covariance Matrix-Weighted Hyperspectral Measures 474

16.3.2.2 Correlation Matrix-Weighted Hyperspectral Measures 475

16.3.2.3 Covariance Matrix-Weighted Matched Filter Distance 475

16.3.2.4 Correlation Matrix-Weighted Matched Filter Distance 476

16.4 Experiments 477

16.4.1 HYDICE Image Experiments 477

16.4.2 AVIRIS Image Experiments 478

16.5 Conclusions 482

17 UNSUPERVISED LINEAR HYPERSPECTRAL MIXTURE ANALYSIS 483

17.1 Introduction 483

17.2 Least Squares-Based ULSMA 486

17.3 Component Analysis-Based ULSMA 488

17.4 Synthetic Image Experiments 490

17.4.1 LS-ULSMA 491

17.4.2 CA-ULSMA 499

Contents xv

Page 18: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

17.5 Real-Image Experiments 503

17.5.1 LS-ULSMA 503

17.5.2 CA-ULSMA 505

17.5.3 Qualitative and Quantitative Analyses between ULSMA and SLSMA 511

17.6 ULSMAVersus Endmember Extraction 517

17.7 Conclusions 524

18 PIXEL EXTRACTION AND INFORMATION 526

18.1 Introduction 526

18.2 Four Types of Pixels 527

18.3 Algorithms Selected to Extract Pixel Information 528

18.4 Pixel Information Analysis via Synthetic Images 528

18.5 Real Image Experiments 534

18.5.1 AVIRIS Image Data 534

18.5.2 DAIS 7915 Image Data 537

18.6 Conclusions 539

V: HYPERSPECTRAL INFORMATION COMPRESSION 541

19 EXPLOITATION-BASED HYPERSPECTRAL DATA COMPRESSION 545

19.1 Introduction 545

19.2 Hyperspectral Information Compression Systems 547

19.3 Spectral/Spatial Compression 549

19.3.1 Dimensionality Reduction by Transform-Based Spectral Compression 550

19.3.1.1 Determination of Number of PCs/ICs to be Retained 551

19.3.1.2 PCA (ICA)/2D Compression 551

19.3.1.3 PCA (ICA)/3D Compression 552

19.3.1.4 Inverse PCA (Inverse ICA)/2D Compression 553

19.3.1.5 Inverse PCA (Inverse PCA)/3D Compression 553

19.3.1.6 Mixed Component Transforms for Hyperspectral Compression 554

19.3.2 Dimensionality Reduction by Band Selection-Based Spectral Compression 556

19.4 Progressive Spectral/Spatial Compression 557

19.5 3D Compression 557

19.5.1 3D-Multicomponent JPEG 557

19.5.2 3D-SPIHT Compression 558

19.6 Exploration-Based Applications 559

19.6.1 Linear Spectral Mixture Analysis 559

19.6.2 Subpixel Target Detection 559

19.6.3 Anomaly Detection 560

19.6.4 Endmember Extraction 561

19.7 Experiments 561

19.7.1 Synthetic Image Experiments 562

19.7.2 Real Image Experiments 567

19.8 Conclusions 580

20 PROGRESSIVE SPECTRAL DIMENSIONALITY PROCESS 581

20.1 Introduction 582

20.2 Dimensionality Prioritization 584

xvi Contents

Page 19: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

20.3 Representation of Transformed Components for DP 585

20.3.1 Projection Index-Based PP 585

20.3.2 Mixed Projection Index-Based Prioritized PP (M-PIPP) 587

20.3.3 Projection Index-Based Prioritized PP (PI-PRPP) 587

20.3.4 Initialization-Driven PIPP (ID-PIPP) 588

20.4 Progressive Spectral Dimensionality Process 589

20.4.1 Progressive Principal Components Analysis 591

20.4.1.1 Simultaneous PCA 591

20.4.1.2 Progressive PCA 592

20.4.1.3 Sequential PCA 593

20.4.1.4 Initialization-Driven PCA 595

20.4.2 Progressive High-Order Statistics Component Analysis 596

20.4.3 Progressive Independent Component Analysis 596

20.5 Hyperspectral Compression by PSDP 597

20.5.1 Progressive Spectral Dimensionality Reduction 597

20.5.2 Progressive Spectral Dimensionality Expansion 597

20.6 Experiments for PSDP 598

20.6.1 Endmember Extraction 598

20.6.2 Land Cover/Use Classification 599

20.6.3 Linear Spectral Mixture Analysis 603

20.7 Conclusions 608

21 PROGRESSIVE BAND DIMENSIONALITY PROCESS 613

21.1 Introduction 614

21.2 Band Prioritization 615

21.3 Criteria for Band Prioritization 617

21.3.1 Second-Order Statistics-Based BPC 617

21.3.1.1 Variance-Based BPC 617

21.3.1.2 Signal-to-Noise-Ratio-Based BPC 618

21.3.2 High-Order Statistics-Based BPC 618

21.3.2.1 Skewness 618

21.3.2.2 Kurtosis 618

21.3.3 Infinite-Order Statistics-Based BPC 618

21.3.3.1 Entropy 619

21.3.3.2 Information Divergence 619

21.3.4 Classification-Based BPC 619

21.3.4.1 Fisher’s Linear Discriminant Analysis (FLDA)-Based BPC 619

21.3.4.2 OSP-Based BPC 620

21.3.5 Constrained Band Correlation/Dependence Minimization 620

21.3.5.1 Band Correlation/Dependence Minimization 621

21.3.5.2 Band Correlation Constraint 622

21.4 Experiments for BP 624

21.4.1 Applications Using Highest-Prioritized Bands 625

21.4.1.1 Unsupervised Linear Spectral Mixture Analysis 626

21.4.1.2 Endmember Extraction 632

21.4.2 Applications Using Least-Prioritized Bands 635

21.4.2.1 Unsupervised Linear Spectral Mixture Analysis 636

21.4.2.2 Endmember Extraction 637

Contents xvii

Page 20: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

21.4.3 Applications Using Mixing Highest-Prioritized and Least-Prioritized Bands 646

21.4.3.1 Unsupervised Linear Spectral Mixture Analysis 646

21.4.3.2 Endmember Extraction 646

21.5 Progressive Band Dimensionality Process 651

21.6 Hyperspectral Compresssion by PBDP 653

21.6.1 Progressive Band Dimensionality Reduction Via BP 654

21.6.2 Progressive Band Dimensionality Expansion Via BP 655

21.7 Experiments for PBDP 656

21.7.1 Endmember Extraction 656

21.7.2 Land Cover/Use Classification 658

21.7.3 Linear Spectral Mixture Analysis 660

21.8 Conclusions 662

22 DYNAMIC DIMENSIONALITYALLOCATION 664

22.1 Introduction 664

22.2 Dynamic Dimensionality Allocaction 665

22.3 Signature Discriminatory Probabilties 667

22.4 Coding Techniques for Determining DDA 667

22.4.1 Shannon Coding-Based DDA 667

22.4.2 Huffman Coding-Based DDA 668

22.4.3 Hamming Coding-Based DDA 669

22.4.4 Notes on DDA 669

22.5 Experiments for Dynamic Dimensionality Allocation 669

22.5.1 Reflectance Cuprite Data 670

22.5.2 Purdue’s Data 672

22.5.3 HYDICE Data 674

22.6 Conclusions 682

23 PROGRESSIVE BAND SELECTION 683

23.1 Introduction 683

23.2 Band De-Corrleation 684

23.2.1 Spectral Measure-Based BD 684

23.2.2 Orthogonalization-Based BD 685

23.3 Progressive Band Selection 686

23.3.1 PBS: BP Followed by BD 687

23.3.2 PBS: BD Followed by BP 687

23.4 Experiments for Progressive Band Selection 688

23.5 Endmember Extraction 688

23.6 Land Cover/Use Classification 690

23.7 Linear Spectral Mixture Analysis 694

23.8 Conclusions 715

VI: HYPERSPECTRAL SIGNAL CODING 717

24 BINARY CODING FOR SPECTRAL SIGNATURES 719

24.1 Introduction 719

xviii Contents

Page 21: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

24.2 Binary Coding 720

24.2.1 SPAM Binary Coding 720

24.2.2 Median Partition Binary Coding 721

24.2.3 Halfway Partition Binary Coding 722

24.2.4 Equal Probability Partition Binary Coding 722

24.3 Spectral Feature-Based Coding 723

24.4 Experiments 725

24.4.1 Computer Simulations 725

24.4.2 Real Hyperspectral Image Data 730

24.5 Conclusions 740

25 VECTOR CODING FOR HYPERSPECTRAL SIGNATURES 741

25.1 Introduction 741

25.2 Spectral Derivative Feature Coding 743

25.2.1 Re-interpretation of SPAM and SFBC 743

25.2.2 Spectral Derivative Feature Coding 744

25.2.3 AVIRIS Data Experiments 746

25.2.3.1 Signature Discrimination 747

25.2.3.2 Mixed Signature Classification 748

25.2.4 NIST Gas Data Experiments 749

25.2.4.1 Signature Discrimination 750

25.2.4.2 Mixed Signature Classification 751

25.3 Spectral Feature Probabilistic Coding 755

25.3.1 Arithmetic Coding 755

25.3.2 Spectral Feature Probabilistic Coding 756

25.3.3 AVIRIS Data Experiments 758

25.3.4 NIST Gas Data Experiments 760

25.4 Real Image Experiments 764

25.4.1 SDFC 764

25.4.2 SFPC 766

25.5 Conclusions 771

26 PROGRESSIVE CODING FOR SPECTRAL SIGNATURES 772

26.1 Introduction 772

26.2 Multistage Pulse Code Modulation 774

26.3 MPCM-Based Progressive Spectral Signature Coding 783

26.3.1 Spectral Discrimination 784

26.3.2 Spectral Identification 785

26.4 NIST-GAS Data Experiments 786

26.5 Real Image Hyperspectral Experiments 790

26.6 Conclusions 796

VII: HYPERSPECTRAL SIGNAL CHARACTERIZATION 797

27 VARIABLE-NUMBERVARIABLE-BAND SELECTION FOR

HYPERSPECTRAL SIGNALS 799

27.1 Introduction 799

Contents xix

Page 22: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

27.2 Orthogonal Subspace Projection-Based Band Prioritization Criterion 801

27.3 Variable-Number Variable-Band Selection 803

27.4 Experiments 806

27.4.1 Hyperspectral Data 806

27.4.1.1 Signature Discrimination 806

27.4.1.2 Signature Classification/Identification 809

27.4.1.3 Noise Effect on VNVBS 811

27.4.2 NIST-Gas Data 813

27.4.2.1 Signature Discrimination 813

27.4.2.2 Signature Classification/Identification 814

27.4.2.3 Signature Discrimination between Two Signatures

with Different Numbers of Bands 816

27.5 Selection of Reference Signatures 819

27.6 Conclusions 819

28 KALMAN FILTER-BASED ESTIMATION FOR HYPERSPECTRAL SIGNALS 820

28.1 Introduction 820

28.2 Kalman Filter-Based Linear Unmixing 822

28.3 Kalman Filter-Based Spectral Characterization Signal-Processing Techniques 824

28.3.1 Kalman Filter-based Spectral Signature Estimator 825

28.3.2 Kalman Filter-Based Spectral Signature Identifier 826

28.3.3 Kalman Filter-Based Spectral Signature Quantifier 828

28.4 Computer Simulations Using AVIRIS Data 831

28.4.1 KFSSE 831

28.4.2 KFSSI 832

28.4.2.1 Subpixel Target Identification by KFSSI 832

28.4.2.2 Mixed Target Identification by KFSSI 838

28.4.3 KFSSQ 839

28.4.3.1 Subpixel Target Quantification by KFSSQ 839

28.4.3.2 Mixed Target Quantification by KFSSQ 840

28.5 Computer Simulations Using NIST-Gas Data 843

28.5.1 KFSSE 843

28.5.2 KFSSI 843

28.5.2.1 Subpixel Target Identification by KFSSI 843

28.5.2.2 Mixed Target Identification by KFSSI 848

28.5.3 KFSSQ 849

28.5.3.1 Subpixel Target Identification by KFSSQ 849

28.5.3.2 Mixed Target Quantification by KFSSQ 849

28.6 Real Data Experiments 852

28.6.1 KFSSE 852

28.6.2 KFSSI 852

28.6.3 KFSSQ 856

28.7 Conclusions 857

29 WAVELET REPRESENTATION FOR HYPERSPECTRAL SIGNALS 859

29.1 Introduction 859

29.2 Wavelet Analysis 860

xx Contents

Page 23: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

29.2.1 Multiscale Approximation 860

29.2.2 Scaling Function 861

29.2.3 Wavelet Function 862

29.3 Wavelet-Based Signature Characterization Algorithm 863

29.3.1 Wavelet-Based Signature Characterization Algorithm for Signature

Self-Tuning 863

29.3.2 Wavelet-Based Signature Characterization Algorithm for Signature

Self-Correction 866

29.3.3 Signature Self-Discrimination, Classification, and Identification 867

29.4 Synthetic Image-Based Computer Simulations 868

29.4.1 Signature Self-Tuning and Self-Denoising 869

29.4.2 Signature Self-Discrimination, Self-Classification, and

Self-Identification 870

29.5 Real Image Experiments 871

29.6 Conclusions 875

VIII: APPLICATIONS 877

30 APPLICATIONS OF TARGET DETECTION 879

30.1 Introduction 879

30.2 Size Estimation of Subpixel Targets 880

30.3 Experiments 881

30.3.1 Synthetic Image Experiments 881

30.3.2 HYDICE Image Experiments 886

30.4 Concealed Target Detection 891

30.5 Computer-Aided Detection and Classification Algorithm for Concealed Targets 892

30.6 Experiments for Concealed Target Detection 893

30.7 Conclusions 895

31 NONLINEAR DIMENSIONALITY EXPANSION TOMULTISPECTRAL

IMAGERY 897

31.1 Introduction 897

31.2 Band Dimensionality Expansion 899

31.2.1 Rationale for Developing BDE 899

31.2.2 Band Expansion Process 901

31.3 Hyperspectral Imaging Techniques Expanded by BDE 902

31.3.1 BEP-Based Orthogonal Subspace Projection 903

31.3.2 BEP-Based Constrained Energy Minimization 903

31.3.3 BEP-Based RX-Detector 903

31.4 Feature Dimensionality Expansion by Nonlinear Kernels 904

31.4.1 FDE by Transformation 905

31.4.2 FDE by Classification 907

31.4.2.1 FDE by Classification using Sample Spectral Correlation 907

31.4.2.2 FDE by Classification using Intrapixel Spectral Correlation 908

31.5 BDE in Conjunction with FDE 909

31.6 Multispectral Image Experiments 909

31.7 Conclusion 918

Contents xxi

Page 24: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

32 MULTISPECTRAL MAGNETIC RESONANCE IMAGING 920

32.1 Introduction 920

32.2 Linear Spectral Mixture Analysis for MRI 923

32.2.1 Orthogonal Subspace Projection to MRI 925

32.2.2 Band Expansion Process-Based OSP 927

32.2.3 Unsupervised Orthogonal Subspace Projection 928

32.3 Linear Spectral Random Mixture Analysis for MRI 928

32.3.1 Source Separation-Based OC-ICA for MR Image Analysis 930

32.3.2 Band Expansion Process Over complete ICA for MR Image Analysis 931

32.3.2.1 Eigenvector-Prioritized ICA 931

32.3.2.2 High-Order Statistics-Based PICA 932

32.3.2.3 ATGP-Prioritized PCA 932

32.4 Kernel-Based Linear Spectral Mixture Analysis 933

32.5 Synthetic MR Brain Image Experiments 933

32.6 Real MR Brain Image Experiments 951

32.7 Conclusions 955

33 CONCLUSIONS 956

33.1 Design Principles for Nonliteral Hyperspectral Imaging Techniques 956

33.1.1 Pigeon-Hole Principle 956

33.1.1.1 Multispectral Imagery Versus Hyperspectral Imagery 957

33.1.1.2 Virtual Dimensionality 957

33.1.2 Principle of Orthogonality 963

33.2 Endmember Extraction 964

33.3 Linear Spectral Mixture Analysis 970

33.3.1 Supervised LSMA 970

33.3.2 Unsupervised LSMA 973

33.4 Anomaly Detection 974

33.5 Support Vector Machines and Kernel-Based Approaches 977

33.6 Hyperspectral Compression 981

33.7 Hyperspectral Signal Processing 984

33.7.1 Signal Coding 986

33.7.2 Signal Estimation 986

33.8 Applications 987

33.9 Further Topics 987

33.9.1 Causal Processing 987

33.9.2 Real-Time Processing 988

33.9.3 FPGA Designs for Hardware Implementation 989

33.9.4 Parallel Processing 990

33.9.5 Progressive Hyperspectral Processing 990

GLOSSARY 993

APPENDIX: ALGORITHM COMPENDIUM 997

REFERENCES 1052

INDEX 1071

xxii Contents

Page 25: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

Preface

Hyperspectral imaging has witnessed tremendous growth over the past few years. Still its applica-

tions to new areas are yet to be explored. Many hyperspectral imaging techniques have been devel-

oped and reported in various venues. My first book, Hyperspectral Imaging: Techniques for

Spectral Detection and Classification, referenced as Chang (2003a), was written in an attempt to

summarize the research conducted at that time in my laboratory (remote sensing signal and image

processing laboratory, RSSIPL) and to provide readers with a peek of this fascinating and exciting

area. With rapid advancement in this area many signal processing techniques have been developed

for hyperspectral signal and image processing. This book has been written with four goals in mind.

One is to continuously explore new statistical signal processing algorithms in this area for various

applications. Many results in this book are new, particularly some in Chapters 2, 4, 5–6, 11, 16, 18–

19, 23, 24, 29, 30–31, and 33. A second goal is to supplement Chang (2003a), where many poten-

tial research efforts were only briefly mentioned (in Chapter 18 of the book). A third goal is to

distinguish this book from Chang (2003a) in many ways. Unlike Chang (2003a) where the main

theme was hyperspectral target detection and classification from a viewpoint of subpixel and mixed

pixel analysis, this book is focused on more in-depth treatment of hyperspectral signal and image

processing from a statistical signal processing point of view. A fourth and last goal is to focus on

several unsettled but very important issues that have been avoided and never addressed in the past.

One issue is “how many spectral signatures are required to unmix data?” arising in linear hyper-

spectral unmixing. This has been a long-standing and unresolved issue in remote sensing image

processing, specifically hyperspectral imaging, since the number of signatures to be used for data

unmixing has a significant impact on image analysis while its accurate number is never known in

real applications. Another is “how many pure spectral signatures, referred to as endmembers, are

supposed to be present in the data to be processed?” It is common practice to assume that the

number of signatures used for spectral unmixing is the same number of endmembers.

Unfortunately, such a claim, which has been widely accepted by the community, is not always true

in practical applications (see Chapter 17). The issue of endmembers has not received much interest

in multispectral image analysis because of its low spectral and spatial resolutions that generally

result in mixed data sample vectors. However, due to recent advances in hyperspectral imaging

sensors with hundreds of contiguous spectral bands endmember extraction has become increas-

ingly important since endmembers provide crucial “nonliteral” information in spectral interpreta-

tion, characterization, and analysis. Interestingly, this issue has never been seriously addressed

until recently when it has been investigated by a series of papers (Chang, 2006ab; Chang and

xxiii

Page 26: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

Plaza, 2006; Chang et al., 2006; Plaza and Chang 2006) by introducing a new concept of virtual

dimensionality (VD). Besides, some controversial issues result from misinterpreting VD. There-

fore, one of the major chapters in this book is Chapter 5, which revisits VD to explore its utility in

various applications. Unlike the intrinsic dimensionality (ID), also known as effective dimension-

ality (ED), which is somewhat abstract and defined as the minimum number of parameters to rep-

resent general high-dimensional multivariate data, VD is more practical and realistic. It is defined

as the number of “spectrally” distinct signatures particularly developed for hyperspectral data in

which the non-literal (spectral) information is more crucial and vital than information provided by

other dimensions such as spatial information. In particular, an issue arises in how to define the

spectral distinction among signatures in VD estimation. Furthermore, unlike ID that is a one-size-

fits-all definition for all data sets, VD should adapt to data sets used for different applications as

well as vary with the techniques used to estimate VD. In order to address this issue, Chapter 5

explores two types of VD criteria, data characterization-driven criteria and data representation-

driven criteria, to define spectrally distinct signatures, and further decouples the concept of VD

from the techniques used to estimate VD. Consequently, when VD is poorly estimated by one tech-

nique for a particular data set, it is not the definition of VD to be blamed, but rather the technique

used for VD estimation that is not applicable to this particular data set. In addition, an issue related

to VD is “characterization of pixel information.” For example, an anomaly is not necessarily an

endmember and vice versa. So, the issues “what is the distinction between these two?” and “how

do we characterize these two?” become interesting issues in hyperspectral data exploitation to

be discussed in Chapter 18.

Another interesting topic presented in this book is a new concept of “hyperspectral information

compression” introduced in Chapters 19–23. It is different from the commonly used so-called

hyperspectral data compression in the sense that hyperspectral information compression is gener-

ally performed based on the information required to be retained rather than the size of hyperspec-

tral data to be compressed. Therefore, a more appropriate term to be used is “exploitation-based

lossy hyperspectral data compression.” Nevertheless, it should be noted that the definitions

and terminologies used in these chapters are by no means standard.

Finally, an issue of “multispectral imagery versus hyperspectral imagery” is also investi-

gated. It seems that there is no cut-and-dried definition to distinguish these two terminologies.

A general understanding of distinguishing these two is that a hyperspectral image is acquired

by hundreds of contiguous spectral channels/bands with very high spectral resolution, while a

multispectral image is collected by tens of discrete spectral channels/bands with low spectral

resolution. If this interpretation is used, we run into a dilemma, “how many spectral channels/

bands are enough for a remotely sensed image to be called a hyperspectral image?” or “how

fine the spectral resolution should be for a remote sensing image to be considered as a hyper-

spectral image?” For example, if we take a small set of hyperspectral band images with spec-

tral resolution 10 nm, say five spectral band images, to form a five-dimensional image cube, do

we still consider this new-formed five-dimensional image cube as a hyperspectral image or

simply a multispectral image? If we adopt the former definition based on the number of bands,

this five-dimensional image cube should be viewed as a multispectral image. On the other

hand, if we adopt the latter definition based on spectral resolution, the five-dimensional image

cube should be considered as a hyperspectral image. Thus far, it seems that there is no general

consensus on this issue. In Chapter 31, an attempt is made to address this issue from a view-

point of how two versions of independent component analysis (ICA), over-complete ICA, and

under-complete ICA can be used to resolve this long-debated issue in the context of linear

spectral mixture analysis (LSMA). After all, some of these issues may never be settled or

standardized for years to come. Many researchers can always argue differently at their

xxiv Preface

Page 27: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

discretion and provide their own versions of interpretation. I have no intention of disputing any

of them, but rather respect their opinions.

Since processing hyperspectral signatures as one-dimensional signals and processing hyper-

spectral images as three-dimensional image cubes are rather different, this book makes a distinc-

tion by treating hyperspectral image processing and hyperspectral signal processing in two

separate categories to avoid confusion. To this end, three categories are specifically outlined in this

book: Category A: hyperspectral image processing; Category B: hyperspectral signal processing;

and Category C: applications.

For better understanding, a set of six chapters is included in PART I as preliminaries that

cover fundamentals and provide a basic background required for readers to follow algorithm

design and development discussed in this book. Category A is made up of 15 chapters (Chap-

ters 7–23) treated separately in four different parts, Part II to Part V. Category B consists of

six chapters (Chapters 24–29) in two separate parts, Part VI and Part VII. Finally, applica-

tions make up Category C.

It is worth noting that many materials presented in this book have been only available

after Chang (2003a). Theses include endmember extraction (Chapters 7–11), algorithm design

using different levels of information (supervised linear hyperspectral mixture analysis in

Chapters 12–15), pixel characterization and analysis (unsupervised hyperspectral analysis in

Chapters 16–18), exploitation-based hyperspectral information compression (Chapters 19–23),

hyperspectral signature coding and characterization (Chapters 24–29), and applications

(Chapters 30–32) in Category C.

There are three unique features in this book that cannot be found in Chang (2003a): (1) Part I:

preliminaries (Chapters 2–6); (2) extensive studies of synthetic image-based experiments for per-

formance evaluation; and (3) an appendix on algorithm compendium that compiles recently devel-

oped signal processing algorithms developed in the RSSIPL, all of which are believed to be useful

and beneficial to those who design and develop algorithms for hyperspectral signal/image process-

ing. Because this book also addresses many issues that were not explored in Chang (2003a), it can

be used in conjunction with Chang (2003a) without much overlap, where the latter provides neces-

sary basic background in design and development of statistical signal processing algorithms for

hyperspectral image analysis, especially for subpixel detection and mixed pixel classification.

Therefore, on one end, those who have been involved in hyperspectral imaging and are familiar

with hyperspectral imaging techniques will find this book useful as reference material. On the

other end, those who are new will find this book a good and valuable guide on the topics that may

interest them.

I would like to thank the Spectral Information Technology Applications Center (SITAC) that

provides its HYDICE data to be used for experiments in this book. I would also like to acknowl-

edge the use of Purdue’s Indiana Indian Pine test site and the AVIRIS Cuprite image data website.

I owe my sincere gratitude and deepest appreciation to my former Ph.D. students, Drs.

Sumit Chakravarty, Hsian-Min Chen, Yingzi Du, Qian Du, Mingkai Hsueh, Baohoing Ji,

Xiaoli Jiao, Keng-Hao Liu, Weimin Liu, Bharath Ramakrishna, Hsuan Ren, Haleh Safavi,

Chiun-Mu Wang, Jianwei Wang, Jing Wang, Su Wang, Englin Wong, Chao-Cheng Wu, Wei

Xiong, and MS student, Ms. Farzeen Chaudhary as well as my current Ph.D. student, Shih-Yu

Chen. My appreciation is also extended to my colleagues, Professor Chinsu Lin with the

Department of Forestry and Natural Resources at National Chiayi University, Dr. Ching Wen

Yang who is the Director of Computer Center, Taichung Veterans General Hospital, and Pro-

fessor Ching Tsorng Tsai with the Computer Science Department at Tunghai University. I

would like to thank particularly my former Ph.D. students, Dr. Chao-Cheng Wu who carried

out most of the experiments presented in Chapters 7–11, Dr. Ken-Hao Liu who performed

Preface xxv

Page 28: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

many experiments described in Chapters 21–23, Dr. Su Wang who did all the work men-

tioned in Chapter 29, Dr. Englin Wong who performed all the experiments described in

Chapter 32, and Professor Antonio J. Plaza who contributed to some part of Chapter 18

when he was on sabbatical leave from the Computer Science Department, University of

Extremadura, Spain, in 2004 to visit my laboratory. This book could not have been com-

pleted without their contributions.

I would also like to thank the Ministry of Education in Taiwan for supporting me as a Distin-

guished Lecture Chair within the Department of Electrical Engineering from 2005 to 2006, a

Chair Professorship of Reduction Technology within the Environmental Restoration and Disaster

Reduction Research Center and Department of Electrical Engineering from 2006 to 2009, and a

Chair Professorship of Remote Sensing Technology within the Department of Electrical Engi-

neering from 2009 to 2012, at National Chung Hsing University where Professor Yen-Chieh

Ouyang of Electrical Engineering has been a very supportive host during my visit. In particular,

during the period 2009–2010, I was on sabbatical leave from UMBC to visit National Chung

Hsing University where my appointment as a distinguished visiting fellow/fellow professor was

supported and funded by the National Science Council in Taiwan under projects of NSC 98-

2811-E-005-024 and NSC 98-2221-E-005-096. All their support is highly appreciated.

Last but not least, I would also like to thank my friends, Dr. San-Kan Lee (Deputy Superintend-

ent of Taichung Veterans General Hospital (TCVGH)), Dr. Clayton Chi-Chang Chen (Chairman of

Radiology at TCVGH), Dr. Jyh-Wen Chai (Section Chief of Radiology at TCVGH), and Dr. Yong

Kie Wong (Head of Dental Department at TCVGH) who have selflessly provided their expertise

and resources, especially an excellent testbed environment to help me use hyperspectral imaging

techniques in magnetic resonance imaging (MRI). Chapter 32 is indeed a culmination of such a

great working relationship.

As a final note, I would like to share that this book was supposed to be delivered by 2008. The

most important factor that caused the delay is the urge to include the latest reports on hyperspectral

data analysis. It is very difficult and challenging to keep a track of such new developments.

Nevertheless, this book has grown three times larger than what I had originally proposed. Those

who are interested in my forthcoming 2013 book can have a quick peek of these topics briefly

discussed in Chapter 33, which includes a new development of target-characterized virtual

dimensionality (VD), real-time and progressive processing of endmember extraction, unsupervised

target detection, anomaly detection, as well as their field programmable gate array (FPGA)

implementation.

CHEIN-I CHANG ( )

Professor of Electrical Engineering

Remote Sensing Signal and Image Processing Laboratory (RSSIPL)

University of Maryland, Baltimore County

Baltimore, Maryland

USA

Chair Professor of Remote Sensing Technology

National Chung Hsing University

Taichung, Taiwan

Republic of China

xxvi Preface

Page 29: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition

Distinguished Professor

Providence University

Taichung, Taiwan

Republic of China

International Chair Professor

National Taipei University of Technology

Taipei, Taiwan

Republic of China

Technical Advisor

Center for QUantitative Imaging in Medicine (CQUIM)

Taichung Veterans General Hospital

Taichung, Taiwan

Republic of China

Fall 2012

Preface xxvii

Page 30: HYPERSPECTRAL DATA PROCESSINGdownload.e-bookshelf.de/download/0003/8291/20/L-G-0003829120... · 6.2.1.2 Standardized Principal Components Analysis 172 6.2.1.3 Singular Value Decomposition