Sparsity, Randomness and Compressed Sensing

74
Sparsity, Randomness and Compressed Sensing Petros Boufounos Mitsubishi Electric Research Labs [email protected]

Transcript of Sparsity, Randomness and Compressed Sensing

Page 1: Sparsity, Randomness and Compressed Sensing

Sparsity, Randomness and Compressed Sensing

Petros Boufounos Mitsubishi Electric Research Labs

[email protected]

Page 2: Sparsity, Randomness and Compressed Sensing

Sparsity

Page 3: Sparsity, Randomness and Compressed Sensing

Why Sparsity

•  Naturaldataandsignalsexhibitstructure

•  Sparsityo2encapturesthatstructure

•  Verygeneralsignalmodel

•  Computa9onallytractable

•  Widerangeofapplica9onsinsignalacquisi2on,processing,andtransmission

Page 4: Sparsity, Randomness and Compressed Sensing

Signal Representations

Page 5: Sparsity, Randomness and Compressed Sensing

Signal example: Images

•  2‐Dfunc9on

•  Idealizedview somefunc9on spacedefined over

Page 6: Sparsity, Randomness and Compressed Sensing

Signal example: Images

•  2‐Dfunc9on

•  Idealizedview somefunc9on spacedefined over

•  Inprac9ce

ie:anmatrix

Page 7: Sparsity, Randomness and Compressed Sensing

Signal example: Images

•  2‐Dfunc9on

•  Idealizedview somefunc9on spacedefined over

•  Inprac9ce

ie:anmatrix(pixelaverage)

Page 8: Sparsity, Randomness and Compressed Sensing

Signal Models Classical Model: Signal lies in a linear vector space (e.g. bandlimited functions)

Sparse Model: Signals of interest are often sparse or compressible

Signal Transform

Image

Bat Sonar Chirp

Wavelet

Gabor/ STFT

i.e., very few large coefficients, many close to zero.

x2

x3

x1

X

Page 9: Sparsity, Randomness and Compressed Sensing

Sparse Signal Models

x2

x3

x1

X x1

x2

x3

X

1-sparse 2-sparse

Compressible (p ball, p<1)

Sparse signals have few non-zero coefficients

Compressible signals have few significant coefficients. The coefficients decay as a power law.

x1

x2

x3

Page 10: Sparsity, Randomness and Compressed Sensing

Sparse Approximation

Page 11: Sparsity, Randomness and Compressed Sensing

Computational Harmonic Analysis

•  Representa9on

•  Analysis:studythroughstructureof shouldextract featuresofinterest

•  Approxima2on: usesjustafewterms exploitsparsity of

basis,framecoefficients

Page 12: Sparsity, Randomness and Compressed Sensing

Wavelet Transform Sparsity

•  Many(blue)

Page 13: Sparsity, Randomness and Compressed Sensing

Sparseness ⇒ Approximation

sortedindex

few big

many small

Page 14: Sparsity, Randomness and Compressed Sensing

Linear Approximation

index

Page 15: Sparsity, Randomness and Compressed Sensing

Linear Approximation

•  ‐term approxima6on:use“first”

index

Page 16: Sparsity, Randomness and Compressed Sensing

Nonlinear Approximation

•  ‐term approxima6on: uselargestindependently

•  Greedy/thresholding

sortedindexfew big

Page 17: Sparsity, Randomness and Compressed Sensing

Error Approximation Rates

•  Op9mizeasympto9cerror decay rate

•  Nonlinearapproxima9onworksbeQerthanlinear

as

Page 18: Sparsity, Randomness and Compressed Sensing

Compression is Approximation

•  Lossycompressionofanimagecreatesanapproxima9on

basis,framecoefficients

quan6ze to total bits

Page 19: Sparsity, Randomness and Compressed Sensing

•  Sparseapproxima9onchoosescoefficientsbutdoesnot quan6zeorworryabouttheirloca6ons

threshold

Sparse approximation ≠ Compression

Page 20: Sparsity, Randomness and Compressed Sensing

Location, Location, Location

•  Nonlinearapproxima9onselectslargesttominimizeerror(easy–threshold)

•  Compressionalgorithmmustencodebothasetofandtheirloca9ons(harder)

Page 21: Sparsity, Randomness and Compressed Sensing

Exposing Sparsity

Page 22: Sparsity, Randomness and Compressed Sensing

Spikes and Sinusoids example

Example Signal Model: Sinusoidal with a few spikes.

DCT Basis: B f a

Page 23: Sparsity, Randomness and Compressed Sensing

Spikes and Sinusoids Dictionary

DCT basis

D f a

Impulses

Lost Uniqueness!!

Page 24: Sparsity, Randomness and Compressed Sensing

Overcomplete Dictionaries

D f a

Strategy: Improve sparse approximation by constructing a large dictionary.

How do we design a dictionary?

Page 25: Sparsity, Randomness and Compressed Sensing

Dictionary Design

DCT, DFT

Impulse Basis

Wavelets Edgelets, curvelets, …

Oversampling Frame

Dictionary D

Can we just throw in the bucket everything we know?

Page 26: Sparsity, Randomness and Compressed Sensing

Dictionary Design Considerations

•  Dic9onarySize:–  Computa2onandstorageincreaseswithsize

•  FastTransforms:–  FFT,DCT,FWT,etc.drama9callydecreasecomputa2onandstorage

•  Coherence:–  Similarityinelementsmakessolu9onharder

Page 27: Sparsity, Randomness and Compressed Sensing

Dictionary Coherence

D1 D2

Two candidate dictionaries:

BAD!

Intuition: D2 has too many similar elements. It is very coherent.

Coherence (similarity) between elements: 〈d1,d2〉

Dictionary coherence: μ=maxi,j〈di,dj〉

Page 28: Sparsity, Randomness and Compressed Sensing

Incoherent Bases

•  “Mix”wellthesignalcomponents–  ImpulsesandFourierBasis–  AnythingandRandomGaussian

–  AnythingandRandom0‐1basis

Page 29: Sparsity, Randomness and Compressed Sensing

Computing Sparse Representations

Page 30: Sparsity, Randomness and Compressed Sensing

Thresholding

Compute set of coefficients

D f a

a=D†f

Zero out small ones

Computationally efficient Good for small and very incoherent dictionaries

Page 31: Sparsity, Randomness and Compressed Sensing

Matching Pursuit

DT ρ f

Measure image against dictionary

Select largest correlation

ρk

Add to representation

ak ← ak+ρk

Compute residual f ← f - ρkdk

Iterate using residual

〈dk,f 〉 = ρk

Page 32: Sparsity, Randomness and Compressed Sensing

Greedy Pursuits Family

•  SeveralVaria9onsofMP:OMP,StOMP,ROMP,CoSaMP,TreeMP,…(YoucancreateanAndrewMPifyouworkonit…)

•  Somehaveprovableguarantees

•  Someimprovedic2onarysearch

•  Someimprovecoefficientselec2on

Page 33: Sparsity, Randomness and Compressed Sensing

CoSaMP (Compressive Sampling MP)

DT ρ f

Measure image against dictionary

Iterate using residual

〈dk,f 〉 = ρk

Select location of largest

2K correlations

supp(ρ|2K)

Add to support set

Truncate and compute residual

Ω = supp(ρ|2K) ∪ T

b = D†Ωf

Invert over support

T = supp(b|K)a = b|Kr ← f −Da

Page 34: Sparsity, Randomness and Compressed Sensing

Optimization (Basis Pursuit)

Sparse approximation:

Minimize non-zeros in representation s.t.: representation is close to signal

min‖a‖0s.t.f ≈ Da

Number of non-zeros (sparsity measure)

Data Fidelity (approximation quality)

Combinatorial complexity. Very hard problem!

Page 35: Sparsity, Randomness and Compressed Sensing

Optimization (Basis Pursuit)

Sparse approximation:

Minimize non-zeros in representation s.t.: representation is close to signal

min‖a‖0s.t.f ≈ Da

min‖a‖1s.t.f ≈ Da

Convex Relaxation

Ploynomial complexity. Solved using linear programming.

Page 36: Sparsity, Randomness and Compressed Sensing

Why l1 relaxation works

f = Da

min‖a‖1s.t.f ≈ Da

l1 “ball”

Sparse solution

Page 37: Sparsity, Randomness and Compressed Sensing

Basis Pursuits

•  Haveprovableguarantees–  Findssparsestsolu9onforincoherentdic9onaries

•  Severalvariantsinformula9on:BPDN,LASSO,Dantzigselector,…

•  Varia9onsonfidelitytermandrelaxa2onchoice

•  Severalfastalgorithms:FPC,GPSR,SPGL,…

Page 38: Sparsity, Randomness and Compressed Sensing

CompressedSensing:

Sensing,Samplingand

DataProcessing

Page 39: Sparsity, Randomness and Compressed Sensing

Data Acquisition

•  Usualacquisi9onmethodssamplesignalsuniformly–  Time:A/Dwithmicrophones,geophones,hydrophones.–  Space:CCDcameras,sensorarrays.

•  Founda9on:Nyquist/Shannonsamplingtheory–  Sampleattwicethesignalbandwidth.–  Generallyaprojec2ontoacompletebasisthatspansthesignalspace.

Page 40: Sparsity, Randomness and Compressed Sensing

Data Processing and Transmission

•  Dataprocessingsteps:–  SampleDensely

–  Transformtoaninforma9vedomain(Fourier,Wavelet)

–  Process/Compress/TransmitSetssmallcoefficientstozero(sparsifica9on)

Signal x, N coefficients

K<<N significant coefficients

Page 41: Sparsity, Randomness and Compressed Sensing

Sparsity Model

•  Signalscanusuallybecompressedinsomebasis

•  Sparsity:goodpriorinpickingfromalotofcandidates

Page 42: Sparsity, Randomness and Compressed Sensing

x2

x3

x1

X

x1

x2

x3

X

1-sparse

2-sparse

Compressive Sensing Principles

If a signal is sparse, do not waste effort sampling the empty space.

Instead, use fewer samples and allow ambiguity.

Use the sparsity model to reconstruct and uniquely resolve the ambiguity.

Page 43: Sparsity, Randomness and Compressed Sensing

Measuring Sparse Signals

Page 44: Sparsity, Randomness and Compressed Sensing

Compressive Measurements

x2

x3

x1

X

N = Signal dimensionality K = Signal sparsity

x1 x2

x3

y1

y2 X

Measurement (Projection)

Reconstruction

Ambiguity

N ≫M ≳ K

Φ has rank M≪N

M = Number of measurements (dimensionality of y)

Page 45: Sparsity, Randomness and Compressed Sensing

One Simple Question

Page 46: Sparsity, Randomness and Compressed Sensing

Geometry of Sparse Signal Sets

Page 47: Sparsity, Randomness and Compressed Sensing

Geometry: Embedding in RM

Page 48: Sparsity, Randomness and Compressed Sensing

Illustrative Example

Page 49: Sparsity, Randomness and Compressed Sensing

Example: 1-sparse signal

x2

x3

x1

X

N=3 K=1 M=2K=2

x1=0

y1=x2

y2 =x3 X

Bad!

y1=x1=x2

y2 =x3

X Bad!

Page 50: Sparsity, Randomness and Compressed Sensing

Example: 1-sparse signal

x2

x3

x1

X

N=3 K=1 M=2K=2

x1

y1=x2

y2 =x3 X

Good!

x2

y1=x1

y2 X

x3

Better!

Page 51: Sparsity, Randomness and Compressed Sensing

Restricted Isometry Property

Page 52: Sparsity, Randomness and Compressed Sensing

RIP as a “Stable” Embedding

Page 53: Sparsity, Randomness and Compressed Sensing

Verifying RIP

Page 54: Sparsity, Randomness and Compressed Sensing

Universality Property

Page 55: Sparsity, Randomness and Compressed Sensing

Universality Property

Page 56: Sparsity, Randomness and Compressed Sensing

Democracy

measurements

Bad/lost/droppedmeasurements

Φ

• Measurementsaredemocra2c[Davenport,Laska,Boufounos,Baraniuk]- Theyareallequallyimportant- Wecanloosesomearbitrarily,(i.e.anadversarycanchoosewhichones)

• TheΦs9llsa9sfiesRIP(aslongaswedon’tdroptoomany)

~ Φ~

~

Page 57: Sparsity, Randomness and Compressed Sensing

Reconstruction

Page 58: Sparsity, Randomness and Compressed Sensing

Requirements for Reconstruction

•  Letx1, x2 beK‐sparsesignals(I.e.x1-x2 is 2K‐sparse):•  Mappingy=Φxisinver2bleforK‐sparsesignals:

Φ(x1-x2)≠0ifx1≠x2

•  MappingisrobustforK‐sparsesignals:

||Φ(x1-x2)||2≈|| x1-x2||2

–  RestrictedIsometryProperty(RIP):Φpreservesdistancewhenprojec9ngK‐sparsesignals

•  GuaranteesthereexistsauniqueK‐sparsesignalexplainsthemeasurements,andisrobusttonoise.

Page 59: Sparsity, Randomness and Compressed Sensing

Reconstruction Ambiguity

•  Solu9onshouldbeconsistentwithmeasurements

•  Projec9onsimplythataninfinitenumberofsolu9onsareconsistent!

•  Classicalapproach:usethepseudoinverse(minimizel2norm)

•  Compressivesensingapproach:pickthesparsest.

•  RIPguarantee:sparsestsolu9onuniqueandreconstructsthesignal.

ˆ x s.t. y =Φˆ x or y ≈ Φˆ x

Becomes a sparse approximation problem!

Page 60: Sparsity, Randomness and Compressed Sensing

Putting everything together

Page 61: Sparsity, Randomness and Compressed Sensing

Compressed Sensing Coming Together

•  Signalmodel:Providespriorinforma9on;allowsundersampling

•  Randomness:Providesrobustness/stability;makesproofseasier

•  Non‐linearreconstruc2on:Incorporatesinforma9onthroughcomputa9on

Measurement y=Φx

Reconstruc9onusingsparseapproxima9on

x y x ~

SignalStructure(sparsity)

StableEmbedding(randomprojec9ons)

Non‐linearReconstruc2on

(BasisPursuit,MatchingPursuit,CoSaMP,etc…)

Page 62: Sparsity, Randomness and Compressed Sensing

Beyond:Extensions,

Connec2ons,Generaliza2ons

Page 63: Sparsity, Randomness and Compressed Sensing

Sparsity Models

Page 64: Sparsity, Randomness and Compressed Sensing

Block Sparsity

measurementssparsesignal

nonzeroblocksofL

Mixed l1/l2 norm—sum of l2 norms:

Basis pursuit becomes:

Blocks are not allowed to overlap

Φ

i

‖xBi‖2

minx

i

‖xBi‖2 s.t. y ≈ Φx

Page 65: Sparsity, Randomness and Compressed Sensing

y x

Joint Sparsity

Φ

Mixed l1/l2 norm—sum of l2 norms:

Basis pursuit becomes: minx

i

‖x(i,·)‖2 s.t. y ≈ Φx

i

‖x(i,·)‖2

M × Lmeasurements

N LLsparsesignalsinRN

Sparsecomponentspersignalwithcommonsupport

Page 66: Sparsity, Randomness and Compressed Sensing

Randomized Embeddings

Page 67: Sparsity, Randomness and Compressed Sensing

Stable Embeddings

Page 68: Sparsity, Randomness and Compressed Sensing

Johnson-Lindenstrauss Lemma

Page 69: Sparsity, Randomness and Compressed Sensing

Favorable JL Distributions

Page 70: Sparsity, Randomness and Compressed Sensing

Connecting JL to RIP

Page 71: Sparsity, Randomness and Compressed Sensing

Connecting JL to RIP

Page 72: Sparsity, Randomness and Compressed Sensing
Page 73: Sparsity, Randomness and Compressed Sensing

More?

Page 74: Sparsity, Randomness and Compressed Sensing

The tip of the iceberg Today’s lecture

Compressive Sensing Repository dsp.rice.edu/cs

Blog on CS nuit-blanche.blogspot.com/

Yet to be discovered… Start working on it