Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization
description
Transcript of Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization
![Page 1: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/1.jpg)
1
Decentralized Jointly Sparse Optimization byReweighted Lq Minimization
Qing Ling Department of Automation
University of Science and Technology of China
Joint work with Zaiwen Wen (SJTU) and Wotao Yin (RICE)
2012/09/05
![Page 2: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/2.jpg)
2
A brief introduction to my research interest
optimization and control in networked multi-agent systems
autonomous agents- collect data- process data- communicate
problem: how to efficiently accomplish in-network optimization and control tasks through collaboration of agents?
![Page 3: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/3.jpg)
3
Large-scale wireless sensor networks: decentralized signal processing, node localization, sensor selection …
how to fuse big sensory data?e.g. structural health monitoring
how to localize blinds with anchors?
blind anchor
how to assign sensors to targets?
difficulty in data transmission→ decentralized optimization without any fusion center
![Page 4: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/4.jpg)
4
Computer/server networks with big data: collaborative data mining
new challenges in the big data era- big data is stored in distributed computers/servers- data transmission is prohibited due to bandwidth/privacy/…- computers/servers collaborate to do data mining
distributed/decentralized optimization
![Page 5: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/5.jpg)
5
Wireless sensor and actuator networks: with application in large-scale greenhouse control
decentralized control system design
wireless sensing- temperature- humidity- …
wireless actuating- circulating fan- wet curtain- …
disadvantages of traditional centralized control- communication burden in collecting distributed sensory data- lack of robustness due to packet-loss, time-delay, …
![Page 6: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/6.jpg)
6
Recent works
wireless sensor networks- decentralized signal processing with application in SHM- decentralized node localization using SDP and SOCP- decentralized sensor node selection for target tracking
collaborative data mining- decentralized approaches to jointly sparse signal recovery- decentralized approaches to matrix completion
wireless sensor and actuator networks- modeling, hardware design, controller design, prototype
theoretical issues- convergence and convergence rate analysis
![Page 7: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/7.jpg)
7
Decentralized Jointly Sparse Optimization byReweighted Lq Minimization
Qing Ling Department of Automation
University of Science and Technology of China
Joint work with Zaiwen Wen (SJTU) and Wotao Yin (RICE)
2012/09/05
![Page 8: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/8.jpg)
8
Outline
Background decentralized jointly sparse optimization with applications
Roadmap nonconvex versus convex, difficulty in decentralized computing
Algorithm development successive linearization, inexact average consensus
Simulation and conclusion
![Page 9: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/9.jpg)
9
Background (I): jointly sparse optimization
Structured signals A sparse signal: only few elements are nonzero Jointly sparse signals: sparse, with the same nonzero supports
Jointly sparse optimization: to recover X from linear measurements
nonzeros
zeros
measurement matrix measurement noise
![Page 10: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/10.jpg)
10
Background (II): decentralized jointly sparse optimization
Decentralized computing in a network Distributed data in distributed agents & no fusion center Consideration of privacy, difficulty in data collection, etc
Goal: agent i has y(i) and A(i), to recover x(i) through collaboration Decentralized jointly sparse optimization
![Page 11: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/11.jpg)
11
Background (III): applications
Cooperative spectrum sensing [1][2] Cognitive radios sense jointly sparse spectra {x(i)} Measure from time domain [1] or frequency selective filter [2] Decentralized recovery from {y(i)=A(i)x(i)}
[1] F. Zeng, C. Li, and Z. Tian, “Distributed compressive spectrum sensing in cooperative multi-hop wideband cognitive networks,” IEEE Journal of Selected Topics in Signal Processing, vol. 5, pp. 37–48, 2011[2] J. Meng, W. Yin, H. Li, E. Houssain, and Z. Han, “Collaborative spectrum sensing from sparse observations for cognitive radio networks,” IEEE Journal on Selected Areas on Communications, vol. 29, pp. 327–337, 2011[3] N. Nguyen, N. Nasrabadi, and T. Tran, “Robust multi-sensor classification via joint sparse representation,” submitted to Journal of Advance in Information Fusion
Decentralized event detection [3] Sensors {i} sense few targets represented by jointly sparse {x(i)} Decentralized recovery from {y(i)=A(i)x(i)}
Collaborative data mining, distributed human action recognition, etc
![Page 12: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/12.jpg)
12
Roadmap (I): nonconvex versus convex
Convex model: group lasso or L21 norm minimization
Nonconvex versus convex Convex: with global convergence guarantee Nonconvex: often with better recovery performance
Look back on nonconvex models to recover a single sparse signal
Reweighted L1/L2 norm minimization [4][5] Reweighted algorithms for jointly sparse optimization?
[4] E. Candes, M. Wakin, and S. Boyd, “Enhancing sparsity by reweighted L1 minimization,” Journal of Fourier Analysis and Applications, vol. 14, pp. 877–905, 2008[5] R. Chartrand and W. Yin, “Iteratively reweighted algorithms for compressive sensing,” In: Proceedings of ICASSP, 2008
regularization parameter
![Page 13: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/13.jpg)
13
Roadmap (II): difficulty in decentralized computing
A popular decentralized computing technique: consensus
common optimization variableobjective function in agent i
local copy in agent i neighboring copies are equal Obviously, two problems are equivalent for a connected network
Efficient algorithms (ADM, SGD, etc) for if it is convex [6] Nothing for consensus in jointly sparse optimization! Signals are different; common supports bring nonconvexity
[6] D. Bertsekas and J. Tsitsiklis, Parallel and Distributed Computation: Numerical Methods, Second Edition, Athena Scientific, 1997
![Page 14: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/14.jpg)
14
Roadmap (III): solution overview
Nonconvex model + convex decentralized computing subproblem Nonconvex model -> successive linearization -> reweighted Lq Natural decentralized computing, one nontrivial subproblem Inexactly solving the subproblem still leads to good recovery
![Page 15: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/15.jpg)
15
Algorithm (I): successive linearization
Nonconvex model (q=1 or 2)
regularization parameter
smoothing parameter
“Successive linearization” to the joint sparsity term at t
Actually a majorization minimization approach
![Page 16: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/16.jpg)
16
Algorithm (II): reweighted algorithm
Centralized reweighted Lq minimization algorithm Updating weight vector
weight vector u=[u1; u2; uN] Updating signals
From a decentralized implementation perspective … Natural decentralized computing in x-update One subproblem needs decentralized solution in u-update
![Page 17: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/17.jpg)
17
Algorithm (III): average consensus
Check u-update: average consensus problem
Rewrite to more familiar forms
![Page 18: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/18.jpg)
18
Algorithm (IV): inexact average consensus
Solve the average consensus problem with ADM (time t, slot s/S)
Updating weight vectors (local copies)
Updating Lagrange multipliers (c is a positive constant)
Exact average consensus versus inexact average consensus Exact average consensus: exact implementation of reweighted Lq Introducing inner loops: cost of coordination & communication Inexact average consensus: one iteration in the inner loop
![Page 19: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/19.jpg)
19
Algorithm (V): decentralized reweighted Lq
Algorithm outline Updating weight vectors (local copies)
Updating Lagrange multipliers (c is a positive constant)
Updating signals
![Page 20: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/20.jpg)
20
Simulation (I): simulation settings
Network settings L=50 agents, randomly deployed in 100×100 area Communication range=30, bidirectionally connected
Measurement settings Signal dimension N=20, signal sparsity K=2 Measurement dimension M=10 Random measurement matrices and random measurement
noise Parameter settings
![Page 21: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/21.jpg)
21
Simulation (II): recovery performance
![Page 22: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/22.jpg)
22
Simulation (III): convergence rate
![Page 23: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/23.jpg)
23
Conclusion
Decentralized jointly sparse optimization problem Jointly sparse signal recovery in a distributed network Reweighted Lq minimization algorithms Feature #1: nonconvex model <- successive linearization Feature #2: decentralized computing <- inexact average consensus
Outlook: many open questions in decentralized optimization
Good news and bad news Local convergence of the centralized algorithms Excellent performance of the decentralized algorithms No theoretical performance guarantee (recovery and
convergence)
![Page 24: Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization](https://reader036.fdocuments.net/reader036/viewer/2022062323/56815a93550346895dc80ab4/html5/thumbnails/24.jpg)
24
Thanks for your attention!