Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf ·...
Transcript of Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf ·...
![Page 1: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/1.jpg)
Communication with Imperfect SharedRandomness
(Joint work with Venkatesan Guruswami (CMU), RaghuMeka (?) and Madhu Sudan (MSR))
Who? Clement Canonne (Columbia University)
When? November 19, 2014
1 / 17
![Page 2: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/2.jpg)
Communication & ComplexityThere is a world outside of n
Context There is Alice, Bob, what they communicate and whatthey don’t have to.
Thef : 0, 1n × 0, 1n → 0, 1 ,
they compute; the protocol
Π
they use; from which
Dx ,Dy
their inputs come; what is blue and what red means.
2 / 17
![Page 3: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/3.jpg)
Communication & ComplexityThere is a world outside of n
Context There is Alice, Bob, what they communicate and whatthey don’t have to.The
f : 0, 1n × 0, 1n → 0, 1 ,
they compute; the protocol
Π
they use; from which
Dx ,Dy
their inputs come; what is blue and what red means.
2 / 17
![Page 4: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/4.jpg)
Communication & ComplexityBut context is not perfect. . .
Noise, misun-derstandings,
falseassumptions
Context is almost never perfectly shared.
My periwinkle is your orchid.
the printer on the 5th floor of Columbia is not exactlythe model my laptop has a driver for.
what precisely is a “French baguette” around here?
3 / 17
![Page 5: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/5.jpg)
Communication & ComplexityBut context is not perfect. . .
Noise, misun-derstandings,
falseassumptions
Context is almost never perfectly shared.
My periwinkle is your orchid.
the printer on the 5th floor of Columbia is not exactlythe model my laptop has a driver for.
what precisely is a “French baguette” around here?
3 / 17
![Page 6: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/6.jpg)
Communication & ComplexityBut context is not perfect. . .
Noise, misun-derstandings,
falseassumptions
Context is almost never perfectly shared.
My periwinkle is your orchid.
the printer on the 5th floor of Columbia is not exactlythe model my laptop has a driver for.
what precisely is a “French baguette” around here?
3 / 17
![Page 7: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/7.jpg)
Communication & ComplexityBut context is not perfect. . .
Noise, misun-derstandings,
falseassumptions
Context is almost never perfectly shared.
My periwinkle is your orchid.
the printer on the 5th floor of Columbia is not exactlythe model my laptop has a driver for.
what precisely is a “French baguette” around here?
3 / 17
![Page 8: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/8.jpg)
Communication & ComplexityWhat about randomness?
Equality testing I have x ∈ 0, 1n, you have y ∈ 0, 1n, are they equal?
Complexity?
Deterministic det-cc(EQ) = Θ(n)
Private randomness private-cc(EQ) = Θ(log n)
Shared randomness psr-cc(EQ) = O(1)
(Recall Newman’s Theorem:
private-cc(P) ≤ psr-cc(P) + O(log n).)
4 / 17
![Page 9: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/9.jpg)
This workRandomness and uncertainty
ISR(Imperfectly
SharedRandomness)
What if the randomness (“context”) was not perfectlyin sync?
To compute f (x , y):
Alice: has access to r ∈ ±1∗, gets input x ∈ 0, 1n
Bob: has access to s ∈ ±1∗, gets input y ∈ 0, 1n
w/ r ∼ρ s: Eri = Esi = 0, Eri si = ρ, (ri , si ) ⊥⊥ (rj , sj).
Studied (independently) by [BGI14] (different focus:“referee model”; more general correlations).
5 / 17
![Page 10: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/10.jpg)
This workRandomness and uncertainty
ISR(Imperfectly
SharedRandomness)
What if the randomness (“context”) was not perfectlyin sync?
To compute f (x , y):
Alice: has access to r ∈ ±1∗, gets input x ∈ 0, 1n
Bob: has access to s ∈ ±1∗, gets input y ∈ 0, 1n
w/ r ∼ρ s: Eri = Esi = 0, Eri si = ρ, (ri , si ) ⊥⊥ (rj , sj).
Studied (independently) by [BGI14] (different focus:“referee model”; more general correlations).
5 / 17
![Page 11: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/11.jpg)
ISR: general relations
For every P with x , y ∈ 0, 1n and 0 ≤ ρ ≤ ρ′ ≤ 1,
psr-cc(P) ≤ isr-ccρ′(P) ≤ isr-ccρ(P)
≤ private-cc(P) ≤ psr-cc(P) + O(log n).
(also true for one-way: psr-ccow, isr-ccowρ , private-ccow)
but for many problems, log n is already huge.
6 / 17
![Page 12: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/12.jpg)
Rest of the talk
1 A first example: the Compression problem
2 General upperbound on ISR in terms of PSR
3 Strong lower bound: Alice, Bob, Charlie and Dana.
7 / 17
![Page 13: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/13.jpg)
First result: Compression
Compressionwith uncertain
priors
Alice has P, gets m ∼ P; Bob knows Q ' P, wants m.
Previous work
P = Q H(P) (Huffman coding)
P '∆ Q H(P) + 2∆ [JKKS11] (w/ shared randomness)
P '∆ Q O(H(P) + ∆ + log log N) [HS14] (deterministic)
This work For all ε > 0,isr-ccowρ (Compress∆) ≤ 1+ε
1−h( 1−ρ2
)(H(P) + 2∆ + O(1))
“natural protocol”
8 / 17
![Page 14: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/14.jpg)
First result: Compression
Compressionwith uncertain
priors
Alice has P, gets m ∼ P; Bob knows Q ' P, wants m.
Previous work
P = Q H(P) (Huffman coding)
P '∆ Q H(P) + 2∆ [JKKS11] (w/ shared randomness)
P '∆ Q O(H(P) + ∆ + log log N) [HS14] (deterministic)
This work For all ε > 0,isr-ccowρ (Compress∆) ≤ 1+ε
1−h( 1−ρ2
)(H(P) + 2∆ + O(1))
“natural protocol”
8 / 17
![Page 15: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/15.jpg)
General upperboundIt’s inner products all the way down!
Theorem ∀ρ > 0, ∃c <∞ such that ∀k, we have
PSR-CC(k) ⊆ ISR-CCowρ (ck) .
Proof. (Outline)
Define GapInnerProduct, “complete” forPSR-CC(k) (see strategies as XR ,YR0, 12k
; use Newman’s
Theorem to bound # R’s);
Show there exists a (Gaussian-based) isr protocol forGapInnerProduct, with Oρ(4k) bits of comm.
9 / 17
![Page 16: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/16.jpg)
General upperboundCan we do better?
For problems inPSR-CCow(k)?
PSR-CCow(k) ⊆ ISR-CCowρ (co(k))?
For ISR-CCρ? PSR-CC(ω(k)) ⊆ ISR-CCρ(ck)?
Answer No.
10 / 17
![Page 17: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/17.jpg)
General upperboundCan we do better?
For problems inPSR-CCow(k)?
PSR-CCow(k) ⊆ ISR-CCowρ (co(k))?
For ISR-CCρ? PSR-CC(ω(k)) ⊆ ISR-CCρ(ck)?
Answer No.
10 / 17
![Page 18: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/18.jpg)
Strong converse: lower boundIt’s as good as it gets.
Theorem ∀k, ∃P = (Pn)n∈N s.t. psr-ccow(P) ≤ k, yet ∀ρ < 1
isr-ccρ(P) = 2Ωρ(k) .
Proof. (High-level)
Define SparseGapInnerProduct, relaxation ofGapInnerProduct.
Show it has as O(log q)-bit one-way psr protocol (Alice
uses the shared randomness to send one coordinate to Bob)
isr lower bound: argue that for any (fixed)* strategy ofAlice and Bob using less that
√q bits, either (a)
something impossible happens in the Boolean world, or(b) something impossible happens in the Gaussian world.
11 / 17
![Page 19: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/19.jpg)
Strong converse: lower boundTwo-pronged impossibility, first prong.
Case (a) The strategies (fr , gs)r ,s have common high-influencevariable (recall the one-way psr protocol).But then, two players Charlie and Dana can* leveragethis strategies to win an agreement distillation game:
Definition(Agreementdistillation)
Charlie and Dana have no inputs. Their goal is tooutput wC and wD satisfying:
(i) Pr[ wC = wD ] ≥ γ;
(ii) H∞(wC ),H∞(wD) ≥ κ.
But this requires Ω(κ)− log(1/γ) bits ofcommunication (via [BM10, Theorem 1]).
12 / 17
![Page 20: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/20.jpg)
Strong converse: lower boundTwo-pronged impossibility, second prong.
Case (b) fr : 0, 1n → KA ⊂ [0, 1]2k, gs : 0, 1n → KB ⊂ [0, 1]2
k
have no common high-influence variable.We then show that this implies k = 2Ω(
√q), by using an
Invariance Principle (in the spirit of [Mos10]) to “go tothe Gaussian world”: if f , g are low-degree polynomialswith no common influential variable, then
E(x ,y)∼N⊗n [〈f (x), g(y)〉] ' E(X ,Y )∼G⊗n [〈F (X ),G (Y )〉]
and Charlie and Dana can use this solve (yet another)problem, the Gaussian Inner Product(GaussianCorrelationξ).
But. . . a reduction to Disjointness shows that (even withpsr) this requires Ω1/ξ bits of communication.
13 / 17
![Page 21: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/21.jpg)
Conclusions
Summary
Dealing with more realistic situations: Alice, Bob, andwhat they do not know about each other;
comes into play when n is huge (Newman’s Theorembecomes loose);
show general and tight relations and reductions in thismodel, with both upper and lower bounds.
a new invariance theorem, and use in comm. complexity.
What about. . .
more general forms of correlations?
cases where even randomness is expensive? (minimizeits use)
one-sided error?
14 / 17
![Page 22: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/22.jpg)
Conclusions
Summary
Dealing with more realistic situations: Alice, Bob, andwhat they do not know about each other;
comes into play when n is huge (Newman’s Theorembecomes loose);
show general and tight relations and reductions in thismodel, with both upper and lower bounds.
a new invariance theorem, and use in comm. complexity.
What about. . .
more general forms of correlations?
cases where even randomness is expensive? (minimizeits use)
one-sided error?
14 / 17
![Page 23: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/23.jpg)
Thank you.(Questions?)
15 / 17
![Page 24: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/24.jpg)
Theorem (OurInvariancePrinciple)
Fix any two parameters p1, p2 ∈ (−1, 1). For all ε ∈ (0, 1],` ∈ N, θ0 ∈ [0, 1), and closed convex sets K1,K2 ⊆ [0, 1]`
there exist τ > 0 and mappings
T1 : f : +1,−1n → K1 → F : Rn → K1T2 : g : +1,−1n → K2 → G : Rn → K2
such that for all θ ∈ [−θ0, θ0], if f , g satisfy
maxi∈[n]
min
(maxj∈[`]
Inf i (d)fj ,maxj∈[`]
Inf i (d)gj
)≤ τ
then, for F = T1(f ) and G = T2(g), we havewhere N = Np1,p2,θ and G is the Gaussian distribution whichmatches the first and second-order moments of N.
16 / 17
![Page 25: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/25.jpg)
Theorem(Invariance
Theorem of[GHM+11])
Let (Ω, µ) be a finite prob. space with each prob. at leastα ≤ 1/2. Let b = |Ω| and L = χ0 = 1, χ1, χ2, . . . , χb−1 bea basis for r.v.’s over Ω. Let Υ = ξ0 = 1, ξ1, . . . , ξb−1 bean ensemble of real-valued Gaussian r.v.’s with 1st and 2nd
moments matching those of the χi ’s; andh = (h1, h2, . . . , ht) : Ωn → Rt s.t.
Inf i (h`) ≤ τ, Var(h`) ≤ 1
for all i ∈ [n] and ` ∈ [t]. For η ∈ (0, 1), let H` (` ∈ [t]) bethe multilinear polynomial associated with T1−ηh` w.r.t. L.If Ψ: Rt → R is Λ-Lipschitz (w.r.t. the L2-norm), then∣∣∣∣E[Ψ(H1(Ln), · · · ,Ht(Ln)
)]−E[Ψ(H1(Υn), · · · ,Ht(Υn)
)]∣∣∣∣≤ C (t) · Λ · τ
η18 log 1
α = oτ (1)
for some constant C = C (t).
17 / 17
![Page 26: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2014-11-07-columbia.pdf · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra, and](https://reader031.fdocuments.net/reader031/viewer/2022040812/5e575288d2292c3a996f9abb/html5/thumbnails/26.jpg)
Mohammad Bavarian, Dmitry Gavinsky, and Tsuyoshi Ito.
On the role of shared randomness in simultaneous communication.In Javier Esparza, Pierre Fraigniaud, Thore Husfeldt, and Elias Koutsoupias, editors,ICALP (1), volume 8572 of Lecture Notes in Computer Science, pages 150–162. Springer,2014.
Andrej Bogdanov and Elchanan Mossel.
On extracting common random bits from correlated sources.CoRR, abs/1007.2315, 2010.
Venkatesan Guruswami, Johan Hastad, Rajsekar Manokaran, Prasad Raghavendra, and
Moses Charikar.Beating the random ordering is hard: Every ordering CSP is approximation resistant.SIAM J. Comput., 40(3):878–914, 2011.
Elad Haramaty and Madhu Sudan.
Deterministic compression with uncertain priors.In Proceedings of the 5th Conference on Innovations in Theoretical Computer Science,ITCS ’14, pages 377–386, New York, NY, USA, 2014. ACM.
Brendan Juba, Adam Tauman Kalai, Sanjeev Khanna, and Madhu Sudan.
Compression without a common prior: an information-theoretic justification for ambiguityin language.In Bernard Chazelle, editor, ICS, pages 79–86. Tsinghua University Press, 2011.
Elchanan Mossel.
Gaussian bounds for noise correlation of functions.Geometric and Functional Analysis, 19(6):1713–1756, 2010.
17 / 17