@let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need...
Transcript of @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need...
![Page 1: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/1.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Massive Access and Many-User Information Theory
Dongning Guo
Joint work with Lei Zhang, Jun Luo, Xu Chen, and Tsung-Yi Chen
Department of Electrical Engineering and Computer ScienceNorthwestern University
@ Conference on Applied Mathematics, The University of Hong KongAugust 25, 2016
Dongning Guo (Northwestern Univ.) p. 1
Massive access and many-user information theory
![Page 2: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/2.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Motivation
Dongning Guo (Northwestern Univ.) p. 2
Massive access and many-user information theory
![Page 3: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/3.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Motivation
Dongning Guo (Northwestern Univ.) p. 2
Massive access and many-user information theory
![Page 4: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/4.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Motivation
Dongning Guo (Northwestern Univ.) p. 2
Massive access and many-user information theory
![Page 5: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/5.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Motivation
Internet of Things
Dongning Guo (Northwestern Univ.) p. 2
Massive access and many-user information theory
![Page 6: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/6.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Fundamental questions
1. ` devices in a cell; k of them are active.
2. ` and k are large numbers.
3. Massive grant free access in the uplink.Who transmitted?What are their messages?
4. Selective addressing in the downlink.Who should listen?What is the message for each receiver?
Dongning Guo (Northwestern Univ.) p. 3
Massive access and many-user information theory
![Page 7: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/7.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Fundamental questions
1. ` devices in a cell; k of them are active.
2. ` and k are large numbers.
3. Massive grant free access in the uplink.Who transmitted?What are their messages?
4. Selective addressing in the downlink.Who should listen?What is the message for each receiver?
Dongning Guo (Northwestern Univ.) p. 3
Massive access and many-user information theory
![Page 8: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/8.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Fundamental questions
1. ` devices in a cell; k of them are active.
2. ` and k are large numbers.
3. Massive grant free access in the uplink.Who transmitted?What are their messages?
4. Selective addressing in the downlink.Who should listen?What is the message for each receiver?
Dongning Guo (Northwestern Univ.) p. 3
Massive access and many-user information theory
![Page 9: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/9.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Fundamental questions
1. ` devices in a cell; k of them are active.
2. ` and k are large numbers.
3. Massive grant free access in the uplink.Who transmitted?What are their messages?
4. Selective addressing in the downlink.Who should listen?What is the message for each receiver?
Dongning Guo (Northwestern Univ.) p. 3
Massive access and many-user information theory
![Page 10: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/10.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Need a Many-User Information Theory
I Classical single-user Information Theory:1 user, coding blocklength n→∞.
I Multiuser Information Theory:k users (fixed, usually small), n→∞.
I Large-system analysis:n→∞ first, then k →∞.
I However, k > n in many systems.E.g., large sensor networks, Internet of things.
I n→∞ for fixed k may be inaccurate and provide little insight.
I We propose a Many-User Information Theory:k, n→∞ simultaneously. For example:k = αn→∞;k = nα →∞ (e.g., k = 1, 000, 000 devices, frame length n = 1, 000).
Dongning Guo (Northwestern Univ.) p. 4
Massive access and many-user information theory
![Page 11: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/11.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Need a Many-User Information Theory
I Classical single-user Information Theory:1 user, coding blocklength n→∞.
I Multiuser Information Theory:k users (fixed, usually small), n→∞.
I Large-system analysis:n→∞ first, then k →∞.
I However, k > n in many systems.E.g., large sensor networks, Internet of things.
I n→∞ for fixed k may be inaccurate and provide little insight.
I We propose a Many-User Information Theory:k, n→∞ simultaneously. For example:k = αn→∞;k = nα →∞ (e.g., k = 1, 000, 000 devices, frame length n = 1, 000).
Dongning Guo (Northwestern Univ.) p. 4
Massive access and many-user information theory
![Page 12: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/12.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Need a Many-User Information Theory
I Classical single-user Information Theory:1 user, coding blocklength n→∞.
I Multiuser Information Theory:k users (fixed, usually small), n→∞.
I Large-system analysis:n→∞ first, then k →∞.
I However, k > n in many systems.E.g., large sensor networks, Internet of things.
I n→∞ for fixed k may be inaccurate and provide little insight.
I We propose a Many-User Information Theory:k, n→∞ simultaneously. For example:k = αn→∞;k = nα →∞ (e.g., k = 1, 000, 000 devices, frame length n = 1, 000).
Dongning Guo (Northwestern Univ.) p. 4
Massive access and many-user information theory
![Page 13: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/13.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Need a Many-User Information Theory
I Classical single-user Information Theory:1 user, coding blocklength n→∞.
I Multiuser Information Theory:k users (fixed, usually small), n→∞.
I Large-system analysis:n→∞ first, then k →∞.
I However, k > n in many systems.E.g., large sensor networks, Internet of things.
I n→∞ for fixed k may be inaccurate and provide little insight.
I We propose a Many-User Information Theory:k, n→∞ simultaneously. For example:k = αn→∞;k = nα →∞ (e.g., k = 1, 000, 000 devices, frame length n = 1, 000).
Dongning Guo (Northwestern Univ.) p. 4
Massive access and many-user information theory
![Page 14: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/14.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Need a Many-User Information Theory
I Classical single-user Information Theory:1 user, coding blocklength n→∞.
I Multiuser Information Theory:k users (fixed, usually small), n→∞.
I Large-system analysis:n→∞ first, then k →∞.
I However, k > n in many systems.E.g., large sensor networks, Internet of things.
I n→∞ for fixed k may be inaccurate and provide little insight.
I We propose a Many-User Information Theory:k, n→∞ simultaneously. For example:k = αn→∞;k = nα →∞ (e.g., k = 1, 000, 000 devices, frame length n = 1, 000).
Dongning Guo (Northwestern Univ.) p. 4
Massive access and many-user information theory
![Page 15: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/15.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Need a Many-User Information Theory
I Classical single-user Information Theory:1 user, coding blocklength n→∞.
I Multiuser Information Theory:k users (fixed, usually small), n→∞.
I Large-system analysis:n→∞ first, then k →∞.
I However, k > n in many systems.E.g., large sensor networks, Internet of things.
I n→∞ for fixed k may be inaccurate and provide little insight.
I We propose a Many-User Information Theory:k, n→∞ simultaneously. For example:k = αn→∞;k = nα →∞ (e.g., k = 1, 000, 000 devices, frame length n = 1, 000).
Dongning Guo (Northwestern Univ.) p. 4
Massive access and many-user information theory
![Page 16: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/16.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Outline
I (Almost practical) device identification
I Classical information theory
I Many-access channel
I Many-broadcast channel
Dongning Guo (Northwestern Univ.) p. 5
Massive access and many-user information theory
![Page 17: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/17.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Neighbor discovery/device identification
I To acquire the network interface addresses (NIAs) of all neighbors.
I Prior art: random access. Each node sends its NIA repeatedly withrandom delay.
Dongning Guo (Northwestern Univ.) p. 6
Massive access and many-user information theory
![Page 18: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/18.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Neighbor discovery/device identification
I To acquire the network interface addresses (NIAs) of all neighbors.
I Prior art: random access. Each node sends its NIA repeatedly withrandom delay.
Dongning Guo (Northwestern Univ.) p. 6
Massive access and many-user information theory
![Page 19: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/19.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Discovery is fundamentally sparse signal recovery
I b-bit address, l = 2b valid NIAs total.
I Node i sends signal si.
I Multiaccess channel with path loss and fading:
Y =∑
i∈neighborhood
siUi +W
=
l∑i=1
siXi +W
= SX +W
I Given Y , what is X?
I Xi u 0 for all but a few neighbors.
Dongning Guo (Northwestern Univ.) p. 7
Massive access and many-user information theory
![Page 20: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/20.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Discovery is fundamentally sparse signal recovery
I b-bit address, l = 2b valid NIAs total.
I Node i sends signal si.
I Multiaccess channel with path loss and fading:
Y =∑
i∈neighborhood
siUi +W
=
l∑i=1
siXi +W
= SX +W
I Given Y , what is X?
I Xi u 0 for all but a few neighbors.
Dongning Guo (Northwestern Univ.) p. 7
Massive access and many-user information theory
![Page 21: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/21.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Discovery is fundamentally sparse signal recovery
I b-bit address, l = 2b valid NIAs total.
I Node i sends signal si.
I Multiaccess channel with path loss and fading:
Y =∑
i∈neighborhood
siUi +W
=
l∑i=1
siXi +W
= SX +W
I Given Y , what is X?
I Xi u 0 for all but a few neighbors.
Dongning Guo (Northwestern Univ.) p. 7
Massive access and many-user information theory
![Page 22: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/22.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Discovery is fundamentally sparse signal recovery
I b-bit address, l = 2b valid NIAs total.
I Node i sends signal si.
I Multiaccess channel with path loss and fading:
Y =∑
i∈neighborhood
siUi +W
=
l∑i=1
siXi +W
= SX +W
I Given Y , what is X?
I Xi u 0 for all but a few neighbors.
Dongning Guo (Northwestern Univ.) p. 7
Massive access and many-user information theory
![Page 23: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/23.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Discovery is fundamentally sparse signal recovery
I b-bit address, l = 2b valid NIAs total.
I Node i sends signal si.
I Multiaccess channel with path loss and fading:
Y =∑
i∈neighborhood
siUi +W
=
l∑i=1
siXi +W
= SX +W
I Given Y , what is X?
I Xi u 0 for all but a few neighbors.
Dongning Guo (Northwestern Univ.) p. 7
Massive access and many-user information theory
![Page 24: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/24.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Network-wide discovery in one frame interval
I Each node transmits a single frame of signature.
I Synchronized transmissions (can be relaxed).
I One key challenge is decoding complexity (need to scale to 220–248 NIAs).
I Second-order Reed-Muller codes + chirp decoding algorithm[Calderbank, Gilbert & Strauss ’06], [Howard, Calderbank & Searle ’08].
I In an ad hoc network with half-duplex transceivers, use symbol erasuresto achieve rapid on-off division duplex (RODD).
Dongning Guo (Northwestern Univ.) p. 8
Massive access and many-user information theory
![Page 25: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/25.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Network-wide discovery in one frame interval
I Each node transmits a single frame of signature.
I Synchronized transmissions (can be relaxed).
I One key challenge is decoding complexity (need to scale to 220–248 NIAs).
I Second-order Reed-Muller codes + chirp decoding algorithm[Calderbank, Gilbert & Strauss ’06], [Howard, Calderbank & Searle ’08].
I In an ad hoc network with half-duplex transceivers, use symbol erasuresto achieve rapid on-off division duplex (RODD).
Dongning Guo (Northwestern Univ.) p. 8
Massive access and many-user information theory
![Page 26: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/26.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Network-wide discovery in one frame interval
I Each node transmits a single frame of signature.
I Synchronized transmissions (can be relaxed).
I One key challenge is decoding complexity (need to scale to 220–248 NIAs).
I Second-order Reed-Muller codes + chirp decoding algorithm[Calderbank, Gilbert & Strauss ’06], [Howard, Calderbank & Searle ’08].
I In an ad hoc network with half-duplex transceivers, use symbol erasuresto achieve rapid on-off division duplex (RODD).
Dongning Guo (Northwestern Univ.) p. 8
Massive access and many-user information theory
![Page 27: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/27.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Network-wide discovery in one frame interval
I Each node transmits a single frame of signature.
I Synchronized transmissions (can be relaxed).
I One key challenge is decoding complexity (need to scale to 220–248 NIAs).
I Second-order Reed-Muller codes + chirp decoding algorithm[Calderbank, Gilbert & Strauss ’06], [Howard, Calderbank & Searle ’08].
I In an ad hoc network with half-duplex transceivers, use symbol erasuresto achieve rapid on-off division duplex (RODD).
Dongning Guo (Northwestern Univ.) p. 8
Massive access and many-user information theory
![Page 28: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/28.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Network-wide discovery in one frame interval
I Each node transmits a single frame of signature.
I Synchronized transmissions (can be relaxed).
I One key challenge is decoding complexity (need to scale to 220–248 NIAs).
I Second-order Reed-Muller codes + chirp decoding algorithm[Calderbank, Gilbert & Strauss ’06], [Howard, Calderbank & Searle ’08].
I In an ad hoc network with half-duplex transceivers, use symbol erasuresto achieve rapid on-off division duplex (RODD).
Dongning Guo (Northwestern Univ.) p. 8
Massive access and many-user information theory
![Page 29: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/29.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Second order Reed-Muller signature generationI Signature length n = 2m.I Pm×m is a binary symmetric matrix, x, t ∈ Zm2 :
ϕP,t(x) =(√−1)xTPx+2tTx
.
I Codebook size up to 2m(m+3)/2:I m = 5, n = 25 = 32, l up to 220 codewords.I m = 10, n = 210 = 1,024, l up to 265;I m = 12, n = 212 = 4,096, l up to 290.
I Introduce about 50% erasures in case of virtual full duplex.
1 1 j -j j -j -1 -1 1 1 -j j j -j 1 1 1 -1 j j j j -1 1 -1 1 j j -j -j -1 1
1 1 j -j j -j 1 1 1 -1 -j -j -j -j 1 -1 j -j 1 1 -1 -1 -j j j j -1 1 1 -1 -j -j
1 1 j j j -j -1 1 j j 1 1 -1 1 j -j j -j -1 1 1 1 j j 1 -1 -j j -j -j -1 -1
1 -1 j -j j j 1 1 j j 1 1 1 -1 j -j 1 1 -j -j -j j 1 -1 j -j -1 1 -1 -1 j j
1 -j 1 j j 1 -j 1 1 j -1 j j -1 j 1 j -1 j 1 -1 -j 1 -j j 1 -j 1 -1 j -1 -j
1 -1 1 1 j -j -j -j 1 1 -1 1 -j -j -j j j j -j j -1 -1 -1 1 -j j -j -j -1 1 1 1
1 j -1 -j j -1 j -1 j 1 j 1 -1 j 1 -j 1 -j -1 j -j -1 -j -1 j -1 j -1 1 j -1 -j
1 1 -1 -1 j j j j j -j j -j 1 -1 -1 1 1 -1 1 -1 -j j j -j -j -j j j 1 1 1 1
Dongning Guo (Northwestern Univ.) p. 9
Massive access and many-user information theory
![Page 30: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/30.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Second order Reed-Muller signature generationI Signature length n = 2m.I Pm×m is a binary symmetric matrix, x, t ∈ Zm2 :
ϕP,t(x) =(√−1)xTPx+2tTx
.
I Codebook size up to 2m(m+3)/2:I m = 5, n = 25 = 32, l up to 220 codewords.I m = 10, n = 210 = 1,024, l up to 265;I m = 12, n = 212 = 4,096, l up to 290.
I Introduce about 50% erasures in case of virtual full duplex.
1 1 j -j j -j -1 -1 1 1 -j j j -j 1 1 1 -1 j j j j -1 1 -1 1 j j -j -j -1 1
1 1 j -j j -j 1 1 1 -1 -j -j -j -j 1 -1 j -j 1 1 -1 -1 -j j j j -1 1 1 -1 -j -j
1 1 j j j -j -1 1 j j 1 1 -1 1 j -j j -j -1 1 1 1 j j 1 -1 -j j -j -j -1 -1
1 -1 j -j j j 1 1 j j 1 1 1 -1 j -j 1 1 -j -j -j j 1 -1 j -j -1 1 -1 -1 j j
1 -j 1 j j 1 -j 1 1 j -1 j j -1 j 1 j -1 j 1 -1 -j 1 -j j 1 -j 1 -1 j -1 -j
1 -1 1 1 j -j -j -j 1 1 -1 1 -j -j -j j j j -j j -1 -1 -1 1 -j j -j -j -1 1 1 1
1 j -1 -j j -1 j -1 j 1 j 1 -1 j 1 -j 1 -j -1 j -j -1 -j -1 j -1 j -1 1 j -1 -j
1 1 -1 -1 j j j j j -j j -j 1 -1 -1 1 1 -1 1 -1 -j j j -j -j -j j j 1 1 1 1
Dongning Guo (Northwestern Univ.) p. 9
Massive access and many-user information theory
![Page 31: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/31.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Second order Reed-Muller signature generationI Signature length n = 2m.I Pm×m is a binary symmetric matrix, x, t ∈ Zm2 :
ϕP,t(x) =(√−1)xTPx+2tTx
.
I Codebook size up to 2m(m+3)/2:I m = 5, n = 25 = 32, l up to 220 codewords.I m = 10, n = 210 = 1,024, l up to 265;I m = 12, n = 212 = 4,096, l up to 290.
I Introduce about 50% erasures in case of virtual full duplex.
1 1 j -j j -j -1 -1 1 1 -j j j -j 1 1 1 -1 j j j j -1 1 -1 1 j j -j -j -1 1
1 1 j -j j -j 1 1 1 -1 -j -j -j -j 1 -1 j -j 1 1 -1 -1 -j j j j -1 1 1 -1 -j -j
1 1 j j j -j -1 1 j j 1 1 -1 1 j -j j -j -1 1 1 1 j j 1 -1 -j j -j -j -1 -1
1 -1 j -j j j 1 1 j j 1 1 1 -1 j -j 1 1 -j -j -j j 1 -1 j -j -1 1 -1 -1 j j
1 -j 1 j j 1 -j 1 1 j -1 j j -1 j 1 j -1 j 1 -1 -j 1 -j j 1 -j 1 -1 j -1 -j
1 -1 1 1 j -j -j -j 1 1 -1 1 -j -j -j j j j -j j -1 -1 -1 1 -j j -j -j -1 1 1 1
1 j -1 -j j -1 j -1 j 1 j 1 -1 j 1 -j 1 -j -1 j -j -1 -j -1 j -1 j -1 1 j -1 -j
1 1 -1 -1 j j j j j -j j -j 1 -1 -1 1 1 -1 1 -1 -j j j -j -j -j j j 1 1 1 1
Dongning Guo (Northwestern Univ.) p. 9
Massive access and many-user information theory
![Page 32: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/32.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Second order Reed-Muller signature generationI Signature length n = 2m.I Pm×m is a binary symmetric matrix, x, t ∈ Zm2 :
ϕP,t(x) =(√−1)xTPx+2tTx
.
I Codebook size up to 2m(m+3)/2:I m = 5, n = 25 = 32, l up to 220 codewords.I m = 10, n = 210 = 1,024, l up to 265;I m = 12, n = 212 = 4,096, l up to 290.
I Introduce about 50% erasures in case of virtual full duplex.
0 0 j 0 j -j 0 0 0 0 -j j 0 -j 1 1 0 0 j 0 j j 0 0 0 0 j j 0 -j -1 1
0 0 0 0 j -j 0 1 0 -1 0 0 -j 0 0 0 0 0 0 0 -1 -1 0 j 0 j 0 0 1 0 0 0
1 1 0 0 j 0 -1 0 0 0 0 1 -1 1 j -j j -j 0 0 1 0 j 0 0 0 0 j -j -j -1 -1
0 0 0 -j 0 0 0 1 j j 0 1 1 0 0 0 0 0 0 -j 0 0 0 -1 j -j 0 1 -1 0 0 0
0 0 0 0 0 1 0 0 1 j 0 0 j 0 0 0 0 0 0 0 0 -j 0 0 j 1 0 0 -1 0 0 0
1 -1 1 1 j -j -j -j 1 1 -1 0 0 0 0 0 j j -j j -1 -1 -1 1 -j j -j 0 0 0 0 0
1 0 0 0 0 -1 j 0 0 0 0 0 -1 j 1 -j 1 0 0 0 0 -1 -j 0 0 0 0 0 1 j -1 -j
0 1 0 -1 j 0 j 0 j -j 0 -j 0 -1 -1 1 0 -1 0 -1 -j 0 j 0 -j -j 0 j 0 1 1 1
Dongning Guo (Northwestern Univ.) p. 9
Massive access and many-user information theory
![Page 33: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/33.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Error rate vs. SNR220 nodes, path loss exponent = 3, Rayleigh fading
8 9 10 11 12 13 14 15 16
10−3
10−2
10−1
SNR (dB)
Rat
e of
Mis
s
M = 1,02410 neighbors on average
Dongning Guo (Northwestern Univ.) p. 10
Massive access and many-user information theory
![Page 34: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/34.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Error rate vs. SNR220 nodes, path loss exponent = 3, Rayleigh fading
8 9 10 11 12 13 14 15 16
10−3
10−2
10−1
SNR (dB)
Rate
of M
iss M = 4,096
30 neighbors on average
M = 1,02410 neighbors on average
Dongning Guo (Northwestern Univ.) p. 10
Massive access and many-user information theory
![Page 35: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/35.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Comparison with random access
I l = 220 nodes, on average 10 neighbors, SNR = 11.5 dB
Target Pe = 0.002
Random access RODD# of frames 194 1# of symbols ≥ 194×20=3,880 1,024
I In addition, significant reduction of per-frame overhead.
I More results in:L. Zhang and D. Guo, “Virtual full duplex wireless broadcasting viacompressed sensing,” IEEE/ACM Trans. Networking, 2014.L. Zhang, J. Luo, and D. Guo, “Neighbor discovery for wireless networksvia compressed sensing,” Performance Evaluation, 2013.X. Chen and D. Guo, “Robust sublinear complexity Walsh-Hadamardtransform with arbitrary sparse support,” ISIT, 2015.
Dongning Guo (Northwestern Univ.) p. 11
Massive access and many-user information theory
![Page 36: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/36.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Comparison with random access
I l = 220 nodes, on average 10 neighbors, SNR = 11.5 dB
Target Pe = 0.002
Random access RODD# of frames 194 1# of symbols ≥ 194×20=3,880 1,024
I In addition, significant reduction of per-frame overhead.
I More results in:L. Zhang and D. Guo, “Virtual full duplex wireless broadcasting viacompressed sensing,” IEEE/ACM Trans. Networking, 2014.L. Zhang, J. Luo, and D. Guo, “Neighbor discovery for wireless networksvia compressed sensing,” Performance Evaluation, 2013.X. Chen and D. Guo, “Robust sublinear complexity Walsh-Hadamardtransform with arbitrary sparse support,” ISIT, 2015.
Dongning Guo (Northwestern Univ.) p. 11
Massive access and many-user information theory
![Page 37: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/37.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Comparison with random access
I l = 220 nodes, on average 10 neighbors, SNR = 11.5 dB
Target Pe = 0.002
Random access RODD# of frames 194 1# of symbols ≥ 194×20=3,880 1,024
I In addition, significant reduction of per-frame overhead.
I More results in:L. Zhang and D. Guo, “Virtual full duplex wireless broadcasting viacompressed sensing,” IEEE/ACM Trans. Networking, 2014.L. Zhang, J. Luo, and D. Guo, “Neighbor discovery for wireless networksvia compressed sensing,” Performance Evaluation, 2013.X. Chen and D. Guo, “Robust sublinear complexity Walsh-Hadamardtransform with arbitrary sparse support,” ISIT, 2015.
Dongning Guo (Northwestern Univ.) p. 11
Massive access and many-user information theory
![Page 38: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/38.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Comparison with random access
I l = 220 nodes, on average 10 neighbors, SNR = 11.5 dB
Target Pe = 0.002
Random access RODD# of frames 194 1# of symbols ≥ 194×20=3,880 1,024
I In addition, significant reduction of per-frame overhead.
I More results in:L. Zhang and D. Guo, “Virtual full duplex wireless broadcasting viacompressed sensing,” IEEE/ACM Trans. Networking, 2014.L. Zhang, J. Luo, and D. Guo, “Neighbor discovery for wireless networksvia compressed sensing,” Performance Evaluation, 2013.X. Chen and D. Guo, “Robust sublinear complexity Walsh-Hadamardtransform with arbitrary sparse support,” ISIT, 2015.
Dongning Guo (Northwestern Univ.) p. 11
Massive access and many-user information theory
![Page 39: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/39.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Comparison with random access
I l = 220 nodes, on average 10 neighbors, SNR = 11.5 dB
Target Pe = 0.002
Random access RODD# of frames 194 1# of symbols ≥ 194×20=3,880 1,024
I In addition, significant reduction of per-frame overhead.
I More results in:L. Zhang and D. Guo, “Virtual full duplex wireless broadcasting viacompressed sensing,” IEEE/ACM Trans. Networking, 2014.L. Zhang, J. Luo, and D. Guo, “Neighbor discovery for wireless networksvia compressed sensing,” Performance Evaluation, 2013.X. Chen and D. Guo, “Robust sublinear complexity Walsh-Hadamardtransform with arbitrary sparse support,” ISIT, 2015.
Dongning Guo (Northwestern Univ.) p. 11
Massive access and many-user information theory
![Page 40: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/40.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
What are the fundamental limits?
Dongning Guo (Northwestern Univ.) p. 12
Massive access and many-user information theory
![Page 41: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/41.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Outline
I (Almost practical) neighbor discovery
I Classical information theory: a digression
I Many-access channel
I Many-broadcast channel
Dongning Guo (Northwestern Univ.) p. 13
Massive access and many-user information theory
![Page 42: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/42.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Classical (single-user) information theoryI [Shannon–MacMillan–Breiman ’48, ’60] for discrete stationary ergodic
sequence,
− 1
nlog pX1,...,Xn
(X1, . . . , Xn)a.s.→ H
I Typical set
T (n)ε =
{(x1, · · · , xn) :
∣∣∣− 1
nlog pX1,··· ,Xn
(x1, . . . , xn)−H∣∣∣ ≤ ε}
I Asymptotic equipartition property:
I Almost all sequences that occur are typical, limn→∞ P{T
(n)ε
}= 1;
I There are about 2nH of them, |T (n)ε | ≈ 2nH.
I Hence Shannon’s lossless source coding theorem.
I A joint AEP is the workhorse for Shannon’s channel coding theorem andrate distortion theorem.
Dongning Guo (Northwestern Univ.) p. 14
Massive access and many-user information theory
![Page 43: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/43.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Classical (single-user) information theoryI [Shannon–MacMillan–Breiman ’48, ’60] for discrete stationary ergodic
sequence,
− 1
nlog pX1,...,Xn
(X1, . . . , Xn)a.s.→ H
I Typical set
T (n)ε =
{(x1, · · · , xn) :
∣∣∣− 1
nlog pX1,··· ,Xn
(x1, . . . , xn)−H∣∣∣ ≤ ε}
I Asymptotic equipartition property:
I Almost all sequences that occur are typical, limn→∞ P{T
(n)ε
}= 1;
I There are about 2nH of them, |T (n)ε | ≈ 2nH.
I Hence Shannon’s lossless source coding theorem.
I A joint AEP is the workhorse for Shannon’s channel coding theorem andrate distortion theorem.
Dongning Guo (Northwestern Univ.) p. 14
Massive access and many-user information theory
![Page 44: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/44.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Classical (single-user) information theoryI [Shannon–MacMillan–Breiman ’48, ’60] for discrete stationary ergodic
sequence,
− 1
nlog pX1,...,Xn
(X1, . . . , Xn)a.s.→ H
I Typical set
T (n)ε =
{(x1, · · · , xn) :
∣∣∣− 1
nlog pX1,··· ,Xn
(x1, . . . , xn)−H∣∣∣ ≤ ε}
I Asymptotic equipartition property:
I Almost all sequences that occur are typical, limn→∞ P{T
(n)ε
}= 1;
I There are about 2nH of them, |T (n)ε | ≈ 2nH.
I Hence Shannon’s lossless source coding theorem.
I A joint AEP is the workhorse for Shannon’s channel coding theorem andrate distortion theorem.
Dongning Guo (Northwestern Univ.) p. 14
Massive access and many-user information theory
![Page 45: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/45.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Classical (single-user) information theoryI [Shannon–MacMillan–Breiman ’48, ’60] for discrete stationary ergodic
sequence,
− 1
nlog pX1,...,Xn
(X1, . . . , Xn)a.s.→ H
I Typical set
T (n)ε =
{(x1, · · · , xn) :
∣∣∣− 1
nlog pX1,··· ,Xn
(x1, . . . , xn)−H∣∣∣ ≤ ε}
I Asymptotic equipartition property:
I Almost all sequences that occur are typical, limn→∞ P{T
(n)ε
}= 1;
I There are about 2nH of them, |T (n)ε | ≈ 2nH.
I Hence Shannon’s lossless source coding theorem.
I A joint AEP is the workhorse for Shannon’s channel coding theorem andrate distortion theorem.
Dongning Guo (Northwestern Univ.) p. 14
Massive access and many-user information theory
![Page 46: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/46.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Classical (single-user) information theoryI [Shannon–MacMillan–Breiman ’48, ’60] for discrete stationary ergodic
sequence,
− 1
nlog pX1,...,Xn
(X1, . . . , Xn)a.s.→ H
I Typical set
T (n)ε =
{(x1, · · · , xn) :
∣∣∣− 1
nlog pX1,··· ,Xn
(x1, . . . , xn)−H∣∣∣ ≤ ε}
I Asymptotic equipartition property:
I Almost all sequences that occur are typical, limn→∞ P{T
(n)ε
}= 1;
I There are about 2nH of them, |T (n)ε | ≈ 2nH.
I Hence Shannon’s lossless source coding theorem.
I A joint AEP is the workhorse for Shannon’s channel coding theorem andrate distortion theorem.
Dongning Guo (Northwestern Univ.) p. 14
Massive access and many-user information theory
![Page 47: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/47.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Classical (single-user) information theoryI [Shannon–MacMillan–Breiman ’48, ’60] for discrete stationary ergodic
sequence,
− 1
nlog pX1,...,Xn
(X1, . . . , Xn)a.s.→ H
I Typical set
T (n)ε =
{(x1, · · · , xn) :
∣∣∣− 1
nlog pX1,··· ,Xn
(x1, . . . , xn)−H∣∣∣ ≤ ε}
I Asymptotic equipartition property:
I Almost all sequences that occur are typical, limn→∞ P{T
(n)ε
}= 1;
I There are about 2nH of them, |T (n)ε | ≈ 2nH.
I Hence Shannon’s lossless source coding theorem.
I A joint AEP is the workhorse for Shannon’s channel coding theorem andrate distortion theorem.
Dongning Guo (Northwestern Univ.) p. 14
Massive access and many-user information theory
![Page 48: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/48.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Classical (single-user) information theoryI [Shannon–MacMillan–Breiman ’48, ’60] for discrete stationary ergodic
sequence,
− 1
nlog pX1,...,Xn
(X1, . . . , Xn)a.s.→ H
I Typical set
T (n)ε =
{(x1, · · · , xn) :
∣∣∣− 1
nlog pX1,··· ,Xn
(x1, . . . , xn)−H∣∣∣ ≤ ε}
I Asymptotic equipartition property:
I Almost all sequences that occur are typical, limn→∞ P{T
(n)ε
}= 1;
I There are about 2nH of them, |T (n)ε | ≈ 2nH.
I Hence Shannon’s lossless source coding theorem.
I A joint AEP is the workhorse for Shannon’s channel coding theorem andrate distortion theorem.
Dongning Guo (Northwestern Univ.) p. 14
Massive access and many-user information theory
![Page 49: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/49.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Classical multiaccess channel
w1 7−→ (X11, . . . , X1n)↘
w2 7−→ (X21, . . . , X2n)↗
PY |X1,X2−→ (Y1, . . . , Yn) 7−→ (w1, w2)
The capacity region is due to Ahlswede (1971) and Liao (1972). It can beachieved by random coding, joint typicality decoding, and time sharing.
Dongning Guo (Northwestern Univ.) p. 15
Massive access and many-user information theory
![Page 50: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/50.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Classical multiaccess channel
w1 7−→ (X11, . . . , X1n)↘
w2 7−→ (X21, . . . , X2n)↗
PY |X1,X2−→ (Y1, . . . , Yn) 7−→ (w1, w2)
The capacity region is due to Ahlswede (1971) and Liao (1972). It can beachieved by random coding, joint typicality decoding, and time sharing.
Dongning Guo (Northwestern Univ.) p. 15
Massive access and many-user information theory
![Page 51: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/51.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Jointly typical set
T (n)ε =
{(x1,x2,y) :
∣∣∣ 1nlog
1
pX1(x1)−H(X1)
∣∣∣ < ε∣∣∣ 1nlog
1
pX2(x2)−H(X2)
∣∣∣ < ε∣∣∣ 1nlog
1
pY (y)−H(Y )
∣∣∣ < ε∣∣∣ 1nlog
1
pX1Y (x1,y)−H(X1, Y )
∣∣∣ < ε∣∣∣ 1nlog
1
pX2Y (x2,y)−H(X2, Y )
∣∣∣ < ε∣∣∣ 1nlog
1
pX1X2(x1,x2)−H(X1, X2)
∣∣∣ < ε∣∣∣ 1nlog
1
pX1X2Y (x1,x2,y)−H(X1, X2, Y )
∣∣∣ < ε
}The “empirical entropy” converges to the entropy for all subsets of (X1, X2, Y ).
Dongning Guo (Northwestern Univ.) p. 16
Massive access and many-user information theory
![Page 52: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/52.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Proof of multiaccess channel capacity
I Two users transmit X1(w1) and X2(w2) from random codebooks.
I Receiver puts out the first (w1, w2) satisfying
(X1(w1),X2(w2),Y ) ∈ T (n)ε .
I Let Ew1w2= {(X1(w1),X2(w2),Y ) ∈ T (n)
ε }. Then
Pe ≤ P (Ec11)︸ ︷︷ ︸→0 by AEP
+∑
w1=1,w2 6=1
P (E1w2)︸ ︷︷ ︸
⇒bound on R2
+∑
w1 6=1,w2=1
P (Ew11)︸ ︷︷ ︸⇒bound on R1
+∑
w1 6=1,w2 6=1
P (Ew1w2)︸ ︷︷ ︸
⇒bound on R1+R2
→ 0
Dongning Guo (Northwestern Univ.) p. 17
Massive access and many-user information theory
![Page 53: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/53.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Proof of multiaccess channel capacity
I Two users transmit X1(w1) and X2(w2) from random codebooks.
I Receiver puts out the first (w1, w2) satisfying
(X1(w1),X2(w2),Y ) ∈ T (n)ε .
I Let Ew1w2= {(X1(w1),X2(w2),Y ) ∈ T (n)
ε }. Then
Pe ≤ P (Ec11)︸ ︷︷ ︸→0 by AEP
+∑
w1=1,w2 6=1
P (E1w2)︸ ︷︷ ︸
⇒bound on R2
+∑
w1 6=1,w2=1
P (Ew11)︸ ︷︷ ︸⇒bound on R1
+∑
w1 6=1,w2 6=1
P (Ew1w2)︸ ︷︷ ︸
⇒bound on R1+R2
→ 0
Dongning Guo (Northwestern Univ.) p. 17
Massive access and many-user information theory
![Page 54: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/54.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Proof of multiaccess channel capacity
I Two users transmit X1(w1) and X2(w2) from random codebooks.
I Receiver puts out the first (w1, w2) satisfying
(X1(w1),X2(w2),Y ) ∈ T (n)ε .
I Let Ew1w2= {(X1(w1),X2(w2),Y ) ∈ T (n)
ε }. Then
Pe ≤ P (Ec11)︸ ︷︷ ︸→0 by AEP
+∑
w1=1,w2 6=1
P (E1w2)︸ ︷︷ ︸
⇒bound on R2
+∑
w1 6=1,w2=1
P (Ew11)︸ ︷︷ ︸⇒bound on R1
+∑
w1 6=1,w2 6=1
P (Ew1w2)︸ ︷︷ ︸
⇒bound on R1+R2
→ 0
Dongning Guo (Northwestern Univ.) p. 17
Massive access and many-user information theory
![Page 55: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/55.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
The large-system limit is ill-suited for many-user
I Example: The sum rate of the k-user Gaussian multiaccess channel:
Csum =1
2log(1 + kγ)→∞
1
kCsum =
1
2klog(1 + kγ)→ 0
I “When the total number of senders is very large, so that there is a lot ofinterference, we can still send a total amount of information that isarbitrary large even though the rate per individual sender goes to 0.”—Cover & Thomas, Elements of Information Theory.
I Rate or capacity in bits per channel use is ill-suited for many-user systems.
Dongning Guo (Northwestern Univ.) p. 18
Massive access and many-user information theory
![Page 56: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/56.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
The large-system limit is ill-suited for many-user
I Example: The sum rate of the k-user Gaussian multiaccess channel:
Csum =1
2log(1 + kγ)→∞
1
kCsum =
1
2klog(1 + kγ)→ 0
I “When the total number of senders is very large, so that there is a lot ofinterference, we can still send a total amount of information that isarbitrary large even though the rate per individual sender goes to 0.”—Cover & Thomas, Elements of Information Theory.
I Rate or capacity in bits per channel use is ill-suited for many-user systems.
Dongning Guo (Northwestern Univ.) p. 18
Massive access and many-user information theory
![Page 57: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/57.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
The large-system limit is ill-suited for many-user
I Example: The sum rate of the k-user Gaussian multiaccess channel:
Csum =1
2log(1 + kγ)→∞
1
kCsum =
1
2klog(1 + kγ)→ 0
I “When the total number of senders is very large, so that there is a lot ofinterference, we can still send a total amount of information that isarbitrary large even though the rate per individual sender goes to 0.”—Cover & Thomas, Elements of Information Theory.
I Rate or capacity in bits per channel use is ill-suited for many-user systems.
Dongning Guo (Northwestern Univ.) p. 18
Massive access and many-user information theory
![Page 58: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/58.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Outline
I (Almost practical) neighbor discovery
I Classical information theory
I Many-access channel
I Many-broadcast channel
Dongning Guo (Northwestern Univ.) p. 19
Massive access and many-user information theory
![Page 59: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/59.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Gaussian many-access channel (MnAC)
Y =∑j=1
Sj(wj) +Z
I Many (`) transmitters, each active w.p. α ∈ (0, 1] in a block.
I Average number of active users:
k = α`.
I Message of user j: wj , corresponding codeword Sj(wj).
I User j is silent if wj = 0, Sj(0) = 0.
Dongning Guo (Northwestern Univ.) p. 20
Massive access and many-user information theory
![Page 60: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/60.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Gaussian many-access channel (MnAC)
Y =∑j=1
Sj(wj) +Z
I Many (`) transmitters, each active w.p. α ∈ (0, 1] in a block.
I Average number of active users:
k = α`.
I Message of user j: wj , corresponding codeword Sj(wj).
I User j is silent if wj = 0, Sj(0) = 0.
Dongning Guo (Northwestern Univ.) p. 20
Massive access and many-user information theory
![Page 61: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/61.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Gaussian many-access channel (MnAC)
Y =∑j=1
Sj(wj) +Z
I Many (`) transmitters, each active w.p. α ∈ (0, 1] in a block.
I Average number of active users:
k = α`.
I Message of user j: wj , corresponding codeword Sj(wj).
I User j is silent if wj = 0, Sj(0) = 0.
Dongning Guo (Northwestern Univ.) p. 20
Massive access and many-user information theory
![Page 62: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/62.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Gaussian many-access channel (MnAC)
Y =∑j=1
Sj(wj) +Z
I Many (`) transmitters, each active w.p. α ∈ (0, 1] in a block.
I Average number of active users:
k = α`.
I Message of user j: wj , corresponding codeword Sj(wj).
I User j is silent if wj = 0, Sj(0) = 0.
Dongning Guo (Northwestern Univ.) p. 20
Massive access and many-user information theory
![Page 63: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/63.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
New challenges
I Time sharing is not good in general. If n = 1, 000, k = 2, 000, an averageuser has half a channel use!
I Classical joint typicality does not apply as n, `→∞:Xn
1,1 Xn1,2 . . . Xn
1,n
Xn2,1 Xn
2,2 . . . Xn2,n
......
...Xn`,1 Xn
`,2 . . . Xn`,n
I The union bound fails as the number of error events 2k grows
exponentially with n.
Dongning Guo (Northwestern Univ.) p. 21
Massive access and many-user information theory
![Page 64: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/64.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
New challenges
I Time sharing is not good in general. If n = 1, 000, k = 2, 000, an averageuser has half a channel use!
I Classical joint typicality does not apply as n, `→∞:Xn
1,1 Xn1,2 . . . Xn
1,n
Xn2,1 Xn
2,2 . . . Xn2,n
......
...Xn`,1 Xn
`,2 . . . Xn`,n
I The union bound fails as the number of error events 2k grows
exponentially with n.
Dongning Guo (Northwestern Univ.) p. 21
Massive access and many-user information theory
![Page 65: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/65.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
New challenges
I Time sharing is not good in general. If n = 1, 000, k = 2, 000, an averageuser has half a channel use!
I Classical joint typicality does not apply as n, `→∞:Xn
1,1 Xn1,2 . . . Xn
1,n
Xn2,1 Xn
2,2 . . . Xn2,n
......
...Xn`,1 Xn
`,2 . . . Xn`,n
I The union bound fails as the number of error events 2k grows
exponentially with n.
Dongning Guo (Northwestern Univ.) p. 21
Massive access and many-user information theory
![Page 66: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/66.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Identification code for the memoryless Gaussian MnAC
1. Let ` denote the total number of users.
2. If user j is inactive (Wj = 0), it transmits 0;
3. If user j is active (Wj = 1), it transmits sj ,
1
n`
n∑i=1
s2ji ≤ γ.
4. On average, k` users are active with i.i.d. activities, P{Wj = 1} = k`` .
5. Average identification error probability:
p` = P {D(Y ) 6= (W1, . . . ,W`)} .
Dongning Guo (Northwestern Univ.) p. 22
Massive access and many-user information theory
![Page 67: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/67.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Identification code for the memoryless Gaussian MnAC
1. Let ` denote the total number of users.
2. If user j is inactive (Wj = 0), it transmits 0;
3. If user j is active (Wj = 1), it transmits sj ,
1
n`
n∑i=1
s2ji ≤ γ.
4. On average, k` users are active with i.i.d. activities, P{Wj = 1} = k`` .
5. Average identification error probability:
p` = P {D(Y ) 6= (W1, . . . ,W`)} .
Dongning Guo (Northwestern Univ.) p. 22
Massive access and many-user information theory
![Page 68: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/68.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Identification code for the memoryless Gaussian MnAC
1. Let ` denote the total number of users.
2. If user j is inactive (Wj = 0), it transmits 0;
3. If user j is active (Wj = 1), it transmits sj ,
1
n`
n∑i=1
s2ji ≤ γ.
4. On average, k` users are active with i.i.d. activities, P{Wj = 1} = k`` .
5. Average identification error probability:
p` = P {D(Y ) 6= (W1, . . . ,W`)} .
Dongning Guo (Northwestern Univ.) p. 22
Massive access and many-user information theory
![Page 69: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/69.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Identification code for the memoryless Gaussian MnAC
1. Let ` denote the total number of users.
2. If user j is inactive (Wj = 0), it transmits 0;
3. If user j is active (Wj = 1), it transmits sj ,
1
n`
n∑i=1
s2ji ≤ γ.
4. On average, k` users are active with i.i.d. activities, P{Wj = 1} = k`` .
5. Average identification error probability:
p` = P {D(Y ) 6= (W1, . . . ,W`)} .
Dongning Guo (Northwestern Univ.) p. 22
Massive access and many-user information theory
![Page 70: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/70.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Identification code for the memoryless Gaussian MnAC
1. Let ` denote the total number of users.
2. If user j is inactive (Wj = 0), it transmits 0;
3. If user j is active (Wj = 1), it transmits sj ,
1
n`
n∑i=1
s2ji ≤ γ.
4. On average, k` users are active with i.i.d. activities, P{Wj = 1} = k`` .
5. Average identification error probability:
p` = P {D(Y ) 6= (W1, . . . ,W`)} .
Dongning Guo (Northwestern Univ.) p. 22
Massive access and many-user information theory
![Page 71: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/71.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Main result 1: identification cost
TheoremLet
n` =`H2(k`/`)
12 log(1 + k`γ)
.
For every ε ∈ (0, 1), as `→∞, arbitrarily reliable identification (p` → 0) isachievable with (1 + ε)n` channel uses; whereas p` → 0 is not achievable with(1− ε)n` channel uses. Here
H2(α) = α log1
α+ (1− α) log 1
1− α.
I Intuition: The activity uncertainty of ` users is `H2(k`/`).
I Achievable code: random Gaussian sequences as signatures.
Dongning Guo (Northwestern Univ.) p. 23
Massive access and many-user information theory
![Page 72: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/72.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Main result 1: identification cost
TheoremLet
n` =`H2(k`/`)
12 log(1 + k`γ)
.
For every ε ∈ (0, 1), as `→∞, arbitrarily reliable identification (p` → 0) isachievable with (1 + ε)n` channel uses; whereas p` → 0 is not achievable with(1− ε)n` channel uses. Here
H2(α) = α log1
α+ (1− α) log 1
1− α.
I Intuition: The activity uncertainty of ` users is `H2(k`/`).
I Achievable code: random Gaussian sequences as signatures.
Dongning Guo (Northwestern Univ.) p. 23
Massive access and many-user information theory
![Page 73: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/73.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Main result 1: identification cost
TheoremLet
n` =`H2(k`/`)
12 log(1 + k`γ)
.
For every ε ∈ (0, 1), as `→∞, arbitrarily reliable identification (p` → 0) isachievable with (1 + ε)n` channel uses; whereas p` → 0 is not achievable with(1− ε)n` channel uses. Here
H2(α) = α log1
α+ (1− α) log 1
1− α.
I Intuition: The activity uncertainty of ` users is `H2(k`/`).
I Achievable code: random Gaussian sequences as signatures.
Dongning Guo (Northwestern Univ.) p. 23
Massive access and many-user information theory
![Page 74: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/74.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Main result 1: identification cost
TheoremLet
n` =`H2(k`/`)
12 log(1 + k`γ)
.
For every ε ∈ (0, 1), as `→∞, arbitrarily reliable identification (p` → 0) isachievable with (1 + ε)n` channel uses; whereas p` → 0 is not achievable with(1− ε)n` channel uses. Here
H2(α) = α log1
α+ (1− α) log 1
1− α.
I Intuition: The activity uncertainty of ` users is `H2(k`/`).
I Achievable code: random Gaussian sequences as signatures.
Dongning Guo (Northwestern Univ.) p. 23
Massive access and many-user information theory
![Page 75: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/75.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Identification cost vs. user numberγ = 10 dB
0 2000 4000 6000 8000 100000
50
100
150
200
250
300
350
400
450
ℓ
n(ℓ
)
kℓ = ⌈ℓ2/3⌉
kℓ = ⌈ℓ1/2⌉
kℓ = ⌈ℓ1/3⌉
Dongning Guo (Northwestern Univ.) p. 24
Massive access and many-user information theory
![Page 76: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/76.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Identification and channel code for the Gaussian MnACAn (M,n) symmetric code:
1. Encoders Ek : {0, . . . ,M} → Snk yields codewords sk(0), · · · , sk(M).
1
n
n∑i=1
s2ki(w) ≤ γ, ∀w ∈ {1, · · · ,M}.
sk(0) = 0.
2. Decoder D : Yn → {0, . . . ,M}`n .
I.i.d. messages {Wk},
P{Wk = w} ={1− α, w = 0,αM , w ∈ {1, · · · ,M}.
P (n)e = P {D(Y ) 6= (W1, . . . ,W`n)} .
Dongning Guo (Northwestern Univ.) p. 25
Massive access and many-user information theory
![Page 77: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/77.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Identification and channel code for the Gaussian MnACAn (M,n) symmetric code:
1. Encoders Ek : {0, . . . ,M} → Snk yields codewords sk(0), · · · , sk(M).
1
n
n∑i=1
s2ki(w) ≤ γ, ∀w ∈ {1, · · · ,M}.
sk(0) = 0.
2. Decoder D : Yn → {0, . . . ,M}`n .
I.i.d. messages {Wk},
P{Wk = w} ={1− α, w = 0,αM , w ∈ {1, · · · ,M}.
P (n)e = P {D(Y ) 6= (W1, . . . ,W`n)} .
Dongning Guo (Northwestern Univ.) p. 25
Massive access and many-user information theory
![Page 78: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/78.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Identification and channel code for the Gaussian MnACAn (M,n) symmetric code:
1. Encoders Ek : {0, . . . ,M} → Snk yields codewords sk(0), · · · , sk(M).
1
n
n∑i=1
s2ki(w) ≤ γ, ∀w ∈ {1, · · · ,M}.
sk(0) = 0.
2. Decoder D : Yn → {0, . . . ,M}`n .
I.i.d. messages {Wk},
P{Wk = w} ={1− α, w = 0,αM , w ∈ {1, · · · ,M}.
P (n)e = P {D(Y ) 6= (W1, . . . ,W`n)} .
Dongning Guo (Northwestern Univ.) p. 25
Massive access and many-user information theory
![Page 79: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/79.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Capacity
I (v(n))∞n=1 with v(n) > 1 is a sequence of asymptotically achievablemessage lengths for the MnAC if there exists a sequence of
(dexp(v(n))e, n) codes such that P(n)e vanishes as n→∞.
I B = (B(n))∞n=1 is said to be a symmetric capacity of the MnAC channelif, for every ε ∈ (0, 1), ((1− ε)B(n)) is asymptotically achievable but((1 + ε)B(n)) is not.
I If (B(n)) is a capacity, then (B(n) + o(B(n))) is also a capacity.
I In classical IT, B(n) = nC;In Many-User IT, B(n) may be nonlinear.
Dongning Guo (Northwestern Univ.) p. 26
Massive access and many-user information theory
![Page 80: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/80.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Capacity
I (v(n))∞n=1 with v(n) > 1 is a sequence of asymptotically achievablemessage lengths for the MnAC if there exists a sequence of
(dexp(v(n))e, n) codes such that P(n)e vanishes as n→∞.
I B = (B(n))∞n=1 is said to be a symmetric capacity of the MnAC channelif, for every ε ∈ (0, 1), ((1− ε)B(n)) is asymptotically achievable but((1 + ε)B(n)) is not.
I If (B(n)) is a capacity, then (B(n) + o(B(n))) is also a capacity.
I In classical IT, B(n) = nC;In Many-User IT, B(n) may be nonlinear.
Dongning Guo (Northwestern Univ.) p. 26
Massive access and many-user information theory
![Page 81: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/81.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Capacity
I (v(n))∞n=1 with v(n) > 1 is a sequence of asymptotically achievablemessage lengths for the MnAC if there exists a sequence of
(dexp(v(n))e, n) codes such that P(n)e vanishes as n→∞.
I B = (B(n))∞n=1 is said to be a symmetric capacity of the MnAC channelif, for every ε ∈ (0, 1), ((1− ε)B(n)) is asymptotically achievable but((1 + ε)B(n)) is not.
I If (B(n)) is a capacity, then (B(n) + o(B(n))) is also a capacity.
I In classical IT, B(n) = nC;In Many-User IT, B(n) may be nonlinear.
Dongning Guo (Northwestern Univ.) p. 26
Massive access and many-user information theory
![Page 82: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/82.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Capacity
I (v(n))∞n=1 with v(n) > 1 is a sequence of asymptotically achievablemessage lengths for the MnAC if there exists a sequence of
(dexp(v(n))e, n) codes such that P(n)e vanishes as n→∞.
I B = (B(n))∞n=1 is said to be a symmetric capacity of the MnAC channelif, for every ε ∈ (0, 1), ((1− ε)B(n)) is asymptotically achievable but((1 + ε)B(n)) is not.
I If (B(n)) is a capacity, then (B(n) + o(B(n))) is also a capacity.
I In classical IT, B(n) = nC;In Many-User IT, B(n) may be nonlinear.
Dongning Guo (Northwestern Univ.) p. 26
Massive access and many-user information theory
![Page 83: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/83.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Main result 2: capacity of Gaussian MnAC
Theorem (Symmetric capacity)Suppose kn →∞, kn = O(n), and `n = o(eδkn), ∀δ > 0.
B(n) =
(n
2knlog(1 + knγ)−
H2(α)
α
)+
=
(B1(n)−
H2(α)
α
)+
nats
I The capacity is B1(n) if α = 1 or the set of active users is known.
I The penalty H2(α)/α is the total amount of activity uncertainty dividedby the number of active users.
I If H2(α)/α > B1(n), an average user cannot send 1 bit reliably.
I Large-system analysis (k →∞ after n→∞) obliterates theidentification cost.
Dongning Guo (Northwestern Univ.) p. 27
Massive access and many-user information theory
![Page 84: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/84.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Main result 2: capacity of Gaussian MnAC
Theorem (Symmetric capacity)Suppose kn →∞, kn = O(n), and `n = o(eδkn), ∀δ > 0.
B(n) =
(n
2knlog(1 + knγ)−
H2(α)
α
)+
=
(B1(n)−
H2(α)
α
)+
nats
I The capacity is B1(n) if α = 1 or the set of active users is known.
I The penalty H2(α)/α is the total amount of activity uncertainty dividedby the number of active users.
I If H2(α)/α > B1(n), an average user cannot send 1 bit reliably.
I Large-system analysis (k →∞ after n→∞) obliterates theidentification cost.
Dongning Guo (Northwestern Univ.) p. 27
Massive access and many-user information theory
![Page 85: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/85.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Main result 2: capacity of Gaussian MnAC
Theorem (Symmetric capacity)Suppose kn →∞, kn = O(n), and `n = o(eδkn), ∀δ > 0.
B(n) =
(n
2knlog(1 + knγ)−
H2(α)
α
)+
=
(B1(n)−
H2(α)
α
)+
nats
I The capacity is B1(n) if α = 1 or the set of active users is known.
I The penalty H2(α)/α is the total amount of activity uncertainty dividedby the number of active users.
I If H2(α)/α > B1(n), an average user cannot send 1 bit reliably.
I Large-system analysis (k →∞ after n→∞) obliterates theidentification cost.
Dongning Guo (Northwestern Univ.) p. 27
Massive access and many-user information theory
![Page 86: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/86.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Main result 2: capacity of Gaussian MnAC
Theorem (Symmetric capacity)Suppose kn →∞, kn = O(n), and `n = o(eδkn), ∀δ > 0.
B(n) =
(n
2knlog(1 + knγ)−
H2(α)
α
)+
=
(B1(n)−
H2(α)
α
)+
nats
I The capacity is B1(n) if α = 1 or the set of active users is known.
I The penalty H2(α)/α is the total amount of activity uncertainty dividedby the number of active users.
I If H2(α)/α > B1(n), an average user cannot send 1 bit reliably.
I Large-system analysis (k →∞ after n→∞) obliterates theidentification cost.
Dongning Guo (Northwestern Univ.) p. 27
Massive access and many-user information theory
![Page 87: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/87.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Main result 2: capacity of Gaussian MnAC
Theorem (Symmetric capacity)Suppose kn →∞, kn = O(n), and `n = o(eδkn), ∀δ > 0.
B(n) =
(n
2knlog(1 + knγ)−
H2(α)
α
)+
=
(B1(n)−
H2(α)
α
)+
nats
I The capacity is B1(n) if α = 1 or the set of active users is known.
I The penalty H2(α)/α is the total amount of activity uncertainty dividedby the number of active users.
I If H2(α)/α > B1(n), an average user cannot send 1 bit reliably.
I Large-system analysis (k →∞ after n→∞) obliterates theidentification cost.
Dongning Guo (Northwestern Univ.) p. 27
Massive access and many-user information theory
![Page 88: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/88.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Capacity (message length) vs. blocklengthγ = 10 dB, kn = n/4
0 2000 4000 6000 8000 100000
5
10
15
20
25
30
n
B(n)
(bits
)
ℓn = n
ℓn = n1.5
ℓn = n2
ℓn = n3
Dongning Guo (Northwestern Univ.) p. 28
Massive access and many-user information theory
![Page 89: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/89.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Proof using an equivalent model
Y = SX +Z
I Concatenated codebook
S = [s1(1), · · · , s1(M), . . . , s`n(1), · · · , s`n(M)]n×(M`n)
I X ∈ {0, 1}M`n selects the codewords:
X =
X1
X2
...X`n
Xk = 0 or ej = [0 . . . 0 1 0 . . . 0]>
Dongning Guo (Northwestern Univ.) p. 29
Massive access and many-user information theory
![Page 90: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/90.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Proof using an equivalent model
Y = SX +Z
I Concatenated codebook
S = [s1(1), · · · , s1(M), . . . , s`n(1), · · · , s`n(M)]n×(M`n)
I X ∈ {0, 1}M`n selects the codewords:
X =
X1
X2
...X`n
Xk = 0 or ej = [0 . . . 0 1 0 . . . 0]>
Dongning Guo (Northwestern Univ.) p. 29
Massive access and many-user information theory
![Page 91: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/91.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Proof using an equivalent model
Y = SX +Z
I Concatenated codebook
S = [s1(1), · · · , s1(M), . . . , s`n(1), · · · , s`n(M)]n×(M`n)
I X ∈ {0, 1}M`n selects the codewords:
X =
X1
X2
...X`n
Xk = 0 or ej = [0 . . . 0 1 0 . . . 0]>
Dongning Guo (Northwestern Univ.) p. 29
Massive access and many-user information theory
![Page 92: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/92.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Proof of achievability: The codebooks
. . .
12 m
{
{
0n
0n n−
12 m
. . .
Signature
Message-bearing codeword jj
User 1 User 2
12 m
. . .. . .
User n
l
I The first n0 symbols form a user-specific signature. The remainingsymbols carry data.
I The capacity is achieved by separate identification and decoding.
I X. Chen and D. Guo, “Gaussian many-access channels with random useractivities,” ISIT 2014.
Dongning Guo (Northwestern Univ.) p. 30
Massive access and many-user information theory
![Page 93: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/93.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Proof of achievability: The codebooks
. . .
12 m
{
{
0n
0n n−
12 m
. . .
Signature
Message-bearing codeword jj
User 1 User 2
12 m
. . .. . .
User n
l
I The first n0 symbols form a user-specific signature. The remainingsymbols carry data.
I The capacity is achieved by separate identification and decoding.
I X. Chen and D. Guo, “Gaussian many-access channels with random useractivities,” ISIT 2014.
Dongning Guo (Northwestern Univ.) p. 30
Massive access and many-user information theory
![Page 94: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/94.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Proof of achievability: The codebooks
. . .
12 m
{
{
0n
0n n−
12 m
. . .
Signature
Message-bearing codeword jj
User 1 User 2
12 m
. . .. . .
User n
l
I The first n0 symbols form a user-specific signature. The remainingsymbols carry data.
I The capacity is achieved by separate identification and decoding.
I X. Chen and D. Guo, “Gaussian many-access channels with random useractivities,” ISIT 2014.
Dongning Guo (Northwestern Univ.) p. 30
Massive access and many-user information theory
![Page 95: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/95.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Achievability: Separate identification and decoding
Y =
[Y an0×1
Y b(n−n0)×1
]=
[SaXa +Za
SbX +Zb
]1. Identification:
minimize ‖Y a − Saxa‖2subject to xa ∈ {0, 1}`n
ln∑i=1
xai ≤ (1 + 2k− 1
3n )kn
2. ML joint message decoding based on result of identification.
minimize ‖Y b − Sbx‖2subject to xk = ewk
, ∀k = 1, . . . , kn.
Dongning Guo (Northwestern Univ.) p. 31
Massive access and many-user information theory
![Page 96: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/96.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Achievability: Separate identification and decoding
Y =
[Y an0×1
Y b(n−n0)×1
]=
[SaXa +Za
SbX +Zb
]1. Identification:
minimize ‖Y a − Saxa‖2subject to xa ∈ {0, 1}`n
ln∑i=1
xai ≤ (1 + 2k− 1
3n )kn
2. ML joint message decoding based on result of identification.
minimize ‖Y b − Sbx‖2subject to xk = ewk
, ∀k = 1, . . . , kn.
Dongning Guo (Northwestern Univ.) p. 31
Massive access and many-user information theory
![Page 97: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/97.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Achievability: Separate identification and decoding
Y =
[Y an0×1
Y b(n−n0)×1
]=
[SaXa +Za
SbX +Zb
]1. Identification:
minimize ‖Y a − Saxa‖2subject to xa ∈ {0, 1}`n
ln∑i=1
xai ≤ (1 + 2k− 1
3n )kn
2. ML joint message decoding based on result of identification.
minimize ‖Y b − Sbx‖2subject to xk = ewk
, ∀k = 1, . . . , kn.
Dongning Guo (Northwestern Univ.) p. 31
Massive access and many-user information theory
![Page 98: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/98.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Identification error probability
I Simple union bound fails for the exponential number of error events.
I The identification error event is decomposed to polynomial number ofevents (t1 misses and t2 false alarms):
E =⋃t1,t2
Et1,t2
I Using techniques from Gallager ’68 to upper bound the error
P {Et1,t2 |Xa = xa} ≤ e−nhn(t1,t2)
where for large enough n, hn(t1, t2) ≥ c0(ε) > 0, ∀(t1, t2).I Correct identification with high probability with n0 channel uses.
Dongning Guo (Northwestern Univ.) p. 32
Massive access and many-user information theory
![Page 99: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/99.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Identification error probability
I Simple union bound fails for the exponential number of error events.
I The identification error event is decomposed to polynomial number ofevents (t1 misses and t2 false alarms):
E =⋃t1,t2
Et1,t2
I Using techniques from Gallager ’68 to upper bound the error
P {Et1,t2 |Xa = xa} ≤ e−nhn(t1,t2)
where for large enough n, hn(t1, t2) ≥ c0(ε) > 0, ∀(t1, t2).I Correct identification with high probability with n0 channel uses.
Dongning Guo (Northwestern Univ.) p. 32
Massive access and many-user information theory
![Page 100: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/100.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Identification error probability
I Simple union bound fails for the exponential number of error events.
I The identification error event is decomposed to polynomial number ofevents (t1 misses and t2 false alarms):
E =⋃t1,t2
Et1,t2
I Using techniques from Gallager ’68 to upper bound the error
P {Et1,t2 |Xa = xa} ≤ e−nhn(t1,t2)
where for large enough n, hn(t1, t2) ≥ c0(ε) > 0, ∀(t1, t2).I Correct identification with high probability with n0 channel uses.
Dongning Guo (Northwestern Univ.) p. 32
Massive access and many-user information theory
![Page 101: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/101.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Identification error probability
I Simple union bound fails for the exponential number of error events.
I The identification error event is decomposed to polynomial number ofevents (t1 misses and t2 false alarms):
E =⋃t1,t2
Et1,t2
I Using techniques from Gallager ’68 to upper bound the error
P {Et1,t2 |Xa = xa} ≤ e−nhn(t1,t2)
where for large enough n, hn(t1, t2) ≥ c0(ε) > 0, ∀(t1, t2).I Correct identification with high probability with n0 channel uses.
Dongning Guo (Northwestern Univ.) p. 32
Massive access and many-user information theory
![Page 102: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/102.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
ML decoding error probabilityI Characterized similarly by lower bounding the error exponents.I kn active users, decomposed to kn events according to # of users in error:
E =
kn⋃k=1
Ek
I Error exponent
P{Ek} ≤ e−nf(k,ρ), ∀ρ ∈ [0, 1]
f(k, ρ) = E0
(k
kn, ρ
)− ρk
nlogM − kn
nH2
(k
kn
).
I Let logM = B(n)− εn/kn. For large enough n, ∃d(ε) > 0, s.t.
mink
maxρ∈[0,1]
f(k, ρ) ≥ d(ε).
I The decoding error probability vanishes. Hence the achievability.
Dongning Guo (Northwestern Univ.) p. 33
Massive access and many-user information theory
![Page 103: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/103.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
ML decoding error probabilityI Characterized similarly by lower bounding the error exponents.I kn active users, decomposed to kn events according to # of users in error:
E =
kn⋃k=1
Ek
I Error exponent
P{Ek} ≤ e−nf(k,ρ), ∀ρ ∈ [0, 1]
f(k, ρ) = E0
(k
kn, ρ
)− ρk
nlogM − kn
nH2
(k
kn
).
I Let logM = B(n)− εn/kn. For large enough n, ∃d(ε) > 0, s.t.
mink
maxρ∈[0,1]
f(k, ρ) ≥ d(ε).
I The decoding error probability vanishes. Hence the achievability.
Dongning Guo (Northwestern Univ.) p. 33
Massive access and many-user information theory
![Page 104: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/104.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
ML decoding error probabilityI Characterized similarly by lower bounding the error exponents.I kn active users, decomposed to kn events according to # of users in error:
E =
kn⋃k=1
Ek
I Error exponent
P{Ek} ≤ e−nf(k,ρ), ∀ρ ∈ [0, 1]
f(k, ρ) = E0
(k
kn, ρ
)− ρk
nlogM − kn
nH2
(k
kn
).
I Let logM = B(n)− εn/kn. For large enough n, ∃d(ε) > 0, s.t.
mink
maxρ∈[0,1]
f(k, ρ) ≥ d(ε).
I The decoding error probability vanishes. Hence the achievability.
Dongning Guo (Northwestern Univ.) p. 33
Massive access and many-user information theory
![Page 105: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/105.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
ML decoding error probabilityI Characterized similarly by lower bounding the error exponents.I kn active users, decomposed to kn events according to # of users in error:
E =
kn⋃k=1
Ek
I Error exponent
P{Ek} ≤ e−nf(k,ρ), ∀ρ ∈ [0, 1]
f(k, ρ) = E0
(k
kn, ρ
)− ρk
nlogM − kn
nH2
(k
kn
).
I Let logM = B(n)− εn/kn. For large enough n, ∃d(ε) > 0, s.t.
mink
maxρ∈[0,1]
f(k, ρ) ≥ d(ε).
I The decoding error probability vanishes. Hence the achievability.
Dongning Guo (Northwestern Univ.) p. 33
Massive access and many-user information theory
![Page 106: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/106.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
ML decoding error probabilityI Characterized similarly by lower bounding the error exponents.I kn active users, decomposed to kn events according to # of users in error:
E =
kn⋃k=1
Ek
I Error exponent
P{Ek} ≤ e−nf(k,ρ), ∀ρ ∈ [0, 1]
f(k, ρ) = E0
(k
kn, ρ
)− ρk
nlogM − kn
nH2
(k
kn
).
I Let logM = B(n)− εn/kn. For large enough n, ∃d(ε) > 0, s.t.
mink
maxρ∈[0,1]
f(k, ρ) ≥ d(ε).
I The decoding error probability vanishes. Hence the achievability.
Dongning Guo (Northwestern Univ.) p. 33
Massive access and many-user information theory
![Page 107: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/107.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Recap
Theorem (Symmetric capacity)
B(n) =
(n
2knlog(1 + knγ)−
H2(αn)
αn
)+
=
(B1(n)−
H2(αn)
αn
)+
nats.
I Achieved by using random Gaussian codebooks with separateindentification and decoding.
I Large-system analysis (k →∞ after n→∞) would obliterate theidentification cost.
Dongning Guo (Northwestern Univ.) p. 34
Massive access and many-user information theory
![Page 108: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/108.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Recap
Theorem (Symmetric capacity)
B(n) =
(n
2knlog(1 + knγ)−
H2(αn)
αn
)+
=
(B1(n)−
H2(αn)
αn
)+
nats.
I Achieved by using random Gaussian codebooks with separateindentification and decoding.
I Large-system analysis (k →∞ after n→∞) would obliterate theidentification cost.
Dongning Guo (Northwestern Univ.) p. 34
Massive access and many-user information theory
![Page 109: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/109.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Recap
Theorem (Symmetric capacity)
B(n) =
(n
2knlog(1 + knγ)−
H2(αn)
αn
)+
=
(B1(n)−
H2(αn)
αn
)+
nats.
I Achieved by using random Gaussian codebooks with separateindentification and decoding.
I Large-system analysis (k →∞ after n→∞) would obliterate theidentification cost.
Dongning Guo (Northwestern Univ.) p. 34
Massive access and many-user information theory
![Page 110: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/110.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Outline
I (Almost practical) neighbor discovery
I Classical information theory
I Many-access channel
I Many-broadcast channel
Dongning Guo (Northwestern Univ.) p. 35
Massive access and many-user information theory
![Page 111: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/111.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Classical degraded broadcast channels (BC)
I 2-user BC: PY1Y2|X .
I Degraded if X–Y1–Y2 is Markov.
I The capacity region is⋃PXU :U−X−Y1−Y2
{(R1, R2) :
0 ≤ R2 ≤ I(U ;Y2)0 ≤ R1 ≤ I(X;Y1|U)
}I Generalizing to a k-user degraded BC:
Rj ≤ I(Uj ;Yj |Uj+1), j = 1, . . . , k
where (0 = Uk+1)–Uk–. . . –(U1 = X)–Y1–. . . –Yk.
Dongning Guo (Northwestern Univ.) p. 36
Massive access and many-user information theory
![Page 112: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/112.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Classical degraded broadcast channels (BC)
I 2-user BC: PY1Y2|X .
I Degraded if X–Y1–Y2 is Markov.
I The capacity region is⋃PXU :U−X−Y1−Y2
{(R1, R2) :
0 ≤ R2 ≤ I(U ;Y2)0 ≤ R1 ≤ I(X;Y1|U)
}I Generalizing to a k-user degraded BC:
Rj ≤ I(Uj ;Yj |Uj+1), j = 1, . . . , k
where (0 = Uk+1)–Uk–. . . –(U1 = X)–Y1–. . . –Yk.
Dongning Guo (Northwestern Univ.) p. 36
Massive access and many-user information theory
![Page 113: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/113.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Classical degraded broadcast channels (BC)
I 2-user BC: PY1Y2|X .
I Degraded if X–Y1–Y2 is Markov.
I The capacity region is⋃PXU :U−X−Y1−Y2
{(R1, R2) :
0 ≤ R2 ≤ I(U ;Y2)0 ≤ R1 ≤ I(X;Y1|U)
}I Generalizing to a k-user degraded BC:
Rj ≤ I(Uj ;Yj |Uj+1), j = 1, . . . , k
where (0 = Uk+1)–Uk–. . . –(U1 = X)–Y1–. . . –Yk.
Dongning Guo (Northwestern Univ.) p. 36
Massive access and many-user information theory
![Page 114: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/114.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Classical degraded broadcast channels (BC)
I 2-user BC: PY1Y2|X .
I Degraded if X–Y1–Y2 is Markov.
I The capacity region is⋃PXU :U−X−Y1−Y2
{(R1, R2) :
0 ≤ R2 ≤ I(U ;Y2)0 ≤ R1 ≤ I(X;Y1|U)
}I Generalizing to a k-user degraded BC:
Rj ≤ I(Uj ;Yj |Uj+1), j = 1, . . . , k
where (0 = Uk+1)–Uk–. . . –(U1 = X)–Y1–. . . –Yk.
Dongning Guo (Northwestern Univ.) p. 36
Massive access and many-user information theory
![Page 115: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/115.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Gaussian degraded many-broadcast channel (MnBC)
I kn channel outputs:
Yj = X + σn,jZj , j = 1, 2, . . . , kn
where Zj i.i.d.∼ N (0, 1) and σn,j ≤ σn,j+1.
I kn →∞ monotonically. kn = O(n).
I Power constraint γ.
I How many bits can one send to each user reliably?
I Noise level as a triangular array (σn,j : n = 1, 2, . . . ; j = 1, . . . , kn).
Dongning Guo (Northwestern Univ.) p. 37
Massive access and many-user information theory
![Page 116: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/116.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Gaussian degraded many-broadcast channel (MnBC)
I kn channel outputs:
Yj = X + σn,jZj , j = 1, 2, . . . , kn
where Zj i.i.d.∼ N (0, 1) and σn,j ≤ σn,j+1.
I kn →∞ monotonically. kn = O(n).
I Power constraint γ.
I How many bits can one send to each user reliably?
I Noise level as a triangular array (σn,j : n = 1, 2, . . . ; j = 1, . . . , kn).
Dongning Guo (Northwestern Univ.) p. 37
Massive access and many-user information theory
![Page 117: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/117.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Gaussian degraded many-broadcast channel (MnBC)
I kn channel outputs:
Yj = X + σn,jZj , j = 1, 2, . . . , kn
where Zj i.i.d.∼ N (0, 1) and σn,j ≤ σn,j+1.
I kn →∞ monotonically. kn = O(n).
I Power constraint γ.
I How many bits can one send to each user reliably?
I Noise level as a triangular array (σn,j : n = 1, 2, . . . ; j = 1, . . . , kn).
Dongning Guo (Northwestern Univ.) p. 37
Massive access and many-user information theory
![Page 118: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/118.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Gaussian degraded many-broadcast channel (MnBC)
I kn channel outputs:
Yj = X + σn,jZj , j = 1, 2, . . . , kn
where Zj i.i.d.∼ N (0, 1) and σn,j ≤ σn,j+1.
I kn →∞ monotonically. kn = O(n).
I Power constraint γ.
I How many bits can one send to each user reliably?
I Noise level as a triangular array (σn,j : n = 1, 2, . . . ; j = 1, . . . , kn).
Dongning Guo (Northwestern Univ.) p. 37
Massive access and many-user information theory
![Page 119: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/119.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Gaussian degraded many-broadcast channel (MnBC)
I kn channel outputs:
Yj = X + σn,jZj , j = 1, 2, . . . , kn
where Zj i.i.d.∼ N (0, 1) and σn,j ≤ σn,j+1.
I kn →∞ monotonically. kn = O(n).
I Power constraint γ.
I How many bits can one send to each user reliably?
I Noise level as a triangular array (σn,j : n = 1, 2, . . . ; j = 1, . . . , kn).
Dongning Guo (Northwestern Univ.) p. 37
Massive access and many-user information theory
![Page 120: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/120.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
DefinitionsI A triangular array
V =
v1,1v2,1 v2,2v3,1 v3,2 v3,3
...vn,1 vn,2 . . . . . . vn,kn
......
I A triangular array V = (vn,j : n = 1, 2, . . . ; j = 1, . . . , kn) describes an
asymptotically achievable message length array for an MnBC if there
exists a sequence of((
2dvn,je)knj=1
, n)
codes s.t. limn→∞ P(n)e = 0.
I The message length capacity of an MnBC is a collection of triangulararrays B such that (1− δ)B is asymptotically achievable and (1 + δ)B isnot asymptotically achievable.
Dongning Guo (Northwestern Univ.) p. 38
Massive access and many-user information theory
![Page 121: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/121.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
DefinitionsI A triangular array
V =
v1,1v2,1 v2,2v3,1 v3,2 v3,3
...vn,1 vn,2 . . . . . . vn,kn
......
I A triangular array V = (vn,j : n = 1, 2, . . . ; j = 1, . . . , kn) describes an
asymptotically achievable message length array for an MnBC if there
exists a sequence of((
2dvn,je)knj=1
, n)
codes s.t. limn→∞ P(n)e = 0.
I The message length capacity of an MnBC is a collection of triangulararrays B such that (1− δ)B is asymptotically achievable and (1 + δ)B isnot asymptotically achievable.
Dongning Guo (Northwestern Univ.) p. 38
Massive access and many-user information theory
![Page 122: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/122.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
DefinitionsI A triangular array
V =
v1,1v2,1 v2,2v3,1 v3,2 v3,3
...vn,1 vn,2 . . . . . . vn,kn
......
I A triangular array V = (vn,j : n = 1, 2, . . . ; j = 1, . . . , kn) describes an
asymptotically achievable message length array for an MnBC if there
exists a sequence of((
2dvn,je)knj=1
, n)
codes s.t. limn→∞ P(n)e = 0.
I The message length capacity of an MnBC is a collection of triangulararrays B such that (1− δ)B is asymptotically achievable and (1 + δ)B isnot asymptotically achievable.
Dongning Guo (Northwestern Univ.) p. 38
Massive access and many-user information theory
![Page 123: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/123.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Capacity results
I Let auxiliary variables U1, . . . , Ukn be jointly Gaussian.
I Power allocation described by a triangular array(αn,j : n = 1, 2, . . . ; j = 1, . . . , kn), with
∑knj=1 αn,j = 1.
I User j’s SNR is then αn,jγ/σ2j .
I An asymptotically achievable triangular array:
Bn,j = Bj(n) =n
2log
(1 +
αn,jγ
σ2n,j +
∑j−1i=1 αn,iγ
).
Dongning Guo (Northwestern Univ.) p. 39
Massive access and many-user information theory
![Page 124: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/124.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Capacity results
I Let auxiliary variables U1, . . . , Ukn be jointly Gaussian.
I Power allocation described by a triangular array(αn,j : n = 1, 2, . . . ; j = 1, . . . , kn), with
∑knj=1 αn,j = 1.
I User j’s SNR is then αn,jγ/σ2j .
I An asymptotically achievable triangular array:
Bn,j = Bj(n) =n
2log
(1 +
αn,jγ
σ2n,j +
∑j−1i=1 αn,iγ
).
Dongning Guo (Northwestern Univ.) p. 39
Massive access and many-user information theory
![Page 125: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/125.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Capacity results
I Let auxiliary variables U1, . . . , Ukn be jointly Gaussian.
I Power allocation described by a triangular array(αn,j : n = 1, 2, . . . ; j = 1, . . . , kn), with
∑knj=1 αn,j = 1.
I User j’s SNR is then αn,jγ/σ2j .
I An asymptotically achievable triangular array:
Bn,j = Bj(n) =n
2log
(1 +
αn,jγ
σ2n,j +
∑j−1i=1 αn,iγ
).
Dongning Guo (Northwestern Univ.) p. 39
Massive access and many-user information theory
![Page 126: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/126.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Capacity results
I Let auxiliary variables U1, . . . , Ukn be jointly Gaussian.
I Power allocation described by a triangular array(αn,j : n = 1, 2, . . . ; j = 1, . . . , kn), with
∑knj=1 αn,j = 1.
I User j’s SNR is then αn,jγ/σ2j .
I An asymptotically achievable triangular array:
Bn,j = Bj(n) =n
2log
(1 +
αn,jγ
σ2n,j +
∑j−1i=1 αn,iγ
).
Dongning Guo (Northwestern Univ.) p. 39
Massive access and many-user information theory
![Page 127: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/127.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Gaussian MnBC: a numerical examplen = 1000, γ = 20, kn = 250 (i.e., c = 1/4), and
σ2j = exp
[− j
(1 + j)2
].
0 0.2 0.4 0.6 0.8 10
10
20
30
40
50
60
70
80
β (user index k = βn)
Mes
sage
length
Numerical evaluation of Cj(n) for n = 1000
Asymptotic approximation
Using uniform power allocation:I Can send ≥ 1 bits reliably to all users;I Can send ≥ 10 bits reliably to about 20% of the users.
Dongning Guo (Northwestern Univ.) p. 40
Massive access and many-user information theory
![Page 128: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/128.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Gaussian MnBC: a numerical examplen = 1000, γ = 20, kn = 250 (i.e., c = 1/4), and
σ2j = exp
[− j
(1 + j)2
].
0 0.2 0.4 0.6 0.8 10
10
20
30
40
50
60
70
80
β (user index k = βn)
Mes
sage
length
Numerical evaluation of Cj(n) for n = 1000
Asymptotic approximation
Using uniform power allocation:I Can send ≥ 1 bits reliably to all users;I Can send ≥ 10 bits reliably to about 20% of the users.
Dongning Guo (Northwestern Univ.) p. 40
Massive access and many-user information theory
![Page 129: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/129.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Gaussian MnBC: a numerical examplen = 1000, γ = 20, kn = 250 (i.e., c = 1/4), and
σ2j = exp
[− j
(1 + j)2
].
0 0.2 0.4 0.6 0.8 10
10
20
30
40
50
60
70
80
β (user index k = βn)
Mes
sage
length
Numerical evaluation of Cj(n) for n = 1000
Asymptotic approximation
Using uniform power allocation:I Can send ≥ 1 bits reliably to all users;I Can send ≥ 10 bits reliably to about 20% of the users.
Dongning Guo (Northwestern Univ.) p. 40
Massive access and many-user information theory
![Page 130: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/130.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
GroupingI What is the optimal strategy if a fraction q of the users can be dropped?
I It’s optimal to drop the group of qkn least capable users.
I In general, multiple groups with different rates—leading to a notion of“capacity region.”
0 0.2 0.4 0.6 0.8 10
10
20
30
40
50
60
70
80
β (user index j = βn)
Mes
sage
length
Uniform power allocationSymmetric rate power allocationSymmetric rate with 30% outage
Dongning Guo (Northwestern Univ.) p. 41
Massive access and many-user information theory
![Page 131: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/131.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
GroupingI What is the optimal strategy if a fraction q of the users can be dropped?
I It’s optimal to drop the group of qkn least capable users.
I In general, multiple groups with different rates—leading to a notion of“capacity region.”
0 0.2 0.4 0.6 0.8 10
10
20
30
40
50
60
70
80
β (user index j = βn)
Mes
sage
length
Uniform power allocationSymmetric rate power allocationSymmetric rate with 30% outage
Dongning Guo (Northwestern Univ.) p. 41
Massive access and many-user information theory
![Page 132: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/132.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
GroupingI What is the optimal strategy if a fraction q of the users can be dropped?
I It’s optimal to drop the group of qkn least capable users.
I In general, multiple groups with different rates—leading to a notion of“capacity region.”
0 0.2 0.4 0.6 0.8 10
10
20
30
40
50
60
70
80
β (user index j = βn)
Mes
sage
length
Uniform power allocationSymmetric rate power allocationSymmetric rate with 30% outage
Dongning Guo (Northwestern Univ.) p. 41
Massive access and many-user information theory
![Page 133: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/133.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
GroupingI What is the optimal strategy if a fraction q of the users can be dropped?
I It’s optimal to drop the group of qkn least capable users.
I In general, multiple groups with different rates—leading to a notion of“capacity region.”
0 0.2 0.4 0.6 0.8 10
10
20
30
40
50
60
70
80
β (user index j = βn)
Mes
sage
length
Uniform power allocationSymmetric rate power allocationSymmetric rate with 30% outage
Dongning Guo (Northwestern Univ.) p. 41
Massive access and many-user information theory
![Page 134: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/134.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Many-Source coding?
Xn
1,1 Xn1,2 . . . Xn
1,n
Xn2,1 Xn
2,2 . . . Xn2,n
......
...Xnln,1
Xnln,2
. . . Xnln,n
I No joint typicality in general.
I A certain joint typicality can be established if the sources form a Markovchain.
Dongning Guo (Northwestern Univ.) p. 42
Massive access and many-user information theory
![Page 135: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/135.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Many-Source coding?
Xn
1,1 Xn1,2 . . . Xn
1,n
Xn2,1 Xn
2,2 . . . Xn2,n
......
...Xnln,1
Xnln,2
. . . Xnln,n
I No joint typicality in general.
I A certain joint typicality can be established if the sources form a Markovchain.
Dongning Guo (Northwestern Univ.) p. 42
Massive access and many-user information theory
![Page 136: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/136.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Conclusion
I Proposed a new many-user paradigm.
I Determined the minimum device identification cost.
I Determined the symmetric capacity of the Gaussian many-access channelwith random user activities.
I Capacity results for the Gaussian degraded many-broadcast channel alsodevelped (not shown here).
I large-system 6= many-user.
I Ongoing: other many-user channel models, source coding, rate distortion.
I The goal is to develop a Many-User Information Theory for emergingmany-user systems.
Dongning Guo (Northwestern Univ.) p. 43
Massive access and many-user information theory
![Page 137: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/137.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Conclusion
I Proposed a new many-user paradigm.
I Determined the minimum device identification cost.
I Determined the symmetric capacity of the Gaussian many-access channelwith random user activities.
I Capacity results for the Gaussian degraded many-broadcast channel alsodevelped (not shown here).
I large-system 6= many-user.
I Ongoing: other many-user channel models, source coding, rate distortion.
I The goal is to develop a Many-User Information Theory for emergingmany-user systems.
Dongning Guo (Northwestern Univ.) p. 43
Massive access and many-user information theory
![Page 138: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/138.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Conclusion
I Proposed a new many-user paradigm.
I Determined the minimum device identification cost.
I Determined the symmetric capacity of the Gaussian many-access channelwith random user activities.
I Capacity results for the Gaussian degraded many-broadcast channel alsodevelped (not shown here).
I large-system 6= many-user.
I Ongoing: other many-user channel models, source coding, rate distortion.
I The goal is to develop a Many-User Information Theory for emergingmany-user systems.
Dongning Guo (Northwestern Univ.) p. 43
Massive access and many-user information theory
![Page 139: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/139.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Conclusion
I Proposed a new many-user paradigm.
I Determined the minimum device identification cost.
I Determined the symmetric capacity of the Gaussian many-access channelwith random user activities.
I Capacity results for the Gaussian degraded many-broadcast channel alsodevelped (not shown here).
I large-system 6= many-user.
I Ongoing: other many-user channel models, source coding, rate distortion.
I The goal is to develop a Many-User Information Theory for emergingmany-user systems.
Dongning Guo (Northwestern Univ.) p. 43
Massive access and many-user information theory
![Page 140: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/140.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Conclusion
I Proposed a new many-user paradigm.
I Determined the minimum device identification cost.
I Determined the symmetric capacity of the Gaussian many-access channelwith random user activities.
I Capacity results for the Gaussian degraded many-broadcast channel alsodevelped (not shown here).
I large-system 6= many-user.
I Ongoing: other many-user channel models, source coding, rate distortion.
I The goal is to develop a Many-User Information Theory for emergingmany-user systems.
Dongning Guo (Northwestern Univ.) p. 43
Massive access and many-user information theory
![Page 141: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/141.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Conclusion
I Proposed a new many-user paradigm.
I Determined the minimum device identification cost.
I Determined the symmetric capacity of the Gaussian many-access channelwith random user activities.
I Capacity results for the Gaussian degraded many-broadcast channel alsodevelped (not shown here).
I large-system 6= many-user.
I Ongoing: other many-user channel models, source coding, rate distortion.
I The goal is to develop a Many-User Information Theory for emergingmany-user systems.
Dongning Guo (Northwestern Univ.) p. 43
Massive access and many-user information theory
![Page 142: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/142.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Conclusion
I Proposed a new many-user paradigm.
I Determined the minimum device identification cost.
I Determined the symmetric capacity of the Gaussian many-access channelwith random user activities.
I Capacity results for the Gaussian degraded many-broadcast channel alsodevelped (not shown here).
I large-system 6= many-user.
I Ongoing: other many-user channel models, source coding, rate distortion.
I The goal is to develop a Many-User Information Theory for emergingmany-user systems.
Dongning Guo (Northwestern Univ.) p. 43
Massive access and many-user information theory
![Page 143: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/143.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Remark: a related work
The only prior many-user model in the literature is the noiseless binary adderchannel:
S.-C. Chang and E. Weldon, “Coding for t-user multiple-access channels,”IEEE Trans. Inform. Theory, vol. 25, no. 6, pp. 684-691, 1979.
The number of users and blocklength taken to infinity simultaneously.They studied uniquely decodable multiuser codes and the capacity.
Dongning Guo (Northwestern Univ.) p. 44
Massive access and many-user information theory
![Page 144: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/144.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Remark: large-system 6= many-userI Large-system analysis of CDMA and MIMO:
Send to infinity the number of users and spreading factor simultaneouslywith fixed ratio; or the number of transmit and receive antennas withfixed ratio;Blocklength n→∞ before that.[Foschini & Gans ’96, Telatar ’99, Verdu & Shamai ’99, Tanaka ’02,Guo & Verdu ’05, Huh, Tulino & Caire ’12]
I Massive MIMO:First, n→∞.Then, send the number of antennas to infinity.[Rusek, Persson, Lau, Larsson, Marzetta, Edfors & Tufvesson ’13,Hoydis, ten Brink & Debbah ’13]
I The CEO problem [Berger & Zhang ’96].n→∞ before the number of agents.
I Broadcast strategy for point-to-point slow-fading channels [Shamai ’97].n→∞ before the number of layers.
Dongning Guo (Northwestern Univ.) p. 45
Massive access and many-user information theory
![Page 145: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/145.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Remark: large-system 6= many-userI Large-system analysis of CDMA and MIMO:
Send to infinity the number of users and spreading factor simultaneouslywith fixed ratio; or the number of transmit and receive antennas withfixed ratio;Blocklength n→∞ before that.[Foschini & Gans ’96, Telatar ’99, Verdu & Shamai ’99, Tanaka ’02,Guo & Verdu ’05, Huh, Tulino & Caire ’12]
I Massive MIMO:First, n→∞.Then, send the number of antennas to infinity.[Rusek, Persson, Lau, Larsson, Marzetta, Edfors & Tufvesson ’13,Hoydis, ten Brink & Debbah ’13]
I The CEO problem [Berger & Zhang ’96].n→∞ before the number of agents.
I Broadcast strategy for point-to-point slow-fading channels [Shamai ’97].n→∞ before the number of layers.
Dongning Guo (Northwestern Univ.) p. 45
Massive access and many-user information theory
![Page 146: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/146.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Remark: large-system 6= many-userI Large-system analysis of CDMA and MIMO:
Send to infinity the number of users and spreading factor simultaneouslywith fixed ratio; or the number of transmit and receive antennas withfixed ratio;Blocklength n→∞ before that.[Foschini & Gans ’96, Telatar ’99, Verdu & Shamai ’99, Tanaka ’02,Guo & Verdu ’05, Huh, Tulino & Caire ’12]
I Massive MIMO:First, n→∞.Then, send the number of antennas to infinity.[Rusek, Persson, Lau, Larsson, Marzetta, Edfors & Tufvesson ’13,Hoydis, ten Brink & Debbah ’13]
I The CEO problem [Berger & Zhang ’96].n→∞ before the number of agents.
I Broadcast strategy for point-to-point slow-fading channels [Shamai ’97].n→∞ before the number of layers.
Dongning Guo (Northwestern Univ.) p. 45
Massive access and many-user information theory
![Page 147: @let@token Massive Access and Many-User Information Theoryghan/WAM/Dongning.pdf · Need aMany-UserInformation Theory I Classical single-user Information Theory: 1 user, coding blocklength](https://reader036.fdocuments.net/reader036/viewer/2022062603/5f48b4f16d5cac6438372cc4/html5/thumbnails/147.jpg)
Motivation Identification Classical IT Many-access Many-broadcast Concluding remarks
Remark: large-system 6= many-userI Large-system analysis of CDMA and MIMO:
Send to infinity the number of users and spreading factor simultaneouslywith fixed ratio; or the number of transmit and receive antennas withfixed ratio;Blocklength n→∞ before that.[Foschini & Gans ’96, Telatar ’99, Verdu & Shamai ’99, Tanaka ’02,Guo & Verdu ’05, Huh, Tulino & Caire ’12]
I Massive MIMO:First, n→∞.Then, send the number of antennas to infinity.[Rusek, Persson, Lau, Larsson, Marzetta, Edfors & Tufvesson ’13,Hoydis, ten Brink & Debbah ’13]
I The CEO problem [Berger & Zhang ’96].n→∞ before the number of agents.
I Broadcast strategy for point-to-point slow-fading channels [Shamai ’97].n→∞ before the number of layers.
Dongning Guo (Northwestern Univ.) p. 45
Massive access and many-user information theory