Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.
-
Upload
celeste-goodspeed -
Category
Documents
-
view
218 -
download
0
Transcript of Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.
![Page 1: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/1.jpg)
Sying-Jyan WangDept. Computer Science and Engineering
National Chung-Hsing University
![Page 2: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/2.jpg)
OutlineFundamental
Information theoryData compression
Digital testing: an introductionCode-based test data compression: a case
studyMaximum compression Improving compression rate
112/04/10 2
![Page 3: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/3.jpg)
112/04/10 33
![Page 4: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/4.jpg)
112/04/10 4
Compressibility of a Data SetData compression (source coding)
The process of encoding information using fewer bits than an unencoded representation would use through use of specific encoding schemes.
How much compression can you get?Encoding schemeData
How to quantify the compressibility of a data set?
4
![Page 5: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/5.jpg)
112/04/10 5
Shannon Entropy (1/4)A measure of the uncertainty associated with a
random variable
Quantify the information contained in a message in the sense of an expected value (usually in bits), or
A measure of the average information content one is missing when one does not know the value of the random variable
Self-information: log p(x)5
C.E. Shannon, "A Mathematical Theory of Communication", Bell System Technical Journal, vol. 27, pp. 379-423, 623-656, July, October, 1948
XxXxxpxp
xpxpXH )(log)(
)(
1log)()(
![Page 6: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/6.jpg)
112/04/10 6
Shannon Entropy (2/4)Self-information
How much information carried by a particular value of a random variable?
XXXXXXXXX XXXX XXXX XX XXXXXX XXXXXXXXX.
XXXXXXXXX XXXX XeXX XX XXXXXX XXXXXXXXX.
XXXXXXXXX XXXX wXnt XX XXXXXX XXXXXXXXX.
Professor Wang wXnt to Taipei yesterday.
6
![Page 7: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/7.jpg)
112/04/10 7
Shannon Entropy (3/4)Tossing a coin with
known, probabilities of coming up heads or tailsMaximum entropy
of the next toss -- if the coin is fair 1-bit information
A double-headed coin 0-bit (no) information
7
![Page 8: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/8.jpg)
112/04/10 8
Shannon Entropy (4/4)Shannon's entropy: an absolute limit on the best
possible lossless compression of any communicationA source produces a sequence of letters chosen from
among A, B,C, D with probabilities 1/2, 1/4, 1/8, 1/8, successive symbols being chosen independently
H = 7/4 bis per symbolEncoding:
A 0 B 10 C 110 D 111
8
AAAA BB DC
![Page 9: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/9.jpg)
Entropy in ThermodynamicsA measure of the unavailability of a system’s
energy to do work; also a measure of disorder; the higher the entropy the greater the disorder.
A measure of disorder; the higher the entropy the greater the disorder.
In thermodynamics, a parameter representing the state of disorder of a system at the atomic, ionic, or molecular level; the greater the disorder the higher the entropy.
A measure of disorder in the universe or of the availability of the energy in a system to do work.
112/04/10 9
![Page 10: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/10.jpg)
112/04/10 10
Eight Allotropes of Carbon(From Wikipedia)
(a)Diamond(b)Graphite(c) Lonsdaleite(d)Fullerene (C60)(e)Fullerene (C540)(f) Fullerene (C70)(g)Amorphous carbon(h)Carbon nanotube
(CNT)Allotropy is a behavior exhibited by certain chemical elements: these elements can exist in two or more different forms, known as allotropes of that element. In each allotrope, the element's atoms are bonded together in a different manner. Allotropes are different structural modifications of an element.[1]
10
![Page 11: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/11.jpg)
112/04/10 11
Carbon Allotropes Spans a Range of ExtremesSynthetic diamond nanorods are the hardest materials known.
Graphite is one of the softest materials known.
Diamond is the ultimate abrasive.
Graphite is a very good lubricant.
Diamond is an excellent electrical insulator.
Graphite is a conductor of electricity.
Diamond is the best known thermal conductor
Some forms of graphite are used for thermal insulation
Diamond is highly transparent.
Graphite is opaque.
Diamond crystallizes in the cubic system.
Graphite crystallizes in the hexagonal system.
Amorphous carbon is completely isotropic.
Carbon nanotubes are among the most anisotropic materials ever produced. 11
![Page 12: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/12.jpg)
112/04/10 12
Amorphous Carbon vs. Carbon Nanotube
12
![Page 13: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/13.jpg)
112/04/10 13
Shannon’s Entropy vs. Entropy: An AnalogyVarious crystalline structures of the same
element exhibit different propertiesHow about different structure of the same set
of data?Can we put the data into a format that is easier
to compress?f f–1
13
![Page 14: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/14.jpg)
112/04/10 1414
![Page 15: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/15.jpg)
112/04/10 15
Digital Testing: A Quick Overview (1/3)All manufactured
components are subject to physical defects
Digital testing is the process to verify whether a chip (system) meets its specification
RequirementsInput vectors only
applied to the chip input pins
Only circuit outputs are observable
15
TEST VECTORS
MANUFACTUREDCIRCUIT
COMPARATOR
CIRCUIT RESPONSE
PASS/FAILCORRECTRESPONSES
![Page 16: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/16.jpg)
112/04/10 16
Digital Testing: A Quick Overview (2/3)Why Testing is Difficult?
Test application time explodes for exhaustive testing of VLSI For a combinational circuit with 50 inputs, we need 250 =
1.126x1015 test patterns. Assume one test per 10-7sec, it takes 1.125x108sec = 3.57yrs. to test such a circuit.
Test generation of sequential circuits are even more difficult. Lack of Controllability and Observability of
Flip-Flops (Latches)How to generate input test vectors (patterns)
Functional testingFault-oriented test generation
![Page 17: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/17.jpg)
112/04/10 17
Digital Testing: A Quick Overview (3/3)Fault models
Modeling the effects of physical defects on the logic function and timing.
Common fault models Stuck-at fault, bridging fault, memory fault, etc.
Single stuck-at fault modelThe most widely used fault modelAssumptions
Only one line is faulty. Gates are fault-free.
Faulty line permanently set to 0 or 1. Fault can be at an input or output of a gate.
![Page 18: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/18.jpg)
112/04/10 1818
X
X1
0
X0
1
![Page 19: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/19.jpg)
112/04/10 19
Scan Based Design-for-Testability (DFT)Sequential circuits are
hard to testScan based design
Modify all flip-flops to provide Parallel load Scan shift
Provide controllability and observability of internal state variables for testing.
Turn the sequential test problem into a combinational one.
State V
ector
CombinationalLogic
X Z
y Y
ScanIn
ScanOut
D QDI
SI
![Page 20: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/20.jpg)
Test Compaction & CompressionWhy?
Modern chips are large and require a lot of test data More than 1 Gb for a chip
Automatic Test Equipments (ATE) are expensive Need to reduce test time
Test data contain a lot of X bits May be up to 97%
CompactionReduce number of test patterns by merging compatible
patternsCompression
Reduce data volume through coding and architectural approaches
![Page 21: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/21.jpg)
![Page 22: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/22.jpg)
112/04/10 22
Code-Based Test Data CompressionPartition test data into symbols, and replace
symbols by codewordsFixed-to-fixed
Lengths of both symbols and codewords are fixed Linear decompressor, dictionary code
Fixed-to-variable Huffman code
Variable-to-fixed Run-length code
Variable-to-variable Golomb, FDR, VHIC
AAAA BB DC
22
![Page 23: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/23.jpg)
112/04/10 23
Fixed-Symbol-Length Codes (1/2)Notations:
p0, p1, pX: Probabilities of 0, 1, and X in a test set
Entropy analysisAssume pX = 0Symbol length is nEntropy: n-ii
n
i
n-iini ppppCpH 00
0000 1log1
0
1
2
3
4
5
6
7
8
9
10
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
p0
En
tro
py
H
n=2
n=4
n=6
n=8
n=10
23
![Page 24: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/24.jpg)
112/04/10 24
Fixed-Symbol-Length Codes (2/2)Maximum compression rate:
No compression when p0 = p1 = 0.5
.gthsymbol_len
entropygthsymbol_len
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
p0
Ma
xim
um
Co
mp
ress
ion
24
![Page 25: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/25.jpg)
112/04/10 25
Variable-Symbol-Length Codes (1/3)Most variable-symbol-length codes are based on run-
length encodingRuns of both 0’s and 1’s are encoded
Theorem 1: The expected run length of 0’s is 1/p1, and the expected run length of 1’s is 1/p0.
Only runs of 0’s are encoded Corollary 2: The expected run length 1/p1 in case X-bits are
assigned to 0’s.
Why?Since a run of 0’s is interrupted by a 1, the expected run
length of 0’s is decided by the occurrence frequency of 1’s. Similarly, the expected run length L1 = 1/p0 in case X-bits
are assigned to 1’s.
25
![Page 26: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/26.jpg)
112/04/10 26
Variable-Symbol-Length Codes (2/3)Entropy Analysis
Assume 0-filling, and let y = p0+pX
1
11
00
log log logPr logPr
p
ypylRLlRLH
l
0
1
2
3
4
5
6
7
0.05 0.15 0.25 0.35 0.45 0.55 0.65 0.75 0.85 0.95
p1
Eb
tro
py H
26
![Page 27: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/27.jpg)
112/04/10 27
Variable-Symbol-Length Codes (3/3)Maximum compression rate:
No compression when p0 = p1 = 0.5
._lengthavg_symbol
entropy_lengthavg_symbol
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
p0
Ma
xim
um
Co
mp
ress
ion
27
![Page 28: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/28.jpg)
112/04/10 28
ImplicationsNo compression if the data are truly randomHigher compression rate if probabilities of 0’s and 1’s
are skewedExtremely skewed data unlikely: low information rateHow to improve maximum compression rate?
X-filling X’s are redundant information that can be compressed
ATPG Generate patterns with desired distribution
Partition scan cells Rearrange data to create desired distribution
28
![Page 29: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/29.jpg)
112/04/10 29
Scan Cells Partition (1/4)In a given test set
The number of specified 1’s and 0’s in each scan cell may not be balanced A not-so-random distribution in the cell
By putting “similar” cells together, we can create symbols with smaller entropy Higher maximum compression rate Easier-to-compress data
![Page 30: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/30.jpg)
112/04/10 30
![Page 31: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/31.jpg)
112/04/10 31
Scan Cells Partition (3/4)Bi-partitioned scan chain
is not good enough…Not naturalTwo parts have to be
encoded separatelyTo achieve even better
encoding efficiencyIntroduce an inversion
between the two partsFrom the encoder’s
point of view, there is only one partition
![Page 32: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/32.jpg)
112/04/10 32
Scan Cells Partition (4/4)Multiple scan chains
![Page 33: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/33.jpg)
112/04/10 33
Routing ConsiderationHow does physical order of scan cells affect
compressibility?1000 random moves in the initial chain
S9234 S15850 S38417
Not PartitionedPartitioned
![Page 34: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/34.jpg)
112/04/10 34
Results (1/5)Mintest patternsUse bit-raising to create
X’sPartition scan cells
0-dominant cells: #0 #1 >
1-dominant cells: #1 #0 >
Balanced cells: | #1 #0 > | Balanced cells are
assigned according to geometric adjacency
Cells in a partitioned are connected for minimum routing length
![Page 35: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/35.jpg)
112/04/10 35
![Page 36: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/36.jpg)
112/04/10 36
Results (3/5)Routing Lengths
S38417 Shortest length = 0 = 5
![Page 37: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/37.jpg)
112/04/10 37
Results (4/5)Shift power
Partitioned scan chains Lower entropy implies
more homogeneous symbols
Fewer signal transitions Lower shift power for
input patterns0-filling
More 0’s in output response
Lower overall shift power
![Page 38: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/38.jpg)
112/04/10 38
Results (5/5)How about the compression rate of another test set?
Assume the same partitioned scan chain orderGenerate a new test set by ATLANTA2/3 of X’s in chain-0 are assigned 02/3 of X’s in chain-1 are assigned 1
![Page 39: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/39.jpg)
About C.E. Shannon (1916-2001)Bachelors, U. Michigan, 1936
EE & MathematicsMaster, MIT, 1937
A Symbolic Analysis of Relay and Switching Circuits On designing electric computer using Boolean algebra
“… possibly the most important, and also the most famous, master's thesis of the century.” -- Howard Gardner, of Harvard University
PhD, Mathematics, MIT, 1940 An Algebra for Theoretical Genetics
Institute of Advanced Study, 1940-1941AT&T Bell Labs, 1941-1956
Information theoryVisiting Professor, MIT, 1956Tenured Professor, MIT, 1958, for only several semestersGambling & Investing using Kelly criterion with Ed Thorpe
![Page 40: Sying-Jyan Wang Dept. Computer Science and Engineering National Chung-Hsing University.](https://reader035.fdocuments.net/reader035/viewer/2022062511/5518c55555034638098b4b1c/html5/thumbnails/40.jpg)
~ THE END ~
Thank you