Data Quality Class 3. Goals Dimensions of Data Quality Enterprise Reference Data Data Parsing.
Quality data
description
Transcript of Quality data
Copyright 2007
Product Performance Intelligence
On-Demand Business Intelligence Solutions for
Product Lifecycle Management
SigmaQuest
www.sigmaquest.com
408-524-3180
Copyright 2007
Verigy Manufacturing and Test ProcessFunctional Test At CM (currently Flex in Shanghai)
FCT-Board1 (ICT)
FCT-Board2 (ICT)
ASSY (e.g. water block)
Site Module Boundary Scan
(Genealogy in eeprom)
Site Module Functional Test
Integration
(System Controller, Test Head, Site Modules, Cables, etc.)
Parametric Data Pass/Fail
& Repair +ReturnData
QIS/FTS Analyses P/F + Repair + RMA + Genealogy
Site Module Assembly
System TestSystem Burn-in
Final Test Inspection Sign-off (No correlation between sign-off and ship in ERP)
Test Results (Manual Entry)
Test Results (Manual Entry)
Parts Assoc (Manual Entry)
Test Results (Manual Entry)
Test Results (Manual Entry)
Parametric Results (Auto Upload)
Parametric Results (Auto Upload)
Parts Assoc (Manual Entry)
Returns
Copyright 2007
Verigy Manufacturing and Test Process with SigmaQuest SolutionData Flow
File Server
SigmaProbe
Auto ftp
FCT-Board
(ICT)
Assembly
FCT-Board
(ICT)
Assembly
Pass/Fail and
Parametric
Data
Site Module
Boundary
Scan
(Genealogy)
Site Module
Boundary
Scan
(Genealogy)Genealogy
Data
RepairRepair
FA/Repair
Data,
Genealogy
updates
Site Module
Functional
Test
Site Module
Functional
Test
Pass/Fail Data
Integration
(System,TestH
ead, Site
Modules)
Integration
(System,TestH
ead, Site
Modules)Genealogy
Data
System TestSystem Test
Pass/Fail and
Parametric
Data
System BurninSystem Burnin
Pass/Fail and
Parametric
Data
FinalTest
Inspection
Signoff
FinalTest
Inspection
Signoff
Pass/Fail Data
RMARMA
ERP System
Ship Data
Return Data
BOM Data
SigmaQuestSigmaQuest
SigmaSure
Analyses
SigmaSure
Analyses
CM 1 (Jabil)
CM 2 (Flex)
Copyright 2007
Current Status & Shortcomings of Verigy Process & QIS System
1. Data Collection:a) Pass/Fail Test and Repair data are entered manually, making the collected data subject to
inconsistency, errors, and missing data fields.b) There are two separate databases, Pass/fail and repair data are in one db, parametrics are in another
db.c) No ICT data collectiond) manual input of associations data, approx 430 inputs per system. Very time consuming and prone to
error.
2. Current Quality Software Tool Maintenancea) QIS system is the umbrella system, and contains modules for failure tracking, warranty, failure
analysis and communication, and quality. This system was developed internally at Verigy. System maintenance is sometimes limited due to IT resource availibility The charts are built from SQL Server Reporting.
b) Charts are hardcoded, and implementation of a new chart (utilizing a different database query) must be custom built, thereby requiring IT resources.
c) Slow lead time on any enhancements, requests, changes.
3. Analytics:a) Currently no method of associating repair actions to field failure modes.b) Time stamp in Parametric Data doesn’t match timestamp in Failure Tracking System (QIS), mainly due
to manual entry of data. Therefore it is difficult to correlate shop floor yields with parametric data.c) In QIS, there is not an automatic way of selecting relevant dates for data (i.e. no “suggested dates”)d) System interface is not user friendly and is often error prone due to complexity of field selection. e) There is no “live drilldown” capability. User must select the filter fields before the chart is drawn (and
only certain charts are available). f) There is no consolidated dashboard for management audiences. g) data crunching is manual and often requires data export to Excel for customizing charts. h) Getting data at the appropriate time is not possible, resulting in delayed response to assemble teams.
Copyright 2007
Current Status & Shortcomings of Verigy Process & QIS System Cont’d
4. Process Limitationsa) No method of confirming the test flow (sequence) that a particular device (or product) has undergone.b) Low visibility – data reporting system is not user friendly, which can lead to time consuming process
for CPE, R&D and others who may want to monitor product performance. c) Limited to No traceability of the product by Serial Number once it leaves the factory.d) Data is prone to error at the CM during the data entry processe) Manual entry to FTS GUI.f) Currently no capability to monitor system performance of fielded units.
Copyright 2007
Verigy Data Evaluation
• Following slides show examples of SigmaSure Analysis of a sampling of pass/fail test, parametric test, and repair data.
• Modules demonstrated: Genealogy, Mfg/Test, Test Flow Visualization, Repair.
• Modules not demonstrated on Verigy data but which can be demonstrated on generic SigmaQuest dataset, and would be of significant value to Verigy: RMA, Maverick Analysis, Supplier Quality.
• Data slice is for approximately 1 month, Jan 4, 2007 – Feb 13, 2007. 10 production systems.
• Data supplied for this evaluation came from multiple queries of Verigy’s Test/Repair databases, and was supplied to SigmaQuest in CSV, or Excel Format. Nature of parametric data precludes demonstration of the full capability of SigmaQuest
SPC and Parametric module (e.g. data for a particular test run under multiple conditions with multiple sets of spec limits is mixed together)
Mixing of repair data and test/retest data at single stations (without correct “event” specification) causes some confusion in the Test Flow Visualization results and some of the data analysis wizards. The confusion is inherent in the data itself and the method under which it was imported, not in the SigmaQuest Analyses.
• A Verigy demo-site was created: http://demo.sigmaquest.com/verigy Login: [email protected] Password: Please request from Barbara Johnson or Ed Matias
Copyright 2007
Sample Management Dashboard
Copyright 2007
Site Module Test Flow (Test Flow Visualization)Product Model Number: E7120-66805
This diagram is a direct output of TFV and shows the sequence of test steps through which the particular serial number went .
The test flow can be shown for any desired serial number in the database, simply by clicking on the serial number and selecting show system test
flow. See next slide for textual description of this diagram.
Copyright 2007
Test History of Product E7120-66805, Serial Number E7120668054634FS2007040050
1. First notice that station name FS_SysTest_2A1 is used in multiple test steps (SYS_BURNIN, Integration, SYS_TEST, FINAL_TEST, FINAL_INSP, and SIGN_OFF). Test Steps map to Verigy’s “TestAreaName” nomenclature. Station Name (FS_SITE05, FS_Repair1, etc.) map to Verigy’s “WorkStationName” nomenclature.
2. Device starts at SCAN_TEST (at FS_Bscan01), passes and moves to AC_SITE test. 3. Device fails at AC_SITE test (for apache pac), and goes to the Repair Station (FS_Repair1).
It is repaired and goes to a different station for retest (FS_SITE02). 4. Device fails again at AC_SITE test (at FS_SITE02), this time for apache ecrdiag, goes back
to repair.5. After repair, device goes BACK to SCAN_TEST, where it passes.6. Device then goes to yet another test station for AC_SITE test (FS_SITE03), where it again
fails, this time for unknown reason (the TestGroupName field is left blank). 7. Device goes back for repair, then goes back to FS_SITE03 again for AC_SITE test. 8. Device fails again for AC_SITE test, again for unknown reason (TestGroupName field again
left blank). 9. Device goes back to repair again. After repair, it goes back to original test station
FS_SITE05 for AC_SITE test. This time it passes, and moves on to SYS_BURNIN test step.10. Device fails at SYS_BURNIN (for apache eeprom), and goes back to FS_SITE05 for
AC_SITE test. It again fails AC_SITE test (this time for accalib) and goes to repair again.11. After repair, device goes back to AC_SITE test, where it passes, and moves on to
Integration, skipping any retest at SYS_BURNIN. 12. Device then passes to SYS_TEST, where it passes, then moves BACK to SYS_BURNIN
where it passes this time.13. Device then moves on to FINAL_TEST, passes, and goes through the rest of the process. 14. This device was tested at AC_SITE 7 times, it was repaired at AC_SITE 5 times.
Copyright 2007
Site Module Test Flow (Test Flow Visualization)For All Serial Numbers
This diagram is a direct output of TFV and shows the sequence of test steps through which all Site Module products were processed. This flow has SCAN_TEST and FINAl_INSP
highlighted, so we only see the flow of devices to and from these steps. The other steps can be highlighted in the same manner to show corresponding information. A few examples of
interesting data of note (for which there may or may or may not be a valid explanation):
1) Skips:
* 9 devices passed AC_SITE and proceeded directly to FINAL_INSP, skipping all test steps in between.
* One device passed at SCAN_TEST and proceeded directly to FINAL_INSP., skipping the test steps in between.
* 4 Devices skipped AC_SITE, and 2 devices skipped AC_SITE and Integration.
2) Passing Fails: 12 Devices failed AC_SITE and proceeded directly to FINAL_INSP. 2 Devices failed SCAN_TEST and proceeded to AC_SITE
3) Retests: 29 devices passed, and were retested at Integration . 30 devices passed and were retested at FINAL_INSP.
4) Reprocessing passed devices: 4 devices passed FINAL_INSP and were returned to SCAN_TEST. 6 passed FINAL_INSP and were returned to Integration.
From the chart, you can drill directly into the 9 systems that passed at AC_SITE and proceeded directly to FINAL_INSP to show the particular Serial Numbers and
corresponding Unit Reports. See next page.
Copyright 2007
9 Site Modules which passed at AC_SITE and proceeded directly to FINAL_INSP
Copyright 2007
Site Module (66805 and 66806) FPY trend chart: “Functional Test”
Note* This chart is somewhat misleading as it shows “Functional Test” FPY, which only looks at the first subtest in the “Functional Test
Sequence”, which is SCAN_TEST The FPY at AC_SITE is the main contributor to failures, therefore it is better to look at the FPY by test in the
functional Test Sequence, whereby you can see the turn on rate for each test separately. See next page for this chart.
Copyright 2007
Site Module (66805 and 66806) FPY trend chart: “Functional Test”
This chart shows only SITE module product 66805 (we have the most data for this PartNumber). It shows the FPY by test. Here we can see
that the AC_SITE FPY is quite low, as compared to SCAN_TEST and SYS_BURNIN. For data points which show a 0% yield, there are most
likely only one or two boards being tested, one or both of which fail, thus yielding a 0% FPY.
Copyright 2007
APE Board (E7088-66501) FPY by Test.
You can chart the yield of any product in the same manner. Here is the APE Board FPY by Test for the month data slice.
Copyright 2007
Site Module Failures by TestAreaName (Test Name in SigmaQuest)
In this chart you can see failures by TestAreaName (This is the Verigy Nomenclature) for the Site Module over the one month data
slice. Most of the failures occur at AC_SITE, which we have seen from the yield run chart previously. From the first “bin” or “bar on
this chart, you can directly drill into the TestGroup parameter, to find out which TestGroup is failing most often within the AC_SITE
test. See next page for the chart.
Note that date range is 12/2005-12/2007. This is an artifact of the
automatically generated chart, which we designated to review 2
years worth of data.. There is really only a one month data slice:
1/4/2007-2/14/2007.
Copyright 2007
Site Module: Failures at AC_SITE by Test Group
Copyright 2007
APE Failures by TestAreaName
In this chart you again see the failures by test Area Name, this time for the APE Board. This time, we’ll drill into the PE_TEST bar to see
the order of TestGroup Failures (the test group for which the PE_TEST failed the most). The resulting chart is on the next page. It is
important to note that you can drill into “any” UnitReport property from these paretos. TestGroupName is only one of the many
properties available.
Copyright 2007
APE PE_TEST Failures by TestGroup
Here you can see that the PE_TEST fails the most for TestGroupName: Apache_peftest.
Copyright 2007
SITE Retests at AC_SITE by Serial Number
In this chart you can see the number of SITE serial numbers which have been retested multiple times at AC_SITE. 2 Serial numbers
have been retested 11 times, and 5 serial numbers have been retested 10 times. From this pareto, you can drill into the bar which
shows the 2 serial numbers which have been tested 11 (bar 09) times to get the actual serial numbers, and the Unit reports which give
the exact details of the tests. See next chart for these details.
Copyright 2007
SITE Retests at AC_SITE by Serial Number
Here you can see the Unit Reports for the 2 serial numbers which had 11 retests, namely E7120668054634FS20070400050, and
E7120668054634FS20070100173 . From this view we can see that some of these retests are actually “repairs”, such as FS_Repair1, with a
subsequent retest. This illustrates the benefit of collecting test/repair data separately, and in a consistent method. In the Verigy methodology,
there is a field called “event” which specifies whether the record is for a repair or a test. The methodology was not adhered to in this case, and
all of these records have the “event” of “test”. Thus the records are mixed, and unless you notice that the station name indicates it’s a “repair”
the analysis will be misleading. TFV as shown earlier in this report will clarify visually what is going on with the test/repair history of these two
devices.
Copyright 2007
Repair By Product
Customization Required
This chart shows the number of repairs by product. 303 of the repairs are performed on the SITE Module product, Part Number
E7120-66805. 134 repairs are performed on the PPS board, Part Number E7120-66513. You can drill into any one of these bins to
find the reason (defect code) the specific parts were repaired. Drilling into the SITE Module bin shows the defect codes found on
the next page.
Copyright 2007
Repair By Product -> Defect Code
This chart shows the defect codes for the 303 repaired SITE Modules (previous chart). 235 were repaired for Component
Defective, and 68 were NTF. You can drill into the 235 Component Defective bin to see which components were found defective
the most often (see next page).
Copyright 2007
Repair By Product -> Defect Code -> Failed Component
This chart shows the failed component part numbers for the 235 SITE Modules with “COMPONENT DEFECTIVE” defect code
(previous chart). 99 of the failed components were part number 66513, which is the PPS board. 90 of the failed components were
part number 66501 which is the APE board. You can drill into the 66513 bin to see the Repair Code for these 99 PPS boards. See
next slide.
Copyright 2007
Repair By Product -> Defect Code -> Failed Component -> Repair Action Code
Of the 99 failed PPS boards (66513) from the previous chart, you can see that 96 of had Repair Code 9, and 3 had Repair Code 7.
Copyright 2007
Parametric Data: PE-DCCalDCDiag Test, Test Number: 527 raw data. Notice the 3 distributions, indicating 3 separate measurements included in a single
TestNumber value (527)
Copyright 2007
Parametric Data: PE-DCCalDCDiag Test, Test Number: 527 Using Data Inclusion Limit Filtering. Displaying distribution of one
measurement.
Copyright 2007
Parametric Data: PE-DCCalDCDiag Test, Test Number: 215 Using Data Inclusion Limit Filtering and “What If” Spec Limits. Displaying
distribution of one measurement.
Notice distribution is within
spec limits, but skewed toward
upper limit.