A Historical Survey of Radiochemistry Laboratory Audit Findings Bob Shannon Contact:...

Post on 30-Mar-2015

216 views 1 download

Tags:

Transcript of A Historical Survey of Radiochemistry Laboratory Audit Findings Bob Shannon Contact:...

A Historical Survey of Radiochemistry Laboratory

Audit Findings

Bob Shannon

Contact: BobShannon91@earthlink.net

Presented June 27, 2006 at the

16th Annual RETS – REMP Workshop

2

Changes in Radioanalytical Capabilities and Needs

• Transition from Site labs to contract labs

• Increasingly complex SOW's • Data processing enables complex

data work-up and reporting• Increasing concern about

Defensibility of Data• Rules, Rules, Rules - Raw Data

and Rules, Rules, Rules ….

3

Integrated Contractor Purchasing Team Basic Ordering Agreement

Mid-1990s -- GAO pointed out a lack of coordination in oversight of analytical services

1999 - Formation of ICPT BOA - a national contract vehicle for DOE environmental lab services

4

Increased Efficiency and Quality through Standardized Requirements

• Single SOW with standardized requirements levels playing field

• Site-specific SOW’s address unique needs (...wants ?)

• With fewer SOW's and deliverables, labs should be able to operate more efficiently and concentrate on quality

5

DOECAP

• DOE Consolidated Audit Program• National team of technically

qualified auditors supplied by DOE Sites

• Fewer audits save sites and labs time and money

6

The QSAS is NELAC … and more ...

7

QSAS

“Quality Systems for Analytical Services” (Revision 2.2 - November 2005)

• Single, integrated QA program for environmental analytical laboratories supporting DOE

• Step toward harmonizing analytical quality requirements across Federal agencies

8

Technical Basis for QSAS

• Chapter 5 of NELAC (2003)

• Gray-text supplements, notes deviations from and clarifies NELAC Standard

• Compliant with DOE Order 414.1A - Quality Assurance

• The QSAS and audit checklists available at: http://www.oro.doe.gov/DOECAP

9

An evaluation of historical findings and observations was conducted

Priority 1 Findings 9 (2%)

Priority 2 Findings 289 (50%)

Observations 276 (48%)

Total Findings and Observations 574 (100%)

10

Frequency of Findings / Observations

Findings and Observations by Functional Area

Not specific to area

37%All others

1%

Alpha Scint. (Rn)

1%

M&TE

3%KPA

4%

Alpha

Spectroscopy

10%

Gas Proportional

Counting

11%

Gamma

Spectroscopy

10%Liquid Scintillation

8%

Preparation

8%

Multiple Instruments

7%

11

Frequency of Findings / Observations

Findings and Observations by Issue

Other

8%

Calibration

16%

Method Technical

14%Standards

14%Quality Control

Practices

7%

Calculations

7%

SOP

7%

Contamination

Control

6%

Performance check

(CCV)

5%

Method Compliance

3%

Documentation

2%

Control Charts

4%

External

Perfomance

Testing

2%

Document control

2% Review practices

2%

Background

1%OtherCorrective action 1.4% ProcessPersonnel 1.0%RM Licensing 0.9%Communication 0.5%Quality Systems 0.5%Capability 0.3%Narrative 0.3%Software QA 0.3%Configuration 0.2% ControlMiscellaneous 2.3%

12

Finding and Observations by Deficiency Type

Established criteria 14%inadequate

Documentation inadequate10%

Proceduralization inadequate10%

Requirement not performed 8%

Criteria not established 8%

Technically inadequate 6%

DQOs not satisfied 5%

Review inadequate 4%Frequency requirements not 4%

satisfied

Corrective action not 3%performed

Implementation inadequate 3%

Cannot demonstrate control 3%over process

Standard expired 2%

Add tracer before prep 1%

No expiration date 1%

No second source 1%

QC and field samples treated 1%differently

Training inadequate 1%

Tracer contaminant check not 0.9%performed

Minimum count criteria 0.9% not satisfied

Configuration control 0.7% inadequate

Point source used for 0.7% plateau

Geometry mismatch 0.5%

Applied beyond calibrated 0.5%range (quench or attenuation)

Wrong constant (2.71) 0.5%used in MDA

Others 9%

13

General Calibration Practices (16%)

• Almost all labs perform technically adequate calibrations for efficiency, self-absorption, quench, energy and shape.

The most common weaknesses include:

• Processes not adequately proceduralized

Definition of nuclide, geometry, activity, matrix

Calibration standard preparation

Counting procedures

Calculations, documentation and review

• Standards Issues• Standards not traceable, expired, not verified• Standards use or prep not adequately documented• Calibration source not independent of QC samples

14

General Calibration Practices (16%)

• Acceptance criteria for calibrations• Criteria not established

• Curve fit acceptance criteria not documented

• Count precision requirements not met

• Frequency requirements not met

• Background determinations• Frequency requirements not met• Background count time shorter than associated samples• Geometry or mount not representative of samples (e.g.

LSC / filters)

15

GPC Set-up / Calibration Practices

GPC Plateau Measurements/Discriminator Set-up • Slope does not meet acceptance criteria (Note

that at HV set-point may approach or exceed 5%)

• Discriminators not set consistently between detectors

• Instrument not recalibrated when HV is reset

16

GPC Performance Checks• Discrete alpha and beta sources not used for

daily checks -- track major & minor channel response (these support both efficiency and crosstalk calibrations)

• Limits do not reflect conditions at point of calibration

A rolling mean of does not reflect conditions prior to first point averaged)

GPC Calibration Practices

17

GPC Calibration Practices

• GPC Efficiency calibrationsChoice of reference nuclide inappropriate

230Th and 137Cs or 90Sr / 90Y required reference nuclides for Gross Alpha and Beta drinking water compliance testing

Other nuclides (non-DW) per application - consider technical basis; document in raw data and case narrative

• Standards not representative of sample (range of masses, chemical composition, source geometry and configuration)

• Self-absorption curve does not span range of masses for analysis

18

GPC Calibration Practices

• Self-absorption and crosstalk calibrations not performed using reference nuclide

• Crosstalk calibrations not performedRequired for Gross Alpha / Beta Crosstalk calibration does not span

self-absorption curve mass range

19

Technical Issues (14%)

• Most technical practices at most labs are adequate to provide acceptable data

• Internal and External QC results, and experience with reported data, show up these problems

• Technical issues tend reflect staff expertise and the level of scrutiny to which the lab’s data is subjected

20

Technical Issues (Examples)• Gamma Spectroscopy

Software not configured properly for intended purpose Software parameter set-up not proceduralized /

documented Gamma spec library not based on technically defensible

assumptions. (e.g. U-238 determined in water sample by measurement of Th-234 although secular equilibrium cannot be defended)

Sample geometry does not conform to calibration geometry

• Other examples Digestion processes inadequate LSC background not representative of sample Method validation not technically adequate or

documented Reagent preparation instructions inaccurate

21

Standard Reference Materials (14%)

• Standards preparation documentation inaccurate or ambiguous

• Standards use ambiguously documented• Standards not NIST traceable (or ASTM C1128)• Standards verification requirements not met

Annual reverification not performed Criteria not formalized Tracers not tested for contaminants (e.g., Am-241 in Am-

243) Independent (second) source not used for verification

• Expiration date issues

22

Quality Control Practices (7%)

• Historical performance (control charts) does not support QC limits applied

• QC samples not subjected to same process as samples – esp. blanks!

• QC sample evaluation acceptance criteria ambiguous or not defined

23

• PT failure without corrective action• Chemical yield practices inadequate

Yield monitor not added early in process (prior to digestion / separation)

Acceptance criteria not established, proceduralized or implemented

Overcorrection for yields >> 100%

Quality Control Practices (7%)

24

Calculations (7%)

• Inadequate software V&V• Inadequate Configuration Control

• Spreadsheets not protected• Instrument setting not documented or checked

• Calculations incorrectly implemented• CSU

not reported, not proceduralized, technical basis not documented/ not adequate

25

Standard Operating Procedures (7%)

• Procedures don’t support practice• Details inadequate or ambiguous • Uncontrolled SOP’s in use• No SOP for method (esp. for HTD’s)• Deviations from regulatory reference methods

not documented

26

Contamination Control (6%)

• Program for contamination control not proceduralized or inadequate Instrument background not checked after high activity

samples No corrective action after background check failure Detector cleaned prior to background checks

(prevents identification of cross-contamination) Inadequate separation of work areas for environmental

versus elevated activity operations Glassware cleaning practices inadequate

27

Conclusions• Technical quality has improved over the lifetime of

DOECAP• Labs are less successful at maintaining pace with

increased QA and documentation requirements. • Labs and their customer must work to maintain both the

technical and the legal defensibility of their data. Besides that is nothing more than good science.

• Harmonizing requirements around quality standards (e.g. NELAC) simplifies requirements for labs and leads to more effective operation, lower costs and better data.

Caveat Emptor!!!

28

Finding and Observations by Deficiency Type

Established criteria 14%inadequate

Documentation inadequate10%

Proceduralization inadequate10%

Requirement not performed 8%

Criteria not established 8%

Technically inadequate 6%

DQOs not satisfied 5%

Review inadequate 4%Frequency requirements not 4%

satisfied

Corrective action not 3%performed

Implementation inadequate 3%

Cannot demonstrate control 3%over process

Standard expired 2%

Add tracer before prep 1%

No expiration date 1%

No second source 1%

QC and field samples treated 1%differently

Training inadequate 1%

Tracer contaminant check not 0.9%performed

Minimum count criteria 0.9% not satisfied

Configuration control 0.7% inadequate

Point source used for 0.7% plateau

Geometry mismatch 0.5%

Applied beyond calibrated 0.5%range (quench or attenuation)

Wrong constant (2.71) 0.5%used in MDA

Others 9%

29

AcknowledgmentsDOECAP Rad LabsDOECAP Auditors

DOECAP Management Team

Graphic Design SupportMaya Shannon

CorrespondenceBob ShannonQRS, LLC

BobShannon91@earthlink.net

2349 Bellaire StDenver, CO 80207303-432-1137