Download presentation
Presentation is loading. Please wait.
Published byAlfred Lawson Modified over 9 years ago
1
1© M G Gibson 2010RSS Destructive Testing MSA1 ISO/TS 16949:2009(E) and AIAG MSA 4 th edn. (2010) Martin Gibson CStat, CSci, MSc, MBB AQUIST Consulting gg1000@waitrose.com
2
2© M G Gibson 2014RSS Destructive Testing MSA2 Making sense of MSA Do you know how accurate and precise your measurement and test equipment are? Do you suspect that good work is sometimes condemned as bad simply because of uncertainty in the measurement system; is bad work ever released as good? Do you know the cost of non-capable measurement systems? Do you realise how important it is to understand measurement systems uncertainty? Does your auditor share your understanding of measurement systems? What can you do about it?
3
3 ISO/TS 16949:2009(E) 7.6.1 Measurement System Analysis Statistical studies shall be conducted to analyse the variation present in the results of each type of measuring and test equipment system. ... applies to measurement systems in the control plan. ... analytical methods and acceptance criteria used shall conform to those in customer reference manuals on MSA. Other analytical methods and acceptance criteria may be used if approved by the customer. © M G Gibson 2014RSS Destructive Testing MSA3 Questions: What is the operational definition of statistical studies? Do organisations, auditors, quality mangers understand statistical studies? Why do auditors ask?, “Can you show me GR&R studies for each type of measuring and test equipment system referenced in the control plan?”
4
4 ISO/TS 16949 Scheme Update IF SMMT Webinar, 5 Nov. 2013 Common problems found in ISO/TS16949 audits Calibration and MSA (7.6 and 7.6.1) Definition of Laboratory scope Control of external laboratories Traceability to national or international standards MSA not done for all types of measuring systems MSA only considering gauge R and R © M G Gibson 2014 RSS Destructive Testing MSA 4 Questions: 1. Why is MSA regarded as GR&R?
5
5 Ford Motor Company MSA requirements (2009) 4.35 (ISO/TS 16949 cl. 7.6.1) All gauges used for checking Ford components/parts per the control plan shall have a gauge R&R performed in accordance with the appropriate methods described by the latest AIAG MSA to determine measurement capability. Variable gauge studies should utilize 10 parts, 3 operators & 3 trials Attribute gauge studies should utilize 50 parts, 3 operators & 3 trials © M G Gibson 2014RSS Destructive Testing MSA5 Questions: 1. Are some Customers leading the thinking? 2. Why just limited to products? 3. What are your Customer expectations?
6
6© M G Gibson 2014RSS Destructive Testing MSA6 Measurement System Variation Reproducibility Repeatability Accuracy Stability Precision Bias Linearity Gauge R&R Calibration Measurement System Variation Observed Variation = Process Variation + Measurement System Variation
7
7 AIAG MSA 4 th edn. (2010) Accuracy, Bias, Stability, Linearity, Precision, Repeatability, Reproducibility, GR&R Attributes, Variables, & non-replicable data considered Variables GR&R study 10 parts, 3 operators, 3 measurements Parts chosen from 80% of tolerance Destructive testing requires 90 parts from a homogeneous batch Three analytical methods: 1. Range – basic analysis, no estimates of R&R 2. Average & Range – provides estimates of R&R 3. ANOVA – preferred, estimates of parts, appraisers, parts*operators interaction, replication error due to gauge © M G Gibson 2014RSS Destructive Testing MSA7 Question: 1. Do organisations, auditors, quality mangers understand MSA?
8
8© M G Gibson 2014RSS Destructive Testing MSA8 AIAG ANOVA Models Crossed vs. Nested Y ijk = μ + Operator i + Part j + (Operator*Part) ij + ε k(ij) Y ijk = + Operator i + Part j(i) + (ij)k Crossed vs. Nested? See Barrentine, Moen, Nolan & Provost, Bower, Burdick, Skrivanek Fixed vs. mixed effects models? Software? MTB V16+ includes fixed, mixed effects, enhanced models, pooled standard deviation approach not included. SPC for Excel – fixed effects Other software packages? Question: Do organisations, auditors, quality mangers understand ANOVA?
9
9 % Contribution Measurement System Variation as a percentage of Total Observed Process Variation using variances (additive) % Study Variation Measurement System Standard Deviation as a percentage of Total observed process standard deviation (not additive) % Tolerance Measurement Error as a percentage of Tolerance Number of Distinct Categories (ndc) Measures the resolution of the scale % Contribution % Study Variation % Tolerance ndc < 1% Good 2-9% Acceptable > 9% Unacceptable < 10% Good 11-30% Acceptable > 30% Unacceptable < 10% Good 11-30% Acceptable > 30% Unacceptable > 10 Good 5-10 Acceptable < 5 Unacceptable GR&R Variables Data Acceptance Criteria Do organisations, auditors, quality mangers understand the metrics?
10
10 Non-replicable GR&R case study (Anon, 2002) Ensure that all the conditions surrounding the measurement testing atmosphere are: defined, standardized and controlled appraisers should be similarly qualified and trained lighting should be adequate and consistently controlled work instructions should be detailed and operationally defined environmental conditions should be controlled to an adequate degree equipment should be properly maintained and calibrated, failure modes understood, etc.
11
11 Non-replicable GR&R case study (Anon, 2002) If the overall process appears to be stable & capable, and all the surrounding pre-requisites have been met, it may not make sense to spend the effort to do a non-replicable study since the overall capability includes measurement error – if the total product variation and location is OK, the measurement system may be considered acceptable. Ironically high Cp / Cpk gives poor ndc! Question: 1. Do organisations, auditors, quality mangers understand this concept? AIAG FAQs response: If your process is stable and capable, the spread of this acceptable process distribution includes your measurement error. There may be no need to study your measurement error from a purely "acceptability" viewpoint.’
12
12© M G Gibson 2014RSS Destructive Testing MSA12 Questions for Making sense of MSA What is the operational definition of statistical studies? Do organisations, auditors, quality mangers understand statistical studies? Why do auditors ask for GR&R studies? Why is MSA regarded as GR&R? Are (some) Customers leading the thinking? Why is MSA limited to products? Do you know your Customer expectations? Do organisations, auditors, quality mangers understand? MSA, ANOVA, crossed vs. nested, fixed vs. mixed models, metrics, high Cp/Cpk gives low ndc? Is MSA seen just as a QMS requirement or a true part of continuous improvement?
13
13 References AIAG Measurement System Analysis, 4 th edn., (2010) Anon. Non-replicable GR&R case study, (circa 2002) Barrentine, Concepts for R&R Studies, 2 nd edn., ASQ, (2003) Bower, A Comment on MSA with Destructive Testing, (2004) ; see also keithbower.com Gorman & Bower, Measurement Systems Analysis and Destructive Testing, ASQ Six Sigma Forum Magazine, (August 2002, Vol. 1, No. 4) Burdick, Borror & Montgomery, A review of methods for measurement systems capability analysis; JQT, 35(4): 342-354, (2003) Burdick, Borror & Montgomery, Design & Analysis of Gauge R&R Studies, SIAM, ASA, (2005) Moen, Nolan & Provost, “using a Nested Design for quantifying a destructive test” in Improving Quality Through Planned Experimentation, McGraw-Hill; 1 st edn., (1991) Skrivanek, How to conduct an MSA when the part is destroyed during measurement, moresteam.com/whitepapers/nested-gage-rr.pdf
14
14 Example crossed vs. Nested (5x2x2 for brevity)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.