Download presentation
Presentation is loading. Please wait.
Published byShauna Armstrong Modified over 8 years ago
1
Chapter 36 Quality Engineering (Review) EIN 3390 Manufacturing Processes Spring, 2011
2
36.1 Introduction Objective of Quality Engineering: Systematic reduction of variability, as shown in Figure 36 – 1. Variability is measured by sigma, standard deviation, which decreases with reduction in variability. Variation can be reduced by the application of statistical techniques, such as multiple variable analysis, ANOVA – Analysis of Variance, designed experiments, and Taguchi methods.
3
36.1 Introduction QE History: Acceptance sampling - Statistical Process Control (SPC) - Companywide Quality Control (CWQC) and Total Quality Control (TQC) - Six Sigma, DOE (Design of Experiment), Taguchi methods - Lean Manufacturing: “ Lean" is a production practice that considers the expenditure of resources for any goal other than the creation of value for the end customer to be wasteful, and thus a target for elimination - Poka-Yoke: developed by a Japanese manufacturing engineer named Shigeo Shingo who developed the concept. poka yoke (pronounced "poh-kah yoh-kay") means to avoid (yokeru) inadvertent errors (poka).
4
Process Control Methods FIGURE 36-1 Over many years, many techniques have been used to reduce the variability in products and processes.
5
36.1 Introduction In manufacturing process, there are two groups of causes for variations: ◦Chance causes – produces random variations, which are inherent and stable source of variation ◦Assignable causes – that can be detected and eliminated to help improve the process.
6
36.1 Introduction Manufacturing process is determined by measuring the output of the process In quality control, the process is examined to determine whether or not the product conforms the design’s specification, usually the nominal size and tolerance
7
36.1 Introduction Accuracy is reflected in your aim (the average of all your shorts, see Fig 36 – 2) Precision reflects the repeatability of the process. Process Capacity (PC) study quantifies the inherent accuracy and precision. Objectives: - root out problems that can cause defective products during production, and - design the process to prevent the problem.
8
Accuracy vs. Precision FIGURE 36-2 The concepts of accuracy (aim) and precision (repeatability) are shown in the four target outcomes. Accuracy refers to the ability of the process to hit the true value (nominal) on the average, while precision is a measure of the inherent variability of the process.
9
Accuracy vs. Precision FIGURE 36-2 The concepts of accuracy (aim) and precision (repeatability) are shown in the four target outcomes. Accuracy refers to the ability of the process to hit the true value (nominal) on the average, while precision is a measure of the inherent variability of the process.
10
36.2 Determining Process Capability The nature of process refers to both the variability (or inherent uniformity) and the accuracy or the aim of the process. Examples of assignable causes of variation in process : multiple machines for the same components, operator plunders, defective materials, progressive wear in tools.
11
36.2 Determining Process Capability Sources of inherent variability in the process: variation in material properties, operators variability, vibration and chatter. These kinds of variations usually display a random nature and often cannot be eliminated. In quality control terms, these variations are referred to as chance causes.
12
36.2 Making PC Studies by Traditional Methods The objective of PC study is to determine the inherent nature of the process as compared to the desired specifications. The output of the process must be examined under normal conditions, the inputs (e.g. materials, setups, cycle times, temperature, pressure, and operator) are fixed or standardized. The process is allowed to run without tinkering or adjusting, while output is documented including time, source, and order production.
13
36.2 Making PC Studies by Traditional Methods Histogram is a frequency distribution. Histogram shows raw data and desired value, along with the upper specification limit (USL) and lower specification limit (LSL). A run chart shows the same data but the data are plotted against time. The statistical data are used to estimate the mean and standard deviation of the distribution.
14
Process Capability FIGURE 36-3 The process capability study compares the part as made by the manufacturing process to the specifications called for by the designer. Measurements from the parts are collected for run charts and for histograms for analysis—see Figure 36-4. 1.001
15
Example of Process Control FIGURE 36-4 Example of calculations to obtain estimates of the mean (m) and standard deviation (s) of a process
16
36.2 Making PC Studies by Traditional Methods +-3 defines the natural capacity limits of the process, assuming the process is approximately normally distributed. A sample is of a specified, limited size and is drawn from the population. Population is the large source of items, which can include all items the process will produce under specified condition. Fig. 36 – 5 shows a typical normal curve and the areas under the curve is defined by the standard deviation. Fig. 36 – 6 shows other distributions.
17
Normal Distribution FIGURE 36-5 The normal or bell-shaped curve with the areas within 1s, 2s, and 3s for a normal distribution; 68.26% of the observations will fall within 1s from the mean, and 99.73% will fall within 3s from the mean.
18
36.2 Histograms A histogram is a representation of a frequency distribution that uses rectangles whose widths represent class intervals and whose heights are proportional to the corresponding frequencies. All the observations within in an interval are considered to have the same value, which is the midpoint of the interval. A histogram is a picture that describes the variation in a progress. Histogram is used to 1) determine the process capacity, 2) compare the process with specification, 3) to suggest the shape of the population, and 4) indicate discrepancy in data. Disadvantages: 1) Trends aren’t shown, and 2) Time isn’t counted.
19
Mean vs. Nominal FIGURE 36-7 Histogram shows the output mean m from the process versus nominal and the tolerance specified by the designer versus the spread as measured by the standard deviation s. Here nominal =49.2, USL =62, LSL =38, m =50.2, s =2.
20
36.2 Run Chart or Diagram A run chart is a plot of a quality characteristic as a function of time. It provides some idea of general trends and degree of variability. Run chart is very important at startup to identify the basic nature of a process. Without this information, one may use an inappropriate tool in analyzing the data. For example, a histogram might hide tool wear if frequent tool change and adjustment are made between groups and observations.
21
Example of a Run Chart FIGURE 36-8 An example of a run chart or graph, which can reveal trends in the process behavior not shown by the histogram.
22
36.2 Process Capability Indexes The most popular PC index indicates if the process has the ability to meet specifications. The process capacity index, C p, is computed as follows: C p = (tolerance spread) / (6 = (USL – LSL) / (6) A value of C p >= 1.33 is considered good. The example in Fig 36-7: C p = (USL – LSL)/(6) = (62 – 38)/(6 x 2) =2
23
36.2 Process Capability Indexes The process capability ratio, C p, only looks at variability or spread of process (compared to specifications) in term of sigmas. It doesn’t take into account the location of the process mean, . Another process capability ratio C pk for off-center processes: C pk = min (C pu, C pl ) = min[C pu = (USL – )/(3), C pl = ( – LSL)/(3)]
24
Output Shift FIGURE 36-9 The output from the process is shifting toward the USL, which changes the Cpk ratio but not the Cp ratio.
25
Output Shift FIGURE 36-9 The output from the process is shifting toward the USL, which changes the Cpk ratio but not the Cp ratio. When = 2.0
26
36.2 Process Capability Indexes In Fig. 36 – 10, the following five cases are covered. a) 6 1 b) 6 < USL –LSL, but process has shifted. c) 6 = USL –LSL, or C p = 1 d) 6 > USL –LSL or C p < 1 e) The mean and variability of the process have both changed. If a process capability is on the order of 2/3 to 3/4 of the design tolerance, there is a high probability that the process will produce all good parts over a long time period.
27
FIGURE 36-10 Five different scenarios for a process output versus the designer’s specifications for the minimal (50) and upper and lower specifications of 65 and 38 respectively.
28
FIGURE 36-10 Five different scenarios for a process output versus the designer’s specifications for the minimal (50) and upper and lower specifications of 65 and 38 respectively.
29
36.3 Inspection to Control Quality Inspection is the function that controls the quality manually, or automatically. How much should be inspected: 1. Inspect every item being made 2. Sample 3. None. Assume that everything is acceptable or the product is inspected by customer, who will exchange it in case it is defective.
30
36.3.1 Statistical Process Control (SPC) Sampling requires statistical techniques for decisions about the acceptability of the whole based on sample’s quality. This is known as statistical process control (SPC). The most widely used basic SPC techniques is the control charts.
31
Control charts for variables are used to monitor the output of a process by sampling, by measuring selected quality characteristics, by plotting the sample data on the chart, and then by making decisions about the performance of the process.
32
Figure 36 – 13 shows the basic structure of two charts commonly used for variable types of measurements. The X chart tracks the aim (accuracy) of the process. The R chart (or chart) tracks the precision or variability of the process. Usually, only X chart and R chart are used unless the sample size is large, and then chart are used in place of R chart.
33
36.3 Inspection to Control Quality
34
Quality Calculations FIGURE 36-13 Quality control chart calculations. On the charts, X plot and R values over time. The constants for calculating UCL and LCL values for the X and R charts are based on 3 standard deviations. R = Xhigh – X low = R/d 2 A2 = 3/[d 2 SQRT(n)] Where, n – sample size x/n
35
Samples are drawn over time. Because some sample statistics tend to be normally distributed about their own mean, x value are normally distributed about x, and R values are normally distributed about R, and values are normally distributed . Quality control charts are widely used as aids in maintaining quality and detecting trends in quality variation before defective parts are actually produced.
36
When sampling inspection is used, the typical sample sizes are from 3 to about 12. Fig 16 – 4 shows one example of X and R charts for measuring a dimension of a gap on a part with 25 samples of size 5 over 6 days.
37
FIGURE 36-14 Example of X and R charts and the data set of 25 samples [k 25 of size 5 (n = 5)]. (Source : Continuing Process Critical and Process Capability Improvement, Statistical Methods Office, Ford Motor Co., 1985.) 0.178
38
FIGURE 36-14 Example of X and R charts and the data set of 25 samples [k 25 of size 5 (n = 5)]. (Source : Continuing Process Critical and Process Capability Improvement, Statistical Methods Office, Ford Motor Co., 1985.) 0.178 C p = ? C pk = ?
39
Errors in Textbook
40
After control charts have been established, the charts act as a control indicator for the process. If the process is operating under chance cause conditions, the data will appear random (no trends or pattern). If X, R or values fall outside the control limits or if nonrandom trends occur (like 7 points on one side of the central line or 6 successive increasing or decreasing points appear), an assignable cause or change may have occurred, and action should be taken to correct the problem.
41
36.4 Process Capability Determination from Control Chart Data
42
After the process is determined to be “under control”, the data can be used to estimate the process capability parameters. A sample size 5 was used in the example, so n = 5 (Fig 36 – 14). 25 groups of samples were drawn from the process, so K = 25. For each sample, the sample mean x and sample range R are computed. For large samples, N > 12, the standard deviation of each sample should be computed rather than the range.
43
36.5 Determining Causes For Problems in Quality Fishbone diagram developed by Kaorw Ishikowa in 1943 is used in conjunction with control chart to root out the causes of problems. Fishbone diagram is also known as Cause-and- effect. Fishbone lines are drawn from the main line. These lines organize the main factors. Branching from each of these factors are even more detailed factors.
44
36.5 Determining Causes For Problems in Quality Four “M” are often used in fishbone diagram: Men, Machines, Materials, and Methods. CEDAC – Cause-and-effect diagram with the additional of cards. The effect is often tracked with a control chart. The possible causes of defects or problems are written on cards and inserted in slots in the cards.
45
Fishbone Diagram FIGURE 36-15 Example of a fishbone diagram using a control chart to show effects.
46
Fishbone Diagram FIGURE 36-15 Example of a fishbone diagram using a control chart to show effects.
47
36.5.1 Sampling Errors Two kinds of decision errors: Type I Error ( error): process is running perfectly, but sample data indicate that something is wrong. Type II error ( error): process is not running perfectly and was making defective products, but sample data didn’t indicate that anything was wrong.
48
Errors FIGURE 36-16 When you look at some of the output from a process and decide about the whole (i.e., the quality of the process), you can make two kinds of errors.
49
36.5.3 Design of Experiments (DOE) and Taguchi Methods SPC looks at processes and control, Taguchi methods loosely implies “improvement”. DOE and Taguchi methods span a much wider scope of functions and include the design aspects of products and processes, areas that were seldom treated from quality standpoint view. Consumer is the focus on quality, and the methods of quality design and controls have been incorporated into all phases of production.
50
Taguchi Method FIGURE 36-18 The use of Taguchi methods can reduce the inherent process variability, as shown in the upper figure. Factors A, B, C, and D versus process variable V are shown in the lower figure.
51
36.5.4 Six Sigma FIGURE 36-19 To move to six sigma capability from four sigma capability requires that the process capability (variability) be greatly improved (s reduced). The curves in these figures represent histograms or curves fitted to histograms.
52
36.5.5 Total Quality Control (TQC) Total Quality Control (TQC) was first used by A. V. Feigenbaum in may 1957. TQC means that all departments of a company must participate in quality control (Table 36 – 1).
54
Final Exam Final Exam Date: April 26, 2011 (Tuesday) Time: 12:00 pm– 2:00pm Classroom: EC 2410
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.