Sunday, October 27, 2019
The Taguchi Methods for Quality Improvement
The Taguchi Methods for Quality Improvement INTRODUCTION: Taguchi methods are statistical methods developed by Genichi Taguchi to improve the quality of manufactured goods, and more recently also applied to, engineering, biotechnology, marketing and advertising. Professional statisticians have welcomed the goals and improvements brought about by Taguchi methods, particularly by Taguchis development of designs for studying variation, but have criticized the inefficiency of some of Taguchis proposals. Taguchis work includes three principal contributions to statistics: A specific loss function see Taguchi loss function; The philosophy of off-line quality control; and Innovations in the design of experiments. Loss functions Loss functions in statistical theory Traditionally, statistical methods have relied on mean-unbiased estimators of treatment effects: Under the conditions of the Gauss-Markov theorem, least squares estimators have minimum variance among all mean-unbiased estimators. The emphasis on comparisons of means also draws (limiting) comfort from the law of large numbers, according to which the sample means converge to the true mean. Fishers textbook on the design of experiments emphasized comparisons of treatment means. Gauss proved that the sample-mean minimizes the expected squared-error loss-function (while Laplace proved that a median-unbiased estimator minimizes the absolute-error loss function). In statistical theory, the central role of the loss function was renewed by the statistical decision theory of Abraham Wald. However, loss functions were avoided by Ronald A. Fisher.[6] Taguchis use of loss functions Taguchi knew statistical theory mainly from the followers of Ronald A. Fisher, who also avoided loss functions. Reacting to Fishers methods in the design of experiments, Taguchi interpreted Fishers methods as being adapted for seeking to improve the mean outcome of a process. Indeed, Fishers work had been largely motivated by programmes to compare agricultural yields under different treatments and blocks, and such experiments were done as part of a long-term programme to improve harvests. However, Taguchi realised that in much industrial production, there is a need to produce an outcome on target, for example, to machine a hole to a specified diameter, or to manufacture a cell to produce a given voltage. He also realised, as had Walter A. Shewhart and others before him, that excessive variation lay at the root of poor manufactured quality and that reacting to individual items inside and outside specification was counterproductive. He therefore argued that quality engineering should start with an understanding of quality costs in various situations. In much conventional industrial engineering, the quality costs are simply represented by the number of items outside specification multiplied by the cost of rework or scrap. However, Taguchi insisted that manufacturers broaden their horizons to consider cost to society. Though the short-term costs may simply be those of non-conformance, any item manufactured away from nominal would result in some loss to the customer or the wider community through early wear-out; difficulties in interfacing with other parts, themselves probably wide of nominal; or the need to build in safety margins. These losses are externalities and are usually ignored by manufacturers, which are more interested in their private costs than social costs. Such externalities prevent markets from operating efficiently, according to analyses of public economics. Taguchi argued that such losses would in evitably find their way back to the originating corporation (in an effect similar to the tragedy of the commons), and that by working to minimise them, manufacturers would enhance brand reputation, win markets and generate profits. Such losses are, of course, very small when an item is near to negligible. Donald J. Wheeler characterised the region within specification limits as where we deny that losses exist. As we diverge from nominal, losses grow until the point where losses are too great to deny and the specification limit is drawn. All these losses are, as W. Edwards Deming would describe them, unknown and unknowable, but Taguchi wanted to find a useful way of representing them statistically. Taguchi specified three situations: Larger the better (for example, agricultural yield); Smaller the better (for example, carbon dioxide emissions); and On-target, minimum-variation (for example, a mating part in an assembly). The first two cases are represented by simple monotonic loss functions. In the third case, Taguchi adopted a squared-error loss function for several reasons: It is the first symmetric term in the Taylor series expansion of real analytic loss-functions. Total loss is measured by the variance. As variance is additive (for uncorrelated random variables), the total loss is an additive measurement of cost (for uncorrelated random variables). The squared-error loss function is widely used in statistics, following Gausss use of the squared-error loss function in justifying the method of least squares. Reception of Taguchis ideas by statisticians Though many of Taguchis concerns and conclusions are welcomed by statisticians and economists, some ideas have been especially criticized. For example, Taguchis recommendation that industrial experiments maximise some signal-to-noise ratio (representing the magnitude of the mean of a process compared to its variation) has been criticized widely. Off-line quality control Taguchis rule for manufacturing Taguchi realized that the best opportunity to eliminate variation is during the design of a product and its manufacturing process. Consequently, he developed a strategy for quality engineering that can be used in both contexts. The process has three stages: System design Parameter design Tolerance design System design This is design at the conceptual level, involving creativity and innovation. Parameter design Once the concept is established, the nominal values of the various dimensions and design parameters need to be set, the detail design phase of conventional engineering. Taguchis radical insight was that the exact choice of values required is under-specified by the performance requirements of the system. In many circumstances, this allows the parameters to be chosen so as to minimise the effects on performance arising from variation in manufacture, environment and cumulative damage. This is sometimes called robustification. Tolerance design With a successfully completed parameter design, and an understanding of the effect that the various parameters have on performance, resources can be focused on reducing and controlling variation in the critical few dimensions (see Pareto principle). Design of experiments Taguchi developed his experimental theories independently. Taguchi read works following R. A. Fisher only in 1954. Taguchis framework for design of experiments is idiosyncratic and often flawed, but contains much that is of enormous value. He made a number of innovations. Outer arrays Taguchis designs aimed to allow greater understanding of variation than did many of the traditional designs from the analysis of variance (following Fisher). Taguchi contended that conventional sampling is inadequate here as there is no way of obtaining a random sample of future conditions.[7] In Fishers design of experiments and analysis of variance, experiments aim to reduce the influence of nuisance factors to allow comparisons of the mean treatment-effects. Variation becomes even more central in Taguchis thinking. Taguchi proposed extending each experiment with an outer array (possibly an orthogonal array); the outer array should simulate the random environment in which the product would function. This is an example of judgmental sampling. Many quality specialists have been using outer arrays. Later innovations in outer arrays resulted in compounded noise. This involves combining a few noise factors to create two levels in the outer array: First, noise factors that drive output lower, and second, noise factors that drive output higher. Compounded noise simulates the extremes of noise variation but uses fewer experimental runs than would previous Taguchi designs. Management of interactions Interactions, as treated by Taguchi Many of the orthogonal arrays that Taguchi has advocated are saturated arrays, allowing no scope for estimation of interactions. This is a continuing topic of controversy. However, this is only true for control factors or factors in the inner array. By combining an inner array of control factors with an outer array of noise factors, Taguchis approach provides full information on control-by-noise interactions, it is claimed. Taguchi argues that such interactions have the greatest importance in achieving a design that is robust to noise factor variation. The Taguchi approach provides more complete interaction information than typical fractional factorial designs, its adherents claim. * Followers of Taguchi argue that the designs offer rapid results and that interactions can be eliminated by proper choice of quality characteristics. That notwithstanding, a confirmation experiment offers protection against any residual interactions. If the quality characteristic represents the energy transformation of the system, then the likelihood of control factor-by-control factor interactions is greatly reduced, since energy is additive. Inefficencies of Taguchis designs * Interactions are part of the real world. In Taguchis arrays, interactions are confounded and difficult to resolve. Statisticians in response surface methodology (RSM) advocate the sequential assembly of designs: In the RSM approach, a screening design is followed by a follow-up design that resolves only the confounded interactions that are judged to merit resolution. A second follow-up design may be added, time and resources allowing, to explore possible high-order univariate effects of the remaining variables, as high-order univariate effects are less likely in variables already eliminated for having no linear effect. With the economy of screening designs and the flexibility of follow-up designs, sequential designs have great statistical efficiency. The sequential designs of response surface methodology require far fewer experimental runs than would a sequence of Taguchis designs.[ TAGUCHI METHODS There has been a great deal of controversy about Genichi Taguchis methodology since it was first introduced in the United States. This controversy has lessened considerably in recent years due to modifications and extensions of his methodology. The main controversy, however, is still about Taguchis statistical methods, not about his philosophical concepts concerning quality or robust design. Furthermore, it is generally accepted that Taguchis philosophy has promoted, on a worldwide scale, the design of experiments for quality improvement upstream, or at the product and process design stage. Taguchis philosophy and methods support, and are consistent with, the Japanese quality control approach that asserts that higher quality generally results in lower cost. This is in contrast to the widely prevailing view in the United States that asserts that quality improvement is associated with higher cost. Furthermore, Taguchis philosophy and methods support the Japanese approach to move quality improvement upstream. Taguchis methods help design engineers build quality into products and processes. As George Box, Soren Bisgaard, and Conrad Fung observed: Today the ultimate goal of quality improvement is to design quality into every product and process and to follow up at every stage from design to final manufacture and sale. An important element is the extensive and innovative use of statistically designed experiments. TAGUCHIS DEFINITION OF QUALITY The old traditional definition of quality states quality is conformance to specifications. This definition was expanded by Joseph M. Juran (1904-) in 1974 and then by the American Society for Quality Control (ASQC) in 1983. Juran observed that quality is fitness for use. The ASQC defined quality as the totality of features and characteristics of a product or service that bear on its ability to satisfy given needs. Taguchi presented another definition of quality. His definition stressed the losses associated with a product. Taguchi stated that quality is the loss a product causes to society after being shipped, other than losses caused by its intrinsic functions. Taguchi asserted that losses in his definition should be restricted to two categories: (1) loss caused by variability of function, and (2) loss caused by harmful side effects. Taguchi is saying that a product or service has good quality if it performs its intended functions without variability, and causes little loss through harmful side effects, including the cost of using it. It must be kept in mind here that society includes both the manufacturer and the customer. Loss associated with function variability includes, for example, energy and time (problem fixing), and money (replacement cost of parts). Losses associated with harmful side effects could be market shares for the manufacturer and/or the physical effects, such as of the drug thalidomide, for the consumer. Consequently, a company should provide products and services such that possible losses to society are minimized, or, the purpose of quality improvement is to discover innovative ways of designing products and processes that will save society more than they cost in the long run. The concept of reliability is appropriate here. The next section will clearly show that Taguchis loss function yields an operational definition of the term loss to society in his definition of quality. TAGUCHIS LOSS FUNCTION We have seen that Taguchis quality philosophy strongly emphasizes losses or costs. W. H. Moore asserted that this is an enlightened approach that embodies three important premises: for every product quality characteristic there is a target value which results in the smallest loss; deviations from target value always results in increased loss to society; [and] loss should be measured in monetary units (dollars, pesos, francs, etc.). Figure I depicts Taguchis typically loss function. The figure also contrasts Taguchis function with the traditional view that states there are no losses if specifications are met. Taguchis Loss Function It can be seen that small deviations from the target value result in small losses. These losses, however, increase in a nonlinear fashion as deviations from the target value increase. The function shown above is a simple quadratic equation that compares the measured value of a unit of output Y to the target T.: Essentially, this equation states that the loss is proportional to the square of the deviation of the measured value, Y, from the target value, T. This implies that any deviation from the target (based on customers desires and needs) will diminish customer satisfaction. This is in contrast to the traditional definition of quality that states that quality is conformance to specifications. It should be recognized that the constant k can be determined if the value of L(Y) associated with some Y value are both known. Of course, under many circumstances a quadratic function is only an approximation. Since Taguchis loss function is presented in monetary terms, it provides a common language for all the departments or components within a company. Finally, the loss function can be used to define performance measures of a quality characteristic of a product or service. This property of Taguchis loss function will be taken up in the next section. But to anticipate the discussion of this property, Taguchis quadratic function can be converted to: This can be accomplished by assuming Y has some probability distribution with mean, a and variance o.2 This second mathematical expression states that average or expected loss is due either to process variation or to being off target (called bias), or both. TAGUCHI, ROBUST DESIGN, AND THEĆ DESIGN OF EXPERIMENTS Taguchi asserted that the development of his methods of experimental design started in Japan about 1948. These methods were then refined over the next several decades. They were introduced in the United States around 1980. Although, Taguchis approach was built on traditional concepts of design of experiments (DOE), such as factorial and fractional factorial designs and orthogonal arrays, he created and promoted some new DOE techniques such as signal-to-noise ratios, robust designs, and parameter and tolerance designs. Some experts in the field have shown that some of these techniques, especially signal-to-noise ratios, are not optimal under certain conditions. Nonetheless, Taguchis ideas concerning robust design and the design of experiments will now be discussed. DOE is a body of statistical techniques for the effective and efficient collection of data for a number of purposes. Two significant ones are the investigation of research hypotheses and the accurate determination of the relative effects of the many different factors that influence the quality of a product or process. DOE can be employed in both the product design phase and production phase. A crucial component of quality is a products ability to perform its tasks under a variety of conditions. Furthermore, the operating environmental conditions are usually beyond the control of the product designers, and, therefore robust designs are essential. Robust designs are based on the use of DOE techniques for finding product parameter settings (e.g., temperature settings or drill speeds), which enable products to be resilient to changes and variations in working environments. It is generally recognized that Taguchi deserves much of the credit for introducing the statistical study of robust design. We have seen how Taguchis loss function sets variation reduction as a primary goal for quality improvement. Taguchis DOE techniques employ the loss function concept to investigate both product parameters and key environmental factors. His DOE techniques are part of his philosophy of achieving economical quality design. To achieve economical product quality design, Taguchi proposed three phases: system design, parameter design, and tolerance design. In the first phase, system design, design engineers use their practical experience, along with scientific and engineering principles, to create a viably functional design. To elaborate, system design uses current technology, processes, materials, and engineering methods to define and construct a new system. The system can be a new product or process, or an improved modification of an existing product or process. The parameter design phase determines the optimal settings for the product or process parameters. These parameters have been identified during the system design phase. DOE methods are applied here to determine the optimal parameter settings. Taguchi constructed a limited number of experimental designs, from which U.S. engineers have found it easy to select and apply in their manufacturing environments. The goal of the parameter design is to design a robust product or process, which, as a result of minimizing performance variation, minimizes manufacturing and product lifetime costs. Robust design means that the performance of the product or process is insensitive to noise factors such as variation in environmental conditions, machine wear, or product to-product variation due to raw material differences. Taguchis DOE parameter design techniques are used to determine which controllable factors and which noise factors are the significant variables. The aim is to set the controllable factors at those levels that will result in a product or process being robust with respect to the noise factors. In our previous discussion of Taguchis loss function, two equations were discussed. It was observed that the second equation could be used to establish quality performance measures that permit the optimization of a given products quality characteristic. In improving quality, both the average response of a quality and its variation are important. The second equation suggests that it may be advantageous to combine both the average response and variation into a single measure. And Taguchi did this with his signal-to-noise ratios (S/N). Consequently, Taguchis approach is to select design parameter levels that will maximize the appropriate S/N ratio. These S/N ratios can be used to get closer to a given target value (such as tensile strength or baked tile dimensions), or to reduce variation in the products quality characteristic(s). For example, one S/N ratio corresponds to what Taguchi called nominal is best. Such a ratio is selected when a specific target value, such as tensile strength, is the design goal. For the nominal is best case, Taguchi recommended finding an adjustment factor (some parameter setting) that will eliminate the bias discussed in the second equation. Sometimes a factor can be found that will control the average response without affecting the variance. If this is the case, our second equation tells us that the expected loss becomes: Consequently, the aim now is to reduce the variation. Therefore, Taguchis S/N ratio is: where S 2 is the samples standard deviation. In this formula, by minimizing S 2 , 10 log 10 S 2 , is maximized. Recall that all of Taguchis S/N ratios are to be maximized. Finally, a few brief comments concerning the tolerance design phase. This phase establishes tolerances, or specification limits, for either the product or process parameters that have been identified as critical during the second phase, the parameter design phase. The goal here is to establish tolerances wide enough to reduce manufacturing costs, while at the same time assuring that the product or process characteristics are within certain bounds. EXAMPLES AND CONCLUSIONS As Thomas P. Ryan has stated, Taguchi at the very least, has focused our attention on new objectives in achieving quality improvement. The statistical tools for accomplishing these objectives will likely continue to be developed. Quality management gurus, such as W. Edwards Deming (1900-1993) and Kaoru Ishikawa (1915-), have stressed the importance of continuous quality improvement by concentrating on processes upstream. This is a fundamental break with the traditional practice of relying on inspection downstream. Taguchi emphasized the importance of DOE in improving the quality of the engineering design of products and processes. As previously mentioned, however, his methods are frequently statistically inefficient and cumbersome. Nonetheless, Taguchis design of experiments have been widely applied and theoretically refined and extended. Two application cases and one refinement example will now be discussed. K. N. Anand, in an article in Quality Engineering, discussed a welding problem. Welding was performed to repair cracks and blown holes on the cast-iron housing of an assembled electrical machine. Customers wanted a defect-free quality weld, however the welding process had resulted in a fairly high percentage of welding defects. Management and welders identified five variables and two interactions that were considered the key factors in improving quality. A Taguchi orthogonal design was performed resulting in the identification of two highly significant interactions and a defect-free welding process. The second application, presented by M. W. Sonius and B. W. Tew in a Quality Engineering article, involved reducing stress components in the connection between a composite component and a metallic end fitting for a composite structure. Bonding, pinning, or riveting the fitting in place traditionally made the connections. Nine significant variables that could affect the performance of the entrapped fiber connections were identified and a Taguchi experimental design was performed. The experiment identified two of the nine factors and their respective optimal settings. Therefore, stress levels were significantly reduced. The theoretical refinement example involves Taguchi robust designs. We have seen where such a design can result in products and processes that are insensitive to noise factors. Using Taguchis quadratic loss function, however, may provide a poor approximation of true loss and suboptimal product or process quality. John F. Kros and Christina M. Mastrangelo established relationships between nonquadratic loss functions and Taguchis signal-to-noise ratios. Applying these relationships in an experimental design can change the recommended selection of the respective settings of the key parameters and result in smaller losses.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.