Tweets by @MfgEngNews
!function (d, s, id) { var js, fjs = d.getElementsByTagName(s)[0], p = /^http:/.test(d.location) ? 'http' : 'https'; if (!d.getElementById(id)) { js = d.createElement(s); js.id = id; js.src = p + "://platform.twitter.com/widgets.js"; fjs.parentNode.insertBefore(js, fjs); } } (document, "script", "twitter-wjs");

- 30 Under 30
- AeroDef
- Alternative Machining
- Controls & Software
- Energy
- Event Coverage
- IMTS 2014
- Lasers
- Lean
- Machines & Automation
- Materials
- Medical
- Metal Shop
- Micro / Nano
- Motorized Vehicles
- Newsdesk
- Product Previews
- Quality
- Rapid & Additive Manufacturing
- Shop Solutions
- SME Speaks
- TechFront
- Tooling & Workholding
- Viewpoints Blog
- Webinars
- Workforce Development

ME Channels / Quality
# Measuring and Achieving Six Sigma Performance

aj_server = 'http://sme.rotator.hadj1.adjuggler.net/servlet/ajrotator/'; aj_tagver = '1.0'; aj_zone = 'sme'; aj_adspot = '1218135'; aj_page = '0'; aj_dim = '1184527'; aj_ch = ''; aj_ct = ''; aj_kw = 'Quality'; aj_pv = true; aj_click = '';
aj_server = 'http://sme.rotator.hadj1.adjuggler.net/servlet/ajrotator/'; aj_tagver = '1.0'; aj_zone = 'sme'; aj_adspot = '1213248'; aj_page = '0'; aj_dim = '1184527'; aj_ch = ''; aj_ct = ''; aj_kw = 'AeroDef'; aj_pv = true; aj_click = '';

Graphing data points can visually present operations personnel with the sigma metric and level-of-defectives actually achieved

Published Date : 7/1/2004

# Editor's Picks

Graphing data points can visually present operations personnel with the sigma metric and level-of-defectives actually achieved

** **

**B****y Robert L. Horst, PESenior Member SME, Life Fellow IEEEFounderPeak Productivity USALancaster, PA**

Manufacturers cannot know when they've achieved six-sigma performance--or some lesser goal--without measuring the performance of individual production variables in sigma-level metrics. But conventional statistical tools don't readily provide that knowledge.

Graph shows the 6σ statistical model compared to a lesser 41/2σ design that yields 1.3 defectives per thousand (dpK). |

The gold standard to be achieved and certified is six-sigma performance. It's a realizable objective and a highly desirable performance goal for enterprise profitability, but sometimes appears not to be economically justifiable. Sigma is population standard deviation, and a measure of data dispersion or scatter.

"Six sigma" is a statistical measure of excellence in process performance as defined by Motorola in the 1980s, wherein process tolerance corresponds to ±6σ. It's not a total quality management program, strategy, or method, although some consultancies are marketing their TQM, CQI (continuous quality improvement), and quality-team implementation systems under the Six Sigma moniker.

The Motorola model defines a 6σ criterion for excellence, promising extremely high yields, with a maximum of 3.4 defectives per million (dpM). (Note that a defective is an error, faulty part or action, or out-of-tolerance variable.) A unique feature of the Motorola peak-yield ideal is that it acknowledges an acceptable degree of drift (process shift) of variables from target, and permits a defined zone of variation. No process adjustments need to be made when the collected data stay within the limits of ±1½ sigma, as long as manufacturing specs are consistent with a process tolerance of ±6σ--corresponding to a process capability index (C_{p}) of 2--or, alternatively ±4½σ.

A second unique feature of the Motorola-defined model is the relevance of short-term versus long-term data collection. To meet the 6σ criterion, short-term data need to exhibit a standard deviation that fits with process tolerance. The focus is always on reducing data scatter represented by the spread of the bell curve.

The power of sigma-level performance analysis for the improvement of manufacturing processes kicks in where the usefulness of statistical process control (SPC) diminishes. SPC is a powerful analytical tool for out-of-control process/product variables, but it's inadequate for quantitative analysis of "in-control," high-yield processes.

All quality management regimens are extensions and expansions of the PDCA cycle (plan, do, check, act) originated by W.A. Shewhart in his book*The Economic Control of Quality of a Manufactured Product* (published in 1931), in combination with problem definition, data collection, analysis, and testing of hypotheses. The action step is the engineering (or re-engineering) that leads to improvement of quality in a manufacturing process and/or product. The work and writings of W.E. Deming and most contemporary quality consultants are founded upon Shewhart's teachings, including the principal analytical tools of SQC/SPC (statistical quality control/statistical process control).

To use the isogrammetric analysis methodology (IAM) statistical performance data are plotted on an isogrammetric chart (left) or entered into a computer programmed with the isogrammetric format. This procedure reveals probable process yield and helps certify quality of performance in terms of level of defectives produced.; |

Today, however, Shewhart's tables for estimating process standard deviation σ are of little utility on the factory floor, because of the availability of handheld calculators and laptop computers that provide instant statistical calculations for collected data. In the October, 2003 issue of *Manufacturing Engineering*, Vivek Sharma states correctly that Six Sigma consultancies have "not introduced even one original tool to the quality field." (See *Six Sigma: A Dissenting Opinion* on page 16 of that issue.)

**SPC practice always focuses** on centering the mean value, on reducing process shift to a minimum. Standard deviation is used to establish upper and lower control limits of ±3σ representing 99.7% yield for a perfectly in-control, centered-on-target process.

Surprisingly--and perplexingly to some quality improvement practitioners--the long-term performance of a 6σ-controlled process may have a centered process deviation that looks like a 4½σ process (actually, 4.65σ), and still meet the 3.4 defects per million requirement!

Multiple sets of short-term data will have central values (means) that scatter across the allowable ±1½σ zone within which no correction is required. The data distribution for the subgroups may be systematic or irregular, but the distribution of the data over the long term will tend to be normal (Gaussian), with mean value centered at, or near, the specification target value. (Statisticians rely on something called the central-limit theorem to explain this outcome.)

In practice, short-term data are considered to be 10 - 30 consecutive data points per set, spanning a minimum of one to three or more process cycles, depending upon the dynamics of the given process. Long-term data are usually collected at regular intervals over the course of an extended factory run with a minimum of 10 data points per subgroup.

It's very unwieldy to calculate manufacturing performance in sigma-level metrics, especially for 6σ yield. When we strive to determine yields higher than that of a ±3σ SPC-controlled process, a six-place statistical table is needed. The NIST handbook of such tables weighs five pounds (2.3 Kg)!

The value of sigma-level analysis is in the quantification of variable data with respect to required process tolerances rather than intrinsic control limits assigned by SPC rules. It's an approach that leads to the discovery of rogue variables that prevent the achievement of high yields. In the broad perspective, knowledge of sigma-level performance for key variables--in every production process--alerts operators and signals management, leading to fact-based decision-making and corrective actions that are essential for higher productivity.

**Our group's proprietary approach** to presentation of sigma-level data is called the Isogrammetric Analysis Method (IAM). It uses isograms of constant process yield as a metric to determine the probable sigma-level yield associated with measurable process variables in a production process. Mean-value shift (in σ units) and ratio (tolerance divided by σ) are plotted, respectively, as *X,Y* coordinates on isogrammetric graphs. Data points show the sigma metric and the achieved level-of-defectives, without the encumbrance of reference tables and associated calculations that, likely, are not practical for use by production personnel.

The intellectual requirements for successful use of IAM by operations personnel are:

- Know the specification tolerance for every key process variable (documented specs are required).
- Understand the process instrumentation and how to record variables data.
- Check data periodically and determine mean value and standard deviation (can be done manually with handheld statistical calculator).
- Enter the mean-value shift and ratio (see above) on the isogrammetric chart (or computer).
- Take corrective action as needed.

The required management actions to achieve these requirements are straightforward:

- Post the specifications for every key process variable.
- Train the operator and/or quality technician to understand the process instrumentation and how to record variables; check data periodically and determine mean value and standard deviation; and enter the mean-value shift and ratio (see above) on the isogrammetric chart (or computer).
- Finally, take corrective action by upgrading the process.

When all in-plant data are examined relative to isograms, the probable yield level can be known for every factory variable for which data can be acquired. The goal always is to stabilize offending variables and subprocesses, in order to maximize the potential process yield and economics for a given production line or factory.

The IAM tool is particularly effective with high-yield processes where statistical sampling and inspection methods tend to miss the relatively few defectives, and where process variables are supposedly "in control" by SPC rules.

Isogrammetric analysis merges SPC and sigma-level measurement criteria. When implemented on-line in factories, it provides real-time feedback in terms of probable level-of-defectives from variables data, informing plant-floor employees of the need to take action to stabilize production processes. Incremental increases in process yield will produce calculable savings in materials, energy, and labor, and can mean the difference between profit and loss on the balance sheet.

This article was first published in the July 2004 edition of *Manufacturing Engineering *magazine.

Published Date : 7/1/2004

- Heller Receives Innovation Award for NANOSLIDE Technology
- Equipment Finance Confidence Index Rises
- SME Elects Nine to Its College of Fellows
- Fed Says Industrial Production Fell in October
- 3D Systems Continues Fab-Grade SLA Innovation with New ProX 800
- Durable Goods Manufacturing Rose 8% in Second Quarter
- FABTECH 2014: Google Pitches the Benefits of the Cloud to Manufacturers
- Ford Officially Launches Production of Aluminum F-150
- FABTECH 2014: Trumpf Executive Says a `Re-Thinking’ Needed to Bring Manufacturing Back to US
- FABTECH 2014: Wal-Mart Sees ‘Progress’ to Boost US Purchasing
- US Manufacturers Add 15,000 Jobs in October
- Stratasys Cuts Earnings Forecast, Tesla Delays Model X
- ISM Says Manufacturing Activity Grew in October
- Now Online: The November 2014 Edition of Manufacturing Engineering
- GM Chief Announces Michigan Volt Investment
- Mazak Introduces New HYBRID Multi-Tasking Technology
- Manufacturing Engineering Media Becomes Advanced Manufacturing Media
- Upcoming Industry Events