thumbnail group

Connect With Us:

Manufacturing Engineering Media eNewsletters

ME Channels / Quality
Share this

Measuring and Achieving Six Sigma Performance


Graphing data points can visually present operations personnel with the sigma metric and level-of-defectives actually achieved

 

By Robert L. Horst, PE
Senior Member SME, Life Fellow IEEE
Founder
Peak Productivity USA
Lancaster, PA
horst@ieee.org


Manufacturers cannot know when they've achieved six-sigma performance--or some lesser goal--without measuring the performance of individual production variables in sigma-level metrics. But conventional statistical tools don't readily provide that knowledge.

 
Graph shows the 6σ statistical model compared to a lesser 41/2σ design that yields 1.3 defectives per thousand (dpK).  

The gold standard to be achieved and certified is six-sigma performance. It's a realizable objective and a highly desirable performance goal for enterprise profitability, but sometimes appears not to be economically justifiable. Sigma is population standard deviation, and a measure of data dispersion or scatter.

"Six sigma" is a statistical measure of excellence in process performance as defined by Motorola in the 1980s, wherein process tolerance corresponds to ±6σ. It's not a total quality management program, strategy, or method, although some consultancies are marketing their TQM, CQI (continuous quality improvement), and quality-team implementation systems under the Six Sigma moniker.

The Motorola model defines a 6σ criterion for excellence, promising extremely high yields, with a maximum of 3.4 defectives per million (dpM). (Note that a defective is an error, faulty part or action, or out-of-tolerance variable.) A unique feature of the Motorola peak-yield ideal is that it acknowledges an acceptable degree of drift (process shift) of variables from target, and permits a defined zone of variation. No process adjustments need to be made when the collected data stay within the limits of ±1½ sigma, as long as manufacturing specs are consistent with a process tolerance of ±6σ--corresponding to a process capability index (Cp) of 2--or, alternatively ±4½σ.

A second unique feature of the Motorola-defined model is the relevance of short-term versus long-term data collection. To meet the 6σ criterion, short-term data need to exhibit a standard deviation that fits with process tolerance. The focus is always on reducing data scatter represented by the spread of the bell curve.

The power of sigma-level performance analysis for the improvement of manufacturing processes kicks in where the usefulness of statistical process control (SPC) diminishes. SPC is a powerful analytical tool for out-of-control process/product variables, but it's inadequate for quantitative analysis of "in-control," high-yield processes.

All quality management regimens are extensions and expansions of the PDCA cycle (plan, do, check, act) originated by W.A. Shewhart in his bookThe Economic Control of Quality of a Manufactured Product (published in 1931), in combination with problem definition, data collection, analysis, and testing of hypotheses. The action step is the engineering (or re-engineering) that leads to improvement of quality in a manufacturing process and/or product. The work and writings of W.E. Deming and most contemporary quality consultants are founded upon Shewhart's teachings, including the principal analytical tools of SQC/SPC (statistical quality control/statistical process control).

 
To use the isogrammetric analysis methodology (IAM) statistical performance data are plotted on an isogrammetric chart (left) or entered into a computer programmed with the isogrammetric format. This procedure reveals probable process yield and helps certify quality of performance in terms of level of defectives produced.;

Today, however, Shewhart's tables for estimating process standard deviation σ are of little utility on the factory floor, because of the availability of handheld calculators and laptop computers that provide instant statistical calculations for collected data. In the October, 2003 issue of Manufacturing Engineering, Vivek Sharma states correctly that Six Sigma consultancies have "not introduced even one original tool to the quality field." (See Six Sigma: A Dissenting Opinion on page 16 of that issue.)

SPC practice always focuses on centering the mean value, on reducing process shift to a minimum. Standard deviation is used to establish upper and lower control limits of ±3σ representing 99.7% yield for a perfectly in-control, centered-on-target process.

Surprisingly--and perplexingly to some quality improvement practitioners--the long-term performance of a 6σ-controlled process may have a centered process deviation that looks like a 4½σ process (actually, 4.65σ), and still meet the 3.4 defects per million requirement!

Multiple sets of short-term data will have central values (means) that scatter across the allowable ±1½σ zone within which no correction is required. The data distribution for the subgroups may be systematic or irregular, but the distribution of the data over the long term will tend to be normal (Gaussian), with mean value centered at, or near, the specification target value. (Statisticians rely on something called the central-limit theorem to explain this outcome.)

In practice, short-term data are considered to be 10 - 30 consecutive data points per set, spanning a minimum of one to three or more process cycles, depending upon the dynamics of the given process. Long-term data are usually collected at regular intervals over the course of an extended factory run with a minimum of 10 data points per subgroup.

It's very unwieldy to calculate manufacturing performance in sigma-level metrics, especially for 6σ yield. When we strive to determine yields higher than that of a ±3σ SPC-controlled process, a six-place statistical table is needed. The NIST handbook of such tables weighs five pounds (2.3 Kg)!

The value of sigma-level analysis is in the quantification of variable data with respect to required process tolerances rather than intrinsic control limits assigned by SPC rules. It's an approach that leads to the discovery of rogue variables that prevent the achievement of high yields. In the broad perspective, knowledge of sigma-level performance for key variables--in every production process--alerts operators and signals management, leading to fact-based decision-making and corrective actions that are essential for higher productivity.

Our group's proprietary approach to presentation of sigma-level data is called the Isogrammetric Analysis Method (IAM). It uses isograms of constant process yield as a metric to determine the probable sigma-level yield associated with measurable process variables in a production process. Mean-value shift (in σ units) and ratio (tolerance divided by σ) are plotted, respectively, as X,Y coordinates on isogrammetric graphs. Data points show the sigma metric and the achieved level-of-defectives, without the encumbrance of reference tables and associated calculations that, likely, are not practical for use by production personnel.

The intellectual requirements for successful use of IAM by operations personnel are:

  • Know the specification tolerance for every key process variable (documented specs are required).
  • Understand the process instrumentation and how to record variables data.
  • Check data periodically and determine mean value and standard deviation (can be done manually with handheld statistical calculator).
  • Enter the mean-value shift and ratio (see above) on the isogrammetric chart (or computer).
  • Take corrective action as needed.

The required management actions to achieve these requirements are straightforward:

  • Post the specifications for every key process variable.
  • Train the operator and/or quality technician to understand the process instrumentation and how to record variables; check data periodically and determine mean value and standard deviation; and enter the mean-value shift and ratio (see above) on the isogrammetric chart (or computer).
  • Finally, take corrective action by upgrading the process.

When all in-plant data are examined relative to isograms, the probable yield level can be known for every factory variable for which data can be acquired. The goal always is to stabilize offending variables and subprocesses, in order to maximize the potential process yield and economics for a given production line or factory.

The IAM tool is particularly effective with high-yield processes where statistical sampling and inspection methods tend to miss the relatively few defectives, and where process variables are supposedly "in control" by SPC rules.

Isogrammetric analysis merges SPC and sigma-level measurement criteria. When implemented on-line in factories, it provides real-time feedback in terms of probable level-of-defectives from variables data, informing plant-floor employees of the need to take action to stabilize production processes. Incremental increases in process yield will produce calculable savings in materials, energy, and labor, and can mean the difference between profit and loss on the balance sheet.

 

 

This article was first published in the July 2004 edition of Manufacturing Engineering magazine. 


Published Date : 7/1/2004

Advanced Manufacturing Media - SME
U.S. Office  |  One SME Drive, Dearborn, MI 48128  |  Customer Care: 800.733.4763  |  313.425.3000
Canadian Office  |  7100 Woodbine Avenue, Suite 312, Markham, ON, L3R 5J2  888.322.7333
Tooling U  |   3615 Superior Avenue East, Building 44, 6th Floor, Cleveland, OH 44114  |  866.706.8665