Monday, March 26, 2007

Some thoughts on "Six Sigma"...

As previously promised....

The hypothesis here is that each and every process, whether it be pouring a cup of coffee, coloring in a coloring book with crayons, or precision machining a steel forging, operates most efficiently when it operates within its inherent capability. Furthermore, that inherent capability is determined by the make-up of that process and its design. And lastly, a process may be made to operate in excess of its capability, but only by careful tending and oversight, which then becomes the basis of investment in improving the process capabilities.

That's all there is to it. Really.

Here's the shorthand sketch. Suppose you have a bar that has to be sawed to length. And let's say that the product designers require that bar length to be 8" long, with an allowable tolerance of +/- 1.0" of length.

It can be demonstrated that the process then is as follows - setup a backstop or a bar stop on the far side of the saw blade set to 8" from the blade. Advance the raw bar stock under the blade until it hits the stop, and start the saw. When the cut is complete, move the bar to the "finished" basket. Repeat.

Common sense might tell us that for this process, given the generous tolerance, one need not even check the bars - the inherent process capability is well with the required tolerance. In fact, a statistical study could be performed looking at the variability of the length of the bars, and establishing the odds of the finished length of the bars being within or outside of tolerance. The term "six-sigma" refers to a capability range that is +/- 3 standard deviations about the mean of a sample size of process results, given an established target. The area under this curve corresponds to the probability that given the sample and the range, that the next sample would fall outside of this range; about 1 in 3.6 million or so for +/- 3 sigma.

Now, imagine if the required tolerance is +/- 0.001". The process looks the same, however, in order to verify the conformance of the finished product, each part would have to be checked. If a part is too big by 0.001", it might be possible to slightly adjust the stop, clean off its face, and attempt to re-work the non-conforming part. If the part winds up too short, it must be discarded, or it must be inventoried as raw material for another shorter part. If the part is OK, then it is placed in finished goods.

Look at all the activity (read that - cost) that has been added. There is the measuring, the sorting, the inventory and its management, the segregation of scrap and the establishment of a yield for the material (in order to make 30 parts, you have to have enough material to manufacture 60). This is inherently wasteful when compared to running the process within its inherent capability.

So what's the answer? As a start - use the "waste" caused by running process under a "control" model to justify improving the process or replacing it with one that runs under a "capability" model. Design or create a process that is capable, and assure or verify conformance, rather than controlling or creating it.

Secondly - challenge your design engineers to create designs which utilize the capabilities of the processes at-hand, and demand that they cost into their projects capability improvements required to generate designs and products which exceed the available capabilities. Burden the design projects that demand excessive capabilities with the costs of creating those capabilities. You would be surprised at how innovative your engineers will become when this requirement is enforced. It will cause better and more realistic business decisions to be made.