28 August 2012

Does nothing wrong mean anything right?

A couple of interesting papers recently crossed my desktop that I'd like you to reflect upon.

The first was a 1994 paper by Dr. Juran (one of America's top quality gurus) titled "Quality Problems, Remedies and Nostrums" that focused on the Zero Defect (ZD) movement. In this paper, he states that "the results of the ZD movement are not very impressive" first, because failures greatly exceed successes and second, published results appeared more qualitative than quantitative as if their main purpose was to impress their customers.

The second document is an ISO related discussion on the difference between "corrective action" and "preventive action" to eliminate the causes of non-conformance. The paper explains that corrective action is about stability, and preventive action is about capability. For QFD practitioners, this explanation also demonstrates the difference between a problem solving approach using DMAIC, and a design approach using DMADV to understand true customer needs and assure satisfaction.

Neither paper answers this critical QFD question, however: "Does nothing wrong mean anything is right?" 

image - "nothing wrong" may not be "anything right"
We ask this question at the start of every QFD Green Belt® course in order to provoke students to go beyond fixing and preventing negative quality, and to search for positive quality.

In other words, customers don't buy a product or service because the product is problem-free; they buy a product because it helps them, the customer, become problem-free. This means you must understand what outcomes the customer truly wants in their life and work.

Unfortunately customers are not always good at explaining themselves. After all, few suppliers ever bother to ask, so customers are not practiced at describing their problems or unfulfilled opportunities.

This is why VOC tools such as the gemba visits, Customer Process model, and Customer Voice table are essential to good QFD. These tools help customers use words and actions to show us what "success" means to them and why they are failing. Through these tools, customers can explain their biggest headaches and missed opportunities. 

With this knowledge, a QFD team can then identify solutions that are capable of delighting the customer better than the competitors. This is how QFD differs from other quality initiatives.

If you find this topic helpful, you might be also interested in reading "Finding Customer Delights Using QFD" in the 2006 Symposium Transactions. Better yet, plan to join us this fall in the 24th Symposium on QFD in St. Augustine, Florida, to learn more about these modern tools.

    22 August 2012

    Romney PDCA

    Mr. Newt Gingrich, in his 2006 book "Saving Lives and Saving Money", expounded on his decade-long fascination with Total Quality Management (TQM), six sigma, and lean thinking. Perhaps he could share his library with U.S. Republican Party presidential candidate Mr. Mitt Romney.

    According to Romney advisor Beth Meyers who worked also on his 2003 Massachusetts governor transition team, Mr. Romney has his own brand of "problem solving" that might interest others in the quality field.

    In an August 16, 2012 article in the New York Times, "Campaigning Aside, Team Plans a Romney Presidency" by Ashley Parker, Ms. Meyers is quoted:
    “With Mitt, his approach to problem solving is first to identify the problem, make sure you’re solving for the problem actually there; second, look at best practices; third, apply best practices to the problem at hand; and fourth, execute on it.”

    While Mr. Romney's four steps resemble Dr. Shewhart's and later Dr. W. E. Deming's "Plan-Do-Check-Act" approach to problem solving, it deserves some examination -- especially if he wins the election and employs this technique in government.

    Let's compare the two approaches.

    image - QFD is a PDCA approach
    QFD is a PDCA approach.
    • PLAN. Define the problem. This means to identify an undesirable state (problem) or a desired state (opportunity). How important is this problem relative to other problems. This requires deep analysis including:
    1. Prioritization of problems and opportunities so that people, time, and money can be focused where they will do the most good. And by what criteria will "good" be defined? Is the problem due to common causes of variation or special causes?
    2. Set a measurable target or outcome (how much must the problem be mitigated to be acceptable or how much opportunity must be realized).
    3. What is the current state of the problem or opportunity.
    4. What is preventing the current state from achieving the target. That is, what is the root cause(s) of the problem or missed opportunity. If there are many root causes, which has the greatest impact or correlation.
    5. In order to address the root causes with the greatest impact, define what a good solution must do or be, independent of a solution.
    6. Use creativity and innovation to propose solutions that will do or be what is defined in 5.
    7. Define a way to test the solutions for efficacy.
    8. Select the best solutions relative to efficacy, time, cost, and other considerations.
    • DO. Test the best solutions to see how well they work in real application. Measure both the inputs of the solution as well as the outputs of the solution to determine if the results achieve statistical stability and not just luck.
    • CHECK (also referred to as Study). Check the results of the solutions against the targets set in the Plan phase. Are they acceptable and sustainable? If not, search for new solutions or as a last resort, lower the targets (and be able to justify why, and when they will be raised again).
    • ACT. Roll out the solution and standardize the improvements by issuing/training new operating procedures in order to prevent recurrence. Measure inputs periodically to assure that the procedures and systems are being followed. Measure outputs periodically to assure the improved process remains stable and predictable. Determine when the process will be reviewed for further improvement, or begin work on the next priority problem.
      There are many variations on the above, including DMAIC, but this will work for our discussion.

    • "Identify identify the problem, make sure you’re solving for the problem actually there." This sounds like good advice to make sure the problem is actually real. But, where is the analysis of the cause of the problem, the current state, the desired state?
    • "Look at best practices." It is interesting that Deming did not care for benchmarking best practices, ridiculing the process as “the last stage of civilization.” His argument was that if your company is the same as others, why would your customers buy from you and not the others. Unique conditions require unique solutions. Where in this approach is creativity and innovation? (See our previous post "Benchmarking – the fatal flaw in modern quality thinking")
    • "Apply best practices to the problem at hand." Where is the testing to see if the solutions are delivering the desired results? Where is the refinement?
    • "Execute on it." This sounds like a repeat of "Apply best practices" so it is not clear if this adds anything to the process. Where is follow up to see if the solution continues to work?

    Remember, that QFD is also a PDCA approach. Plan includes all the modern Blitz QFD® tools up to and including parts of the Maximum Value table.

    House of Quality matrix actually starts at the end of the Plan stage -- which is why it should be preceded with Blitz QFD® anyway. Do is the design, development and prototyping. Check is the testing, and market validation phase. Act is the roll out, commercialization, product maintenance, and product retirement phase.


    14 August 2012

    Which country won the 2012 London Olympics — Quantifying and prioritizing subjective data

    The 2012 Olympics were fantastic and our British friends are to be congratulated on putting together a memorable experience for athletes and viewers alike.

    image - olympic gold medalBut after each series of Games, whether summer or winter, I always marvel at the discussion of which country won.

    Those of us in the quality field, for whom numbers are our bread-and-butter, may be interested to know that I posed this question to Dr. Thomas Saaty, creator of the Analytic Hierarchy Process, a method for quantifying and prioritizing subjective data. Tom, I'm not surprised, has authored papers on the subject.

    Most news reports go for the straight count of medals:
    1. The U.S. 104
    2. China 87
    3. Russia 82, and
    4. Great Britain 65, for London 2012.
    This assumes that all medals regardless of color, are of equal value. But what about other considerations, such as:
    • What was the score difference between gold and silver (7 points in men's basketball, 0.12 seconds in men's 100 meter track, 0.100 points in a women's beam gymnastics)
    • How strong was the competition (US men's basketball team were NBA professionals)
    • How important is the event relative to other events (past modern Olympics included events such as hot air balloon [1900], poodle grooming -- actually this was an April Fools Day joke, and others.)
    Dr. Saaty raises interesting questions to develop some weighting criteria, such as how long and difficult is the training, how many challengers in the world engage in the sport, and other intangibles. Saaty looks at additional factors such as national Purchasing Power Parity per person (Ethiopia wins 2008 with 161.441)  and country population from which to send athletes (Bahamas wins 2008 with 6.5433 medals per million people). He also raises the issue of how exciting the game is based on the average ticket price to spectators.

    After looking at the medals from multiple perspectives in the 2010 Winter Olympics, Dr. Saaty settles on 7, 2, 1 for the values assigned to gold, silver and bronze medals respectively.

    Applying this for 2012, he arrives at a very interesting observation:
    1. The U.S. 104 total medals score 409
    2. China's 87 medals score 342
    3. Great Britain's 65 medals score 256, and
    4. Russia's 82 medals score 251 -- creating a reversal for third place.
    Those who want to know more are recommended to read the QFDI newsletter "Decision Making with AHP (Analytic Hierarchy Process)".

    Today's quality professionals should know how to apply AHP in their projects for better analytic precision, and this includes six sigma black belts and anyone who is involved with prioritization of customer needs and product features in QFD. Case studies using AHP will be presented at the upcoming Symposium.

    08 August 2012

    Social Media for VOC

    In a recent New York Times article "Facebook, Twitter and Foursquare as Corporate Focus Groups" on July 30 2012, it was noted that producers of hit food and other retail fads are using social media to extract new ideas from consumers, as well as to select which ideas to commercialize.

    Younger consumers who are more adept at online communications and can be attracted in larger numbers and more quickly than traditional focus groups. Further, their online profiles are self-populated and can provide much more demographic and attitudinal details than otherwise obtained. Sorting responses by these criteria can yield valuable insight according to age, income, location, and other characteristics important to target marketing.

    photo - young people using social media
    In recent years, several QFD practitioners have been using social media to acquire Voice of Customer (VOC) data during new product development. QFD, of course, would usually begin well before focus groups were employed to evaluate solution options, in order to acquire VOC to define customer needs and product requirements. In these cases, users are asked to send in videos and photos of their activities and frustrations, usually around a product theme. Other uses are to search personal social media postings for issues related to the new product project.

    In my experience, this has proven to be a rich source of candid VOC where the user is directing the script. A kind of virtual "gemba." In one case, we uncovered that one company's product was being abused by young teens, allowing the maker to change the product and make it less prone to tampering.

    Of course, like any customer gemba data, these are only inputs to a deeper QFD analysis that includes translating VOC into prioritized customer needs, product requirements, and features. These features can be tested again using social media as described in the above captioned article.

    01 August 2012

    Benchmarking – the fatal flaw in modern quality thinking

    Frequently we hear in quality conference presentations and papers high praises for benchmarking and "shamelessly stealing" the ideas of others. But does it make sense to take what is successful elsewhere and expect it to work in a different context with different staff and customers? Two recent news reports are noteworthy.

    "How Apple Store Seduces You With the Tilt of Its Laptops" (from Forbes Magazine, June 14, 2012): Apple Retail has found that tilting demonstration laptop computer screens at a specific angle encourages customers to adjust them to their ideal viewing angle – and by virtue of touching the computer, invite them to experience the product and its apps in a multi-sensory mode.

    "A Store Without a Checkout Counter? JCPenney Presses on with Retail Revolution" (Time Magazine, July 20, 2012): In late July 2012, J.C. Penney (a large American department store chain) announced that by 2014, it will eliminate cash registers and checkout counters at their retail stores. This idea emulates Apple Computer's successful retail store format, also the brainchild of Ron Johnson who left Apple Retail this spring to become CEO of J.C. Penney (JCP). Key functions of the plan are to have store employees with remote scanners roam the store and record purchases and payments, as well as create an iPhone app that allows customers to check themselves out.
    photo - shopping
    Readers who have shopped at an Apple Retail store know that you are surrounded by staff both eager to leave you alone to play with the devices, but instantly there should you have questions or wish to make a purchase. If you use a credit card, you can be immediately checked out right where you stand, and instantly receive your receipt by email. But can a clothing and general merchandise retailer imitate this successfully?

    From a QFD perspective, let's examine JCP's decision to emulate Apple as a new solution to an existing problem or opportunity. At the start of a technology-driven QFD project (Apple may have been customer-driven, but benchmarking is usually technology- or solution-driven), we look at the the functions of the technology and what important customer problems does it address?

    For example:
    • Who are the target customers and how do they shop?
    • Do they come in with a purchase already in mind or do they browse?
    • Do they buy things from multiple departments and don't mind paying for different purchases in different departments?
    • Do they pay with cash?
    • Do they add additional items as they walk towards the checkout counter?
    • How big a problem is checking out and purchasing at JCP today?
    • Do customers abandon their purchases due to waiting in line?
    • What other problems do customers face at JCP such as poor selection or size availability?
    • How will floor staff handle lost sales when customers that cannot find what they want?

    So, when benchmarking another business, be careful to understand the "spirit" and not just the "form."  We talked about this in the QFDI Newsletter "Hoshin Kanri - Understanding the spirit behind the form"

    What fits others may need alterations before it fits your business.