Monitoring Cost of Quality and Merging with Non Conformance Reports

The purpose of cost of quality (COQ) technique is to provide a tool to management for facilitating quality program and quality improvement activities. Quality cost reports can be used to point out the strengths and weaknesses of a quality system. Quality costs can define the activities of quality program and quality improvement efforts in a language that management can understand and act on: Dollars

One of the lessons I have learnt through my work as Quality engineer is that speaking the language of money is essential. For a successful quality effort, the single most important element is leadership by upper management. To gain that leadership, some concepts or tools could be proposed, but that is the wrong approach in my opinion. Instead, the management should be convinced that a problem exists that requires its attention and action, such as excessive costs due to poor quality. Excessive cost of poor quality and loss of sales revenue are both quality-related hot buttons management.

In my previous blog, we discussed about automating the non-conformance reporting, monitoring the trend and taking preventive actions. However, implementing continuous improvement efforts based on number of non conformances is counterproductive and may not yield measurable results to the organization. Instead, Merging this data with quality costs (Supplier chargebacks, MRB, Production line downtime costs, on-the-line sorting, customer chargebacks) would point out the potential for improvement and provide management with the basis for measuring the improvement accomplished.

Do

Phase 2:

In order to merge quality costs to the non-conformances, you would need to check if the data sources of these quality costs are compatible with Business Analytics software. For most modern tools, a wide range of data sources are supported including excel, cvs spreadsheets. Most likely, your organization is keeping track of quality costs in a separate database from non conformances records. I have connected multiple data sources and refined the data. Refining the data has several steps including (but not limited to)

  • Verifying the data – Upon the first import of data, it is essential to verify the data and to make sure that the data is displayed as you expect.
  • Remove Unnecessary Rows – When shaping your data, you might need to remove some of the rows, for example, if they are blank or if they contain data that you do not need in your reports.
  • Rename the columns – It is important to shape your initial data and identify the column headers and names within the data
  • Verify Data Categories – It is good practice to specify data categories for data fields (columns). Most BI tools are quite good on setting this automatically but again make sure to check all the data fields. Especially for address-related fields where the data category must be set correctly to ensure that ensure geocoding can occur properly and date fields.

Once the data is cleaned and transformed, it is now time to build the data and establish relationship between different source files. For example: All quality costs root back to a non conformance whether it is a customer complaint, supplier complaint, production issue or internal quality costs. So the joint connector between the non conformance records and quality costs is non conformance number that is generated when an engineering files a complaint. Since I was working on unifying multiple locations, I have created a custom column that unifies cost of quality and non conformance number

The graphs I have created are exactly similar to those in previous blog – except that this time, I’m presenting the total cost of quality that was accumulated for a registered non conformance. And this is a live data. As soon as an engineer updates the files, the BI tool updates itself and presents it to the management.

Using drill down features, trend prediction features, this tool now is running at its full potential.

Check

Act

Quality cost management and publication do not solve quality problems, unless they are used. Improvement projects must be identified (use tools like Pareto to identify Vital Few), clear responsibilities established, and resources provided to diagnose and remove the cause of problems, as well as other essential steps. A dedicated team need to attack and reduce the high cost of poor quality.

(A Picture of Before and After)

This concludes my brief summary of how one could utilize BI tools in providing a deep down visualization of a company’s core issues. During this work, I have laid pathway to increase the scope of traditional quality costs. I will be going into those details in my next blog.

Did you find the information listed in this article helpful? How have they worked for you? Let me know in the comments below!

Leave a comment