Quality assurance & process improvement

Hand giving five star rating, Feedback concept vector illustration flat style

The common view of "quality" in business is limited to finding defects in products.  However, Quality is a discipline with schools of thought, practices, and analytical tools.  That is why, by convention, the "Q" is capitalized when referring to "quality" as a discipline (to differentiate it from the common use of the word).

During my time working in the Operations and Quality Departments of Nabisco Refrigerated Foods in the mid 1990s, I was introduced to the ideas of the Theory of Constraints and "Quality is Free". My interest, study, and practice of Quality has continued since then.  In 2019 this culminated in my passing the exam for certification as a Quality Improvement Associate through the American Society for Quality.

Certified Quality Improvement Associate, ASQ

Including my work showcased on this page, my clients for quality and process improvement projects are:

DEA/Northern New Jersey HIDTA Task Force
Streamlined and systematized intelligence support to make it more science than art.

Global Traffic Technologies
CRM record deduping and address verification.

Wells Fargo - Data Management & Insights
Assisted in data warehouse project by cataloging quality controls and documenting processes.

YMCA of the North
Researched exercise class schedules and how they coincided with visit times and volumes by target demographics.

Nabisco: fixing a process & restoring compliance

Ameriprise Financial: using dissent to help measure and eliminate wasted effort

Medtronic: quality testing & tracking for a large database project.

Nabisco

East Hanover, NJ

Fixing a process & restoring compliance

A procedure will only be followed when it hurts more to not follow it than to follow it.

Nabisco Refrigerated Foods (a division of Nabisco at the time of RJR Nabisco of "Barbarians at the Gate" fame), made margarine, yogurt, and microwavable omelets. Before anyone was permitted to change an ingredient, packaging, or anything else about a product, the managers of various departments in the Division --- Marketing, Operations, Quality, and Purchasing --- had to review and approve the change.

All these reviews were to avoid problems the change requester might not have imagined.  For example, margarine will change color when exposed to light for a long period of time; therefore, decreasing the thickness or opaqueness of the container could change the margarine's color. To avoid these kinds of problems, the various department managers had to be consulted on every product change.

These change requests came from production facilities around the country as well as headquarters staff in two New Jersey cities. One designated person in the Quality Department, the Quality Coordinator, administered the process of getting change requests reviewed and, if approved, published.  With that current employee’s departure from the company, the existing backlog had grown to one going back six months.  Officially, no changes were implemented while the Quality Coordinator position was vacant.

I was promoted from my position in Operations to restore and maintain this Quality process.

My Analysis

Using the Ishikawa (fishbone) diagram and the Five Whys approaches to root cause analysis, I realized I had four problems to solve:  (1) restoring trust in the process, (2) demonstrating value in following the process, (3) getting quick reviews for the most critical of the proposed changes, and (4) eliminating the backlog so new changes would not be delayed by old ones in the queue.

Any one of these four problems would have been a challenge on its own.  However, I knew I would have to solve all four quickly and virtually at the same time if I was to gain compliance to a process that had ceased to exist. This is because the entire process --- from submittal through publication of approved changes --- had to be fully functional and seen to be so.  I did not have the convenience of rolling out improvements over the course of 40-hour work weeks.  Instead, I knew I was in for some very late nights and weekends in the office.

Conceptual illustration about struggle and adversity
5 stars rating review high quality and good business reputation, customer feedback or credit score, evaluation rank concept, businessman holding 5th star climb up ladder to put on best rating.

Problems #1 & 2 - Restoring trust & providing value

Getting people to wait for review and approval before implementing a change would require trust that the change process was working and would be timely. More importantly, there had to be demonstrated value in simply submitting a change request for review apart from getting official approval.  For providing value, I had something in mind I will explain shortly.

Problem #3 - Prioritizing the critical

I would need to review every change request in the six-month backlog and prioritize each by a combination of its potential impact and how old the request was.  I did so through a point system I devised:  I would award 500 points to any change request related to QUALITY, 200 points for changes intended for COST SAVINGS, and no points for anything else (which would include marketing-related changes to packaging graphics).  I did not have the expertise to evaluate the relative benefit of each QUALITY and COST SAVINGS change request beyond my topical score.

To account for how long a request had been awaiting review, I added one point for each day since it had been submitted. With such a point system, QUALITY-related changes (500 points) would always have highest priority except in the unlikely event another request had been in the queue for more than a year (which was never the case).  I would update the days-waiting score every day a request awaited the final department manager's review.

Priority rubber stamp
Working, Busy, Overworked, Emotional Stress, Office, Physical Pressure, Working, Work hard

Problem #4 - Eliminate the backlog

Ideally, the department heads would review all the backlogged change requests quickly.  That was impractical, however, due to the size of the backlog.  I realized I could let the requesters themselves pare down the backlog:  as each submitted a new change request for something, I would ask if they wanted me to cancel any of their backlogged requests so their new requests could be reviewed sooner.  I also realized my idea for "demonstrating value" --- to be described shortly --- would also pare down the backlog.

Automating process management iteratively (taking an organic approach)

The magnitude of what I had to accomplish quickly and would have to manage continuously in the future made me realize the need for a database with automated processes.  Using my technical skills, I created one.  Called Q-Track (short for "Quality Tracking"), I built it iteratively, developing each set of tables and features needed to make progress in resurrecting the review process in general and in solving my four problems specifically.  Because I had to restore every aspect of the change review process virtually at the same time, I was building-out multiple different features of Q-Track every day.  As the days progressed, I was increasingly able to transition manual work to semi-automated features and eventually to fully automated ones.

Q-Track was an entirely MS Access/VBA application since a robust relational database was not available to me.  The VBA automation included the automation of tasks in MS Outlook, Word, and Excel --- all from within MS Access.  Q-Track's features would eventually include:

A database of all change requests, their priority score and status (received, out-for-review, approved, published, etc.).

Automated log entries for each step performed (including when I received a request via email and sent requests out for management's review) .

Automated reminders to reviewers (and me).

Automating my administrative backups and edits to the product specification documents prior to publishing a revised version.  This ensured these administrative actions were done and done quickly and correctly.

Providing VALUE for compliance by identifying possibly change conflicts

One of Q-Track's features was critical to demonstrating value in submitting a change request for review:  conflict identification.

Because all product specifications --- ingredients, formulas, packaging, etc. --- used an identification code that identified the product involved, I could leverage my relational database to quickly identify change requests that might conflict.

Businesswomen on push me pull you tandem bicycle. Female ambitious managers in disagreement, unable working together moving in different ways, unproductive. Vector illustration, faceless characters

If two people proposed two different changes to the same product, that could be a problem.  Using Q-Track, I could identify these kinds of potential problems and inform the requesters.  The requesters could then reconcile any conflicts and potentially combine their separate requests into one.  With QTrack, I could track these potential conflicts and wait until both requesters gave me an "OK" to submit the changes for management review.

During my entire time in my role, the percentage of true conflicts among possible ones Q-Track identified was small. However, when applying this check to the backlog, many true conflicts came up due to the six months the backlog had grown.  Often, different people were requesting different changes to the same thing about a product, such as the vendor of an ingredient. Having these requesters agree to consolidate their several changes to one helped reduce the backlog.  Even when Q-Track raised false alarms, the various requesters realized that Q-Track's conflict identification feature provided value to their having submitted a change request for review.  The change request process was no longer merely bureaucratic, but was now also risk-reducing --- for the product, the company, and my internal customers' careers.

Rapid reviews of critical requests in the backlog 

Earlier I had described my scoring method for prioritizing the backlog by the nature of the change (e.g., quality or cost savings) and age.  To support my efforts in restoring trust in the process, the department heads agreed to dedicate time to get together and review as a group the top-priority change requests in the six-month backlog.  The large number of decisions handed down in a short amount of time was the first and biggest evidence to my internal customers that the process was working again.

After the initial backlog reduction, the department heads would review change requests on their own.  In this my priority score still had a role to play.  A manager who failed to review and give a decision on a request within a week would receive automated reminders from Q-Track, the frequency determined by the priority score.  After a long enough delay, I would call the manager about working through their personal backlog.

Eliminating the backlog

After the initial surge of high-priority reviews, I worked with the requesters to pare down the backlog.  Q-Track's conflict identification feature had a secondary purpose in identifying a requester who had more than one old change request to the same specification.  I would then work with the requester to combine or eliminate unneeded requests.  A different way I would pare down the backlog was to suggest to a requester they rescind one or more of their old, minor change requests simply to move a newer, more important request up in the queue.

In fewer than three months, old requests disappeared from the queue through approval or attrition even as new high-priority quality and cost-saving changes made their way through the review process quickly.  The disappearance of old requests from the queue meant new requests for low-priority changes were now able to get through the process quickly.

software development, digital technology and modern solution

Maintaining trust & perceived value

To maintain the trust and perceived value I had gained for the restored process, I created an automated weekly newsletter to my internal clients.  It reported by product category all approved changes of the previous week and the approval status of pending ones — including the names of department managers who were delaying the review process. The newsletter also described new requests submitted that week.  In that way, everyone possibly impacted by the proposed changes knew of them before they might go into effect.

Efficiency through automation

Once I had the process running smoothly, I added more automation to Q-Track. This included automatically extracting change requests from Outlook emails, logging them, running my conflict checks, and sending the requests out for review --- all as one automated task I would run every morning.

Thanks to the automation I created in my Q-Track application, my Quality Coordinator responsibilities eventually required only 33% of my time.  This time became available just as Nabisco began creating competitive intelligence roles in its divisions.  With a background in intelligence analysis and a lot of free time, I was given the competitive intelligence role for Nabisco Refrigerated Foods.  This is how I I first began performing intelligence analysis in the private sector.  You can read about that here.  In addition, I was able to take on a greater role in the Quality Department, researching possible causes of consumer complaints and being an observer at consumer focus groups.

News of my Q-Track application made its way around Nabisco.  Soon Q-Track was adopted by my Quality counterparts in both the Planters/Lifesavers and Food Service divisions of Nabisco.

Ameriprise Financial

Minneapolis, MN

Square maze, shortcut through walls. Simple efficient solution of dfficult problem, breakthrough, obstinacy, creativity concept. Flat design. EPS 8 vector illustration, no transparency, no gradients

Using dissent to help measure & eliminate wasted effort

Ameriprise Financial uses lean implementation to assess and improve the efficiency of its processes. Lean implementation involves capturing metrics in each work process to identify and eliminate wasted effort.

In one of its divisions, Ameriprise conducted clearing operations for special-rule investment products. It did this through a large number of MS Access user interfaces tied to Oracle databases. As with other large clients of mine, Ameriprise had realized MS Access's forms and VBA programming language can provide user interfaces more cost-effective to create and maintain than browser-based interfaces. To implement lean metrics in this department, Ameriprise contracted a team of MS Access developers. I was one of these developers. Our numbers would vary between 10 and 15 over the course of this Agile-managed project.

I and my fellow developers were each assigned to work with a specific group of team managers to identify what metrics to capture and where in each team's work processes (performed using an MS Access interface). At a minimum, each application would need to capture the number of inputs and outputs each day and the time it took to produce the outputs. At their discretion, managers could define additional metrics to be captured anywhere in their processes. All metrics would be automatically delivered to an Oracle database where they would be centrally tracked and available for review by both the team manager and their senior management.

Frustrated angry female character. Irritated person. Sad offended woman sulking. Expressing angry emotion and unhappy face expression. Vector illustration isolated on white background

Dissent

Some of the team managers with whom I worked were initially resentful of lean implementation. Their reasons fell into two categories:

  • Their team's work processes were complex and labor-intensive; metrics would only suggest bad productivity, not the diligence required by the team's responsibilities.

 

  • Their team's work was bottle-necked by upstream quality problems and delayed answers from other departments; again, metrics would only suggest bad productivity, not the sources of wasted time.

Using dissent in design

I did not take a "never mind that" approach with these dissenting managers, but offered to work with them to address their concerns. After all, lean integration's goal of eliminating waste was achieved through identifying where and how time was spent.  How their teams spent their time and why was the very story these managers wanted to tell. The team managers accepted my offer to help them tell that story.

File format is EPS10.0.

I then worked with members of each manager's team to map the work process in each of their MS Access applications. We also worked to identify where the bottlenecks were and their causes. While doing all this, I noted those current application features that could be made more efficient.

With the bottleneck causes identified, I worked with the team managers to classify each cause. We then composed metrics that would give a picture of what was happening and what the time cost was.

The result of my efforts in this requirement gathering phase was a lean metrics plan that not only supported Ameriprise's lean goals, but gave the team managers the support they had originally feared was missing from this project.

Measuring & efficiently dealing with complexity

For those applications with complex, labor-intensive steps, my metrics capturing started by grouping imported records by what had to be done. This not only provided record counts for each type of treatment to be provided, but let me improve efficiency by creating new purpose-tailored screens for the employee to apply each type of treatment. A main menu let the employee navigate between the different screens for each treatment and would keep track of what had been completed and what was still undone. Capturing the start and end times of when each treatment screen was used let me record how long each treatment took. By recording the work time and record count for each type of treatment, the impact of complexity on productivity could be measured. My customized screens and other enhancements helped improve that productivity.

recycling_bins

Measuring & dealing with poor-quality inputs

For the processes complicated by quality problems (caused by other departments), the managers and I identified and classified the different quality problems. Where possible, I had my metrics capture begin by looking for those problems, separating and counting them by type of problem, then (with the click of a button) emailed back to the sender with a pre-formatted message describing the specific problem to be fixed.

For those problems only identified by an employee during the treatement process, I added a feature where the employee could tag a record for inquiry, letting the employee work the remaining records to completion. "Inquiry records" would later be counted and reported automatically.  The user could then return the "inquiry records" to their sender for correction and not have the delay for completion count against their productivity statistics.

Vector illustration - Success

Quick victories

Creating a new version of an app, one that included metrics gathering and new productivity features, could take me a week or more to complete given my other responsibilities. There could also be several days of parallel processing as part of the user acceptance testing.  I took advantage of that delay to start capturing in the current version of the app the simple metrics of overall inputs, outputs, and time to completion.  As a result, after my new version had been in production for a while, the team manager would have before-and-after metrics to demonstrate how they had quickly improved their team's productivity using lean principles.  It was an easy and significant victory for these managers in this company-wide lean initiative.

As you can expect, my work was appreciated by the team managers.  They now had metrics to illustrate the problems they faced.  My work was also appreciated by the lean project leadership, who made me the sole trainer of new contractors, on-shore and off-shore, as they joined the project. This was in addition to continuing my own metrics implementation work.

Medtronic

Mounds View, MN

Quality testing & tracking for a large database project

The word DATA is broken into fragments, concept of digital data loss caused by system error or computer virus, or intentional data deletion, problem of safe data storage and recovery

Don't try to solve a big problem; break it into small problems you can solve easily.

Following Medtronic's acquisition of medical device manufacturers and vendors on five continents, the voluminous information of those companies had to be recorded in Medtronic’s SAP ERP system.  First, that data had to be cleaned, completed where incomplete, and conformed to Medtronic's conventions.

Medtronic organized the project as a multi-phased, multi-staged series of hygiene, conversion/load testing, and data transformation tasks. Transformations were to be made as a series of transitions through intermediate table structures.  Quality checks and corrections were to be made before and after each transition. The project could not progress to the next stage until the data met all the data quality standards established for the current stage.

I was hired to provide the needed quality testing.

Isometric Business Corporate Management Planning Team Concept. Business Project Team Working Together at Meeting Room at Office

My primary responsibility was to compose custom test scripts when asked by the development team. These scripts were to be based on data standards composed for each stage of the project.  Depending on what the developers wanted to test, a script would need to include some or all of the quality standards for a given stage.

The standards for the early stages of the project were concerned with the completeness of the consolidation into one table of similar data from each of the acquired companies. Standards for middle stages were concerned with data completeness and its conformance to conventions.  The standards of the final stages were concerned with normalizing the data so it could be appended to the data structure of the SAP ERP system for which it was destined.

My role in this project was described to me as follows:  The developers would ask me to compose a test script involving certain data standards.  I would then need to translate these standards, written in plain English, into Oracle's programming language, PL/SQL. My script would need to include the recording of details on all errors found:  table name, record ID, field, actual value, and the name of the standard that was violated. I would then need to track and report the test results. Implied in the responsibility of writing test scripts is that I test my scripts to verify they would work correctly.  I would need to repeat this process each time a custom test script was needed --- which would be often. Composing test scripts quickly and frequently posed a high risk for error.

Parts of puzzles on white background in colored colors. Set of puzzle 2, 3, 4, 6, 8, 9, 16 pieces

My analysis of the situation

Given the complexity and time needed to write custom test scripts from scratch, I knew I needed a proactive approach to producing these scripts. As I had done in other projects before and since, I would use the concept of modularity: creating things at a granularity that would support assembling them as needed to build whatever was needed as it was needed.  I had come by this approach from my training as an intelligence analyst:  always break a big problem into small ones you can easily solve.

Businessman unraveling tangled tangle of problems. Business flat vector illustration.

Another need was to automate the reporting of each test script's results. Part of my test scripts would be recording in a table all the errors found.  During the project, there would be many tests for many different things.  A tremendous number of errors would be found, especially in the early stages of each phase.  The volume and complexity of the error data would make reporting the results of a test complicated:  I would need to query for those particular results, export and format them, summarize them in an email message, and distribute those results to my internal customers.  This would be time-consuming and error prone. I knew I would have to automate this task.

modularity_two

My solution:  modularity & automation

For my quality testing scripts, I decided to be proactive.  I immediately began to compose and test PL/SQL "snippets" for each individual data standard. A snippet would consist only of those lines of PL/SQL needed for the quality standard.  There would be no variable definitions or output code --- just the few lines of code pertaining to the quality standard.

I saved each snippet in its own text file.  I could then combine these snippets as needed to create a test script.  The last steps in creating a test script would be the insertion of reusable code for defining needed variables (standardized for use with any quality standards) and the code needed to record the test results.

To automate both the production of these custom scripts and the reporting of their results, I created an Oracle and MS Access/VBA solution, QUATRA (for QUAlity Testing & Reporting App).  It was characterized by:

Linear Flat UI/UX interface design web site hero image vector illustration. User experience, projecting and testing app and software concept. Laptop, digitizer, rulers and wireframe

MS Access and VBA programming provided a user interface for me to perform my automated tasks.

A strict folder structure and file naming convention kept my PL/SQL snippets organized for automated retrieval.

Oracle housed each test's results and summary data.

QUATRA in operation

When the developers would ask me for a custom test script based on certain data standards, I selected those standards in my Access interface. Then, at a click of a button, my VBA programming retrieved each of the needed PL/SQL "snippets", inserted their text and the standard supporting code into a new text file, saved the resulting script, and opened it for me to use. After I had run the script in Oracle, I could click another button to have the results of that test script extracted, summarized, and prepared in an email for delivery. Although a script might take hours to run to completion, my own labor for each script took only minutes.  With the time QUATRA saved me, I wrote and tested the "snippets" for the data standards of the next stage of the project.  I also had time for other projects.

Development time pays off

Designing and building QUATRA and the initial PL/SQL snippets took me several weeks. Since this was at the start of the project when managers and developers were still planning, I had the time to do this. The investment paid off handsomely.  One key benefit was that I avoided all the mistakes I might otherwise have made if I created all my scripts from scratch and by hand. Once again, the combination of my analytical experience, Quality background, and skills with databases and programming had provided a solution to what started out as a large, complex problem.

CQIA - American Society for Quality
International Institute of Business Analysis

Return to top

Copyright Will Beauchemin 2024