datahead

"Every application must have an inherent amount of irreducible complexity. The only question is who will have to deal with it."

--- Larry Tesler

Larry Tesler, a pioneer in human-computer interaction (and inventor of copy-and-paste) observed that the complexity of an automated system cannot be elminated. It can only be shifted between the system and the end user --- and Tesler argued that it should always be shifted to the system in favor of the end user.

I not only ascribe to this principle, I take it further: within a system, the complexity can also be shifted between two parties: the people designing and coding the system and the people who will need to test and maintain it. An easily tested and maintained system is only the result of careful planning, which puts the complexity burden on the designer. When I have been the business analyst on a project, I have always taken on that burden.

My approach to dealing with complexity is the way I was taught in military intelligence school: break a big question into smaller and smaller ones. From my training as a quality improvement associate, I've learned techniques for root cause analysis and process improvement. From my training as a business analyst, I've learned to shepherd a project through the software development life cycle. Read about some of my successes below.

International Institute of Business Analysis
American Society for Quality - CQIA
American Society for Quality - CQIA
Retailer: Create new automated pricing system
  • Industry: retail
  • Topics: requirements gathering, risk analysis, iterative design, agile, backlog management, system design, prototyping, process improvement
  • Software: Teradata, MS SQL Server, Access/VBA
Nabisco: Automation to remove bottlenecks and promote adherence
  • Industry: food manufacturer
  • Topics: root cause analysis, process improvement, getting buy-in, backlog management
  • Software: MS Access/VBA
HIDTA Task Force: Unconventional process documentation for intelligence support
  • Industry: law enforcement
  • Topics: project management, process improvement, documentation

Return to top

Retailer:  Create new automated pricing system

flowchart

This case study will focus on my business analyst role in this project where I was also the developer and data hygienist. For a case study on my work as a developer of a similar pricing tool, see MTS Systems: Pricing tool for thousands of products.

SITUATION

Retailer was losing access to a third-party product pricing tool unexpectedly. They had invested a large amount of time in defining and refining pricing algorithms and other strategy components in this tool.  They therefore wanted a new pricing system that could leverage similar functionality. Retailer was also dissatisfied with many features of the 3rd-party tool. They therefore did not want a simple replacement of the old system, but something new and better.  The immediate need was to have enough functionality in place to avoid disruption of business when the 3rd-party tool went away.  Once that was met, Retailer wanted to build new functionality into the system.  Although Retailer envisioned eventually having a browser-based tool as their interface for the system I would create, they understood that an MS Access front-end was the fastest way to get the application they needed.  Access would also simplify adding enhancements.  The Access user interfaces I would create would serve as working prototypes for the eventual browser-based interfaces. The databases and stored procedures I would create would persist.

ANALYSIS - Phase 1 of 2 of the project

After gathering requirements and composing my use cases, I identified three critical questions whose answers would determine the project's success or failure.  They were:

1) Were Retailer's existing pricing algorithms useable in Teradata (the database management system I would have to use)?

2) Could I write Teradata stored procedures that could produce what was needed within the nightly time window?

3) Could I create those stored procedures --- tested, debugged, and ready for production --- before the 3rd party tool went away?

Following my principle of failing quickly, I dealt with those critical areas first:

Were Retailer's existing pricing algorithms useable in Teradata?

I was able to quickly verify that the Retailer's algorithms would work in Teradata, although some data hygiene would be needed. This meant the project did not need to divert time and effort to recreate all the algorithms. Determining this quickly meant more time was available to build and test more functionality in the first release of the system.

Could I write Teradata stored procedures that could produce what was needed within the nightly time window?

Before I began outlining my new Teradata database and stored procedures, I worked with the client to create a mock-up of the reviewer's app I would later need to create.  Getting approval on this mock-up now rather than later determined what ancillary information my stored procedures and tables would need to capture.

I then created an MS Access prototype to develop and test the concept I had in mind for the Teradata tables and stored procedures. This prototyping confirmed my concept and refined its details.  I had only to translate my MS Access table structures and VBA programming into Teradata.

With a tentative plan created, I met with a Teradata database administrator (DBA) to learn about the process that prepared the input data my process would need.  I learned that it would take hours to compile all the inputs my stored procedures would need. This pre-processing time reduced the time window for my own process.  I suggested and the DBA agreed I should have my own pre-processing scripts that would identify all possible components for that night's pricing process. When input data became available to my system, my scripts would only need to filter by the factors in that input data to arrive at the set of components needed for that night's processing. I therefore traded efficiency for the certainty of on-time completion.

Could I create those stored procedures --- tested, debugged, and ready for production --- before the 3rd-party tool went away?

To create working code as quickly as possible, I needed to quickly verify that the inputs and outputs of each step in my automated process were correct. To support this, I outlined my Teradata scripts to produce temporary tables for each step. Similarly, I divided the automated process into separate stored procedures comprised of a small set of related tasks. I could therefore validate one large part of the automated process before starting to write the next. This modularization saved me from the time-consuming process of debugging a single large, complex stored procedure.

With all the critical areas addressed successfully and with key lessons learned, I could begin to organize my requirements backlog for each of my sprints.  Although I generally completed one component of my system --- database, scripts, reviewer interface, and administrator interface --- before proceeding to the next, I did have one major digression. While testing my scripts, I found they were failing due to syntax problems in some algorithms.  These were problems my earlier quality checks had missed. I therefore created a syntax testing tool, intended for the administrator interface, far ahead of my original schedule. This helped repair the algorithms and so got my scripts into user acceptance testing sooner. I later integrated this testing tool into the administrator interface.  It enables new and revised algorithms to be tested for syntax and tentative results before they go into production.

ANALYSIS - Phase 2 of 2 of the project

BUSINESS PROCESS & INTERFACE IMPROVEMENT

After the first release of the reviewer interface, I talked with price reviewers about their experience.  They expressed their frustration with not being able to quickly make decisions on batches of prices that fit certain criteria. This functionality had not existed in the previous 3rd-party system, but reviewers considered it a need.  I therefore suggested to the client that I create this for the next major release of the app.  They agreed.

With that next major release, reviewers can now filter products by up to five criteria and make a pricing decision on that filtered set. The five filtering criteria could be selected from dozens of fields and include user-specified ranges and comparisons, such as ">=" and "BETWEEN." Creating this new feature was greatly simplified by how I had structured the code in the first release:  I had modularized the actions performed for different types of decisions.  As a result, when creating the new filter-and-decide feature, the only coding needed was to call the appropriate module for each product listed on the filtered screen.  Because I did not need to change the code shared with existing features, only the new functionality needed testing.  Regression testing on the original features was not necessary.

As business analyst on this Retailer project, I addressed critical areas first, prototyped the solution, and modularized my automated processes.  Together with my programming and data hygiene skills, I was able to deliver a working pricing system in time to avoid business disruption.  My subsequent efforts in continuous improvement led to final versions of the user interfaces which, the client has told me, exceeded their expectations.

Return to top

Nabisco: Automation to remove bottlenecks and promote adherence

Situation

Nabisco Refrigerated Foods, a division of Nabisco, made margarine, yogurt, and microwavable omelets. Before a change could be made to ingredients, packaging, and other aspects of a product, the managers of impacted departments had to review and approve the requested change.  These requests came from production facilities around the country as well as headquarters staff in two New Jersey cities. One designated person in the Quality Department, the Quality Coordinator, administered the process.  With that current employee's departure from the company, a six-month backlog of change requests built up.

I was hired to fix the problem and maintain the process.

Analysis

  • There was only token adherence to the process now because the process had simply collapsed.
  • Any solution would need to simultaneously address the six-month backlog even as new requests came in. There could be no time-out.
  • The greatest weakness in the process was that it had depended on one person doing a large number of tasks by hand.

My solution would therefore need to do five critical things:

  1. Identify and act on the most important requests first.
  2. Be fast.
  3. Be thorough in following through on review and approval.
  4. Encourage trust & adherence.
  5. Allow for quick knowledge transfer when necessary.

Solution

I was able to do all five things in large part by creating Q-Track, an MS Access/VBA application. Q-Track provided:

  • A database of all change requests, their priority score (an innovation of mine), and status.
  • Ability to identify when one requested change might conflict with another requested change.
  • Automated log entries for each step performed.
  • Automated reminders to reviewers (and me).
  • Automated edits and edit-tracking in the final version of specifications.
  • Generation of a weekly newsletter to the community of interest.

It was critical to restoring the process and building trust that I organize the entire backlog of requests, prioritize, and move each request forward as quickly as warranted.  I therefore developed Q-Track iteratively, building the next feature of QTrack as it was needed to move the backlog forward.

I prioritized requests by a combination of importance and urgency.  Importance was scored based on the reason for a request:  #1 was quality (weighted much higher than all else), #2 was cost savings, and #3 was "other" (such as marketing-related changes to packaging).  QTrack then scored requests by how old they were with higher scores going to older requests.  The total score let me focus my time and management's on the most important and urgent things.  It let me tell requesters where their request was ranked in the queue.  It also let me determine programmatically how often automatic reminders were sent to managers:  the higher the score, the more frequent the reminders.

Another innovation was to immediately identify any change request that might impact another change request.  For example, a change in a product's ingredients might be at odds with proposed new quality standards.  With a database this was simple to identify whereas it had been near impossible to do by a person simply reading and remembering the details of requests.

By showing stakeholders that change requests were being processed again and that there was value in following procedure (such as the conflict identification described earlier), trust in the procedure --- and therefore adherence --- was restored.  To maintain this trust, I eventually created a weekly newsletter to all stakeholders.  It reported by product category all approved changes of the previous week and the approval status of pending ones --- including the names of managers who were delaying completion of the review. It also described new requests so that conversations on pros and cons could start sooner rather than later.

As for knowledge transfer, the automated features reduced the Coordinator's hours per week by 66% and eliminated the complexity of the role. It could be quickly shifted to another employee, even on a part-time basis.

The Q-Track application was so successful that it was soon adopted by both the Planters/Lifesavers and Food Service divisions of Nabisco.

Return to top

HIDTA Task Force: Unconventional process documentation for intelligence support

hierarchy

Situation

The Northern New Jersey High Intensity Drug Trafficking Area Task Force (HIDTA) was a joint federal, state, and local drug enforcement task force. It investigated and eliminated international heroin smuggling organizations operating through Newark's airport and seaport. I was contracted to train and oversee a staff of intelligence analysts as well as to provide intelligence support myself.

The Task Force always had six to eight major investigations going on at any time and several in the early stages.  The analysts I would supervise were all military reservists, most on six-month temporary assignments to our task force. While most of them were trained intelligence analysts, drug smuggling investigations were new to all of them.  Because of the urgency of the Task Force's work and the limited time an analyst would be available to me, I had to get them trained and productive quickly.  I would also have to repeat this every six months as the reservists rotated in and out.  I also had to accomplish this as I supported several investigations of my own.  I summarized my situation as follows:

  • My new analysts needed to know immediately and specifically what they had to do to move their investigations forward. For my own productivity, they needed to do that with as little help from me as possible.
  • When the Task Force investigators returned to the office periodically, they needed to know immediately what intelligence would help their investigations move forward and so plan accordingly.

Analysis

  • Everything the intelligence analysts did needed to produce actionable intelligence, ready to deliver when the investigator was ready to act on it.
  • I had to take the complex job of intelligence support for investigations and break it into simple tasks.  For each task, I would need to compose instructions and standards.
  • Intelligence analysts had to easily determine what tasks to perform, in what order, what the objectives of each were, and how to find the instructions and standards for each task.
  • The intelligence analyst (who was new to drug enforcement) and the investigator (who was new to having intelligence support) needed a means of communicating in the same terms. Misunderstandings would lead to missed opportunities.

Solution

In addition to identifying the discrete tasks needed for intelligence production and documenting each of them, I also created a large wall chart: "The HIDTA Intelligence Cycle." It consisted of three concentric rings:

  • Inner ring:  Specific task of the intelligence analyst.  Each task had its own corresponding, short procedural manual and standards, so the chart and the documentation were synchronized.
  • Outer ring:  The subsequent activity of the investigator. This could be to organize a surveillance plan, subpoena particular types of documents, or seek telephone monitoring.
  • Middle ring:  The bridge between analyst task and investigator's ability.  These were the questions to be answered by the analyst task.  Those answers would enable the investigator to do the action necessary at that particular stage of the investigation.

By looking at the chart, an intelligence analyst knew what task they had to perform next, what its purpose was, and where to find my instructions on how to perform it. They also saw how that task moved the investigation forward. When investigators returned to the office and wanted an update from their analyst, the chart provided a common reference in discussions of progress, obstacles, and next steps.

Shown below is an abstract of this chart which, due to its confidentiality (not to mention my fading memory), I cannot represent in detail.

For this and my other work for the Task Force, I received awards from the Task Force and from the US Customs Service (now Immigration & Customs Enforcement), shown on my "About" page.

HIDTA_cycle