On Quality Velocity

The Love Field airport in Dallas was fairly quiet this morning at 5:00 AM. Starbucks had a huge line of people waiting to order and a large group of people waiting for coffee. I always forget what it is like to travel this early in the morning. My preference for evening flights home is well founded. I’m not sure why that preference gets ignored so frequently.

From time to time it may be necessary to manage a work queue within a defined workflow. That workflow defines the path in which work is going to be completed. The process is well defined. Whatever unit of work being done builds up in the queue. Managing that queue has come to the forefront of my thoughts today. Perhaps later a treatise on workflows will be forthcoming. Today is not that day. Today is a day to think about completing the work. Achieving a degree of quality velocity during the course of managing a work queue requires planning and the right framework of accountability. Setting up that framework of accountability ensures that work is done quickly with a high degree of quality. It also means that if speed or quality metrics are not being met the data is available to explore the causes of that imbalance.

In this example, velocity is a measure of speed to resolution along the workflow and quality is a measure of accuracy based on a calculation of error rates. Depending on the work being done the measure of quality could be defect density or some other calculation. Quality does have to be measured. It should be measured. Mature and well defined workflows should over time have a well-defined mechanism for tracking quality velocity over time. Ensuring that mechanism is setup and running is the part of the equation that has caught my attention today.

Within any workflow a defined beginning and end exist. All of the points in the path from the beginning to the end have to be well understood. Mapping those points into some type of workflow is usually either very straightforward or a real adventure. It could be as easy as defining the unit of work and tracing the route of one unit through the system. Thinking of a workflow as a living system can change your view of things. Complexity within living systems can change your evaluation path. Truly complex systems can be incredibly hard to map. Brevity will always be the heart of whit. Taking something that is truly complex and presenting a simple to understand version requires a keen understanding.

After the workflow has been mapped, the next step in the process would be to figure out how to retain information and quality and velocity. It is very possible that as a workflow crystalized from formational chaos that no mechanism for tracking quality or velocity was built into the system. Adding those layers of tracking may be easy or it could be incredibly challenging. Setting up a mechanism for collecting that information without having adverse effects on the process will introduce an interesting planning challenge to the agenda.

Introducing that framework of quality velocity related accountability opens the door to advanced methodologies. Data mining or something even more advanced like process mining could be introduced. A variety of advanced analytic techniques are becoming more and more mainstream and accessible. People seem to really be accepting the idea that analytic engines can separate the signal from the noise. New methodologies are making it easy to evaluate which signals are the most meaningful. Linking a well-defined course of action to a signal from an analytic engine seems to be a relatively recent phenomenon.

 

This site uses Akismet to reduce spam. Learn how your comment data is processed.