OODA v PDSA by example

Glenn Wilson
7 min readOct 15, 2021

--

While discussing the merits of learning and improvement cycles with an esteemed colleague of mine, I realised that many people seem to make a choice between using either an Observe →Orient →Decide →Act (OODA) loop or a Plan →Do →Study →Act (PDSA) cycle. However, I believe that both are very relevant in today’s organisation and should not be used exclusively for all process of learning. In this article, I will describe real examples of how each cycle can benefit security practices within an organisation. For those of you unfamiliar with one or both cycles, I shall explain them next.

OODA

The OODA loop came to prominence within software delivery processes following the publication of Jeff Sutherland’s Scrum: The Art of Doing Twice the Work in Half the Time in 2015. In this book, Sutherland, a former fighter pilot, describes a technique that gave US fighter pilots an advantage over their enemies during dog fights. It was designed by US Airforce strategist John Boyd during the Korean war (1950 to 1953). It is described as a loop, although it is effectively a loop of smaller loops (see figure 1). The use of it in software delivery provided a foundation for creating feedback loops for Agile Scrum. The technique involves taking in information through observation, understand how this information related to each other (orient), creating a hypothesis, and then testing the hypothesis through action. The cycle is designed to be adaptive. Thus, during each step, feedback provides input back to the observation process to continually refine the orient, decision, and action steps.

Figure 1 John Boyd’s OODA loop

PDSA

The PDSA cycle was a concept that gained traction in Japan through the influence of W. Edwards Deming who taught the cycle to Japanese manufacturing organisations in the years immediately after World War II. Deming attributed the cycle to Walter Shewhart of whom Deming was a student during the 1920s and 1930s. The PDSA cycle is defined as a cycle for continual learning and improvement (see figure 2). Its origins were based on the scientific approach to experimentation; applying this within business to continually improve our knowledge and understanding within a business context. In simple terms, it allows us to create a theory, define the success criteria and produce a plan to test the theory. The plan is implemented during the ‘do’ phase of the cycle. The results of the action are studied to test the validity of the theory and plan. Based on the outcomes of the study, an action is undertaken, resulting in acceptance of the theory, adjustment to the plan, broadening of the theory, or creation of a new theory.

Figure 2 Deming’s PDSA cycle

The PDSA cycle has often been called the PDCA cycle, where the “C” is for “Check”. However, Deming was very insistent that the term ‘check’ is inappropriate within this process. His main argument is that the term ‘study’ involves analysis of the results, and interpretating the outcomes in a considered manner. On the other hand, the term ‘check’ suggests a brief look at the results, a tick in the box, or a token glance at the outcome. This term also suggests closure which reducing the PDSA cycle to plandotickplandotick. The studying and action phases are reduced to a minimum or removed completely from the cycle. For the PDSA cycle to be effective, studying the results is an essential step in the cycle.

Difference between OODA and PDSA

Obviously the two cycles are not identical, yet there are some similarities. For example, both result in actions based on the outcomes of previous steps in their respective cycles. However, they are different in terms of their purpose and use. The OODA loop, designed for fast moving scenarios such as a dogfight between opposing fighter pilots, provides a framework to make decisions as a reaction to a situation. The PDSA cycle, designed as a tool for understanding and knowledge, allows people to make decisions in relation to testing a theory. The first is reactive the second is proactive. Therefore, they are equally valuable tools that can be used to improve security within an organisation. The following examples demonstrate how to use each methodology in selecting and using a static application security testing product.

PDSA by example

Let’s start with the general goal of identifying and reducing the number of potential vulnerabilities (or security weaknesses) within an application’s source code. To set the scene for this scenario, imagine that we have failed a payment card industry data security standard (PCI DSS) audit in relation to weak encryption algorithms. To resolve this problem, we decide to apply the PDSA cycle. First, we start with a theory. In this case, it is hypothesised that the strength of encryption algorithms used within the application source code meets PCI DSS standards. We now need to devise a plan to test this theory using a static application security testing (SAST) tool (it assumed that a SAST tool is available to the engineering teams). The plan describes how the SAST tool is configured to identify encryption algorithms used within the application and assert that they meet PCI DSS standards. Most commercial tools and a few open-source tools have the capability to identify specific types of vulnerabilities within source code. Of course, other options exist, such as writing unit tests to assert that encryption algorithms used within the source code meet the desired standards. In addition to the theory, the plan describes the steps needed to carry out the test, how the results are collected, and how the results are measured. When the planning phase is completed, we move onto the ‘do’ phase. In other words, we run the relevant test or scan to target weak encryption within the relevant application source code. If the application contains this threat, this phase will generate data such as the names of source code files containing weak encryption and the line numbers of the code block affected. The data may also contain a score indicating the severity of the weakness. After running the unit tests or scans and collating the data, we enter the ‘study’ phase of the PDSA cycle. It is important that the data is analysed objectively, avoiding jumping to conclusions based on false or wrong assumptions. The results must be analysed in the context of the hypothesis. Did we prove or disprove the hypothesis? In this example, we evaluate the weak encryption algorithms identified by the unit tests or SAST tool to assert whether those used in the code meet the PCI DSS standards. Finally, we ‘act’ based on our evaluation of the results. We may need to change the code to pass the tests or scans. The results may be inconclusive so we may choose to modify the hypothesis; modify the plan; or create a new hypothesis. At this stage, we have come full circle and we start the PDSA cycle again.

OODA by example

The reactive nature of the OODA loop provides an opportunity to apply the benefits of a SAST tool in a different context. In this scenario, we have a large and increasing backlog of vulnerabilities. These are mainly security defects accumulated through various scans and penetration tests, as well as known issues we were going to fix later. As a result, we have not met a former compliance finding to reduce the number of security defects. In fact, the increase in security defects is becoming a major cause for concern.

The first task within the OODA cycle is to ‘observe’, which, in this case, is to identify all the different vulnerability types. Information is gathered from the vulnerability list, from common weakness enumeration databases, internal security teams, and other resources. Once we have grouped all the vulnerabilities by type, the next step is to ‘orient’ and understand their impact. In this case, we want to react to the most critical vulnerability group that could have the greatest impact on the organisation. We derive this from our experience, from knowledge of our products, of our industry, and of current global security threats. Once we have identified the most critical vulnerability type, we have enough information to move to the next step of the cycle which is to ‘decide’ on what to do. We hypothesise about the best way to resolve the risk and make a decision based on that hypothesis. For example, we may decide to rewrite the code, or apply a compensating control, or even remove vulnerable features that offer no value to customers. Through this decision process, we make other observations that could be fed back into the cycle. For example, we may notice that the most critical vulnerability type only affects a legacy application which may lead to a decision to decommission the application rather than fix it. Effectively, we are making an informed decision based on our observation and orientation.

The final step in the OODA loop is to ‘act’ based on our decision. During this phase, we should maintain observability into whether the action we have taken is resolving the original problem. If not, we return to the orient phase to reassess our initial observation before making a new decision. Ultimately, iterating through this cycle, we work our way through the most critical vulnerability types within our organisation until we reach a point where compliance is satisfied.

Conclusion

Using frameworks to help us understand and improve our situation and to make good decisions is important. The ODDA and PDSA cycles are useful tools that allow us to continually improve the security of our organisation. I often heard engineers use phrases like ‘trial and error’, ‘suck it and see’, or ‘try it and see what happens’. In these cases, there is no structure. We may learn what does not work, but not necessarily what does work. Using real-world examples this article shows the reader how to make use of the PDSA and OODA cycles to tackle different issues relating to vulnerability management. Each cycle shapes the way we create theories to resolve problems through continuous experimentation. It allows us to continuously improve our daily lives, pay down technical debt, and increase flow, ultimately giving us greater enjoyment in what we do.

--

--

No responses yet