Let’s just say there’s a lot to learn from history without quoting Sun Tzu… again. Especially in information and cybersecurity. While much of the birth of cyber realm revolves around the military - many of the members of our community are current or former members of various armed forces - many of us still refer to the military influence of old when working through our business planning and various actions revolving around cybersecurity. A great example is the common use or reference to Boyd’s OODA (Observe–Orient–Decide–Act) loop flow chart in both attack and defensive security applications. In sticking to a military theme, I want to touch on a story from World War II and its applicability in today’s modern cybersecurity world.
The year was 1943 and the protagonist of our history lesson, a mathematician named Abraham Wald, was working for the Statistical Research Group (SRG). The SRG was a classified program that leveraged various statisticians and mathematicians to support ongoing war efforts. Located in what is present day Columbia University faculty apartments and doctors’ offices, this number crunching facility of old was buzzing with many top brains curled over desks and calculators sweating over numbers and equations on things like the proximity fuses, optimal flight curves and speeds for fighter pilots while dogfighting, and on various equations revolving around radar applications in detection. Suffice it to say, big long lines of numbers and letters with even bigger applications.
With the enormous scale of World War II and the quickly evolving world of aviation’s part in the war effort, the military was constantly gathering data and working through a delicate equation. The equation went something like this: the military wants to avoid its planes being shot down, so they add armor to the planes, but adding armor to planes makes them heavier, which in turn changes the flight dynamics of said planes and results in them becoming much less maneuverable. Less maneuverable planes result in enemy fighters being able to target and attack them much more easily. So, the military was working through this balance to find a precise median in which planes were well armored, as well as maneuverable. The root question being can they get the same protection with less armor if it is concentrated in areas with greater need (more damage), on high value planes (think bombers or transport planes). A real catch 22.
The military started gathering statistical data from the planes that returned from various missions. They noted the amount of bullet holes per square foot of area on certain components. Things like, how many holes were in engines or fuel systems (areas more critical to a planes’ flying) or how many holes were in the fuselage or landing gear and the rest of the plane (areas less critical to staying aloft). They then used these gathered numbers to harden the armor of the areas which had the largest number of holes. More holes in these areas meant that these planes were taking more damage to these parts, but the problem was, the number of planes lost never had a dramatic change through these tests. So, they brought the research to the SRG for evaluation. The idea here was to get the catch 22 number efficiently.
This is where Wald comes into play. You see, while many other SRG members looked at the data and drew conclusion that adding armor to areas with more holes would lead to less damage to these planes and potentially increase the number of returned planes… Wald on the other hand took a step back. He noted that the data showed regularly that the number of holes in critical areas like the engines or fuel systems were lower overall when compared to the fuselage, wings, or landing gear. Wald looked at the whole of the problem and came to a different conclusion altogether. He didn’t answer “where would armor be most effective?” question, instead, he asked his own question, “Where are the missing holes?”. That is, why are there fewer holes in critical systems like engines and fuel systems when compared to the rest of the plane. The answer, of course, is because those holes were never accounted for. The planes that took more damage to critical systems never returned. As such this data couldn’t be included.
Wald proposed that the armor shouldn’t go where the bullet holes are, it goes where the bullet holes aren’t. The data set only includes the planes that returned, and as such the full data set didn’t exist. A plane that completed a mission and returned to base with a shredded wing or a cabin that looked like a whiffle ball still completed the mission and the plane returned. So, adding defenses to an area that may not in fact be critical to the plane just because it is readily visible or in the forefront of the information present, doesn’t make it the right choice. In armoring areas of planes that were essentially the inverse of the present data, the return rates of planes skyrocketed (pun fully intended).
I try and remember this fact when looking at the whole of a system from the security aspect. To fully appreciate and understand security and its applicability to a business, you first must look at the entire problem critically. Fighting implicit biases like the sunk cost fallacy (irrationally clinging to things that have already cost you something; think a security tool you have invested time and money in) or the availability heuristic (making judgments that are influenced by what most easily springs to mind; think responding to an issue as though it’s a red alert situation because another business has been breached recently, when really it’s just Fortnight launcher) are very easy to fall into, especially when you are already knee deep in a system that has organically grown around you or that you have hand built!
I like to think of it like this; often when we come and complete a Penetration test for a business they are blown away by the findings. They struggle to wrap their heads around everything that’s been going on during the engagement, often without them even knowing it, because they were fairly confident they were in a good security position to start. The unexpected results are not because they are bad at security or don’t understand the complexity of the system they live in, it’s often because they have been adding armor to the holes they see and not taking the time to question where the holes aren’t. And at no fault of their own! It’s incredibly difficult to keep up with all the technology, tools, methods, and vulnerabilities that come falling from the sky every day, especially on top of balancing the multiple spinning plates that are cybersecurity, business or software development, IT uptime, and every other leg of that good ol’ CIA triad.
The biggest takeaway from this story for me is twofold; having a pair of fresh eyes - especially those that may be focused on a slightly different aspect of the cyber landscape like a red team - evaluate where things are is invaluable in continuing to move your progress goal posts forward and keeping things on the right track in this ever changing cyberscape. The second is, of course, taking the time to evaluate things from this perspective occasionally on your own. These are the type of situations that may break you free from your unintended biases and let you ask questions like “Where are our missing holes?” which allow you to better evaluate your current situation more critically. I guess the overarching moral of the story is always think critically when evaluating any system, and call ProCircular for a Penetration test ASAP (personal bias included)! Take the time to question where your holes aren’t and where you may be being blindsided.