- The Mastery Minute Newsletter
- Posts
- To Go There, Look Elsware
To Go There, Look Elsware
Non-Intuitive Complex Solutions
Every complex problem initially feels and behaves like a black box.
We may never get full transparency of the system’s hidden machinations but there is one main thing we can do to make the box more “grey”.
A complex system’s non-linear nature, along with potential dependence or integration with other systems some of which are complex or complicated themselves, can produce unexpected or drastically different results.
When faced with a problem or system that produces unanticipated outputs, our initial recourse can be to focus on the problem.
What are the variables?
What are the parameters?
For what numerical range do I get rational answers?
Or in the case of real-life non-linear problems:
What did I do wrong?
How do I get promoted?
Did I not articulate my viewpoint clearly?
What can I do to help in (Haiti, Inflation, Political Corruption, or Obesity Crisis)?
This may be the wrong course of action.
The Problem with Complexity
Complex problems are not complicated, they are complex.
The same methods cannot be used to solve both.

Simple vs. Complicated vs. Complex Problems
Complicated systems are:
Linear (There is a direct cause and effect relationship)
Visible (All the factors of the problems can be seen. We know where all the information gaps lie.)
Predictable (The system provides consistently repeatable results)
Complex systems, in contrast, exhibit:
Multiplicity (There may be more than one interacting element in the situation)
Interdependence (The interacting elements may interact with each other)
Diversity (The interacting elements may be in differing domains (Psychology, Mathematics, environmental science, etc.))
The characteristics of complex systems demand a different examination method.
The insight you need to solve a problem in one domain may be found in another.
In place of a neat, simple representative formula to manipulate to our liking, the best practice for solving complex problems is to increase knowledge around the problem’s factors, contrast your results into other arenas, experimentation, or (what people try to avoid) trial and error.
Look Here and There
Overwhelmed by complexity, they shrink their options just when they need to expand them.
When dealing with things we don't understand we need to EXPAND the information we are absorbing rather than shrinking it.
Nate Silver’s book “The Signal and The Noise” shows an excellent example of this in action.
Post-9/11, US intelligence agencies were looking for a way to model the frequency and magnitude of future terrorist attacks.
This was a complex problem that depended on several factors:
How many terrorist organizations are active?
Are these groups operating outside of government oversight?
How many terrorists are in each organization?
What are their goals and ideals?
What types of targets are they looking for?
Some of these questions could not be answered and the US intelligence was unsure if they could even create a complete question set, creating a complex problem.
The model best used to help was found not in another intelligence agency's files or by spying more on the terrorist groups.
(These efforts are needed to gather more information but, in its self does not contribute to the prediction mode)
The answer was found in an earthquake predictive model.
The modeling data for the frequency and severity of attacks and the frequency and severity of earthquakes were similar as they both followed a power law distribution.
Power law distribution means that most of an event's effect comes from a small minority of occurrences.
(Think Pareto’s Law or 80/20 Rule: 80% of output comes from 20% of input)
The researcher, Aaron Clauset, noted that most earthquakes have minimal effect, while a small minority of earthquakes are responsible for most devastation.
This same pattern is reflected in terrorist attacks.
By modeling previous attacks the same way earthquakes are modeled, Clauset discovered the probability an attack would occur based on the scale of mortality.
Clauset’s model doesn’t mean that an earthquake or large-scale attack WILL happen every 100 years but that each year there is a 1 in 100 chance that one could and preparations can be made based on this assumption.
Instead of attempting to shoehorn a complex system into a framework or formula you are familiar with, look to other domains to see if another phenomenon exhibits similar characteristics you can model against.
Will this take extra time?
Yes.
Will this mean admitting that you don’t know?
Yes
Will this involve trial and error?
Potentially.
However, attempting to force fit a complex problem into a non-optimal framework results in the loss of resolution and accuracy needed for prediction.
Reply