A growing number of companies are deploying sophisticated predictive models powered by artificial intelligence and machine learning, and many are using them to inform critical decisions.
Yet even the most advanced models couldn’t predict the arrival of the COVID-19 pandemic or Russia’s invasion of Ukraine — which serve as ongoing reminders of the limitations that come with trying to foretell the future.
Predictive models are based on data from past events and used to project future outcomes, but they must evolve to account for complex situations and environments.
Events that are inherently unpredictable aren’t preceded by any other data points, creating a bias in model output toward “safe and predictable” outcomes.
Companies that base their operations on such outcomes are inevitably caught off-guard when unforeseen circumstances arise, and the results can be detrimental.
The current supply chain crisis provides a salient example of the dangers facing companies that place too much faith in predictive modeling.
Supply chains, in particular, are geared toward predictability, and most aren’t resilient to disruption.
The product supply shortages that sent consumers into a panic early in the pandemic and the current microchip shortage now plaguing manufacturers are two of the latest hits to a global supply chain that has experienced ongoing disruption. Still, they likely won’t be the last.
The business landscape, and the world, are increasingly defined by the unpredictable.
As complex challenges proliferate, the business environment is evolving into one that is altogether chaotic.
Companies that rely on predictive models to drive decision-making in a chaotic environment must develop organizational resilience because models are always reflective of the organizations leveraging them.
When it comes to the supply chain crisis — added resilience could be gained by building a distributed supply chain model. A distributed supply chain model may mitigate the negative impacts of disruption in any one place.
While historical data might not be sufficient for creating models that can forecast disruption before it occurs, enterprises can still run experiments to predict the impact of potential disruptions in pursuit of better experiences.
Businesses can then use the outcomes of these experiments to build contingency plans to ensure success and be better prepared when the unexpected inevitably occurs.
As complexity increases, the amount of data generated by modern enterprises will likewise increase.
Eventually, organizations will have such vast quantities of data that are deriving actionable value from their information will be nearly impossible without a method and platform that can corral it all to find value.
Predictive modeling might still serve some enterprises today. Still, ultimately it should be a stepping stone to a prescriptive modeling approach — one that doesn’t just project possibilities but also pinpoints appropriate responses.
Humans have evolved to operate in a world that is complicated but linear. Consequently, when we encounter genuinely complex problems, we tend to approach them in a linear fashion.
However, data can help us develop custom business and technology solutions by giving us the ability to test our environment.
Rather than preconceived notions, experimentation allows us to roll up our sleeves and act on probabilities that account for the unexpected. The more data we can leverage, the more we can learn about the levels of complexity shaping our environment and take action accordingly.
When business leaders take the time to thoroughly test model outcomes based on inputs that reflect a chaotic state, complexity reveals itself.
However, this approach doesn’t come naturally. In most cases, it will require organizations to reevaluate their relationship with their data. Here are three ways they can do that:
When conducting modeling activities, I’m often reminded of the adage, “All models are wrong; some are useful.” Not even the most sophisticated models will yield entirely accurate predictions because data from the past constantly feed them.
Rather than treating model outputs as concrete evidence of what’s ahead, they should be viewed as indicators of what’s possible.
When using models to inform critical decisions, leaders should test a range of inputs to understand where optimizations can be made.
A business going down one path might use the operational status quo as a control and then test inputs that reflect hypothetical changes — internal processes, people, or some other metric — to see what different paths open up.
The more inputs an organization can test, the more insight they can gain about the strengths and weaknesses of their operations.
Sometimes, even companies that engage in testing fail to make the right decisions — usually because they resort to acting on preconceived notions rather than the data at hand.
The Case of the Seriously Flawed Data
When leaders have already invested in a plan based on one expected outcome, they might be more inclined to trust their instincts than the output of a model, suggesting their plan has serious flaws.
However, by remaining patient and continually gathering more data to inform their models, they can get a more accurate sense of the true nature of the environment they’re operating within and make more innovative plans for navigating it.
Image Credit: Provided by the Author; Thank you!
The post How Modeling Must Evolve to Account for Complex Environments appeared first on ReadWrite.