This post is based on a previous attempt to peek into the future. Having discussed the economic, political and social risks before EU Member States, as well as their consequences, I now turn my attention to the probabilistic quantification of these events, the likelihood with which they might occur. This is a form of forecasting that all macroeconomic consultancies like to engage in. However the majority of them put their finger to the wind, see where it is blowing and go with it. Probabilities are taken from the analysts’ mind without much tractability or coherence. This tends to result in relatively arbitrary values based in heuristics, which are difficult to evaluate and to criticise. More often than not, such values are provided to satisfy a primal need from clients to have a numerical and visual aid at no cost for the analyst. Businesses and individuals should be very skeptical about any such values. They provide more noise, not news nor certainty.
What this posts offers is a tentative exercise to provide tractable, criticisable estimates of the likelihood of future developments in the EU, based on the risks identified in the previous post. Following the argument that a crisis in Europe is still “likely”, I show that the probability of this resulting in a collapse of the Euro-zone is small and that comments to that effect are alarmist at best, hysterical at worse.The post is divided in 4 parts. The first part offers a user guide on how to read the chart that is shown in the second part. The third part discusses the criticisms of this methodology. The last part concludes by mentioning an alternative analytical methodology and by isolating the criticisms presented in the third part from econometric and macroeconomic academic models.
In the next section I present a flow chart providing some tentative insights as to how I understand the Euro-Zone sovereign debt crisis will progress in the coming year. Before I do this I must explain how it is used.
The chart is to be read vertically, from top to bottom, unless when the reader reaches a “cross road” where two paths are available. At that stage the reader should follow the nodes to the right in order to follow the next events in that “universe”. The total probability of a given sequence of events is the product of all the probabilities along its path. The sum of sequence of events probabilities adds up to 100%, minus a decimal rounding error.
The flowchart below provides some insights into what might happen. As is argued, there’s a very high probability that there will be a crisis in Europe. However, it is consistently shown that the Euro-Zone is not necessarily at risk, in light of the fact that the ECB and/or the EFSF are extremely likely to intervene at some point, in order to stop the crisis from getting worse.
There’s an argument to be made that the scenario analysis assumes that those two institutions would always be able to react fast enough. To attend to that concern, I have included the case where no one reacts and the problems are either not soved at all or solved by non-Europeans. The probability of such an event, according to the analysis above is less than 1%, although you may disagree.
Criticisms of Quantified Scenario Analysis
However, as much as I would like to state that these values are scientific, they are not. I do not claim to have solved the fundamental problem with this type of analysis. I do not have an enormous data set with events and so have been unable to conduct a Monte Carlo experiment to identify the probability distributions which would underlie a serious analysis. But, then again, neither do any of the consultancies which charge tens of thousands for their advice on political events. The main merit of the values I produced is that they are tractable, in the sense that I have made a conscious effort to make it possible to trace how I arrived to them. They are coherent and consistent to the extent that I have attempted to follow the rules that I set out for this exercise to their logical conclusion. Moreover, by describing them in that matter I have endeavoured to clearly expose the different conditions underlying the many narratives that are floating around. However, they are not falsifiable, which makes them a great tool for analysts who can never be proved wrong. As long as the events are worded in general enough a manner they can always account for all relevant scenarios. Because we do not have access to the infinitude of parallel universes where each of the above sequence of events would take place, the actual probability can never be found. Therefore, even if the event with the lowest probability happened, so long as you accounted for it, the analysis will always be right. Ex-ante an event may be uniformly, normally binomial or distributed in any other way. Ex-post the event has become a certainty.
Let me save you some time and explain all that is wrong with my own analysis, aside from the callous ignorance of the actual probability distributions:
- However well I may have argued my case for crisis in the past, the reason why the crisis in Europe or the interventions of the EFSF or the ECB are “likely” or “very likely”, is still somewhat of an assumption.
- Choosing the average probability as being the most representative could also be argued as being flawed. The median might be more adequate, particularly if the distribution is not symmetrical around its mean. Others might have preferred to consider the variance instead. Given my ignorance of the actual probability distribution this would have been impossible. I have assumed the statistics from a value judgement based on my anecdotal observation of events.
- Why the probabilities should be independent and thus multipliable down the branches of the scenario analysis is also arbitrary and probably inaccurate. The real world is chaotic and everything influences everything. In this endogenous world I doubt that independent and identically distributed events may not be the rule.
There were two main reasons to conduct this analysis. First to show that the collapse of the Euro-Zone is not the logical inevitability so often talked about. The second aim was to discuss the fact that assertions such as these are grounded on relatively very fragile methodological bases.
Two final comments are warranted about other forms of analysis.First, there are generally unexplored alternatives to this heuristic approach. One such influential model is Axelrod’s Landscape model of coalitions. Unfortunately, despite being a relatively old model, it has not yet gained much traction in the realm of economic analysis. I hope to use it in the future to provide more insightful analysis.
Finally, the poor quality of this type of analysis is an exception rather than the rule for economic analysis. While they may fail to yield magic wands that predict shocks, econometric and theoretical models are more serious. Although they still suffer from some imperfections, they are supported by decades of research and tend to be build incrementally by adjusting and improving existing models. Please consult the relevant sections of this website if you are interested in learning more about them.
Hopefully, this post shows how easy it is to make predict the future and how even easier it is to explain why your prediction was “inaccurate” but not “wrong”. For your own sake, next time someone tells you that the Euro-Zone is going to collapse or that it is going to thrive, at least ask them to explain the chain of events that’ll lead to either destination.