There is a very pertinent article today on the dangers of putting too much faith in models, based on inadequate information.
We’ve now lived through the same new disaster twice. Computer simulations, more or less universally adopted as the solution to a major problem, turned out to have been based on flawed assumptions and faulty data. As a result policy or markets became heavily skewed in an inappropriate direction. Wall Street’s risk managers and climate change scientists both acted as super-salesmen for a paradigm that turned out to be flawed. After two examples of the same error have each cost the world a substantial percentage of a year’s GDP, we’d better figure out how to avoid further examples of this syndrome.
I have previously linked to an article that compared Obama to Mikhail Gorbachev. I think this comparison is also valid and interesting.
As the credit crisis of 2008 recedes into history, the part in it played by misguided computer models, particularly in the risk management area, is becoming generally agreed. Rating agencies made assumptions about the probabilistic independence of different home mortgages that were unfounded. As a result many of their AAA ratings proved to be completely spurious, particularly in the subprime area where the loans’ vulnerability to a house price downturn was especially extreme.
Investment banks managed their risks based on the “Value-at-Risk” risk management paradigm, which assumed that the distribution of securities’ returns was approximately Gaussian (normally distributed), with a very low probability of high losses. The “Basel II” system of global capital adequacy standards for banks, which came into effect in 2008, just in time for the crash, was so impressed with these models that it ruled that any bank using such obviously sophisticated and superior modeling techniques could calculate risks on its own, without reference to the crude guidelines deemed appropriate for smaller, less mathematically attuned houses. The Securities and Exchange Commission (SEC) essentially agreed with the Basel Committee; from 2004, it allowed the largest U.S. investment banks to manage their own leverage, under the theory that no mere regulator could match the exquisite precision of a modern VaR-based risk management system.
The model and the confidence placed in it by financial managers who should know better resembles and old paradigm, confidence in machines that we don’t understand. The “black box” is an example. It had happened before. Programs were written by physics PhDs who did not understand finance for financial experts who did not understand programming.
It’s not as if Wall Street had no warning; mathematical models based on modern financial theory had caused huge losses as far back as 1987, and had caused the collapse of Long Term Capital Management in 1998. Yet the world’s best remunerated people went on using the mathematical models that had caused moderate sized disasters before, only to watch them cause a truly impressive disaster in 2008. It must have been some kind of compulsion.
Then we come to global warming and the cap and trade legislation that relies on the theory.
Turning now to my other example, that of global warming: the possibility that excess carbon dioxide, through a “greenhouse effect” might cause a global rise in temperature is based on well-established chemistry and physics. Deniers of the possibility of global warming are thus being as irrational as the extreme eco-alarmists; global warming is indeed possible because of physical and chemical processes that are perfectly well understood, indeed fairly elementary.
The difficulty arises in estimating whether it is actually happening. The rise in temperatures so far observed is well within the level of “noise” in global temperatures over a period of a century or so, let alone the more extreme fluctuations that have taken place when the observation period is extended to millennia. It is thus necessary to match the very limited temperature data we have, stretching back no more than a century on a worldwide basis, with secondary observations of such things as tree rings and ice cores, synthesizing the result with a computer model of what is believed to be the carbon forcing process in order to predict the range of possible future warming effects.
This is of course a very similar process to that undertaken by Wall Street’s rating agencies and risk managers. Assumptions and simplifications are made, without which it would be impossible to construct a model. Then the model is matched up against a few years’ observations in real time, being “tweaked” as real data comes in that does not quite fit with it. By the time this has been done, careers have been invested in the model, institutions have been built around its predictions and eminent people have become enthralled by its results. It thus takes on the appearance of a scientific reality as solid as Newtonian mechanics.
The economic effects of this model are even greater than the effects of the financial models.
The political left continues to lie about the causes of these recurrent crises, even Nobel Prize winners.
The first big wave of deregulation took place under Ronald Reagan — and quickly led to disaster, in the form of the savings-and-loan crisis of the 1980s. Taxpayers ended up paying more than 2 percent of G.D.P., the equivalent of around $300 billion today, to clean up the mess.
I’m sure that Paul Krugman knows the story of Fernand St Germain and the midnight amendment that brought down the S&Ls.
By the time Ronald Reagan took office in 1981, two-thirds of the nation’s S&Ls were losing money and many were broke. If all the problem thrifts had been shut down right then, the government’s insurance fund would have covered their debts.
Instead, the government delayed an average of two years-and, in some cases, as many as seven years-thus allowing bankrupt S&Ls to go on losing billions of dollars. This delay also gave S&Ls a chance to gamble on questionable investments, in an attempt to regain solvency. But first they had to convince Congress to deregulate them.
One night in 1980, Representative Fernand St Germain (D-Rhode Island), whose $10,000-to-$20,000-a-year restaurant and bar tab was paid for by the S&L industry’s chief lobbyist, proposed raising federal insurance on S&L savings accounts from $40,000 to $100,000- even though the average size of an S&L account was $6,000. He waited until after midnight, when only eleven representatives were still on the floor of the House; they approved his proposal unanimously.
But St Germain was just getting warmed up. In 1982, he cosponsored a bill that removed all controls on what S&Ls could charge for interest and released them from their century-old reliance on home mortgages.
That was Regulation Q.
Around the same time, the Reagan administration ended the requirement that S&Ls lend money only in their own communities, allowed them to offer 100% financing (i.e. no down payments), let real estate developers own their own S&Ls, and permitted S&L owners to lend money to themselves.
These changes were like taping a sign to the S&Ls’ backs that read, “Defraud me.
This has little to do with models but I ran across that Krugman column which is so duplicitous that I had to add a comment.
Yep… Representative Fernand St Germain (D-Rhode Island).
(Soul mate to Barney Frank!)
Hey, doc, do you ever bother with Frum’s site nowadays?
Have ANY of the idiots – even one – “adjusted” their views on the “settled science” of global warm… er… of “climate change” one iota?
BILL
I will look at Frum once a week of so. I could not keep up with the changes in passwords and names and it seems to be mostly leftists commenting.
Funny, I was just thinking about the problem of models. I read a book called the Best-Laid Plans that said the forest plans of the 1970s were based on computer modeling using faulty data like 600-ft trees, resulting in an orgy of clearcutting. Then when the USFS suddenly pulled back, it threw wood products into a depression. It was a matter of GIGO.
BTW our last big mill here is closing…