Sandy's Cone of Uncertainty, National Weather Service |
Still I think it is appropriate to notice the success of the National Weather Service and its associated forecasters. Without their efforts, and subsequent evacuations, many more people might have been been killed or injured. One key to that success, according to Nate Silver, is the NWS's embrace of uncertainty, its frank acknowledgement of error, and its skepticism of its own models. The "cone of uncertainty," above, exemplifies the NWS ethos and makes it public. Instead of simple trajectory, we get a range of possible paths, and the caveat that areas outside the path may still feel some impact of the storm. I'm sure you've all seen some version of this graphic recently.
Silver, whose political forecasting blog (fivethirtyeight) offers some of the best analysis of the ins and outs of predictive modeling that I've read anywhere*, published an excerpt from his new book last month on the development of weather forecasting. It's snappily written with a standard new journalism flair. Silver shows us the shiny innards (in-nerds? ha, ha, get it?) of the National Centers for Environmental Prediction (which I had never heard of) and makes lame jokes along the way (but who am I, really, to judge?).
While Silver offers a capsule history of weather prediction, his real points are: 1) the embrace of uncertainty has been key to improved prediction and 2) even the best models have flaws that require expert judgment and human intervention. On the first point, Silver contemplates the impact of chaos theory on the way forecasters deal with their data and he notes the increasing tendency of forecasters to make their uncertainty more evident. It makes me wonder if the general public has been learning a new way of thinking, a probabilistic way of thinking, or is at least getting more comfortable with ranges of prediction.
On the second point, Silver draws our attention to NWS technicians who "draw on their maps with light pens, painstakingly adjusting the contours of temperature gradients produced by the computers...." He interviews a NOAA official who explains that there are lots of models and they seldom agree. Those closest to the predictions know just how much they still don't know. But that does not incapacitate them. Silver draws analogies to poker players, billiards sharks, and high-frequency stock traders in an effort to explain how human judgment can make it possible for people plagued by uncertainty to find an advantage. The moral here is that uncertainty need not lead to inaction, and in fact can lead to smarter action.
I'll close this post with the "Modelers' Hippocratic Oath," written by Emanuel Derman and Paul Wilmott for the sake of financial modelers, but more broadly applicable:
The Modelers' Hippocratic Oath
~ I will remember that I didn't make the world, and it doesn't satisfy my equations.
~ Though I will use models boldly to estimate value, I will not be overly impressed by mathematics.
~ I will never sacrifice reality for elegance without explaining why I have done so.
~ Nor will I give the people who use my model false comfort about its accuracy. Instead, I will make explicit its assumptions and oversights.
~ I understand that my work may have enormous effects on society and the economy, many of them beyond my comprehension.
Good science, it turns out, and good modeling, require a strong dose of humility.
*Seriously, I could make the argument that Silver is just using the election to teach a huge number of people how to think responsibly about statistics and predictive modeling. Ben Schmidt does a similar service for the digital humanities using nostalgic TV shows like Downton Abbey.
Note: Only a member of this blog may post a comment.