It’s a skill that would’ve been familiar or even ancient in the time of Homer, and it has never not been in daily use all over the world, from shallow coastal waters to the deepest blue seas. It’s so simple that it’s almost overstating it to call it a skill–rather, it’s more of a discipline, a discipline that reflects that word’s root meaning in scholarship, because students of this ancient discipline learn to survive.
Dead reckoning involves keeping track of your course and speed throughout the voyage so that you can plot your best estimate of your current position on a chart. But that’s only the first step to the safe navigation of the ship. What makes dead reckoning work is that, as you plot your course for hours and days and sometimes even weeks, you draw a constantly expanding circle around the point that you hope represents your own true position. That circle of error grows larger and larger the longer you have gone without ”getting a fix,” which doesn’t have to do with drugs although it’s possible to see sailors craving a navigational fix with the same intensity.
Because safe navigation requires that you never let any part of that ever-expanding error circle touch any known hazards, such a shallow waters or submerged rocks. The only way to shrink that circle is to get a fix – to establish your true position with certainty, such as by finding multiple landmarks with the periscope and “shooting” bearings to those landmarks, and drawing the resulting bearing lines on your chart. With luck, the lines all converge at a point, the point where you must have been to view those landmarks from those angles. With such a fix, you can collapse your error circle for a brief moment … though it begins building up again immediately until you get your next “ground truth” fix.
Even more important than the temporary comfort of re-establishing your position with certainty for a moment is what else a fix allows you to do:
A fix after a period of dead reckoning lets you gauge – with real numbers – how good or bad your dead reckoning was, and the rate at which your estimate of your own position is degrading.
And since dead reckoning is pretty simple – the distance you travel in any given direction is simply how fast you are going times the length of time you head in that direction – when you get a reality check that makes you see just how far off your dead reckoning position estimate was compared to reality, what you are really getting a handle on are all the factors you can’t measure directly or control from inside a submarine: You are getting the integrated summation of all the errors and hidden forces that create a difference between your theoretical (dead reckoning) position and reality: ocean tides, deep currents, wind forces, instrument errors, and a hundred other, including the degree to which the sailor at the helm wobbles around the ordered course and the degree of astigmatism in her eyeglass prescription.
If, in the six hours between two fixes, you managed to put a nautical mile between where you thought you were at the moment of the second fix and where the second fix actually showed that you were, you just established what you should assume is your minimum error rate. And with that minimum error rate established, ship’s safety requires you to assume that you will continue to suffer that at least that same rate of positional uncertainty in the future, minute by minute, hour by hour. And that shows up as a bigger and bigger error circle around your position estimate.
That is, if you have found that you were off by a mile after six hours between fixes, then until you have hard evidence to the contrary, you have to let your position error estimate – the ever-growing circle around your dead reckoned position — grow at the rate of 1 nm per six hours (.17 nm/hour). That’s even if you do nothing but simply sail in (what you believe to be) a straight line for six hours under what you believe to be perfect conditions with no tide, current, wind or wave effects. You learn, the hard way, that your “dead reckoning” position is not the point in the center of your circle. To the contrary, you must always assume that you are actually sailing at the very edge of the circle, on the side closest to the nearest hazard – the undersea mountain, the submerged cable, or what have you.
Without the humility of the error circle, dead reckoning is a recipe for running aground and causing the loss of the ship with all lives. And if fixes are easy to come by – if you’re sailing along a coast with lots of buoys and on-shore landmarks – it’s easy to keep “shrinking the circle” so that it never gets so large that it interferes with where you’d like to go.
But get well away from shore, or get socked in by bad weather such that visibility is nil, and the error circle keeps growing and growing, such that you can wind up unable to go in the direction you want to head because the huge positional uncertainty circle includes hazards in several directions. Until you get some ground truth that lets you shrink the error circle, you are stuck in deep waters, unable to approach the shore.
And that’s where things go wrong. Because it takes iron discipline – in the sense of toughness, not learnedness – to refuse to let your human ego and fondest wishes lead you astray. Humans are less rational creatures than we are rationalizing creatures. And humans in hierarchies are primed for groupthink, which is what leads to refusal to remain humble in the face of fervent desires. When people really want something, and when their superiors really want something, it’s quite difficult to resist the urge to start “shrinking the error circle” based on phony pretexts (rationalization). Nobody says that they want to shrink the circle because, if we don’t, we won’t be able to reach the harbor on time and they’ll miss the Super Bowl or whatever. Instead, they invent persuasive arguments to claim that the error circle was needlessly large, and that it can be reduced without risk.
What does dead reckoning have to do with Oregon, except for helping explain why some of the many ships lost off our coast found a hazard where they hoped for a safe harbor?
Just this: Public agencies – all public agencies, but most especially those that either make or rely on long-range forecasts — need to learn both the humility and the discipline of dead reckoning, of having a constantly growing measure of uncertainty around every number that they forecast. We need for all government statements about the future to be qualified with the agency’s current dead reckoning error, using numbers that are based on past agency performance, not the hubris of the management.
When an urban renewal agency forecasts the dazzling results to be had from seizing private homes through eminent domain and lavishing cash on contractors and developers to build a new project, there are countless implicit forecasts made, and so the agency should have to show, for each one, how well the agency estimates have performed historically, and the largest of those historical errors should be applied to the proposed project outcomes, giving taxpayers a way to assess those sunny estimates against the wisdom of experience.
Imagine if Oregonians got accustomed to asking all politicians and public officials for the error estimates that apply to every public pronouncement.
Imagine the day when, after the politician proposes longer prison sentences and claims that these longer sentences will reduce crime rates, the citizens respond by pointing out that we know, with hard numbers, that the error in this kind of forecast is approximately infinite (because our prison sentences are already so excessive, longer prison sentences have roughly the same effect on crime as sentencing according to the astrological chart of the prisoner).
Better yet, imagine the discipline of requiring politicians and educators to report and apply their own error rates to their rosy forecasts about the benefits of the next round of “improved” standardized testing for students.
The Oregon Department of Transportation – which is best thought of as simply the Oregon Highway Department hiding under a flimsy pseudonym – is the classic offender of blind dead reckoning without feedback.
They have the dead reckoning part down – they apply a simple formula to draw a straight line from a past known position (say, measured traffic volumes) to a foregone, forecast conclusion (that we have to pour more concrete and asphalt and build more bridges).
But ODOT and other agencies never do is determine their own rate of error expansion and apply that to future estimates.
One measure of autism is the degree to which an autistic person is unable to recognize social cues from others – the feedback from other people about how the autistic person is perceived. Public agencies are positively autistic in their spooky, otherworldly ability to be fantastically wrong in their forecasts, year after year after year, and never have that message penetrate into the part of the agency responsible for forecasting.
From roads and bridges to prisons to energy facilities, from tax policy to spending programs justified by projected returns on investment – for any expensive, long-lived project that takes a long time to plan and construct or implement – the most important thing to a for a planner – and for taxpayers – should be in forcing the planners and politicians to study their own tendency to err, and to include that tendency in all subsequent forecasts. And they must be forced to identify whether that tendency is random or is biased in one particular direction, like the DOT estimates that always, always, always, always shoot high, which just happens to be the direction that requires lost more highway projects.
The other thing that anyone who forecasts or who pays taxes should keep constantly in mind is the way that forecasting requires humility, because it takes humility to admit that your estimates have errors, and that those errors increase with time. And staying with this humility requires great discipline, because it’s demoralizing to have to confront the fact that most of what we actually know is much smaller than we like, and the uncertainty about what we do often dwarfs the amount we can have confidence in.
This discipline is critical because the effect of the errors grow as the size of the decisions grow, and government tends to deal in large decisions. A tiny 1% (0.01) error estimate in the ship’s measured speed is a much bigger problem at 20 knots rather than 5 knots. And a 5% overestimate of traffic demand compounds year after year, producing an estimate that is twice the actual demand in just 14 years.
Just as a ship’s navigator who wants to get home safely must be obsessed with understanding and minimizing the accumulating position estimate errors, citizens who want government to work well must become obsessed with getting government to adopt the discipline of measuring and applying its own forecasting uncertainty to all its programs and projects.
(To be continued.)