Highway Robbery
Highway Robbery
Government highway agencies have enabled the blatant falsification of traffic model results. Consequently, the United States wastes billions on road expansions that fail to cure congestion and make it harder to get around without a car.
In 1996, the state highway agencies of Kentucky and Indiana set out to build a new bridge over the Ohio River, adding more lanes to Interstate 65 where it leaves downtown Louisville. Their planners employed an elaborate computer model to forecast future traffic volumes. The model predicted that by 2025, 160,000 cars would cross the old and new bridges on an average weekday. Based on that forecast, the states decided to make the new bridge six lanes wide. When it finally opened, in 2016, the project had cost more than a billion dollars.
In 2023, just 70,000 cars crossed the two adjoining bridges on an average day. The model was wrong, but it did its job for the highway agencies: they got to spend all that money on the new bridge.
Highway construction is a very big business. Nationally, the United States spends nearly $150 billion per year on road and highway construction, an amount that has increased by almost 50 percent in the past five years. The highway-building bureaucracy has created a powerful and well-organized political machine that mobilizes construction companies, engineering firms, truckers, and local business boosters. Politicians are always keen to take credit at ribbon-cuttings. Highway departments routinely shortchange maintenance to cobble together funding for massive empire-building highway and bridge projects.
In pursuit of these goals, highway agencies depend on traffic models. These models are bewilderingly complex, their results are offered with false certainty, and when they are challenged in court, judges routinely defer to “agency expertise.” To understand how these impenetrable models work, let alone contest their accuracy or validity, is a daunting task. The models thus serve as powerful technocratic weapons in securing funding, dismissing environmental concerns, and blocking outside scrutiny. Concrete keeps pouring into new highway lanes, regardless of their utility for drivers or their damage to the world around them.
Bad Science
The National Environmental Policy Act, passed in 1970, requires highway builders to assess environmental impacts before an interstate highway can be built or expanded. These assessments hinge directly on estimates of future traffic levels. The forecasters, usually employees or consultants for the state highway agency, use models developed by regional planning organizations. Established by federal law in each metropolitan area, the regional planners are theoretically independent of the highway agencies, but in practice are usually under their thumb.
The models divide the region into areas of a few thousand inhabitants each, called Traffic Analysis Zones. The model starts from the number of residents in each zone and the locations of their jobs, both currently and as predicted for a “forecast year” twenty or thirty years in the future. The model then finds the optimum route for each trip to work, balancing travel time against tolls or transit fares. Non-commuting trips, like those for shopping, trucking, and through travel by long-distance drivers, are added in as well.
These models need a vast amount of data about current travel patterns, much of which can only be estimated. Extrapolating such data decades into the future creates further potential for error and manipulation. Dealing with congested roads piles on mathematical difficulties: when traffic backs up, traffic speed at one location depends on traffic volume elsewhere. Whether a given route is fastest for one driver depends on how many other commuters choose to drive that route. Highway builders take advantage of this complexity, presenting models to the public as black boxes that only experts understand. Key assumptions are not disclosed.
It’s not news that powerful economic interests can pervert science. The cases of climate change, tobacco, asbestos, and lead are only the most notorious examples. Research is kept in friendly hands so that dangers are known only to the manufacturers (asbestos), or even better, remain undiscovered (leaded gasoline). When that fails, companies manufacture doubt by sponsoring a cadre of friendly researchers who slant studies to yield desired results (cigarettes, global warming).
For all their faults, industry-backed researchers in those fields generally avoided flat-out falsification of study results. The highway agencies, however, have taken the perversion of science to a new level.
Until recently, lack of transparency shielded the inner workings of the modeling process from public view. But two recent investigations, one by each author of this article, managed to get behind the curtain. Both revealed blatant falsification of model results. When forecasters were disappointed by the computer outputs, the forecasters simply changed them by hand, passing off the doctored numbers as genuine results of the model. The practice of manually altering the results of calculations turns out to be widespread, and the Federal Highway Administration, which should police the modelers, has given it a wink and a nod.
The I-5 Columbia River Bridge
Since 2004, the Oregon and Washington State Departments of Transportation have been promoting a five-mile-long, ten-lane, $7.5 billion bridge and highway expansion on I-5 across the Columbia River between Portland and Vancouver, Washington. The Interstate Bridge Replacement project, previously branded the Columbia River Crossing, has been touted for two decades—long enough to bring to light fundamental flaws in the project’s traffic modeling.
The project’s claimed rationale, repeated despite years of evidence to the contrary, is that traffic volumes across the river will grow rapidly, creating intolerable congestion if nothing is done. In 2005, state highway officials predicted that in the “no-build case”—the scenario if the project is not built—traffic would grow 1.3 percent per year for the next two decades. In reality, traffic growth from 2005 to 2019 averaged just 0.3 percent per year. The travel demand models overstated actual growth by a factor of four—a mistake that current forecasts still repeat.
The state DOTs presented their traffic projections for the revived project as the findings of a regional travel demand model. But rather than use the model’s outputs, the project’s consultants altered them, inflating predicted rush-hour volumes to falsely support the need for a wider roadway.
Local advocates—including Joseph Cortright, co-author of this article—were only able to obtain the actual model results under state public records laws. Comparing the actual outputs to the DOTs’ published forecast showed that project consultants had systematically altered numbers to favor the proposed project and minimize environmental impacts. These changes made “no-build” traffic volumes look larger, and congestion vastly worse, than the model had predicted. Moreover, consultants failed to show their work so that outsiders could check the validity of the alterations.
When the changes were discovered, the DOTs justified them as “post-processing.” Post-processing is a real part of modeling, used in many fields to describe an auxiliary computer program that puts the numerical output of a simulation model into a form understandable by humans or by another computer program. Typical post-processing operations include graphing, interpolation, unit conversion, or smoothing to remove numerical noise. But crucially, genuine post-processing does not alter the findings of the simulation model.
Maryland Toll Lanes
In September 2017, Maryland Governor Larry Hogan announced a grandiose plan to widen nearly 100 miles of highways around Washington, D.C., by adding privatized toll lanes. The proposal was hotly contested from the start, and due in part to grassroots opposition, was repeatedly scaled back in the years after Hogan’s initial announcement.
Just as the Maryland DOT was winding up its draft environmental study, the D.C.-area Transportation Planning Board issued a new version of its traffic model. Maryland had already done its analysis using the previous model, so it ran the new model for the no-build case to confirm that the two versions yielded similar results. The comparison was included in the draft report, published in October 2020.
A few months later, the project shrank once again, down to a thirteen-mile stretch across the Potomac River on Washington’s famous Beltway and continuing northward on I-270. A supplement to the draft environmental report, issued in October 2021, stated that its forecasts for both build and no-build cases were based on the regional planning board’s newer model version.
Notably, the supplement predicted traffic volumes in the no-build case that were substantially different, by as much as 10 percent, from the traffic predicted by the same model in the previous report. Yet the model had only been run once—a fact never mentioned in the report. Not until two years later, after a contentious fight under the Public Information Act, was it revealed that Maryland DOT had attributed two different sets of numbers to the same model run.
There were manifest errors in the October 2021 forecast. It predicted, for example, that widening highways west of Washington would substantially reduce traffic toward Baltimore and Annapolis on the northeast side of the city. Co-author Benjamin Ross and other opponents of the toll lanes wrote to the Federal Highway Administration, pointing out that the model had to be flawed to produce such patently incorrect predictions. We asked for the model to be fixed and the report redone.
The final environmental report, with a new traffic forecast, appeared the following June. The anomalies identified the previous October had been corrected, but the traffic volumes had also been changed, in ways no computer model could have produced. On July 11, 2022, three weeks before final federal approval of the project was expected, Ross requested an investigation into possible scientific fraud, attracting media attention.
On August 11, this request and Maryland DOT’s reply were referred to specialists at the Volpe Center, a federal transportation research organization in Cambridge, Massachusetts. Just four days later Volpe responded, saying that while manual adjustments to model outputs are sometimes necessary, the Maryland modelers had not explained their adjustments and therefore Volpe could not “assess their plausibility or validity.”
Meanwhile, the scheduled August 5 federal signoff date had passed. Governor Hogan, who had hoped to put the toll lanes at the center of a possible presidential campaign, was furious at the delay. Calling it “outrageous and shocking,” he wrote to President Joe Biden to demand immediate action, and threatened a lawsuit if it were not forthcoming.
Federal approval came on August 25. The Maryland DOT press release announcing the decision blatantly misrepresented the Volpe Center’s findings: “USDOT Independent Review Finds No ‘Scientific Fraud’ in Toll Lane Traffic Model,” the headline declared.
Deep in the fine print of the approval document, however, in the legend of a figure on page twenty of Appendix D, the Maryland DOT admitted to the public for the first time that it had manually changed traffic model outputs. In all, we now know, it had published three substantially different sets of numbers and attributed all of them to a single model run.
A Common Practice
Exaggeration of traffic growth is endemic to the highway engineering profession. Researcher Tony Dutzik reviewed two decades of predictions of automobile usage by state transportation departments, the Federal Highway Administration, and industry groups. In nearly every case, Dutzik found, actual traffic volumes grew substantially more slowly than forecasted. Predictions for individual highways ran even farther off base.
In the decade since the Ohio and Kentucky highway departments began pushing to expand the Brent Spence Bridge connecting Cincinnati and Covington, Kentucky, ostensibly to serve the future traffic increases predicted by the agencies’ models, traffic levels on the bridge have in fact gone down. Nonetheless they are proceeding with a $3.6 billion project to almost double the size of the bridge.
Again and again, critics such as traffic engineering consultant Norm Marshall find highway agencies ignoring real growth trends and capacity constraints to overstate projected traffic congestion. The predicted no-build congestion is exaggerated; the environmental damage from the added traffic that the wider road will attract is minimized. Building these unrealistic assumptions into traffic models serves the interests of highway builders.
But the rot goes deeper. Much evidence suggests that the practice of altering model results, as uncovered in Oregon and Maryland, is widespread. In an informal survey last summer, modelers from seven states told the advocacy group Transportation for America that their organizations alter outputs manually based on “engineering judgment” or “long-range trends” as part of their post-processing. Similar reports come from former employees of highway agencies elsewhere.
To be clear, simulation modeling need not be done purely by computers. In proper circumstances, the computer output can be combined with other numbers: for example, if a traffic model only simulates the movement of passenger cars, trucks are estimated manually and added to get the total traffic volume. But without a quantitative basis, such changes are mere opinion, not modeling. Concealing alterations to portray manually adjusted numbers as the outputs of an impartial computer model is scientific fraud.
Many younger traffic engineers are troubled by these practices. Last year, California Department of Transportation Deputy Director Jeanie Ward-Waller filed a whistleblower complaint over the agency’s plans to illegally divert maintenance funding and avoid environmental reviews to widen a stretch of I-80 between Sacramento and Davis. Shortly afterward, Caltrans (as the agency is known) fired Ward-Waller, who is now suing the department for illegal retaliation. Caltrans continues to push ahead with the project, despite opposition from the state’s air pollution regulators. The California Air Resources Board had taken the extraordinary step of debunking Caltrans traffic modeling, which claims that the highway will generate fewer vehicle miles of travel and less pollution if it is widened than if it is not.
Why the Falsification?
If even malignant economic interests such as cigarette and asbestos manufacturers rarely resorted to flat-out falsification of results, why is it so common in traffic modeling? Part of the answer lies in the environmental legislation that requires highway agencies to come up with traffic forecasts. It’s not enough for them to suppress bad results; they must manufacture good ones. Another factor is the models’ sheer complexity. Most model users rely on computer programs and input data developed by others. To cook the books by changing algorithms or inputs would require coordinating a team of people across multiple organizations; it is much simpler to just change the answers.
There are even deeper problems. Even when results aren’t blatantly falsified, they are distorted by inherent biases and shortcomings. Despite their complexity, models omit two basic processes that determine traffic volumes on congested highways. First, they assume drivers always react to congestion by taking a different route. Second, they ignore the limited physical capacity of a highway and don’t consider the spreading of traffic jams beyond the bottlenecks that cause them.
When a car trip takes more time or costs more money, some people walk, cycle, carpool, or choose not to take the trip. Others shift their schedules to avoid rush hour. Over time, people move or change jobs. If a highway is widened to speed up traffic, the missing traffic will return, and job and home relocations will create new traffic. The models in current use are unable to count the drivers waiting in the wings, let alone predict how the number of cars on the road will vary as congestion gets better or worse. As a result, the models often fail when trying to analyze congested roadways.
On top of that, the spatial structure of the models, based on Traffic Analysis Zones, blurs detail. Traffic is not divided accurately among nearby roads. The user’s guide for at least one regional model even warns against using it to predict traffic on individual roads, before going on to say that it will be used in just that way.
With these weaknesses, models tend to depart from reality even when used with the best intentions. When they fail even to reproduce current traffic conditions, as often happens, modelers introduce fudge factors to create a match, which in turn makes them less sensitive to future changes. Algorithms pushed far outside their realm of applicability spew out nonsense. Modelers replace the nonsense with their own best guesses and call what they’ve done post-processing. From there it’s a short step to altering results to please the boss.
Indeed, the best possible forecast may be one that forgoes elaborate computations altogether: in crowded urban areas, traffic congestion will remain the same, whether highways get wider or narrower. This prediction is far from perfect; no one doubts, for instance, that widening a highway at a bottleneck point can move the traffic jam elsewhere. But in our experience, it is substantially more accurate on average than current traffic modeling.
The Columbia River bridge story is typical. Modelers two decades ago predicted growing delays unless something were done; but as the widening project has languished, traffic volumes have barely changed. Tearing down San Francisco’s Embarcadero Freeway after a 1989 earthquake made downtown traffic no worse than before. An extreme example is I-405 in Los Angeles, where a carpool lane was added to a ten-mile stretch of highway through the mountains west of Beverly Hills, at a cost of a billion dollars. This was supposed to cut ten minutes off commuting times. But after the new lane opened in 2014, the drive took a minute longer than the year before.
There is, of course, no need to feed data into a computer if your model always predicts that traffic will move at the same speed twenty years hence as it moves now. Scientifically, a simpler model is a better model. But for highway agencies building a case for larger roads and more expensive projects, such a model would be a disaster. They need to predict worse traffic if the highway isn’t widened and better if it is, and to fend off criticism by obscuring the basis for these predictions in a fog of complexity.
By contrast, the last thing the highway agencies want to consider is the one proven way to reduce traffic congestion: charging tolls on existing highways. Such tolls are the reason the Louisville bridges carry fewer cars than they did years ago. (The modelers took the tolls into account, but wildly underestimated their effect in discouraging traffic.) As this example shows, charging a toll high enough to pay for a new bridge will often reduce traffic so much that there’s no reason to build the bridge at all—a fact that explains highway agencies’ widespread resistance to tolling for congestion relief.
Until recently, New York City was poised to use tolls to relieve the traffic jams that have plagued Lower Manhattan for a century. New York stopped adding road capacity decades ago, and much street space has since been converted into bus and bike lanes, parks, and outdoor dining space. In that time, the city has gained more than a million residents and jobs with little effect on traffic congestion, while two-thirds of all trips are now on foot, on bicycle, or by transit. The overwhelmingly negative reaction to Governor Kathy Hochul’s decision to abort congestion pricing shows the growing support for managing traffic congestion by limiting automobile use instead of making more room for cars.
Traffic modeling, as now practiced, spreads a pseudoscientific veneer over highway engineers’ and contractors’ never-ending quest for ever-larger roads. The demonstrated inaccuracy of current methods is persistently and willfully disregarded, while “post-processing” results to fit a preferred narrative is all too common. The United States keeps wasting billions on road expansions that not only fail to cure congestion, but also make it ever harder to get around without a car. The outcome is more driving, more pollution, more climate-warming gases—and more traffic jams to boot.
Benjamin Ross, a longtime Dissent contributor, is chair of the Maryland Transit Opportunities Coalition.
Joseph Cortright is the director of City Observatory in Portland, Oregon.