FairWarining Reports

Highway Agency Takes a Hit Over Safety Report on Electronic Billboards

An electronic billboard in Sarasota, Florida

Photo courtesy of Scenic America

Why did the billboard cross the road?

It sounds like the opening line of a corny joke, but it’s actually a question raised by a baffling glitch in a Federal Highway Administration study on the safety of electronic billboards. Billboards that seem magically to have moved from one side of the highway to the other are part of a detailed critique by a former FHWA researcher, who says the federal report is so badly flawed that no one should rely on its conclusions.

It’s just the latest, but most prominent, black eye for the federal research, which was announced with fanfare in 2007.

The $859,000 FHWA study had been eagerly anticipated by local agencies across the country, including some that held up permit decisions on electronic billboards to await federal guidance. They hoped the report would shed light on whether the visually stunning digital signs, which change messages every few seconds, might pose a threat to traffic safety.

The FHWA study, performed for the agency by the consulting firm Leidos, used sophisticated eye-tracking equipment to time drivers’ glances at billboards and other visual features along assigned routes in Reading, Pa., and Richmond, Va. Finally released years behind schedule, the study found that the digital signs did not prompt drivers to look away from the road long enough to increase the risk of crashes. The billboard industry, which has aggressively pushed to install more of the lucrative displays, trumpeted the results as confirmation of their safety.

But the lengthy critique issued last month by Jerry Wachtel–who worked for the highway administration in the 1970s and ’80s and served as an adviser in a preliminary phase of the study—says the research is so riddled with errors and contradictions that it should be disregarded (A summary of his critique can be found on the website of the Eno Center for Transportation think-tank). His report lists experts from the U.S. and five other countries who reviewed his conclusions, including one who wrote: “It is highly disappointing, even irresponsible, that a study anticipated for so long on such an important question has been so poorly executed.”

For their part, officials of the FHWA and Leidos, which formerly was known as SAIC, have refused to answer questions about the federal study. “FHWA has no additional comment beyond those made in the report itself,” spokesman Doug Hecox said in an email.

Wachtel, who heads The Veridian Group, a consulting firm, says he wasn’t hired to dissect the report, but decided to do it on his own. He said he was concerned that local officials lacking the technical background to analyze the FHWA study would simply read the conclusions “and begin to promulgate regulations based on this faulty data.”

“The breadth and depth of the mistakes and errors were so substantial,” he said, “that it was either very, very poor science, or there was something about it that they were trying to hide.”

Seizing on Safety Issue

While billboard foes oppose digital signs mainly on aesthetic grounds, they’ve also seized on the safety issue, and are touting Wachtel’s critique. “We know that the issue of digital billboards and traffic safety is far from settled,” said Mary Tracy, president of the anti-billboard group Scenic America, in a prepared statement. “Any public agency considering allowing the bright, blinking signs on their roadsides should take this critique into account first.’’ The Outdoor Advertising Assn. of America, the billboard industry trade group, did not respond to requests for comment.

Jerry Wachtel

Jerry Wachtel

As reported by FairWarning, the federal study was originally slated for completion in 2009. It was already overdue when peer reviewers shredded a draft of the federal study in spring 2011. According to records obtained under the Freedom of Information Act, the expert reviewers said the eye-glance times recorded for the test drivers were far too brief to be credible, suggesting serious problems with the equipment or mistakes in analyzing the data.

“The reported glances to billboards here are on the order of 10-times shorter than values reported elsewhere,” one reviewer wrote. Said another: “The data reported as average glance durations are not plausible.’’

The reviewers “have serious concerns” about glance data that “greatly undermine their confidence in the report,” former FHWA official Christopher Monk told the lead author William A. Perez in a May, 2011 email released in response to FairWarning’s Freedom of Information Act request.

“Suffice to say that if we cannot adequately address these concerns either through counterargument or through re-analyzing, I doubt OST [Office of the Secretary of Transportation] will let it go out and it will be perceived (correctly) as a failure on our part.”

By then, agency officials were being peppered with inquiries about the status of the overdue study. Records show that they repeatedly answered, euphemistically, that it was under review. “Have no idea when we can change that message (do you?) but we will plan to continue to sound like a broken record,” wrote one official in an email to another. “Wish we could end this.”

Changes Not Explained

The study was finally released more than two years later, on Dec. 30, 2013. It featured major adjustments to the eye-glance data, without explaining how they were recalculated. It said the longest recorded glance at an electronic billboard was 1.34 seconds—less than the two seconds that some authorities say raises crash risks.

According to the study, “The results did not provide evidence” that electronic billboards, “as deployed and tested in the two selected cities, were associated with unacceptably long glances away from the road.”

The Outdoor Advertising Assn. of America quickly embraced the finding. “Studies have long shown that digital billboards do not cause distracted driving behavior,’’ said its president and CEO, Nancy Fletcher, “and this new study comes to the same conclusion.’’

Much of Wachtel’s critique is steeped in bone-dry technical argot, though some puzzling details and factual discrepancies are apparent to an ordinary reader.

It notes, for example, that the federal study did not measure glances for the entire time drivers were approaching billboards. And while the draft report had listed a combined 20 electronic and 10 standard billboards on the driving courses of the two cities, the final report included only eight signs of each type. There was no explanation for discarding the rest.

The sizes of billboards and distance of setbacks from the road were also changed from one version to the other, the critique said. It added: “Perhaps the greatest concern for a reader attempting to understand the findings of this study is that, between the draft and final reports, some target billboards appear to have crossed from one side of the road to the other.”

Print Print  
Myron Levin - FairWarning

About the author

Myron Levin is editor of FairWarning.

3 comments to “Highway Agency Takes a Hit Over Safety Report on Electronic Billboards”

  1. Maggie

    My concern with the digital billboards is not to much that a driver looks at them too long, though I agree that could well be a problem. Moving lights are hard to not look at and then you may look too long. My main concern has been when I drive by these beasts at night. Sometimes they are sooooo bright that they blow out my night vision and make seeing the roadway and other cars up ahead more difficult for a while, even if I intentionally do not look at the billboard (which can be hard because our eyes to tend to look at these moving images even if we don’t want to look at them). And around here they along at some of the bussiest areas of busy highways, where even a small distraction can have serious consequences.

  2. Bill Brinton

    A fundamental question arises about the original “draft” study that was shredded by a peer review of three people. Did the “final” study substantially change from the “draft” study? If so, was that final study (the one released to the public) subject to its own peer review? If not, why not? Members of the billboard industry often explain that it takes 5 seconds of viewing time for billboard advertising copy to be effective. If that is true, then a federal study reflecting that those drivers who are even looking in the direction of a billboard advertisement are only doing so for 1.34 seconds at the longest recorded glance. Perhaps that is a fact that the outdoor advertising industry should share with would-be advertisers. How many drivers were not evenlooking in the direction of the billboard? Advertising professionals who sell and market competing modes of media communication may now have data that suggests what may not be effective at all as opposed to what may in fact be effective. If you have an advertising budget, what is the best way to spend it?

Leave a comment