Skip to content U.S. Department of Transportation/Federal Highway AdministrationU.S. Department of Transportation/Federal Highway Administration

Office of Planning, Environment, & Realty (HEP)
PlanningEnvironmentReal Estate

HEP Events Guidance Publications Awards Contacts

Outdoor Advertising Control

National Study

Executive Summary

Survey Design And Implementation

Survey Results

Conclusions

Recommendations For Future Research

Executive Summary

The Federal Highway Administration's (FHWA's) Office of Real Estate Services (ORES) sponsored, in coordination with the National Alliance of Highway Beautification Agencies (NAHBA), a national survey of all states in an effort to determine the types of outdoor advertising (ODA) data each state collects, how it is collected and how that data is maintained and used. The survey was conducted over a five-month period utilizing the World Wide Web and achieved an 81 percent response rate.

The survey revealed that the data collected by each state and the manner in which it is collected depends on several factors. These factors are present in various combinations and to varying degrees in different states. Consequently, the availability of a consistent set of data from state to state is relatively low. Moreover, even where there is consistency in data collection, the exact definition of ODA terms and various sign categories appears to differ to the degree that data continuity is questionable.

Based on the survey, some of the factors that appear to affect the type of data that is collected by a state and the manner in which it is collected include:

The survey showed that many states are now using automated data collection systems for the administration of their outdoor advertising programs. The majority of those states find these systems sufficient for managing the day-to-day needs and requirements of their ODA program. On the other hand, the majority of states using non-automated data collection systems find them inadequate for managing their ODA program.

An attempt was made in the survey to determine the level of consistency in the definition of certain outdoor advertising terms amongst the states. The terms investigated include "normal maintenance", "urban area" and "unzoned commercial and industrial area". There was a high level of consistency amongst the definitions of "urban area", although as many as 18 out of 39 responding states stated that their state laws, regulations and/or agreements do not contain a specific definition of an "urban" area. There was only a fair degree of consistency amongst the definitions of "unzoned commercial and industrial area", although a greater percentage of respondents at least confirmed that their state laws, regulations and/or agreements do contain a specific definition of an "unzoned commercial and industrial area". There was very little consistency amongst the states in the definition of "normal" or "customary" maintenance, and many states responded that their state laws, regulations and/or agreements do not contain such a definition.

A detailed explanation of the survey design and implementation process, along with a presentation of the survey's results, is provided in the paragraphs below.

Survey Design and Implementation

Survey Design

There are two basic aspects of survey design - content (questions) and format (length, style, array of possible responses, etc.). To some degree, both of these aspects depend on the manner in which the survey will be conducted, e.g. via telephone, in-person, via the web, etc. Due to various project constraints, it was decided by JFA in consultation with FHWA and NAHBA that the ODA survey would be conducted via the NAHBA website with responses submitted electronically directly to JFA. Conducting the survey in this manner emphasized the need for short, concise questions; providing pull-down menus or radio-buttons limiting the set of possible responses and making sure that the survey could stand on its own without a great deal of introduction or explanation. This format also allowed for some narrative response and space was provided for comments and further explanation of responses if necessary. JFA created the survey using Microsoft's Front Page software. The survey page was actually located on JFA's website with a seamless link to the page provided on NAHBA's website.

Content

Two shorter surveys previously conducted by NAHBA (See Appendix A) were the starting point for the content of the current survey instrument. At the time the survey instrument was designed, JFA had already attended the 1999 NAHBA Annual Conference in Lexington, KY and visited five states on a related outdoor advertising research project. Knowledge gained through these experiences also contributed to the content of the current survey.

The survey questions were written by JFA and reviewed by FHWA and NAHBA. The survey instrument was then pre-tested on seven states. The pre-test was designed to test both the survey questions and the method of conducting the survey. Just as when the survey was conducted nationally, pre-test states were notified of the survey via email and encouraged to visit NAHBA's website to complete the survey. These states were given further instruction to provide their comments and feedback on the survey via telephone or email to JFA.

Pre-test participants were asked to comment on the clarity of the survey's introduction, clarity of the questions, ease of the response process (e.g. using the web), possible question omissions and overall impressions of survey (e.g. length, relevance, etc.). Pre-test participants were also asked to provide an estimate of the amount of time it took them to complete the survey. All seven of the pre-test states completed the survey and their comments were used to further refine the survey instrument. Problems were detected at this time with the electronic receipt of responses. These problems were fixed prior to implementing the survey nationally. Several questions were re-worded, a few questions were deleted and others were added.

After pre-test revisions, the survey instrument contained a total of thirty-two questions (although not all questions applied to all respondents). The questions were split into two main topical areas - outdoor advertising program administration issues and outdoor advertising data and data collection issues. The data and data collection issues section was further split into questions related to non-conforming signs, illegal signs and questions specific to states with certified cities. (See Appendix B for a copy of the full survey.)

Format

The survey had a brief, five-sentence introductory paragraph that was separated from the body of the survey with a line and different text color. As many questions as possible were worded so as to elicit a "yes" or "no" response. For those questions that could not be answered "yes" or "no", limited options were provided as responses. On the few occasions when a question required a textual response, text boxes of sufficient length were provided.

Out of the thirty-two questions in the survey, twenty-four were yes/no questions, three required the respondent to check all that apply from a list of possible responses, and three provided a list of three or fewer possible responses from which the respondent could select no more than one answer. Several questions of this variety also provided space for further textual clarification if necessary. Two questions required a numerical or short text (one word) response. In these instances, reasonable numerical ranges were set for the possible responses. Several of the yes/no questions also included a separate area for a textual response. Large text boxes were provided, for example, for those questions that asked for the definition of a term. Exhibit 1 shows a sample of two survey questions.

Exhibit 1: Sample of Survey Format

  1. Does your state permit by number of sign faces or by sign location?

Sign Face Sign Location Other, Explain

  1. Does your state conduct regularly scheduled updates of your statewide sign inventory?

Yes No

Survey Implementation

When the survey instrument was complete, a group email was sent to all ODA administrators in the 47 states and the District of Columbia that allow billboards [1] . This email explained the research project and directed all ODA administrators to visit NAHBA's website and to complete the survey. The email contact list was compiled by FHWA from their regional offices staff and checked by JFA against contact names supplied by NAHBA. A few state administrators did not have access to email. A copy of the survey was sent via facsimile to these states. (See Appendix C for the complete list of ODA survey contacts.)

Responses were received by JFA's web server and were checked daily. Responses were downloaded into an Excel spreadsheet upon receipt. Each response was reviewed and respondents were contacted via telephone or email if there were any incomplete or inconsistent answers.

A second group email was sent to all ODA contacts a month after the first. This message thanked those states that had already participated in the survey, and again urged those that had not to visit NAHBA's website and to complete the survey. A third email message was sent another month later. This email again thanked states for their participation and reminding those that had no completed the survey, of the date by which JFA would be tallying the responses. These follow-up emails also mentioned that the results of the survey would be presented at the NAHBA Annual Conference in San Antonio, TX in October 2000. NAHBA representatives also solicited responses to the survey when contacting states about their registration for the 2000 NAHBA conference.

Survey responses were solicited over a five-month period (June through October 2000). The responses were tallied and analyzed using an Excel spreadsheet. A Power Point presentation summarizing the results and a numeric tally of the survey results were prepared for the NAHBA 2000 Annual Conference held in October 2000. States attending the conference that had not responded to the survey were encourage to still do so. The survey results were, therefore, not finalized until December 2000. Two additional states responded as a result of extending the response time beyond the conference.

Survey Results

Thirty-nine out of 48 states responded to the survey resulting in an 81.3 percent response rate. A tally of all the responses is presented in Exhibit 2. Beyond Exhibit 2, this section also highlights the results with respect to some of the general topics covered in the survey. These topics include: the various forms of ODA program administration; definitions of ODA terms; types of data collection systems; types of automated field data collection devices; and methods of inventorying legal conforming, non-conforming and illegal signs. (Appendix D provides the Power Point presentation of the survey results given at the Washington, DC conference in January 2001.)

Statistics

Summary of ODA Programs

The following charts show the distribution amongst the respondents of various attributes of state ODA program administration. The majority of states, 64 percent versus 36 percent, operate their programs in a centralized environment. Sixty-nine percent (27 states) permit signs by sign location rather than by sign face. Most respondents perform regularly scheduled inventories. Of those that do, just under half (14 states) perform them annually, 8 states perform them biennially, 4 perform them biannually and 4 perform them quarterly.

pie chart: How Often - from largest to smallest, Annual, Biennial, Quarterly, Biannual pie chart: Scheduled Inventory. About 80% yes

EXHIBIT 2: SURVEY TALLY

   

Responses

 

Survey Question

Number

Percentage

Q1

Environment in which your ODA program is administered: (universe: all respondents)

   
 

Centralized

25

64.1%

 

Decentralized

14

35.9%

Q2

Require a business to be licensed before issuing a sign permit: (universe: all respondents)

   
 

Yes

11

28.2%

 

No

28

71.8%

Q3

Permit by sign face or sign location: (universe: all respondents)

   
 

Location

27

69.2%

 

Face

6

15.4%

 

Both

4

10.3%

 

Other

2

5.1%

Q4

Do you conduct regularly scheduled sign inventories: (universe: all respondents)

   
 

Yes

30

76.9%

 

No

9

23.1%

Q4A

How often: (universe: respondents answering "yes" to question 4)

   
 

Annual

14

46.7%

 

Biannual

4

13.3%

 

Biennial

8

26.7%

 

Quarterly

4

13.3%

Q5

Controlled routes defined by: (universe: all respondents)

   
 

Map

15

38.5%

 

Written description

3

7.7%

 

Both

21

53.8%

Q6

If FAP designation eliminated, controlled route miles lost:(universe: respondents (13) providing a numeric answer to this question)

   
 

Average

2,432

NA

 

Greatest

9,730

NA

 

Least

60

NA

Q6A

If FAP designation eliminated, percent of controlled route miles lost:

   
 

(universe: respondents (20) providing a numeric answer to this question)

   
 

Average

30.6%

NA

 

Greatest

80%

NA

 

Least

0%

NA

Q7

Do your state laws/regulations have definition of unzoned commercial/industrial area: (universe: all respondents)

   
 

Yes

32

82.1%

 

No

7

17.9%

Q8

Do your state laws/regulations have definition of urban area: (universe: all respondents)

   
 

Yes

21

53.8%

 

No

18

46.2%

Q9

Do your state laws/regulations have definition of normal maintenance: (universe: all respondents)

   
 

Yes

21

53.8%

 

No

18

46.2%

Q10

Do your state laws/regulations address movable/portable signs: (universe: all respondents)

   
 

Yes

13

33.3%

 

No

26

66.7%

Q11

Are your state laws more restrictive than your federal/state agreement: (universe: all respondents)

   
 

Yes

14

35.9%

 

No

25

64.1%

Q12

Have you modified your federal/state agreement: (universe: all respondents)

   
 

Yes

9

23.1%

 

No

30

76.9%

Q13

Do you use an automated data collection/inventory system:(universe: all respondents)

   
 

Yes

26

66.7%

 

No

13

33.3%

Q14

How many months have you used automated system: (universe: respondents answering "yes" to question 13)

   
 

Average

137.2

NA

 

Median

78

NA

 

Greatest

372

NA

 

Least

12

NA

Q15

Do you consider your automated system sufficient: (universe: respondents answering "yes" to question 13)

   
 

Yes

16

61.5%

 

No

10

38.5%

Q16A

Automated system includes use of digital photos (universe: respondents answering "yes" to question 13)

6

23.1%

Q16B

Automated system includes use of GPS (universe: respondents answering "yes" to question 13)

7

26.9%

Q16C

Automated system includes use of laser binoculars (universe: respondents answering "yes" to question 13)

5

19.2%

Q17

Do you consider your non-automated system sufficient: (universe: respondents answering "no" to question 13)

   
 

Yes

4

30.8%

 

No

9

69.2%

Q18

Do you plan to implement a new automated system within a year: (universe: all respondents)

   
 

Yes

13

33.3%

 

No

26

66.7%

Q19

Does your system track all signs including those that do not require a permit under HBA:(universe: all respondents)

   
 

Yes

29

74.4%

 

No

10

25.6%

Q20

Can you distinguish between signs on different types of controlled routes: (universe: all respondents)

   
 

Yes

31

79.5%

 

No

8

20.5%

Q21

Can you distinguish between signs in urban vs. rural areas: (universe: all respondents)

   
 

Yes

16

41.0%

 

No

23

59.0%

Q22A

Does your system track square footage of legal signs (universe: all respondents)

34

87.2%

Q22B

Does your system track square footage of illegal signs (universe: all respondents)

17

43.6%

Q22C

Does your system track square footage of nonconforming signs (universe: all respondents)

31

79.5%

Q23

Can you create inventory of nonconforming signs: (universe: all respondents)

   
 

Yes

31

79.5%

 

No

8

20.5%

Q24

Can you tell why a sign is nonconforming: (universe: respondents answering "yes" to question 23)

   
 

Yes

16

51.6%

 

No

15

48.4%

Q25

Can you know the law/regulation under which sign is nonconforming:(universe: respondents answering "yes" to question 23)

   
 

Yes

7

22.6%

 

No

24

77.4%

Q26

Can you tell if nonconforming sign was part of a ROW purchase:(universe: respondents answering "yes" to question 23)*

   
 

Yes

10

31.3%

 

No

22

68.8%

Q27

Can you tell if federal dollars were spent to acquire a nonconforming sign:

   
 

(universe: respondents answering "yes" to question 23)

   
 

Yes

16

51.6%

 

No

15

48.4%

Q28

Can you inventory illegal signs:(universe: all respondents)

   
 

Yes

19

48.7%

 

No

20

51.3%

Q29

Does illegal sign remain trackable throughout legal process:(universe: respondents answering "yes" to question 28)**

   
 

Yes

20

95.2%

 

No

1

4.8%

Q30

Are removal dates for illegal signs maintained:(universe: respondents answering "yes" to question 28)***

   
 

Yes

12

60.0%

 

No

8

60.0%

Q31

Do you have certified cities: (universe: all respondents)

   
 

Yes

8

20.5%

 

No

31

79.5%

Q32A

Could you inventory legal signs within certified cities (universe: respondents answering "yes" to question 32)

6

75.0%

Q32B

Could you inventory nonconforming signs within certified cities (universe: respondents answering "yes" to question 32)

6

75.0%

Q32C

Could you inventory illegal signs within certified cities (universe: respondents answering "yes" to question 32)

4

50.0%

 

*One state indicated that it cannot create an inventory of non-conforming signs, but that it can tell if a non-conforming sign is part of a ROW purchase.

 

**Two states indicated that they cannot create an inventory of illegal signs, but an illegal sign is tracked throughout any legal process.


pie chart: Program Environment - Central, 64%; Decentral, 36% pie chart: Permit Type - about 70% Location, 15% face, 10% both, an 5% Other (estimates)

Definitions Provided in State Laws, Regulations and Agreements

The vast majority of respondents, 82 percent, stated that their state laws, regulations and/or Federal/State agreements include a specific definition of an "unzoned commercial and industrial area". Only 54 percent, or 21 states, said that their state laws, regulations or agreements include a definition of an "urban area". Similarly, 54 percent said that their state laws, regulations or agreements include a definition of "normal maintenance" as it applies to non-conforming signs.

There is a fair degree of consistency amongst the definitions of an unzoned commercial and industrial (C/I) area. These definitions usually contain the following components:

Differences between state's definitions of an unzoned commercial and industrial area include:

The definitions provided for an urban area quite consistently reference the U.S. Census Bureau definition of an area with 5,000 or more people. There is very little consistency amongst the definitions of "normal" or "customary" maintenance, however. Most states with a definition of "normal" maintenance at least state that "substantial change" is not allowed, but not all states establish a specific value or percentage of the total value of a sign at which point repairs become more than normal maintenance. When states do monetarily define the point at which repairs become more than "customary" maintenance, it is usually expressed in terms of a percent of the replacement cost of the sign.

Types of Data Collection Systems

As the charts below illustrate, the majority, 67 percent of respondents, utilize automated data collection systems. Most of those using such systems find them sufficient for supporting the functions of their outdoor advertising program. Most of those states using non-automated systems do not find these systems sufficient for administering their ODA programs.

pie chart: Is your program automated? 67% Yes, 33% No pie chart: Is your automated system sufficient? 62% yes, 38% no
pie chart: is your non-automated system sufficient? 31% yes, 69% no

The areas in which states identified their systems (automated or non-automated) as insufficient include:

As seen below, several states indicated that they are planning to implement a new automated system within a year.

pie chart: implementing an automated system within a year? 67% no, 33% yes

Those states reporting that they will be implementing a new automated system within a year include: Kansas, North Carolina, New York, Oregon, Nebraska, Connecticut, Arizona, Oklahoma, Texas, Colorado, Tennessee, Wisconsin, Ohio.

Components of Automated Systems

Of those states using automated systems, some are also utilizing various automated field data collection tools. These tools include Global Positioning System (GPS) locationing tools (backpack units, magnetic car-top mounted units, hand-held laser guns), digital cameras, and distance/dimension measuring binoculars. Of the 26 respondents utilizing automated systems, six are using digital cameras, seven are using GPS units of some kind, and five are using dimension measuring binoculars.

bar chart: Data Collection Technologies Used. # Respondents with automated systems. Digital Photo, 6; GPS, 7; laser bino, 5


Types of Sign Information and Inventories Available

Regardless of the type of data collection system used, the survey attempted to ascertain what information states had the capability of providing about various types of signs. In other words, even if a state does not currently track or report a particular piece of information, can their data collection system (automated or non-automated) allow them to assemble various data if asked. The following charts show that most states said they could create an inventory of all signs in their state, including signs exempted under the HBA. Most respondents, 79 percent, could also create an inventory of non-conforming signs. Slightly less than half of the respondents, however, could create an inventory of illegal signs.

pie chart: can create inventory of all signs - 74% yes, 26% no pie chart: can create inventory of nonconforming signs - 79% yes, 21% no
pie chart: can create inventory of illegal signs bar graph: can you determine square footage of these types of signs: legal, about 35, illegal about 15, non-conforming about 30


States were also asked whether or not they could determine the size of various types of signs. Thirty-four states said they could determine the square footage of legal, conforming signs. Seventeen states said the same about illegal signs, and 31 states could determine the square footage of non-conforming signs.

The survey also investigated if states could determine whether or not signs in their inventories are located in urban versus rural areas (regardless of what their particular state definition of urban/rural may be) and whether or not they could distinguish between the type of controlled route on which a sign is located (e.g. interstate versus FAP, etc.) As the charts below illustrate the vast majority of respondents indicated that they could determine the type of controlled route on which a sign is located. The majority of respondents could not, however, determine whether or not a sign is in a rural versus an urban area.

pie chart: distinguisy between different types of controlled routes - 79% yes, 21% no pie chart: distinguish between signs in urban vs. rural areas - 41% yes, 59% no

Non-Conforming Sign Inventories

The survey asked several questions specifically about non-conforming signs. The following charts summarize the types of information that respondents reported they could provide about non-conforming signs. States were asked if they could determine the reason a sign is non-conforming, e.g. size, location, etc. They were asked further if they record the specific law, regulation or agreement under which a sign is non-conforming. States were also asked if they could determine whether or not a non-conforming sign was acquired as part of a Right-of-Way acquisition and whether or not federal funds were spent to acquire a non-conforming sign. As the charts reveal, slightly more than half of those states that can create an inventory of non-conforming signs, could also tell why the sign is non-conforming. Most, however, do not keep track of the specific regulation, law or agreement under which a sign is non-conforming.

Similarly, most states responded that they could not inventory non-conforming signs that had been part of a Right-of-Way acquisition. Slightly more than half of the sates that can create a non-conforming sign inventory can also create a list of those non-conforming signs for which federal dollars had been spent to acquire.

pie chart: regulation/law that makes sign non-conforming - 77% no, 23% yes pie chart: non-conforming signs part of ROW acquisition - 69% no, 31% yespie chart: why is sign nonconforming? 52% yes, 48% no
 

Illegal Sign Inventories

The survey also addressed illegal sign inventories specifically. States were asked if illegal signs remain trackable within their system (automated or non-automated) throughout the legal process. It was also asked if removal dates are maintained for those illegal signs that are removed. As the charts below illustrate, the vast majority of those respondents that can create an inventory of illegal signs, also have the ability to track illegal signs throughout the legal process. Approximately 60 percent of respondents with the ability to create an illegal sign inventory also maintain removal dates for those illegal signs that are removed.

pie chart: illegal sign trackable throughout legal process, 95% yes, 5% no pie chart: removal dates maintained for illegal signs - 60% yes, 40% no
pie chart: federal dollars spent to acquire - 52% yes, 48% no

Conclusions

Beyond the more specific outdoor advertising control issues that are highlighted by the statistics presented above, a few other general conclusions can be drawn from the survey results. The results reveal that states administer their ODA programs significantly differently one state to another. As a result, the data that each state considers important to collect and to maintain and the systems used to do so also vary considerably. Not one question in the survey, for example, was answered 100 percent uniformly by all respondents. The highest degree of uniformity on any question whose universe was the entire set of respondents was 87 percent (34 states). That was in response to the question of whether or not a state's data collection system (automated or non-automated) reflects the square footage of legal, conforming signs - a point many associated with outdoor advertising control would probably have expected to be answered in the affirmative by 100 percent of respondents.

The survey also shows that those states utilizing automated data gathering systems feel that they are better able to effectively administer their ODA programs than those states that do not use an automated system. With many states implementing automated systems, the potential for uniformity in data collection increases (e.g. most automated systems are sophisticated and flexible enough to be able to accommodate various reporting requests even if a particular format was not specifically built into the system).

However, even if all states were automated and could produce a similar report containing the same data elements, those reports would be useless unless each data element were defined the same in each state. The survey shows that not all states define certain basic outdoor advertising terms similarly. Beyond the terms that were a part of the survey ("normal maintenance", "urban area" and "unzoned commercial and industrial area"), respondents' state laws and regulations were reviewed as part of the survey results analysis, and differences were also observed in the definitions of other terms such as "illegal" sign and "non-conforming" sign.

Recommendations for Further Research

No matter how carefully and/or clearly the survey questions may have been written and pre-tested, asking the questions verbally with the opportunity for immediate follow-on questioning would result in more accurate responses. Some selected follow-up phone calls and/or in person visits based on the targeted knowledge of NAHBA staff of certain states' programs, could produce even more useful results.

Perhaps certain issues could be identified as the target for follow-up surveying. The number of states to be surveyed could be dictated by the specific issue to be investigated. Some questions, for example, may only need to be asked of Bonus states or states with certified cities. Some questions may only apply to states that are utilizing automated data collection systems or certain types of automated field data collection devices. Some issues may require surveying all states.

Some issues to be further investigated could include:


[1] Maine, Vermont, Hawaii and Alaska do not allow billboards.

Updated: 09/05/2014
HEP Home Planning Environment Real Estate
Federal Highway Administration | 1200 New Jersey Avenue, SE | Washington, DC 20590 | 202-366-4000