You are here

5. Data Collection

5. Data Collection

5.1 Data Collection Schedule

The survey was conducted over 37 days to enable 1,500 interviews to be completed. The survey period was initially from October 1 through October 31, 2009, but was extended by one week to November 7 to increase the response rate and number of completions.

5.2 Interview Procedures

The following sections outline the key phases of the interviewing procedures utilized in the survey.

5.2.1 Pre-Testing

Standard pre-testing protocols were utilized to ensure that the survey instrument was programmed correctly and to make sure each survey item was clear and easy to understand.

  • The pre-test instrument was reviewed by a project manager together with the pre-testing interviewers to discuss question intent and any potential challenges and issues.
  • A pre-test sample was created from a list of households in targeted areas.
  • The listed households were called, and pre-test interviews for the survey were conducted when appropriate.
  • The pre-test interviews were monitored by the project manager and data collection manager, and the interviewers were debriefed after an interview was conducted.
  • Issues that emerged during survey administration such as respondent questions and confusions and interviewer mishaps were recorded by the project manager and data collection manager.
  • Clients listened to interviews in interviews and provided feedback.
  • All calls that lasted over one minute were recorded and placed into the archive for future reference.
  • A Pre-Test Form was filled out by an interviewer to record any problems or issues that emerged during an interview.

Problems or issues that pre-testing interviewers were focused on included the amount of time to administer the survey, the wording and order of questions, respondent motivation, and transitions (i.e., whether changes in topics were smooth or abrupt). Questions that yielded high occurrences of the same behaviors (e.g., the respondent asked what a question meant) were carefully examined and recorded along with how long it took a respondent to answer them.

Two rounds of pre-tests were conducted for the October 2009 OHS. The first round of pre-tests consisted of 17 interviews. During these interviews, it was found that each interview took more than the 10-15 minutes that DOT specified for the interview time. As a result, Questions L1010 and L1030 were deleted from the questionnaire. The second round of pre-tests was conducted using the shortened questionnaire. This round consisted of eight interviews. The two rounds of pre-tests also led to the rephrasing of a question and correction of typos in the questionnaire.

Timing
Certain items were only asked of individuals who gave a specific response to a previous question. Thus, the length of time it took to administer the survey varied among respondents. During pre-testing, time used for each pre-test was recorded, and the average time of administering the survey was calculated. The average time for an interview was 20 minutes in the first round of pre-tests; it was reduced to 16 minutes in the second round of pre-tests as a result of shortening the questionnaire and improving the wording of the questions.

Question Wording and Order
The following situations regarding question wording and order were recorded and examined by interviewers:

  • Questions that had awkward wording.
  • Questions that asked something other than what they were intended to ask.
  • Questions that were difficult for the respondent to understand. (During the first round of pre-tests, it was found that respondents did not understand the meaning of "video monitor." The question in concern was then rephrased and was tested in the second round of pre-tests.)
  • Questions that appeared to be out of order.
  • Questions that were redundant.
  • Questions that were not applicable for a certain set of respondents.

Behavior Coding
Interviews were monitored to determine whether an interviewer read a question correctly and whether a respondent answered a question correctly and/or asked for clarification of a question as well as to determine how much time it took the respondent to answer a question. This was to ensure that all questions were clearly understood and that each question served its intended purpose. Questions that were not clearly understood were identified for modification in order to obtain the necessary information.

Respondent Motivation
Interviewers were asked to provide the respondent's motivations for taking the survey on the Pre-Test Form. The information helped to determine whether "encouraging" statements needed to be inserted at any point during the survey to keep the respondent's desire to complete it at the optimal level.

Transitions

Transitions were inserted throughout the survey to indicate to the respondent that they were progressing well and to make them aware of how many more sections of the survey remained. Through pre-testing, interviewers and managers noted when they believed such statements needed to be inserted based on their administration of the survey and how well the topics followed one another.

5.2.2 Interviewer Training

All interviewers were given general training in interviewing techniques and skills and in the use of Computer Assisted Survey Execution System (CASES) - a computer assisted telephone interviewing software developed by the University of California, Berkeley. They were provided an intensive training session tailored to the requirements of the October 2009 OHS. All interviewers were required to review and sign a confidentiality statement before working on the October 2009 OHS project.

The interviewer training first focused on identifying factors that can cause interviewer and respondent bias, as well as interviewing and record keeping techniques. Special attention was given to training interviewers on how to introduce themselves and the project to respondents and on how to make appointments and callbacks. They were taught correct interviewing and probing techniques, including how to read questions exactly as worded, record open-ended responses verbatim appropriately, and respond to respondents' questions. Interviewers were also trained on how to fill out call sheets and enter correct call disposition codes both on call sheets and in the data file.

Interviewers were then trained on how to use CASES to administer telephone interviews. They worked through a CASES training survey instrument to learn how to enter responses effectively and how to manipulate the survey instrument during an interview. As a part of the general training, they role played different interviewing scenarios with a supervisor, reviewing all of the common questions and responses by respondents.

All interviewers participated in a special training session for the October 2009 OHS project. The goals and the objectives of the project were reviewed with the interviewers. BTS staff members discussed confidentiality requirements, gave their perspective on the survey, and discussed the use of the survey data. The new survey instrument was reviewed and potential problems or issues were fully discussed with interviewers. A special role playing using the questionnaire for the October 2009 OHS was conducted with interviewers acting as both interviewer and respondent in turn.

A customized interviewing manual, the 2009 OHS Training Manual, was prepared for training and was reviewed by interviewers during training. The manual provided information on the scope and potential issues that could arise during an interviewing session. The manual included the goals and objectives of the project, terms specific to the survey instrument, and information on administering the survey. Scripted responses to common questions regarding the OHS project were also included for the interviewers to use.

5.2.3 Pre-Contact Letter

Five calendar days prior to the start of data collection, a BTS-approved pre-contact letter was sent to households of sampled telephone numbers with an address. The intent was for each household with an address to receive the pre-contact letter several days before they received a call for an interview.

The pre-contact letters were sent out in four batches, with an interval of a week between two mailings. The first mailing was sent on September 25, 2009, consisting of 944 respondents in the national sample and 425 respondents in the sample of targeted MSAs. The second mailing went out on October 3, 2009, consisting of 964 in the national sample and 440 in the sample of targeted MSAs. The third mailing went out on October 11, 2009, with 511 in the national sample and 209 in the sample of targeted MSAs. The fourth mailing went out on October 19, 2009, with 179 in the national sample and 140 in the sample of targeted MSAs. In total, 2,598 pre-contact letters were sent to respondents in the national sample, and 1,214 pre-contact letters to the respondents in the sample of targeted MSAs, which accounted for approximately 49 percent of the national sample and 43 percent of the sample of targeted MSAs.

An "800" number was listed in each letter with the specific times to call (Monday through Friday, 9 a.m. to 12 a.m. EST; Saturday, 10 a.m. to 2 p.m.; and Sunday, 5 p.m. to 12 a.m. EST). Should respondents call outside the listed times, they would receive a phone message asking them to leave their name and number so that someone would contact them as soon as possible to conduct the interview.

5.2.4 Call Attempts and Callbacks

A standardized procedure of multiple call attempts and a three-phase message procedure were used to encourage participation. With the standardized calling procedure, a sampled telephone number was called up to as many as 30 times with calls in the day time and evening and on weekends. Standardized multiple call attempts were made in between voice messages, but a message was not left at each call attempt when encountering an answering machine (due to concern that people might avoid the call or feel "harassed" if they were away for a few days and found multiple messages on their answering machine upon returning home). Given the limited duration of fielding, a household with an answering machine was called two to three times per day during the October 2009 OHS. This number was established to strike a balance between perceived harassment and encouraging participation.

Messages were left to encourage households to at least pick up the telephone when they were called or to encourage them to call back when they were available. To avoid annoying a respondent by leaving multiple messages, a three-phase message procedure was implemented: the first message was left after reaching an answering machine two or three times; the second message was left halfway through the calling window; and the third message was left two or three days before the end of the calling window. Each message was progressively more earnest and urgent. This three-phase message procedure resulted in more call-ins from respondents after each successive message.

Toward the end of the survey, a more aggressive approach to reaching more respondents was employed in an effort to improve the final response rate. Daily multiple attempts (up to five times per day) were made to reach respondents.

5.2.5 Disposition Codes

Table 7 shows a list of disposition codes and their descriptions. They were used by interviewers to determine the scope of each call.

Table 7: Interviewer Disposition Codes

Out of Scope (ineligible for study participation)  
22 No one 18 years old or older in household
28 Respondent unavailable before and during the study period
40 Business
42 Disconnected number
43 Number changed
47 Computer/Fax/Pager
48 Cell Phone
Scope Undetermined  
11 - NQ No answer after 5 rings - not qualified
12 - NQ Busy - not qualified
15 - NQ Answering machine - not qualified
16 - NQ Left first message - not qualified
17 - NQ Left second message - not qualified
18 - NQ Left third message - not qualified
21 - NQ Too ill/hearing disabled/Mental Incapacitation - not qualified
29 - NQ Respondent does not speak English or Spanish - not qualified
30 - NQ General callback - not qualified
31 - NQ Callback at time/date by informant - not qualified
32 - NQ Callback at time/date by respondent - not qualified
45 - NQ Cannot complete call - not qualified
46 - NQ Privacy manager on - not qualified
60 - NQ First informant refusal - not qualified
70 - NQ Second informant refusal - not qualified
80 - NQ First respondent refusal - not qualified
90 - NQ Second respondent refusal - not qualified
91 - NQ Hard refusal/Take me off of your list - not qualified
In Scope  
01 Interview Completed
03 Partial complete: willing to finish
06 Partial complete: refused to finish
06, 60, 70, 80, 90, 91 à 01 Refusal Conversion
11 - Q No answer - qualified
12 - Q Busy - qualified
15 - Q Answering machine - qualified
16 - Q Left first message - qualified
17 - Q Left second message - qualified
18 - Q Left third message - qualified
21 - Q Too ill/hearing disabled/Mental Incapacitation - qualified
23 - Q Respondent deceased prior to completion of the interview - qualified
28 - Q Respondent not available during study - qualified
29 - Q Respondent does not speak English or Spanish - qualified
30 - Q General callback - qualified
31 - Q Callback at time/date by informant - qualified
32 - Q Callback at time/date by respondent - qualified
44 - Q Area code changed, but not the number
45 - Q Cannot complete call - qualified
46 - Q Privacy manager on- qualified
60 - Q First informant refusal - qualified
70 - Q Second informant refusal - qualified
80 - Q First respondent refusal - qualified
90 - Q Second respondent refusal - qualified
91 - Q Hard refusal/Take me off of your list - qualified

Note: For our purposes, Q (qualified) indicates that a respondent was screened and was an individual that was 18+ years old who resided in an eligible household according to the parameters of the study. NQ (Not qualified) indicates that an eligible respondent was not been selected, so it was unknown whether or not they were eligible for participation in this study.

5.2.6 Household Screening

A qualified respondent must be a household member 18 years of age or older. While a household member who answered the phone - the informant - might be 18 or older, the survey was not conducted with him or her to avoid potential bias. Instead, the informant was asked to identify as the qualified respondent another household member whose birthday is immediately after his or hers. A randomized selection was made when the informant did not know the birthdays of any household members. The next-birthday method of respondent selection has been proven as a relatively efficient procedure for selecting a sample that is representative of all household members. If the selected household member was not available at the time of the call, a callback was scheduled to screen and/or interview the respondent. On average, it took less than four minutes for screening respondents and for reviewing the required confidentiality statement.

5.2.7 Interviewing Methods

Incentives were not offered to potential respondents in exchange for their participation in the survey. Interviews were conducted in both English and Spanish. When a potential respondent refused to be interviewed, the reason for refusal was recorded. The average length of completing an interview was about 15 minutes in addition to the time for screening and recruiting a potential respondent.

At the beginning of an interview, interviewers introduced themselves, specifying who they worked for and the purpose of the survey, and assured the potential respondent this was not a sales call. Interviewers then determined whether there was an eligible person in the household. Once contact was made with the eligible household member, the interviewers reintroduced themselves as needed and explained the purpose of the survey. The interviewers also indicated to the respondents that the survey took 15 minutes to complete, all information would remain confidential, and it was a voluntary study so respondents could refuse to answer any question.

If a potential respondent agreed to participate in the survey, the respondent was provided an opportunity to ask any questions to be answered by the interviewer, and then the interview was conducted. However, if it was not a convenient time, a callback was scheduled. When a respondent refused to participate in or complete the survey, the case was moved to a "refusal buster" who was trained to overcome refusals. The "refusal buster" called the respondent back after waiting two days. Refusal conversion efforts helped to increase the number of valid cases in the final samples. In the final data of the October 2009 OHS, over 13 percent of cases in the national sample and about 15 percent of cases in the original sample of targeted MSAs resulted from refusal conversions.

5.3 Data Quality Control Procedures

Standard procedures were implemented for data quality control. Data were reviewed and examined by senior analysts for any outliers, entry errors, or missing variables. Each variable was examined to ensure that responses fell within expected parameters. Potentially invalid responses and outliers were further investigated. Variables were cross checked to each other to ensure internal consistency between responses to interrelated variables.

When inconsistencies or outliers were found, related call sheet logs and notes and the actual recordings of the interview in question were reviewed to determine if data had been incorrectly interpreted or entered by the interviewer. While the survey was still in the field, callbacks were made by supervisors to respondents for cases that could not be reconciled through a review of the logs or recordings. Once the survey interval ended, these cases were flagged and reported.

5.3.1 Interviewer Performance

Interviewer performance was ensured through the implementation of standard procedures of survey interviewing, constant monitoring, and a process of verification. The implementation of standard procedures of survey interviewing provided the prerequisites for high-level interviewer performance. Each interviewing shift began with a staff meeting to review any issues that had emerged from previous calling efforts. Interviewers were then assigned a set of call sheets to cover that shift. All call dispositions (date, time, interviewer number, and result) were captured in two ways. First, the survey questionnaire was programmed in CASES to capture the results of each call and to place the information into a database for analysis. Second, call disposition results were collected electronically, and the interviewer identification number, date and time of the call, final disposition, and any comments that the interviewer determined to be relevant were entered on paper call sheets. The use of paper call sheets allowed interviewers to quickly assess each case and determine when was best to call the respondent again. Call sheets were reviewed by a supervisor before each shift who then passed out the call sheets to interviewers to call at all standard times. Analysis of the call dispositions from previous shifts helped to determine when a respondent was most likely to be available to complete the interview. At the end of each shift, a supervisor log was filled out by the head supervisor to document any events and issues that emerged during the shift. The log sheets were reviewed by the survey manager each day. The supervisor log was made available to the BTS OHS project team upon request.

Throughout the survey, a one-to-five supervisor/interviewer ratio was maintained, and each interviewer was monitored at least once each shift. Supervisors were always on the floor with the interviewers and were always available to answer questions or handle problems throughout all phases of interviewing. All interviewers were also monitored via a monitoring station in the survey unit to assure unbiased and reliable data were collected. A silent monitoring process allowed supervisors to listen to interviews live without interviewers' knowledge. Corrective feedback was promptly provided to interviewers whenever needed, and appropriate actions were taken when necessary. At least once a week, the interviewer's progress was evaluated through discussion with supervisors, and the interviewers were provided with written evaluations documenting both positive and inappropriate behaviors. All completed surveys were reviewed by supervisors for completeness of responses.

In addition, verification of completed interviews was conducted through callbacks to respondents. About 15 percent of all completed interviews from the previous day were selected by supervisors for verification. The respondents were asked a set of questions to ensure that the appropriate respondent was interviewed and to obtain feedback on the interviewer's administration of the questionnaire. The verification process was completed by a supervisor alongside the interviewers, further reminding them of the importance of obtaining quality data while treating all respondents with respect.

5.3.2 Other Procedures

In addition to general checking and cleaning, responses to "other specify" items were pulled out to determine if they could be back-coded into the pre-existing response codes for close-ended questions. During an interview, the interviewer must make quick decisions regarding the correct response code to use for any item. While most items were easily coded, the coding of responses to some types of questions such as race or ethnicity could be difficult to determine. For this survey, when a response was not easily placed into a pre-existing code, the verbatim response was recorded instead. A review of the verbatim responses helped determine if it could be recoded back into the initial codes. If the responses in the "other specify" category were not matched with any code, they were left unchanged and were provided to the client along with any open-ended question responses. All open-ended verbatim responses were reviewed to ensure that they were complete and understandable. In cases where the response was not complete, the interviewer was asked to call the respondent to re-ask the question.

For call-in interviews, telephones were manned by day-time interviewers and by staff who were trained to conduct interviews. Interviewers were available Monday through Friday, 9 a.m. to 12 a.m.; Saturdays, 10 a.m. to 2 p.m.; and Sundays, 5 p.m. to 12 a.m. in each time zone.

5.4 Summary of Data Cleaning

The use of the survey interviewing software CASES greatly facilitates the process of data cleaning because it is designed to only allow pre-programmed codes for responses to be entered into the system. Thus, it effectively prevents invalid responses from being entered erroneously during the process of interviewing. CASES is also extremely flexible in that it allows for continuous internal data quality checks. Once interviewing was completed for the survey, all data were sent through a cleaning process that checked for data inconsistencies. All substantive and disposition result data were then extracted into an ASCII file format several times so that the quality checking process was continuous throughout the survey effort.

After the data were extracted, they were reviewed by research analysts to check for internal consistency according to the interrelationship between variables and to identify any potential error. When an error was identified during the data checking and cleaning process, original data files were reviewed for verification. Corrections were made once the error was confirmed. Detailed notes and records of all changes and corrections were kept and maintained.

5.5 Coding of Missing Values

The OHS contains questions that are not asked of certain respondents based on their response(s) to other questions. In addition, there are usually some respondents who do not know the answer to or choose not to answer some questions in the survey. Each of these responses can have a different meaning to the data user. While each of these response categories is important in characterizing the results of the survey, they are often removed from certain analyses, particularly those involving percentages. To preserve the unique characteristics of these responses, three categories of responses are given special codes for easy identification. Table 8 presents the response categories and how they are represented in each data file for October 2009 OHS.

Table 8: Summary of Codes for Missing Values by Data File Format

Response Category Dataset Format
SAS®Version 9.1 Microsoft Excel® Text Comma Delimited
Appropriate skip -9 -9 -9
Refused -7 -7 -7
Don't know -8 -8 -8

5.6 Response Rates

The procedures for response rate calculation are based on the guidelines established by the Council of American Survey Research Organizations (CASRO) in defining a response rate. The response rates are the proportion of completed interviews to the total number of eligible respondents. The total number of eligible respondents is the sum of respondents in scope and the eligible portion of scope-undetermined respondents that is determined by the percentage of respondents in scope in the total number of respondents in scope and out of scope.

The final response rate for the survey is obtained using the following formula:

Response rate equals completed household interviews divided by outer left parenthesis households in scope plus inner left parenthesis scope undetermined times left brace households in scope divided by households in and out of scope right brace inner right parenthesis outer right parenthesis

A total of 1,082 interviews were completed during the survey period for the national survey and 504 interviews for the survey of targeted MSAs. The numbers of households in scope and out of scope and scope-undetermined households are shown in Table 9 for the national sample and in Table 10 for the sample of targeted MSAs. As shown below, the response rates for both the national sample and the sample of targeted MSAs were below the targeted response rate of 50 percent despite a variety of efforts made to increase responses.

For the national sample, a response rate of 44.1 percent was achieved in the following manner:

Response rate equals completed interviews divided by outer left parenthesis households in scope plus inner left parenthesis scope undetermined times inner left parenthesis households in scope divided by households in and out of scope right parenthesis inner right parenthesis outer right parenthesis .4413 equals 1,082 divided by outer left parenthesis 1,580 plus inner left parenthesis 1,902 times inner left parenthesis 1,580 divided by 3,448 right parenthesis inner right parenthesis outer right parenthesis

For the original sample of targted MSAs, a response rate of 43.3 percent was achieved in the following manner:

Response rate equals completed interviews divided by outer left parenthesis households in scope plus inner left parenthesis scope undetermined times inner left parenthesis households in scope divided by households in and out of scope right parenthesis inner right parenthesis outer right parenthesis .4333 equals 504 divided by outer left parenthesis 755 plus inner left parenthesis 993 times inner left parenthesis 755 divided by 1,837 right parenthesis inner right parenthesis outer right parenthesis

Table 9: Final Dispositions - National Sample

Summary of disposition

Telephone numbers available: 6,964
# Telephone No. released: 5,350
Telephone numbers not dialed: 0
Telephone number dialed: 5,350
CASRO Response rate (%): 44.1%

Distribution of household cases by disposition code

Interviewer Disposition Code Final Disposition Code  Disposition Description Number of Households
In-Scope Numbers     1,580
01 1 Interview Completed 940
03, 06 2 Partial Complete 57
06, 60, 70, 80, 90, 91 à 01 3 Refusal Conversion 142
15, 16, 17, 18 - Q 4 Answering Machine/Message - Q 0
21, 29 - Q 5 Deaf/Lang/Ill/Mental Incap - Q 36
60, 70, 80, 90 - Q 6 Refusal - Q 173
91 - Q 7 Hard refusal - Q 79
28 - Q 8 R not available during study - Q 16
23 - Q 9 R deceased prior to completion of the interview - Q 0
44 - Q 10 Area code changed, but not the number - Q 0
30, 31, 32 - Q 31 Callback - Q 137
45 - Q 12 Cannot complete call - Q 0
46 - Q 13 Privacy Manager - Q 0
Out-of-Scope Numbers     1,868
40 14 Business 674
47, 48 15 Computer/Fax/Pager/Cell Phone 751
42 16 Disconnected number 410
43 17 Number change 24
22 18 No one 18 years old or older in HH 2
28 19 Respondent unavailable before and during study 7
Scope-Undetermined Numbers     1,902
11 - NQ 32 No answer - NQ 367
12 - NQ 21 Busy - NQ 29
15 - NQ 33 Answering Machine - NQ 119
16, 17, 18 - NQ 23 Left Message - NQ 0
45 - NQ 24 Cannot complete call - NQ 2
46 - NQ 25 Privacy Manager - NQ 0
21, 29 - NQ 26 Deaf/Lang/Ill/Mental Incap - NQ 33
60, 70, 80, 90 - NQ 27 Refusal - NQ 136
91 - NQ 28 Hard refusal - NQ 199
03, 06 - NQ 29 Partial Complete - NQ 0
44 - NQ 30 Area code changed, but not the number - NQ 1
30, 31, 32 - NQ 34 Callback - NQ 1,016
Total     5,350

Table 10: Final Dispositions - Sample of Targeted MSAs

Summary of disposition

Telephone numbers available: 7,326
# Telephone No. released: 2,830
Telephone numbers not dialed: 0
Telephone number dialed: 2,830
CASRO Response rate (%): 43.3

Distribution of household cases by disposition code

Interviewer Disposition Code Final Disposition Code  Disposition Description Number of Households
In-Scope Numbers     755
01 1 Interview Completed 429
03, 06 2 Partial Complete 15
06, 60, 70, 80, 90, 91 à 01 3 Refusal Conversion 75
15, 16, 17, 18 - Q 4 Answering Machine/Message - Q 0
21, 29 - Q 5 Deaf/Lang/Ill/Mental Incap - Q 25
60, 70, 80, 90 - Q 6 Refusal - Q 78
91 - Q 7 Hard refusal - Q 54
28 - Q 8 R not available during study - Q 3
23 - Q 9 R deceased prior to completion of the interview - Q 0
44 - Q 10 Area code changed, but not the number - Q 0
30, 31, 32 - Q 31 Callback - Q 76
45 - Q 12 Cannot complete call - Q 0
46 - Q 13 Privacy Manager - Q 0
Out-of-Scope Numbers     1,082
40 14 Business 393
47, 48 15 Computer/Fax/Pager/Cell Phone 413
42 16 Disconnected number 257
43 17 Number change 12
22 18 No one 18 years old or older in HH 3
28 19 Respondent unavailable before and during study 4
Scope-Undetermined Numbers     993
11 - NQ 32 No answer - NQ 242
12 - NQ 21 Busy - NQ 6
15 - NQ 33 Answering Machine - NQ 27
16, 17, 18 - NQ 23 Left Message - NQ 0
45 - NQ 24 Cannot complete call - NQ 5
46 - NQ 25 Privacy Manager - NQ 4
21, 29 - NQ 26 Deaf/Lang/Ill/Mental Incap - NQ 46
60, 70, 80, 90 - NQ 27 Refusal - NQ 83
91 - NQ 28 Hard refusal - NQ 131
03, 06 - NQ 29 Partial Complete - NQ 0
44 - NQ 30 Area code changed, but not the number - NQ 0
30, 31, 32 - NQ 34 Callback - NQ 449
Total     2,830

OHS contains questions that ask respondents to supply the demographic information necessary to categorize their age, gender, and/or education. There are usually some respondents who choose not to answer some of these questions in the survey. For respondents that do not want to provide this information, the most common reasons for non-responses are:

  • I don't like giving my age

  • I would rather not say

  • I don't like to be labeled, and

  • that is personal information.

Common reasons for non-responses when asked questions regarding contacts they may have had with any government agencies and/or why they contacted the agencies are:

  • I don't want to say because I don't trust the government,

  • I don't want to answer because I have an issue pending, and

  • I would rather not say.