AAPOR Task Force Report, 66
Active listening, 44
American Association for Public Opinion Researchers (AAPOR), 83
American Consumer Opinion, 61
American Trends Panel (ATP), 60
Answering questions, willingness to, 13–14
knowing, 75
lack of understanding by, 75–77
Behavior coding, 35
Built-in assumptions questions, 17
Callback, 40
Changing technology, and survey research, 53–71
Chartjunk, 90
Checks-and-balances, 85
Coding, 42
behavior, 35
Cognitive interviews, 34
Computer-assisted personal interviewing (CAPI), 55
Computer-assisted telephone interviewing (CATI), 9, 55
Content, survey results, 79–81
privacy, confidentiality, and proprietary information, 83–84
Data, 1
coding, 42
collection, 2
confidentiality agreements, 83
making available to other social scientists, 43–44
Data entry, 43
Digital divide, 58
Disposition codes, 40
Documentation, for interviewer, 45–46
Double-barrel questions, 16
Double-negative questions, 17
Dynamic probes, 39
Empirical, 1
Experts, reviewing survey, 35
Expression, survey results, 85
text, graphs, and tables, 86–93
Focus groups, 34
Good questions. See also Survey questions
key elements of
common question pitfalls, avoiding, 16–17
specificity, clarity, and brevity, 14–15
goals for, 11
Google, 34
Google Scholar, 34
Granularity, 24
Human subject protections, 83
IBM RAMAC 305, 54
Inflation, 8
Inter-university Consortium for Political and Social Research (ICPSR), 43–44
Interactive probes, 39
Internet, 64
-accessible smartphones, 57
Interviewer
-administered survey, 44
debriefing, 35
manual, 46
–respondent–questions interaction, 3
training
practice interviews, 46
providing documentation for, 45–46
Interviewing. See Questioning
Level of granularity, 80
Listening, 44
Loaded/leading questions, 16
Mailed surveys, probing in, 39–40
Marketing Research, 73
Measurement, 2
advantages of, 10
Mobile technologies, 64. See also MCAPI; MCATI
internet access and survey mobility, 56–59
Moore’s Law, 54
Narrative presentation. See Text
National Center for Health Statistics, 56
New York Times, 53
Nonresponse bias, 47
Online survey, 9, 19, 57, 61–62
Ordered/scalar response category, 25
Participants, 74
Practice interviews, 46
Probability-based panels, 59–60
Probes, defined, 38
Probing
in web surveys, 39
Proprietary information, 83–84
Questioning, as social process, 2–3, 6
Refusals, 45
Research design, 2
Researchers, 74
Role playing, 46
Satisficing, 36
Scientific approach, 1
Self-administered survey, 41
Semantic Differential Scale, 26–28
Skip patterns, 36
Smartphones
dependent, 57
internet-accessible, 57
Social process, questioning as, 2–3, 6
Sponsors, 74
Statistical story-telling, 86
Structured questions. See Closed-ended questions
Survey panels, 59
probability-based panels, 59–60
Survey presentations, 73
good questions
responses to, 87
validity and reliability in, 11–13
Survey reports
writing, 43
Survey results
knowing, 75
lack of understanding by, 75–77
privacy, confidentiality, and proprietary information, 83–84
expression, 85
text, graphs, and tables, 86–93
Survey technology, 53–71. See also specific technologies
administering
linking to other information, 41–42
probing in mailed surveys, 39–40
probing in web surveys, 39
behavior coding and interviewer debriefing, 35
cognitive interviews, 34
focus groups, 34
other surveys, 34
interviewer training
practice interviews, 46
providing documentation for, 45–46
listening, 44
making data available to other social scientists, 43–44
mode of delivery, 37
processing data
coding, 42
data analysis, 43
data entry, 43
writing reports, 43
Tailored Design Method, 9
disposition codes for, 40
Total Design Method, 9
Universal Automatic Computer. See UNIVAC
Unstructured questions. See Open-ended questions
Wall Street Journal, 54
Web surveys, probing in, 39
YouGov, 61