Description of Courses

Analysis Methods for Complex Sample Survey Data

SurvMeth 614 (3 credit hours)

Instructor: Yajuan Si, University of Michigan and Brady West, University of Michigan

This course provides an introduction to specialized software procedures that have been developed for the analysis of complex sample survey data. The course begins by considering the sampling designs of specific surveys: the National Comorbidity Survey-Replication (NCS-R), the National Health and Nutrition Examination Surveys (NHANES), and the Health and Retirement Study (HRS). Relevant design features of the NCS-R, NHANES and HRS include weights that take into account differences in probability of selection into the sample and differences in response rates, as well as stratification and clustering in the multistage sampling procedures used in identifying the sampled households and individuals.

Prerequisite: Two graduate-level courses in statistical methods, familiarity with basic sample design concepts, and familiarity with data analytic techniques such as linear and logistic regression.

Why take this course? 

  • To gain an understanding of modern methods and software for the secondary analysis of survey data collected from large complex samples
  • To have the opportunity for one-on-one interaction with the instructors when walking through analyses of survey data
  • To see several examples of applied statistical analyses of survey data
  • To have the experience of writing a scientific paper that presents an analysis of complex sample survey data, and getting expert feedback on that paper

2018 Syllabus (PDF)

 

Applied Sampling/Methods of Survey Sampling

SurvMeth 625 (3 credit hours)

Instructor: James Wagner, University of Michigan and Raphael Nishimura, University of Michigan

A fundamental feature of many sample surveys is a probability sample of subjects. Probability sampling requires rigorous application of mathematical principles to the selection process. Methods of Survey Sampling is a moderately advanced course in applied statistics, with an emphasis on the practical problems of sample design, which provides students with an understanding of principles and practice in skills required to select subjects and analyze sample data. Topics covered include stratified, clustered, systematic, and multi-stage sample designs, unequal probabilities and probabilities proportional to size, area and telephone sampling, ratio means, sampling errors, frame problems, cost factors, and practical designs and procedures. Emphasis is on practical considerations rather than on theoretical derivations, although understanding of principles requires review of statistical results for sample surveys. The course includes an exercise that integrates the different techniques into a comprehensive sample design.

Prerequisite: Two graduate-level courses in statistical methods.

2018 Syllabus (PDF)

 

Cognitive Interviewing for Testing Survey Questions  Cancelled for 2018

SurvMeth 988.204 (1 credit hour)

Instructor: Pamela Campanelli, UK Methods Consultant
 
 

This 2-day course is designed to familiarize participants with the powerful and efficient method of testing survey questions called Cognitive Interviewing. This course is about understanding and learning to use Cognitive Interviewing. A full range of Cognitive Interviewing techniques will be covered (e.g., think-alouds, probing, observation, paraphrasing, rating tasks, response latency measurement and card sort classification tasks). The course will include practical information on how to implement the various methods as well as “hands-on” sessions where participants will have the chance to practice the major methods. There is also a session where cognitive interviewing is practiced on participants’ questionnaires. The course also covers recruitment and sampling of participants, analysis, getting from results to improved survey questions and reporting as well as broader uses for Cognitive Interviewing and Cognitive Interviewing in comparison and combination with other survey question testing methods.

Prerequisite: There is no prerequisite, but some knowledge of questionnaire design is of value.

2018 Syllabus (PDF)

 

Data Collection Methods Cancelled for 2018

SurvMeth 623 (3 credit hours, remote participation option available)

Instructors: Frederick Conrad, University of Michigan and Florian Keusch, University of Michigan

This course reviews a range of survey data collection methods that are both interview-based (face-to-face and telephone) and self-administered (questionnaires that are mailed and online, i.e., web surveys). Mixed mode designs are also covered as well as several hybrid modes for collecting sensitive information e.g., self-administering the sensitive questions in what is otherwise a face-to-face interview. The course also covers newer methods such as mobile web and SMS (text message) interviews, and examines alternative data sources such as social media. It concentrates on the impact these techniques have on the quality of survey data, including error from measurement, nonresponse, and coverage, and assesses the tradeoffs between these error sources when researchers choose a mode or survey design. This is not a how-to-do-it course on survey data collection, but rather focuses on the error properties of key aspects of the data collection process.

Students will view recorded lectures and complete reading assignments in preparation for class discussion sessions which will occur twice per week, one hour per session.  Students are expected to attend all discussion sessions either in person or via BlueJeans.  Successful discussion sessions will occur through preparation and active participation by all participants enrolled in the course.  Students should have questions or discussion topics in mind for the class sessions.

Remote participation option:  It is not necessary to be physically in Ann Arbor to take the course.  Students who cannot be in Ann Arbor can enroll and join discussion sessions via  BlueJeans (https://www.bluejeans.com/).

Once enrollment is confirmed via email, indicate if course attendance will be in person, in Ann Arbor or via BlueJeans.

Prerequisite: An introductory course in survey research methods or equivalent experience.  If joining remotely, participant must have computer, camera and headset available to join the class via BlueJeans (https://www.bluejeans.com/)

2018 Syllabus (PDF)

 

Generalized Linear Latent and Mixed Models (GLLAMMs) for Complex Survey, Biometric, and Educational Data  Cancelled for 2018

SurvMeth 988.201 (2 credits)

Instructor: Stephen Schilling, University of Michigan (2 credit hours)

The last 50 years have seen development and use multilevel and mixed models, latent and structural equation models, generalized linear models, generalized linear mixed models, item response theory (IRT) models, and longitudinal models across a wide variety of disciplines. Statisticians often note the overlap between these methods but lacked a unifying approach towards estimation, testing, and application. Recently Rabe-Hesketh and colleagues have developed a unified approach, Generalized Linear Latent and Mixed Models. GLLAMMs allow estimation of multilevel models for binary, ordinal, and count data that include structural equation relationships among latent variables underlying observed data. Diverse applications include 1) multilevel models of complex survey data with multistage sampling, unequal sampling probabilities, and stratification; 2) explanatory IRT Models: 2) modeling endogenous switching and sample selection models for binary, count, and ordinal data; 4) generalized linear models with covariate measurement error; 5) biometrical modeling of twin and family data; 6) multivariate methods for meta-analysis of genetic association studies; 7) multilevel models for discrete choice and rankings; 8) conjoint choice models of consumer preference; 9) differential item functioning and test bias; and 10) enhancing the validity and cross cultural comparability of survey research.

This class will consist of daily morning lecture and afternoon lab. Lectures begin with an introduction to GLAMMs, focusing on their structure and estimation and then move on to specific applications, concentrating on modeling of complex survey data, biometric data, and educational data. Worked examples will be performed in the labs using the GLLAMM software available for free with use with STATA. We will also demonstrate Bayesian estimation of GLLAMM models using STAN within R.

Pre-requisites: One or more courses in statistical methods that include regression analysis and/or structural equation models. Familiarity with STATA or R would be helpful, but is not essential.

2018 Syllabus (PDF)

 

Introduction to Data Collection Methods

SurvMeth 988.225 (1 credit)

Instructors: Florian Keusch, University of Mannheim

This 2-day workshop will introduce students to different methods of collecting data in the social sciences. Surveys are the most common form of collecting primary data in many disciplines, and this course will provide students with an overview of interview-administered (face-to-face and telephone) and self-administered (mail, web, mobile web, and SMS) survey data collection as well as the combination of multiple modes (mixed mode surveys). The course will in particular discuss the implication of survey design decisions on data quality. In addition, students will also receive an overview on alternative data sources (e.g., passive measurement, social media, administrative data) and how they can be used in combination with traditional survey data.

2018 Syllabus (PDF)

 

Introduction to the Health and Retirement Study (HRS) Workshop

Not for credit

Instructors: Amanda Sonnega, University of Michigan

The Health and Retirement Study (hrsonline.isr.umich.edu) Summer Workshop is intended to give participants an introduction to the study that will enable them to use the data for research. HRS is a large-scale longitudinal study with more than 20 years of data on the labor force participation and health transitions that individuals undergo toward the end of their work lives and in the years that follow. The HRS Summer Workshop features morning lectures on basic survey content, sample design, weighting, and restricted data files. Hands-on data workshops are held every afternoon in which participants learn to work with the data (including the user-friendly RAND version of the HRS data) under the guidance of HRS staff. Staff of the Gateway to Global Aging project (G2Aging.org), which harmonizes data across HRS international sister studies, conduct an afternoon training. At the end of the week, students have the opportunity to present their research ideas to the class and HRS research faculty and obtain feedback. Topics include (but are not limited to) in depth information on HRS data about health insurance and medical care; biomarkers, physical measures, and genetic data; cognition; health and physical functioning; linkage to Medicare; employment, retirement, and pensions and linkage toe Social Security records; psychosocial and well-being; family data; and international comparison data. The data training portion assumes some familiarity with SAS or STATA.

2018 Syllabus (PDF)

 

Introduction to Focus Group Interviewing Research Methods

SurvMeth 652 (1 credit hour)

Instructors: Richard Krueger and Mary Anne Casey, University of Minnesota

Introduction:

This course introduces the skills needed to conduct focus group interviews. Students will learn about the critical components of successful focus group research. They will develop a plan for a focus group study and then practice key skills. Attention will be placed on moderating, recruiting, developing questions, and analysis of focus groups. This course will be particularly applicable for those conducting focus group research in academic, non-profit, and government settings.

Course Topic

The course will cover these skills:

Planning—When to use focus groups and designing a study

Recruiting—Identifying information-rich participants and getting them to show up

Hosting—Creating a permissive and nonthreatening environment

Moderating—The crucial first few minutes and moderating techniques

Developing questions—Characteristics of good focus group questions

Analyzing—Options for analysis

Reporting—Options for sharing what was learned

Course Format

The course format includes daily lectures along with opportunities to practice critical skills in small groups. 

Why Take This Course?

Focus groups are used to understand issues, pilot test ideas, and evaluate programs. They also provide great insight when used in combination with surveys. Focus groups have been used to help design surveys, to pilot test surveys, and to understand survey findings. Take this course if you want to learn more about how focus groups might add to your research toolbox.

Prerequisite: An introductory course in research methods or equivalent experience.

2018 Syllabus (PDF)

 

Introduction to Applied Questionnaire Design

SurvMeth 988.206 (1 credit hour)

Instructor: Nora Cate Schaeffer, University of Wisconsin-Madison

This course provides students with practice applying principles of question design. Students leave the course with tools to use in diagnosing problems in survey questions and writing their own survey questions.  The lecture provides guidelines for writing and revising survey questions and using troubled questions from surveys as examples for revision. Each day's session combines lecture with group discussion and analysis.  For some class activities, students work in small groups to apply lecture material to identify problems in the survey questions and propose solutions. Assignments require that students write new questions or revise problematic questions and administer them to fellow students. Sessions consider both questions about events and behaviors and questions about subjective phenomena (such as attitudes, evaluations, and internal states).

20 Ways to Test Your Survey Questions is a course that complements well with this class.

2018 Syllabus (PDF)

 

Introduction to Survey Methodology

SurvMeth 988.208 (1 credit hour)

Instructor: Emilia Peytcheva, RTI International

This 2-day course will introduce participants to the basic principles of survey design, presented within the Total Survey Error framework.  The course provides an introduction to the skills and resources needed to design and conduct a survey, covering topics such as sampling frames and designs, mode of data collection and their impact on survey estimates, cognitive processes involved in answering survey questions, best questionnaire design practices, and pretesting methods.

2018 Syllabus (PDF)

 

Introduction to Survey Research Techniques

SurvMeth 988.229 (2 credit hours)

Instructor: Tuba Suzer-Gurtekin, Univesity of Michigan

Surveys continue to play an important role in addressing many kinds of problems about many kinds of populations stand alone or as part of an integrated information system.  Application of the scientific principles underlying surveys depends on good understanding of theories and empirical research from disciplines such as psychology, sociology, statsitics and computer science.  A set of principles and empirical research will be introduced through the Total Surevy Error (TSE) framework.  The principles include problem and hypothesis formulation, study design, sampling, questionnaire design, interviewing techniques, pretesting, modes of data collection and data cleaning, management, and analysis.  Students will be trained to determine major steps in data collection design and implemetnation and to refer to literature to justify the steps.  The cousre will also discuss team and project management in the content of survey research, identifying skillsets anf technical language required.  The course will also provided training in an important subset of skills needed to conduct a survey form beginning to end. 

2018 Syllabus (PDF)

 

Introduction to Survey Sampling

SurvMeth 988.219 (1 credit hour)

Instructor: Jim Lepkowski, University of Michigan

This is a foundation course in sample survey methods and principles.  The instructors will present, in a non-technical manner, basic sampling techniques such as simple random sampling, systematic sampling, stratification, and cluster sampling.  The instructors will provide opportunities to implement sampling techniques in a series of exercises that accompany each topic. 

Participants should not expect to obtain sufficient background in this course to master survey sampling.  They can expect to become familiar with basic techniques well enough to converse with sampling statisticians more easily about sample design.

 

 

Introduction to Item Response Theory (IRT)  Cancelled for 2018

SurvMeth 988.203 (2 credit hours)

Instructor: Stephen Schilling, University of Michigan

Over the past half century Item Response Theory (IRT) has revolutionized test analysis and scoring in education, psychology, and medicine. IRT modeling is now the standard for almost all educational assessments, college readiness exams, and patient reported outcomes measures.

IRT involves modeling subjects’ responses to individual items, in contrast to Classical Test Theory, which models test scores on complete test forms. IRT offers substantial advantages for many technical problems that arise in creating and using tests, including test design, test equating, assessment of item and test bias, and test scoring.  IRT models have the advantage of invariance of person estimates to the collection or sample of items in a particular test, and the invariance of item parameter estimates to the sample of subjects used in test calibration.

This course will begin by comparing Item Response Theory to Classical Test Theory, focusing on the assessment of measurement error. There we will focus on the key components of IRT: the item characteristic curve (ICC), the item information function, and the test information function.

We will then move to a survey of models for unidimensional sets of dichotomously scored items, including the 1-parameter or Rasch model and the 2 and 3-parameter IRT models. We will then look at extensions of IRT to ordinal and nominal data, including the partial credit model, the generalized partial credit model, the graded response model, and the nominal response model.  

Finally we will examine specific applications of IRT, including test design and equating, assessment of test and item bias (differential item functioning), and test scoring including computerized adaptive testing. Here we will work real world problems and applications of IRT in educational assessment and the assessment of patient reported outcomes. Students will be provided with the knowledge and skills to perform IRT analyses using freely available software available in the R statistical environment.  Course work will include three assignments and a final project that requires the students to use IRT analyses on their own data. Class will consist of a morning lecture and an afternoon computing lab.

Prerequisites: One or more courses in statistical methods that covered regression analysis, notions of statistical inference, and probability, as well as some familiarity with statistical software such as SPSS and SAS.

2018 Syllabus (PDF)

 

Mixed Method Research Design and Data Collection 

SurvMeth 653 (1.5 credit hours)

Instructors:  William Axinn, Dirgha Ghimire and Emily Treleaven, University of Michigan

This course reviews multiple methods of data collection and presents study designs for combining multiple methods within a single research project. The course focuses on the integration of survey methods with multiple alternative methods to achieve a single data collection approach using the strengths of some methods to compensate for weaknesses in other methods. The methods examined include unstructured or in-depth interviews, semi-structured interviews, focus groups, survey interviews, observation, geographic information systems, archival research, social media analysis and hybrid methods. Emphasis will be placed on the specific contribution of each method, as well as the use of combined methods to advance specific research questions. This course is designed for those with a specific research question in mind. Throughout the course, participants will be asked to design multi-method approaches to a research question of their choice. By the end of this course, participants will have an overview of multi-method research that will enable them to design, understand, and evaluate multi-method approaches within a single project.

Prerequisite: An introductory course in survey research methods or equivalent experience.

 

 

Multi-Item Scale Development and Testing  Cancelled for 2018

SurvMeth 988.220 (1 credit hour)

Instructors: Pamela Campanelli, UK Survey Methods Consultant 

Standardized multi-item scales are more common in some disciplines than others. This 2-day course is designed to inspire participants from all disciplines that it is possible to develop your own high quality multi-item scales (or correctly adapt existing multi-item scales) and offers an introduction on how to do this. It covers the psychometric principles of question development while adding in principles of general questionnaire design. Focusing first on Classical Measurement Theory, participants design their own multi-item scales. This is followed by a group discussion of existing multi-item scales. The course then introduces some basic statistical tools for assessing the reliability and dimensionality of multi-item scales and participants get to practice evaluating some existing scales in a computer lab session. The course finishes with an introduction to Item Response Theory.

Prerequisite: There is no prerequisite, but a little knowledge about questionnaire design, multi-item scales and SPSS would be of value.

2018 Syllabus (PDF)

 

Multi-Level Analysis of Survey Data  Cancelled for 2018

SurvMeth 988.202 (2 credit hours)

Instructor: Robert Henson, University of North Carolina 

Although many surveys gather data on multiple units of analysis, most statistical procedures cannot make full use of these data and their nested structures: for example, individuals nested within groups, measures nested within individuals, and other nesting levels that may be of analytic interest. In this course, students are introduced to an increasingly common statistical technique used to address both the methodological and conceptual challenges posed by nested data structures -- hierarchical linear modeling (HLM). The course demonstrates multiple uses of the HLM (also known as mixed models or random effect models) software, including growth-curve modeling, but the major focus is on the basic logic of multi-level models and the investigation of organizational effects on individual-level outcomes. The multi-level analysis skills taught in this course are equally applicable in many social science fields: sociology, public health, psychology, demography, political science, and in the general field of organizational theory. Typically, the course enrolls students from all these fields. Students will learn to conceptualize, conduct, interpret, and write up their own multi-level analyses, as well as to understand relevant statistical and practical issues.  This course will be taught over a week and will include both classes that provide the basic concepts of these models in addition to labs where participants will get hands on experience and practice with respect to determining the appropriate model, running the analysis, evaluating the reasonableness of the model and interpreting the results.

Prerequisites: At least one graduate-level course in statistics or quantitative methods, and experience with multivariate regression models, including both analysis of data and interpretation of results. School of Education students must have successfully completed ED-795 or equivalent. If you cannot meet this criterion, you must speak directly to the instructor prior to being given permission to enroll.

2018 Syllabus (PDF)

 

Probability and Non-Probability Sampling Methods

SurvMeth 988.224 (2 credit hours)  *Remote participation optionIt is not necessary to be physically in Ann Arbor to participate in these workshops. Students who cannot be in Ann Arbor can enroll and join sessions via  BlueJeans (https://www.bluejeans.com/).  Once enrollment is confirmed via email, indicate if course attendance will be in person, in Ann Arbor or via BlueJeans

Instructors:  Jim Lepkowski and Sunghee Lee, University of Michigan

Probability and Non-probability Sampling Methods is a sampling course that differs from traditional sampling classes. First, this class gives an equal amount of attention to both probability and non-probability sampling methods as non-probability sampling cannot be discussed meaningfully without understanding probability sampling and these two methods offer distinctive advantages and disadvantages. Second, this class will combine theoretical/conceptual parts of sampling through lectures and practical applications of different approaches through lab sessions.

The course will start with examining probability sampling techniques and their properties, including simple random selection, systematic selection, cluster sampling, stratified sampling, and probability proportionate to size selection. Issues of weighting to compensate for unequal chances of selection and variance estimation for calculating confidence intervals are also examined.  Then the wide variety of non-probability sampling methods are examined, from panel-based convenience samples, to river samples, quota samples, respondent-driven samples, and other techniques. The properties of these samples are discussed, and assumptions needed to obtain estimates are examined. We will also examine these two approaches from the total survey error perspectives.

The lab sessions to be held after each class will combine R programming and group discussions on the topics that need to be considered when implementing various sampling approaches. Hands-on examples of frame preparation, sample draws, post-survey adjustments and analysis specific to design will be provided and discussed.

The course is not designed to provide the mastery of survey sampling. Rather, it provides materials that will accommodate participants to become familiar with advantages and disadvantages of the two methods and their implementation which will allow them to make informed design decisions.

2018 Syllabus (PDF)

 

Qualitative Methods: Overview and Semi-Structured Interviewing cancelled for 2018

 SurvMeth 651 (1.5 credit hours)

Instructor: Nancy Riley, Bowdoin College

This course will focus on semi-structured, or in-depth, interviewing. This methodology is often most helpful in understanding complex social processes. The course will examine the goals, assumptions, process, and uses of interviewing and compare these methods to other related qualitative and quantitative methods in order to develop research designs appropriate to research goals. The course will cover all aspects of interviewing, including how to decide who to interview, how to ask good interview questions, and how to conduct successful interviews. Students will conduct interviews, and discuss the process and outcome of those interviews. We will examine the strengths and weaknesses of this methodology, particularly through discussion of some of the critiques of these methods.

Prerequisite: An introductory course in survey research methods or equivalent experience.

2018 Syllabus (PDF)

 

Questionnaire Design  Cancelled for 2018

SurvMeth 988.230 (2 credit hours )

Instructor:  Jessica Broome, Jessica Broome Research

This course introduces students to the art and the science of questionnaire design and focuses on "real world" applications of questionnaires. Topics will include basic principles of questionnaire design; factual and non-factual questions; techniques for asking about sensitive topics; designing scales and response options; survey mode considerations; and an introduction to pre-testing surveys. 
Students will be expected to develop a questionnaire on a topic of their choosing during this course. Homework assignments will build up to the construction of a usable questionnaire for each student.
A time commitment of approximately 30 hours over 4 weeks (June 18-July 13) is required for this course, including 3-5 hours of viewing online video lectures. Mandatory online class discussions will be held weekly from 2-3PM EST on Thursdays

Prerequisite: An introductory course in survey research methods or equivalent experience.

2018 Syllabus (PDF)

 

Questionnaire Design-short course

SurvMeth 988.223 (1 credit hour)

Instructor:  Jessica Broome, Jessica Broome Research

This course provides an overview of the art and science of questionnaire design. Topics will include basic principles of questionnaire design; factual and non-factual questions; techniques for asking about sensitive topics; designing scales and response options; survey mode considerations; and an introduction to pre-testing surveys. The course will consist of both lectures and hands-on activities.

2018 Syllabus (PDF)

 

Responsive Survey Design:  A Research Education Program  

For more information on this program, please visit the RSD Program web site: https://rsdprogram.si.isr.umich.edu/

Not for academic credit workshop (*Remote participation option available)

RSD has financial support available to those who qualify

Responsive survey design (RSD) refers to a method for designing surveys that has been demonstrated to increase the quality and efficiency of survey data collection. RSD uses evidence from early phases of data collection to make design decisions for later phases. Beginning in the 2018 Summer Institute, we will offer a series of eleven one-day short courses in RSD techniques.

*Remote participation optionIt is not necessary to be physically in Ann Arbor to participate in these workshops. Students who cannot be in Ann Arbor can enroll and join sessions via  BlueJeans (https://www.bluejeans.com/).  Once enrollment is confirmed via email, indicate if course attendance will be in person, in Ann Arbor or via BlueJeans.  Survey Methodology for Randomized Controlled Trails does not have the remote participation option.

These courses will include:

1.  Survey Methodology for Randomized Controlled Trials (half-day workshop)

Instructor: Mick Couper

Topics covered: Randomized Controlled Trials (RCTs) are an important tool for tests of internal validity of causal claims in both health and social sciences.  In practice, however, inattention to crucial details of data collection methodology can compromise the internal validity test.  One crucial example is recruitment and retention of participants – though randomized to treatment, unequal reluctance to participate or unequal attrition from the RCT jeopardize the internal validity of comparisons within the RCT design.  Another crucial example is the interaction of treatment and measurement – if the measures themselves change in response to the RCT treatment, then observed treatment and control differences may reflect these measurement differences rather than treatment differences.  In both cases, specific tools from survey methodology can be used to maximize the internal validity test in the RCT design. This course will focus on the survey methodology topics most important for maintaining the internal validity of RCT studies and feature specific examples of applications to RCTs.  One set of tools will focus on maximizing participation and minimizing attrition of participants.  Core survey methodology tools for encouraging participation in both pre-treatment measurement and the treatment itself as well as tools for minimizing the loss of participants to follow-up measures will be featured.  These tools include incentives, tailoring refusal conversion, switching modes, and tracking strategies. Links to RSD will also be made. A second set of tools will focus on measurement construction to reduce chances of interaction with treatment. These tools include mode options, questionnaire design issues, and special instruments (such as life history calendars) to minimize reporting error.  Each portion of the course will feature examples applying each specific tool to RCT studies.

 

2.  Basic Concepts and Theoretical Background (one-day workshop)

Instructor: James Wagner, Brady West and Andy Peytchev

This course will provide participants with an overview of the primary concepts underlying RSD. This will include discussion of the uncertainty in survey design, the role of paradata, or data describing the data collection process, in informing decisions, and potential RSD interventions. These interventions include timing and sequence of modes, techniques for efficiently deploying incentives, and combining two-phase sampling with other design changes. Interventions appropriate for face-to-face, telephone, web, mail and mixed-mode surveys will be discussed. Using the Total Survey Error (TSE) framework, the main concepts behind these designs will be explained with a focus on how these principles are designed to simultaneously control survey errors and survey costs. Examples of RSD in both large and small studies will be provided as motivation.  Small group exercises will help participants to think through some of the common questions that need to be answered when employing RSD.

 

3.  Case Studies in Responsive Design Research (one-day workshop)

Instructor: Brady West, William Axinn and Barry Schouten

This course will explore several well-developed examples of RSD. Dr. West will serve as a moderator of the course, and also introduce a case study from the National Survey of Family Growth (NSFG). The instructors will then provide independent examples of the implementation of RSD in different international surveys. All case studies will be supplemented with discussions of issues regarding the development and implementation of RSD. Case studies will include the NSFG, the Relationship Dynamics and Social Life (RDSL) survey, the University of Michigan Campus Climate (UMCC) Survey, and the Netherlands Survey of Consumer Satisfaction, among others. This variety of case studies will reflect a diversity of survey conditions. The NSFG (West) is a cross-sectional survey that is run on a continuous basis with in-person interviewing. The RDSL (Axinn) is a panel survey that employed a mixed-mode approach to collecting weekly journal data from a panel of young women. The UMCC survey is a web survey of students at UM that employed multiple modes of contact across the phases of the design. The Netherlands Survey of Consumer Satisfaction (Schouten) is a mixed-mode survey combining web and mail survey data collection with telephone interviewing. The focus of the course will be on practical tools for implementing RSD in a variety of conditions, including small-scale surveys.

 

4.  Responsive Survey Design for Web Surveys (one-day workshop)

Instructor:  William Axinn and Stephanie Coffey

Topics covered: Web surveys can be an inexpensive method for collecting data. This is especially true for designs that repeat measurement over several time periods. However, these relatively low-cost data collections may result in reduced data quality if the problem of nonresponse is ignored. This course will examine methods for using RSD to effectively deploy scarce resources in order to minimize the risk of nonresponse bias. Recent experience with the University of Michigan Campus Climate Survey and the National Survey of College Graduates is used to illustrate this point. These surveys are defined by phased designs and multiple modes of contact. This approach produced relatively high response rates and used alternative contact methods in later phases to recruit sample members from subgroups that were less likely to respond in earlier phases. In the case of the UM-CCS all of this was accomplished on a very small budget and with a small management team. Lessons from these experiences can be directly applied in many similar settings.

 

5.  Developing RSD Dashboards for Active Monitoring  (one-day workshop)

Instructor: Brad Edwards

Topics covered: This course will cover basic concepts for the design and use of “dashboards” for monitoring survey data collection. We will begin with a detailed discussion of how to design dashboards from an RSD perspective. This will include concrete discussions of how relevant data may be collected and summarized across a variety of production environments. We will also discuss how these dashboards can be used to implement RSD interventions on an ongoing basis. We will demonstrate these points using examples from actual dashboards. We will briefly explore methods for modeling incoming paradata in order to detect outliers. On the second day, we will consider practical issues associated with the development of dashboards, including software alternatives. Finally, we will demonstrate how to update dashboards using data reflecting the results of ongoing fieldwork. Students will be provided with template spreadsheet dashboards as discussed earlier.

 

6.  Alternative Indicators Designed to Maximize Data Quality (one-day workshop)

Instructor: Barry Schouten and Natalie Shlomo

Topics covered: The response rate has been shown to be a poor indicator for data quality with respect to nonresponse bias. Several alternatives have been proposed – the fraction of missing information (FMI), R-Indicators, subgroup response rates, etc. This course will explore the use of these indicators as guides for data collection when working within an RSD framework. We also explore optimization techniques that may be useful when designing a survey to maximize these alternative indicators. The consequences of optimizing a survey to other indicators will be explored. We will also consider how the response rate fits into this approach. We will end with a brief discussion of methods for post data collection evaluation of data quality.

 

7.  A Management Model for Responsive Survey Design (one-day workshop)

Instructor: Heidi Guyer, Joe Murphy and Shonda Kruger-Ndiaye

This course will cover issues associated with implementation of RSD to manage field work. Instructors will provide concrete instruction on active monitoring of key indicators across a variety of environments – small-scale surveys, large-scale surveys, and web, telephone, face-to-face and mixed-mode surveys. Methods for implementing RSD interventions in a diversity of production environments will be discussed. RSD will be presented within the framework of the principles of project management, with a particular focus on risk management. A checklist of steps for implementing RSD will be discussed in detail. This course will draw upon a semester-long graduate course in survey management, which includes sections on RSD.

 

8.  Getting SMART about the Collection of Data for Informing the Construction of Optimal Adaptive Interventions (one-day workshop)

Instructor: Daniel Almirall

Topics covered: The effective treatment and management of a wide variety of health disorders often requires individualized, sequential decision making whereby treatment is adapted over time based on the changing disease state or specific circumstances of the patient. Adaptive interventions (also known as dynamic treatment regimens) operationalize this type of individualized treatment decision making using a sequence of decision rules that specify whether, how, for whom, or when to alter the intensity, type, or delivery of pharmacological, behavioral, and/or psychosocial treatments. There has been a huge surge of scientific interest in constructing adaptive interventions via the sequential multiple assignment randomized trial (SMART) design. SMART is a type of multi-stage randomized trial design, developed specifically for the purpose of collecting high-quality data for building optimal adaptive interventions. SMARTs are still new to the great majority of behavioral and social science investigators. In this course, we will introduce adaptive interventions, SMART (including simple design principle, cutting-edge analytic methods (e.g., Q-Learning) for SMART data, and discuss how these ideas can guide responsive and adaptive survey designs.

 

9.  Two-phase Sample Designs in a Responsive Survey Design Framework (one-day workshop)

Instructor: James Wagner

Topics covered: Two-phase sampling is an important tool for RSD. In this course, we will review the theoretical underpinnings of the method, and elaborate on the use of this method for controlling costs and errors in the context of RSD by combining two-phase sampling with other (often more expensive) design changes. We will also discuss implementation issues, such as timing of the sample across various modes and designs and the development and use of appropriate sample weights. Examples from several studies will be included.

 

10.  Implementing, Managing, and Analyzing Interventions in a Responsive Survey Design Framework (one-day workshop)

Instructor: Brady West

Topics covered: This course will discuss a variety of potential RSD interventions. Many of these have been implemented experimentally, and the course will include evaluations of those experiments. The importance of experimental evaluations in early phases of RSD will be discussed. Methods for implementing interventions will also be discussed, including implementation of experiments aimed at evaluating new interventions. Strategies for implementing these interventions with both interviewer-mediated and self-administered (e.g., web and mail) surveys will be discussed. Methods for the evaluation of the results of the interventions (experimental and otherwise) will be considered. These evaluations will include measures of both costs and errors.

 

11.  Implementation of Responsive Survey Design at the U.S. Census Bureau (one-day workshop)

Instructor: Peter Miller, Ben Reist and Stephanie Coffey

Topics covered: This course will provide an overview of challenges and successes experienced in the development of adaptive survey design at the U.S. Census Bureau, including illustrations from the National Survey of College Graduates, the National Health Interview Survey and the Survey of Income and Program Participation. The presentation includes a brief history of the evolution of adaptive design capabilities at the Bureau.  We also discuss the development of a protocol for adaptive survey design that guides implementation and transparent documentation.  The three case studies show applications of AD in surveys with different designs (cross-section vs. longitudinal, single vs. multi-mode) and different cost/quality objectives.  We discuss successes and failures in these applications and factors that will shape future uses of adaptive design.

 

20 Ways to Test Your Survey Questions   

SurvMeth 988.221 (1 credit hour) 

Instructor: Pamela Campanelli, UK Survey Methods Consultant

Testing your survey questionnaire is essential for ensuring a high quality survey. There has been a large proliferation of question testing methods (both new methods and variations of existing methods). This course covers 20 different question testing methods. It is a very practical course; looking at what is known in the research literature about these methods but also focusing on "hands-on" practice of implementing the main methods. The course will cover the standard pilot test, review of item nonresponse and response distributions, interviewer rating form and variations, behaviour coding (classical, sequence-based and shortened), expert review, systematic forms appraisal (including two online programmes), respondent debriefing (including vignettes and web probing), cognitive interviewing (very short introduction because of the Summer Institute course on cognitive interviewing, but discussion of variations), focus groups for testing survey questions, split ballot tests, usability testing, analysis methods such as latent class and multi-trait-multi-method (appendix on item response theory), computational linguistics, crowd sourcing, eye-tracking, measuring reliability and validity, using record check studies and paradata. Discussion will also focus on the strengths and weaknesses of each method as well as proposals for multi-method question evaluation strategies.

Introduction to Applied Questionnaire Design is a course that complements well with this class.

Prerequisite: There is no prerequisite, but some knowledge of questionnaire design is of value.

2018 Syllabus (PDF)

 

Workshop in Survey Sampling Techniques

SurvMeth 616 (6 credit hours)

Instructors: Steve Heeringa, Jim Lepkowski and Raphael Nishimura, University of Michigan

The Workshop in Sampling Techniques is a component of the Sampling Program for Survey Statisticians. The workshop can only be taken in conjunction with the sampling methods courses, Methods of Survey Sampling and Analysis of Complex Sample Survey Data. The workshop allows students the opportunity to implement methods studied in the companion methods courses such as segmenting and listing in area sampling; selection of a national sample of the U.S.; stratification; controlled selection; telephone sampling; national samples for developing countries; and sampling with microcomputers.

The workshop is a required class for the Sampling Program for Survey Statisticians (SPSS). The SPSS is an eight-week program. It consists of three courses: a methods course (SurvMeth 612), a course on the analysis of complex sample survey data (SurvMeth 614), and a hands-on daily workshop (SurvMeth 616). Students enrolled in these three courses are considered Fellows in the Program. The methods and the analysis courses may be taken without being a Fellow. However, the workshop cannot be taken alone. Fellows receive a certificate upon successful completion of the program.

Syllabus 2018 (PDF)