Testing

Testing

User Testing

CSR utilizes Web-based and other user testing methods such as A/B testing, usability testing, and user observation. A/B testing is crucial when measuring the effectiveness of two Web designs. Usability tests are generally designed around a series of tasks or scenarios that represent the end goal of our users. User observation, often casual or semi-structured, allows our staff to get a better idea of how our potential users interact with our designs – and provide vital baseline information to guide how we can adjust our Web products to better suit our users.

Whether it be an online survey design element, client interface, or Web portal we are always looking for new ways to reduce our users’ burdens and barriers.

Field Pretests and Pilots

CSR employs a number of industry standard field pretesting techniques. In a field pretest, the survey is sent to a sample of potential respondents according to the same protocol as planned for the main study. Returned responses from self-administered surveys are analyzed for ambiguous responses or high rates of missing data that may indicate respondent difficulties. Respondents can also be asked debriefing questions at the end of the questionnaire or in a follow-up telephone call to gather their feedback on the survey instrument and process. In telephone surveys, interviewers may be asked to code problems observed during the interview, such as “question had to be repeated” or “respondent requested clarification.” This technique known as behavior coding is used to help identify potential measurement problems with survey items. We work with each collaborator to develop a field pretesting plan, conduct pretest data collection, and analyze findings including proposed recommendations. Other field pretesting methods are also employed at CSR as suits the mode, target population, topic, and depth of testing required for the research project.

Survey Methodological Experiments

As part of its mission to advance survey methods innovation, CSR conducts research on a variety of topics including questionnaire design, email and Web design, social desirability bias, interviewer effects, mode effects, and recruitment and nonresponse reduction strategies.

Given the empirical nature of survey and sample-based research, “best practices” are sometimes best when evaluated in context. Methodological experiments in which, for example, one randomized half is assigned a treatment that adjusts the sequential mode of contact and the other half is assigned as the control (or a different treatment) are efficient and beneficial tools to determine not only the efficacy of the implementation treatment itself with the target population but also to forecast and conduct sensitivity analyses on costs and response for the project. We work closely with collaborators to determine whether and what kinds of methodological experiments may be appropriate.

Center for Survey Research
Morrison Hall 120
1165 E Third Street
Bloomington, IN 47405
(812) 856-0779 or (800) 258-7691
csr@indiana.edu
Free Weekly Consulting Hours:

Excludes first and last weeks of classes, holiday breaks and exam weeks. If the listed times are not convenient, please schedule an appointment using our Contact Us form.

Tuesdays 10AM-12PM (by appointment only)
Social Science Research Commons
Woodburn Hall 200 or by Zoom
Wednesdays 12PM-1:30PM (by appointment only)
Scholars' Commons
Herman B. Wells Library
East Tower E157N or by Zoom
 Book an appointment online