If you ask me…

Wright State psychology research on improving the quality of questionnaire data to be published in prestigious international journal

Research on the length of surveys by Tony Gibson, left, a Ph.D. psychology student, and Nathan Bowling, professor of psychology, will be published in the European Journal of Psychological Assessment. (Photo by Erin Pence)

The focus of research at Wright State University on how the length of surveys affects the accuracy of the results has been accepted for publication in a prestigious international journal.

The research by Tony Gibson, a graduate psychology student, and Nathan Bowling, professor of psychology, will be published in the European Journal of Psychological Assessment.

The journal is the official publication of the European Association of Psychological Assessment and is also sponsored by the International Association of Applied Psychology. Its purpose is to present important articles to practitioners and academic researchers that provide seminal information on developments in psychological measurement.

“We got into this topic because we thought this would be a chance to make a fundamental contribution to psychology and the social sciences since so much research in the social sciences uses surveys,” said Bowling. “I don’t think there is any way around it. Surveys provide the most direct and convenient means of studying many topics that are of interest to social scientists.”

Bowling and Gibson examined careless responding to questionnaires, which occurs when people respond without closely reading the content.

“We believe this is a widespread problem that undermines the accuracy of survey research done within companies and within social science research,” said Bowling.

Surveys are widely used in psychology and social sciences to measure attitudes, opinions, political views and other beliefs and behaviors. Personality surveys are extensively used in the corporate world to determine the suitability of job candidates.

Gibson said careless responding can give researchers results that are misleading.

Bowling said typically 10 to 15 percent of people taking surveys respond carelessly, enough to produce results that are very different than if every person in the study responded carefully.

Companies often use surveys to measure job applicants’ suitability for certain positions.

“If the applicant is not paying close attention to the questions, the employer is not capturing that information to successfully place them,” said Gibson. “As an applicant, you might get the job and find out later down the line that you weren’t a good fit because the company wasn’t able to measure your personality well.”

Some surveys are very long. Many personality tests, for instance, contain hundreds of self-report questions. And even when they don’t include any lengthy measures, a survey may still be lengthy because it includes dozens of relatively brief measures.

For years, researchers have suspected that long surveys tend to cause some respondents to lose interest as they respond to questions.

The Wright State research grew out of Gibson’s master’s thesis as a student in the College of Science and Mathematics. Gibson, of Marion, earned his bachelor’s degree in psychology from Wright State in 2011 and is working on his Ph.D.

His research involved surveying 362 undergraduate students in a psychology lab and another 280 online. Gibson and Bowling used several different methods to detect if survey respondents were answering carelessly.

They looked at the length of time it took respondents to complete the questionnaires. The researchers also observed whether respondents gave different answers to what was virtually the same question. And there would be an occasional item in which the respondents would have to check a certain option to show they were paying attention.

Conducting surveys online, a growing practice, raises concern that the absence of face-to-face contact with the researcher makes it more likely that the respondents will be careless in answering the questions.

“The takeaway here is that data quality suffered the most when participants completed the survey online,” said Gibson.

The researchers also studied the effects of manipulating the surveys by offering respondents rewards for careful answering and threatening penalties for careless responses.

“We found some evidence that these incentives work,” said Bowling.

One survey involved telling the students they would be put in a drawing for a gift card if they answered carefully. Another study involved a potential penalty for answering carelessly.

“Having consequences for carelessness can reduce careless responding up to a point,” said Gibson. “I think that small rewards or incentives like raffles are a really good way to go.”

Comments are closed.