6 Tips for Designing and Conducting an Online Survey

 May 9, 2016    by Toni Klemm

Online surveys are everywhere these days, and with free tools like SurveyMonkey or Google Forms, anyone can conduct a survey. Preparing and conducting a survey for research, however, is no small endeavor and requires careful preparation and consideration. Here are 6 tips for how to get the most out of your efforts. I recently finished an online survey of agricultural advisors in Texas, Oklahoma, Kansas, and Colorado about seasonal forecasting for winter wheat farmers. That survey took about 3 months to plan and another 3 to conduct, and I will use it as an example here.

1. Survey or Not?

Ask yourself: What information am I interested in, and is a survey the best method to get this information? Surveys work well when gathering quantitative information that can be put in rankings, lists, like-dislike scales, or counts, but surveys are not ideal for qualitative research, such as descriptions of events, detailed anecdotes, or open-ended conversations, for which in-depth interviews are much better suited. Conducting, recording, transcribing, and analyzing interviews is also much more labor-intensive–one reason why interview studies generally have fewer participants than survey-based studies. The pros and cons of qualitative approaches are nicely laid out in Berg (2008), a seminal book on qualitative methods, while Babbie (2014) provides the basics about survey research (chapter 9) and quantitative analysis (chapter 14). Also discuss your approach with faculty, experienced coworkers, and your committee. I decided that a survey followed by a small number of interviews would give me the best of both worlds – a large, quantitative dataset to analyze and detailed information to explain some of the most interesting survey results, all while being time-efficient.

2. Survey Methods

You’ve established that a survey is your method of choice. But which survey method should you choose? Many surveys today are conducted online as opposed to via phone or snail mail, and for obvious reasons: They are easy to disseminate via email or social media, and they are cheap or free to produce (try Google Forms or SurveyMonkey). Online surveys also deliver instant results in a digital format reducing errors from digitizing mail responses, and they have lower labor costs than phone surveys, which also tend to have lower response rates: about 25% for online surveys versus 8 to 12% for phone surveys, according to FluidSurveys.

However, online surveys, convenient as they are, can create biases. Your target population might not all have internet and/or social media access, or you might not have a complete email list. These biases could lower the explanatory power and generalizability of the survey results. Biases can’t always be avoided, or avoiding them would increase the survey costs, for example by using a phone survey instead of an online survey. In any case, these shortcomings should be mentioned in the publication. My survey, too, faced the problem of internet bias, but instead of changing my method I decided to change my survey population. Instead of surveying farmers, a group that generally has low computer and internet access, I surveyed agricultural advisors, who have all desk jobs, internet access, and publically available email addresses. They are also in contact with many farmers, and thus can, to some degree, speak to their concerns. I couldn’t ask them quite the same questions that I would have asked farmers, but that was a compromise I was willing to make.

3. Survey Design

Ask yourself again: What is it that I’m interested in? This should help you decide what question formats are best: matrices, multiple choice, open-ended text boxes, Likert scales, images and sketches, …? The literature can give you direction, but think critically about what you read in papers. Was that really the best way to get at the research question or just convenient in that particular case? Could I do it differently and achieve more robust results? Discuss your ideas with your committee or peer researchers. My survey was inspired by focus group and interview research of corn farmers in the Midwestern U.S., which I adapted to fit my time budget and to answer my research questions. Surveying also means explaining differences in responses and, often, trying to confirm or reject a hypothesis. Why did some participants answer in this or that way? Was it because of their income, their education level, their geographic location — whatever it is, make sure you ask about it in your survey in order to later cross-tabulate answers and analyze them for significant differences. Also, think about the order of your questions and if you really need to ask all of them. People might be okay spending 10 or 15 minutes on your survey, but too many questions could make them frustrated and quit the survey. Keep it succinct, but still ask everything you need. Let people know at the beginning how long the survey will take. Pretests can help you estimate this.

4. Question Language

By now you probably see that designing a survey can take some time. After weighing the pros and cons of the question format, phrasing, testing and refining your questions can take weeks or even months. Which words should you avoid? Farmers in the Southern Great Plains, for example, don’t like terms like “sustainability” (which many associate with government regulations) or “climate change”, for obvious reasons, so I tried to avoid them. Jargon is okay to use, but make sure your survey population understands what you mean. Consult experts to fine-tune the wording. Make sure questions are unambiguous, easy to understand, and check that answer choices cover every possibility. Again, pretesting will reveal most of the wrinkles and issues before you release your survey. The easier you make it for your participants, the more likely they will finish your survey.

5. IRB Approval

Getting your survey approved by your Institutional Review Board (IRB, also called Independent Ethics Committee, IEC) is required for all research on human subjects (meaning survey, medical, psychological, and other research on humans) that is intended for publication. You can read more about it here, but in general, the IRB’s job is to make sure you treat your participants fairly, protect their information, and don’t compromise the reputation of your institution. The University of Oklahoma produced a series of short videos to explain the IRB process. Expedited IRB approval for low-risk studies, like in my case with the agricultural advisors, can take as little as one week to get approved. But when your target population includes children, prisoners, or pregnant women (so-called “vulnerable populations”), a full panel review is necessary which can take months, and reviewers might ask you to justify every question in your survey. Some studies need approval by multiple IRBs, for example studies of Native American tribes, which may also need approval by each tribe that is involved. Last but not least, make sure your survey is finalized when submitted for IRB approval. Even small changes, for example in the wording of questions, have to get approved again.

6. Spreading your online survey

Congratulations! Your survey got IRB-approved and is ready to go. Now on to getting your survey out there. Depending on who your target population is, this can be a challenge in several ways. For my survey of agricultural advisors, for example, I couldn’t just spread via Facebook or Twitter (i.e., the snowball method). First of all, I didn’t list this method of dissemination in my IRB application, but also I wouldn’t be able to tell who took the survey nor would it be the best way to access my target population. My results would become meaningless. In my case, direct email within certain mailing lists was the only method that made sense. Timing is also critical. Winter wheat advisors have a lower workload during in the cold months, before temperatures increase in spring and farm work picks up again, leaving less time for them to do my survey.

There are several ways to increase the number of responses. Connect with your survey population by attending their meetings and introducing yourself. Reach out to trade publications and ask if they would report about your research and the survey you are conducting in their circulation area. When they do, you can link them in your survey invitation. My research was reported by the Kansas Farm Bureau and the Texas Farm Bureau, which I subsequently included in survey reminder emails. Especially for out-of-state surveys, this can create trust and familiarity among the people you are surveying.

Wheat harvest.

“Local champions,” people well known and respected by your target group, can also help you boost your response rate. They can email out the survey invitation or survey reminder on your behalf. Their name in people’s inbox (as opposed to yours) will make potential respondents more likely to take time out of their busy schedule and do your survey. But local champions are busy people, too. Provide them with email templates, a list of email addresses (semicolon-separated), and perhaps a PDF with additional information about your research that they can attach. Also, ask them to be copied in their email. That way you know the email was actually sent out. Collaborating with local champions will be additional work for you, but your efforts will be greatly rewarded. Without local champions—for me it was regional extension directors and state climatologists—I would not have gotten anywhere near the response rate that I got. After several rounds of emails and three months of surveying, it was at just over 40%! And as a nice side effect, I had several people say they were very interested in presenting my results.

____________________________________________________________________________________________________________

Toni is a Ph.D. candidate at the Department of Geography and Environmental Sustainability at the University of Oklahoma and works for the South-Central CSC.

____________________________________________________________________________________________________________

Babbie, E. R. (2014). The Basics of Social Research (Vol. 6).

Berg, B. (1998). Qualitative Research Methods for the Social Sciences. Third Edition.

Check back in a while to read more about tips for analyzing surveys.

You can also follow us on Facebook and Twitter and stay up to date about new blog posts.

NOTE: Comments will be visible to the public. Before commenting for the first time, please review the ECCF's Editorial Policy.

Add comment

Log in to post comments

Photo: Craig Taylor, Flickr