//

Is your data trustworthy?

Is your data trustworthy?

iModerate

Oct 09, 2015

Share It

Do you trust your Data? No, no, I’m not talking about the sentient android from Star Trek, The Next Generation. And no I’m not talking about a family wireless plan either. I’m talking about the end product you get from a research study. I’m talking about the data that you plan your marketing campaign around or a study to see where your product development should be.

How accurate is your data? How can you trust that the respondents are actually engaged in your study and reading the questions carefully and responding thoughtfully? I’m sure it’s always a point of concern among many researchers, especially with studies that require longer engagements. Here are some tips and tricks for effectively collecting quality, accurate data from your research:

Data Verification

  • Add a clicker trap. Make sure your respondents are engaged and paying attention by adding a simple “clicker” question. In the question text, request that the respondent select a particular option from the answer choices, e.g., select the 2nd option below. If they choose a response other than the one that you specified, disqualify that respondent or flag this behavior in the system.
  • Set a timer for the survey. A good practice would be to either flag or disqualify anyone who completes a survey in less than half the time of the expected duration of the interview. You could even take this further by flagging anyone who completes the study in less than half the time of the mean of the soft launch completion (a smaller controlled number of interviews done before targeting your complete audience).
  • Check for straight-lining of grid or matrix questions. These question formats can cause many respondents to “straight line” each row (e.g., answer the second option for every question) just to get through the survey. Implementing code to check for this type of behavior and flagging or even terminating those respondents can help you sort out the ones that are more engaged.
  • Flag or disqualify respondents coming from the same IP address. Unless you’re surveying an audience that’s likely to be participating from the same building or network, such as professionals in the same geographical locations with specific occupations like doctors or educators, the majority of respondents will come from a unique IP address. Flagging or disqualifying these would prevent the same respondent from participating in the study multiple times.

 

End of Interview analysis

  • Ask for feedback on the experience. Follow up the end of the study with a couple of simple questions asking them to judge the length of the survey and describe (in an open end) how they felt about the study. A person who says the survey was short in a study that you know runs long (or the opposite) can be flagged. You can set coding in the survey to automatically flag these individuals. Finally, asking about their experience taking the survey can give you insight regarding what was frustrating, and some may even share what they liked about the experience.

 

Qualitative component

  • Require a higher level of engagement. Whether it’s an online interview or focus group or even more advanced like a video/facial tracking interface, the ability to probe and communicate instantly removes most doubt that your respondent is just clicking away with no thought to the substance of the study. While this requires certain logistics and extra technology to be added to the survey, it’s well worth it if it can be done.

 

Survey Design

  • Vary the format. Incorporating elements like cool buttons, card sorts, or drag ‘n drop type of questions generally makes the experience more enjoyable. Sometimes respondents spend more time on them because they are more engaging and require an activity outside of “just clicking.”
  • Leverage engaging content. When appropriate, use media in the survey. Whether you’re simply asking their reaction to a video or conducting a “dial test” (which allows respondents continuously slide a dial on a scale of favorable to unfavorable based on their reactions towards what’s being shown during the course of a video), people are more eager to respond to something they just saw.
  • Ask them to upload a photo where applicable. This is not only a fun way to interact with respondents and bring them to life, but also allows you another checkpoint to see if the upload matched what was requested. While it certainly depends on what’s being asked, you’d be surprised what respondents are willing to share in pictures – their cabinets, cars, office spaces, etc. I have seen some consumer behavior studies achieve great success with the level of engagement in these types of studies.

 

There is no silver bullet that ensures complete accuracy all of the time. But implementing some or all of these practices would go a long way in being confident in the accuracy of the data that you collect and deliver. And we can all sleep better when data is trustworthy!

New Call-to-Action

The insights I received from iModerate really brought our NPS program to life. While it was always highly-visible and important to key stakeholders it did not resonate as well with the majority of employees. The iModerate piece rounded out the NPS program and brought it to a place where it is now more valued, transparent and salient across the organization. Having the consumer’s voice and that context has helped us build business cases and impact operations in a way that has led to great success.

Adriana Smith, Manager, Brand Strategy, NRG Energy