Improving Data Quality in Insight Communities

Written By
Philippe Dame
  • Posted On
June 11, 2013

One of the common concerns we hear from clients who have never run an insight community is: how to ensure high quality data?

In any insight community, it’s reasonable to expect some amount of vaguely worded responses or failure to address the research question. While fear of not collecting enough ”good data” is often quickly forgotten when the output produced by these communities is analyzed, there are certain approaches you should take to minimize poor responses.

First, insight communities are still relatively new. Many participants will have little or no past experience with them which means it’s very important to set out firm requirements and be as clear as possible about your expectations. This sets the stage for the “right behaviour” and ultimately helps build “social proofs” that reaffirm it. In Recollective, that type of clear communication is done on the study homepage (summary tab) and when posing the research question (task). One of our research partners, Allpoints, has run many Recollective insight communities using this approach:

“In our experience, the single most important thing to do is to set the respondents expectations early and often. The summary page is the first place we do this, but at the beginning of each day or session, we start with a prompt that outlines the plan for that session. Then at the end, we go ahead and set them up for what will be expected in the next session. This constant communication is key to our success and to the respondents enjoyment of the research experience.”

Erin Sattenfield, Director of Research Services at AllPoints Research, Inc.

Taking the summary tab first, in the studies we observed, some of the most successful went as far as to detail specific expectations including the frequency or number of anticipated logins, how and to what extent participants are meant to contribute and even any best practices you want them to follow. Some suggested that participants read the questions a few times before typing a response. Another recognized the need to keep participants focused on their content by de-emphasizing the importance of spelling or grammatical errors.

Researchers can make the most notable impact on data quality by setting clear expectations in the research question itself (task instructions). Recollective assists with the ability to set minimums and maximums (word counts, photo attachments, etc) but this doesn’t guarantee that you’re collecting useful information. The key is to suggest what constitutes a well-balanced, suitable response.

Stating something like: “Please go beyond statements that simply say it is your favorite or that it’s the best. We’d like you to take the time to describe what’s unique about it. What separates it from the rest?” is a great example taken from a real Recollective study. It clearly communicates what constitutes a “helpful” response and there’s little doubt to how the participant should respond. In turn, this works to put the participant at ease which improves their propensity to comment and add more depth to their responses in follow ups.

Lastly, in social communities where activity responses are shared, many participants will reflect on their responses in light of other peoples’ and absorb information pertaining to response relevancy, tone, length and depth from the rest of the community. Consequently, participants who respond favourably because of clearly communicated expectations can influence how future activities are responded to.

Whilst it’s impossible to eliminate the risk that participants will upload an amusing picture of cats instead of what’s in their fridge, stressing the requirements, eliminating uncertainties and helping build a strong community “norm” helps prevent low quality responses and improves response quality overall.

Philippe Dame
Co-Founder & CPO
Want to chat about this topic?
Get in touch!