Up: presentation-increasing-response-rates-incentives

Survey Incentives

Reading: Singer, Eleanor. 2018. “Survey Incentives.” In The Palgrave Handbook of Survey Research, Cham: Springer International Publishing, 405–15. doi:10.1007/978-3-319-54395-6_50.

General:

  • Focused on large-scale, often national, social research surveys sponsored by government agencies or research organizations.
    • Excludes market research, customer satisfaction surveys, or short polls.

History:

  • Declining response rates are undeniable, the incentives are widely used in major national surveys.

Effects of Incentives:

  • Response Rates
    • General Impact: Incentives increase response rates, primarily by reducing refusals, which are the main cause of nonresponse in surveys of interest.
      • Prepaid incentives outperform promised incentives or no incentives.
    • Monetary vs. Non-monetary: Monetary incentives are more effective than gifts, and response rates increase with larger amounts, though not always linearly.
    • Interviewer-Mediated Surveys: Incentives have smaller effect in interviewer-mediated surveys compared to mail surveys, as interviewers partially compensate for the lack of incentives.
    • Random Digit Dialing Experiments: 2008 study by Cantor, O’Hare, and O’Connor reviewed 23 RDD experiments, finding that 5 prepaid incentives increased response rates by 2-12 percentage points. Larger incentives yielded higher rates, but with diminishing returns. The effect size remained stable over time, though baseline response rates have declined.
    • Longitudinal Studies: Incentives reduce refusals and sometimes noncontact.
      • Initial payments can sustain motivation in later waves, though respondents may perceive waves as separate surveys.
      • Prepaid incentives are effective for converting prior refusals but not for those who previously cooperated, suggesting a ceiling effect.
  • Response Quality
    • Measurement Metrics: Response quality is typically assessed through item-nonresponse and the length of open-ended responses.
    • Hypotheses: Two competing hypotheses exist:
      • Incentives may lead to minimal effort, as respondents feel obligated to complete the survey but not to provide high-quality answers
      • Incentives may create a sense of obligation to provide accurate responses. Most studies find no significant effect on response quality.
    • Key Study: Becca Medway’s JPSM doctoral research found that a $5 incentive increased response rates (22% vs. 11%) and reduced item nonresponse while shortening survey completion time.
      • However, when controlling for cognitive ability and conscientiousness, these effects disappeared, suggesting no overall impact on quality.
  • Sample Composition
    • General Findings: Most studies show incentives have little effect on sample composition, but some exceptions exist:
      • Berlin’s literacy study found incentives attracted more low-socioeconomic status (SES) respondents. 🚨
      • Merkle et al. noted more Democrats in samples offered a branded pen.
      • Groves et al. found a higher proportion of respondents with lover civic duty in incentivized samples.
    • Topic Interest: Experiments targeting groups with low topic interest show incentives can increase participation among these groups but do not significantly reduce nonresponse bias or alter sample composition meaningfully.
  • Response Distributions
    • Direct Effects: There is little evidence that incentives directly alter response content. A study of former mental patients found no effect of incentives on their evaluations of treatment facilities.
    • Indirect Effects: Changes in sample composition could indirectly affect response distributions, but research on this is sparse. The lack of direct bias is reassuring, but incentives’ ability to counter nonresponse bias remains unclear.
  • Internet Surveys
    • Similarities: Findings from other modes apply to internet surveys: monetary incentives outperform gifts, and prepaid incentives are more effective than promised ones.
    • Lotteries: Lotteries, commonly used in web surveys, are generally no more effective than no incentives. Anna Göritz’s research suggests lotteries may not be necessary, as other incentives perform better.
    • Outcomes: Incentives in internet surveys do not significantly affect item nonresponse or sample composition.

Differential Incentives

  • Refusal Conversion Payments: These are more cost-effective than universal prepaid incentives and are believed to reduce nonresponse bias by targeting reluctant respondents. However, some argue they are unfair, rewarding non-cooperative respondents while those who participate without incentives receive less.
  • Public Perception: Respondents often view refusal conversion payments as unfair but are still willing to participate in future surveys by the same organization, even if differential incentives were used previously.
  • Singer’s Recommendation: Offer a small upfront incentive to all respondents to acknowledge their effort, then use differential incentives for refusal conversion to address bias.

Incentive Amount and Effectiveness

  • Uncertainty: There is no clear guideline on optimal incentive amounts. Larger incentives are generally better, but differences are sometimes negligible.
  • Ceiling Effect: Incentives have a greater impact on those less inclined to respond, but provide little additional boost for already motivated respondents.

Cost-Effectiveness

  • Incentives can reduce costs by lowering callback efforts. Medway’s study found lower completion costs with incentives. However, few studies systematically evaluate cost-effectiveness, an area needing further research.

Conclusion!

  • Article confirms that incentives effectively increase survey response rates, particularly by reducing refusals, with prepaid monetary incentives being the most effective.
  • However, their impact on response quality and sample composition is limited, and their ability to reduce nonresponse bias is uncertain.
  • Strategic use of incentives, such as refusal conversion payments, offers promise for addressing bias, but more research is needed.