Up: presentation-increasing-response-rates-incentives
The Impact of Varying Financial Incentives on Data Quality in Web Panel Surveys
Reading: Spreen, Thomas Luke, Lisa A House, and Zhifeng Gao. 2020. “The Impact of Varying Financial Incentives on Data Quality in Web Panel Surveys.” Journal of Survey Statistics and Methodology 8(5): 832–50. doi:10.1093/jssam/smz030.
KEYWORDS: Data quality; Financial incentives; Web panel.
General
- This study looks at whether giving people different amounts financial incentives (0.50) to complete an online survey about fresh strawberries changes how they answer or behave.
- Conducted in 2015 with 2,156 US respondents.
- Question: Does more money makes people answer more honestly, pay better attention, or finish the survey? Does it reduce cheating or lazy responses?
Intro and Context
- Web Panel Surveys: Web panels, where participants voluntarily join to complete online surveys for rewards, are cost-effective compared to mail, phone, or face-to-face surveys.
- However, concerns exist about non-representative sample frames and data validity due to fraudulent or inattentive respondents.
- Data Quality Issues: Prior research (e.g., Jones et al., 2015; Miller, 2006) indicates that 16-25% of web panel respondents may engage in fraudulent behavior (misreporting qualifications) or satisficing, impacting data quality.
- Incentive Effects: Incentives increase response rates across survey modes, with prepaid monetary incentives being most effective (Singer and Ye, 2013). However, their impact on response quality and problematic behaviors in web panels is understudied. Theoretical perspectives are mixed:
- Rational Choice: Higher incentives attract disinterested respondents who may provide low-quality responses quickly earn rewards (Barge and Gehlbach, 2012).
- Reciprocity Norm: Incentives may encourage more careful responses as respondents feel obligated to reciprocate (Gouldner, 1960).
- Study Objective: Whether increasing contingent financial incentives reduces qualification fraud, satisficing, and affects eligibility or break-off rates in web panel surveys.
Experimental Design
- Survey Details: Focusing on fresh strawberry preferences.
- People were randomly assigned to six incentive levels:
- 0.75, 1.25, 3.00, awarded upon survey completion
- People were randomly assigned to six incentive levels:
- Checking Behavior: The researchers used tricks to spot dishonest or careless answers:
- Low-Probability Fraud: Asked if people bought rare fruits (rambutan, ugli fruit, goji berries) in the last two months. If they said yes to two or more, they were likely lying to qualify (3.1% did this).
- Trap Questions: They used simple directives that request the respondent to select a specific answer. For example, “Please verify where you are in the survey by marking a ‘2’ for this item”. About 9.7% failed.
- Speeding: Flagged people who answered a set of questions too fast (less than 4 seconds per question, 1.3% of people).
Findings
- Did more money change who joined or quit?
- No. The number of people who qualified (87%) or quit early (3%) was about te same, no matter how much money they were offered.
- Did more money stop cheating?
- Not really. About 18% of people were flagged as possibly dishonest (saying they bought rare fruits). More money didn’t clearly reduce this, though older people lied less, and those without college degrees were less likely to cheat as the reward increased.
- Did more money improve attention?
- A little, but only for some people.
- Employed people were more likely to fail the trap question (8% more likely), but bigger rewards made them 4% less likely to fail it. This suggests more money helped them focus.
- Other behaviors (like speeding or inconsistent answers) didn’t change much with more money.
- A little, but only for some people.
- Response Quality: People who were dishonest or careless gave different answers than careful respondents (they were more likely to say they’d pay extra for Florida strawberries). This happened at all reward levels.
- Why People Joined: About half of honest respondents (55%) and slightly fewer careless ones (51%) said they took the survey for the money. Only 23% of dishonest people said this, maybe to hide their motives. The reward amount didn’t change why people said they participated.
What is means?
- Main Point: Giving more money doesn’t make a big difference how honest or careful people are in web surveys. It helps a bit with employed people paying attention to trick questions, but that’s about it.
- Tips for Better Surveys:
- Use trap questions and rare-item questions to catch dishonest or lazy respondents early.
- Be careful though — some honest people might get flagged by mistake (30% of those marked as dishonest didn’t mess up other questions).