Up: presentation-increasing-response-rates-incentives
Prepaid and Promised Incentives in Web Surveys: An Experiment
Reading: Bosnjak, Michael, and Tracy L. Tuten. 2003. “Prepaid and Promised Incentives in Web Surveys: An Experiment.” Social Science Computer Review 21(2): 208–17. doi:10.1177/0894439303021002006.
KEYWORDS: web-based surveys; incentives; noncompliance; nonresponse
Overview
- This study tests whether different types of incentives — prepaid cash, promised cash, prize draws, or no incentive — encourage people to participate in an online survey and complete it fully.
- Conducted in 2003 with members of a Virginia real estate association, it explores if prepaid online payments (via PayPal) work as well as they do in mail surveys, where they significantly boost response rates.
- The researchers expected prepaid incentives to outperform others but found surprising results, with prize draws proving most effective.
What they did?
- Background: Web surveys often have low response rates, and methods to boost participation (like prepaid cash in mail surveys) are hard to apply online.
- A 1993 study (church) showed prepaid cash increases mail survey responses by 19%, far more than promised rewards. With new online payment services like PayPal, the researchers tested if prepaid cash works similarly for web surveys.
Experiment Setup:
- Participants: 1,466 members of a Virginia real estate association (agents, brokers) with valid email addresses. After excluding 134 unreachable addresses, 1,332 were included.
- Groups: Randomly assigned to four groups (each ~330-336 people):
- Prepaid Group: Received $2 via PayPal before the survey started.
- Promised Group: Offered $2 via PayPal after completing the survey.
- Prize Draw Group: Entered into a draw for one 25 prizes upon completion.
- Control Group: No incentive, just a survey invitation.
- Survey Process:
- Day 1: Initial email about the survey (mentioned incentives for relevant groups).
- Day 3: Survey start email with a unique URL and ID to track participation.
- Day 8: First reminder for non-respondents, repeating incentive details.
- Day 13: Second reminder, same content plus tech support offer.
- Day 22: Final email announcing survey closure.
- The survey content isn’t detailed but was hosted on Inquisite software, which tracked participation but couldn’t distinguish specific incomplete patterns (e.g., dropouts vs. skipped questions).
- What They Measured:
- Willingness to Participate:
- Number of unique visits to the survey’s welcome page.
- Number of people starting the survey (clicking past the welcome page).
- Speed of accessing the welcome page (days after invitation).
- Actual Participation: Percentage of people completing all questions (completes).
- Incomplete Participation: Percentage of people starting but not finishing (e.g., dropouts, skipped questions, or “lurkers” who viewed without answering). Limited by software, this was a broad category.
- Willingness to Participate:
- Hypotheses:
- Prepaid incentives would increase willingness, completion rates, and reduce incompletes compared to promised incentives (based on mail survey findings).
- Promised incentives would perform no better than no incentive.
- Prize draws’ effects were explored without specific predictions due to mixed prior evidence.
Findings
- Willingness to Participate:
- Welcome Page Visits:
- Prize draw group had the most visits (35.9%, 118/329), followed by promised (27.3%, 91/333), control (26.6%, 89/334), and prepaid (25.6%, 86/336).
- No difference between prepaid and promised (p > 0.05) or promised and control (p > 0.05). Prize draw beat all others (p < 0.05).
- Finding: Prepaid didn’t outperform promised (rejecting Hypothesis 1), and promised matched control (supporting Hypothesis 3). Prize draws sparked the most interest.
- Speed of Access:
- Prepaid group was slowest (average 6.5 days), followed by prize draw (5.3 days), promised (4.3 days), and control (4.3 days).
- Prepaid was significantly slower than promised (p < 0.01) and control (p < 0.01), but not prize draw (p = 0.098).
- Finding: Prepaid incentives didn’t speed up participation, contrary to expectations.
- Starting the Survey:
- Of those visiting the welcome page, 79.7% of prize draw (94/118), 73.6% of promised (67/91), 70.9% of prepaid (61/86), and 69.7% of control (62/89) started the survey.
- No significant differences between prepaid and promised (p > 0.05), promised and control (p > 0.05), or prize draw and others (p > 0.05, two-sided tests).
- Finding: Prize draw trended higher but wasn’t statistically better; prepaid didn’t lead.
- Welcome Page Visits:
- Actual Participation (Completes):
- Prize draw had the highest completion rate (65.3%, 77/118), followed by promised (58.2%, 53/91), prepaid (55.8%, 48/86), and control (48.3%, 43/89).
- No difference between prepaid and promised (p > 0.05) or promised and control (p > 0.05). Prize draw beat control (p < 0.05) but not prepaid or promised (p > 0.05).
- Finding: Prepaid didn’t increase completions (rejecting Hypothesis 2), and promised matched control (supporting Hypothesis 4). Prize draws boosted completions.
- Incomplete Participation:
- Control had the most incompletes (30.6%, 19/62), followed by prepaid (21.3%, 13/61), promised (20.9%, 14/67), and prize draw (18.1%, 17/94).
- No significant differences between prepaid and promised (p > 0.05), promised and control (p > 0.05), or prize draw and others (p > 0.05, two-sided). Prize draw vs. control was significant with a one-sided test (p < 0.05).
- Finding: Prepaid didn’t reduce incompletes (rejecting Hypothesis 5), and promised matched control (supporting Hypothesis 6). Prize draws slightly reduced incompletes.
What it means?
- Main Points:
- Unlike mail surveys, prepaid $2 online payments didn’t boost participation or reduce incompletes compared to promised payments or no incentive. This challenges social exchange theory, which predicts prepaid rewards build trust and obligation.
- Prize draws were most effective, increasing welcome page visits, completions (by ~17% vs. control), and slightly reducing incompletes. This suggests people are drawn to the chance of a big reward.
- Promised $2 payments were no better than no incentive, showing small guaranteed rewards may not motivate.
- Why Prepaid Failed:
- The $2 was sent via PayPal, not cash-in-hand, so it felt less tangible.
- Participants (real estate professionals) might see $2 as too small, given their economic mindset.
- Lack of trust in PayPal or the offer’s legitimacy could reduce impact.
- People might be used to prize draws in surveys, making prepaid cash seem unusual or suspicious.
- Tips for Web Surveys:
- Use prize draws (e.g., a few large prizes) to boost participation and completion rates.
- Avoid small prepaid or promised cash incentives ($2) unless trust in the payment system is high.
- Test larger cash amounts or more trusted payment methods (e.g., bank transfers).
- Track participation patterns (visits, starts, completes) to understand incentive effects.