Huge shoutout to Thomas Bialek and @chuxin_h on the OP Labs data team, and @andyhall from Stanford GSB/ a16z crypto, for leading the design and analysis of Airdrop 5 summarized below!
Summary
We leverage the quasi-experimental design of airdrop 5 to study whether airdrop rewards increased subsequent user retention. We find that:
(1) Receiving 50 OP through Airdrop 5 increased 30-day retention by 4.2 percentage points and 60-day retention by 2.8 percentage points.
(2) Bonus categories that focused on cross-chain activity moderately increased retention, whereas a bonus category focusing on frequent activity decreased retention rates over time.
(3) A more frequent, recurring airdrop cadence may potentially be more effective at re-engaging addresses over the long term, given the marked decline in the treatment effect observed between the 30- and 60-day marks.
This raises some interesting questions for the future:
- Why do airdrops sometimes succeed in encouraging long-term engagement, and sometimes they don’t?
- How might we design the airdrop to encourage constant positive effects on retention (rather than positive but declining effects over time) — especially given our public pledge to offer additional sequential rewards?
- Are there other causal inference tools (e.g., synthetic control) we might leverage to study the effects of airdrops and other policies in Optimism’s token allocation strategy?
We speculate that the positive treatment effect may in part be driven by expectations of a future airdrop with similar criteria as well as power users parlaying their rewards into a variety of activities across the Superchain. Furthermore, we hypothesize that the Frequent User bonus may target a different segment of addresses prone to farming activity, which could help explain the resulting negative treatment effect.
By extension, the results suggest that airdrops may be a viable option in the toolkit for bottom-of-the-funnel activities geared toward boosting retention rates. Interestingly, the fact that small rewards of 50 OP have already demonstrated a powerful effect on retention implies that the “minimum effective dose” may perhaps be lower than initially expected.
Airdrops: Short-term Adoption versus Long-term Retention?
In web3, airdrops directly distribute tokens to specific addresses, often in an effort to raise awareness and adoption of crypto projects. Optimism has earmarked 19% of the total initial token supply to airdrops to the community. To date, Optimism has issued 5 airdrops and approximately 550M OP remains for future airdrops. We’ve learned and iterated each time, finding for instance that airdrop 2 increased governance engagement or finding that user friction such as claiming rewards can be a barrier to receiving airdrop 4.
Airdrop 5, the airdrop we study here, aimed to reward Superchain power-users who actively engaged with apps across the ecosystem. We wanted to understand: Did this airdrop actually reach its goal of raising retention? And was this lift in retention long-term (e.g., is there meaningful retention after one month, or two months, in activity)? We crunched the data and leveraged the quasi-random design to estimate the effects of airdrop rewards on subsequent retention.
We Exploit the Natural Experiment in Optimism’s Airdrop 5 to Study Causal Effects of Rewards on Retention
On October 9, 2024 Optimism distributed 10.4M OP to 54.7K addresses. Similar to airdrop 1 and 4, eligible addresses needed to claim their rewards. Airdrop 5’s reward function consisted of
- Superchain Power User reward.
- Seven Bonus rewards
Reward Criteria | Description |
---|---|
Main Reward: Superchain Power User | Interacted with at least 20 unique contracts on the Superchain and had a contracts-to-transactions ratio of at least 10% during the eligibility period from Mar 15, 2024, to Sep 15, 2024. |
Bonus Reward 1: Active Delegator | Had at least 9,000 total OP delegated x days delegated during the eligibility period from Mar 15, 2024, to Sep 15, 2024. |
Bonus Reward 2: Frequent User | Made at least 10 app transactions per week in at least 20 distinct weeks during the eligibility period from Mar 15, 2024, to Sep 15, 2024. |
Bonus Reward 3: Superchain Explorer | Made at least 1 app transaction on at least 7 chains in the Superchain during the eligibility period from Mar 15, 2024, to Sep 15, 2024. |
Bonus Reward 4: Early Superchain Adopter | Made at least 1 app transaction on at least 3 chains in the Superchain in the first week after each chain’s public mainnet launch. |
Bonus Reward 5: Quester | Completed at least 1 Optimism quest between Sep 20, 2022, and Jan 17, 2023. |
Bonus Reward 6: SuperFest Participant | Participated in at least 5 SuperFest missions during the campaign period from Jul 9, 2024, to Sep 3, 2024. |
Bonus Reward 7: SUNNYs Fan | Minted NFTs from at least 3 unique contracts that registered for the SUNNYs during the eligibility period from Mar 15, 2024, to Sep 15, 2024. |
Overview of Regression Discontinuity Design for Studying the Effects of Airdrops 5 on Retention
Our goal is to estimate the effect of giving an address OP rewards on the subsequent retention rate of that address. Isolating this causal effect is challenging due to a basic issue of statistical confounding: addresses more likely to qualify for OP rewards based on their past usage are likely to continue using Optimism more in the future, regardless of whether or not they receive rewards. We do not want to confuse a positive correlation between OP rewards that were awarded based on engagement and subsequent retention with an actual causal effect of the airdrop.
To overcome this issue of confounding and to estimate the effects of receiving airdrop 5 on subsequent retention, we used a regression discontinuity (RD) design. This quasi-experimental design allows us to analyze the effects of an intervention by comparing individuals directly below and above an eligibility threshold — because these individuals should be essentially the same, we can treat this as a quasi-random intervention and compare groups below the threshold who don’t receive the intervention (our effective control group) to the individuals directly above the intervention (our effective treatment group).
We find Positive Effects of Airdrop 5 on 30-day User Retention Rates
Specifically in our case we examine the treatment effect close to the eligibility cutoff of 50 OP, which determined whether an address would receive the airdrop or not. This allows us to isolate the causal effect of the airdrop, as addresses just above and below the threshold are otherwise nearly identical in their observed characteristics, with the treated addresses receiving 50 OP and the control addresses not. Importantly, we did not announce this threshold beforehand so there is no reason to believe we’d see sorting directly around the threshold (we also confirm this with the data). We measure the 30-day retention rate as the number of unique days the address transacted on within the 30 days after the airdrop announcement date.
As we can see in the figure above, there is a notable jump in the 30-day retention rate right at the threshold. This jump is a 4.2 percentage point increase which is statistically significant (p < 0.000000348; SE: 0.008). Notably, we rely on the rdrobust package, which implements a standardized, data-driven bandwidth selection procedure to determine the optimal bandwidth and is widely regarded as a standard for regression discontinuity analysis. As a robustness check to ensure that the treatment effect remains stable over various bandwidths, we assessed how sensitive the RD estimate is to the choice of the bandwidth. We find that when using a narrow bandwidth (1 to 5), the estimated treatment effect is around 7-11%, but as the bandwidth widens (5 to 50), the estimated effect drops to about 3.5–5.5%.
We find Positive, though Smaller, Effects of Airdrop 5 on 60-day User Retention Rates
We run a similar RD analysis to estimate the effects of the airdrop on retention after 60 days. Similar to the 30 day retention outcome, we see a positive and statistically significant increase in retention among those awarded the airdrop, though the effect size is smaller than the 30 day retention outcome (+2.8 percentage points; p < 0.000466; SE: 0.008).
Superfest Participant and Superchain Explorer Bonus Categories Significantly Increased Retention
Furthermore, we deploy another RD analysis to measure the incremental effects of receiving the respective bonus points on 30-day retention rates. To this end, we restrict the sample to addresses that were eligible for airdrop 5 and either received 0 bonus points or exactly 1 specific bonus point, which helps us isolate the incremental treatment effect and compare the efficacy between the various bonus points. We find that qualifying for the SuperFest Participant bonus leads to a statistically significant increase in the 30-day retention rate by 10.0 percentage points, while receiving the Superchain Explorer bonus results in a 4.7 percentage point increase. This highlights that encouraging cross-chain activity might prove beneficial for increasing retention rates. In addition, there may be something about participating in SuperFest that leads to better onboarding and exploration of DeFi that causes participants to continue to engage post-rewards.
Bonus Point | Est. Treatment Effect | Standard Error | p-Value | Confidence Interval (95%) |
---|---|---|---|---|
SuperFest Participant | +10.0pp | 0.03 | 0.0009187 (Statistically Significant) | [0.041, 0.159] |
Superchain Explorer | +4.7pp | 0.012 | 0.0001576 (Statistically Significant) | [0.022, 0.071] |
Frequent User Bonus Rewards Significantly Decreased Retention
Conversely, we observe a statistically significant 7.1 percentage point decrease in the 30-day retention rate for the Frequent User Bonus. At first glance, these results may be somewhat counterintuitive and fly in the face of its intended outcome. But upon closer examination, they reveal the intricate complexities of large-scale incentives and provide valuable insights into the potential dynamics at play. More concretely, we posit this negative effect may arise due to farmers who immediately sell their rewards before exiting the system. Thus, these findings suggest that directly rewarding extremely high activity might not lead to a further increase in retention rates. Note that we also examined the other four bonus points, but the results were not statistically significant.
Bonus Point | Est. Treatment Effect | Standard Error | p-Value | Confidence Interval (95%) |
---|---|---|---|---|
Frequent User | -7.1pp | 0.035 | 0.04142 (Statistically Significant) | [-0.139, -0.003] |
Conclusion
To summarize, the key lessons from this analysis are:
- Airdrop 5 increased 30-day retention by 4.2 percentage points and increased 60-day retention by 2.8 percentage points
- Encouraging cross-chain activity seemed beneficial for increasing retention rates
- Directly rewarding very high activity may not lead to an increase in retention rates, perhaps because this captures farmers who immediately sell rewards
More broadly, we hope this study illustrates how we might use various causal inference tools to study policy questions related to Optimism token allocation, as part of our data-driven approach to research to inform our system design. For instance, others in this space might consider incorporating regression discontinuity designs into future airdrop design. If this sounds interesting to you, please be in touch.