[Measuring Impact] Data-Driven Content Performance

WHY

On the premise that our collective North Star is Impact = Profit it is important that we establish some transparent standards to differentiate impact (value) from work (time). Specific to quality content analytics tools and industry standards exist as an objective means to quantify performance which can then

  • inform impact measurement using data
  • guide proposal development to deliver impact
  • allocate fair reward based on impact vs work

Quality Content: High-quality content is a cornerstone of brand, marketing and communication for which there is no shortage of industry standards and data-driven metrics to help quantify (direct, proxy and attributed) performance that can inform impact measurement.

Objective Measures: Data-driven performance measures are a reliably objective signal of quality. While there may be subjective measures of quality content and impact, objective industry practices could be our starting point

Proof of Performance: Content is one area of work where we have the analytics tools to effectively monitor, quantify and reward (future and retroactive) based on proven performance. Logically, performance data provides retroactive proof of performance.

Consistent Reward: With a standard set of data metrics we can ensure that people are aware of, working towards and rewarded equally for impact. Whether it takes someone one week or two years to attain some x performance (i.e. 10k subscribers) they should receive y.reward

HOW

I would like to open a conversation to discuss performance standards specific to content creation and performance measurement. Industry terminology and metrics can 1) help build our collective understanding of how content funnels deliver impact and 2) evolve with our learning around impact.

The case for transparency of these metrics is that even though they may evolve, people will be aware of the standards that apply to them when seeking future or retroactive grants. I recognise this reflects my personal results-driven, performance orientation and reliance on data-driven decision making so please excuse these biasā€™.

The terminology shared below is drawn from professional experience across in-house, agency and consultancy practices. The associated example (shown in quotes) is provided for illustration purposes only. This is minimum viable information and does not offer a complete set of performance measures for all content approaches.

My primary intent is to share my perspective as a basis for discussion, debate and constructive feedback from which to speculate <> collaborate <> learn. I hope this might broaden our collective understanding of content performance and accountability so that we effectively reward impact

WHAT

Purpose

Strategic purpose should talk to an overarching and quantifiable end result (i.e # product demos, # dapp users, # course completions). We are fortunate to have a set of collective intents that clearly inform strategic purpose

Purpose: Intent 4 Governance Accessibility: Increase and/or diversify the total voteable supply through a series of podcasts designed to lead people to the Agora platform to delegate their OP

Objectives & Key Results

Strategic OKRs are not all equal in terms of impact. In terms of impact, it is helpful to think of content as a form of a funnel, a series of steps to achieving a strategic purpose. Specifically, the items at the top of the funnel would rarely if ever carry the same value as those further down. Logically, closing a sale is a strategic result and higher impact (conversion) than top-of-the-funnel awareness or lead generation. Funding should reflect impact

OKRs

  • promote ease of use for the Agora delegation process to encourage delegation
  • promote delegates with <0.25% voteable supply to highlight governance diversity
  • baseline and measure delegated OP for participants to quantify changes and assess the impact on voteable supply

Funnel Examples

Simplified Blended Purchase Cycle Sales B2B Led Gen
Awareness Problem Probelm Prospect Problem Attract
Aquistion Solution Requirements
Qualified Lead Product Qualify Options Convert
Consideration Engage Compare Solution
Influence Trial Inspire
Facilitate Select Demonstarte Buy In
Decision Convert Purchase Close Purchase Close
Experience Service Evaluate Consierge Service UX Support
Activate
Loyalty
Advocacy

Approach

The tactical approach describes a plan to achieve OKR through the combination of various options to publish and disseminate content.

  • types i.e video, audio, copy
  • channels owned, earned, paid
  • formats i.e AMA, podcast, course, memes
  • methods i.e. email, socials, search

Approach: Publish a series of # deleagte podcasts published [location] and repurpose content for sharing via Twitter and Linked.in to drive engagement on Call to Action

Deliverables

Describe process implementation and resulting output informed by your approach. While this is where work and time are invested, neither is a measure of impact, albeit impact should be the goal here.

Deliverables

  • Draft delegate interview agenda and questions to share during delegate outreach
  • Research, identify and engage # willing delegates and collaborate to finalise a tailored agenda
  • Conduct, edit, publish and promote # delegate interview
  • Report end of season on changes to voteable supply for participating delegates

Call to Action

Whatever level of the funnel you are targeting content deliverables should be designed to progress your audience through the funnel with a clear call to action that answer the question ā€œWhere to next?ā€

CTA: Delegate Today! Check out [delegate profile [link on agora]] and delegate your OP in a matter of seconds. Simply Connect Wallet > Select Delegate > Click Delegate Your Votes & Sign! Remember you can change your delegation anytime and even delegate to yourself by visiting [agora link]

Remember to use free tools like bit.ly or google to identify, track and quantify engagement on your content links.

Key Performance Indicators

Strong KPIs quantify results and represent actions that directly achieve strategic purpose Quantifying your purpose these KPIs should be monitored, measured and optimised for.

  • lead conversion rate %
  • new users #
  • increase voteable supply %

Solid KPIs measure performance against OKR attributed to achieving strategic purpose yet can be several steps higher in the funnel. These are the levers that we optimise to achieve results

  • qualified leads #
  • video watch time
  • average view duration (hrs)
  • subscriber growth %

Many KPIs can be attributed as funnel progression measures. They either filter out or move your audience closer to achieving your strategic purpose. These can be refined to aid OKR performance.

  • open rate
  • link clicks
  • unsubscribed / undelivered
  • time on site
  • forms submitted

Other KPIs provide a proxy measure only for more qualitative actions that underpin the strategic purpose

  • Trust measured as new followers %
  • Authority measured as retweets/likes %
  • Engagement measured as engagement/impressions %

Conversely, metrics such as impressions, likes, views and visits are rarely an effective measure of impact although they quantify visibility they are a weak link to strategic purpose and are no substitute for more refined measures. For example;

  • instead of impressions measure engagement
  • instead of likes measure authority
  • instead of views measure the average watch duration
  • instead of visits measure time on site

Project KPIā€™s & S4 Targets
Podcast targeting 1000 hrs total listening time wirh average listening duration of more than 33 % (attributed)
CTA targeting 500 clicks and a conversion rate of 15% (progression)
Participant delegated OP targeting an average 5% increase from baseline (conversion)
Voteable supply % change is to be assessed. Recognising that only new delegation will impact this (conversion)

7 Likes

Thanks for this post! Thinkng about how to move this conversation forward - This is a topic weā€™ve been thinking a lot about for RetropPGF Round 3. Weā€™re planning to share our initial thinking on this soon and it would be awesome if you could help collect badgeholder feedback on our initial thoughts. Weā€™re going to start Citizensā€™ House community calls in early July and this would be a great topic of conversation. Let me know if thatā€™s a conversation youā€™d like to help lead!

5 Likes

Keen to contribute but due to limited bandwidth Iā€™m much better positioned for async collab. If the meetings are scheduled for a time I can attend I will be there! This post was intended as a starting point for public discussion and Iā€™d love to hear from people. If you know others working on or interested in this topic please tag them

3 Likes

Loved your insights and 100% agree that we should be thinking in terms of KPIs that measure impact as opposed to measuring time spent.

There are some challenges as there are two things to keep in mind:

  1. These metrics, while true in all settings, come from traditional work environments where the reward isnā€™t directly tied to impact. That means that although one might be strict in measuring, they are safe in knowing that not meeting expected KPIs wonā€™t affect their return on time invested.

  2. Critical Milestones towards a mission are proactively set by the proposer, but the success or failure in having the target impact can only be seen retroactively. That leads to a situation where two things can happen:
    a) A proposer sets a relatively easy-to-achieve milestone not to lose their grant, or
    b) A proposer sets a harder-to-achieve milestone to justify their proposed budget, which jeopardizes their ability to actually receive the grant at the end of the mission.

If you take a look at some of the proposals shared in Discourse already, youā€™ll see that people requesting a grant for their time is a common occurrence. E.g:

Is the problem that all the aforementioned mission proposals, and by extension, the people submitting them, donā€™t understand what it means to measure impact over time spent working?

The challenge, in my opinion, lies in developing a framework that accounts both for time spent working, but also for the impact generated. That is the distinction I make between Missions and RPGF. Missions reward people for putting in the work and making an effort, while RPGF measures impact. Thatā€™s why I also agree with Lavande. This is something to be considered for RPGF, not Missions.

I believe we shouldnā€™t get the Missions to conform with the ā€œImpact=Profitā€ rule because it creates a lack of alignment.

2 Likes

Agree, objective performance data is not the only way and I appreciate your consideration and feedback on the information shared. Using broad brush strokes here to consider how performance data can inform impact measures.

I would counter that, Impact = Profit aligns directly with the Optimistic Vision
Q: If not the vision, what do you feel we should align on?

I 100% support funding work to deliver impact, just with funding tied to impact (results) as opposed to work (time)

EXAMPLE
Consider the results of work to create, publish and promote video(s) that generates a 50,000 OP increase to voteable supply

Person A takes 12 hours to create one video to achieve the result
Person B takes three months to create 10 videos (480 hrs) to achieve the result

All other factors /results aside and on the basis that Impact = Profit should person B be rewarded 468x or even 9x more than person A for delivering the same results?
Why?

You also offered the following example, which for me highlights the distinction I draw above between
work vs impact | quantity vs quality

imo the choice here is clear for Collective Intent 4 Governance Accessibility because expected results are indicated

Conversely, the resulting impact might not be so clear for Intent 3 Raising Awareness of the Optimistic Vision. However, average time watched is a more effective measure of performance than views.

In terms of creator rewards which results are more likely to correlate to higher impact

  • y people that watch a video for an average of 1 of 30 minutes (03%)
  • y people who watch a video for an average 20 of 30 minutes (66%)

I recognise that having built a career serving clients with success defined in terms of growth, performance, revenue and results that I am biased towards measures that I understand. Quantitative is my go-to reference. Iā€™m not against qualitative or subjective measures, Iā€™ve employed both but even there I revert to quantitative measures as a baseline to compare "apples with applesā€™

2 Likes

I think we agree on what needs to happen, but we see things differently when it comes to what these changes need to apply to. Iā€™m all for using quantitative measures to gauge the impact a mission had, but that should be taken into consideration for RPGF, not Missions.

The brief on Token House Missions mentions:

And thatā€™s my response to your comment:

In my thinking, the example you provided should be rewarded as follows:

  • Person A would request less funding through a Mission proposal, but they would receive hefty RPGF
  • Person B would need more funding through their Mission proposal, but they would get less than Person A through RPGF.

In short, we should consider Missions as something complimentary to RPGF that incentivizes people to begin working toward something, while RPGF rewards the impact that something has on the broader ecosystem.

Thatā€™s just my opinion and how I understand things tho. Iā€™ve suggested we add that topic to the agenda of the Community Call on Tuesday so we can get more people into the discussion and see where the consensus lies.

This is a very good example of funding impact [result=upgrade] not work [time= 3months]

Itā€™s interesting our differences in perspective. Remember RFPG is not guaranteed.

1 Like

Agreed, the upgrade is the impact/result of the mission. The funding is meant for the upgrade to happen, not for the impact the upgrade will have on the ecosystem after it has happened.

Similarly, with my Delegate Corner Podcast proposal as an example, the impact of the mission is the creation and publishing of the podcastā€™s episodes, not the effect those episodes will have after theyā€™ve been published.

It is exactly because RPGF is not guaranteed that mission proposals should fund the completion of work, rather than the impact the work has on the community/ecosystem.

1 Like

Can we also agree that ā€˜bedrock upgradeā€™ is work that directly serves a high impact strategic purpose w. quantifiable improvements measures

  • lower maintenance costs $
  • Ethereum equivalence %

While the Delegate Podcast example offers a solid tactical approach and series of deliverables.

I believe it is the strategic purpose, objectives and key results that should attract funding because these convey an authorā€™s understanding of the Optimistic Vision & Ethers Phoenix

Content development is one space within the public goods arena that has a wealth of industry practices and tools to draw from that should

  • guide content planning and development practices
  • baseline minimum performance thresholds acceptable/not acceptable
  • quantify results that inform impact
1 Like

Thank you for the continuous discussion and useful insights. I am by no means an expert on measuring content performance anf impact so much of my knowledge comes from my understanding of the views you share.

I 100% agree with your points. I guess my biggest concern is adapting your approach with the way Mission Proposals are currently set up.

Maybe I donā€™t disagree as much as I initially thought. I just understand Mission Proposals as introduced by the Foundation differently than you do. And I believe others are too.

I believe your input as someone with experience in the space will be much needed to help shape future iterations of Missions or RPGF experiments. Hopefully youā€™ll find some time to help, as Lavande suggested.

I know that I personally would appreciate your contributions on that front immensely!

1 Like

We are all learning here, and have diverse experiences, knowledge and perspectives to build from. Thanks for taking the time to share too.

Sharing this here too as Hubspot has a wealth of resources