Season 6 Grants Council Final and Retrospective Report

First of all, thank you @Gonna.eth for putting this Retrospective Report together and sharing the learnings identified by the Grants Council Members and congratulations on all the work completed in Season 6.

As part of my role in the Feedback Commission, I’d like to share some comments and questions that I hope contribute to the depth of this Retrospective. I would like to note that I was not part of the Grants Council in Season 6, and therefore have no insight into internal conversations or the day to day operations of the Council. I hope this perspective that is removed from the internal ops enriches the conversation and analysis.

I believe the structure of the Report is good in succinctly summarizing what are some of the major problems and it’s solutions, along with how to move forward to the next cycle.

To strengthen the report as a retrospective, it would be beneficial to include what were the changes introduced for Season 6 as part of the Grants Council Charter and Budget Proposal, and if these changes (and the underlying assumptions that led to them) had positive, negative or no outcomes at all. This will create a long lasting public record and understanding of how the Council has continued to evolve as it has and which things, plus under what conditions, are not worth trying out again.

Some examples:

  1. The unification of the Ops and Lead roles into a single person is a change that was introduced in Season 6, however it would seem (based on the end state of the Council) there is a need to keep these roles separated and performed by two different contributors, as had been the case for previous Seasons.

From the Operating Budget Proposal:

Could you please share more on the learnings that lead to bringing @Bunnic onboard? Was the work split the same as before (Season 4-5) or did it change? Were there changes done to the rewards split vs what was proposed in the Operating Budget?

  1. Were there any additional changes to the structure of the Council that were needed during S6 vs what was initially suggested?

  2. Could you also please share the learnings on the division of work between the 4 reviewing teams? Was this division of labor useful in achieving the KPI’s both on Builder Experience and on Grant Performance? Do you feel these KPI’s were useful and sufficient to challenge and set these teams up for success?

Another reflection that seems to be missing is the sharp decline in number of applications, specially when considering the initial assumptions for the budget:

It would seem that even more than just a difficulty to manage a large group of reports, one of the main things that could be concluded to support the reduction of the members is that there was no demand (based on the number of applications) to account for such a large working group. Does this resonate with what was experienced?

  1. What other reflections can we explore based on the original budget and the materialized workload?

These are some of the questions that I would encourage are answered when analyzing the original Charter design and Operating Budget Proposal as part of this Retrospective Report.

Comments on the Major Problems and Proposed Solutions:

Problem 1:

I wasn’t able to find who is responsible for preparing the forms, rubrics and QA processes on the Grants Council Charter, so I’m not sure if this competence belongs to one team in particular (the Lead and Ops Manager) or to all the Council. If it belongs to the whole Council, what are the tradeoffs (if any) to delegate this to a task force?

If little to no changes have been historically made by the Council to a proposed set of forms, rubrics, QA processes led by the Lead, then this solution makes a lot of sense though!

Problem 2:

I think this solution might be addressing the symptoms rather than the problem, as a “General Innovation” MR would become a catch-all option. If I understand correctly, the problem is “Mission Requests were too tightly scoped”. Are there any learnings we should take from more loosely outlined MRs vs very tightly scoped ones? Do we have additional data on “reduced overall participation in creative solutions”?

Problem 4:

These seem to be two different problems, it might be best to address them separately given the solutions for each are different (funds allocated vs ops). The Ops takeaway is a very good finding to reflect on when looking at the initial Charter Design.

Before increasing the budget, it would be good to understand what the outcomes are from completed audits and how they feed into the KPI’s of the Council. Do increased funded audits translate into the outcomes needed by the Collective?

Problem 5:

This would seem to be an effort that is better co-led with the Foundations BD team and the Developer Relations team at OP Labs if there are resources they are already employing to this end. These individuals are already at events, partnerships, and in touch with builders. @vonnie610 proposal is a very good starting point. I highly doubt that having another BD person to talk to is ideal for developers…

A big opportunity area though that has been highlighted consistently across the forum, social media and other spaces by builders is the complexity of the Grants Program. Would you consider that streamlining how the program works might be a good approach to improve engagement?

Problem 7

I think this is the most important challenge faced by the Council to successfully complete its mandate of allocating funding to grow the Ecosystem based on identified needs and opportunities. The solution proposed is really good as it enables a continued dialogue to ensure that as changes in priorities take place, they will be quickly communicated with the Council. Additionally, it would be good to define a procedure for the Grants Council to call for a session if needed outside of the regular check-ins.

5 Likes