Grants for voting neurons

Will try to separate recommendations so we can have focused discussion:

  • are concerns probable and relevant?
  • is the recommendation the best that mitigates it?

I, Wenzel, Nathan, Zane and a few others on the CodeGov team voiced their concern about the change from a cooperative model to a competitive model :disappointed:

Concern 1 - Shared Training & Tooling:
We are concerned that peer training, documentation (like review primers) and open source tools (like the review website, notification, tally bot, etc.) will cease to exist in this competitive model.

In a nutshell, we completely lose the economies of scale in shared services / needs.

Concern 2 - Unfair / Innefective Competition:
Furthermore, it seems we are heading towards a “bigger the Voting Power, bigger the reward” kind of mechanism.

We worry that reviewers will be judged by popularity / marketing effort / collusion with power, than by their technical quality.

Reviewers should be competing by nr of missed reviews, by nr of issues found or by
Dfinity devs satisfaction (NPS) score, not by individual/subteam popularity.

On another note, the reviewers also don’t want to do communication kind of tasks, they rather prefer to delegate into a marketing / project manager that can efficiently run those tasks.

Recommendation 2:
I believe these concerns can be mitigated by having a “core team” that is responsible for all the “shared” services and needs of all reviewers. The way to implement it can be deeper discussed but think of just another grant (same process of applying and being selected or maybe the reviewers can elect this core/supporting team).

The responsibilities of this core team are:

  • facilitate periodic calls between all reviewers.
  • With the reviewers define which benchmarks they agree to be tracked and compared about.
  • by the end of each month, release the latest results (that should be facts that a third party / anyone else could audit).
  • communicate frequently in all relevant channels, with Dfinity and NNS token holders, a summary of results and most important issues found and if they were considered addressed.
  • gather reviewers needs, transform into initiatives or requests of dev grants to develop shared tooling / best practices / colaboration opportunities.
  • ensure and look after the reviewers well being, that the conditions are attractive and that potential new reviewers have equal chances of being trialed out and onboarded.

With this “core team” concept, think we are able to mitigate all concerns without adding significant new risks. Of course there is cost, but we should see it more as an investment, an investment on consistency, quality and sustainability of the whole system. This “axis” allows for economies of scale on the shared services, the qualified supervising of the reviewers and impartial communication with Dfinity and NNS token holders.

Looking forward to feedback on this recommendation, it’s by far the most important I have.

10 Likes