Talk to the City

zkIgnite

Would you suggest any changes to the scoring or voting phase to improve the proposer's experience?

Overview:
The public consultation revealed four main clusters of opinions. Cluster 0 focused on improving the voting experience, with suggestions such as increasing the number of electors and implementing a minimum bar for proposals. Cluster 1 highlighted satisfaction with the scoring and voting mechanics but called for refinements, including addressing rating disparities. Cluster 2 emphasized concerns about the transparency and effectiveness of the voting process, suggesting improvements such as providing more information and addressing bias in the scoring system. Cluster 3 focused on improving transparency and criteria, with suggestions for clear guidelines, ethical behavior guides, and involving the ecosystem in proposal voting. Overall, participants sought improvements in the voting process, scoring system, and transparency.

Improving Voting Experience

(12 arguments,29% of total)
Cluster analysis:
Participants expressed a range of opinions on the voting process. Some suggested increasing the number of electors and implementing a minimum bar/score for proposals. Others called for better coordination and anticipation among electors, including discussions and consensus-building. Participants also recommended adding a page explaining the voting process and proposing one-on-one meetings between proposers and electors, with compensation for electors' time.
Representative comments:
  • "If we can get more than three electors to score in the future, it could help negate any large discrepancies."
  • "Include a brief paragraph in the scoring section to provide a neutral perspective on the project, helping remaining electors understand it from a different point of view."
  • "I support the current voting phase and would not make any changes."
  • "We should implement a minimum bar/score for proposals to qualify for voting, reducing the load on electors."
  • "Next time, there should be better coordination and anticipation within electors."

Improving Scoring and Voting Experience

(10 arguments,24% of total)
Cluster analysis:
Participants expressed satisfaction with the scoring and voting mechanics, noting improvements and appreciation for the quantified calculation process. However, concerns were raised about the lack of detail and range in the scoring system, as well as the potential for arbitrary scores in technical proposals. Suggestions were made for a small process to address rating disparities and the need for a formal document to document disagreements. Overall, participants found the scoring system helpful but called for refinements.
Representative comments:
  • "The mechanics of scoring and voting played well and were a massive improvement."
  • "I submitted three projects and received the same average score for all of them."
  • "I also happened to be right at the average of all submitted proposals in cohort 2."
  • "From this, I can deduce that my performance is probably just average."
  • "The scoring system should be more detailed and include a wider range of numbers."

Improving Voting Process

(10 arguments,24% of total)
Cluster analysis:
Participants expressed concerns about the effectiveness and transparency of the voting process, including meaningless votes, lack of information, confusion about voting status, and insufficient scoring and explanations. They also suggested improvements such as providing more information, adding optional comments, and addressing the subjective and biased nature of the scoring system. Additionally, participants felt anxious about the possibility of receiving partial funding and sought clarity on budget allocation.
Representative comments:
  • "Voters may not have been capable of effectively voting on projects."
  • "I found it confusing when the status was 'in voting' but I saw information on Discord that voting had already ended."
  • "The scoring and voting process was obscure and led to the realization that the proposed projects were out of scope and not eligible for funding."
  • "It was unclear if receiving less funding than the budget meant receiving that amount or needing to be fully funded."
  • "Being partially funded before the final decision added extra anxiety to the process."

Improving Voting Process Transparency and Criteria

(9 arguments,22% of total)
Cluster analysis:
Participants highlighted the need for clear guidelines and specific criteria for evaluating projects, as well as revising the current criteria to align with the strengths of the Mina Foundation. They also called for the development of a strategy brief, detailed criteria, and ethical behavior guide to improve the voting process. Transparency in funding allocation and the voting process was emphasized, with suggestions to explore on-chain solutions and involve the ecosystem in proposal voting. Additionally, participants suggested considering a combination of quadratic funding with weighted votes for a more democratic process, while also emphasizing the credibility and qualification of voters.
Representative comments:
  • "Clear guidelines should be provided to electors on the goals of Mina Foundation and criteria for evaluating projects."
  • "A strategy brief, detailed criteria, and ethical behavior guide should be developed to remove agency costs and prevent zero-sum games in the voting process."
  • "Scoring and voting processes should be more transparent in allocating funds"
  • "We would like to explore on-chain solutions for implementing the funding platform."
  • "There should be more involvement of the ecosystem in proposal voting"

Appendix

This report was generated using an AI pipeline that consists of the following steps:
Step 1: extraction - see show code, show prompt(gpt-3.5-turbo)
Step 2: embedding - see show code
Step 3: clustering - see show code
Step 4: labelling - see show code, show prompt(gpt-3.5-turbo)
Step 5: takeaways - see show code, show prompt(gpt-3.5-turbo)
Step 6: overview - see show code, show prompt(gpt-3.5-turbo)