How can we improve quality AND quantity?
KPIs are set at current average of around 10% capture-rate of submissions:# of analysts in channel. KPIs scale with growth of channel, aiming for 10—20% each challenge with the goal of surpassing 20%.
Challenge Type | # of Submissions | # of Insights | Engagement in Channel** |
---|---|---|---|
Tier A Challenges | 5-10+ | 5+ | 5+ posts per challenge |
Tier B Challenges | 20+ | 10+ | 5+ posts per challenge |
*Insights are considered above-average submissions containing valuable analysis
**Based on challenges 3+4, we believe this to be a fair KPI to begin. We’ve also made updates to our grading rubric to more specifically outline the rule updates for grading submissions.
Category | Low Score (1-3) | Medium Score (4-7) | High Score (8-10) |
---|---|---|---|
Quality | Demonstrated little understanding of the topic area. Analysis lacks depth and rigor. Methodology is flawed or poorly explained. Conclusions are unsupported or irrelevant. Code (if applicable) is poorly written or non-functional. | Demonstrates a valid attempt at understanding the question and developing a solid methodology for answering the questions. Analysis shows adequate depth. Methodology is sound but may have minor flaws. Conclusions are mostly supported by data. Code (if applicable) is functional but may lack efficiency or best practices. | Demonstrates high mastery of the subject. Analysis is thorough, insightful, and goes beyond basic requirements. Methodology is well-designed and clearly explained. Conclusions are well-supported and offer valuable insights. Code (if applicable) is efficient, well-documented, and follows best practices. |
Engagement | Did not share in channel. No visible participation in discussions. Showed no interest in collaborating or learning from others. | Engaged at least once in the channel. Participated in some discussions, asked questions, or offered basic comments. Showed some interest in the community aspect of the challenge. | Shared insights and/or specific code/queries from their analysis in the channel to help others. Actively participated in discussions, offering valuable insights and constructive feedback. Demonstrated a collaborative spirit and significantly contributed to the learning environment. |
Sharing Insights/Helping Others | Did not share any insights or offer help. Kept findings and methodologies to themselves. Showed no interest in others' progress or questions. | Shared some basic insights or findings. Offered occasional help or suggestions to others. Showed willingness to engage in knowledge sharing, even if not extensive. | Regularly shared detailed insights, novel approaches, or unexpected findings. Actively helped others by explaining concepts, troubleshooting code, or suggesting resources. Significantly contributed to the collective learning experience of the group. |
Technical Complexity | Methodologies are shoddy or lack minimum requirements for valid submission. | Methodologies are sound; user makes use of leading analytics tools like Dune, Flipside, or Allium. User also demonstrates moderate technical understanding despite making some assumptions | Methodologies are advanced and complex; user leverages leading analytics platforms or using APIs for proprietary analysis, presentation is sophisticated and not just a single dashboard from Flipside or Dune |
Ken challenged us to come up with more specific and realistic KPIs for each challenge given where we think the program currently is. Comments from Ken on challenges: