Jump to content

2021:Evaluation/Findings/Annex

From Wikimania
Wikimania 2021 - An Evaluation of a Virtual Conference

Annex A: Summary of Methods and Limitations

Methodology: The evaluation used a utilization focused evaluation approach. The intended use of the evaluation is to better understand the experience of participants at Wikimania, as well as future event planning processes.

The evaluation used a mix-method approach. This approach collected both qualitative and quantitative data. The data sampling aimed at collecting as much information needed to make strategic decisions.

Interviews: A total of 12 phone/zoom interviews were conducted lasting one hour. They included, four Wikimedia Foundation staff, five Wikimania Core Organizing Team Members and three Steering Committee Members.

Registration Form: All self-selected individuals who wish to participate in the conference filled a registration form via Eventbrite (100% of registered participants). Some of the information collected was mandatory and some was optional. After deleting duplicates, 3888 registration forms were used in this evaluation.

Limitation: There were two limitations in the registration form. The first and perhaps most importantly was that individuals did not indicate that they were part of a technology community. As a result, participant information cannot be disaggregated by belonging to this grouping. This error was also replicated in the Post-Event Survey because the registration data was not processed until after the post-event survey was released. The second limitation in the registration form was that the question on where people were presently living was too broad. Individuals indicated cities, states, provinces, countries, and even continents. This led to a very labor intensive, and potential increase in human error, of coding all answers to sub-continent geographies.

Post-Event Survey Wikimania Participants: A survey was sent to all conference participants in the final hours of the survey via the Wikimania chat. A week later the post-event survey was shared with all registrants. A one week delay in the survey distribution took place as a result of translating the survey into six languages. A total of three reminders were sent to the participants. The post-event survey was completed by 456 individuals (26% response rate). The one week delay in releasing the translated survey and communicating the survey via email may have had an impact on the survey response rate. There may have also been translation discrepancies but these should be minimal as the survey was professionally translated and 87% of surveys were completed in English (Arabic -1 , German -17, English - 396, Spanish - 19, French - 12, Russian -7, and Chinese - 4).

Limitations: The post-event surveys did not ask individuals if they were scholarship recipients. While outside the scope of the evaluation, This was a missed opportunity as no data was collected on the perceptions of scholarship recipients.

Post-Event Scholarship Affiliates Survey: Originally, the evaluation was meant to interview Affiliates that distributed scholarships to better understand the effectiveness of the Scholarship model applied (ie through Affiliates). However, due to time delays the interview guide was transformed into a survey. A total of 11 affiliates out of 18 answered the survey (61% response rate).

Remo Data: Wikimania’s technology platform, Remo, collected who attended the conference and how long they stayed at the conference. The Remo data was used to identify how many people actually attended the conference. The dataset was combined with the Registration results.

Observations: The evaluator also participated in the conference and took observational notes that were used to triangulate data. Observations of social media and the Wikimania chats (including Telegram) were also captured. The evaluator also attended a group retrospective session on lessons learned about organizing Wikimania as a virtual event. This focus group was led by the Wikimedia Foundation.

Annex B: List of Evaluation Questions

Goal of the Evaluation: This evaluation seeks to fully understand whether and to what extent the intentional design of Wikimania 2021 enabled not only the achievement of behavior change among key stakeholders, but also reduced barriers of participation while enabling community growth. Moreover, it is interested in understanding if the process of creating the event adhered to the values and principles of the Wikimedia Foundation and the Wikimedia Movement, as outlined in the Movement Strategy Principles.

Evaluation Questions: The evaluation will answer the following questions:

Effectiveness: To what extent did Wikimania 2021 achieve its expected results?

  • Did each of the key stakeholders groups participating in Wikimania 2021 achieve the expected changes? Were there other expected, unexpected, positive or negative results of Wikimania 2021?
  • To what extent was the conference able to reduce barriers to attendance and meaningful participation, particularly among newcomers?[1]
  • What was the experience of participants during Wikimania 2021? In particular, what was the experience of new Wikimania participants and the Technical Community?
  • To what extent were participants able to grow their social connections through Wikimania? Is there a difference between veterans and newcomers in terms of their social connections?
  • To what extent was the event able to generate new ideas or projects for volunteers?

Process: To what extent did the process of developing Wikimania 2021 embody the values of the Wikimedia movement and the principles of the Wikimedia Foundation?

  • What lessons were learned during the process of developing and implementing Wikimania 2021? In particular, the Foundation is interested in understanding how well the organizers communicated, collaborated, and co-created.
  • How was the experience of developing and implementing Wikimania different for each of the stakeholders groups involved (Wikimania Steering Committee, Core Organizing Team, Wikimedia Foundation)? Did the experience of developing, organizing, and implementing a virtual event meet their goals and expectations?
  • Which aspects of a virtual conference were most strategic to achieve the goals of Wikimania? Which aspects need further improvement? (Including: language, technology platforms, session structures, Scholarships etc.)
  • Was the level of support to speakers and participants sufficient to be effective? (Including language accessibility, technology, instructions)
  • How accessible and inclusive was the event? What measures were taken to ensure the safety of participants? Were there any differences in how included and welcome newcomers and veterans felt?[2]
  • In addition, the process review will also highlight key learnings on the following processes: Hackathon, Wikimedian of the Year, Language Options, and Committee Collaboration.

Additional Indicators:

The evaluation will also seek to answer the following questions, if evaluation data of prior years is available and comparable.

  • What were the key differences between Wikimania 2019 and Wikimania 2021 as they relate to conference experience, application of learnings, or relationship growth between demographic groups? To what extent did the differences in the conferences impact the experience?
  • Participant demographic information (ie. # of attendees, new participants, geographies, language, gender)

Annex C: Biography of Evaluator

Vanessa Corlazzoli is an experienced evaluator and team leader with over 15 years specializing in the non-for-profit, international development, and research sectors. She has significant experience implementing organization-wide learning agendas and monitoring and evaluation (M&E) frameworks. She is an independent evaluation consultant with clients that include: The Mastercard Foundation, World Food Program, Mercy Corps, Democracy Fund, the World Bank, International Budget Partnership, and The Washington Center. Vanessa is considered a thought-leader having authored several papers on the role of M&E theory and practice within the peacebuilding and governance sector.

For six years, Vanessa led Search for Common Ground’s Monitoring and Evaluation unit. She has also worked for the Ontario Ministry of Research and Innovation and the G8 Research Group. She has a Master’s degree from Tufts University - The Fletcher School of Law and Diplomacy and a Bachelor’s degree, focused on Peace and Conflict Studies, from the University of Toronto.

References

  1. Newcomers is defined as a person that has not attended Wikimania before.
  2. For the purpose of this evaluation, veterans are considered anyone that has attended a Wikimania in the past.