top of page

Experience & Program Evaluation

I honed much of my craft as a researcher and evaluator by leading a multitude of experience and program studies. I have been involved in projects across all phases of development from iterative front-end evaluation, concept testing and piloting, to rigorous diagnostic and summative studies. Throughout the evaluation cycle, I partnered closely with experience developers and program managers to develop guiding questions, determine methodological approaches, devise implementation plans, and form interpretive strategies. Usability and application of results were key goals of any evaluation project I led, thus I readily employed various data visualizations and reporting formats for non-techical audiences. Likewise, I would frequently hold data interpretation meetings with stakeholder groups; during these meeting I would present results, talk about key insights, and facilitate discussions about implications and recommendations. Results from studies I led were ultimately used in strategic decision making, understanding and advocating program impact, and reporting to funders. Centering the perspective of program participants and elevating their voices to those in power were always central aims of my evaluative work. Please explore a few example of the evaluation studies I have led below. 

This executive summary exemplifies the robust mixed-methods approach I aimed to employ with the evaluation projects I led. This accessible, non-technical report also showcases the results of empowerment approach I used in reporting and interpreation-recommendations and implications were collectively developed through facilitated conversation with stakeholders.  

This insight-driven report is an example of a project that was used in strategic decision making. Requested by senior leadership to inform a strategic new vision for the organization, analysis and reporting had to balance complexity and nuance of the participants' experience and a need for concrete insights and clear visualizations. 

This full, technical report is an example of kind of report that would be made available to stakeholders but would be accompanied by a data interpretation meeting.  These meetings the project team to dig into results, ask critical questions, reflect on practice, and make meaning of results

Associated Presentations & Publications

  • Volunteer Evaluation to Scale

    • Panel presentation at the 2019 Association of Zoos and Aquariums Conference, New Orleans, LA

  • Exploring Value of Immersive Technology

    • Panel presentation at the 2019 Visitor Studies Association conference, Detroit, MI

  • Direct and Indirect Experiences with the Natural World: Comparing Great Lakes Programming and Exhibit Experiences

    • Poster presented at the 2016 Association of Zoos and Aquariums conference, San Diego, CA

  • Observing School Groups at Shedd Aquarium

    • Presentation at the Spring 2016 CCORN meeting

  • Children's Perceptions of their Exhibit Experience After Participating in Free-play and Facilitated Activities

    • Panel presentation at the 2016 Jean Piaget Society conference, Chicago, IL

  • Penguin Play Packs: A Case Study in Play Facilitation

    • Panel presentation at the 2015 Visitor Studies Association conference, Indianapolis, IN

  • Understanding Children's Connections to Nature though Facilitated and Non-Facilitated Experiences

    • Poster presented at the 2015 Visitor Studies Association conference, Indianapolis, IN

  • Small Changes, Big Impact: Scalable Renovations Lead to Improved Visitor Experiences

    • Article in Spring 2014 volume of Exhibitionist, peer reviewed

    • Co-authored by K. Nesbit and L. Maldonado

bottom of page