Edinburgh Evaluation Summer School Logo The Evaluation Programme at RUHBC, Edinburgh  
  Edinburgh Evaluation Classes Main Edinburgh Evaluation Classes Structure of workshop Edinburgh Evaluation Classes Questions from Community
Edinburgh Evaluation Classes Presentations & resources Edinburgh Evaluation Classes Feedback and Future Needs Edinburgh Evaluation Classes Facilitators   

Feedback

 
 
 

 

Anticipated Evaluation Questions

  • How can I broker to the needs of many different stakeholders given limited time and resources?

  • Balancing representations of voice/stakeholders in planning policy and dominants (indicators) of outcome: for example, frequently the funders and government have the larger capacity and ability to communicate what they have/what is perceived as “desired”.

  • Creating reciprocity across sectors and stakeholders

  • Health outcomes

  • Client/participant satisfaction

  • Quality of services received

  • How can we identify appropriate measures/indicators?

  • How do we work around structural barriers?

  • Evaluating impact of one of our strategic granting programs (what has been impact over time? What has the impact been on community?)

  • Addressing culture specific evaluation approaches

  • impact of move to integrated cline core model in the home care sector

  • how do we evaluate a n organizations strategic plan or operational plan?

  • How to develop internal capacity for evaluation?

  • Exploring the development of an evaluation policy to guide overall evaluation

  • How to evaluate a government funding program for a medical device?

  • How do we build capacity (of users, practitioners) to carry out evaluation that correctly maps the markers of progress for the long-term impact? Process? Developmental?

  • Evaluating team operational processes

  • Evaluation 101: how to plan and design an evaluation? How to identify meaningful objectives? Metrics and indicators?

  • How to measure our tools/approaches: are they meeting our needs?

  • What is our organization influencing?

  • Health/health care à how can we do better?/spheres of influence?

  • Scientific experimentation vs community-based evaluations?

  • Evaluation capacity building

  • Developmental evaluation

  • How to ensure/measure uptake, while acknowledging structural boundaries/limitations?

  • What are the impacts of supportive housing services on people living with HIV in Ontario?

  • How to make an evaluation a learning process accepted by all parties (community and researchers)?

  • Challenge of rigor and evidence from a practical, community based approach!

  • How to do initiative planning (including evaluation planning) in a short time frame with present to implement initiatives?

  • Different views on the differences between research and evaluation

 
 

Strengths of the Workshop

  • Free

  • Great speakers with lots of experience

  • Discussion time was great à more would be appreciated

  • Many perspectives provided a rich discussion

  • Expertise, knowledge of presenters

  • Diversity of participant knowledge and questions/comments shared

  • Good combination of theory & practice and discussion of how the two must inform each other

  • Good first step and would be interested in expanded versions of each of the elements

  • Diversity of participants

  • Format was great

  • I thought the speed/pace of the workshop was effective and kept the discussion moving

  • The topics and breakdown of the day’s discussion were very interesting and enlightening

  • small group of people with common knowledge needs

  • short but helpful à didn’t drag (much appreciated)

  • very clear questions and followed them

  • the flip sheet

  • opportunity to network and get ideas

  • The exchange of ideas and sharing of similar challenges within evaluation was helpful. If anything, it is nice to vent about problems and knowing others feel the same way

  • Very interesting dialogue & discussion. While not what I initially thought, I thoroughly enjoyed the session

  • Lots of opportunity for participation

  • A lot of great discussion

  • Very democratic

  • The energy and commitment by the speakers

  • Facilitators are very knowledge

  • The atmosphere was also very good

  • Mix of people in the room (academics, program implementation, research, government)

  • Notes and slides online

  • Great overview of great information

  • Good to have practitioners presenting; good mix of presentations and discussions

  • Allowing group discussions and encouraging participation

  • Answered individual questions

  • Sanjeev was an excellent moderator and commentator. Enjoyed all presentations and the format.

  • Several different facilitators to provide a variety of perspectives and answers to people’s questions.

  • Website was very helpful

  • Multiple presenters, roundtable approach and lots of discussion

  • Great sharing of theory and practical knowledge

  • Though-provoking presentations and discussions

  • Linked built-in time for comments

 
 

Weaknesses of the Workshop

  • It would have been helpful to have received the handouts prior to July 5th to review

  • Some discussion went off topic

  • It would be a good idea to have a break midway through

  • Sometimes it was difficult to hear the speakers or questions

  • Felt rushed at times à perhaps a longer workshop next time or cover less material/topics

  • I think this could have been broken out into a whole series

  • Having a government representative present and discuss the role of evaluation in programs & policies would have been good as well as would have touched upon the reality of program implementation & local initiatives

  • too general à would have liked concrete examples

  • lack of focus on rigorous evaluation design

  • lack of information on innovative evaluation designs

  • can we have a 5 minute stretch break?

  • Could provide resources to help people engage in evaluation (i.e. THCU.ca, online program planner on OHAPP)

  • Seating and layout

  • Lack of threading goals of the workshop presented at the beginning throughout the workshop

  • A bit too much participation à lead to slightly disjointed flow to the workshop

  • Lack of clear definition of evaluation

  • Highly theoretical rather than practical for those of us working in community organizations

  • Nothing à excellent

  • Short, but thanks for adding the second workshop

  • Presenters contact information was not readily available

  • Not enough time to fully discuss topics

  • I wish it was longer

  • More case studies?

  • Questions given before we all come together

  • Why/What will new Evaluation Center do?

  • Full day workshop would be great. We just touched on some of the challenges.

  • Maybe clarify the definition of program theory more clearly

  • Sometimes people’s questions were not answered (I don’t mean the complex questions with no real answers i.e. difference between evaluation and performance management)

  • Presumed a more advance knowledge and experience in evaluation practice and methodology

  • Little opportunity for networking

  • Not a lot of context given about the definition of evaluation and its functions

  • Squished into too short of a time period

  • Did not define (or acknowledge an open definition) of evaluation vs performance measure or research from the very beginning – this would be very helpful

  • none

  • Might have been helpful to define evaluation and complex community initiatives at start so audience was clearer idea of the context of discussions

 
 

Utility of the Workshop

  • 10/10

  • Amazing – one of the most useful workshops I have attended in a long while

  • Provided a good starting point for future discussion, networking

  • Connections to other areas of work in public sector management and organizational theory

  • Great introduction on evaluating complex initiatives

  • Useful for work in program design and the role of evaluation in this area

  • Great to hear how others are using developmental evaluation

  • Very useful – will definitely look up presentations online

  • Very useful when I go forward in analyzing evaluations and discussing evaluative components of programs & initiatives

  • picked up some interesting tidbits

  • very useful as a networking tool

  • useful, although not too much new knowledge

  • Somewhat – I am a PhD student in Health Services Research and Policy. It wasn’t clear from advertisement that the focus would be non-academic

  • Great opportunity to learn from others

  • Not as useful as I had hoped. Would have liked to learn more about concrete methodology examples and strategies à what’s worked for people, what hasn’t? Practical concrete methods of evaluation. More small group activities would have been useful too (hard for all to participate in big group discussions). Less lecture style

  • Multi-disciplinary

  • I personally find the topic very interesting with my background in health administration, but my work currently focuses more on academic evaluation, which the workshop did not address.

  • Furthers my thinking

  • Knowledge of resources available to me

  • Excellent: provides a good framework for discussion of several common evaluation questions

  • Affirmed some of my work and gave me some new perspectives

  • I feel that I deal with the issues raised today on day-to-day and am hungry for moving on and informing application & practice à need future space for continuous learning, reflection and discussion

  • Good for understanding a wider perspective on evaluation

  • I realize that the challenges I face in evaluation are being faced by others (i.e. buy-in, partnerships)

  • Very useful in clarifying some ideas and thoughts regarding a project I am working on

  • As a graduate student in applied social psych, it was an excellent supplement to my program evaluation course work.

  • Somewhat à stimulated lots of thinking for me, but the focus was less useful as I work in a health care context that doesn’t provide health care directly (i.e. evaluation of ethics policy and programs)

  • Not very useful because I am not working on directly designing or conducting evaluations à I’m reading academic evaluations of RCTs as part of a literature review.

  • Excellent! Congrats on your first workshop!

  • Very useful. I look forward to other workshops or dialogues as you called them.

  • Quite useful: good resources and links to look up. Helpful to listen to practical examples and learn from others.

  • Reinforced the importance of doing evaluations and including it in planning phase
 
 

Other Feedback

  • How do we build a common understanding of the problem/issue among stakeholders?

  • I would be very interested in attending future workshops on accountability and participate in this newly-created community. I hope the forum will continue and grow/maintain its (web) presence through the site you established

  • References and actual tools/examples would have helped

  • Long time coming. Thanks.

  • How to measure individual behaviour change and capacity/skill increase?

  • How/what short term impacts or goals are appropriate to measure when working in a climate focused on short-term political cycles?

  • What safeguards can you build into evaluation to mitigate staff/community turnover?

  • How do you convince stakeholders to accept longer-term qualitative measures?

  • good start

  • great workshop à looking forward to website and using it

  • I think it would have been helpful if evaluator was defined at the beginning so everyone knew what is being talked about. Another definition that would have been helpful would have been performance measurement and performance management.

  • Great initiative J

  • How will you link with other evaluate on organizations (i.e. CES-ON, Canadian Evaluation Society- Ontario Chapter)?

  • Looking forward to more engagement

  • Thank you for opening a second session.

 
 
 
torontoevaluation.ca/complexity