Design and Data

All elements of the design for this evaluation were planned and executed by both the Evaluation Centre and the DWP executive team. After developing the theory of change and mapping the possible evaluation questions and potential measures to the theory of change, we decided to focus the current evaluation on client outcomes and the mechanisms associated with those client outcomes (realist evaluation), and starting to examine the impacts of the program (summative evaluation) rather than on learning about best practices, strategies for organizational improvement, or other formative, developmental, or process type evaluative questions.

The six data collection instruments that were developed include:

  1. Pre-summer survey
  2. Post-summer survey
  3. Daily client journal
  4. Weekly caregiver/care-partner journal
  5. New client survey
  6. Follow-up interviews/ online survey

Design Considerations

An unusual design for the evaluation arose from happenstance in that most of the DWP locations take a summer break for 2 1/2 months from late June until early September. We employed a pre-post study design. In this scenario the pre- and post-contexts were somewhat reversed in that the pre-summer surveys were actually post-program measures (at the end of the term) while the post-summer surveys were after 2 1/2 months of not receiving the intervention (so more like a typical pre-program survey). We were interested to find out if clients who did not dance over the summer had a decline in positive outcomes. We also were interested in finding out if any patterns might be observed both during the times when clients were dancing regularly as well as when they weren’t dancing, so we asked clients to respond to a daily journal starting the last week of the term, running the course of the summer, and concluding several weeks into the Fall term. We did not want to burden clients so decided to ask only one daily question, “How are you doing?” On Sundays we asked more questions in order to track in greater detail how clients perceived they were doing and what activities they had engaged in during that week. We also asked caregivers/ care-partners to respond to a weekly set of journal questions asking about how they were doing and about observations of the client. We wanted to capture any patterns in caregiver/partner’s energy and overall wellbeing as well as corroborate (or not) what clients were reporting in their daily journals.

We also developed a New Client Survey which is much like the Pre-Summer Survey but shortened so as not to be too taxing for new clients to complete. The New Client Survey will continue to be used by DWP to collect a few key baseline measures of clients. Follow-up surveys administered longitudinally will allow DWP to track the trajectory of client outcomes over time.

A follow-up set of questions for both an online survey and interviews (either in person or over the telephone) was developed that focuses only on tracing impacts. The data gathered from this instrument will serve the development of a more involved evaluation project that will seek to establish causal impacts of DWP.

DWP also has begun keeping attendance records. Their client retention rate over the years has been high but has not been documented.