An evaluation framework has been developed for IPDET which involves the annual
collection, review and reporting of data from the program by an outside
evaluator. Annual evaluations have shown consistently positive results using a
variety of tools including knowledge tests for learning effectiveness, plus
student evaluations for the Core Course, Weeks 3 and 4, and for each workshop.
The majority of data gathered corresponds to what is commonly understood as
Level 1 evaluation, focusing on participant reactions and satisfaction with the
program experience. In addition, pre- and post-tests measure actual cognitive
development of participants as a result of their participation in the program.
IPDET Tracer Study (November 2015)
IPDET implemented a survey in November of last year, the first one of its kind, to measure the performance of IPDET training by looking at the experience of alumni 1.5 years after attending the program – and to see how their capacities to do evaluation related tasks have changed, and how this impacts on decision making. The findings from this first survey are as follows:
66% of respondents said that they have applied the knowledge gained from IPDET in their work very frequently and a large majority (72%) stated that the evaluation-related work they have done since IPDET has very frequently contributed to their decision making. 85% of the respondents continue to actively follow the IPDET listserv primarily to gather or share new information, look for job opportunities, and network/stay connected. An even larger proportion, 89% feel connected to a global community because of the IPDET listserv.
|GAC 2015 Tracer Report
IPDET Evaluation of Program Impact: Volume 1 Main report (October 2010) & Volume 2 Case Studies (April 2011),
B. Cousins, C. Elliott and N. Gilbert, University of Ottawa
This evaluation, which was conducted by Drs. Brad Cousins and Catherine Elliott
with Nathalie Gilbert of the University of Ottawa, examined the extent to which
IPDET has met its objectives of knowledge and skill development and its networking
role in the broader evaluation community. The evaluation also looked at the transfer
of knowledge and skills to the home workplace as well as the factors that are most
powerful in explaining successful application of knowledge and skills. A
non-comparative, retrospective design was employed, using multiple lines of evidence.
Two primary sources were an on-line questionnaire survey of IPDET alumni who had
attended IPDET since its inception and a multiple case study that involved two
organizational cases (located in Ottawa and Geneva) and three country-level cases
(Botswana, China, Sri Lanka). Content analyses were done of two secondary data
sources: email communications to IPDET from alumni; and a recent six-month sample of
IPDET listserv traffic.
The evaluators concluded that:
IPDET is a very successful program that is unparalleled in its ability to develop
foundational knowledge and skill in development evaluation;
there is a fairly high degree of knowledge transfer of M&E skills learned at IPDET
to the participants’ home work environment;
IPDET is helping to build local capacity and an appetite for development evaluation.
Through the efforts of alumni who are ‘championing’ M&E, IPDET is also helping to make
inroads in contexts which have a readiness for development M&E.
Evaluation of the International Program for Development Evaluation Training,
Jua Consulting, Ottawa, Canada, March 2004
An independent impact evaluation by
Buchanan in 2004 considered the actual application of participants’
learning and skill development to the workplace as well as the impact on
organizations. It demonstrated a strong fit between the program, the job and
strategic contexts. Ninety-one percent of respondents found the content of
IPDET relevant to their work, 92% found the selection of workshops met their
needs and 97% reported that they applied their new skills and knowledge in
IDRC Tracer Study (October 2006)
From 2001 to 2006, the Interntational Development Research Centre (IDRC)
Evaluation Unit sponsored 53 people to attend the IPDET program. Participants
have included IDRC staff and project partners who were sponsored by IDRC
Program. Recognizing the Centre’s aim to build the evaluation capacity of
partners and staff, this tracer study was designed to determine whether IPDET
is the best method of building this capacity.
Overall, the study found the IPDET program to be an effective way for IDRC
partners and staff to build their evaluation capacity.
To view the complete
report, please visit the IDRC website. The report is available in
English and in French.