Posted by ""Keith Linard"" <
klin4960@bigpond.net.au>
Offline I received the following comment on my previous posting:
<<I agree completely about the correct problem.
<<But what is a correct problem?
In response I have attached an extract from an Australian Federal Department
of Finance manual that I wrote, as Chief Finance Office, in 1986:
""Evaluating Government Programs - A Handbook"". Whilst originally focussed on
program evaluation generally, I suggest the checklist is equally relevant to
the process of undertaking any SDM project (and is consistent, e.g., with
processes such as Vennix's 'Group Model Building' or Eden's 'Cognitive
Mapping'. It also addresses issues such as whether there should be a
steering committee (comprising, e.g., various stakeholders including QA).
All major management consulting firms have developed internal methodology,
which would cover most if not all of the points I have noted, to ensure
quality control. I believe every SDM consultant should apply such a
framework, adapted to their particular consulting niche.
APPENDIX A: STEPS IN THE PRE-EVALUATION ASSESSMENT * A CHECKLIST *
1. Define purpose of the evaluation:
* background to the evaluation
* audience
2. Define the nature, scope and objectives of the program:
* nature of problem addressed by program
* program authority or mandate
* program objectives
* actual or planned resource use
3. Analyse program logic:
* logical links between inputs, outputs, outcomes and objectives
4. Specify alternative ways of meeting program objectives
5. Identify key evaluation issues:
* program rationale
* impacts and effects
* objectives achievement
* alternatives
6. Identify evaluation constraints:
* time, cost, expertise and credibility
7. Assess appropriate evaluation designs:
* management issues
* alternative evaluation methods
* data collection issues
* data analysis issues
8. Develop strategy for evaluation study:
* terms of reference
* preliminary work plan
* consider input from other agencies
* consider composition of steering committee and evaluation team
* prepare pre-evaluation assessment report
(Steps 9 & 10 taken from Appendix B -'The Evaluation Study')
9: Do the evaluation study
* get agreement on terms of reference
* assign steering committee and study team
* approve work program
* undertake evaluation
* communicate results.
10 Suggested Framework of Evaluation Report
* Executive Summary
* Introduction
* Substance of the Report
* Recommendations
* Resource Issues
* Appendices
_______________________________________________
A-1: PURPOSE OF THE EVALUATION * CHECKLIST FOR STEP 1 *
1. What are the objectives of the evaluation?
2. Who, or what event, initiated the evaluation?
(e.g., routine evaluation cycle, Auditor-General's report, ministerial
request).
3. What is the stated reason for conducting the evaluation
(e.g., assist development of program proposal, review of agency
priorities).
4. What is the 'hidden agenda', if any?
(e.g., defuse public controversy, answer criticism, provide rationale
for abolishing program).
5. Who are the primary audience for the evaluation report and what
authority do they have over program resourcing or management?
6. Which other key decision makers have a strong interest in the
evaluation and what influence do they have on program decisions?
7. Have the decision makers' needs and expectations been determined?
8. To what phase of the program development and implementation cycle
will the evaluation relate?
(e.g., new policy proposal, review of existing program, review of
completed program).
9. What issues are of particular interest?
(e.g., matters raised in Parliament or by the Auditor-General;
achievement of key program objectives; cost effectiveness).
10. How important is each issue?
(e.g., in terms of political impact, cost, or scope for improved
performance).
_______________________________________________
A-2: NATURE, SCOPE AND OBJECTIVES OF THE PROGRAM * CHECKLIST FOR STEP 2
*
1. What is the mandate or authority for the program and are program
activities consistent with this?
2. What are the stated objectives of the program?
3. What were the catalysts which led to the development of the program?
(e.g., who were the key proponents, what studies/inquiries recommended
this approach?)
4. What key needs, gaps in services, problems are/were the program
intended to solve?
5. What results are/were expected from the program?
6. What reasons are/were there for believing that the program would be
effective in achieving these results?
7. Is there a clear and unambiguous definition of the target group at
which the program is aimed?
8. Have program implementation or other changes in the social/political
environment
affected the relevance of the original program objectives or
introduced new objectives?
(e.g., changes in demographic profile, strong popular support for
program, creation of perceived ""rights"" to a benefit.)
9. What would be the consequences if the new program were introduced
(or an existing one abolished)?
Who would be affected? Who would complain and who would be glad? Why?
10. What measures or criteria were identified at the program development
and
implementation phase as appropriate output and outcome indicators?
11. Are these performance indicators still relevant?
12. In the light of program operation experience, are there other
performance indicators
which are more relevant or which assist further in understanding the
success or otherwise of the program?
13. In respect of each performance indicator, were targets (standards,
levels of service) set;
when; by whom; with what justification; and were they achieved?
___________________________________________
A-3: ANALYSE THE PROGRAM LOGIC * CHECKLIST FOR STEP 3 *
1. Specify the ultimate objectives of the program.
2. Subdivide the operation of the program into a manageable number of
major activities or phases
(between 5 and 10 segments is usually appropriate).
3. Specify intermediate objectives relevant to each phase/activity
(there should be at least one intermediate objective for each of the
program's ultimate objectives).
4. Identify the inputs and intended outputs or outcomes of each of
these major phases.
5. Identify significant secondary outputs/outcomes (whether desirable
or undesirable).
6. Specify the perceived logical relationships
(i.e., how a particular phase is supposed to achieve the intermediate
objectives),
the implicit assumptions underlying the relationship between the
inputs, the outputs,
the outcomes and the intermediate or final objectives.
7. Confirm with the program managers and line managers that the model
is a realistic representation
of what happens or, for a new program, is supposed to happen.
8. Identify the evaluation questions which are of interest in respect
to each phase
(these should directly address each assumption in point 6).
9. Specify performance indicators which can be used to monitor or
answer the evaluation questions in point 8.
10. Assess, in conjunction with program managers, what are the critical
assumptions and the corresponding key performance indicators.
___________________________________________________
A-4: IDENTIFY ALTERNATIVES * CHECKLIST FOR STEP 4 *
1. Review history of program and the reasons for the current
evaluation.
2. Review reports, journals etc on approaches to the problem in
question.
3. Use structured problem solving techniques (DELPHI, brainstorming
etc) with groups of professionals, clients etc.
4. Undertake screening of options.
5. Report briefly on discarded options.
6. Include selected options for analysis.
__________________________________________________
A-5: IDENTIFY KEY EVALUATION ISSUES * CHECKLIST FOR STEP 5 *
1. Program Rationale
* Does the Program make sense?
* Are the objectives still relevant?
2. Impacts and Effects
* What has happened as a result of the Program?
3. Objectives Achievement
* Has the Program achieved what was expected?
4. Alternatives
* Are there better ways of achieving the results?
________________________________________
A-6: IDENTIFY EVALUATION CONSTRAINTS * CHECKLIST FOR STEP 6 *
1. Time
2. Cost
3. Expertise
4. Credibility
5. Political and Social Environment
_______________________________________________
A-7: ASSESS APPROPRIATE EVALUATION DESIGNS * CHECKLIST FOR STEP 7 *
1. Specify those activities or changes (due to the program) which must
be measured.
2. Identify sources of data.
3. Decide appropriate means of measuring changes due to program as
distinct from changes due to non-program factors.
4. Decide procedures for obtaining data (e.g., sample survey, automatic
monitoring, simulation, modelling).
5. Decide appropriate analytical approaches for analysing the data.
6. Decide how the results are to be aggregated and presented.
_______________________________________________
A-8: DEVELOP STRATEGY FOR EVALUATION STUDY * CHECKLIST FOR STEP 8 *
1. Prepare terms of reference or brief which includes clear and
unambiguous statement
of the purpose and nature of the evaluation: (key issues to be
addressed,
the specific questions to be answered, the audience, the timing and
the decision context).
2. Prepare preliminary work plan indicating
* how the purpose of the study is to be achieved;
* when each and all tasks are to be completed; and
* what evaluation products are required.
3. Provide a clear statement of procedures for review of progress and
of any key decision points.
4. Provide a clear statement of time and resource constraints, and of
procedures for amending these.
5. Consider input from other agencies, composition of steering
committee and evaluation team;
identify official points of contact among program officials and where
appropriate, clients; and
6. Prepare an outline of procedures for amending the evaluation work
plan should this subsequently be required.
___________________________________________________
A.9: DO THE EVALUATION STUDY * CHECKLIST *
1. Get executive agreement on the Terms of Reference, proposed Work
Plan, Review & Decision Points,
Time & Resource expectations, Data & Person Input and Person
Involvement expectations and Work Program Change Procedures.
2. Assign steering committee and study team; decide on the extent of
involvement of other agencies including central agencies, etc.
3. Prepare detailed work plan.
4. Prepare time, resources and methodology schedule for data
collection, analysis and reporting.
5. Undertake evaluation (e.g., collect data, test reliability, document
and analyse data).
6. Communicate results.
__________________________________________________
A.10 SUGGESTED OUTLINE FOR EVALUATION REPORTS
1. Executive Summary
* a brief statement of evaluation objectives and methods
* a summary of major findings and conclusions
* recommendations and matters needing further consideration
2. Introduction
* terms of reference for the study
* identification of constraints on the study
* statements of key assumptions and values underlying the report
3. The Substance of the Report
(a) Program (Element) Description
* a statement of the mandate and key objectives of the program
* an exposition of the logic of the program
* definition of key concepts
(b) Summary of Analyses Conducted
* justification for key evaluation issues addressed
* description of data collection procedures and measurement
devices;
* outline of collection results, with indication of
reliability
4. Findings and Conclusions
* results of analysis related to program (element) objectives
* overall findings and discrepancies between these and program
objectives
* conclusions organised in terms of major evaluation study issues
5. Recommendations
* recommendations set out to show derivation from findings and
conclusions
* alternative options considered and reasons for rejection
* matters recommended for further study and estimates of the required
resources.
6. Resource Issues
* resources required to implement the recommendations
* offsetting savings
7. Appendices
* detailed documentation of data collection and analysis procedures
* list of references
* list of staff/organisations consulted during the study
* list of steering committee and study team members.
_____________________________________
Keith Linard
Posted by ""Keith Linard"" <
klin4960@bigpond.net.au>
posting date Sat, 17 Nov 2007 07:58:02 +1100
_______________________________________________