In diesem Abschnitt finden sich Inhalte zu verschiedenen Evaluationsansätzen, -modellen und -theorien und ihre jeweilige Protagonisten.
Haubrich, K. (2001). Cluster-Evaluation - lokale Kontexte berücksichtigen, Innovation und Beteiligung fördern. In BMFSFJ (Hrsg.), QS 35 Materialien zur Qualitätssicherung in der Kinder- und Jugendhilfe.
Millet, R. (1995). W.K. Kellogg Foundation cluster evaluation model of evolving practices. Battle Creek, MI: W.K. Kellogg Foundation.
Sanders, J. R. (1997). Cluster Evaluation. In E. Chelimsky & W. R. Shadish (Hrsg.), Evaluation for the 21st century. A handbook (S. 396-404). Thousand Oaks: Sage.
"Evaluation may be done to provide feedback to people who are trying to improve something (formative evaluation); or to provide information for decision-makers who are wondering whether to fund, terminate, or purchase something (summative evaluation)." (Scriven, 1980, S. 6-7)
Der Begriff formative Evaluation (nicht das Konzept) geht auf Scriven (1972) zurück und bildet mit seinem Gegenstück summative Evaluation wohl das prominenteste Begriffspaar in der Evaluationsliteratur. Dennoch handelt es sich um einen problematischen Begriff, da er ungenau definiert, theoretisch unstimmig und in seiner praktischen Verwendung oft entsprechend beliebig ist (vgl. dazu etwa die Beiträge von Patton, Chen und Wholey in Evaluation Practice, 1996, Vol. 17, No. 2).
Da sich das Begriffspaar wegen seiner hohen Anmutungsqualität trotz dieser Probleme mit Sicherheit halten wird, scheint mir folgende Begriffsverwendung sinnvoll:
Die Begrifflichkeiten formativ/summativ werden ausschließlich zur Bezeichnung intendierter Evaluationszwecke verwendet, so wie es das obige Zitat Scrivens andeutet. Auf alle anderen von Scriven und Apologeten vertretenen Addenda wird verzichtet. Darunter fallen:
Subject: Re: Formative/Summative & Process/Outcome 2*2 Matrix?
Date: Tue, 28 Dec 2004 12:24:10 -0500
From: Eileen Stryker
I don't remember whether or where Dan Stufflebeam might have written this, but back in olden times when I took his class, he talked about how he and Scriven grew to understand that the CIPP model and Formative / Summative evaluation complement, rather than compete with, each other. It went something like what I've portrayed below (roughly and w/o the careful thought portrayed by Dan in class -- I haven't enough room or time for that right now). Context evaluation includes (but is not limited to) evaluation of goals; input includes evaluation of designs and resources; process includes implementation and product includes effects -- outputs, outcomes, short, medium, long term, etc. etc. These are further defined in the design phase of any evaluation study, of course. Formative focusses on providing information for program development, summative for accountability (with the extended meanings Scriven has portrayed in previous posts and writings, of course
| Context Input Process Product --------------|---------------------------------------------- Formative | | Summative | |
Some sample questions might include:
Formative /Context: Are program goals responsive to participant needs? Are the goals good?
Summative /Context: Were the goals appropriate to participant needs? To the setting? What contextual factors were important to project successes/ failures?
Formative Input: What designs might be most effective to reach the goals? What can (educational, social, health, management) theory tell us about effective intervention designs?
Summative Input: Was the project design well-founded in theory, best practice, organizational experience?
I'm sure you can fill in the rest.
Hope this helps. It has certainly helped me think about questions an evaluation might address as I meet with client groups.
Happy New Year,
Eileen
Dr. Eileen Stryker Stryker and Endias, Inc. Planning, Research and Evaluation Services Kalamazoo, Michigan 269-668-2373
Original Message -----
From: "Charles Partridge"
> Group, > > Before I reinvent the wheel, if someone out there has already put > together a 2*2 matrix that defines the Formative/Summative & > Process/Outcome dimensions, could you please forward it to me? > > Thanks in advance. > > Charles R. Partridge > Evaluation Specialist > Center for Learning Excellence > The John Glenn Institute for Public Service and Public Policy > The Ohio State University > Columbus, Ohio 43212-1421 > Email: Partridge.6@osu.edu > > ------------------------------------------------------------------ > EVALTALK - American Evaluation Association (AEA) Discussion List. See also > the website: http://www.eval.org > To unsubscribe from EVALTALK, send e-mail to listserv@bama.ua.edu > with only the following in the body: UNSUBSCRIBE EVALTALK > To get a summary of commands, send e-mail to listserv@bama.ua.edu > with only the following in the body: INFO REFCARD > To use the archives, go to this web site: > http://bama.ua.edu/archives/evaltalk.html > For other problems, contact a list owner at kbolland@sw.ua.edu or > carolyn.sullins@wmich.edu
EVALTALK - American Evaluation Association (AEA) Discussion List. See also
the website: http://www.eval.org
To unsubscribe from EVALTALK, send e-mail to listserv@bama.ua.edu
with only the following in the body: UNSUBSCRIBE EVALTALK
To get a summary of commands, send e-mail to listserv@bama.ua.edu
with only the following in the body: INFO REFCARD
To use the archives, go to this web site: http://bama.ua.edu/archives/evaltalk.html For other problems, contact a list owner at kbolland@sw.ua.edu or carolyn.sullins@wmich.edu
From: "Alan Listiak"
Last week a request went out for info on logic models. I have accumulated a number of resources on "How-to" develop and use logic models in program development and evaluation. Here they are.
1. Mayeske, George W. and Michael T. Lambur (2001). How to Design Better Programs: A Staff Centered Stakeholder Approach to Program Logic Modeling. Crofton, MD: The Program Design Institute. Highly Recommended.
And, Mayeske, George W. (2002). How to Develop Better Programs & Determine Their Results: An Organic & Heuristic Client & Staff Centered Approach with Stakeholder Involvement. Bowie, MD: The Program Design Institute. Highly Recommended.
The first manual (How to Design Better Programs) is a step-by-step guide to developing and implementing logic models. The second manual (How to Develop Better Programs) deals focuses on how-to develop experiential educational programs "based on, but not restricted to, the use of program logic models which serve as a tool for the development process." (from the Foreword).
Both manuals are available from The Program Design Institute, c/o Dr. George W. Mayeske, 12524 Knowledge Lane, Bowie, MD 20715-2622. The Logic Modeling manual is $28.00 (includes shipping) and the Better Pro-grams manual is $45.00 (including shipping) - checks only. But both manuals can be purchased at a discount. Contact Dr. Mayeske for details at gwmayeske@aol.com.
2. W. K. Kellogg Foundation (2001). W. K. Kellogg Foundation Logic Model Development Guide. Available for no cost at http://www.wkkf.org/ by clicking on the link to the guide on the right of the page.
This guide is not as detailed as the Program Design Institute guides on the nuts and bolts of logic modeling, but is better at discussing program theory and its application. And it's free for the downloading. Highly Recommended.
Also see: W. K. Kellogg Foundation (1998). W. K. Kellogg Foundation Evaluation Handbook. Available at no cost through this site at http://www.wkkf.org/ by clicking on the link to the handbook.
3. Devine, Patricia (1999). Using Logic Models in Substance Abuse Treatment Evaluations. Fairfax, VA: National Evaluation Data and Technical Assistance Center, Caliber Associates. Available at
http://www.calib.com/home/work_samples/files/logicmdl.pdf.
Highly Recommended.
This paper discusses the use of logic models in planning and evaluating substance abuse treatment services. The best part is the "sample data maps" that specify evaluation questions, measures, and variables.
The paper is part of the Integrated Evaluation Methods Package
for substance abuse treatment programs developed under the auspices of the Center for Substance Abuse Treatment, Department of Health and Human Services. The full discussion of this evaluation framework, concepts, and tools is presented in: Devine, Patricia (1999). A Guide for Substance Abuse Treatment Knowledge-Generating Activities. Fairfax, VA:
National Evaluation Data and Technical Assistance Center, Caliber
Associates. Available at http://www.calib.com/home/work_samples/files/iemdoc.pdf.
There are other papers in the Integrated Evaluation Methods Package available at http://www.calib.com/home/work_samples/pubs.cfm under the heading Substance Abuse Research and Evaluation, Evaluation Tools and Resources. These papers include:
Devine, Patricia (1999). A Guide to Process Evaluation of Substance Abuse Treatment Services. Fairfax, VA: National Evaluation Data and Technical Assistance Center, Caliber Associates.
Devine, Patricia, Bullman, Stephanie, & Zeaske, Jessica (1999). Substance Abuse Treatment Evaluation Product Outlines Notebook. Fairfax, VA: National Evaluation Data and Technical Assistance Center, Caliber Associates.
Devine, Patricia, Christopherson, Eric, Bishop, Sharon, Lowery, Jacquelyn, & Moore, Melody (1999). Self-Adjusting Treatment Evaluation Model. Fairfax, VA: National Evaluation Data and Technical Assistance Center, Caliber Associates.
4. The University of Wisconsin-Cooperative Extension has an online course entitled, Enhancing Program Performance with Logic Models. The course contains two modules - Module 1, "Logic Model Basics," is an introduction to logic models; and Module 2, "Introducing The Community Nutrition Education Logic Model," is an application of logic models to community nutrition education programs. Each module has various interactive elements, including practice activities designed to help students better understand the course content. The free course is available at http://www1.uwex.edu/ces/lmcourse/. The citation is:
Taylor-Powell, E., Jones, L., & Henert, E. (2002) Enhancing Program Performance with Logic Models. Retrieved December 1, 2003, from the University of Wisconsin-Extension web site: http://www1.uwex.edu/ces/lmcourse/.
5. United Way of America (1996). Measuring Program Outcomes: A Practical Approach. This manual can be purchased for $5.00 plus S&H by calling 1-800-772-0008 and ordering item number 0989. You can find the manual's table of contents and excerpts on the United Way web site at http://national.unitedway.org/outcomes/resources/mpo/.
6. Harrell, Adele, with Burt, Martha, Hatry, Harry, Rossman, Shelli, Roth, Jeffrey, and Sabol, William (no date). Evaluation Strategies for Human Service Programs: A Guide for Policymakers and Providers. Washington, DC: The Urban Institute.
This guide focuses on developing a logic model and selecting and implementing an evaluation design. Gives an example of a logic model for a children-at-risk program. It is available at http://www.bja.evaluationwebsite.org/html/documents/evaluation_strategies.ht ml. 7. Hernandez, M. & Hodges, S. (2003). Crafting Logic Models for Systems of Care: Ideas into Action. Making children's mental health services successful series, volume 1. Tampa, FL: University of South Florida, The Louis de la Parte Florida Mental Health Institute, Department of Child & Family Studies. Available at http://cfs.fmhi.usf.edu/TREAD/CMHseries/IdeasIntoAction.html. This monograph is a guide to developing a system of care using a theory-based approach. System stakeholders can use the theory of change approach to move from ideas to action-oriented strategies to achieve their goals and understand the relationships among the populations that the system is intended to serve.
Other resources
Alter, C. & Murty, S. (1997). Logic modeling: A tool for teaching practice evaluation. Journal of Social Work Education, 33(1), 103-117. Conrad, Kendon J., & Randolph, Frances L. (1999). Creating and using logic models: Four perspectives. Alcohol-ism Treatment Quarterly, 17(1-2), 17-32.
Hernandez, Mario (2000). Using logic models and program theory to build outcome accountability. Education and Treatment of Children, 23(1), 24-41.
Julian, David A. (1997). The utilization of the logic model as a system level planning and evaluation device. Evaluation and Program Planning, 20(3), 251-257.
McLaughlin, J. A., & Jordan, G. B. (1999). Logic models: A tool for telling your program's performance story. Evaluation and Program Planning, 22(1), 65-72.
Stinchcomb, Jeanne B. (2001). Using logic modeling to focus evaluation efforts: Translating operational theories into practical measures. Journal of Offender Rehabilitation, 33(2), 47-65.
Unrau, Y.A. (2001). Using client exit interviews to illuminate outcomes in program logic models: A case example. Evaluation and Program Planning, 24(4), 353-361.
Alan
Alan Listiak, Ph.D. Coordinator of Sex Offender Program Certification Minnesota Department of Corrections 1450 Energy Park Drive St. Paul, MN 55108 651.642.0317 Alan.Listiak@state.mn.us