- Gebundenes Buch
- Merkliste
- Auf die Merkliste
- Bewerten Bewerten
- Teilen
- Produkt teilen
- Produkterinnerung
- Produkterinnerung
The golden standard evaluation reference text
Now in its second edition, Evaluation Theory, Models, and Applications is the vital text on evaluation models, perfect for classroom use as a textbook, and as a professional evaluation reference. The book begins with an overview of the evaluation field and program evaluation standards, and proceeds to cover the most widely used evaluation approaches. With new evaluation designs and the inclusion of the latest literature from the field, this Second Edition is an essential update for professionals and students who want to stay current.…mehr
Andere Kunden interessierten sich auch für
- Qualitative Inquiry in Evaluation90,99 €
- Sue C. FunnellPurposeful Program Theory102,99 €
- Brenda RussellOnline Research Essentials67,99 €
- Don A. DillmanInternet, Phone, Mail, and Mixed-Mode Surveys95,99 €
- William E. MartinQuantitative and Statistical Research Methods105,99 €
- Robert A. HannemanBasic Statistics for Social Research100,99 €
- David ThornburgFrom the Campfire to the Holodeck28,99 €
-
-
-
The golden standard evaluation reference text
Now in its second edition, Evaluation Theory, Models, and Applications is the vital text on evaluation models, perfect for classroom use as a textbook, and as a professional evaluation reference. The book begins with an overview of the evaluation field and program evaluation standards, and proceeds to cover the most widely used evaluation approaches. With new evaluation designs and the inclusion of the latest literature from the field, this Second Edition is an essential update for professionals and students who want to stay current. Understanding and choosing evaluation approaches is critical to many professions, and Evaluation Theory, Models, and Applications, Second Edition is the benchmark evaluation guide.
Authors Daniel L. Stufflebeam and Chris L. S. Coryn, widely considered experts in the evaluation field, introduce and describe 23 program evaluation approaches, including, new to this edition, transformative evaluation, participatory evaluation, consumer feedback, and meta-analysis. Evaluation Theory, Models, and Applications, Second Edition facilitates the process of planning, conducting, and assessing program evaluations. The highlighted evaluation approaches include:
Experimental and quasi-experimental design evaluations
Daniel L. Stufflebeam's CIPP Model
Michael Scriven's Consumer-Oriented Evaluation
Michael Patton's Utilization-Focused Evaluation Robert Stake's Responsive/Stakeholder-Centered Evaluation
Case Study Evaluation
Key readings listed at the end of each chapter direct readers to the most important references for each topic. Learning objectives, review questions, student exercises, and instructor support materials complete the collection of tools. Choosing from evaluation approaches can be an overwhelming process, but Evaluation Theory, Models, and Applications, Second Edition updates the core evaluation concepts with the latest research, making this complex field accessible in just one book.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Now in its second edition, Evaluation Theory, Models, and Applications is the vital text on evaluation models, perfect for classroom use as a textbook, and as a professional evaluation reference. The book begins with an overview of the evaluation field and program evaluation standards, and proceeds to cover the most widely used evaluation approaches. With new evaluation designs and the inclusion of the latest literature from the field, this Second Edition is an essential update for professionals and students who want to stay current. Understanding and choosing evaluation approaches is critical to many professions, and Evaluation Theory, Models, and Applications, Second Edition is the benchmark evaluation guide.
Authors Daniel L. Stufflebeam and Chris L. S. Coryn, widely considered experts in the evaluation field, introduce and describe 23 program evaluation approaches, including, new to this edition, transformative evaluation, participatory evaluation, consumer feedback, and meta-analysis. Evaluation Theory, Models, and Applications, Second Edition facilitates the process of planning, conducting, and assessing program evaluations. The highlighted evaluation approaches include:
Experimental and quasi-experimental design evaluations
Daniel L. Stufflebeam's CIPP Model
Michael Scriven's Consumer-Oriented Evaluation
Michael Patton's Utilization-Focused Evaluation Robert Stake's Responsive/Stakeholder-Centered Evaluation
Case Study Evaluation
Key readings listed at the end of each chapter direct readers to the most important references for each topic. Learning objectives, review questions, student exercises, and instructor support materials complete the collection of tools. Choosing from evaluation approaches can be an overwhelming process, but Evaluation Theory, Models, and Applications, Second Edition updates the core evaluation concepts with the latest research, making this complex field accessible in just one book.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Produktdetails
- Produktdetails
- Research Methods for the Social Sciences
- Verlag: Wiley & Sons
- 2. Aufl.
- Seitenzahl: 800
- Erscheinungstermin: 13. Oktober 2014
- Englisch
- Abmessung: 244mm x 182mm x 48mm
- Gewicht: 1465g
- ISBN-13: 9781118074053
- ISBN-10: 111807405X
- Artikelnr.: 40888465
- Herstellerkennzeichnung
- Libri GmbH
- Europaallee 1
- 36244 Bad Hersfeld
- 06621 890
- Research Methods for the Social Sciences
- Verlag: Wiley & Sons
- 2. Aufl.
- Seitenzahl: 800
- Erscheinungstermin: 13. Oktober 2014
- Englisch
- Abmessung: 244mm x 182mm x 48mm
- Gewicht: 1465g
- ISBN-13: 9781118074053
- ISBN-10: 111807405X
- Artikelnr.: 40888465
- Herstellerkennzeichnung
- Libri GmbH
- Europaallee 1
- 36244 Bad Hersfeld
- 06621 890
DANIEL L. STUFFLEBEAM, PHD, is Distinguished University Professor Emeritus at Western Michigan University, Kalamazoo. CHRIS L. S. CORYN, PHD, is director of the Interdisciplinary PhD in Evaluation (IDPE) program and assistant professor in the Evaluation, Measurement, and Research (EMR) program at Western Michigan University. He is the executive editor of the Journal of MultiDisciplinary Evaluation.
List of Figures, Tables, and Exhibits xiii
Dedication xvii
Preface xix
Acknowledgments xxiii
The Author xxv
Introduction xxvii
Changes to the First Edition xxviii
Intended Audience xxviii
Overview of the Book's Contents xxix
Study Suggestions xxxii
Part One: Fundamentals of Evaluation 1
1 OVERVIEW OF THE EVALUATION FIELD 3
What Are Appropriate Objects of Evaluations and Related Subdisciplines of
Evaluation? 3
Are Evaluations Enough to Control Quality, Guide Improvement, and Protect
Consumers? 4
Evaluation as a Profession and Its Relationship to Other Professions 4
What Is Evaluation? 6
How Good Is Good Enough? How Bad Is Intolerable? How Are These Questions
Addressed? 17
What Are Performance Standards? How Should They Be Applied? 18
Why Is It Appropriate to Consider Multiple Values? 20
Should Evaluations Be Comparative, Noncomparative, or Both? 21
How Should Evaluations Be Used? 21
Why Is It Important to Distinguish Between Informal Evaluation and Formal
Evaluation? 26
How Do Service Organizations Meet Requirements for Public
Accountability? 27
What Are the Methods of Formal Evaluation? 29
What Is the Evaluation Profession, and How Strong Is It? 29
What Are the Main Historical Milestones in the Evaluation Field's
Development? 30
2 EVALUATION THEORY 45
General Features of Evaluation Theories 45
Theory's Role in Developing the Program Evaluation Field 47
Functional and Pragmatic Bases of Extant Program Evaluation Theory 48
AWord About Research Related to Program Evaluation Theory 49
Program Evaluation Theory Defined 50
Criteria for Judging Program Evaluation Theories 52
Theory Development as a Creative Process Subject to Review and Critique by
Users 56
Status of Theory Development in the Program Evaluation Field 57
Importance and Difficulties of Considering Context in Theories of Program
Evaluation 58
Need for Multiple Theories of Program Evaluation 58
Hypotheses for Research on Program Evaluation 59
Potential Utility of Grounded Theories 62
Potential Utility of Metaevaluations in Developing Theories of Program
Evaluation 63
Program Evaluation Standards and Theory Development 63
3 STANDARDS FOR PROGRAM EVALUATIONS 69
The Need for Evaluation Standards 71
Background of Standards for Program Evaluations 73
Joint Committee Program Evaluation Standards 74
American Evaluation Association Guiding Principles for Evaluators 80
Government Auditing Standards 83
Using Evaluation Standards 97
Part Two: An Evaluation of Evaluation Approaches and Models 105
4 BACKGROUND FOR ASSESSING EVALUATION APPROACHES 107
Evaluation Approaches 109
Importance of Studying Alternative Evaluation Approaches 109
The Nature of Program Evaluation 110
Previous Classifications of Alternative Evaluation Approaches 110
Caveats 112
5 PSEUDOEVALUATIONS 117
Background and Introduction 117
Approach 1: Public Relations Studies 119
Approach 2: Politically Controlled Studies 120
Approach 3: Pandering Evaluations 122
Approach 4: Evaluation by Pretext 123
Approach 5: Empowerment Under the Guise of Evaluation 125
Approach 6: Customer Feedback Evaluation 127
6 QUASI-EVALUATION STUDIES 133
Quasi-Evaluation Approaches Defined 133
Functions of Quasi-Evaluation Approaches 134
General Strengths and Weaknesses of Quasi-Evaluation Approaches 134
Approach 7: Objectives-Based Studies 135
Approach 8: The Success Case Method 137
Approach 9: Outcome Evaluation as Value-Added Assessment 143
Approach 10: Experimental and Quasi-Experimental Studies 147
Approach 11: Cost Studies 152
Approach 12: Connoisseurship and Criticism 155
Approach 13: Theory-Based Evaluation 158
Approach 14: Meta-Analysis 164
7 IMPROVEMENT- AND ACCOUNTABILITY-ORIENTED EVALUATION APPROACHES 173
Improvement- and Accountability-Oriented Evaluation Defined 173
Functions of Improvement- and Accountability-Oriented Approaches 174
General Strengths and Weaknesses of Decision- and Accountability-Oriented
Approaches 174
Approach 15: Decision- and Accountability-Oriented Studies 174
Approach 16: Consumer-Oriented Studies 181
Approach 17: Accreditation and Certification 184
8 SOCIAL AGENDA AND ADVOCACY EVALUATION APPROACHES 191
Overview of Social Agenda and Advocacy Approaches 191
Approach 18: Responsive or Stakeholder-Centered Evaluation 192
Approach 19: Constructivist Evaluation 197
Approach 20: Deliberative Democratic Evaluation 202
Approach 21: Transformative Evaluation 205
9 ECLECTIC EVALUATION APPROACHES 213
Overview of Eclectic Approaches 213
Approach 22: Utilization-Focused Evaluation 214
Approach 23: Participatory Evaluation 219
10 BEST APPROACHES FOR TWENTY-FIRST-CENTURY EVALUATIONS 229
Selection of Approaches for Analysis 230
Methodology for Analyzing and Evaluating the Nine Approaches 230
Our Qualifications as Raters 230
Conflicts of Interest Pertaining to the Ratings 231
Standards for Judging Evaluation Approaches 231
Comparison of 2007 and 2014 Ratings 236
Issues Related to the 2011 Program Evaluation Standards 237
Overall Observations 237
The Bottom Line 240
Part Three: Explication of Selected Evaluation Approaches 247
11 EXPERIMENTAL AND QUASI-EXPERIMENTAL DESIGN EVALUATIONS 249
Chapter Overview 249
Basic Requirements of Sound Experiments 250
Prospective Versus Retrospective Studies of Cause 251
Uses of Experimental Design 251
Randomized Controlled Experiments in Context 252
Suchman and the Scientific Approach to Evaluation 256
Contemporary Concepts Associatedwith the Experimental andQuasi-Experimental
Design Approach to Evaluation 265
Exemplars of Large-Scale Experimental and Quasi-Experimental Design
Evaluations 269
Guidelines for Designing Experiments 271
Quasi-Experimental Designs 280
12 CASE STUDY EVALUATIONS 291
Overview of the Chapter 291
Overview of the Case Study Approach 292
Case Study Research: The Views of Robert Stake 294
Case Study Research: The Views of Robert Yin 297
Particular Case Study Information Collection Methods 301
13 DANIEL STUFFLEBEAM'S CIPP MODEL FOR EVALUATION: AN IMPROVEMENT AND
ACCOUNTABILITY-ORIENTED APPROACH 309
Overview of the Chapter 309
CIPP Model in Context 309
Overview of the CIPP Categories 312
Formative and Summative Uses of Context, Input, Process, and Product
Evaluations 313
Philosophy and Code of Ethics Underlying the CIPP Model 314
The Model's Values Component 317
Using the CIPP Framework to Define Evaluation Questions 319
Delineation of the CIPP Categories and Relevant Procedures 319
Use of the CIPP Model as a Systems Strategy for Improvement 332
14 MICHAEL SCRIVEN'S CONSUMER-ORIENTED APPROACH TO EVALUATION 341
Overview of Scriven's Contributions to Evaluation 341
Scriven's Background 343
Scriven's Basic Orientation to Evaluation 343
Scriven's Definition of Evaluation 343
Critique of Other Persuasions 344
Formative and Summative Evaluation 345
Amateur Versus Professional Evaluation 347
Intrinsic and Payoff Evaluation 347
Goal-Free Evaluation 347
Needs Assessment 348
Scoring, Ranking, Grading, and Apportioning 349
Checklists 352
Key Evaluation Checklist 353
The Final Synthesis 354
Metaevaluation 357
Evaluation Ideologies 357
Avenues to Causal Inference 361
Product Evaluation 363
Professionalization of Evaluation 366
Scriven's Look to Evaluation's Future 366
15 ROBERT STAKE'S RESPONSIVE OR STAKEHOLDER-CENTERED EVALUATION
APPROACH 373
Stake's Professional Background 374
Factors Influencing Stake's Development of Evaluation Theory 374
Stake's 1967 ''Countenance of Educational Evaluation'' Article 375
Responsive Evaluation Approach 383
Substantive Structure of Responsive Evaluation 390
Functional Structure of Responsive Evaluation 390
An Application of Responsive Evaluation 392
Stake's Recent Rethinking of Responsive Evaluation 397
16 MICHAEL PATTON'S UTILIZATION-FOCUSED EVALUATION 403
Adherents of Utilization-Focused Evaluation 404
Some General Aspects of Patton's Utilization-Focused Evaluation 405
Intended Users of Utilization-Focused Evaluation 407
Focusing a Utilization-Focused Evaluation 407
The Personal Factor as Vital to an Evaluation's Success 408
The Evaluator's Roles 408
Utilization-Focused Evaluation and Values and Judgments 409
Employing Active-Reactive-Adaptive Processes to Negotiate with Users 410
Patton's Eclectic Approach 411
Planning Utilization-Focused Evaluations 411
Collecting and Analyzing Information and Reporting Findings 412
Summary of Premises of Utilization-Focused Evaluation 413
Strengths of the Utilization-Focused Evaluation Approach 414
Limitations of the Utilization-Focused Evaluation Approach 415
Part Four: Evaluation Tasks, Procedures, and Tools 421
17 IDENTIFYING AND ASSESSING EVALUATION OPPORTUNITIES 423
Sources of Evaluation Opportunities 423
Bidders' Conferences 431
18 FIRST STEPS IN ADDRESSING EVALUATION OPPORTUNITIES 435
Developing the Evaluation Team 436
Developing Thorough Familiarity with the Need for the Evaluation 437
Stipulating Standards for Guiding and Assessing the Evaluation 437
Establishing Institutional Support for the Projected Evaluation 437
Developing the Evaluation Proposal's Appendix 438
Planning for a Stakeholder Review Panel 439
19 DESIGNING EVALUATIONS 445
A Design Used for Evaluating the Performance Review System of a Military
Organization 446
Generic Checklist for Designing Evaluations 462
20 BUDGETING EVALUATIONS 479
Ethical Imperatives in Budgeting Evaluations 480
Fixed-Price Budget for Evaluating a Personnel Evaluation System 483
Other Types of Evaluation Budgets 486
Generic Checklist for Developing Evaluation Budgets 493
21 CONTRACTING EVALUATIONS 505
Definitions of Evaluation Contracts and Memorandums of Agreement 506
Rationale for Evaluation Contracting 508
Addressing Organizational Contracting Requirements 511
Negotiating Evaluation Agreements 511
Evaluation Contracting Checklist 512
22 COLLECTING EVALUATIVE INFORMATION 519
Key Standards for Information Collection 519
An Information Collection Framework 540
Useful Methods for Collecting Information 543
23 ANALYZING AND SYNTHESIZING INFORMATION 557
General Orientation to Analyzing and Synthesizing Information 558
Principles for Analyzing and Synthesizing Information 559
Analysis of Quantitative Information 560
Analysis of Qualitative Information 575
Justified Conclusions and Decisions 580
24 COMMUNICATING EVALUATION FINDINGS 589
Review of Pertinent Analysis and Advice from Previous Chapters 590
Complex Needs and Challenges in Reporting Evaluation Findings 591
Establishing Conditions to Foster Use of Findings 592
Providing Interim Evaluative Feedback 600
Preparing and Delivering the Final Report 603
Providing Follow-Up Support to Enhance an Evaluation's Impact 619
Part Five: Metaevaluation and Institutionalizing and Mainstreaming
Evaluation 629
25 METAEVALUATION: EVALUATING EVALUATIONS 631
Rationale for Metaevaluation 632
Evaluator and Client Responsibilities in Regard to Metaevaluation 634
Formative and Summative Metaevaluations 634
A Conceptual and Operational Definition of Metaevaluation 634
An Instructive Metaevaluation Case 640
Metaevaluation Tasks 643
Metaevaluation Arrangements and Procedures 647
Comparative Metaevaluations 662
Checklists for Use in Metaevaluations 664
The Role of Context and Resource Constraints 664
26 INSTITUTIONALIZING AND MAINSTREAMING EVALUATION 671
Review of this Book's Themes 671
Overview of the Remainder of the Chapter 672
Rationale and Key Principles for Institutionalizing and Mainstreaming
Evaluation 673
Early Efforts to Help Organizations Institutionalize Evaluation 674
Recent Advances of Use in Institutionalizing and Mainstreaming
Evaluation 675
Checklist for Use in Institutionalizing and Mainstreaming Evaluation 676
Glossary 691
References 713
Index 744
Dedication xvii
Preface xix
Acknowledgments xxiii
The Author xxv
Introduction xxvii
Changes to the First Edition xxviii
Intended Audience xxviii
Overview of the Book's Contents xxix
Study Suggestions xxxii
Part One: Fundamentals of Evaluation 1
1 OVERVIEW OF THE EVALUATION FIELD 3
What Are Appropriate Objects of Evaluations and Related Subdisciplines of
Evaluation? 3
Are Evaluations Enough to Control Quality, Guide Improvement, and Protect
Consumers? 4
Evaluation as a Profession and Its Relationship to Other Professions 4
What Is Evaluation? 6
How Good Is Good Enough? How Bad Is Intolerable? How Are These Questions
Addressed? 17
What Are Performance Standards? How Should They Be Applied? 18
Why Is It Appropriate to Consider Multiple Values? 20
Should Evaluations Be Comparative, Noncomparative, or Both? 21
How Should Evaluations Be Used? 21
Why Is It Important to Distinguish Between Informal Evaluation and Formal
Evaluation? 26
How Do Service Organizations Meet Requirements for Public
Accountability? 27
What Are the Methods of Formal Evaluation? 29
What Is the Evaluation Profession, and How Strong Is It? 29
What Are the Main Historical Milestones in the Evaluation Field's
Development? 30
2 EVALUATION THEORY 45
General Features of Evaluation Theories 45
Theory's Role in Developing the Program Evaluation Field 47
Functional and Pragmatic Bases of Extant Program Evaluation Theory 48
AWord About Research Related to Program Evaluation Theory 49
Program Evaluation Theory Defined 50
Criteria for Judging Program Evaluation Theories 52
Theory Development as a Creative Process Subject to Review and Critique by
Users 56
Status of Theory Development in the Program Evaluation Field 57
Importance and Difficulties of Considering Context in Theories of Program
Evaluation 58
Need for Multiple Theories of Program Evaluation 58
Hypotheses for Research on Program Evaluation 59
Potential Utility of Grounded Theories 62
Potential Utility of Metaevaluations in Developing Theories of Program
Evaluation 63
Program Evaluation Standards and Theory Development 63
3 STANDARDS FOR PROGRAM EVALUATIONS 69
The Need for Evaluation Standards 71
Background of Standards for Program Evaluations 73
Joint Committee Program Evaluation Standards 74
American Evaluation Association Guiding Principles for Evaluators 80
Government Auditing Standards 83
Using Evaluation Standards 97
Part Two: An Evaluation of Evaluation Approaches and Models 105
4 BACKGROUND FOR ASSESSING EVALUATION APPROACHES 107
Evaluation Approaches 109
Importance of Studying Alternative Evaluation Approaches 109
The Nature of Program Evaluation 110
Previous Classifications of Alternative Evaluation Approaches 110
Caveats 112
5 PSEUDOEVALUATIONS 117
Background and Introduction 117
Approach 1: Public Relations Studies 119
Approach 2: Politically Controlled Studies 120
Approach 3: Pandering Evaluations 122
Approach 4: Evaluation by Pretext 123
Approach 5: Empowerment Under the Guise of Evaluation 125
Approach 6: Customer Feedback Evaluation 127
6 QUASI-EVALUATION STUDIES 133
Quasi-Evaluation Approaches Defined 133
Functions of Quasi-Evaluation Approaches 134
General Strengths and Weaknesses of Quasi-Evaluation Approaches 134
Approach 7: Objectives-Based Studies 135
Approach 8: The Success Case Method 137
Approach 9: Outcome Evaluation as Value-Added Assessment 143
Approach 10: Experimental and Quasi-Experimental Studies 147
Approach 11: Cost Studies 152
Approach 12: Connoisseurship and Criticism 155
Approach 13: Theory-Based Evaluation 158
Approach 14: Meta-Analysis 164
7 IMPROVEMENT- AND ACCOUNTABILITY-ORIENTED EVALUATION APPROACHES 173
Improvement- and Accountability-Oriented Evaluation Defined 173
Functions of Improvement- and Accountability-Oriented Approaches 174
General Strengths and Weaknesses of Decision- and Accountability-Oriented
Approaches 174
Approach 15: Decision- and Accountability-Oriented Studies 174
Approach 16: Consumer-Oriented Studies 181
Approach 17: Accreditation and Certification 184
8 SOCIAL AGENDA AND ADVOCACY EVALUATION APPROACHES 191
Overview of Social Agenda and Advocacy Approaches 191
Approach 18: Responsive or Stakeholder-Centered Evaluation 192
Approach 19: Constructivist Evaluation 197
Approach 20: Deliberative Democratic Evaluation 202
Approach 21: Transformative Evaluation 205
9 ECLECTIC EVALUATION APPROACHES 213
Overview of Eclectic Approaches 213
Approach 22: Utilization-Focused Evaluation 214
Approach 23: Participatory Evaluation 219
10 BEST APPROACHES FOR TWENTY-FIRST-CENTURY EVALUATIONS 229
Selection of Approaches for Analysis 230
Methodology for Analyzing and Evaluating the Nine Approaches 230
Our Qualifications as Raters 230
Conflicts of Interest Pertaining to the Ratings 231
Standards for Judging Evaluation Approaches 231
Comparison of 2007 and 2014 Ratings 236
Issues Related to the 2011 Program Evaluation Standards 237
Overall Observations 237
The Bottom Line 240
Part Three: Explication of Selected Evaluation Approaches 247
11 EXPERIMENTAL AND QUASI-EXPERIMENTAL DESIGN EVALUATIONS 249
Chapter Overview 249
Basic Requirements of Sound Experiments 250
Prospective Versus Retrospective Studies of Cause 251
Uses of Experimental Design 251
Randomized Controlled Experiments in Context 252
Suchman and the Scientific Approach to Evaluation 256
Contemporary Concepts Associatedwith the Experimental andQuasi-Experimental
Design Approach to Evaluation 265
Exemplars of Large-Scale Experimental and Quasi-Experimental Design
Evaluations 269
Guidelines for Designing Experiments 271
Quasi-Experimental Designs 280
12 CASE STUDY EVALUATIONS 291
Overview of the Chapter 291
Overview of the Case Study Approach 292
Case Study Research: The Views of Robert Stake 294
Case Study Research: The Views of Robert Yin 297
Particular Case Study Information Collection Methods 301
13 DANIEL STUFFLEBEAM'S CIPP MODEL FOR EVALUATION: AN IMPROVEMENT AND
ACCOUNTABILITY-ORIENTED APPROACH 309
Overview of the Chapter 309
CIPP Model in Context 309
Overview of the CIPP Categories 312
Formative and Summative Uses of Context, Input, Process, and Product
Evaluations 313
Philosophy and Code of Ethics Underlying the CIPP Model 314
The Model's Values Component 317
Using the CIPP Framework to Define Evaluation Questions 319
Delineation of the CIPP Categories and Relevant Procedures 319
Use of the CIPP Model as a Systems Strategy for Improvement 332
14 MICHAEL SCRIVEN'S CONSUMER-ORIENTED APPROACH TO EVALUATION 341
Overview of Scriven's Contributions to Evaluation 341
Scriven's Background 343
Scriven's Basic Orientation to Evaluation 343
Scriven's Definition of Evaluation 343
Critique of Other Persuasions 344
Formative and Summative Evaluation 345
Amateur Versus Professional Evaluation 347
Intrinsic and Payoff Evaluation 347
Goal-Free Evaluation 347
Needs Assessment 348
Scoring, Ranking, Grading, and Apportioning 349
Checklists 352
Key Evaluation Checklist 353
The Final Synthesis 354
Metaevaluation 357
Evaluation Ideologies 357
Avenues to Causal Inference 361
Product Evaluation 363
Professionalization of Evaluation 366
Scriven's Look to Evaluation's Future 366
15 ROBERT STAKE'S RESPONSIVE OR STAKEHOLDER-CENTERED EVALUATION
APPROACH 373
Stake's Professional Background 374
Factors Influencing Stake's Development of Evaluation Theory 374
Stake's 1967 ''Countenance of Educational Evaluation'' Article 375
Responsive Evaluation Approach 383
Substantive Structure of Responsive Evaluation 390
Functional Structure of Responsive Evaluation 390
An Application of Responsive Evaluation 392
Stake's Recent Rethinking of Responsive Evaluation 397
16 MICHAEL PATTON'S UTILIZATION-FOCUSED EVALUATION 403
Adherents of Utilization-Focused Evaluation 404
Some General Aspects of Patton's Utilization-Focused Evaluation 405
Intended Users of Utilization-Focused Evaluation 407
Focusing a Utilization-Focused Evaluation 407
The Personal Factor as Vital to an Evaluation's Success 408
The Evaluator's Roles 408
Utilization-Focused Evaluation and Values and Judgments 409
Employing Active-Reactive-Adaptive Processes to Negotiate with Users 410
Patton's Eclectic Approach 411
Planning Utilization-Focused Evaluations 411
Collecting and Analyzing Information and Reporting Findings 412
Summary of Premises of Utilization-Focused Evaluation 413
Strengths of the Utilization-Focused Evaluation Approach 414
Limitations of the Utilization-Focused Evaluation Approach 415
Part Four: Evaluation Tasks, Procedures, and Tools 421
17 IDENTIFYING AND ASSESSING EVALUATION OPPORTUNITIES 423
Sources of Evaluation Opportunities 423
Bidders' Conferences 431
18 FIRST STEPS IN ADDRESSING EVALUATION OPPORTUNITIES 435
Developing the Evaluation Team 436
Developing Thorough Familiarity with the Need for the Evaluation 437
Stipulating Standards for Guiding and Assessing the Evaluation 437
Establishing Institutional Support for the Projected Evaluation 437
Developing the Evaluation Proposal's Appendix 438
Planning for a Stakeholder Review Panel 439
19 DESIGNING EVALUATIONS 445
A Design Used for Evaluating the Performance Review System of a Military
Organization 446
Generic Checklist for Designing Evaluations 462
20 BUDGETING EVALUATIONS 479
Ethical Imperatives in Budgeting Evaluations 480
Fixed-Price Budget for Evaluating a Personnel Evaluation System 483
Other Types of Evaluation Budgets 486
Generic Checklist for Developing Evaluation Budgets 493
21 CONTRACTING EVALUATIONS 505
Definitions of Evaluation Contracts and Memorandums of Agreement 506
Rationale for Evaluation Contracting 508
Addressing Organizational Contracting Requirements 511
Negotiating Evaluation Agreements 511
Evaluation Contracting Checklist 512
22 COLLECTING EVALUATIVE INFORMATION 519
Key Standards for Information Collection 519
An Information Collection Framework 540
Useful Methods for Collecting Information 543
23 ANALYZING AND SYNTHESIZING INFORMATION 557
General Orientation to Analyzing and Synthesizing Information 558
Principles for Analyzing and Synthesizing Information 559
Analysis of Quantitative Information 560
Analysis of Qualitative Information 575
Justified Conclusions and Decisions 580
24 COMMUNICATING EVALUATION FINDINGS 589
Review of Pertinent Analysis and Advice from Previous Chapters 590
Complex Needs and Challenges in Reporting Evaluation Findings 591
Establishing Conditions to Foster Use of Findings 592
Providing Interim Evaluative Feedback 600
Preparing and Delivering the Final Report 603
Providing Follow-Up Support to Enhance an Evaluation's Impact 619
Part Five: Metaevaluation and Institutionalizing and Mainstreaming
Evaluation 629
25 METAEVALUATION: EVALUATING EVALUATIONS 631
Rationale for Metaevaluation 632
Evaluator and Client Responsibilities in Regard to Metaevaluation 634
Formative and Summative Metaevaluations 634
A Conceptual and Operational Definition of Metaevaluation 634
An Instructive Metaevaluation Case 640
Metaevaluation Tasks 643
Metaevaluation Arrangements and Procedures 647
Comparative Metaevaluations 662
Checklists for Use in Metaevaluations 664
The Role of Context and Resource Constraints 664
26 INSTITUTIONALIZING AND MAINSTREAMING EVALUATION 671
Review of this Book's Themes 671
Overview of the Remainder of the Chapter 672
Rationale and Key Principles for Institutionalizing and Mainstreaming
Evaluation 673
Early Efforts to Help Organizations Institutionalize Evaluation 674
Recent Advances of Use in Institutionalizing and Mainstreaming
Evaluation 675
Checklist for Use in Institutionalizing and Mainstreaming Evaluation 676
Glossary 691
References 713
Index 744
List of Figures, Tables, and Exhibits xiii
Dedication xvii
Preface xix
Acknowledgments xxiii
The Author xxv
Introduction xxvii
Changes to the First Edition xxviii
Intended Audience xxviii
Overview of the Book's Contents xxix
Study Suggestions xxxii
Part One: Fundamentals of Evaluation 1
1 OVERVIEW OF THE EVALUATION FIELD 3
What Are Appropriate Objects of Evaluations and Related Subdisciplines of
Evaluation? 3
Are Evaluations Enough to Control Quality, Guide Improvement, and Protect
Consumers? 4
Evaluation as a Profession and Its Relationship to Other Professions 4
What Is Evaluation? 6
How Good Is Good Enough? How Bad Is Intolerable? How Are These Questions
Addressed? 17
What Are Performance Standards? How Should They Be Applied? 18
Why Is It Appropriate to Consider Multiple Values? 20
Should Evaluations Be Comparative, Noncomparative, or Both? 21
How Should Evaluations Be Used? 21
Why Is It Important to Distinguish Between Informal Evaluation and Formal
Evaluation? 26
How Do Service Organizations Meet Requirements for Public
Accountability? 27
What Are the Methods of Formal Evaluation? 29
What Is the Evaluation Profession, and How Strong Is It? 29
What Are the Main Historical Milestones in the Evaluation Field's
Development? 30
2 EVALUATION THEORY 45
General Features of Evaluation Theories 45
Theory's Role in Developing the Program Evaluation Field 47
Functional and Pragmatic Bases of Extant Program Evaluation Theory 48
AWord About Research Related to Program Evaluation Theory 49
Program Evaluation Theory Defined 50
Criteria for Judging Program Evaluation Theories 52
Theory Development as a Creative Process Subject to Review and Critique by
Users 56
Status of Theory Development in the Program Evaluation Field 57
Importance and Difficulties of Considering Context in Theories of Program
Evaluation 58
Need for Multiple Theories of Program Evaluation 58
Hypotheses for Research on Program Evaluation 59
Potential Utility of Grounded Theories 62
Potential Utility of Metaevaluations in Developing Theories of Program
Evaluation 63
Program Evaluation Standards and Theory Development 63
3 STANDARDS FOR PROGRAM EVALUATIONS 69
The Need for Evaluation Standards 71
Background of Standards for Program Evaluations 73
Joint Committee Program Evaluation Standards 74
American Evaluation Association Guiding Principles for Evaluators 80
Government Auditing Standards 83
Using Evaluation Standards 97
Part Two: An Evaluation of Evaluation Approaches and Models 105
4 BACKGROUND FOR ASSESSING EVALUATION APPROACHES 107
Evaluation Approaches 109
Importance of Studying Alternative Evaluation Approaches 109
The Nature of Program Evaluation 110
Previous Classifications of Alternative Evaluation Approaches 110
Caveats 112
5 PSEUDOEVALUATIONS 117
Background and Introduction 117
Approach 1: Public Relations Studies 119
Approach 2: Politically Controlled Studies 120
Approach 3: Pandering Evaluations 122
Approach 4: Evaluation by Pretext 123
Approach 5: Empowerment Under the Guise of Evaluation 125
Approach 6: Customer Feedback Evaluation 127
6 QUASI-EVALUATION STUDIES 133
Quasi-Evaluation Approaches Defined 133
Functions of Quasi-Evaluation Approaches 134
General Strengths and Weaknesses of Quasi-Evaluation Approaches 134
Approach 7: Objectives-Based Studies 135
Approach 8: The Success Case Method 137
Approach 9: Outcome Evaluation as Value-Added Assessment 143
Approach 10: Experimental and Quasi-Experimental Studies 147
Approach 11: Cost Studies 152
Approach 12: Connoisseurship and Criticism 155
Approach 13: Theory-Based Evaluation 158
Approach 14: Meta-Analysis 164
7 IMPROVEMENT- AND ACCOUNTABILITY-ORIENTED EVALUATION APPROACHES 173
Improvement- and Accountability-Oriented Evaluation Defined 173
Functions of Improvement- and Accountability-Oriented Approaches 174
General Strengths and Weaknesses of Decision- and Accountability-Oriented
Approaches 174
Approach 15: Decision- and Accountability-Oriented Studies 174
Approach 16: Consumer-Oriented Studies 181
Approach 17: Accreditation and Certification 184
8 SOCIAL AGENDA AND ADVOCACY EVALUATION APPROACHES 191
Overview of Social Agenda and Advocacy Approaches 191
Approach 18: Responsive or Stakeholder-Centered Evaluation 192
Approach 19: Constructivist Evaluation 197
Approach 20: Deliberative Democratic Evaluation 202
Approach 21: Transformative Evaluation 205
9 ECLECTIC EVALUATION APPROACHES 213
Overview of Eclectic Approaches 213
Approach 22: Utilization-Focused Evaluation 214
Approach 23: Participatory Evaluation 219
10 BEST APPROACHES FOR TWENTY-FIRST-CENTURY EVALUATIONS 229
Selection of Approaches for Analysis 230
Methodology for Analyzing and Evaluating the Nine Approaches 230
Our Qualifications as Raters 230
Conflicts of Interest Pertaining to the Ratings 231
Standards for Judging Evaluation Approaches 231
Comparison of 2007 and 2014 Ratings 236
Issues Related to the 2011 Program Evaluation Standards 237
Overall Observations 237
The Bottom Line 240
Part Three: Explication of Selected Evaluation Approaches 247
11 EXPERIMENTAL AND QUASI-EXPERIMENTAL DESIGN EVALUATIONS 249
Chapter Overview 249
Basic Requirements of Sound Experiments 250
Prospective Versus Retrospective Studies of Cause 251
Uses of Experimental Design 251
Randomized Controlled Experiments in Context 252
Suchman and the Scientific Approach to Evaluation 256
Contemporary Concepts Associatedwith the Experimental andQuasi-Experimental
Design Approach to Evaluation 265
Exemplars of Large-Scale Experimental and Quasi-Experimental Design
Evaluations 269
Guidelines for Designing Experiments 271
Quasi-Experimental Designs 280
12 CASE STUDY EVALUATIONS 291
Overview of the Chapter 291
Overview of the Case Study Approach 292
Case Study Research: The Views of Robert Stake 294
Case Study Research: The Views of Robert Yin 297
Particular Case Study Information Collection Methods 301
13 DANIEL STUFFLEBEAM'S CIPP MODEL FOR EVALUATION: AN IMPROVEMENT AND
ACCOUNTABILITY-ORIENTED APPROACH 309
Overview of the Chapter 309
CIPP Model in Context 309
Overview of the CIPP Categories 312
Formative and Summative Uses of Context, Input, Process, and Product
Evaluations 313
Philosophy and Code of Ethics Underlying the CIPP Model 314
The Model's Values Component 317
Using the CIPP Framework to Define Evaluation Questions 319
Delineation of the CIPP Categories and Relevant Procedures 319
Use of the CIPP Model as a Systems Strategy for Improvement 332
14 MICHAEL SCRIVEN'S CONSUMER-ORIENTED APPROACH TO EVALUATION 341
Overview of Scriven's Contributions to Evaluation 341
Scriven's Background 343
Scriven's Basic Orientation to Evaluation 343
Scriven's Definition of Evaluation 343
Critique of Other Persuasions 344
Formative and Summative Evaluation 345
Amateur Versus Professional Evaluation 347
Intrinsic and Payoff Evaluation 347
Goal-Free Evaluation 347
Needs Assessment 348
Scoring, Ranking, Grading, and Apportioning 349
Checklists 352
Key Evaluation Checklist 353
The Final Synthesis 354
Metaevaluation 357
Evaluation Ideologies 357
Avenues to Causal Inference 361
Product Evaluation 363
Professionalization of Evaluation 366
Scriven's Look to Evaluation's Future 366
15 ROBERT STAKE'S RESPONSIVE OR STAKEHOLDER-CENTERED EVALUATION
APPROACH 373
Stake's Professional Background 374
Factors Influencing Stake's Development of Evaluation Theory 374
Stake's 1967 ''Countenance of Educational Evaluation'' Article 375
Responsive Evaluation Approach 383
Substantive Structure of Responsive Evaluation 390
Functional Structure of Responsive Evaluation 390
An Application of Responsive Evaluation 392
Stake's Recent Rethinking of Responsive Evaluation 397
16 MICHAEL PATTON'S UTILIZATION-FOCUSED EVALUATION 403
Adherents of Utilization-Focused Evaluation 404
Some General Aspects of Patton's Utilization-Focused Evaluation 405
Intended Users of Utilization-Focused Evaluation 407
Focusing a Utilization-Focused Evaluation 407
The Personal Factor as Vital to an Evaluation's Success 408
The Evaluator's Roles 408
Utilization-Focused Evaluation and Values and Judgments 409
Employing Active-Reactive-Adaptive Processes to Negotiate with Users 410
Patton's Eclectic Approach 411
Planning Utilization-Focused Evaluations 411
Collecting and Analyzing Information and Reporting Findings 412
Summary of Premises of Utilization-Focused Evaluation 413
Strengths of the Utilization-Focused Evaluation Approach 414
Limitations of the Utilization-Focused Evaluation Approach 415
Part Four: Evaluation Tasks, Procedures, and Tools 421
17 IDENTIFYING AND ASSESSING EVALUATION OPPORTUNITIES 423
Sources of Evaluation Opportunities 423
Bidders' Conferences 431
18 FIRST STEPS IN ADDRESSING EVALUATION OPPORTUNITIES 435
Developing the Evaluation Team 436
Developing Thorough Familiarity with the Need for the Evaluation 437
Stipulating Standards for Guiding and Assessing the Evaluation 437
Establishing Institutional Support for the Projected Evaluation 437
Developing the Evaluation Proposal's Appendix 438
Planning for a Stakeholder Review Panel 439
19 DESIGNING EVALUATIONS 445
A Design Used for Evaluating the Performance Review System of a Military
Organization 446
Generic Checklist for Designing Evaluations 462
20 BUDGETING EVALUATIONS 479
Ethical Imperatives in Budgeting Evaluations 480
Fixed-Price Budget for Evaluating a Personnel Evaluation System 483
Other Types of Evaluation Budgets 486
Generic Checklist for Developing Evaluation Budgets 493
21 CONTRACTING EVALUATIONS 505
Definitions of Evaluation Contracts and Memorandums of Agreement 506
Rationale for Evaluation Contracting 508
Addressing Organizational Contracting Requirements 511
Negotiating Evaluation Agreements 511
Evaluation Contracting Checklist 512
22 COLLECTING EVALUATIVE INFORMATION 519
Key Standards for Information Collection 519
An Information Collection Framework 540
Useful Methods for Collecting Information 543
23 ANALYZING AND SYNTHESIZING INFORMATION 557
General Orientation to Analyzing and Synthesizing Information 558
Principles for Analyzing and Synthesizing Information 559
Analysis of Quantitative Information 560
Analysis of Qualitative Information 575
Justified Conclusions and Decisions 580
24 COMMUNICATING EVALUATION FINDINGS 589
Review of Pertinent Analysis and Advice from Previous Chapters 590
Complex Needs and Challenges in Reporting Evaluation Findings 591
Establishing Conditions to Foster Use of Findings 592
Providing Interim Evaluative Feedback 600
Preparing and Delivering the Final Report 603
Providing Follow-Up Support to Enhance an Evaluation's Impact 619
Part Five: Metaevaluation and Institutionalizing and Mainstreaming
Evaluation 629
25 METAEVALUATION: EVALUATING EVALUATIONS 631
Rationale for Metaevaluation 632
Evaluator and Client Responsibilities in Regard to Metaevaluation 634
Formative and Summative Metaevaluations 634
A Conceptual and Operational Definition of Metaevaluation 634
An Instructive Metaevaluation Case 640
Metaevaluation Tasks 643
Metaevaluation Arrangements and Procedures 647
Comparative Metaevaluations 662
Checklists for Use in Metaevaluations 664
The Role of Context and Resource Constraints 664
26 INSTITUTIONALIZING AND MAINSTREAMING EVALUATION 671
Review of this Book's Themes 671
Overview of the Remainder of the Chapter 672
Rationale and Key Principles for Institutionalizing and Mainstreaming
Evaluation 673
Early Efforts to Help Organizations Institutionalize Evaluation 674
Recent Advances of Use in Institutionalizing and Mainstreaming
Evaluation 675
Checklist for Use in Institutionalizing and Mainstreaming Evaluation 676
Glossary 691
References 713
Index 744
Dedication xvii
Preface xix
Acknowledgments xxiii
The Author xxv
Introduction xxvii
Changes to the First Edition xxviii
Intended Audience xxviii
Overview of the Book's Contents xxix
Study Suggestions xxxii
Part One: Fundamentals of Evaluation 1
1 OVERVIEW OF THE EVALUATION FIELD 3
What Are Appropriate Objects of Evaluations and Related Subdisciplines of
Evaluation? 3
Are Evaluations Enough to Control Quality, Guide Improvement, and Protect
Consumers? 4
Evaluation as a Profession and Its Relationship to Other Professions 4
What Is Evaluation? 6
How Good Is Good Enough? How Bad Is Intolerable? How Are These Questions
Addressed? 17
What Are Performance Standards? How Should They Be Applied? 18
Why Is It Appropriate to Consider Multiple Values? 20
Should Evaluations Be Comparative, Noncomparative, or Both? 21
How Should Evaluations Be Used? 21
Why Is It Important to Distinguish Between Informal Evaluation and Formal
Evaluation? 26
How Do Service Organizations Meet Requirements for Public
Accountability? 27
What Are the Methods of Formal Evaluation? 29
What Is the Evaluation Profession, and How Strong Is It? 29
What Are the Main Historical Milestones in the Evaluation Field's
Development? 30
2 EVALUATION THEORY 45
General Features of Evaluation Theories 45
Theory's Role in Developing the Program Evaluation Field 47
Functional and Pragmatic Bases of Extant Program Evaluation Theory 48
AWord About Research Related to Program Evaluation Theory 49
Program Evaluation Theory Defined 50
Criteria for Judging Program Evaluation Theories 52
Theory Development as a Creative Process Subject to Review and Critique by
Users 56
Status of Theory Development in the Program Evaluation Field 57
Importance and Difficulties of Considering Context in Theories of Program
Evaluation 58
Need for Multiple Theories of Program Evaluation 58
Hypotheses for Research on Program Evaluation 59
Potential Utility of Grounded Theories 62
Potential Utility of Metaevaluations in Developing Theories of Program
Evaluation 63
Program Evaluation Standards and Theory Development 63
3 STANDARDS FOR PROGRAM EVALUATIONS 69
The Need for Evaluation Standards 71
Background of Standards for Program Evaluations 73
Joint Committee Program Evaluation Standards 74
American Evaluation Association Guiding Principles for Evaluators 80
Government Auditing Standards 83
Using Evaluation Standards 97
Part Two: An Evaluation of Evaluation Approaches and Models 105
4 BACKGROUND FOR ASSESSING EVALUATION APPROACHES 107
Evaluation Approaches 109
Importance of Studying Alternative Evaluation Approaches 109
The Nature of Program Evaluation 110
Previous Classifications of Alternative Evaluation Approaches 110
Caveats 112
5 PSEUDOEVALUATIONS 117
Background and Introduction 117
Approach 1: Public Relations Studies 119
Approach 2: Politically Controlled Studies 120
Approach 3: Pandering Evaluations 122
Approach 4: Evaluation by Pretext 123
Approach 5: Empowerment Under the Guise of Evaluation 125
Approach 6: Customer Feedback Evaluation 127
6 QUASI-EVALUATION STUDIES 133
Quasi-Evaluation Approaches Defined 133
Functions of Quasi-Evaluation Approaches 134
General Strengths and Weaknesses of Quasi-Evaluation Approaches 134
Approach 7: Objectives-Based Studies 135
Approach 8: The Success Case Method 137
Approach 9: Outcome Evaluation as Value-Added Assessment 143
Approach 10: Experimental and Quasi-Experimental Studies 147
Approach 11: Cost Studies 152
Approach 12: Connoisseurship and Criticism 155
Approach 13: Theory-Based Evaluation 158
Approach 14: Meta-Analysis 164
7 IMPROVEMENT- AND ACCOUNTABILITY-ORIENTED EVALUATION APPROACHES 173
Improvement- and Accountability-Oriented Evaluation Defined 173
Functions of Improvement- and Accountability-Oriented Approaches 174
General Strengths and Weaknesses of Decision- and Accountability-Oriented
Approaches 174
Approach 15: Decision- and Accountability-Oriented Studies 174
Approach 16: Consumer-Oriented Studies 181
Approach 17: Accreditation and Certification 184
8 SOCIAL AGENDA AND ADVOCACY EVALUATION APPROACHES 191
Overview of Social Agenda and Advocacy Approaches 191
Approach 18: Responsive or Stakeholder-Centered Evaluation 192
Approach 19: Constructivist Evaluation 197
Approach 20: Deliberative Democratic Evaluation 202
Approach 21: Transformative Evaluation 205
9 ECLECTIC EVALUATION APPROACHES 213
Overview of Eclectic Approaches 213
Approach 22: Utilization-Focused Evaluation 214
Approach 23: Participatory Evaluation 219
10 BEST APPROACHES FOR TWENTY-FIRST-CENTURY EVALUATIONS 229
Selection of Approaches for Analysis 230
Methodology for Analyzing and Evaluating the Nine Approaches 230
Our Qualifications as Raters 230
Conflicts of Interest Pertaining to the Ratings 231
Standards for Judging Evaluation Approaches 231
Comparison of 2007 and 2014 Ratings 236
Issues Related to the 2011 Program Evaluation Standards 237
Overall Observations 237
The Bottom Line 240
Part Three: Explication of Selected Evaluation Approaches 247
11 EXPERIMENTAL AND QUASI-EXPERIMENTAL DESIGN EVALUATIONS 249
Chapter Overview 249
Basic Requirements of Sound Experiments 250
Prospective Versus Retrospective Studies of Cause 251
Uses of Experimental Design 251
Randomized Controlled Experiments in Context 252
Suchman and the Scientific Approach to Evaluation 256
Contemporary Concepts Associatedwith the Experimental andQuasi-Experimental
Design Approach to Evaluation 265
Exemplars of Large-Scale Experimental and Quasi-Experimental Design
Evaluations 269
Guidelines for Designing Experiments 271
Quasi-Experimental Designs 280
12 CASE STUDY EVALUATIONS 291
Overview of the Chapter 291
Overview of the Case Study Approach 292
Case Study Research: The Views of Robert Stake 294
Case Study Research: The Views of Robert Yin 297
Particular Case Study Information Collection Methods 301
13 DANIEL STUFFLEBEAM'S CIPP MODEL FOR EVALUATION: AN IMPROVEMENT AND
ACCOUNTABILITY-ORIENTED APPROACH 309
Overview of the Chapter 309
CIPP Model in Context 309
Overview of the CIPP Categories 312
Formative and Summative Uses of Context, Input, Process, and Product
Evaluations 313
Philosophy and Code of Ethics Underlying the CIPP Model 314
The Model's Values Component 317
Using the CIPP Framework to Define Evaluation Questions 319
Delineation of the CIPP Categories and Relevant Procedures 319
Use of the CIPP Model as a Systems Strategy for Improvement 332
14 MICHAEL SCRIVEN'S CONSUMER-ORIENTED APPROACH TO EVALUATION 341
Overview of Scriven's Contributions to Evaluation 341
Scriven's Background 343
Scriven's Basic Orientation to Evaluation 343
Scriven's Definition of Evaluation 343
Critique of Other Persuasions 344
Formative and Summative Evaluation 345
Amateur Versus Professional Evaluation 347
Intrinsic and Payoff Evaluation 347
Goal-Free Evaluation 347
Needs Assessment 348
Scoring, Ranking, Grading, and Apportioning 349
Checklists 352
Key Evaluation Checklist 353
The Final Synthesis 354
Metaevaluation 357
Evaluation Ideologies 357
Avenues to Causal Inference 361
Product Evaluation 363
Professionalization of Evaluation 366
Scriven's Look to Evaluation's Future 366
15 ROBERT STAKE'S RESPONSIVE OR STAKEHOLDER-CENTERED EVALUATION
APPROACH 373
Stake's Professional Background 374
Factors Influencing Stake's Development of Evaluation Theory 374
Stake's 1967 ''Countenance of Educational Evaluation'' Article 375
Responsive Evaluation Approach 383
Substantive Structure of Responsive Evaluation 390
Functional Structure of Responsive Evaluation 390
An Application of Responsive Evaluation 392
Stake's Recent Rethinking of Responsive Evaluation 397
16 MICHAEL PATTON'S UTILIZATION-FOCUSED EVALUATION 403
Adherents of Utilization-Focused Evaluation 404
Some General Aspects of Patton's Utilization-Focused Evaluation 405
Intended Users of Utilization-Focused Evaluation 407
Focusing a Utilization-Focused Evaluation 407
The Personal Factor as Vital to an Evaluation's Success 408
The Evaluator's Roles 408
Utilization-Focused Evaluation and Values and Judgments 409
Employing Active-Reactive-Adaptive Processes to Negotiate with Users 410
Patton's Eclectic Approach 411
Planning Utilization-Focused Evaluations 411
Collecting and Analyzing Information and Reporting Findings 412
Summary of Premises of Utilization-Focused Evaluation 413
Strengths of the Utilization-Focused Evaluation Approach 414
Limitations of the Utilization-Focused Evaluation Approach 415
Part Four: Evaluation Tasks, Procedures, and Tools 421
17 IDENTIFYING AND ASSESSING EVALUATION OPPORTUNITIES 423
Sources of Evaluation Opportunities 423
Bidders' Conferences 431
18 FIRST STEPS IN ADDRESSING EVALUATION OPPORTUNITIES 435
Developing the Evaluation Team 436
Developing Thorough Familiarity with the Need for the Evaluation 437
Stipulating Standards for Guiding and Assessing the Evaluation 437
Establishing Institutional Support for the Projected Evaluation 437
Developing the Evaluation Proposal's Appendix 438
Planning for a Stakeholder Review Panel 439
19 DESIGNING EVALUATIONS 445
A Design Used for Evaluating the Performance Review System of a Military
Organization 446
Generic Checklist for Designing Evaluations 462
20 BUDGETING EVALUATIONS 479
Ethical Imperatives in Budgeting Evaluations 480
Fixed-Price Budget for Evaluating a Personnel Evaluation System 483
Other Types of Evaluation Budgets 486
Generic Checklist for Developing Evaluation Budgets 493
21 CONTRACTING EVALUATIONS 505
Definitions of Evaluation Contracts and Memorandums of Agreement 506
Rationale for Evaluation Contracting 508
Addressing Organizational Contracting Requirements 511
Negotiating Evaluation Agreements 511
Evaluation Contracting Checklist 512
22 COLLECTING EVALUATIVE INFORMATION 519
Key Standards for Information Collection 519
An Information Collection Framework 540
Useful Methods for Collecting Information 543
23 ANALYZING AND SYNTHESIZING INFORMATION 557
General Orientation to Analyzing and Synthesizing Information 558
Principles for Analyzing and Synthesizing Information 559
Analysis of Quantitative Information 560
Analysis of Qualitative Information 575
Justified Conclusions and Decisions 580
24 COMMUNICATING EVALUATION FINDINGS 589
Review of Pertinent Analysis and Advice from Previous Chapters 590
Complex Needs and Challenges in Reporting Evaluation Findings 591
Establishing Conditions to Foster Use of Findings 592
Providing Interim Evaluative Feedback 600
Preparing and Delivering the Final Report 603
Providing Follow-Up Support to Enhance an Evaluation's Impact 619
Part Five: Metaevaluation and Institutionalizing and Mainstreaming
Evaluation 629
25 METAEVALUATION: EVALUATING EVALUATIONS 631
Rationale for Metaevaluation 632
Evaluator and Client Responsibilities in Regard to Metaevaluation 634
Formative and Summative Metaevaluations 634
A Conceptual and Operational Definition of Metaevaluation 634
An Instructive Metaevaluation Case 640
Metaevaluation Tasks 643
Metaevaluation Arrangements and Procedures 647
Comparative Metaevaluations 662
Checklists for Use in Metaevaluations 664
The Role of Context and Resource Constraints 664
26 INSTITUTIONALIZING AND MAINSTREAMING EVALUATION 671
Review of this Book's Themes 671
Overview of the Remainder of the Chapter 672
Rationale and Key Principles for Institutionalizing and Mainstreaming
Evaluation 673
Early Efforts to Help Organizations Institutionalize Evaluation 674
Recent Advances of Use in Institutionalizing and Mainstreaming
Evaluation 675
Checklist for Use in Institutionalizing and Mainstreaming Evaluation 676
Glossary 691
References 713
Index 744