J. Michael Bamberger (Independent Consultant), Linda S. Mabry (WA Washington State University)
RealWorld Evaluation
Working Under Budget, Time, Data, and Political Constraints
J. Michael Bamberger (Independent Consultant), Linda S. Mabry (WA Washington State University)
RealWorld Evaluation
Working Under Budget, Time, Data, and Political Constraints
- Broschiertes Buch
- Merkliste
- Auf die Merkliste
- Bewerten Bewerten
- Teilen
- Produkt teilen
- Produkterinnerung
- Produkterinnerung
RealWorld Evaluation: Working Under Budget, Time, Data, and Political Constraints addresses the challenges of conducting program evaluations in real-world contexts where evaluators and their clients face budget and time constraints. The new Third Edition includes a new chapter on gender equality and women’s empowerment and discussion of digital technology and data science.
Andere Kunden interessierten sich auch für
- Kristen L. RohannaLeading Change Through Evaluation48,99 €
- Core Concepts in Evaluation103,99 €
- Leila KahwatiQualitative Comparative Analysis in Mixed Methods Research and Evaluation77,99 €
- Michael Quinn Patton (Utilization-Focused Evaluation, Saint Paul, MUtilization-Focused Evaluation140,99 €
- Anastasia Catsambas (Tessie) TzavarasEvaluation Management98,99 €
- Egon G. GubaFourth Generation Evaluation101,99 €
- Laura R. PeckExperimental Evaluation Design for Program Improvement48,99 €
-
-
-
RealWorld Evaluation: Working Under Budget, Time, Data, and Political Constraints addresses the challenges of conducting program evaluations in real-world contexts where evaluators and their clients face budget and time constraints. The new Third Edition includes a new chapter on gender equality and women’s empowerment and discussion of digital technology and data science.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Produktdetails
- Produktdetails
- Verlag: SAGE Publications Inc
- 3 Revised edition
- Seitenzahl: 568
- Erscheinungstermin: 27. September 2019
- Englisch
- Abmessung: 254mm x 202mm x 27mm
- Gewicht: 1222g
- ISBN-13: 9781544318783
- ISBN-10: 1544318782
- Artikelnr.: 54471994
- Verlag: SAGE Publications Inc
- 3 Revised edition
- Seitenzahl: 568
- Erscheinungstermin: 27. September 2019
- Englisch
- Abmessung: 254mm x 202mm x 27mm
- Gewicht: 1222g
- ISBN-13: 9781544318783
- ISBN-10: 1544318782
- Artikelnr.: 54471994
Michael Bamberger has been involved in development evaluation for fifty years. Beginning in Latin America where he worked in urban community development and evaluation for over a decade, he became interested in the coping strategies of low-income communities, how they were affected by and how they influenced development efforts. Most evaluation research fails to capture these survival strategies, frequently underestimating the resilience of these communities - particularly women and female-headed households. During 20 years with the World Bank he worked as monitoring and evaluation advisor for the Urban Development Department, evaluation training coordinator with the Economic Development Department and Senior Sociologist in the Gender and Development Department. After retiring from the Bank in 2001 he has worked as a development evaluation consultant with more than 10 UN agencies as well as development banks, bilateral development agencies, NGOs and foundations. Since 2001 he has been on the faculty of the International Program for Development Evaluation Training (IPDET). Recent publications include: (with Jim Rugh and Linda Mabry) RealWorld Evaluation: Working under budget, time, data and political constraints (2012 second edition); (with Marco Segone) How to design and manage equity focused evaluations (2011); Engendering Monitoring and Evaluation ( 2013 ); (with Linda Raftree) Emerging opportunities: Monitoring and evaluation in a tech-enabled world (2014); (with Marco Segone and Shravanti Reddy) How to integrate gender equality and social equity in national evaluation policies and systems (2014).
List of Boxes, Figures, and Tables List of Appendices Foreword by Jim Rugh Preface Acknowledgments About the Authors PART I
THE SEVEN STEPS OF THE REALWORLD EVALUATION APPROACH Chapter 1
Overview: RealWorld Evaluation and the Contexts in Which It Is Used 1. Welcome to RealWorld Evaluation 2. The RealWorld Evaluation Context 3. The Four Types of Constraints Addressed by the RealWorld Approach 4. Additional Organizational and Administrative Challenges 5. The RealWorld Approach to Evaluation Challenges 6. Who Uses RealWorld Evaluation, for What Purposes, and When? Summary Further Reading Chapter 2
First Clarify the Purpose: Scoping the Evaluation 1. Stakeholder Expectations of Impact Evaluations 2. Understanding Information Needs 3. Developing the Program Theory Model 4. Identifying the Constraints to Be Addressed by RWE and Determining the Appropriate Evaluation Design 5. Developing Designs Suitable for RealWorld Evaluation Conditions Summary Further Reading Chapter 3
Not Enough Money: Addressing Budget Constraints 1. Simplifying the Evaluation Design 2. Clarifying Client Information Needs 3. Using Existing Data 4. Reducing Costs by Reducing Sample Size 5. Reducing Costs of Data Collection and Analysis 6. Assessing the Feasibility and Utility of Using New Information Technology (NIT) to Reduce the Costs of Data Collection 7. Threats to Validity of Budget Constraints Summary Further Reading Chapter 4
Not Enough Time: Addressing Scheduling and Other Time Constraints 1. Similarities and Differences Between Time and Budget Constraints 2. Simplifying the Evaluation Design 3. Clarifying Client Information Needs and Deadlines 4. Using Existing Documentary Data 5. Reducing Sample Size 6. Rapid Data-Collection Methods 7. Reducing Time Pressure on Outside Consultants 8. Hiring More Resource People 9. Building Outcome Indicators Into Project Records 10. New Information Technology for Data Collection and Analysis 11. Common Threats to Adequacy and Validity Relating to Time Constraints Summary Further Reading Chapter 5
Critical Information Is Missing or Difficult to Collect: Addressing Data Constraints 1. Data Issues Facing RealWorld Evaluators 2. Reconstructing Baseline Data 3. Special Issues Reconstructing Baseline Data for Project Populations and Comparison Groups 4. Collecting Data on Sensitive Topics or From Difficult-to-Reach Groups 5. Common Threats to Adequacy and Validity of an Evaluation Relating to Data Constraints Summary Further Reading Chapter 6
Political Constraints 1. Values, Ethics, and Politics 2. Societal Politics and Evaluation 3. Stakeholder Politics 4. Professional Politics 5. Political Issues in the Design Phase 6. Political Issues in the Conduct of an Evaluation 7. Political Issues in Evaluation Reporting and Use 8. Advocacy Summary Further Reading Chapter 7
Strengthening the Evaluation Design and the Validity of the Conclusions 1. Validity in Evaluation 2. Factors Affecting Adequacy and Validity 3. A Framework for Assessing the Validity and Adequacy of QUANT, QUAL, and Mixed-Method Designs 4. Assessing and Addressing Threats to Validity for Quantitative Impact Evaluations 5. Assessing Adequacy and Validity for Qualitative Impact Evaluations 6. Assessing Validity for Mixed-Method (MM) Evaluations 7. Using the Threats-to-Validity Worksheets Summary Further Reading Chapter 8
Making It Useful: Helping Clients and Other Stakeholders Utilize the Evaluation 1. What Do We Mean by Influential Evaluations and Useful Evaluations? 2. The Underutilization of Evaluation Studies 3. Strategies for Promoting the Utilization of Evaluation Findings and Recommendations Summary Further Reading PART II
A REVIEW OF EVALUATION METHODS AND APPROACHES AND THEIR APPLICATION IN REALWORLD EVALUATION: FOR THOSE WHO WOULD LIKE TO DIG DEEPER Chapter 9
Standards and Ethics 1. Standards of Competence 2. Professional Standards 3. Ethical Codes of Conduct 4. Issues Summary Further Reading Chapter 10
Theory-Based Evaluation and Theory of Change 1. Theory-Based Evaluation (TBE) and Theory of Change (TOC) 2. Applications of Program Theory in Program Evaluation 3. Using TOC in Program Evaluation 4. Designing a Theory of Change Evaluation Framework 5. Integrating a Theory of Change Into the Program Management, Monitoring, and Evaluation Cycle 6. Program Theory Evaluation and Causality Summary Further Reading Chapter 11
Evaluation Designs: The RWE Strategy for Selecting the Appropriate Evaluation Design to Respond to the Purpose and Context of Each Evaluation 1. Different Approaches to the Classification of Evaluation Designs 2. Assessing Causality Attribution and Contribution 3. The RWE Approach to the Selection of the Appropriate Impact Evaluation Design 4. Tools and Techniques for Strengthening the Basic Evaluation Designs 5. Selecting the Best Design for RealWorld Evaluation Scenarios Summary Further Reading Chapter 12
Quantitative Evaluation Methods 1. Quantitative Evaluation Methodologies 2. Experimental and Quasi-Experimental Designs 3. Strengths and Weaknesses of Quantitative Evaluation Methodologies 4. Applications of Quantitative Methodologies in Program Evaluation 5. Quantitative Methods for Data Collection 6. The Management of Data Collection for Quantitative Studies 7. Data Analysis Summary Further Reading Chapter 13
Qualitative Evaluation Methods 1. Design 2. Data Collection 3. Data Analysis 4. Reporting 5. Real-World Constraints Summary Further Reading Chapter 14
Mixed-Method Evaluation 1. The Mixed-Method Approach 2. Rationale for Mixed-Method Approaches 3. Approaches to the Use of Mixed Methods 4. Mixed-Method Strategies 5. Implementing a Mixed-Method Design 6. Using Mixed Methods to Tell a More Compelling Story of What a Program Has Achieved 7. Case Studies Illustrating the Use of Mixed Methods Summary Further Reading Chapter 15
Sampling Strategies for RealWorld Evaluation 1. The Importance of Sampling for RealWorld Evaluation 2. Purposive Sampling 3. Probability (Random) Sampling 4. Using Power Analysis and Effect Size for Estimating the Appropriate Sample Size for an Impact Evaluation 5. The Contribution of Meta-Analysis 6. Sampling Issues for Mixed-Method Evaluations 7. Sampling Issues for RealWorld Evaluation Summary Further Reading Chapter 16
Evaluating Complex Projects, Programs, and Policies 1. The Move Toward Complex, Country-Level Development Programming 2. Defining Complexity in Development Programs and Evaluations 3. A Framework for the Evaluation of Complex Development Programs Summary Further Reading Chapter 17
Gender Evaluation: Integrating Gender Analysis Into Evaluations 1. Why a Gender Focus Is Critical 2. Gender Issues in Evaluations 3. Designing a Gender Evaluation 4. Gender Evaluations With Different Scopes 5. The Tools of Gender Evaluation Summary Further Reading Chapter 18
Evaluation in the Age of Big Data 1. Introducing Big Data and Data Science 2. Increasing Application of Big Data in the Development Context 3. The Tools of Data Science 4. Potential Applications of Data Science in Development Evaluation 5. Building Bridges Between Data Science and Evaluation Summary Further Reading PART III
MANAGING EVALUATIONS Chapter 19
Managing Evaluations 1. Organizational and Political Issues Affecting the Design, Implementation, and Use of Evaluations 2. Planning and Managing the Evaluation 3. Institutionalizing Impact Evaluation Systems at the Country and Sector Levels 4. Evaluating Capacity Development Summary Further Reading Chapter 20
The Road Ahead 1. Conclusions 2. Recommendations Glossary of Terms and Acronyms References Author Index Subject Index
THE SEVEN STEPS OF THE REALWORLD EVALUATION APPROACH Chapter 1
Overview: RealWorld Evaluation and the Contexts in Which It Is Used 1. Welcome to RealWorld Evaluation 2. The RealWorld Evaluation Context 3. The Four Types of Constraints Addressed by the RealWorld Approach 4. Additional Organizational and Administrative Challenges 5. The RealWorld Approach to Evaluation Challenges 6. Who Uses RealWorld Evaluation, for What Purposes, and When? Summary Further Reading Chapter 2
First Clarify the Purpose: Scoping the Evaluation 1. Stakeholder Expectations of Impact Evaluations 2. Understanding Information Needs 3. Developing the Program Theory Model 4. Identifying the Constraints to Be Addressed by RWE and Determining the Appropriate Evaluation Design 5. Developing Designs Suitable for RealWorld Evaluation Conditions Summary Further Reading Chapter 3
Not Enough Money: Addressing Budget Constraints 1. Simplifying the Evaluation Design 2. Clarifying Client Information Needs 3. Using Existing Data 4. Reducing Costs by Reducing Sample Size 5. Reducing Costs of Data Collection and Analysis 6. Assessing the Feasibility and Utility of Using New Information Technology (NIT) to Reduce the Costs of Data Collection 7. Threats to Validity of Budget Constraints Summary Further Reading Chapter 4
Not Enough Time: Addressing Scheduling and Other Time Constraints 1. Similarities and Differences Between Time and Budget Constraints 2. Simplifying the Evaluation Design 3. Clarifying Client Information Needs and Deadlines 4. Using Existing Documentary Data 5. Reducing Sample Size 6. Rapid Data-Collection Methods 7. Reducing Time Pressure on Outside Consultants 8. Hiring More Resource People 9. Building Outcome Indicators Into Project Records 10. New Information Technology for Data Collection and Analysis 11. Common Threats to Adequacy and Validity Relating to Time Constraints Summary Further Reading Chapter 5
Critical Information Is Missing or Difficult to Collect: Addressing Data Constraints 1. Data Issues Facing RealWorld Evaluators 2. Reconstructing Baseline Data 3. Special Issues Reconstructing Baseline Data for Project Populations and Comparison Groups 4. Collecting Data on Sensitive Topics or From Difficult-to-Reach Groups 5. Common Threats to Adequacy and Validity of an Evaluation Relating to Data Constraints Summary Further Reading Chapter 6
Political Constraints 1. Values, Ethics, and Politics 2. Societal Politics and Evaluation 3. Stakeholder Politics 4. Professional Politics 5. Political Issues in the Design Phase 6. Political Issues in the Conduct of an Evaluation 7. Political Issues in Evaluation Reporting and Use 8. Advocacy Summary Further Reading Chapter 7
Strengthening the Evaluation Design and the Validity of the Conclusions 1. Validity in Evaluation 2. Factors Affecting Adequacy and Validity 3. A Framework for Assessing the Validity and Adequacy of QUANT, QUAL, and Mixed-Method Designs 4. Assessing and Addressing Threats to Validity for Quantitative Impact Evaluations 5. Assessing Adequacy and Validity for Qualitative Impact Evaluations 6. Assessing Validity for Mixed-Method (MM) Evaluations 7. Using the Threats-to-Validity Worksheets Summary Further Reading Chapter 8
Making It Useful: Helping Clients and Other Stakeholders Utilize the Evaluation 1. What Do We Mean by Influential Evaluations and Useful Evaluations? 2. The Underutilization of Evaluation Studies 3. Strategies for Promoting the Utilization of Evaluation Findings and Recommendations Summary Further Reading PART II
A REVIEW OF EVALUATION METHODS AND APPROACHES AND THEIR APPLICATION IN REALWORLD EVALUATION: FOR THOSE WHO WOULD LIKE TO DIG DEEPER Chapter 9
Standards and Ethics 1. Standards of Competence 2. Professional Standards 3. Ethical Codes of Conduct 4. Issues Summary Further Reading Chapter 10
Theory-Based Evaluation and Theory of Change 1. Theory-Based Evaluation (TBE) and Theory of Change (TOC) 2. Applications of Program Theory in Program Evaluation 3. Using TOC in Program Evaluation 4. Designing a Theory of Change Evaluation Framework 5. Integrating a Theory of Change Into the Program Management, Monitoring, and Evaluation Cycle 6. Program Theory Evaluation and Causality Summary Further Reading Chapter 11
Evaluation Designs: The RWE Strategy for Selecting the Appropriate Evaluation Design to Respond to the Purpose and Context of Each Evaluation 1. Different Approaches to the Classification of Evaluation Designs 2. Assessing Causality Attribution and Contribution 3. The RWE Approach to the Selection of the Appropriate Impact Evaluation Design 4. Tools and Techniques for Strengthening the Basic Evaluation Designs 5. Selecting the Best Design for RealWorld Evaluation Scenarios Summary Further Reading Chapter 12
Quantitative Evaluation Methods 1. Quantitative Evaluation Methodologies 2. Experimental and Quasi-Experimental Designs 3. Strengths and Weaknesses of Quantitative Evaluation Methodologies 4. Applications of Quantitative Methodologies in Program Evaluation 5. Quantitative Methods for Data Collection 6. The Management of Data Collection for Quantitative Studies 7. Data Analysis Summary Further Reading Chapter 13
Qualitative Evaluation Methods 1. Design 2. Data Collection 3. Data Analysis 4. Reporting 5. Real-World Constraints Summary Further Reading Chapter 14
Mixed-Method Evaluation 1. The Mixed-Method Approach 2. Rationale for Mixed-Method Approaches 3. Approaches to the Use of Mixed Methods 4. Mixed-Method Strategies 5. Implementing a Mixed-Method Design 6. Using Mixed Methods to Tell a More Compelling Story of What a Program Has Achieved 7. Case Studies Illustrating the Use of Mixed Methods Summary Further Reading Chapter 15
Sampling Strategies for RealWorld Evaluation 1. The Importance of Sampling for RealWorld Evaluation 2. Purposive Sampling 3. Probability (Random) Sampling 4. Using Power Analysis and Effect Size for Estimating the Appropriate Sample Size for an Impact Evaluation 5. The Contribution of Meta-Analysis 6. Sampling Issues for Mixed-Method Evaluations 7. Sampling Issues for RealWorld Evaluation Summary Further Reading Chapter 16
Evaluating Complex Projects, Programs, and Policies 1. The Move Toward Complex, Country-Level Development Programming 2. Defining Complexity in Development Programs and Evaluations 3. A Framework for the Evaluation of Complex Development Programs Summary Further Reading Chapter 17
Gender Evaluation: Integrating Gender Analysis Into Evaluations 1. Why a Gender Focus Is Critical 2. Gender Issues in Evaluations 3. Designing a Gender Evaluation 4. Gender Evaluations With Different Scopes 5. The Tools of Gender Evaluation Summary Further Reading Chapter 18
Evaluation in the Age of Big Data 1. Introducing Big Data and Data Science 2. Increasing Application of Big Data in the Development Context 3. The Tools of Data Science 4. Potential Applications of Data Science in Development Evaluation 5. Building Bridges Between Data Science and Evaluation Summary Further Reading PART III
MANAGING EVALUATIONS Chapter 19
Managing Evaluations 1. Organizational and Political Issues Affecting the Design, Implementation, and Use of Evaluations 2. Planning and Managing the Evaluation 3. Institutionalizing Impact Evaluation Systems at the Country and Sector Levels 4. Evaluating Capacity Development Summary Further Reading Chapter 20
The Road Ahead 1. Conclusions 2. Recommendations Glossary of Terms and Acronyms References Author Index Subject Index
List of Boxes, Figures, and Tables List of Appendices Foreword by Jim Rugh Preface Acknowledgments About the Authors PART I
THE SEVEN STEPS OF THE REALWORLD EVALUATION APPROACH Chapter 1
Overview: RealWorld Evaluation and the Contexts in Which It Is Used 1. Welcome to RealWorld Evaluation 2. The RealWorld Evaluation Context 3. The Four Types of Constraints Addressed by the RealWorld Approach 4. Additional Organizational and Administrative Challenges 5. The RealWorld Approach to Evaluation Challenges 6. Who Uses RealWorld Evaluation, for What Purposes, and When? Summary Further Reading Chapter 2
First Clarify the Purpose: Scoping the Evaluation 1. Stakeholder Expectations of Impact Evaluations 2. Understanding Information Needs 3. Developing the Program Theory Model 4. Identifying the Constraints to Be Addressed by RWE and Determining the Appropriate Evaluation Design 5. Developing Designs Suitable for RealWorld Evaluation Conditions Summary Further Reading Chapter 3
Not Enough Money: Addressing Budget Constraints 1. Simplifying the Evaluation Design 2. Clarifying Client Information Needs 3. Using Existing Data 4. Reducing Costs by Reducing Sample Size 5. Reducing Costs of Data Collection and Analysis 6. Assessing the Feasibility and Utility of Using New Information Technology (NIT) to Reduce the Costs of Data Collection 7. Threats to Validity of Budget Constraints Summary Further Reading Chapter 4
Not Enough Time: Addressing Scheduling and Other Time Constraints 1. Similarities and Differences Between Time and Budget Constraints 2. Simplifying the Evaluation Design 3. Clarifying Client Information Needs and Deadlines 4. Using Existing Documentary Data 5. Reducing Sample Size 6. Rapid Data-Collection Methods 7. Reducing Time Pressure on Outside Consultants 8. Hiring More Resource People 9. Building Outcome Indicators Into Project Records 10. New Information Technology for Data Collection and Analysis 11. Common Threats to Adequacy and Validity Relating to Time Constraints Summary Further Reading Chapter 5
Critical Information Is Missing or Difficult to Collect: Addressing Data Constraints 1. Data Issues Facing RealWorld Evaluators 2. Reconstructing Baseline Data 3. Special Issues Reconstructing Baseline Data for Project Populations and Comparison Groups 4. Collecting Data on Sensitive Topics or From Difficult-to-Reach Groups 5. Common Threats to Adequacy and Validity of an Evaluation Relating to Data Constraints Summary Further Reading Chapter 6
Political Constraints 1. Values, Ethics, and Politics 2. Societal Politics and Evaluation 3. Stakeholder Politics 4. Professional Politics 5. Political Issues in the Design Phase 6. Political Issues in the Conduct of an Evaluation 7. Political Issues in Evaluation Reporting and Use 8. Advocacy Summary Further Reading Chapter 7
Strengthening the Evaluation Design and the Validity of the Conclusions 1. Validity in Evaluation 2. Factors Affecting Adequacy and Validity 3. A Framework for Assessing the Validity and Adequacy of QUANT, QUAL, and Mixed-Method Designs 4. Assessing and Addressing Threats to Validity for Quantitative Impact Evaluations 5. Assessing Adequacy and Validity for Qualitative Impact Evaluations 6. Assessing Validity for Mixed-Method (MM) Evaluations 7. Using the Threats-to-Validity Worksheets Summary Further Reading Chapter 8
Making It Useful: Helping Clients and Other Stakeholders Utilize the Evaluation 1. What Do We Mean by Influential Evaluations and Useful Evaluations? 2. The Underutilization of Evaluation Studies 3. Strategies for Promoting the Utilization of Evaluation Findings and Recommendations Summary Further Reading PART II
A REVIEW OF EVALUATION METHODS AND APPROACHES AND THEIR APPLICATION IN REALWORLD EVALUATION: FOR THOSE WHO WOULD LIKE TO DIG DEEPER Chapter 9
Standards and Ethics 1. Standards of Competence 2. Professional Standards 3. Ethical Codes of Conduct 4. Issues Summary Further Reading Chapter 10
Theory-Based Evaluation and Theory of Change 1. Theory-Based Evaluation (TBE) and Theory of Change (TOC) 2. Applications of Program Theory in Program Evaluation 3. Using TOC in Program Evaluation 4. Designing a Theory of Change Evaluation Framework 5. Integrating a Theory of Change Into the Program Management, Monitoring, and Evaluation Cycle 6. Program Theory Evaluation and Causality Summary Further Reading Chapter 11
Evaluation Designs: The RWE Strategy for Selecting the Appropriate Evaluation Design to Respond to the Purpose and Context of Each Evaluation 1. Different Approaches to the Classification of Evaluation Designs 2. Assessing Causality Attribution and Contribution 3. The RWE Approach to the Selection of the Appropriate Impact Evaluation Design 4. Tools and Techniques for Strengthening the Basic Evaluation Designs 5. Selecting the Best Design for RealWorld Evaluation Scenarios Summary Further Reading Chapter 12
Quantitative Evaluation Methods 1. Quantitative Evaluation Methodologies 2. Experimental and Quasi-Experimental Designs 3. Strengths and Weaknesses of Quantitative Evaluation Methodologies 4. Applications of Quantitative Methodologies in Program Evaluation 5. Quantitative Methods for Data Collection 6. The Management of Data Collection for Quantitative Studies 7. Data Analysis Summary Further Reading Chapter 13
Qualitative Evaluation Methods 1. Design 2. Data Collection 3. Data Analysis 4. Reporting 5. Real-World Constraints Summary Further Reading Chapter 14
Mixed-Method Evaluation 1. The Mixed-Method Approach 2. Rationale for Mixed-Method Approaches 3. Approaches to the Use of Mixed Methods 4. Mixed-Method Strategies 5. Implementing a Mixed-Method Design 6. Using Mixed Methods to Tell a More Compelling Story of What a Program Has Achieved 7. Case Studies Illustrating the Use of Mixed Methods Summary Further Reading Chapter 15
Sampling Strategies for RealWorld Evaluation 1. The Importance of Sampling for RealWorld Evaluation 2. Purposive Sampling 3. Probability (Random) Sampling 4. Using Power Analysis and Effect Size for Estimating the Appropriate Sample Size for an Impact Evaluation 5. The Contribution of Meta-Analysis 6. Sampling Issues for Mixed-Method Evaluations 7. Sampling Issues for RealWorld Evaluation Summary Further Reading Chapter 16
Evaluating Complex Projects, Programs, and Policies 1. The Move Toward Complex, Country-Level Development Programming 2. Defining Complexity in Development Programs and Evaluations 3. A Framework for the Evaluation of Complex Development Programs Summary Further Reading Chapter 17
Gender Evaluation: Integrating Gender Analysis Into Evaluations 1. Why a Gender Focus Is Critical 2. Gender Issues in Evaluations 3. Designing a Gender Evaluation 4. Gender Evaluations With Different Scopes 5. The Tools of Gender Evaluation Summary Further Reading Chapter 18
Evaluation in the Age of Big Data 1. Introducing Big Data and Data Science 2. Increasing Application of Big Data in the Development Context 3. The Tools of Data Science 4. Potential Applications of Data Science in Development Evaluation 5. Building Bridges Between Data Science and Evaluation Summary Further Reading PART III
MANAGING EVALUATIONS Chapter 19
Managing Evaluations 1. Organizational and Political Issues Affecting the Design, Implementation, and Use of Evaluations 2. Planning and Managing the Evaluation 3. Institutionalizing Impact Evaluation Systems at the Country and Sector Levels 4. Evaluating Capacity Development Summary Further Reading Chapter 20
The Road Ahead 1. Conclusions 2. Recommendations Glossary of Terms and Acronyms References Author Index Subject Index
THE SEVEN STEPS OF THE REALWORLD EVALUATION APPROACH Chapter 1
Overview: RealWorld Evaluation and the Contexts in Which It Is Used 1. Welcome to RealWorld Evaluation 2. The RealWorld Evaluation Context 3. The Four Types of Constraints Addressed by the RealWorld Approach 4. Additional Organizational and Administrative Challenges 5. The RealWorld Approach to Evaluation Challenges 6. Who Uses RealWorld Evaluation, for What Purposes, and When? Summary Further Reading Chapter 2
First Clarify the Purpose: Scoping the Evaluation 1. Stakeholder Expectations of Impact Evaluations 2. Understanding Information Needs 3. Developing the Program Theory Model 4. Identifying the Constraints to Be Addressed by RWE and Determining the Appropriate Evaluation Design 5. Developing Designs Suitable for RealWorld Evaluation Conditions Summary Further Reading Chapter 3
Not Enough Money: Addressing Budget Constraints 1. Simplifying the Evaluation Design 2. Clarifying Client Information Needs 3. Using Existing Data 4. Reducing Costs by Reducing Sample Size 5. Reducing Costs of Data Collection and Analysis 6. Assessing the Feasibility and Utility of Using New Information Technology (NIT) to Reduce the Costs of Data Collection 7. Threats to Validity of Budget Constraints Summary Further Reading Chapter 4
Not Enough Time: Addressing Scheduling and Other Time Constraints 1. Similarities and Differences Between Time and Budget Constraints 2. Simplifying the Evaluation Design 3. Clarifying Client Information Needs and Deadlines 4. Using Existing Documentary Data 5. Reducing Sample Size 6. Rapid Data-Collection Methods 7. Reducing Time Pressure on Outside Consultants 8. Hiring More Resource People 9. Building Outcome Indicators Into Project Records 10. New Information Technology for Data Collection and Analysis 11. Common Threats to Adequacy and Validity Relating to Time Constraints Summary Further Reading Chapter 5
Critical Information Is Missing or Difficult to Collect: Addressing Data Constraints 1. Data Issues Facing RealWorld Evaluators 2. Reconstructing Baseline Data 3. Special Issues Reconstructing Baseline Data for Project Populations and Comparison Groups 4. Collecting Data on Sensitive Topics or From Difficult-to-Reach Groups 5. Common Threats to Adequacy and Validity of an Evaluation Relating to Data Constraints Summary Further Reading Chapter 6
Political Constraints 1. Values, Ethics, and Politics 2. Societal Politics and Evaluation 3. Stakeholder Politics 4. Professional Politics 5. Political Issues in the Design Phase 6. Political Issues in the Conduct of an Evaluation 7. Political Issues in Evaluation Reporting and Use 8. Advocacy Summary Further Reading Chapter 7
Strengthening the Evaluation Design and the Validity of the Conclusions 1. Validity in Evaluation 2. Factors Affecting Adequacy and Validity 3. A Framework for Assessing the Validity and Adequacy of QUANT, QUAL, and Mixed-Method Designs 4. Assessing and Addressing Threats to Validity for Quantitative Impact Evaluations 5. Assessing Adequacy and Validity for Qualitative Impact Evaluations 6. Assessing Validity for Mixed-Method (MM) Evaluations 7. Using the Threats-to-Validity Worksheets Summary Further Reading Chapter 8
Making It Useful: Helping Clients and Other Stakeholders Utilize the Evaluation 1. What Do We Mean by Influential Evaluations and Useful Evaluations? 2. The Underutilization of Evaluation Studies 3. Strategies for Promoting the Utilization of Evaluation Findings and Recommendations Summary Further Reading PART II
A REVIEW OF EVALUATION METHODS AND APPROACHES AND THEIR APPLICATION IN REALWORLD EVALUATION: FOR THOSE WHO WOULD LIKE TO DIG DEEPER Chapter 9
Standards and Ethics 1. Standards of Competence 2. Professional Standards 3. Ethical Codes of Conduct 4. Issues Summary Further Reading Chapter 10
Theory-Based Evaluation and Theory of Change 1. Theory-Based Evaluation (TBE) and Theory of Change (TOC) 2. Applications of Program Theory in Program Evaluation 3. Using TOC in Program Evaluation 4. Designing a Theory of Change Evaluation Framework 5. Integrating a Theory of Change Into the Program Management, Monitoring, and Evaluation Cycle 6. Program Theory Evaluation and Causality Summary Further Reading Chapter 11
Evaluation Designs: The RWE Strategy for Selecting the Appropriate Evaluation Design to Respond to the Purpose and Context of Each Evaluation 1. Different Approaches to the Classification of Evaluation Designs 2. Assessing Causality Attribution and Contribution 3. The RWE Approach to the Selection of the Appropriate Impact Evaluation Design 4. Tools and Techniques for Strengthening the Basic Evaluation Designs 5. Selecting the Best Design for RealWorld Evaluation Scenarios Summary Further Reading Chapter 12
Quantitative Evaluation Methods 1. Quantitative Evaluation Methodologies 2. Experimental and Quasi-Experimental Designs 3. Strengths and Weaknesses of Quantitative Evaluation Methodologies 4. Applications of Quantitative Methodologies in Program Evaluation 5. Quantitative Methods for Data Collection 6. The Management of Data Collection for Quantitative Studies 7. Data Analysis Summary Further Reading Chapter 13
Qualitative Evaluation Methods 1. Design 2. Data Collection 3. Data Analysis 4. Reporting 5. Real-World Constraints Summary Further Reading Chapter 14
Mixed-Method Evaluation 1. The Mixed-Method Approach 2. Rationale for Mixed-Method Approaches 3. Approaches to the Use of Mixed Methods 4. Mixed-Method Strategies 5. Implementing a Mixed-Method Design 6. Using Mixed Methods to Tell a More Compelling Story of What a Program Has Achieved 7. Case Studies Illustrating the Use of Mixed Methods Summary Further Reading Chapter 15
Sampling Strategies for RealWorld Evaluation 1. The Importance of Sampling for RealWorld Evaluation 2. Purposive Sampling 3. Probability (Random) Sampling 4. Using Power Analysis and Effect Size for Estimating the Appropriate Sample Size for an Impact Evaluation 5. The Contribution of Meta-Analysis 6. Sampling Issues for Mixed-Method Evaluations 7. Sampling Issues for RealWorld Evaluation Summary Further Reading Chapter 16
Evaluating Complex Projects, Programs, and Policies 1. The Move Toward Complex, Country-Level Development Programming 2. Defining Complexity in Development Programs and Evaluations 3. A Framework for the Evaluation of Complex Development Programs Summary Further Reading Chapter 17
Gender Evaluation: Integrating Gender Analysis Into Evaluations 1. Why a Gender Focus Is Critical 2. Gender Issues in Evaluations 3. Designing a Gender Evaluation 4. Gender Evaluations With Different Scopes 5. The Tools of Gender Evaluation Summary Further Reading Chapter 18
Evaluation in the Age of Big Data 1. Introducing Big Data and Data Science 2. Increasing Application of Big Data in the Development Context 3. The Tools of Data Science 4. Potential Applications of Data Science in Development Evaluation 5. Building Bridges Between Data Science and Evaluation Summary Further Reading PART III
MANAGING EVALUATIONS Chapter 19
Managing Evaluations 1. Organizational and Political Issues Affecting the Design, Implementation, and Use of Evaluations 2. Planning and Managing the Evaluation 3. Institutionalizing Impact Evaluation Systems at the Country and Sector Levels 4. Evaluating Capacity Development Summary Further Reading Chapter 20
The Road Ahead 1. Conclusions 2. Recommendations Glossary of Terms and Acronyms References Author Index Subject Index