- Broschiertes Buch
- Merkliste
- Auf die Merkliste
- Bewerten Bewerten
- Teilen
- Produkt teilen
- Produkterinnerung
- Produkterinnerung
This text aims to build evaluation capacity by increasing knowledge about evaluation and improving skills to conduct evaluations. The Second Edition adds topics suggested by users of the book, incorporates content that the author has added to her own classes, and covers emerging areas in evaluation since the publication of the first edition such as artificial intelligence and equity in evaluation.
Andere Kunden interessierten sich auch für
- Collaborative Approaches to Evaluation42,99 €
- Evaluation for Health Policy and Health Care161,99 €
- National Research CouncilAn Evaluation of the Food Safety Requirements of the Federal Purchase Ground Beef Program24,99 €
- Institute Of MedicineAdequacy of the Comprehensive Clinical Evaluation Program33,99 €
- Alana DennisMusic Therapy Social Skills Assessment and Documentation Manual (MTSSA)69,99 €
- Routledge International Handbook of Qualitative Nursing Research85,99 €
- Frederick J KvizConducting Health Research202,99 €
-
-
-
This text aims to build evaluation capacity by increasing knowledge about evaluation and improving skills to conduct evaluations. The Second Edition adds topics suggested by users of the book, incorporates content that the author has added to her own classes, and covers emerging areas in evaluation since the publication of the first edition such as artificial intelligence and equity in evaluation.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Produktdetails
- Produktdetails
- Verlag: Sage Publications Inc Ebooks
- 2 Revised edition
- Seitenzahl: 560
- Erscheinungstermin: 14. Juli 2025
- Englisch
- Abmessung: 231mm x 187mm
- ISBN-13: 9781071918289
- ISBN-10: 1071918281
- Artikelnr.: 73527970
- Herstellerkennzeichnung
- Libri GmbH
- Europaallee 1
- 36244 Bad Hersfeld
- gpsr@libri.de
- Verlag: Sage Publications Inc Ebooks
- 2 Revised edition
- Seitenzahl: 560
- Erscheinungstermin: 14. Juli 2025
- Englisch
- Abmessung: 231mm x 187mm
- ISBN-13: 9781071918289
- ISBN-10: 1071918281
- Artikelnr.: 73527970
- Herstellerkennzeichnung
- Libri GmbH
- Europaallee 1
- 36244 Bad Hersfeld
- gpsr@libri.de
Sue Giancola, Ph.D. has over 20 years of experience as an evaluator in both academia and private business. Prior to becoming an evaluator, she worked in corporate and higher education settings on process improvement initiatives. Dr. Giancola is a graduate of the University of Pennsylvania's Policy Research, Evaluation, and Measurement program. She also has a bachelor's degree from the University of Virginia in systems engineering and a master's degree from Pennsylvania State University in management. She is currently Senior Associate Director of the Center for Research in Education and Social Policy (CRESP) at the University of Delaware. Her work focuses on research and evaluation of programs to improve human services, primarily in the areas of education and health. She evaluates many local, state, and federal initiatives, including projects funded through NIH, NSF, U.S. Department of Health and Human Services, and U.S. Department of Education. Dr. Giancola is a member of the American Evaluation Association and the American Educational Research Association. She lives in Kennett Square, Pennsylvania with her husband and two daughters.
Preface
Acknowledgments
Digital Resources
About The Author
About the Contributors
Section I: Introduction
Chapter 1: Evaluation Matters
1.1 What is Evaluation?
1.2 Why Evaluate?
1.3 Values and Standards in Evaluation
1.4 Types of Evaluation
1.5 Internal and External Evaluation
1.6 Embedding Evaluation Into Programs
1.7 Textbook Organization
1.8 Chapter Summary
Key Terms
Chapter 2: History of Evaluation
2.1 The Evolution of Evaluation
2.2 The History of Ethics in Research and Evaluation
2.3 Common Threads and Current Issues in Evaluation
2.4 Chapter Summary
Key Term
Chapter 3: Evaluation Ethics
3.1 Ethics Defined
3.2 Research Ethics Guidelines and Legislation
3.3 IRB Protocols and Amendments
3.4 Ethical Responsibilities of Organizations
3.5 Ethical Responsibilities of Evaluators
3.6 Additional Considerations
3.7 Chapter Summary
Key Terms
Chapter 4: Evaluation Ideologies and Approaches
4.1 Inquiry and Ideology
4.2 Evaluation Ideology
4.3 Evaluation Design
4.4 Evaluation Approach
4.5 Embedded Evaluation
4.6 Chapter Summary
Key Terms
Section II: Embedded Evaluation - Planning and Design
Chapter 5: Define, Part 1
5.1 Embedded Evaluation
5.2 Understanding the Program
5.3 Delineating Goals and Strategies
5.4 Explaining the Program Theory
5.5 Determining Contextual Conditions
5.6 Program Theory and Other Theories
5.7 Considering Alternative Theories
5.8 Chapter Summary
Key Terms
Chapter 6: Define, Part 2
Chapter in Context
6.1 What is a Logic Model?
6.2 Creating the Logic Model
6.3 Using the Program's Logic Model
6.4 More On Logic Models
6.5 Logic Model Cautions
6.6 Chapter Summary
Key Terms
Chapter 7: Plan, Part 1
7.1 Creating Evaluation Questions
7.2 Overarching Evaluation Questions
7.3 Embedding Evaluation Questions Into the Logic Model
7.4 Determining What Data to Collect
7.5 Creating the Evaluation Matrix
7.6 Chapter Summary
Key Terms
Chapter 8: Plan, Part 2
8.1 Attribution
8.2 Evaluation Design
8.3 Evaluation Methods and Tools
8.4 Evaluation Matrix: Identifying Data Sources
8.5 Chapter Summary
Key Terms
Section III: Embedded Evaluation - Implementation and Use
Chapter 9: Implement, Part 1
Chapter in Context
9.1 Informed Consent
9.2 Collecting The Data
9.3 Organizing Quantitative Data
9.4 Organizing Qualitative Data
9.5 Special Considerations for Mixed Methods
9.6 Chapter Summary
Key Terms
Chapter 10: Implement, Part 2
Chapter in Context
10.1 Quantitative Data Analysis: Descriptive Statistics
10.2 Quantitative Data Analysis: Inferential Statistics
10.3 Quantitative Data Analysis: Advanced Statistical Methods
10.4 Qualitative Data Analysis
10.5 Mixed Method Integrative Analysis
10.6 Managing the Unexpected and Unintended
10.7 Chapter Summary
Key Terms
Chapter 11: Interpret
11.1 The Home Stretch
11.2 Examining Results
11.3 Interpreting Results
11.4 Communicating Evaluation Results
11.5 Enhancing Reporting and Communication
11.6 Chapter Summary
Key Terms
Chapter 12: Inform and Refine
12.1 Purpose of Evaluation
12.2 Pre-Evaluation: Efforts to Promote Utilization
12.3 During Evaluation: Ongoing Utilization Efforts
12.4 Post-Evaluation and Data Dissemination
12.5 Some Final Thoughts
12.6 Chapter Summary
Key Terms
Section IV: Resources
Chapter 13: Case Study Applications
13.1 LEND Evaluation
13.2 ACCEL Evaluation
13.3 YAP Evaluation
Chapter 14: Logic Model Examples
14.1 Birth-3 Early Intervention Screening Program
14.2 Gender Equity in STEM Academic Professions Program
14.3 Graduate Pipeline to Diversify the STEM Workforce Program
14.4 K-3 Cybersecurity Awareness Program
14.5 Higher Education Cybersecurity Program
14.6 Mental Health Services Program
Appendices Special Topics
Appendix A: An Integrated MERLA (Monitoring, Evaluation, Research,
Learning, and Adapting) Framework for Evidence-Based Program Improvement
Appendix B: Community Needs Assessment Among Latino Families in an Urban
Public Housing Development
Appendix C: Leveraging Artificial Intelligence to Advance Implementation
Science: Potential Opportunities and Cautions
Appendix D: How Mixed-Methods Research Can Improve the Policy Relevance of
Impact Evaluations
Appendix E: Learning, Unlearning, and Sprinkling In: Our Journey with
Equitable Evaluation
References
Acknowledgments
Digital Resources
About The Author
About the Contributors
Section I: Introduction
Chapter 1: Evaluation Matters
1.1 What is Evaluation?
1.2 Why Evaluate?
1.3 Values and Standards in Evaluation
1.4 Types of Evaluation
1.5 Internal and External Evaluation
1.6 Embedding Evaluation Into Programs
1.7 Textbook Organization
1.8 Chapter Summary
Key Terms
Chapter 2: History of Evaluation
2.1 The Evolution of Evaluation
2.2 The History of Ethics in Research and Evaluation
2.3 Common Threads and Current Issues in Evaluation
2.4 Chapter Summary
Key Term
Chapter 3: Evaluation Ethics
3.1 Ethics Defined
3.2 Research Ethics Guidelines and Legislation
3.3 IRB Protocols and Amendments
3.4 Ethical Responsibilities of Organizations
3.5 Ethical Responsibilities of Evaluators
3.6 Additional Considerations
3.7 Chapter Summary
Key Terms
Chapter 4: Evaluation Ideologies and Approaches
4.1 Inquiry and Ideology
4.2 Evaluation Ideology
4.3 Evaluation Design
4.4 Evaluation Approach
4.5 Embedded Evaluation
4.6 Chapter Summary
Key Terms
Section II: Embedded Evaluation - Planning and Design
Chapter 5: Define, Part 1
5.1 Embedded Evaluation
5.2 Understanding the Program
5.3 Delineating Goals and Strategies
5.4 Explaining the Program Theory
5.5 Determining Contextual Conditions
5.6 Program Theory and Other Theories
5.7 Considering Alternative Theories
5.8 Chapter Summary
Key Terms
Chapter 6: Define, Part 2
Chapter in Context
6.1 What is a Logic Model?
6.2 Creating the Logic Model
6.3 Using the Program's Logic Model
6.4 More On Logic Models
6.5 Logic Model Cautions
6.6 Chapter Summary
Key Terms
Chapter 7: Plan, Part 1
7.1 Creating Evaluation Questions
7.2 Overarching Evaluation Questions
7.3 Embedding Evaluation Questions Into the Logic Model
7.4 Determining What Data to Collect
7.5 Creating the Evaluation Matrix
7.6 Chapter Summary
Key Terms
Chapter 8: Plan, Part 2
8.1 Attribution
8.2 Evaluation Design
8.3 Evaluation Methods and Tools
8.4 Evaluation Matrix: Identifying Data Sources
8.5 Chapter Summary
Key Terms
Section III: Embedded Evaluation - Implementation and Use
Chapter 9: Implement, Part 1
Chapter in Context
9.1 Informed Consent
9.2 Collecting The Data
9.3 Organizing Quantitative Data
9.4 Organizing Qualitative Data
9.5 Special Considerations for Mixed Methods
9.6 Chapter Summary
Key Terms
Chapter 10: Implement, Part 2
Chapter in Context
10.1 Quantitative Data Analysis: Descriptive Statistics
10.2 Quantitative Data Analysis: Inferential Statistics
10.3 Quantitative Data Analysis: Advanced Statistical Methods
10.4 Qualitative Data Analysis
10.5 Mixed Method Integrative Analysis
10.6 Managing the Unexpected and Unintended
10.7 Chapter Summary
Key Terms
Chapter 11: Interpret
11.1 The Home Stretch
11.2 Examining Results
11.3 Interpreting Results
11.4 Communicating Evaluation Results
11.5 Enhancing Reporting and Communication
11.6 Chapter Summary
Key Terms
Chapter 12: Inform and Refine
12.1 Purpose of Evaluation
12.2 Pre-Evaluation: Efforts to Promote Utilization
12.3 During Evaluation: Ongoing Utilization Efforts
12.4 Post-Evaluation and Data Dissemination
12.5 Some Final Thoughts
12.6 Chapter Summary
Key Terms
Section IV: Resources
Chapter 13: Case Study Applications
13.1 LEND Evaluation
13.2 ACCEL Evaluation
13.3 YAP Evaluation
Chapter 14: Logic Model Examples
14.1 Birth-3 Early Intervention Screening Program
14.2 Gender Equity in STEM Academic Professions Program
14.3 Graduate Pipeline to Diversify the STEM Workforce Program
14.4 K-3 Cybersecurity Awareness Program
14.5 Higher Education Cybersecurity Program
14.6 Mental Health Services Program
Appendices Special Topics
Appendix A: An Integrated MERLA (Monitoring, Evaluation, Research,
Learning, and Adapting) Framework for Evidence-Based Program Improvement
Appendix B: Community Needs Assessment Among Latino Families in an Urban
Public Housing Development
Appendix C: Leveraging Artificial Intelligence to Advance Implementation
Science: Potential Opportunities and Cautions
Appendix D: How Mixed-Methods Research Can Improve the Policy Relevance of
Impact Evaluations
Appendix E: Learning, Unlearning, and Sprinkling In: Our Journey with
Equitable Evaluation
References
Preface
Acknowledgments
Digital Resources
About The Author
About the Contributors
Section I: Introduction
Chapter 1: Evaluation Matters
1.1 What is Evaluation?
1.2 Why Evaluate?
1.3 Values and Standards in Evaluation
1.4 Types of Evaluation
1.5 Internal and External Evaluation
1.6 Embedding Evaluation Into Programs
1.7 Textbook Organization
1.8 Chapter Summary
Key Terms
Chapter 2: History of Evaluation
2.1 The Evolution of Evaluation
2.2 The History of Ethics in Research and Evaluation
2.3 Common Threads and Current Issues in Evaluation
2.4 Chapter Summary
Key Term
Chapter 3: Evaluation Ethics
3.1 Ethics Defined
3.2 Research Ethics Guidelines and Legislation
3.3 IRB Protocols and Amendments
3.4 Ethical Responsibilities of Organizations
3.5 Ethical Responsibilities of Evaluators
3.6 Additional Considerations
3.7 Chapter Summary
Key Terms
Chapter 4: Evaluation Ideologies and Approaches
4.1 Inquiry and Ideology
4.2 Evaluation Ideology
4.3 Evaluation Design
4.4 Evaluation Approach
4.5 Embedded Evaluation
4.6 Chapter Summary
Key Terms
Section II: Embedded Evaluation - Planning and Design
Chapter 5: Define, Part 1
5.1 Embedded Evaluation
5.2 Understanding the Program
5.3 Delineating Goals and Strategies
5.4 Explaining the Program Theory
5.5 Determining Contextual Conditions
5.6 Program Theory and Other Theories
5.7 Considering Alternative Theories
5.8 Chapter Summary
Key Terms
Chapter 6: Define, Part 2
Chapter in Context
6.1 What is a Logic Model?
6.2 Creating the Logic Model
6.3 Using the Program's Logic Model
6.4 More On Logic Models
6.5 Logic Model Cautions
6.6 Chapter Summary
Key Terms
Chapter 7: Plan, Part 1
7.1 Creating Evaluation Questions
7.2 Overarching Evaluation Questions
7.3 Embedding Evaluation Questions Into the Logic Model
7.4 Determining What Data to Collect
7.5 Creating the Evaluation Matrix
7.6 Chapter Summary
Key Terms
Chapter 8: Plan, Part 2
8.1 Attribution
8.2 Evaluation Design
8.3 Evaluation Methods and Tools
8.4 Evaluation Matrix: Identifying Data Sources
8.5 Chapter Summary
Key Terms
Section III: Embedded Evaluation - Implementation and Use
Chapter 9: Implement, Part 1
Chapter in Context
9.1 Informed Consent
9.2 Collecting The Data
9.3 Organizing Quantitative Data
9.4 Organizing Qualitative Data
9.5 Special Considerations for Mixed Methods
9.6 Chapter Summary
Key Terms
Chapter 10: Implement, Part 2
Chapter in Context
10.1 Quantitative Data Analysis: Descriptive Statistics
10.2 Quantitative Data Analysis: Inferential Statistics
10.3 Quantitative Data Analysis: Advanced Statistical Methods
10.4 Qualitative Data Analysis
10.5 Mixed Method Integrative Analysis
10.6 Managing the Unexpected and Unintended
10.7 Chapter Summary
Key Terms
Chapter 11: Interpret
11.1 The Home Stretch
11.2 Examining Results
11.3 Interpreting Results
11.4 Communicating Evaluation Results
11.5 Enhancing Reporting and Communication
11.6 Chapter Summary
Key Terms
Chapter 12: Inform and Refine
12.1 Purpose of Evaluation
12.2 Pre-Evaluation: Efforts to Promote Utilization
12.3 During Evaluation: Ongoing Utilization Efforts
12.4 Post-Evaluation and Data Dissemination
12.5 Some Final Thoughts
12.6 Chapter Summary
Key Terms
Section IV: Resources
Chapter 13: Case Study Applications
13.1 LEND Evaluation
13.2 ACCEL Evaluation
13.3 YAP Evaluation
Chapter 14: Logic Model Examples
14.1 Birth-3 Early Intervention Screening Program
14.2 Gender Equity in STEM Academic Professions Program
14.3 Graduate Pipeline to Diversify the STEM Workforce Program
14.4 K-3 Cybersecurity Awareness Program
14.5 Higher Education Cybersecurity Program
14.6 Mental Health Services Program
Appendices Special Topics
Appendix A: An Integrated MERLA (Monitoring, Evaluation, Research,
Learning, and Adapting) Framework for Evidence-Based Program Improvement
Appendix B: Community Needs Assessment Among Latino Families in an Urban
Public Housing Development
Appendix C: Leveraging Artificial Intelligence to Advance Implementation
Science: Potential Opportunities and Cautions
Appendix D: How Mixed-Methods Research Can Improve the Policy Relevance of
Impact Evaluations
Appendix E: Learning, Unlearning, and Sprinkling In: Our Journey with
Equitable Evaluation
References
Acknowledgments
Digital Resources
About The Author
About the Contributors
Section I: Introduction
Chapter 1: Evaluation Matters
1.1 What is Evaluation?
1.2 Why Evaluate?
1.3 Values and Standards in Evaluation
1.4 Types of Evaluation
1.5 Internal and External Evaluation
1.6 Embedding Evaluation Into Programs
1.7 Textbook Organization
1.8 Chapter Summary
Key Terms
Chapter 2: History of Evaluation
2.1 The Evolution of Evaluation
2.2 The History of Ethics in Research and Evaluation
2.3 Common Threads and Current Issues in Evaluation
2.4 Chapter Summary
Key Term
Chapter 3: Evaluation Ethics
3.1 Ethics Defined
3.2 Research Ethics Guidelines and Legislation
3.3 IRB Protocols and Amendments
3.4 Ethical Responsibilities of Organizations
3.5 Ethical Responsibilities of Evaluators
3.6 Additional Considerations
3.7 Chapter Summary
Key Terms
Chapter 4: Evaluation Ideologies and Approaches
4.1 Inquiry and Ideology
4.2 Evaluation Ideology
4.3 Evaluation Design
4.4 Evaluation Approach
4.5 Embedded Evaluation
4.6 Chapter Summary
Key Terms
Section II: Embedded Evaluation - Planning and Design
Chapter 5: Define, Part 1
5.1 Embedded Evaluation
5.2 Understanding the Program
5.3 Delineating Goals and Strategies
5.4 Explaining the Program Theory
5.5 Determining Contextual Conditions
5.6 Program Theory and Other Theories
5.7 Considering Alternative Theories
5.8 Chapter Summary
Key Terms
Chapter 6: Define, Part 2
Chapter in Context
6.1 What is a Logic Model?
6.2 Creating the Logic Model
6.3 Using the Program's Logic Model
6.4 More On Logic Models
6.5 Logic Model Cautions
6.6 Chapter Summary
Key Terms
Chapter 7: Plan, Part 1
7.1 Creating Evaluation Questions
7.2 Overarching Evaluation Questions
7.3 Embedding Evaluation Questions Into the Logic Model
7.4 Determining What Data to Collect
7.5 Creating the Evaluation Matrix
7.6 Chapter Summary
Key Terms
Chapter 8: Plan, Part 2
8.1 Attribution
8.2 Evaluation Design
8.3 Evaluation Methods and Tools
8.4 Evaluation Matrix: Identifying Data Sources
8.5 Chapter Summary
Key Terms
Section III: Embedded Evaluation - Implementation and Use
Chapter 9: Implement, Part 1
Chapter in Context
9.1 Informed Consent
9.2 Collecting The Data
9.3 Organizing Quantitative Data
9.4 Organizing Qualitative Data
9.5 Special Considerations for Mixed Methods
9.6 Chapter Summary
Key Terms
Chapter 10: Implement, Part 2
Chapter in Context
10.1 Quantitative Data Analysis: Descriptive Statistics
10.2 Quantitative Data Analysis: Inferential Statistics
10.3 Quantitative Data Analysis: Advanced Statistical Methods
10.4 Qualitative Data Analysis
10.5 Mixed Method Integrative Analysis
10.6 Managing the Unexpected and Unintended
10.7 Chapter Summary
Key Terms
Chapter 11: Interpret
11.1 The Home Stretch
11.2 Examining Results
11.3 Interpreting Results
11.4 Communicating Evaluation Results
11.5 Enhancing Reporting and Communication
11.6 Chapter Summary
Key Terms
Chapter 12: Inform and Refine
12.1 Purpose of Evaluation
12.2 Pre-Evaluation: Efforts to Promote Utilization
12.3 During Evaluation: Ongoing Utilization Efforts
12.4 Post-Evaluation and Data Dissemination
12.5 Some Final Thoughts
12.6 Chapter Summary
Key Terms
Section IV: Resources
Chapter 13: Case Study Applications
13.1 LEND Evaluation
13.2 ACCEL Evaluation
13.3 YAP Evaluation
Chapter 14: Logic Model Examples
14.1 Birth-3 Early Intervention Screening Program
14.2 Gender Equity in STEM Academic Professions Program
14.3 Graduate Pipeline to Diversify the STEM Workforce Program
14.4 K-3 Cybersecurity Awareness Program
14.5 Higher Education Cybersecurity Program
14.6 Mental Health Services Program
Appendices Special Topics
Appendix A: An Integrated MERLA (Monitoring, Evaluation, Research,
Learning, and Adapting) Framework for Evidence-Based Program Improvement
Appendix B: Community Needs Assessment Among Latino Families in an Urban
Public Housing Development
Appendix C: Leveraging Artificial Intelligence to Advance Implementation
Science: Potential Opportunities and Cautions
Appendix D: How Mixed-Methods Research Can Improve the Policy Relevance of
Impact Evaluations
Appendix E: Learning, Unlearning, and Sprinkling In: Our Journey with
Equitable Evaluation
References