Leslie Fierro is an assistant clinical professor of evaluation in Claremont Graduate University’s Division of Behavioral and Organizational Sciences. Fierro’s research focuses on identifying factors that can be leveraged to strengthen the conduct of evaluation practice within organizations. Her current research will continue along similar lines and expand to examine the potential influence of evaluation policy in the design, conduct, and use of program evaluations.
Fierro received a BA in Biology from Pitzer College, an MPH degree with a co-concentration in Epidemiology and Biostatistics from Loma Linda University, and a PhD in Evaluation and Applied Research Methods from Claremont Graduate University. She is currently an active member of the American Evaluation Association, where she serves as the Research on Evaluation Topical Interest Group Chair. Her early research examined the strengths and gaps that exist in the academic preparation of current day evaluators and potential future evaluators. Her dissertation research focused on clarifying the distal intended outcomes of building evaluation capacity within organizations. As part of the dissertation, she developed and tested an instrument designed to measure constructs of evaluation capacity and the intended outcomes of this capacity in public health programs.
Prior to joining the faculty at CGU, she worked for Deloitte Consulting as an evaluation specialist in the Human Capital Service Line. In this role, she led medium- to large-scale evaluation projects for clients in the federal health space and assisted with growing Deloitte’s evaluation service offerings. Fierro’s professional background also includes a lengthy tenure with the Centers for Disease Control and Prevention (CDC), where she served as an epidemiologist and evaluator. While at the CDC, she was central to the growth of evaluation within its National Asthma Control Program.
Fierro enjoys working collaboratively with nonprofit organizations and government agencies to enhance their ability to support and sustain effective evaluation practice. Along these lines, she is interested in engaging with organizations to develop clear long-term strategies for implementing a portfolio of monitoring and evaluation activities that are capable of providing timely, informative, and actionable insights about organization and program performance.
Co-authored with Christina A. Christie and Patricia Quiñones. “Informing the Discussion on Evaluator Training: A Look at Evaluators’ Course Taking and Professional Practice. American Journal of Evaluation 35, no. 2 (2014): 274–90.
Co-authored with J. H. Reed, J.H. & Fierro, L.A. “Book Review: Evaluating Public and Community Health Programs.” American Journal of Evaluation 34, no. 3 (2013): 432–35.
“Clarifying the Connections: Evaluation Capacity and Intended Outcomes.” PhD diss., Claremont Graduate University, 2012.
Co-authored with Christina A. Christie, C.A. “Evaluation policy to implementation: An examination of Scientifically Based Research in Practice.” Studies in Educational Evaluation 38, no. 2 (2012): 65–72.
Co-authored with E. J. Herman, et al. “A model-driven approach to qualitatively assessing the added value of community coalitions.” Journal of Urban Health 88, no. 1 (2011): 130–43.
Co-authored with Christina A. Christie. “Understanding evaluation training in schools and programs of public health.” American Journal of Evaluation 32, no. 3 (2011): 448–68.
Monitoring & Evaluation in Global Public Health Programs
Comparative Evaluation Theory
Evaluation Capacity Building
Introduction to Performance Monitoring & Management of Social Programs