- About Us
- Research & Outreach
Jennifer’s research interests focus on the intersections of social science and social policy. Her work in the domain of educational and social program evaluation seeks to advance the theory and practice of alternative forms of evaluation, including qualitative, democratic, and mixed methods evaluation approaches. Current work emphasizes evaluation as a venue for democratizing dialogue about critical social and educational issues, with a focus on conceptualizing evaluation as a "public good."
- Ph.D., Educational Psychology, Stanford University, 1976
- M.A., Education, Stanford University, 1972
- B.A., Psychology, Wellesley College, 1971
Key Professional Appointments
- Professor, Educational Psychology, University of Illinois at Urbana-Champaign, 1999-present
- Assistant, Associate, Full Professor, Department of Policy Analysis and Management, Cornell University, 1983-1999
- Assistant, Associate Professor, Department of Education, University of Rhode Island, 1977-1983
Activities & Honors
- Distinguished Fellow Award, School of Critical Studies in Education, Auckland University, 2012-present
- Reviewer, American Educational Research Journal, 2010-present
- Series Editor, Evaluation and Society, Evaluation and Society, 2010-present
- Member of Evaluation Team, University of Illinois, Education Justice Project, 2009-present
- Distinguished Scholar, Aalborg University, 2011-2013
- Distinguished Fellow Award, Critical Studies in Education, University of Auckland, 2012
- Member of the Board, American Evaluation Association, 2012-
- Fellow, American Educational Research Association, 2010
- Fellow of the American Educational Research Association, American Educational Research Association, 2010
- Alva Myrdal Guest Professor, Department of Education and Culture, Eskilstuna Sweden, Malardalen University, 2008-2009
- R. Stewart Jones Award for the Outstanding Teacher in Educational Psychology, Department of Educational Psychology, 2006
- Distinguished Senior Scholar, College of Education, 2003
- Paul F. Lazarsfeld Award, for contributions to evaluation theory, American Evaluation Association, 2003
Research StatementMy research interests focus on the intersections of social science and social policy. I work in the domain of educational and social program evaluation, and I seek to advance the theory and practice of alternative forms of evaluation, including qualitative, democratic, and mixed methods evaluation approaches. My current work emphasizes evaluation as a venue for democratizing dialogue about critical social and educational issues, with a focus on conceptualizing evaluation as a "public good."
- Principal Investigator, Advancing the State-of-the-Art in Evaluation: Field-Testing and Disseminating an Educative, Values-Engaged Approach to Evaluating STEM Education Programs, National Science Foundation, 2006-2011
- Co-Principal Investigator, Center Evaluation, Center on Democracy in a Multiracial Society, 2005-2006
- Principal Investigator, Internal Evaluation: Spring 2005, Center for African Studies, 2005-2005
- Co-Principal Investigator, 12-Step Participation After Adolescent Treatment, National Institutes of Health, 2005-2008
- Co-Principal Investigator, Illinois Project for Democratic Accountability, Campus Research Board, 2004-2005
- Principal Investigator, An Evaluation Plan for the 2003 Annual Conference of the American Evaluation Association, U.S. Department of Education (American Evaluation Association), 2003-2004
- Principal Investigator, An Evaluation of the EPICS K-12 Program at UIUC, National Science Foundation (College of Engineering), 2003-2004
- Co-Principal Investigator, Evaluation of the APPEALS Training, U.S. Department Veterans Administration (Leads Corporation), 1998-2002
- DeStefano, L., Greene, J., Anderson, J. The external evaluation of the AERA/IES postdoctoral fellows and grants program, year 2 report. American Educational Research: Washington, DC.
- Greene, J., others, . The learning in community (LINC) program, evaluation summary report. Prepared for the Low Income Networking and Communications Program Staff. College of Engineering, University of Illinois at Urbana-Champaign: Champaign.
- Greene, J. (2013). Consumers, curmudgeons, and courage: Traveling evaluation's byways with Michael Scriven. The future of evaluation in society: A tribute to Michael Scriven. Information Age Publishing: Greenwich CT.
- Johnson, J., Hall, J., Greene, J., Ahn, J. (2013). Exploring alternative approaches for presenting evaluation results. American Journal of Evaluation, 34(4), 486-503.
- Greene, J. (2012). Engaging critical issues in social inquiry by mixing methods. American Behavioral Scientist, 56(Special issue on mixing methods), 755-773.
- Hall, J., Greene, J., Ahn, J. (2012). Values-engagement in evaluation: Ideas, implications, and illlustrations. American Journal of Evaluation, 33(2), 195-207.
- Greene, J. (2012). Values-engaged evaluation. Evaluation for equitable development results. UNICEF: New York.
- Greene, J. (2012). La contribution des données probantes au processus de crédibilisation d’une évaluation. [How evidence earns credibility in evaluation]. L’évaluation de programme axée sur le jugement crédible. Les Presses de l’Université du Québec.
- Greene, J., Boyce, A., Ahn, J. (2011). A values-engaged educative approach for evaluating education programs: A guidebook for practice. University of Illinois at Urbana-Champaign: Champaign, IL.
- Greene, J., Kreider, H., Mayer, E. (2011). Combining qualitative and quantitative methods in social inquiry. Theory and methods in social research, second edition. Sage: London.
- Greene, J. (2011). The construct(ion) of validity as argument. In H.T. Chen, S.I. Donaldson, & M.M. Mark (eds.), Advancing validity in outcome evaluation: Theory and practice. New Directions for Evaluation 130, (pp. 81-. Wiley: San Francisco.
- Greene, J. (2010). Evaluation in service of the public good: The views of one US American evaluator. Zeitschrift fur Evaluation [the journal of the German Evaluation Society], 9(2), 199-210.
- Greene, J., Hall, J. (2010). Dialectics and pragmatism: Being of consequence. Sage handbook of mixed methods in social and behavioral research. Sage: Thousand Oaks, CA.
- Greene, J., Sommerfeld, P., Haight, W. (2010). Mixing methods in social work research. The SAGE handbook of social work research. Sage: London.
- Benjamin, L., Greene, J. (2009). From program to network: The evaluator’s role in today’s public problem-solving environment. American Journal of Evaluation, 30(3), 296-309.
- Greene, J. (2009). Meaningfully engaging with difference through mixed methods educational evaluation. The sage international handbook of educational evaluation. Sage: Thousand Oaks, CA.
- Greene, J. (2009). Evidence as “proof” and evidence as “inkling.” What counts as credible evidence in applied research and evaluation practice?. Sage: Thousand Oaks, CA.
- Greene, J. (2008). Is mixed methods social inquiry a distinctive methodology? Journal of Mixed Methods Research, 2(1), 7-21.
- Greene, J. (2006). Toward a methodology of mixed methods social inquiry. Research in the Schools. Special Issue: New Directions in Mixed Methods Research, 13(1), 93-99.
- Greene, J., DeStefano, L., Burgon, H., Hall, J. (2006). An educative, values-engaged approach to evaluating STEM educational programs. Critical issues in STEM evaluation. New directions for evaluation: San Francisco.
- Shaw, I., Greene, J., Mark, M. (2006). Handbook of evaluation. Sage: London.
- Greene, J. (2005). Various entries. Encyclopedia of evaluation. Sage: Thousand Oaks, CA.
- Greene, J. (2005). Synthesis: A reprise on mixing methods. Discovering successful pathways in childrens development: Mixed methods in the study of childhood and family life. University Press: Chicago.
- Greene, J. (2005). Evaluators as stewards of the public good. The role of culture and cultural context: A mandate for inclusion, truth, and understanding in evaluation theory and practice. Evaluation and Society Series. Evaluation and Society Series: Greenwich, CT.
- Greene, J., Kreider, H., Mayer, E. (2005). Combining qualitative and quantitative methods in social inquiry. Research methods in the social sciences. Sage: London.
- Greene, J. (2005). The generative potential of mixed methods inquiry. International Journal of Research and Method in Education, 28(2), 207-211.
- Greene, J. (2005). A value-engaged approach for evaluating the Bunche-Da Vinci Learning Academy. Theorists' models in action: New directions for Evaluation, 106. Jossey-Bass: San Francisco.
- Greene, J., Millet, R., Hopson, R. (2004). Evaluation as a democratizing practice. Foundations and evaluation: Contexts and practices for effective philanthropy. Jossey-Bass: San Francisco.
- Greene, J. (2003). Commentary: Margaret Mead, the Salzburg Seminar, and a historical evaluation report. The American Journal of Evaluation, 24(1), 115-121.
- Costantino, T., Greene, J. (2003). Reflections on the use of narrative in evaluation. The American Journal of Evaluation, 24(1), 35-49.
- Greene, J. (2002). Mixed-method evaluation: A way of democratically engaging with difference. Evaluation Journal of Australasia, 2(2), 23-29.
- Greene, J. (2001). Mixing social inquiry methodologies. In V. Richardson (Ed.), Handbook of research on teaching, 4th ed. dbo. (pp. 251-258). American Educational Research Association: Washington, DC.
- Greene, J. (2000). Understanding social programs through evaluation. Handbook of qualitative research. Sage: Thousand Oaks, CA.
- Greene, J. (1999). The inequality of performance measurements. Evaluation, 5, 160-172.
- Greene, J. (1999). Balancing philosophy and practicality in qualitative evaluation. the 1998 Robert E. Stake Symposium on Educational Evaluation. University of Illinois: Urbana-Champaign, IL.
- Greene, J., Caracelli, V. (1997). Advances in mixed-method evaluation: The challenges and benefits of integrating diverse paradigms. National Council for the Social Studies: San Francisco.
In The News
U of I team, including two from Education, leads $2M NSF study on best practices for STEM education reform
Sep. 27, 2013
U of I physicist and chair of Educational Psychology José Mestre is the principal investigator on a new study at that will seek to provide U.S. higher education institutions with a model of best practices and methods to reform gateway STEM courses offered in the first two years of study. Jennifer Greene, professor of Educational Psychology and evaluation expert, is a co-principal investigator. Read more...