CIRCEPast to Present
|The Center for Instructional Research and Curriculum Evaluation (CIRCE)
was officially organized in 1964 as a service and research agency within
the College of Education, University of Illinois at Urbana-Champaign.
This agency brought together a number of activities which in the previous
twenty years operated informally from the office of the University Examiner,
the Bureau of Educational Research, the office of the Illinois Statewide
High School Testing Program, and other college offices. More specifically,
it institutionalized a portion of the growing involvement of University
of Illinois Personnel in the nation's educational reforms.
CIRCE's early attention focused mainly on the problems of evaluating
new curricula. Occasionally before but especially following Sputnik,
scholars from the various disciplines sought ways to upgrade the content
of school and college courses. Many of them, knowing little about
procedures of testing students and evaluating school programs, sought aid
from such specialists as J. Thomas Hastings, Lee J. Cronbach, John A. Easley,
and Philip J. Runkel. Often their discussions turned to problems
of instructional method, in-service teaching training, and curriculum coordination.
The decision of the University of Illinois to found and support CIRCE made
it possible to integrate those discussions into a program of evaluation
model building, instrumentation, field studies, collection of literature,
and professional training in measurement and evaluation.
Among the projects involved with CIRCE's first efforts were:
b. the School Science Curriculum Project (Finlav);
c. the Elementary School Science Project (Atkin);
d. the BSCS biology, project (Grobman);
e. the Social Studies Curriculum Center (Leppert); and
f. the Illinois Statewide High School Testing Program (Hastings).
b. the Consortium of Professional Associations for Study of Special Teacher Improvement Programs (CONPASS);
c. the Social Science Education Consortium;
d. the joint Council on Economic Education DEEP project;
e. the CERLI Regional Laboratory;
f. the Educational Products Information Exchange (EPIE);
g. the Commission on College Geography;
h. the Illinois Gifted Program, Office of the Superintendent of Public Instruction;
i. the Federal/State Task Force on Educational Evaluation;
j. the Twin City Institute for Talented Youth;
k. the Training of Teachers of Teachers;
1. the National Assessment of Educational Progress;
m. the Goal Free Evaluation of BSCS;
n. the Michigan Accountability Study;
o. the National Science Foundation Project City Science evaluation;
p. the Cleveland Area Arts Council Self-Evaluation of Arts in Education;
r. critique of the Follow Through Evaluation;
s. development of an evaluation manual for the Office of Environmental Education;
t. evaluation of a National Humanities Faculty program;
u. Getty Trust Case Studies of School Art; and
v. audit of NYC Promotional Gates program.
Improvement of instructional programs on campus has not been within CIRCE'S jurisdiction; but there have been many instances of collaboration with various University of Illinois Groups, particularly:
b. the Curriculum Laboratory (University High School);
c. the Office of Research in Medical Education (Chicago); and
d. Cooperative Extension Service.
Development of Program Evaluation Guidelines - OECD
International Seminar on Educational Evaluation - (PUC-RJ) Brazil
UNCAL CAl Evaluation - Great Britain
PANG Project (Rural School Survival) Sweden
Assessment of Icelandic Schools (Proppe) Iceland
The rationale for work in CIRCE is that educational change should be based on empirical studies, drawing for concepts and methods from the social sciences and humanities as well as from the behavioral sciences. Educational research is seen as a search for why one educational result obtains rather than another, with the findings hopefully useful for policy makers, administrators, and teachers. The burden on evaluators is seen to be that of describing, fully and from different value perspectives, what occurs inside the classroom and out so that people already involved with the program can understand it better and perhaps run it better.