Evaluating Arts Education

Evaluations of arts education programs raise some of the greatest challenges I face in reviewing proposals. Even in a secular age, when people are pressed to describe the nature of art, they come to words like "essence." How do we get to a point where we know that children have learned to make and to encounter that kind of knowing? Unlike Professor Mihaly Csikszentmihalyi, whose elegant "Assessing Aesthetic Education," appeared in the Spring 1997 Grantmakers in the Arts Newsletter, I am writing from the perspective of a grantmaker with three years of experience facing a desk filled with slippery, eclectic piles of publications and reports, continually trying to glean from them an approach that can be both authentic and pragmatic.

According to those reports, arts education can improve math and reading ability, raise Scholastic Aptitude Test scores, enhance the teaching of history, boost self esteem, contribute to harmony among members of different racial groups, and improve the likelihood of participation in the arts as an adult. The potential of thoughtful programs is rich, but every approach to teaching art in and after school and then measuring what happens comes up against ill-fitting tools of measurement or redefines the tools and the rules.

Based on my knowledge of the behavior of school districts in the greater San Francisco Bay Area, administrators have heard that there are success stories out there—high test scores and great achievements hatching out of arts-intensive curricula and arts magnet schools in low and middle income communities. Not content with national studies, each of these school administrators wants to know whether the arts are working in their schools, with their students. They want proof at hand to justify allocations. Arts organizations come to grantmakers seeking increased funding to evaluate their work. This is a difficult, expensive endeavor. Many administrators design assessment plans that have multiple purposes, trying to measure for the values they themselves maintain, to comfort school district administrators, and to satisfy their funding sources. And many are in over their heads. The possibility that they will fail or waste resources worries me.

What follows are brief responses to ten evaluation documents, each of which takes a distinct approach. My hope is that having examples for reference will enable grantmakers to help arts organizations and schools know how questions have been asked by others. This is by no means a comprehensive collection. I used the “random sampling” approach of collecting what came in the mail or what I learned was out there based on what came in the mail. I hope others will respond with recommendations for studies that demonstrate other possibilities.

I have clustered the readings into three groups according to ways they might be compared and used: a) evaluations that provide arguments on behalf of arts education; b) evaluations that ask what can and ought to be measured, whether new terms of measurement are needed, and how assessment can be integrated into teaching; and c) evaluations that address how to strengthen nonprofit agencies or support effective partnerships among organizations.

Making the Argument for Arts Education
Properties we didn't notice are like ideas we have not had. They leave no gap in the world; it takes information to specify gaps.
— Ulric Neisser, Cognition and Reality

“Now, most public elementary schools offer instruction in music and visual art (97 percent and 85 percent respectively), although relatively few offer dance and drama courses (43 percent and 8 percent respectively); and only 39 percent of the nation's public secondary schools require credit specifically in the arts for graduation.” Transforming Ideas for Teaching and Learning the Arts by Charles L. Gary (U.S. Department of Education, 1997) is addressed to teachers and school administrators and provides a brisk overview of the current status of arts education in U.S. schools and of contemporary thought about the teaching opportunities that arts education programs provide. Intellectual and social skills mentioned include decision making, aesthetic awareness, originality, self-acceptance, and craftsmanship. Opportunities to employ new technologies in this field are abundant. A modest document, Transforming Ideas... includes a good “Suggested Reading” list as well as a useful list of resources (books, service organizations, and internet resources).

The Creative Ticket to Student Success (Alliance for Arts Education, New Jersey, 1995) makes a complex, well-articulated argument on behalf of arts education. An effective tool for designing a position paper, The Creative Ticket includes a statement of goals and values for arts education, and then maps out arguments from the points of view of national policy, academic achievement, effectiveness with at-risk youth, and workplace preparation, along with examples of exemplary professional development and parental involvement programs. The study quotes examples of standardized test score results, including:

In 1995, the College Entrance Examination Board found that Scholastic Assessment Test scores for students who studied the arts more than four years were 59 points higher on the verbal and 44 points higher on the math portion than students with no course work or experience in the arts.

The American Psychological Association in 1994 demonstrates “...that the spatial reasoning performance of nineteen preschool children who received eight months of music lessons far exceeded the spatial reasoning performance of a demographically comparable group of fifteen preschool children who did not receive music lessons;” and reported similar results for college students who listened to Mozart.

Other examples highlight specific public schools with strong arts curricula where test scores exceed those of comparable schools in their districts.

Effects of Arts Education on Participation in the Arts by Louis Bergonzi and Julia Smith (NEA Research Division, 1996) analyzes an assumption that many arts grantmakers make when supporting arts education: exposure to the arts as a child increases participation in the arts as an adult (whether as a producer of art, as an audience member, or by gaining access to the arts through the media). A new report on art participation in the United States is now underway. This one uses data from the 1992 Survey of Public Participation in the Arts, which was conducted by the U.S. Census Bureau on behalf of the National Endowment for the Arts. Analysis of the data reveals differences in participation based on arts education, general education, racial and ethnic group, socioeconomic status, and patterns of leisure activities. Arts education emerges as the most important factor in predicting arts participation of adults at live events or through the media.

The report differentiates between school-based arts education and community-based arts education. For almost every type of participation, the more of either school- or community-based arts education that children received, the more they will participate as adults, through either consumption or creation. There was one exception. In “arts performance”—that is, in an individual's own activity as a performer—the study found that community-based arts education as a child did nothing to predict adult activity, and receiving school-based arts education actually decreased the likelihood that an individual would continue to perform as an adult.

Rethinking Assessment: What Can and Ought to Be Measured
Midway between the unintelligible and the commonplace, it is metaphor which produces most knowledge.

Arts PROPEL: An Introductory Handbook by Ellen Winner (1991) and Taking Full Measure: Rethinking Assessment through the Arts by Dennie Palmer Wolf and Nancy Pistone (1991) introduce some of the premises of the student-centered approach to arts education developed at Harvard Project Zero. The theses of Arts PROPEL are that students learn best when they are invited to solve challenging, open-ended questions, and that in the creative process, an artist “...must mine both his or her own experience and the heritage of the art form in fashioning new work.” Proposing that arts education mimic the experiences of artists, the program accepts that assessing and reflecting upon creative work are integral to the creative process. Arts PROPEL recommends a system of portfolio review that evaluates student work based on criteria that can be applied across artistic disciplines including: craftsmanship, originality, willingness to pursue a problem in depth, development of a work over time, ability to work independently and in a group, ability to perceive characteristics of a work, and ability to think critically about one's work.

Taking Full Measure illustrates these principles. Written for classroom teachers, this article is good reading for both generalists and specialists. It introduces examples of creative teaching that incorporate reflection and assessment activities. Teaching examples are provided in visual arts, drama, music, and dance. The final chapter describes open-ended problem solving and portfolio assessment as applied to the teaching of mathematics, turning the tables on arts education studies that seek success in improved standardized mathematics test scores. While not the primary focus, each example includes the importance and power of observing “professional” exhibits and performances in the process of teaching and learning art. A refreshing quality of this piece is its focus on individuality. It highlights the particular approaches of different teachers, using different art forms, and reaching very different students. Project Zero has produced a rich selection of publications, some of which may report more specifically on results. Taking Full Measure is intended as an exposition, not an analysis, and does not stand outside of its approach to compare its composite qualities to those of a different program.

Improving Visual Arts Education: Final Report on the Los Angeles Getty Institute for Educators on the Visual Arts, 1982-92 (published in 1993) addresses both philosophical concerns about what visual arts education ought to be and structural questions about how programs can be institutionalized within school districts. The Getty Institute's Discipline Based Arts Education (DBAE) curriculum was developed in response to the “problem” that much arts teaching focused exclusively on art-making activities and production skills, and overlooked the potential inherent in arts education to further such cognitive skills as analysis, interpretation, and problem solving.

Improving Visual Arts Education outlines the implementation of the Getty approach between 1982 and 1989 in selected Los Angeles County school districts. Schools received a specific DBAE curriculum, staff development for teams of teachers and administrators, and support for curriculum implementation. The evaluation included classroom observation and measured several variables including educators' responses to staff development, levels of support for the program at school and district levels, the maintenance and expansion of the program, and growth in student achievement in art. A very helpful feature of this report is its “Appendix B” that lists all of the evaluation instruments used. The breadth of this list is a valuable brainstorming tool for anyone designing an evaluation plan.

A disappointing aspect of the report is that it more clearly describes the evaluation methods than it describes any criticisms that the evaluation uncovered. In this respect, an interesting passage is a discussion of the Institute's development of pre- and post-”Achievement in Art” tests that measure what students learn. The tests were abandoned or postponed due to a “controversy” that the report mentions but does not explain. However, the breadth of Getty's effort has informed subsequent national work in defining arts education standards.

The Schooled Mind: Do the Arts Make a Difference? by Richard L. Luftig (1994, 1995) provides a contrast to the Project Zero and Getty Institute's efforts to redefine the methods of teaching along with the instruments of examination. This two-volume evaluation of the SPECTRA+ program in Hamilton and Fairfield counties, Ohio, takes a classic, empirical approach. Two schools—one in each district—were selected to participate in a multidisciplinary, multi-year arts education program that included professional development for teachers in interdisciplinary teaching and that incorporated the use of arts specialists. The evaluation compares SPECTRA+ classrooms to two control groups in schools in the same districts. These control groups include one that had a different special program (family math), and one that had a standard curriculum. The report outlines scored results by gender. Both academic and behavioral tests were used—Culture-Free Self-Esteem Inventory (Battle, 1981), Bialer-Cromwell Locus of Control Scale (Bialer, 1961), Torrance Tests of Creative Thinking (Torrance, 1962, revised 1990), Arts Appreciation Test (Manitoba Department of Education, 1988), Iowa Tests of Basic Skills, and Stanford Achievement Tests. A second year evaluation examines whether gains made in year one of the study were temporary.

As with any report of this type, the nuances of achievement are numerous and, in spite of the manageable sampling, results are complex. A strong coincidental theme was girls' poor math test results as compared to boys. The strength of this evaluation approach is that it generates results that are easily understood by school administrators. Its rigor demonstrates how difficult it is to secure clear, reliable information about learning. These efforts can be like the physics of sub-atomic particles: the closer one looks, the more mystical the object of observation appears to be.

The National Gallery of Art Teachers' Institutes: A Five Year Evaluation (1989-1993) by Ellyn Berk (1994) looks at a museum's intensive summer institute program. This Institute has reached a large number of teachers (both arts and non-arts) and school administrators from schools of all types across the United States since 1989. The study focuses on the program's impact on teachers, although the inclusion of a site visit suggests that the effect on students also was a concern. Surveys of participating teachers were conducted at the onset of their Institute experiences and six months later. In addition to analyzing who was served, the study evaluates teacher motivation, teacher morale, attitudes toward teaching art, and changes in teaching. Surveys evolved over the five years of the study to include questions about the teachers' roles in their schools and communities. One hope that developed as the program matured was that it would foster leadership, with participants going back to their jobs to train and inspire others. The version of the study that was distributed to grantmakers may have been abridged. It includes observations of a single site visit to a classroom. While it is an inspiring example, the program is national and has served over 1,000 teachers. A broader variety of site visits would have been more appropriate.

Arts Education in Context
From error to error we discover the entire truth.
— Sigmund Freud

Evaluation of the Implementation and Outcomes of WritersCorps Programming by Rebecca Schaffer, Steve Hulett, Robyn Harris, and Michael Peters is hot off the presses. WritersCorps is a collaborative effort of the National Endowment for the Arts, AmeriCorps, and arts agencies in Washington D.C., the Bronx, and San Francisco to incorporate creative writing into existing support services for at-risk populations. This program is challenging to evaluate because it was implemented to serve different populations in three cities, and WritersCorps Members were placed in a wide variety of schools and social service agencies. Caliber Associates determined that data sources and resources were not sufficient to produce a reliable, scientifically rigorous evaluation. Therefore, they documented the process and strategies for implementing the program at each site, noting factors that impeded the program's implementation. Through focus groups and surveys, they also created a synopsis of “direct outcomes.”

WritersCorps Members produced extensive and innovative literary arts programs. “The anecdotal and qualitative evidence of the program's impact included many success stories...” One outcome repeatedly mentioned was increased self-esteem (although that term is not defined). Nearly all host agencies were interested in continuing their work with the program. Some obtained additional resources to hire their Members as regular staff.

The evaluation uncovers an essential and thorny question of whether or not WritersCorps was meant to serve the agencies with the most urgent needs or meant to create sustainable programs. If urgency of need were most important, more resources were needed and members should have had tenures of longer than two years. Caliber Associates emphasizes the question of urgency specifically because AmeriCorps is a federal initiative:

...to create a demand for writing without providing long-term means of meeting this demand would be very detrimental for the Host Agencies, the Members, and the participants.... WritersCorps may fall into the ranks of the many federal projects that frustrate local agencies by promising success, showing signs of achieving this success, and ending before they can be sustained or replicated at the local level. Such programs leave everyone feeling “empty-handed.”

Another interesting question raised was whether the AmeriCorps structure impeded successful implementation of program goals by limiting writers' tenure to two years, insisting on a large number of hours relative to fees paid to writers, and by requiring time-consuming reporting (and mandatory three-hour weekly meetings). Several Host Agencies became very attached to their writers and were distressed when the two year period ended. Before the end of the program's initial three-year period, the Bronx broke away from AmeriCorps to gain more flexibility. Despite these criticisms of the program structure, AmeriCorps' commitment to breadth, service, and access was critical to the program's design and spirit in all three cities.

Power in Practice Arts Education Development Project (The Pew Charitable Trusts, 1995) offers good lessons for any kind of partnership program involving schools and community centers. Grantmakers are one of its primary audiences. The Pew Charitable Trusts' Arts Education Development Project (AEDP) selected a cluster of ten cultural organizations and provided up to $85,000 over a four-year period for them to strengthen and enhance the breadth of their programs. Pew's goal was to nurture ten model programs, and it allowed for the possibility that each would be distinct. Each was to be judged on its own terms. The purpose of assessment in this case was not to justify a program to outsiders, but to inform the work of the agencies involved. The premise behind this grantmaking was that better arts education would be produced by strengthening the organizations that deliver the service. Support consisted of financial assistance and frequent convening of all participants to share ideas and information. Pew's report does not attempt to compare the effect of the programs on the students served. Each agency was invited to look at “impact on constituents” for itself. Over the course of the project, ADEP's original goal (creating ten models for national replication) was changed to seeking common concerns or strategies that stayed true no matter what the audience or setting. In the report, the lessons learned appear not only alongside the profiles of each project and group, but also on a colorful poster inserted in the publication. The evaluation focuses on the behavior of organizations and ways to strengthen them. Patience and adaptability are critical to success.

Evaluating arts education programs raises pragmatic questions for grantmakers. When should we expect empirical results and when should we allow a program time to grow and be refined before asking it to face the chilly lens of standards? When are self-defined goals appropriate and when are they self-indulgent? When does our investment in a sophisticated design cause us to lose sight of what children are learning? When must we make accommodation for the pressured environments in which teachers and artists work—where survey forms are lost and focus groups represent unpaid overtime?

It is difficult to maintain a controlled experimental environment when studying any kind of learning. It is even more difficult in the arts where defining what learning is involves questioning every one of our values, class assumptions, and preconceptions about the nature of culture and civilization. Also, arts education often depends heavily on organizations and artists from outside of traditional schools, so that to strengthen arts education, we must affect and evaluate at least three fluctuating systems: teacher education, school districts, and arts organizations.

The reports cited here are a small sampling of important work undertaken in this complex field in the past two decades. Reading across the range that they represent highlights the difficulties of assessment itself, but also demonstrates that arts education contributes to positive social and intellectual change in the lives of students.


  1. Transforming Ideas for Teaching and Learning the Arts by Charles L. Gary, U.S. Department of Education, March 1997. For purchase, contact the U.S. Government Printing Office, Superintendent of Documents, Mail Stop: SSOP, Washington D.C. 20402-9328. Call GPO order desk: 202-512-1800.
  2. The Creative Ticket to Student Success: A Position on the State of the Arts in New Jersey Education Reform, Alliance for Arts Education, New Jersey, December 1995. For copies, contact National Assembly of State Arts Agencies, 1010 Vermont Avenue, N.W., Suite 920, Washington D.C. 20005, 202-347-5348.
  3. Effects of Arts Education on Participation in the Arts by Louis Bergonzi and Julia Smith, NEA Research Division, Report #36, 1996. For copies, call Seven Locks Press, 800-354-6352.
  4. Arts PROPEL: An Introductory Handbook by Ellen Winner, series editor, Educational Testing Service and the President and Fellows of Harvard College on behalf of Project Zero, Harvard Graduate School of Education, 1991. To receive an order form for Harvard Project Zero publications write Harvard Project Zero, 323 Longfellow Hall, Cambridge, MA 02139.
  5. Taking Full Measure: Rethinking Assessment through the Arts by Dennie Palmer Wolf and Nancy Pistone; Richard Orrill, executive editor, College Entrance Examination Board, New York, 1991. Photocopies are available by ordering from Harvard Project Zero (see #4 above).
  6. Improving Visual Arts Education: Final Report on the Los Angeles Getty Institute for Educators on the Visual Arts (1982-1989), J. Paul Getty Trust, 1993. To receive a catalogue of publications of the Getty Institute for Education in the Arts, contact Getty Trust Publications, Distribution Center, Dept. TFC7, P.O. Box 49659, Los Angeles, CA 90049-0659, 800-223-3431.
  7. The Schooled Mind: Do the Arts Make a Difference?, Year 2, by Richard L. Luftig, Ph.D., Center for Human Development, Learning, and Teaching, Miami University, Oxford, OH, 1994 and 1995. Copies are available through the Fitton Center for Creative Arts, 101 S. Monument Avenue, Hamilton, OH 45011-2833, 513-863-8873. (Year I, also by Richard Luftig, is out of print, although photocopies are available at cost from the Center at Miami University. Year 1 results are summarized in the Year 2 report.)
  8. The National Gallery of Arts Teachers' Institutes: A Five-Year Evaluation, Ellyn Berk, Ph.D., June 1994. Order from the National Gallery of Art, Office of Teacher and School Programs by calling 202-842-6187.
  9. Evaluation of the Implementation and Outcomes of WritersCorps Programming by Rebecca Schaffer, Steve Hulett, Robyn Harris, and Michael Peters, evaluation director, Caliber Associates, Fairfax, VA, 1997. For copies, contact Ed Taylor, National Endowment for the Arts, 1100 Pennsylvania Avenue N.W., Washington D.C. 20506-0001, 202-682-5441.
  10. Power in Practice, The Pew Charitable Trusts, 1995. Contact the Trusts at 2005 Market Street, Suite 1700, Philadelphia, PA 19103, 215-575-9050.