Seeking Truth to (Em)Power

How to Fix Our Field’s Broken Link between Research and Practice

In this moment of upheaval, challenge, and resistance for our country, the phrase “speaking truth to power” has taken on a new urgency. Rarely asked amid the fervor pervading the corner offices and Twitter feeds of so many of our foundations and other civic institutions in recent months, however, is an important question: Where does our “truth” come from? How do we make judgments about truth in so subjective a field as arts and culture?

This is not merely an idle philosophical debate. Every year, our society invests thousands of hours and millions of dollars in generating knowledge about arts and culture.1 But just when choices about how to distribute resources seem to matter more than at any time in living memory, the arts field’s system for knowledge production, dissemination, and consumption is under tremendous strain, if not entirely broken — a predicament only exacerbated by a rapidly changing media environment.

As I have had the privilege to work with organizations such as Createquity, Fractured Atlas, and the Cultural Research Network over the past decade, along with a number of philanthropic institutions, the outlines of this dysfunction have become progressively clearer to me. In the abstract, it seems reasonable to expect that the decisions we make as a field should be informed by the best research and evidence available. But that expectation is stymied by a combination of factors, including the sharply limited capacity of our most influential decision makers to seek out and process information, warped incentives facing those who make a living conducting research on behalf of the field, and arts leaders’ persistent failure to develop and carry out a coordinated research agenda for the benefit of all.

During the roughly ten years that I spent as the founder of an arts think tank and the director of research and information strategy for a leading arts service organization, I often found myself advocating for the relevance and utility of research in the arts with funders and grantmaker-serving organizations. I learned from those conversations that philanthropy professionals typically place research, data, and evidence in a separate mental silo from everyday grantmaking practice. In private conversations and at large annual convenings alike, I would often hear words like “of course, data is so important” spoken, and yet I would rarely hear of actual case studies where data or research shaped an important internal decision that wouldn’t have been made otherwise. At first, I found this gap puzzling, but increasingly I have come to understand why it exists. And though its persistence is frustrating, I also believe that with awareness of the problem and focused attention toward fixing it, a better world is indeed possible.

Stumbling in the Dark

Arts leaders today grapple with tremendous limitations on their time. According to a 2016 study commissioned by the Knight and Hewlett Foundations, more than 80 percent of arts leaders report having difficulty keeping up with information in the field.2 (It should be noted that the people who didn’t bother to respond to the survey likely had even less time to spare!) An earlier report commissioned by the Kresge Foundation via the Cultural Data Project likewise identifies “very real capacity constraints within cultural nonprofits, in terms of both resources and ‘know-how,’” along with a “lack of a strong organizational vision for how data can be used to inform internal planning and decision-making,” as barriers to better use of information in the field.3

It seems fair to assume that grantmakers must be among those most severely affected by this reality, and that assumption is consistent with my own experiences speaking, working, and consulting with arts grantmakers over the past decade. At some organizations, arts grantmakers are expected to carry out in-depth due diligence on hundreds or even thousands of grant applications a year with a staff of two, three, or four people. With time at such a premium, it is not surprising that hundred-page PDF reports on topics not directly related to the task at hand drop to the bottom of the priority list. Indeed, one problem exacerbating the gap between research and practice is the fact that the typical publication format for research reports is an increasingly poor fit with contemporary reading habits. As a particularly dramatic example, the World Bank conducted a study on its own publications from 2008 to 2012, and found that roughly three-quarters of them had been downloaded fewer than a hundred times; stunningly, nearly a third of them had never been downloaded even once.4

Given this state of affairs, one might imagine that the field must rely on robust intermediaries to filter all of the relevant information and deliver it to grantmakers and others in the field in an accessible format. Alas, that is not the case. The arts sector has repeatedly tried and failed to sustain various think tank–like entities that can serve this function of translating existing research into practice.5 (My own arts think tank, Createquity, ultimately succumbed to the same structural factors that claimed the others.6) In the absence of such third-party information brokers, funders have few tools available to streamline the laborious process of seeking out relevant research literature and evaluating each individual publication’s trustworthiness and key takeaways.

Making matters worse is the fact that funders rely heavily on each other for knowledge about their field.7 A Hewlett Foundation study found that peers were the most trusted source of information among foundation professionals, sought out by more than 90 percent of respondents. Yet because so few grantmakers have the requisite bandwidth to seriously engage with a wide swath of research, this peer-to-peer approach lends itself to a potentially disastrous insularity — a “blind leading the blind” situation wherein funders encounter only a very small number of loosely vetted publications throughout the year, even as a vast and rich well of potentially transformative knowledge sits untapped just beyond their view.

These systemic failures have two very unfortunate implications. First, it is likely that numerous decisions we make every day to shape the arts ecosystem are at best suboptimal, or perhaps even causing active harm, because they do not benefit from the evidence available. Second, it suggests that the thousands of hours and millions of dollars we collectively invest in new research every year are largely a waste of time and money. We are not taking advantage of the knowledge we have. What does that say about the value of producing knowledge in the first place?

What We Are Not Seeing

The apparent paradox of a field that generates large quantities of research while doing little to ensure its use reflects, I believe, a collective internal disagreement about what knowledge can do for us. The act of commissioning and conducting a new study is rooted in a faith that knowledge can change minds, perspectives, and practices. Otherwise, if there is no chance that anyone would make a different decision no matter what the research tells us, why bother?

It is not crazy to be skeptical that knowledge has this power. Any single research study is extremely unlikely to be relevant, compelling, and irrefutable enough to be transformative on its own. In our experience at Createquity, the vast majority of publications we encountered did not change our worldview at all, either because they confirmed it, or more frequently, because they were not designed in a way that allowed us to draw any meaningful conclusions from them. Furthermore, many of the questions most relevant to grantmaking practice in arts and culture are difficult to measure, which means that studies offering real insight on those questions are rare indeed. Finally, even when faced with genuinely good evidence, human beings exhibit a remarkable ability to continue believing what they have already decided to be true.8

If evidence can’t help us do our jobs better, then it really is a mistake for us to invest in evidence at all. However, my own testimony suggests this is not the case. In the course of reviewing hundreds of research publications for Createquity over the past several years, either on my own or by training and supervising the work of my team, the weight of the evidence has repeatedly taken me by surprise. Below are just a couple of examples of how my mind changed during that time:

  • Createquity’s first large-scale research investigation examined the intersection between arts attendance and socioeconomic status in the United States. I was surprised to learn that it is not just classical music, theater, and art museums that attract a disproportionately wealthy and educated audience; the pattern also extends to movie theaters, sporting events, and activities like dancing socially. Our analysis gave me more appreciation for the dominant role that television plays in shaping cultural life, especially among our most marginalized citizens, and caused me to become much more skeptical of audience engagement strategies that rely on free or reduced-price admission as the primary lure.9
  • I had long been vaguely aware of the strain of research into the health benefits of the arts but had not taken it seriously for a number of reasons. When Createquity investigated the wealth of literature across all of the benefits of the arts, however, I was surprised to discover that some of the strongest available evidence supports the value of arts participation in clinical settings and for older adults.10 I now believe that the relative lack of attention to these target populations in most arts grantmaking programs is a significant blind spot for the field.

Experiences like these give me faith that if we are willing to adopt an intellectually curious approach and be open to having our minds changed, evidence can indeed be transformative — even in the arts.

Let There Be Light

To my mind, this is what a well-functioning knowledge ecosystem looks like, in the arts or any other field:

  • Agreement on what information is important. Grantmakers, scholars, and other ecosystem leaders have a working consensus on what we need to know in order to optimally support the field and thereby improve the lives of people in our communities. The tangible demonstration of this consensus is a shared and prioritized research agenda with widespread buy-in among the key players.
  • Sufficient investment to get confident answers to our highest-priority questions. Qualified professionals design research studies and evaluations that have the best chance to yield the kind of evidence desired, taking gaps in existing knowledge into account. Foundations, governments, and donors coordinate to fund the highest-priority inquiries at realistic price points and continue to do so as long as the expected value of the new information exceeds the cost of seeking it.
  • A centralized resource (or resources) making sense of those answers and what they mean for practice. At least one and ideally more than one policy shop interprets and synthesizes other people’s research using a transparent and consistent framework for judging evidence. It is important that this function be independent of the people carrying out the research and evaluation of projects under review, as professional incentives make it very difficult for researchers to be objective in public about the significance of their own work.
  • An audience of professionals eager to use this information in their organizations and programming. A culture of curiosity embedded throughout organizations in the field makes people hungry for knowledge and enthusiastic about its potential for practice. Programming initiatives are designed to adapt appropriately to new information when it becomes available, regardless of what it might say. As new questions come up, they feed back and contribute to the further development of the shared research agenda.

What would it take to make this dream a reality? One obvious move in the right direction would be to establish the sort of entity described above to synthesize and interpret arts research for funders and practitioners in a centralized fashion; the same organization(s) could also help to coordinate conversation regarding a shared research agenda. This is the role that Createquity tried to carve out for itself in its last three years of existence. Unfortunately, with Createquity set to close, establishing that infrastructure anew will need to involve a lot of retracing of our steps. But with much of the blueprint and precedent established, any effort along those lines will at least be given a head start.

That’s the good news. The bad news is that before any of this infrastructure can be sustained, we first need to establish more buy-in across the arts sector for supporting knowledge-building activities with the requisite funds. Currently, that is a tremendous challenge because of the highly place-based nature of arts funding in the United States, which is a bad fit with the largely borderless nature of knowledge. The arts field desperately needs more resource providers who are willing to engage strategically on a national (or international) basis and across discipline boundaries.

Thus, my primary recommendation for the field as a whole is to find ways to expand the pool of available dollars for national, cross-discipline field leadership and knowledge-building activities. Below are a few thoughts on how that might be accomplished:

  • Bring donors into the conversations that are already happening. Grantmakers in the Arts itself is ideally positioned to make progress on this particular point. In recent years, more foundation board members have attended the annual conference, though their numbers are still small in relation to the total. More programming like 2014’s offsite session aimed specifically at engaging this audience would go a long way toward raising awareness of the key issues facing the field among a broader range of ecosystem players.
  • Bring the conversation to spaces where donors congregate. I was pleased to moderate a session organized by Grantmakers in the Arts at the Exponent Philanthropy conference a few years ago. Exponent (formerly the Association of Small Foundations) specializes in philanthropic entities with few or no staff, entities that often support a range of causes and do not have the budget to participate in more narrowly focused affinity groups like GIA. GIA or individual funders could use platforms like these to engage more strategically with foundation board members and individual donors not otherwise participating in national dialogue.
  • Convene donors directly. Since so many donors are place based, one idea could be to work in partnership with (i.e., fund) value-aligned major local institutions to host donor salons about important issues in the arts. The salons could be coordinated to take place simultaneously, making possible real-time communication across local communities to emphasize the shared nature of the challenges and opportunities facing the field.

The above ideas mostly focus on Grantmakers in the Arts and other field leadership institutions. That said, individual grantmakers can also play a helpful role by examining and adjusting practices related to commissioning research and using knowledge in everyday practice:

  • Always assume, until demonstrated otherwise, that someone out there has already answered your question. The first step in any research process should be to answer the question, What do we already know about this? It is almost always cheaper to find and use the knowledge that has already been generated than it is to generate new knowledge.
  • Don’t be afraid to look outside your own community for answers. Your community probably has more in common with other geographies than you think.
  • Seek to stay current on the field’s active knowledge-building activities, and broker collaboration among researchers where appropriate. Consider joining the Cultural Research Network if you are interested in staying abreast of what projects are going on at any given time.
  • Remember that as a gatekeeper to capital, you actively shape the marketplace for what research does and doesn’t get done. That is a responsibility to take seriously. Accordingly, when commissioning new research, try to design it in such a way that the rest of the field benefits, not just your immediate community. If you don’t consider yourself a research professional and don’t have internal support at your organization, think about convening a research advisory committee or hiring a consultant to manage the contractor/grantee selection process.
  • Don’t free ride. Information is a public good and thus very difficult to monetize. If you find yourself benefiting in your practice from information resources that you didn’t pay for, try to find a way to support the people or organizations that made that work possible. If you take those resources for granted, they might just disappear on you.

The arts field’s knowledge infrastructure is broken, but I believe it can and ought to be fixed. Doing so, however, will take conscious commitment on the part of present and future field leadership. My hope is that one day, we will be able to say with confidence what we need to know, what research is out there, what it tells us, and how we can use it. And I hope on that day, we will share an appreciation for why research matters — not for its role as part of our institution’s thought leadership strategy, but because it really can help us make the most of the little time we have on this Earth and these precious resources that have been entrusted to our care.


  1. In 2016 alone, my arts think tank, Createquity, catalogued 516 research publications that had the arts as a primary focus; the true number of such publications released that year is likely at least 20 percent higher. Conservative assumptions of two hundred hours and $20,000 per publication yield figures of well over one hundred thousand hours and $10 million in investment, respectively.
  2. Barry Hessenius, “Communication and Information Management in the Nonprofit Arts Sector,” January 2016,
  3. Sarah Lee and Peter Linett, “New Data Directions for the Cultural Landscape: Toward a Better-Informed, Stronger Sector,” 2013, prepared for the Cultural Data Project by Slover Linett Audience Research,
  4. Christopher Ingraham, “The Solutions to All Our Problems May Be Buried in PDFs That Nobody Reads,” Washington Post, May 8, 2014,
  5. Barry Hessenius, “Arts Think Tank Follow-Up,” Barry’s Blog, June 4, 2017,
  6. Ian David Moss, “A Milestone and a Sunset for Createquity,” Createquity, October 26, 2017,
  7. Harder+Company Community Research, “Peer to Peer: At the Heart of Influencing More Effective Philanthropy,” February 2017, This report is not specific to the arts, but if anything, one would expect arts funders to be even more reliant on peers given the dearth of strong think tanks and other knowledge intermediaries in our field.
  8. For a comprehensive overview of cognitive biases and how they affect judgment and decision making, I highly recommend Daniel Kahneman, Thinking, Fast and Slow (New York: Farrar, Straus and Giroux, 2013).
  9. Ian David Moss, Louise Geraghty, Clara Inés Schuhmacher, and Talia Gibas, “Why Don’t They Come?,” Createquity, May 6, 2015,
  10. Salem Tsegaye, Ian David Moss, Katie Ingersoll, Rebecca Ratzkin, Sacha Wynne, and Benzamin Yi, “Everything We Know about Whether and How the Arts Improve Lives,” Createquity, December 19, 2016, See also Salem Tesgaye, Sacha Wynne, Rebecca Ratzkin, Ian David Moss, and Katie Ingersoll, “(Eng)Aging with the Arts Has Its Benefits,” Createquity, November 2, 2016,