Taking Note: Arts Impact Measurement from Our Neighbor to the North


By Sunil Iyengar, Arts Endowment Director of Research & Analysis
Illustrations representing research
Four years ago, the Canada Council for the Arts announced its first funding increase toward a doubling of the agency’s budget by 2021. Preparing for this cash infusion, the agency restructured its grant-making programs, moving from a “disciplinary-based” model to an “outcomes-oriented” one. Accordingly, the Council’s six grant programs are: Explore and Create; Engage and Sustain; Creating, Knowing, and Sharing (a program designed especially to serve indigenous populations); Supporting Artistic Practice; Arts Across Canada; and Arts Abroad. With budgetary windfalls come great responsibility—for program evaluation and performance measurement, that is. Last month, the Council published a “qualitative impact framework” to articulate and measure the “intrinsic” impacts of its contributions to the Canadian arts ecosystem. Working with the arts consulting firm WolfBrown, the Council’s research director, Gabriel Zamfir-Emache, and his team led three regional stakeholder group meetings to vet the tool, which originated with “deep research into different models and frameworks used in a variety of sectors around the world, including the arts,” a Q&A document explains. When it comes to measuring the value of the arts, the notion of intrinsic versus so-called instrumental benefits has been circulating since at least 2005, when the RAND Corporation produced its report Gifts of the Muse: Reframing the Debate about the Benefits of the Arts. Intrinsic value, at times, can seem like a panacea: as if the term is a repository for all the abstract and elusive qualities of the artistic experience that social sciences research methods have yet to plumb. Using customizable surveys and dashboards, however, WolfBrown works routinely with arts organizations in the U.S. and abroad to track the emotional responses of audiences and art-goers, often to specific artworks or productions. These “intrinsic impact” surveys confirm rather than contradict the enduring appeal of psychometrics—and of good old mixed-methods research—to measure knotty concepts such as engagement and captivation. The Canadian report does not focus exclusively on audience experiences; nor does it propose any single survey instrument to measure the Council’s cumulative impact. Instead the framework offers five impact areas as worthy of study, and it lists detailed questions for each. Broadly, the five impact areas are: artists; organizations; artistic/creative work and practices; national and international audiences; and communities and society. According to the Council, the questions within each impact area will fuel a qualitative research implementation plan. Resulting studies will examine “the effects of the Canada Council’s funding on first-time grant recipients” and also “how organizations across Canada can articulate the broader impacts they have on their communities,” the Council’s website states. Discussing impacts it will study, the Council distinguishes between “upstream” and “downstream,” with the former representing “impacts most directly related to the Council’s funding,” and, the latter, “impacts that result from the work of funded artists and organizations and [that] may be attributed (but not limited) to the Council’s funding.” A summary of stakeholder meetings on the framework mentions initial concern among some participants that the Council would use the framework to affect individual granting decisions. The final framework document dispels this possibility, noting: “The framework focuses on the impact that the Canada Council generates both through its grant programs and strategic commitments. It will not be used to assess the work of the artists and organizations the Council funds, or influence the decisions of peer assessment committees.” Similarly, in a later section, the report describes two types of artistic programming—one that is based primarily on an “artistic impulse…without a specific outcome in mind,” and another that “is designed with a specific outcome in mind, as is the case with many education and community engagement programs.” In connection with the first type, the report states: “Evaluation principles don’t apply here because there is no logic model behind an artistic vision.” Those of us who toil in performance measurement for arts and cultural institutions would do well to hang that sentence above our desks.