Philosophy and Definitions
At Createquity, respect for and understanding of research are embedded into our DNA. Every year, governments, foundations, universities, and scientists invest thousands of hours and millions of dollars generating research about arts and culture. Yet without a shared understanding of how to interpret that research and a framework for how the findings can be used, much of that effort goes to waste. Our role has been to vet and synthesize this literature in a way that policy-makers, funders, thought leaders, arts administrators, and the general public will find useful and compelling. Our goal has been to help people apply knowledge to actual, real-life decisions, decisions that ultimately improve people’s lives through the arts.
To that end, the overarching research question driving all of our work has been this one: “what are the most important issues in the arts, and what can we do about them?”
In order to actually answer this, we needed to decide what “importance” means from the perspective of the arts ecosystem. Accordingly, our editorial team hunkered down during the summer of 2014 to come up with a definition of a healthy arts ecosystem. Although always a work in process, for the most part this definition served as a stable representation of our guiding principles and topics that drive our work. In subsequent years, we extended this definition by drawing an equivalence between the cross-disciplinary, holistic concept of wellbeing (or quality of life) and ecosystem health. Doing so enabled us to connect the arts to broader conversations across the social sector about human progress, and create a framework that would make it possible to compare priority areas within the arts against each other.
Thus, by 2016, we were describing a healthy arts ecosystem as one in which “the maximum possible collective wellbeing is generated through the arts.” [Note that the use of a term like “maximum possible” is aspirational in the sense that Createquity must make judgments in an environment of significant uncertainty. We aimed with our work to create a fuzzy but fundamentally accurate picture of (a) the world that is and (b) the world that could be with the benefit of different choices.]
With those definitions in hand, we were able to operationalize our tagline as follows:
- “What are the most important issues in the arts?” ➤ “What are the biggest gaps between current conditions and the maximum collective wellbeing that could be generated through the arts?”
- “What can we do about them?” ➤ “For any given gap, what is the most promising strategy or set of strategies available to close it, after taking cost and risk into account?”
Core Research Process
Our core research process placed these questions in the context of a three-phase process, ultimately leading to an advocacy campaign for some kind of concrete change in the sector (what we called a “case for change”):
Phase I, the Discovery Phase, involved examining a wide range of potential problems or opportunities in the arts in order to determine which ones were most pressing from the standpoint of increasing overall quality of life. Each of these gaps between present-day reality and the world that could be was conceived as a separate research investigation. Many of the big feature articles published on Createquity, such as Why Don’t They Come?, are the direct result of a Discovery Phase investigation – in that particular case, an exploration of the extent to which socioeconomic disadvantage was interfering with adults’ ability to experience arts and culture as consumers.
We identified potential investigations primarily through two routes: our own intuitions and experiences, and external input. The latter involved assessing the results of our reader polls, as well as feedback from our advisory council members. In addition, from time to time we would be alerted to a promising topic via information we came across in the process of investigating a different topic. By blending these various methods, we could have some assurance that we were investigating issues that our audience cared about, while at the same time not ignoring neglected topics that might not be receiving the attention they deserve.
Each of these investigations involved a thorough review of the evidence in order to estimate as precisely as possible how many people are affected by each issue, by how much, and in what ways. The plan was to then move the most consequential of these issues into the second phase, where we would consider strategies to close the gap between the status quo and the better future that may be possible. Finally, where we’d identified both a significant gap and at least one promising strategy to address it, we’d develop a case for change that translates all of the learning we have undergone into concrete recommendations and calls to action.
One of the benefits of combining an online publication model with a think tank model, as Createquity did, is that the outputs of our research are not 100-page PDFs that nobody reads. We strive to make the research, or more to the point the lessons we learn from the research, come alive in the form of compelling and highly shareable narratives. Furthermore, our feature articles are cumulative by design: each one builds upon the last. For example, one feature in a series might examine the question of whether a problem that we’ve identified actually exists. Another might further explore aspects of the problem for which there is evidence and ask how people, either in the arts or in adjacent fields, have tried to address that problem in the past.
Conducting a Research Investigation
Each topic explored as part of the research agenda described above would be the subject of its own research investigation. The process template for conducting a research investigation was fairly consistent, albeit subject to adjustments based on the topic and the team investigating. In addition, the process evolved over time due to our own collective learning and growth as a research-based organization. We sought to adopt a structured approach to reviewing literature that was as systematic as possible in the context of our grassroots operation, yet nimble enough to respond to broadly framed research objectives. In doing so, we drew inspiration from the practice of “systematic evidence reviews” and “rapid evidence assessments,” as well as research synthesis methodology as described in Harris Cooper’s textbook Research Synthesis and Meta-Analysis: A Step-by-Step Approach and implemented by organizations such as Cochrane and the Campbell Collaboration. (Note: Cooper himself served as a member of Creatquity’s advisory council.)
In general, we start by defining hypotheses that will guide the investigation. For a Discovery Phase investigation, the goal is to understand how reality falls short of our definition of a healthy arts ecosystem. Formulating specific hypotheses to that effect helps to clarify where we think those gaps might be. For example, within the research area of “Economic Disadvantage/Insecurity and Opportunities to Participate in the Arts,” our hypotheses included:
- Poor and economically insecure adults are significantly less likely to have access to opportunities to participate in the arts for a variety of reasons, including inability to afford the cost of participating, inability to afford indirect costs, lack of time, and lack of awareness of opportunities, and lack of ability to take financial risk (for potential producers).
- Many people who would benefit from opportunities to participate in the arts do not take advantage of them due to pressure from social and/or professional environments that treat participation in the arts as an unwelcome distraction from economically productive activities.
Next, we conduct an initial scoping review to get a broad sense of the available literature that is relevant to the hypotheses, and whether it warrants (and can support) a more thorough investigation.
The basic selection criteria for including literature at this stage was:
- Is the work relevant to the issue area?
- Was the work written to increase the existing knowledge base? (as opposed to, for example, expressing a personal opinion)
Sources for literature include academic databases (primarily JSTOR and EBSCO), Google Scholar, combing through the bibliographies of key texts, reviewing the tables of contents of relevant academic journals, peer references, and research aggregators (e.g., CultureCase, Arts Research Monitor, the monthly Taking Note column from the NEA’s Art Works blog). We developed a starter guide for sourcing literature reviews for new editorial staff so that everyone could start from the same base level of understanding.
The next step is to prioritize the studies that appear to be most compelling and able to address our hypotheses, and conduct preliminary reviews of each of them. As part of this process, we typically create a formal or informal coding guide to help track relevant features of each study and organize our analysis. We review each publication for its relevance to the investigation topic, transparency in sharing and explaining methods and their potential limitations, general rigor of design overall, and soundness of interpretation in order to gauge the strength of the research.
After prioritization, we go into review mode. The highest-priority publications receive a preliminary review of the full text through capsule reviews, a structured brief review format. The primary purpose of a capsule review is to provide reviewers and readers with a good understanding of the major claims made by the research and its contributions to a body of knowledge about the arts. The format provides a standard template that allows us not only to summarize a particular piece of research, but analyze its strengths and weaknesses and connect it back to our hypotheses. We created a guide to writing capsule reviews for our internal training purposes, which you can read here.
At this stage, since we are simply trying to get a quick assessment of which texts have the most to teach us about our hypotheses, we rely on a technique called power browsing. Power browsing is a way of strategically scanning nonfiction texts that vastly cuts down on the time necessary to access and understand the key points. We’ve spent a significant amount of time developing and testing this strategy internally, finding that in most cases there is remarkable consistency in the features that different reviewers highlight as particularly important or concerning about the study, and that these major highlights and concerns can often be extracted in as short as 30 minutes with proper recruitment and training.
Next, we create a summary evidence matrix, reporting on arguments, sources, key evidence, countervailing evidence, and reviewer notes. This matrix provides a birds-eye view of the existing evidence, and enables us to quickly assess the strength and depth of the evidence as well as identify gaps and weaknesses. Here is a link to an example from our most recent research investigation taking stock of the evidence around the benefits of the arts to wellbeing.
Finally, we home in on the most promising of this set of studies and subject them to more thorough analysis, and in-depth review. Based on the initial inspection of the work, the reviewer makes a recommendation about whether or not to proceed with a deeper treatment or review. She makes that recommendation based on relevancy to our hypotheses, and or strength of methodology, amongst other things as outlined in our guideline to evaluating research. Sometimes, if the work is deemed valuable but the reviewer is confident that she got the main points in her first pass, we will make use of the material, but won’t necessarily give the underlying text a second read. Sometimes, however, the outcome of the initial review will be to recommend a second pass, in which case the work is set aside for a more detailed inspection or a “deeper dive.”
A deeper dive retains the capsule review format and target length, but devotes more time to understanding the details and full ramifications of the text. Depending on the length and complexity of the text and its centrality to our emerging conclusions, we may be satisfied with setting a longer time limit for the review or go so far as to have multiple people read the entire text and compare notes. In general, the more a publication’s conclusions promise to influence our own conclusions, the more safeguards we want to put into place to ensure we are not misinterpreting the study or missing an important detail that would change our view.
Our core research process provides the raw material for all of the major feature articles we publish at Createquity. Periodically, we take a step back from the research process to assess what we’ve learned so far and, if warranted, synthesize that learning for our general readership.
Other Research Activities
While the core research process is the backbone of our work at Createquity, it was not the only way in which we engaged with research. To keep readers engaged in between feature articles, our Research Spotlight series offered quick reviews and discussions of current and/or notable research that may or may not fit directly into our editorial themes. We tried to focus attention on studies and reports that deserve a wider reception than they’re currently getting, although we also sometimes would shine a light on high-profile research that is already being used to drive decisions and strategy at a policy level.
In addition, in 2016 we formalized a research screening process and methodology for vetting and rating all published English-language arts research on an ongoing basis. Through the end of 2016 and continuing into 2017, 12 Createquity team members and volunteers took on the task of reviewing over 500 works. The screening process involves identifying key hypotheses and research questions, describing methods, and making quick, almost gut-based, recommendation on whether the work deserves greater treatment and time. In addition to providing the raw material for selecting the winner of the Createquity Arts Research Prize, the screening process also generated several capsule reviews that were not associated with a core research process investigation or a Research Spotlight article. You can review the training materials we developed to assist in this effort here.
Unfinished Business: The Synthesis Project
Although we were very proud of the research process we developed and executed at Createquity, we were not able to raise the funds necessary to complete the Discovery phase of our research agenda. Had funding been available, we would have pursued the rest of the work using an innovative model called the Synthesis Project.
Borrowing from regranting arrangements often used by foundations and public granting agencies to reach smaller organizations, artists, and communities that they don’t have the capacity to reach directly, the Synthesis Project was a strategy to dramatically scale up Createquity’s Discovery Phase work over a two-year period. In this model, funding and management of research is funneled through one organization which in turn subcontracts individual projects out at market rates to teams of consultants. Instead of one to two research investigations a year, there might then be eight to ten. And instead of multiple agencies managing different timelines and approaches, there would be one centralized agency (in this case, Createquity) coordinating and overseeing all research projects.
The goal of fast-tracking myriad research projects is to build and share knowledge fast enough so that it can be acted upon. This means we aren’t continually stuck in Discovery mode, and instead can move into Strategy and Advocacy phases with a smart prioritization of the relative levels of urgency associated with a wide range of problems and opportunities facing the arts sector. At different key moments, the collective review and reflection on myriad investigations could be used to prioritize areas for further field-wide research and advocacy.
Although Createquity was ultimately unable to transition the Synthesis Project from concept to reality, we still think it’s a great idea, and welcome efforts by others to adapt it in the future. The industry is rich with research that can and should be mined for the gems that will help us to determine where the greatest opportunities lie to advocate for and build a healthier arts ecosystem, and what questions still remain to be answered in order to help us get further along the path toward a case for change.
We invite you to read our other recommendations for the arts research field here. If you are interested in building upon Createquity’s work or discussing our findings, leave a comment on one of the articles or feel free to contact founder Ian David Moss.