Title: The Cultural Lives of Californians: Insights from the California Survey of Arts & Cultural Participation
Author(s): Jennifer Novak-Leonard, Michael Reynolds, Ned English, Norman Bradburn
Publisher: The James Irvine Foundation
Methods: Survey of 1,238 California adults by telephone (both landline and cellphone), stratified by urban/rural county and including oversamples for black and Chinese respondents
What it says: Conducted as a companion piece to “A Closer Look at Arts Engagement in California,” “The Cultural Lives of Californians” is an attempt to model a different approach to cultural participation surveys. Rather than limit the inquiry to closed-ended questions about specific types of participation, the survey opens with an open-ended prompt about what types of creative activities the respondent participates in. The report seeks to understand how and where Californians participate in arts and culture, as well as the value they gain from such participation and the role technology plays in their cultural lives. It broadly finds similar patterns of engagement among California’s population compared to previous surveys, particularly “A Closer Look at Arts Engagement in California,” but includes a few notable twists. In particular, “The Cultural Lives of Californians” brings attention to a notable pattern of lower participation among immigrants, even those of the same ethnic background as non-immigrants. The authors explain that immigrants work more hours and have less leisure time than the rest of the population, which could account for the difference. (Unfortunately, the report does not ask about barriers to participation.) The report also highlights the very significant effect of age on art-making (as distinct from arts attendance), which overshadow the effect of income and education (though that effect does not disappear entirely).
What I think about it: There is a tremendous level of transparency provided regarding the survey design and administration, as well as an attempt to consider the core questions by both primary and secondary research methods, both of which are welcome. A significant investment was made into the primary data collection component of the work, with robust methodological principles applied. Despite this, the values observed in the respondent set to “Cultural Lives of Californians” are consistently higher than seen in comparable questions in the SPPA and other data sources, sometimes dramatically so. For example, the California Survey reports a prevalence of acting six times higher than the SPPA; four times higher for purchasing or acquiring art; and double the rate of attending a cultural fair or festival. By way of explanation, the authors cite the broader frame of the survey as a whole (which doesn’t explain the difference in directly comparable questions) and cast doubt on the methodology of the SPPA, implying that the abrupt transition to the set of questions about arts and culture as well as the switch in recall period from the past week to the past year depress the results. While this is probably true to some extent, the authors seem to go out of their way not to consider another possibility, which is that the California Survey may suffer from increased nonresponse bias. The response rate to the survey is substantially lower than that of both the Survey of Public Participation in the Arts and the General Social Survey, which opens up a higher risk for bias considering the targeted nature of the survey (i.e., it is upfront about being interested in people’s cultural lives, so people who have richer cultural lives were probably more likely to respond).
The authors assert repeatedly that the report asks about a wider range of arts activities than the 2012 SPPA, but the difference is not enormous, with perhaps half a dozen new activities gauged in addition to the open-ended prompt at the beginning. That said, one important difference is that examples are provided for many of the prompts, which may encourage respondents to think more expansively about experiences they’ve had that qualify as arts-related.
Overall, the California Survey and accompanying report excel in two respects: the design of the survey, particularly the questions, and the level of transparency and rigor involved in validating the results against other data sources and making the data available for others to use. The administration of the survey seems not to have been robust enough to overcome nonresponse bias, making it less useful than it could have been, but it is a great model for integration into the next edition of the SPPA.
What it all means: There are really two stories here – one about survey methodology and one about arts participation patterns. Regarding the latter, even with potential bias we can establish some upper bounds on overall participation, since 6.3% of the respondents couldn’t name a single cultural activity they participate in even with an open-ended prompt. It’s also interesting that about 15% of the remaining responses had no correspondence with the closed-ended questions about specific types of participation. In terms of Createquity’s original interest in the study, determining whether low-SES adults are more likely to participate in the informal arts than high-SES, we don’t see a lot of evidence here to suggest that’s the case. Certainly, the differences across education and especially income are muted compared to various forms of physical attendance, but there is no category of participation that demonstrates a clear pattern in the opposite direction – of higher participation by adults in the bottom income quartile or who never attended college. Put another way, while poor and economically disadvantaged adults may be more likely to sing to themselves or dance with friends than see the opera, the same is true of people with a college degree.
The second story here is about what it takes to conduct a truly reliable survey. Response rates for phone surveys continue to get worse and worse, and there seems to be a strong suggestion here that even with a random sample and a professional approach, a survey that signals too strongly the subject under study can bias results if part of what’s being measured is interest in that subject. This has implications for every arts organization that surveys its own audience members, an extremely common practice throughout the industry. If your market research relies on people taking the time to tell you whether they’re interested in what you have to offer, odds are you’ll be hearing from the most interested people.