SNAAP Report

(For a quick summary of this post, see “Strategic National Arts Almuni Project: The Condensed Version.” SNAAP has responded in the comments.)

Is an arts degree worth it or worthless? Many an art or art history major has had to defend the value of her studies. Indeed, in a Kiplinger article that used data from payscale.com and Georgetown University’s Center on Education and the Workforce to determine the “Worst College Majors for Your Career,” fine arts, studio arts, film/photography, graphic design, drama and theater arts all made the list. The article warns college students who are tempted to major in fine arts that the unemployment rate for recent grads is 12.6% (almost twice the national average of 6.8%) and they are 1.8% more likely to work in retail than the average college graduate.

The employment situation for recent art-school grads is anything but aesthetic. Slow job growth and an abundance of fine-arts majors means unemployment is high – the second highest on our list. When fine-arts majors do find jobs, they generally don’t pay well. Even experienced artists can expect to make 20% less than their college classmates. While few people have ever gone into art for the money, the East Village isn’t as cheap as it used to be.

So the recent Strategic National Arts Alumni Project (SNAAP) report that finds that the unemployment rate for arts alumni is less than half the unemployment rate for all Americans is heartening, but surprising. Moreover, the report claims that most arts alumni “are satisfied with the opportunities their ‘primary job’ affords to demonstrate their creativity.” The sunny outlook that SNAAP presents in “A Diverse Palette: What Arts Graduates Say About Their Education and Careers” seems to be in conflict with both the aforementioned Kiplinger article and conventional wisdom. But is it? Let’s analyze what the SNAAP report actually has to say about the prospects for arts alumni.

SUMMARY

SNAAP, based at the Indiana University Center for Postsecondary Research, was founded for the stated purposes of providing a comprehensive look at artist development in the United States, and identifying how best to connect arts education and training to artistic careers. SNAAP defines “the arts” and “art” as inclusive of

 a broad range of creative activity including performance, design, architecture, creative writing, music composition, choreography, film, illustration, and fine art.

Since 2009, SNAAP has distributed a yearly report based on the results of an annual online survey that it gives to alumni of participating institutions. These institutions include arts high schools, comprehensive colleges and universities, liberal arts colleges and special-focus arts institutions. Institutions pay to participate in the project, with fees for post-secondary schools ranging from $3300 to $7800, depending upon the size of the arts alumni population. The survey comprises eighty-three questions and takes 20-30 minutes to complete.

The most recent 2012 report, based upon results from the 2011 SNAAP survey, presented many positive findings, or at least findings that appear to be positive. Most SNAAP respondents are currently employed. In fact, for the past two years, the SNAAP respondent unemployment rate is less than half the national unemployment rate for all Americans. What’s more, 87% of currently employed SNAAP respondents are “somewhat” or “very” satisfied with their primary job. Even most of those who are working in non-arts fields report general work-satisfaction, and most also say that despite working in an area outside of the arts, their arts training is still relevant to their work.

When asked if they would “do it all over again,” 77% of SNAAP respondents said that yes, if given the chance to go back in time, they would make the same choices as they originally had in terms of institution and major. Indeed, 92% reported that their overall experience at their institution was either good or excellent, and 88% would recommend their school to other prospective students.

Furthermore, arts alumni are also likely to participate in the arts outside of work. The report explains why this is meaningful:

One of the arguments for public support for the arts is that the presence and contributions of artists add depth and meaning to the human experience, thereby enhancing the quality of life for all. Thus, it’s important to know how arts graduates contribute to the arts and their communities independent of their income-producing work.

More than a quarter of SNAAP respondents have volunteered for an arts organization in the past year, and 45% have donated money to an arts organization or artist in the past 12 months. These are significantly higher rates than those in the general population, where just 2% of Americans volunteer for arts, cultural or humanities organizations, and only 6% of US households with incomes under $100,000 have given money to the arts. Moreover, in their leisure time, 72% of SNAAP respondents remain active in the arts community by creating, exhibiting and performing.

But it isn’t all coming up roses for arts graduates. Despite the encouraging employment data, 50% of survey respondents were “somewhat” or “very” dissatisfied with the career advising they received at their school. Nearly half were unhappy with the opportunities for degree-related internships and other work that their institutions provided, and 41% found occasions to network with alumni lacking.

Additionally, 40% of currently employed respondents have two or more jobs, which may be a sign that  they are unable to find full-time work or that their primary job does not provide enough income to live on. In fact, the majority of respondents from all arts majors except architecture earn less than $50,000 per year in their primary job.

Still, the 2012 SNAAP report concludes:

For many of these graduates, going to an arts training institution was “worth it”; they gained invaluable skills that they continue to draw upon whether or not they work as professional artists—both at work and in their non-work time.

ANALYSIS

Certainly the value of an arts education must be measured as more than the average earning potential of its graduates. Even so, the SNAAP report findings are inconsistent with data reported in the Kiplinger article, which states that there are higher unemployment rates for arts graduates. Why might this be? Let’s analyze the methodology of SNAAP’s research.

Respondent Representativeness

The 2011 SNAAP Questionnaire is only available online, and SNAAP relies on the individual institutions to disseminate information about the survey. This means that SNAAP survey respondents maintained valid email and/or mailing addresses on file with their alumni institutions, or had been active enough on social media to be located either directly by their schools or by Harris Connect, the “people finder” service contracted by SNAAP. Respondents who take the time to remain in email contact with their alumni institutions may be more likely to think favorably about those institutions than those who haven’t. And of the alumni contacted, those motivated to respond to the 20-30 minute survey may have been more likely to hold positive viewpoints about their institutions and present career situations.

The exclusive online availability of the survey, as well as the fact that information about the survey is primarily disseminated online, means that respondents most likely have regular internet access, and that they’re comfortable navigating the web. The survey delivery method may skew toward a more well-off demographic who are able to pay for internet in their homes. The questionnaire takes 20-30 minutes to complete. If you’re using your local coffee shop’s internet to look for work and e-mail resumes, you may not be as inclined to use a half hour of that time to complete an alumni survey as someone who is sitting at home after work watching hulu and cruising Facebook.

Anecdotally, I often hear of individuals from well-off backgrounds enjoying success in the arts, in part because they are able to afford to take unpaid internships, or participate in residencies where they generate no income for months at a time. These opportunities may lead to greater things. Additionally, these are pleasurable experiences, and although one may not be earning a large income, a SNAAP respondent may still reflect on them positively, especially if paying rent and buying food isn’t a concern. Thus, it’s possible that the SNAAP respondents are more financially comfortable in the aggregate than arts alumni as a whole, and if so we might expect to see this bias reflected in the responses. (Unfortunately, while the SNAAP survey asks the age, gender and ethnicity of each respondent, it does not ask any questions about the socio-economic background of the respondent, or whether he or she has additional financial support or means beyond his or her income.)

Danielle J. Lindemann and Steven J. Tepper’s recently published follow-up report on SNAAP’s 2010 survey, which was structured and delivered in the same way as the 2011 survey, acknowledges the survey’s potential response bias: “It is plausible that more financially successful arts graduates are more likely to fill out a survey about their experiences.” However, they counter that results from an earlier study indicate that the SNAAP results are not skewed in this respect. In 2009, SNAAP conducted a shadow study using a variety of incentives, such as a $15 gift card and inclusion into a lottery for a $100 award, as well as different modes of delivery, including paper, web and phone. Their report explains:

We found that there were no meaningful differences in the characteristics of the graduates in the high response rate group compared to the low response rate group. Related to the question of employment, for example, respondents from the higher-response rate sample indicated that they were currently doing paid work an average of 31 hours per week. Comparable individuals from the full SNAAP sample in 2009 indicated that they were doing paid work an average of 34 hours per week.

Still, even if alumni who are financially better off weren’t more likely to respond to the SNAAP survey, the issue of what alumni the SNAAP survey reached remains. More than 36,000 alumni responded to the survey, for an average institutional response rate of just over 20%. But, alumni who have updated their institution with current contact information may already be more inclined to respond positively about that institution than those who haven’t.

Even if there is no bias among respondents to the survey, however, the selection of institutions participating in SNAAP is not random. The 2011 SNAAP survey was sent to alumni from 183 programs in 66 institutions, including 8 arts high schools, 20 private nonprofit postsecondary schools, and 38 public postsecondary schools. These postsecondary institutions include colleges and universities with top-ranked arts programs, such as Maryland Institute College of Art, New York University Tisch School of the Arts, University of California – Los Angeles and Virginia Commonwealth University, as well as a number that are less known for their arts programs. At first glance, this seems like a reasonable cross-section of higher arts education across the nation. Still, these institutions have all chosen to participate in SNAAP, and paid a fee for the privilege of doing so. This suggests that they are already paying above average attention to students and alumni of their arts programs. What’s more, of the 58 postsecondary institutions, more than 30% have art programs in disciplines that include fine arts, ceramics, graphic design, multimedia/visual communications, painting/drawing, photography, printmaking, and sculpture, ranked in the top 20 by US News. To put this in perspective, the College Arts Association’s directory of Graduate programs in Studio Art and Design includes almost 250 individual institutions. These arts graduates, with degrees from esteemed institutions, may be more financially successful and happier with their arts training than arts alumni from second- and third-tier schools.

SNAAP’s efforts to increase and test response rates, through its use of Harris Connect and the shadow survey, are commendable. Still, the report repeatedly uses “arts alumni,” “arts graduates” and “SNAAP respondents” interchangeably. Are respondents to the survey representative of arts alumni as a whole? Unfortunately, there is no way for us to know.

Conflicting Interests

As mentioned earlier, participating institutions pay to have the survey sent to their alumni. This potentially creates a bias, if not in the survey’s responses, then in how this data is ultimately interpreted in the final SNAAP report and “packaged” to a broader audience. SNAAP explains why it requires fees from participating institutions:

…as a self-sustaining research project, institutional participation fees underwrite the cost of survey administration, data analysis, and school reports.

In other words, SNAAP needs these fees in order to remain viable. Although the individual institutions’ reports aren’t made public, schools might understandably be hesitant to participate in a study that openly casts doubt on the value of an arts education. In turn, that lack of participation could mean the end of SNAAP. Indeed, the SNAAP report does include negative statistics, but they are always countered with a positive statement, so that the overall tone and takeaway is optimistic. For example, the following passage from the SNAAP report came after a list of mixed results about alumni satisfaction with various aspects of their education:

While the results suggest a variety of strengths and weaknesses for institutions to consider, they also indicate that despite any less than stellar experiences alumni may have had, most who obtained an arts degree have few regrets. When asked if they would still attend their institution if they could start over again, over three quarters (77%) say definitely or probably yes. Furthermore, when asked if they would recommend their institution to another student like them, 88% say yes.

Although the 2011 report findings are not drastically negative overall, it’s unclear whether SNAAP would be in a position to draw attention to a significant and sustained deterioration in these numbers in the future.

Data Comparability

SNAAP reports that the SNAAP respondent unemployment rate is less than half the national unemployment rate for all Americans, which in 2011 was 8.9%.This figure becomes less impressive when compared with the unemployment rate of college-educated Americans, which most SNAAP respondents are. The 2011 national unemployment rate for college graduates is 4%; for SNAAP respondents with a bachelor’s degree, it’s also 4%, and for those with a master’s, it’s 5%. Still, it would appear that arts alumni are not significantly better or worse off than college graduates with non-arts majors, which conflicts with the Georgetown research cited in the Kiplinger article. How could this be?

SNAAP explains that unlike Carnevale, Cheah and Strohl’s Georgetown Center on Education and the Workforce study, its employment figures are based upon different measures than those used by the Bureau of Labor Statistics (BLS).

The difference in employment numbers between data from SNAAP and from other sources may be due in part to SNAAP’s employment measures, which include intermittent work—not uncommon among professional artists—as among the ways of being employed. The U.S. Census, for example, would label such people as unemployed.

In particular, SNAAP may be counting severely underemployed respondents as employed because they identify as self-employed/ freelancers. Freelancers are often more affected than traditional employees during a recession, from which the US continues to recover. When pockets are tight and businesses aren’t growing, customers are more likely to view purchases of goods like art as luxuries, see participating in a continuing ed or community art class as optional, and need services like design less. In fact, BLS projects that employment opportunities for craft and fine artists will grow by 5% over the next decade, which is slower than the average 14.3% projected for all occupations.

Because of the nature of SNAAP, the project only collects data about arts alumni. However, in order to truly draw conclusions about the value of an arts degree, we would need to see data collected in the same way for a cross-section of majors. Unfortunately, one of the few metrics that could potentially serve as a common yardstick is complicated by SNAAP’s alternative approach to measuring employment status. SNAAP could easily have designed its survey to enable analysis of unemployment figures comparable to BLS statistics alongside numbers derived from the alternative method. It’s unclear why this path wasn’t taken.

Considering the Alternatives

Many of the questions in the SNAAP survey ask alumni about subjective impressions, like these:

In your opinion, how much did [INSTITUTION] help you acquire or develop each of the following skills and abilities?

Please describe how your arts training is or is not relevant to your current work.

Describe how your arts training at [INSTITUTION] is or is not relevant to your participation in civic and community life.

Certainly most arts alumni value the arts—that’s why they chose an arts major—and the SNAAP survey provides them with the opportunity to opine about the importance of the arts. But for most arts alumni, the choice was not between pursuing an arts degree or doing nothing. Instead, the choice was between studying the arts and studying something else. Or perhaps it was between pursing an arts-related graduate degree and gaining additional experience in the workplace, or investing in a home. Respondents weren’t asked to reflect upon the sacrifices that they might have made in choosing their field of study.

I have no doubt that the education that arts alumni received contributed to skill sets that are relevant to their primary employment and civic engagement. But it’s also possible that alumni may have developed equivalent or better skill sets related to their current employment with an alternative course of study. Still, it is encouraging to learn that most SNAAP respondents have found work “congruent with their values and dispositions” that affords them the opportunity to “demonstrate their creativity.”    

IMPLICATIONS

So what does this mean? Certainly not that the SNAAP study has no value, nor that arts education is worthless. But the SNAAP report is largely inconclusive. SNAAP respondents are not necessarily reflective of the larger pool of arts alumni. A significant proportion of the participating institutions have top-ranked arts programs. It’s possible that a sample with alumni from a greater range of arts programs would produce the same results, but it’s also possible that SNAAP’s sample is biased. The report also presents an alternative way of measuring employment data, which could be valuable in creating a more accurate view of the employment situations of those who freelance or are self-employed, regardless of whether that work is arts related. Unfortunately, absent a comprehensive employment survey across fields using SNAAP’s method, it tells us very little about arts alumni as compared to holders of college and graduate degrees as a whole.

As an artist, arts educator, and arts administrator, I want to believe that an arts education is always worth it. Certainly the median salary of arts alumni should not be the sole factor in determining its value. However, with the average student loan debt for recent college graduates at over $26,000, and many borrowing beyond that to pursue graduate degrees, financial considerations shouldn’t be eliminated from the equation entirely. As I read the SNAAP report, I found myself thinking about Vanderbilt University Law Professor Herwig Schlunk’s groundbreaking 2009 academic essay, “Mamas Don’t Let You Babies Grow Up To Be… Lawyers,” in which he calculates opportunity and out-of-pocket costs for law students, and the likely return for different types of students, depending upon their class rank and the prestige of their program. He concludes that for many, particularly average students graduating from second and third tier schools, law school is a losing proposition.

It’s not possible to determine for whom an arts education is a worthy investment because no similar study has been done for art students. The difficulty for SNAAP is that its first responsibility is to analyze data for paying customers, not to  use that data as field research to draw larger conclusions. Still, at this time SNAAP is the most comprehensive data resource available for pre-professional arts education.

Few people go into the arts for the money. It’s likely that most of us instinctively knew that the median salary of an arts major would be significantly less than that of an engineering or business major, even before Georgetown released their report. SNAAP’s report does tell us that although a career in the arts may not be incredibly lucrative, not all artists are starving. But it also reminds us that an education should be valued for more than the average earning potential of its graduates.

… the worth of an arts degree must be measured by both pecuniary and non-pecuniary benefits. Much has been made of recent reports using national income data showing that arts graduates have lower than average earnings… Tangible economic benefits are unquestionably important, but calibrating the success of arts graduates only by how much they make does a disservice not only to those who practice their art and apparently derive great satisfaction from doing so, but also to the communities they enrich with artistic contributions through sharing their artistic creations, teaching, and supporting other artists.

FURTHER READING

For those who would like to dig in to SNAAP’s data in more detail, the 2011 Aggregate Frequency Report is what you’re looking for.

The 2011 SNAAP Annual Report covers the 2010 survey, along with the 2010 Aggregate Frequency Report.

Danielle J, Lindemann and Steven J. Tepper’s SNAAP special report, “Painting With Broader Strokes: Reassessing the Value of an Arts Degree” addresses potential bias issues and further analyzes the 2010 survey findings.

SNAAP publishes nifty interactive visualizations of its survey data, called SnaapShots. Here is the SnaapShot of the 2011 survey data, and the previous version, SnaapShot 2010.

Daniel Luzer critiques the SNAAP survey bias in Washington Monthly.

The Kennedy Center discusses the survey.

SNAAP manager Sally Gaskill blogs about SNAAP’s findings for Americans for the Arts.

Support Createquity

Help us build a sustainable future for Createquity!
  • This field is for validation purposes and should be left unchanged.
  • This field is for validation purposes and should be left unchanged.
  • http://createquity.com Ian David Moss

    One interesting thing I noticed, while poring through the Aggregate Frequency Report mentioned in the “further reading” section, is that the graduates of arts high schools, which represent the smallest cohort of survey respondents, have drastically more positive views of their institutions. For example, 82% of these respondents reported that they had an “excellent” experience overall, compared to 50% of college alumni and 53% of graduate program alums. 86% of the high schoolers said they would “definitely” attend their institution again, compared to 42% of the college grads and 43% of the master’s/doctoral/diploma students. And an amazing 97% would recommend their institution to a student like them, versus 87% and 85% for the other two cohorts. Not surprisingly, the high schoolers had a more diverse range of career paths and fewer of them intended to become professional artists, although those differences are not as huge as those above.

  • http://arts.vcu.edu Sarah B. Cunningham

    Createquity recently asked important questions about the Strategic National Arts Alumni Project (SNAAP) regarding response rates, bias and value. Is SNAAP getting reasonable response rates? Can online surveys with a 20% response rate be statistically significant? Do participation fees bias the survey to produce false positive results? Are the SNAAP results suspiciously “positive”? As they should, Createquity authors demand high standards in data, analysis, and writing. And yet, SNAAP does make concerted efforts to proceed with caution, establish scientifically valid response rates, and maintain research integrity to circumvent bias.

    Createquity suggested that the use of online surveys means that SNAAP is more likely to receive results from a “well-off demographic.” While SNAAP’s demographic could be skewed in a number of ways, the online surveys won’t necessarily skew to the income demographics of arts graduates. Why? Recent research indicates that 90% of college graduates use the internet. By seeking online data from college graduates, the SNAAP survey might expect that 90% of potential respondents may have access to the survey, regardless of whether or not they pay for an internet connection in their home. Even so, the research on internet usage also suggests that 83% of individuals earning more than $30,000 will be online. With the median arts salary at about $45,000 per year (based on US Census data), SNAAP has a good chance of reaching its target audience through the internet. Additionally, survey researchers utilizing phone or mail techniques can’t always persuade respondents to participate. In 2012, Pew Research Center reported that they could successfully survey only 9% of households using phone and mail techniques.

    As a relatively new survey, is the 20% SNAAP response rate reflecting accurate information about college graduates? After all, many graduates will have never heard of the survey until their institution sends them a link. And many young graduates move frequently, making them difficult to track. Pew Research Center for the People and the Press reported that when comparing surveys that range from Standard (9%) to High-Effort (22%), “the vast majority of results did not differ between the survey conducted with the standard methodology and the survey with the higher response rate.” They conclude, “in general, the additional effort and expense in the high-effort study appears to provide little benefit in terms of the quality of estimates.” Therefore, the 20% response rate can yield valid results and, even with an exclusively online approach, comes very close to a “high-effort” yield.

    But if top-ranked schools participated in SNAAP, does this mean the survey caters to respondents who are “more financially successful and happier”? Arguing that school-ranking correlates to individual income and also happiness is quite a leap. Since ranking reflects evaluation of graduate programs by peer institutions, it fails to provide us with information about undergraduate arts education. Can one even assume that college selectivity could be causally linked to “happiness” amongst alumni? Not necessarily. In fact, the SNAAP study reveals no correlation between income and school satisfaction.

    Regardless, could institutional fees for survey administration contribute to bias as suggested by the critique? Institutional fees do not bias the outcomes. While the SNAAP team remains vigilant to potential distortions in data, institutions pay for the benefit of impartial data. Collective fees have made it possible to create an instrument far less biased than any one school’s self-administered alumni survey. As one leader recently noted, ‘it would be impossible for any one school to afford or develop such a sophisticated tool with national benchmarks.’ To establish research validity, SNAAP has employed the national leaders in higher education survey research at Indiana University’s Center for Postsecondary Research and Center for Survey Research. These efforts are complemented with research oversight provided by Vanderbilt University’s Curb Center for Art, Enterprise and Public Policy, whose nationally recognized researchers produce scholarly analyses of aggregated SNAAP data. The reputation of these research centers is established precisely by producing unbiased, scholarly reports to deepen knowledge of campus culture, student engagement, and learning outcomes. As a project embedded within research universities, SNAAP must maintain high research standards consistent with other university endeavors.

    But, still, are the results “too positive?” The national data will shift from year to year depending on the cohort of institutions participating; during the past two years, a total of 124 unique institutions have partnered with SNAAP. Each institution receives a confidential report on the responses of its alumni that can provide more detailed accounts of alumni dissatisfaction. Createquity researchers likely did not have the opportunity to read the candid alumni discontent reflected in comments and statistics focused on a single school. Institutions can consider this feedback when making strategic decisions about campus culture, curriculum, or opportunities. Likewise, the alumni reports can often confirm existing wisdom about a weakness. On many occasions, students, faculty and alumni have already vocalized their concerns through existing feedback mechanisms and the school has already taken action to introduce improvements related to targeted weaknesses. At the same time, individual schools must contextualize the value (positive or negative) of feedback, depending on their history and strategic investments. They alone must decide what it means when alumni describe challenging teachers, the amount of free time, or the quality of coursework. In reading a comprehensive institutional report, one encounters a rich dialogue between what the numbers say, and how the alumni narrative comments shed light on what those numbers mean.

    Thank you to Jacquelyn Strycker for her questions about the research quality of the SNAAP project. SNAAP’s collective effort to maintain research excellence should dispel any perception that arts schools are soft on accountability. I hope that ongoing dialogue with the public will continue to sharpen our efforts and tell the stories of the 3 million arts graduates nationwide. Thanks to the Surdna Foundation’s visionary initiation of this project in 2002, arts schools can now respond to the increased scrutiny of volatile funding environments. We can now provide the public with more accurate information about arts careers, illuminating public policy debates through ongoing research.

    Dr. Sarah Bainter Cunningham
    Executive Director, Research, VCU School of the Arts
    National Advisory Board Member, SNAAP

    • Jacquelyn Strycker

      Thank you, Dr. Cunningham, for this thoughtful response. However, it does seem as though you may have misinterpreted my critique of how the exclusive online ability of the survey may result in a bias of a more well-off demographic of respondents. I don’t doubt that 90% of college graduates use the internet, and that a large number of the arts alumni surveyed would have found a link to the SNAAP Questionnaire in their inbox. But my argument is that those who are unemployed or underemployed, and accessing the web from a coffee shop, the public library or a friend’s house because they are unable to afford a connection in their home, may be less likely to actually respond to the online survey they’ve received than those who are able to access internet from their homes. Why? Because that valuable time on the internet is being used to work or find work. But for those who have internet access in their homes, possibly because they are more financially stable, spending a half hour completing a survey one evening is less of a big deal.

      Does the internet exclusivity of SNAAP’s survey create a bias? I don’t know, but it’s plausible that it could.

      What are other options, particularly if phone surveys only yield a 9% response rate? The Georgetown Center on Education and Workforce study that I cited gathered much of their data from the American Community Survey. Conducted by the US Census Bureau, they use multiple modes of data collection: a mailer, a telephone follow-up for those who don’t respond, and a personal visit for those who still don’t respond. Responses can be sent in online or via mail, or they can also be collected on the phone or in person. That might not make sense for SNAAP, but SNAAP could continue to use email as their main mode of distribution, and then follow up with non-responders via phone calls and snail mailers. The survey could also be available to download as a pdf that could be printed and returned via USPS, in addition to its current online format.

      Would a “high-effort” yield make a difference for this survey? Perhaps it would, and perhaps it wouldn’t. But, since SNAAP is relatively new, and is currently leading the way in conducting research about pre-professional arts education, I’d like to see them err on the side of comprehension.

  • http://createquity.com Ian David Moss

    I think it may be helpful to take a step back and consider things from a broader frame here. SNAAP is really two entities in the guise of a single project. On the one hand, it is a service, one that provides a standardized methodology and centralized resources for gathering information about the alumni of its institutional customers. On the other, it is a dataset, a gigantic one, consisting of responses from arts alumni of all kinds to questions about their experiences and satisfaction with their schooling, current employment and financial circumstances, and other items of interest.

    I don’t think Jacquie or I have much to quibble with about the first “version” of SNAAP. Sarah is absolutely right that it would be difficult and far more expensive for schools to undertake an equivalent study on their own. SNAAP’s model allows for an admirable scaling of resources to allow those schools to understand how their alumni stack up against those of other schools, when asked the same questions and approached in the exact same manner. If I were running an art school, I would have it sign up to participate in SNAAP in a heartbeat on the basis of that comparative data alone.

    But that’s not what Jacquie’s Arts Policy Library piece is analyzing. She is writing about the second “version” of SNAAP, the one that takes the resulting data and publishes research reports based on it, research reports that attempt to draw conclusions about the entire universe of arts graduates on the basis of that data.

    It is this version of SNAAP that we find problematic. Although Jacquie makes several critiques throughout her article, by far the most important is this question of representativeness. What does a nonrandom sample of alumni of a nonrandom sample of institutions tell us about the whole in a situation like this?

    Survey bias (both bias arising from people choosing not to respond to the survey and coverage error, i.e., failing to even reach people who would have been eligible to take your survey) is a complex and subtle thing. A survey can be biased without that bias skewing the results. For example, if a political poll administered by telephone reaches mostly people who are more interested in politics, but interest in politics doesn’t correlate with how they’ll vote in the election, the poll will still be useful for determining the likely election results.

    Of course we don’t know whether SNAAP’s results are skewed or not. But the core problem is that there are several highly plausible reasons, all of which Jacquie lays out in her piece, to think that the results might be skewed in one particular direction. By contrast, we have no compelling reasons to think that the results might be skewed in the opposite direction. It seems that the best we can come up with in SNAAP’s defense is reasons why the results might on target, or not skewed that much. To me, that suggests that we’re probably looking at a biased sample. The question is, biased by how much?

    Once again, this only matters insofar as we are trying to generalize from the respondents in SNAAP’s survey to those who didn’t respond, either because they chose not to or because they didn’t have the opportunity. That’s why the first version of SNAAP, the service for participating institutions, can still be useful. Even if there is a bias in the SNAAP dataset, there’s no systemic reason to think that bias would differ from one school to the next. So if my goal is to compare my school’s results to those of others, I’m still getting a valid comparison, because all of the schools’ graduates were surveyed the same way. We just have to be realistic about which students we’re hearing from.

    I should add that it’s clear to me that SNAAP takes itself and its work seriously. I was pleasantly surprised to learn both that the project works with Harris Connect to track down alumni who have lost contact with their institutions, and that a shadow version of the survey with incentives was tested to assess the feasibility and value of increasing response rates. Both of these actions are indicative of a seriousness of purpose that is to be commended. If SNAAP’s primary goal was to generate feel-good numbers to pump up arts training programs, there would be much cheaper and easier ways of going about it.

    Despite these positive steps, though, questions remain. (The “Painting with Broader Strokes” report cites several papers with intriguing titles such as “An analysis of mode effects in three survey modes in the Strategic National Arts Alumni Project” and “Lower response rates on alumni surveys might not mean lower response representativeness,” but these do not appear to be available online.) I just don’t know if it’s realistic to expect that broad conclusions about arts graduates as a whole can ever be reliably drawn from SNAAP data, no matter how many safeguards are put in place to increase response rates. I may well be proven wrong about that, and hope that I will be. But in the meantime, I would advise caution in applying the SNAAP results beyond the survey sample.

  • Steven Tepper

    Great comments Ian and Jacquelyn. You raise great points about the challenges of survey research. This project is at the beginning stages and we will learn more about this incredible data as the research advances. As research director, I can confidently say that we are not disregarding potential bias. We have done a bias study where we tripled the response rate for five schools through multiple modes and found very little difference between the respondents. For example, in the regular SNAAP pool, 87.5 percent of respondents say they rated their school experience as good or excellent; compared to 87.4 percent in the higher response rate sample. In terms of life satisfaction, 74 percent in both the regular pool and the higher response rate pool said their lives are close to their ideal. There are additional comparisons we are still looking into as we study potential bias, but our first attempt suggests that on the issues we discuss in our recent research report, we do not think we are subject to bias that would change our overall interpretation. Likewise, while future reports will weight the data to take into account differences in the sample pool, the results in the most recent report confirm the general findings from previous years when the sample was different. In other words, we have no reason to suspect that on the issues we are writing about, the pool of institutions in the sample will change our findings in any substantive way. This is what scholars must always do — consider the limitations of their data and then report findings when they believe the substance of their findings would not be contradicted by any improvements in the data. That is the case with the recent SNAAP report. So, we acknowledge that there can be bias; we will continue to improve our understanding of what that bias might be; and we will only report those findings where we think we are on pretty safe ground in terms of the general story we are telling.

    • http://createquity.com Ian David Moss

      Thanks, Steven. If there is a possibility of releasing the full results of that bias study to the public, I am sure we would all learn a lot.

      I am curious, have you or the SNAAP team given any consideration to trying to model the nonresponses on the basis of that study, or apply weights to the survey results in some way? (By degree awarded/major would seem to be an obvious one.) It seems like either of these methods might yield more precise estimates without significantly adding to the cost, no?

  • http://createquity.com Ian David Moss

    A new article from the Wall Street Journal analyzes data from the Department of Education and finds that “median debt loads at schools specializing in art, music and design average $21,576.” The data refers only to students who have loan debt burden, and (as far as I can tell) just to those getting up to a Bachelor’s degree. A true apples-to-apples comparison is not possible because the WSJ is averaging median debt burdens by school, rather than finding the overall median debt burden across an entire alumni population, and is isolating its analysis to entire schools “specializing” in art, music and design rather than programs with that specialization. But for what it’s worth, the roughly equivalent figure for SNAAP works out to somewhere between $15,000 and $20,000. (I got this by looking at the distribution of the 49% of SNAAP undergraduate institution alumni respondents who reported some student loan debt; see page 60 in the link above.)