<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Createquity.Createquity.</title>
	<atom:link href="https://createquity.com/tag/decision-analysis/feed/" rel="self" type="application/rss+xml" />
	<link>https://createquity.com</link>
	<description>The most important issues in the arts...and what we can do about them.</description>
	<lastBuildDate>Wed, 15 Jul 2020 20:17:39 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>Fractured Atlas as a Learning Organization: An Introduction</title>
		<link>https://createquity.com/2013/10/fractured-atlas-as-a-learning-organization-an-introduction/</link>
		<comments>https://createquity.com/2013/10/fractured-atlas-as-a-learning-organization-an-introduction/#respond</comments>
		<pubDate>Thu, 31 Oct 2013 13:03:06 +0000</pubDate>
		<dc:creator><![CDATA[Ian David Moss]]></dc:creator>
				<category><![CDATA[Research]]></category>
		<category><![CDATA[Big Data]]></category>
		<category><![CDATA[data]]></category>
		<category><![CDATA[decision analysis]]></category>
		<category><![CDATA[Fractured Atlas]]></category>
		<category><![CDATA[Fractured Atlas as a Learning Organization]]></category>
		<category><![CDATA[measurement in the arts]]></category>
		<category><![CDATA[risk]]></category>
		<category><![CDATA[strategy]]></category>

		<guid isPermaLink="false">https://createquity.com/?p=5743</guid>
		<description><![CDATA[(Cross-posted from the Fractured Atlas blog, as I expect many Createquity readers will be interested in this series. -IDM) If you&#8217;ve been paying any attention at all to technology trends the past few years, you know that we live in the era of Big Data. All of those videos we upload to YouTube, hard drives<a href="https://createquity.com/2013/10/fractured-atlas-as-a-learning-organization-an-introduction/" class="read-more">Read&#160;More</a>]]></description>
				<content:encoded><![CDATA[<p><em>(Cross-posted from the <a href="http://www.fracturedatlas.org/site/blog/">Fractured Atlas blog</a>, as I expect many Createquity readers will be interested in this series. -IDM)</em></p>
<p>If you&#8217;ve been paying any attention at all to technology trends the past few years, you know that we live in the era of <a href="http://en.wikipedia.org/wiki/Big_data">Big Data</a>. All of those videos we upload to YouTube, hard drives we fill with government secrets (or cat photos, take your pick), and tweets we awkwardly punch out on touchscreen keyboards add up to a whole lot of gigabytes, the bulk of which are stored by <a href="http://www.huffingtonpost.com/2013/06/07/nsa-prism-program_n_3401695.html">someone, somewhere, indefinitely</a>. By some estimates, human beings <a href="http://techcrunch.com/2010/08/04/schmidt-data/">generate more data</a> every two days than we did in the entire history of civilization prior to 2003 &#8211; and that was as of three years ago!</p>
<p>Indeed, these are <a href="http://www.wired.com/wiredenterprise/2013/03/big-data/all/1">exciting times for data nerds</a>, and <a href="http://trevorodonnell.com/2013/03/07/six-big-data-predictions-for-the-arts/">data nerds in the arts</a> are <a href="http://artsfwd.org/big-data-in-arts-orgs/">no exception</a>. Initiatives such as the <a href="http://www.culturaldata.org/">Cultural Data Project</a>, Southern Methodist University&#8217;s <a href="http://blog.smu.edu/artsresearch/">National Center for Arts Research</a>, and the Americans for the Arts <a href="http://www.artsindexusa.org/">National Arts Index</a> seek to collect or organize relevant indicators pertaining to everything from arts organizations&#8217; financial health to audience reach and characteristics to long-term trends for musical instrument purchases.</p>
<p>Fractured Atlas is no stranger to data initiatives in the arts. Our <a href="http://www.fracturedatlas.org/site/technology/archipelago">Archipelago data visualization software</a> is one of the largest such efforts, bringing together information on arts nonprofits, for-profits, fiscally sponsored projects, funding, audience distributions, and community context all in one place in the service of better understanding the arts ecosystem in a region. Facilitating data-driven decisions is a major long-term objective of <a href="http://www.artful.ly/">Artful.ly</a>, our <a href="http://www.fracturedatlas.org/site/blog/2013/10/15/join-us-to-celebrate-artfully-taking-off-the-training-wheels/">just-launched</a> cloud-based arts management tool, and a present-day reality for <a href="http://www.fracturedatlas.org/site/technology/spaces">Spaces</a>, our venue listing and booking service that <a href="http://www.dnainfo.com/new-york/20120904/east-village/booking-website-for-city-rehearsal-spaces-relieves-headache-for-performers">can promote spaces with last-minute availability to users</a>. Through our research advisory services work, we&#8217;ve <a href="http://www.youtube.com/watch?v=ziyprUZHnj0">helped funders such as ArtsWave</a> organize their entire grantmaking process around principles of data-driven decision-making in order to further their philanthropic objectives. Everyone benefits when funders, organizations and individuals in the arts ecosystem make thoughtful decisions about resource allocation, setting up and responding to incentives, and more. At Fractured Atlas, we believe that data can and should be a crucial input into that thoughtful decision-making process, and we&#8217;ve been increasingly vocal in evangelizing for data-driven decision making throughout the arts and cultural sector.</p>
<p>There&#8217;s just one problem. Up until now, Fractured Atlas has not had any formal guidelines in place to ensure that we use data in our <em>own </em>decision making, with the result that our internal decisions &#8211; relating to management, marketing, strategy, and the like &#8211; have been guided primarily by managerial intuition. In a &#8220;doctor, heal thyself!&#8221; moment, we&#8217;ve agreed that is time for our practices to reflect our preaching, at both the program and institutional levels. In 2013, the scope of our operations, the size of the community we serve, and the financial stakes in our work demand informed analysis at a level of rigor that we have not historically practiced.<em> </em>(This directive was immortalized by our fearless leader Adam Huttler in the organization&#8217;s annual Strategic Priorities Memo with the colorful title, &#8220;Eating Our Data-Driven Dog Food.&#8221;)<em></em></p>
<p>So between now and next summer,<strong> Fractured Atlas is embarking on a pilot initiative to explore how we can use data and evidence to improve our decision-making process at all levels.</strong> We&#8217;re calling it Fractured Atlas as a Learning Organization, and through this and future blog posts, we&#8217;re giving you the opportunity to be a fly on the wall as use this process as a way of grappling with issues of organization identity, strategy, culture, and impact.</p>
<p>&nbsp;</p>
<p><strong>What Is a Learning Organization?</strong></p>
<p>As I define it*, a learning organization is one for which <strong>information and strategy are joined at the hip</strong>. It is, quite literally, an organization that has successfully forged a culture of learning and integrated that culture into its decision-making process at all levels.</p>
<p>Why is this integration between information and strategy important? Because every organization operates in an environment of uncertainty about what is going to result from its decisions, and every decision we make on behalf of an organization is based on a prediction, whether explicitly articulated or not, about the results of that decision.</p>
<p>If you can reduce the uncertainty associated with your decisions, the chances that you will make the right decision will increase. Of particular interest here  are what I call <strong>decisions of consequence</strong>: dilemmas for which the consequences of making the wrong decision and uncertainty about the nature of the right decision are both high.<strong> </strong>So, how do you reduce that uncertainty? Why, through research, of course! Studying what has happened in the past can inform what is likely to happen in the future. Studying what has happened in other contexts can inform what is likely to happen in your context. And studying what is happening now can tell you whether your assumptions seem spot on or off by a mile.</p>
<p>In fact, I subscribe to the notion that research is<em> only</em> valuable insofar as it helps to answer a question that matters. I&#8217;m not the only one who thinks so, either: Jake Porway, the founder of a <a href="http://www.datakind.org/">nonprofit</a> that connects data scientists with social enterprises in need, <a href="http://blogs.hbr.org/2013/03/you-cant-just-hack-your-way-to/">wrote this past spring</a> that &#8220;any data scientist worth their salary will tell you that you should start [a data project] with a question, NOT the data.&#8221; In fact, all of the excitement around Big Data notwithstanding, <a href="http://www.artsjournal.com/artfulmanager/main/measuring-only.php">data divorced from strategy is not likely to be very useful</a>.</p>
<p>A learning organization solves this problem by forging a powerful feedback loop between information and strategy, with each feeding the other and adapting in relation to the other. The more obvious implication of this symbiosis is that organizational decisions must adapt in response to new information, as discussed above. But the less obvious implication is no less important:<em> information-gathering must be directed by the organization&#8217;s decision-making needs</em>. Without that intimate connection, there are no real safeguards to prevent organizations from thinking they are making data-driven decisions without really putting much thought into either the data or the decisions.</p>
<p>More broadly, a learning organization develops a culture of seeking out and using information thoughtfully from the highest levels to the organization&#8217;s grassroots. The most effective organizations are conscious about the impact they are trying to achieve, and willing to be open-minded regarding the paths they take to maximizing that impact.</p>
<p><em>*Some readers may be familiar with the term &#8220;learning organization&#8221; as defined by MIT management scientist Peter Senge in his well-known 1990 book </em><a href="http://en.wikipedia.org/wiki/The_Fifth_Discipline">The Fifth Discipline</a><em>. My use of the phrase is broadly in the same spirit as Senge&#8217;s, but he sets out a very specific formula for what constitutes a learning organization that I don&#8217;t make use of here.</em></p>
<p>&nbsp;</p>
<p><strong>Fractured Atlas as a Learning Organization</strong></p>
<p>This fiscal year, which started in September and goes through next summer, we are undertaking a pilot project to put some of these principles into practice. The primary goal of the pilot is to develop<strong> a conceptual framework and a toolkit of situation-adaptable methods for reducing uncertainty about decisions of consequence</strong>. If we can reduce the uncertainty we have about those decisions through strategic measurement and information-gathering efforts, over time we&#8217;re likely to make better decisions that will in turn lead to better outcomes for Fractured Atlas and the people who benefit from our work.</p>
<p><a href="http://www.fracturedatlas.org/site/blog/wp-content/uploads/2013/10/falo-process.jpg"><img fetchpriority="high" decoding="async" class="aligncenter size-full wp-image-10830" title="falo-process" alt="falo-process" src="http://www.fracturedatlas.org/site/blog/wp-content/uploads/2013/10/falo-process.jpg" width="676" height="480" /></a></p>
<p>As powerful as this idea is, it only works if we have a very concrete sense of what we&#8217;re trying to accomplish as an organization. While we&#8217;ve had a <a href="http://www.fracturedatlas.org/site/about/">mission statement</a> for some time now, the huge variety of programs and services Fractured Atlas offers is virtually impossible to fully capture in a single sentence. Accordingly, the first step in this process is to <strong>create a </strong><a href="http://www.fracturedatlas.org/site/blog/2012/06/28/in-defense-of-logic-models/"><strong>theory of change</strong></a><strong> for every program at the organization</strong>, from which we&#8217;ll roll up an overall theory of change and logic model for the organization as a whole. This will allow us to define our overall goals as well as some key success metrics at various levels of operation, taking into account both Fractured Atlas&#8217;s mission objectives and its focus on <a href="http://www.fracturedatlas.org/site/about/business">developing programs that are sustainable with earned income</a>.</p>
<p>Meanwhile, we&#8217;ve formed an internal task force to work on this project at a deeper level of engagement throughout the year. Affectionately called the <strong>Data-Driven D.O.G. Force</strong> (the &#8220;D.O.G.&#8221; stands for Data Over Gut), the group will meet every 6-8 weeks to receive <a href="http://en.wikipedia.org/wiki/Calibrated_probability_assessment">calibrated probability assessment training</a>, identify real-world decisions of consequence to use as case studies, and come up with measurement experiments to gather information relevant to those decisions. In doing so, we&#8217;ll be using a modified version of a methodology called Applied Information Economics as described in Douglas W. Hubbard&#8217;s book <a href="http://www.amazon.com/How-Measure-Anything-Intangibles-Business/dp/1452654204"><em>How to Measure Anything: Finding the Value of &#8220;Intangibles&#8221; in Business</em></a>. One major advantage of AIE is that it explicitly takes into account the cost-benefit of measurement strategies by calculating something called the <a href="http://en.wikipedia.org/wiki/Value_of_information">value of information</a>, which we&#8217;ll be exploring further in a future post.</p>
<p>At the end, we&#8217;ll attempt to formalize a process for identifying decisions of consequence in the future and fitting measurement strategies to the situation at hand. We&#8217;ll also present some recommendations for building infrastructure in the form of ongoing data collection, to address those questions that are likely to be asked again and again. And through it all, I&#8217;ll be writing about it here &#8211; so that anyone who wants to can learn alongside us.</p>
<p>&nbsp;</p>
<p><strong>Learning in Context: Why Philosophy Matters as Much as Performance</strong></p>
<p>Data-driven decision-making isn&#8217;t just about crunching numbers. It&#8217;s a practice that requires certain values in order to work. The hard part of being data-driven is not the &#8220;data&#8221; but the &#8220;driven&#8221; &#8211; you have to be willing to question your assumptions and actually change your behavior in response to the new information coming in. Put another way, a learning organization is, well, open to learning new things -even things that suggest that the way that we&#8217;re currently doing things isn&#8217;t working as well as it could, or that we&#8217;re missing important opportunities to increase our impact.</p>
<p>It&#8217;s much easier to attain that kind of open stance if we train ourselves to expect failure upfront. In general, organizations as well as people have a tendency to be far too risk-averse. Being a learning organization means embracing a culture of intentional experimentation and productive failure: we&#8217;re likely not going to hit upon the secret sauce the very first time we try something &#8211; or, sometimes, at all.</p>
<p>Being a learning organization similarly requires that we think about ourselves from a system perspective &#8211; how are we making a difference in light of what everyone else is doing? And how can our experiences shed light on those of others? That&#8217;s why we&#8217;re not just going down this path on our own and in private. If the specific activities of the pilot project turn out to be a big waste of time (and I can&#8217;t guarantee that they won&#8217;t), we won&#8217;t be able to hide that from you or the world. But even that would ultimately be a good thing &#8211; because, in true learning organization fashion, it would cause us to reconsider the limitations of a data-driven approach. Embracing change is hard, but one of the very best things about it is that it can allow us to extract just as much (if not more) value from failure as success.</p>
<p>For me, personally, this project is very exciting. Of course I&#8217;m eager to find out what we&#8217;ll learn. But more than that, Fractured Atlas as a Learning Organization is an opportunity for us to exercise leadership in a way that reaffirms our <a href="http://www.fracturedatlas.org/site/blog/2010/04/13/the-future-of-leadership/">highest standards</a> for ourselves and for the field. I&#8217;m looking forward to sharing our journey with you.</p>
]]></content:encoded>
			<wfw:commentRss>https://createquity.com/2013/10/fractured-atlas-as-a-learning-organization-an-introduction/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Why Arts Research is Hard (And Why We Should Do it Anyway)</title>
		<link>https://createquity.com/2011/01/why-arts-research-is-hard-and-why-we-should-do-it-anyway/</link>
		<comments>https://createquity.com/2011/01/why-arts-research-is-hard-and-why-we-should-do-it-anyway/#comments</comments>
		<pubDate>Thu, 20 Jan 2011 17:34:49 +0000</pubDate>
		<dc:creator><![CDATA[Ian David Moss]]></dc:creator>
				<category><![CDATA[Research]]></category>
		<category><![CDATA[decision analysis]]></category>
		<category><![CDATA[evaluation]]></category>
		<category><![CDATA[Fractured Atlas]]></category>
		<category><![CDATA[impact assessment]]></category>
		<category><![CDATA[strategy]]></category>

		<guid isPermaLink="false">https://createquity.com/?p=1878</guid>
		<description><![CDATA[(Crossposted from the Fractured Atlas blog. This is the first in a series of posts about Fractured Atlas&#8217;s research approach and philosophy.) I was a participant in a couple of conversations with fellow arts research nerds recently in which we discussed the notion of cause and effect. You remember that one from grade school, right? Well,<a href="https://createquity.com/2011/01/why-arts-research-is-hard-and-why-we-should-do-it-anyway/" class="read-more">Read&#160;More</a>]]></description>
				<content:encoded><![CDATA[<p><em>(Crossposted from the <a href="http://www.fracturedatlas.org/site/blog/2011/01/20/why-arts-research-is-hard-and-why-we-should-do-it-anyway/">Fractured Atlas blog</a>. This is the first in a series of posts about Fractured Atlas&#8217;s research approach and philosophy.)</em></p>
<p>I was a participant in a couple of conversations with fellow arts research nerds recently in which we discussed the notion of cause and effect. You remember that one from <a href="http://www.studyzone.org/testprep/ela4/o/causeeffectp.cfm">grade school</a>, right? Well, it turns out that when it comes to research (and especially arts research), it&#8217;s <a href="http://en.wikipedia.org/wiki/Causality">not as simple</a> as we all thought.</p>
<p>You see, in science, when we say that something caused something else, we tend to want to be sure. A common concept in statistics is &#8220;<a href="http://en.wikipedia.org/wiki/Significance_level">significance</a>&#8220;: the idea that a meaningful connection exists between two variables that can&#8217;t be explained by random noise. If you&#8217;ve formed a <a href="http://en.wikipedia.org/wiki/Hypothesis">hypothesis</a> and designed your experiments correctly, and you get statistically significant results, you can be fairly confident that the results you&#8217;re looking at indicate something real and not merely an accident or coincidence.</p>
<p>Of course, it&#8217;s never possible to be entirely sure. But in some fields, you can get pretty close. The technical term for the degree of uncertainty we&#8217;re willing to tolerate in an analysis like this is the <a href="http://en.wikipedia.org/wiki/Statistical_significance">alpha</a>: an alpha of 0.05 means that we&#8217;d like to be 95% sure that we can reject the <a href="http://en.wikipedia.org/wiki/Null_hypothesis">null hypothesis</a> (i.e., that we might just be looking at random noise) before we go ahead and report the result as meaningful. In some fields, it&#8217;s common to require an alpha of 0.001 or even lower &#8211; that is, a 99.9% certainty that random variation isn&#8217;t behind the results. And for something like testing a new drug, you most definitely want to be that sure that, for example, it doesn&#8217;t cause <a href="http://en.wikipedia.org/wiki/Rofecoxib">heart attacks as a side effect</a> before you put that sucker on the market.</p>
<p>In the social sciences, though, which is where arts research generally lives, it&#8217;s much harder to be certain about your results. That&#8217;s not necessarily because it&#8217;s harder to get statistically significant results. It&#8217;s because it&#8217;s harder to design the models and experiments &#8212; in other words, your hypothesis as to what is happening and why, and the means you use to test it &#8212; with integrity.</p>
<p>In order for a model to work, it needs to account for everything that might affect the result that you&#8217;re looking for. For health sciences, this can be pretty simple: you give people a drug or administer them a treatment, and you measure whether they got better. You can collect other information about them, such as their race, age, gender, and so forth, in order to catch any differences along those axes, but otherwise it&#8217;s fairly straightforward. It&#8217;s also relatively routine (though sometimes complicated by ethical issues) to construct what&#8217;s known as a <a href="http://en.wikipedia.org/wiki/Experimental_control">control group</a>: a set of comparable individuals with the same problem who don&#8217;t receive the treatment.</p>
<p>In the arts, by contrast, both of these assumptions are challenged. First, when you&#8217;re looking at the impact of an arts program on things like, say, crime rates in a community, or educational outcomes for children, or hell, even just straight-up happiness for individual arts participants, it&#8217;s really difficult to isolate the unique contribution of the arts program from everything else that could be entering into the picture and affecting the results (e.g., the economy, quality parenting, or what they had to eat that day). And second, unlike in the case of giving somebody a pill, it&#8217;s hard to isolate the recipients or beneficiaries of arts programs from people who don&#8217;t benefit in a way that is scientifically rigorous &#8211; especially when the desired results concern whole communities or ecosystems.</p>
<p>*</p>
<p>It would be tempting, faced with that litany of challenges, to conclude that arts research simply isn&#8217;t worth the trouble. I would disagree with such a conclusion. Well-executed arts research, though rarely providing evidence beyond all doubt, can nevertheless help to illuminate some of the key assumptions that we make when we design arts programs. And boy, do some of those assumptions<a href="https://createquity.com/2010/12/the-myth-of-the-transformative-arts-experience.html">need illumination</a>!</p>
<p>I view causality and arts research more generally through a frame of <a href="http://en.wikipedia.org/wiki/Program_evaluation">program evaluation and impact assessment</a>. Fractured Atlas is currently undertaking an evaluation of one of its own programs and also helping an external client develop a framework to assess the impact of its grantmaking. One of the most important steps in that process is to identify the underlying assumptions upon which your program strategy rests. Every time we employ strategy to make decisions, no matter what the context, we are making assumptions. When we sign up for a test-prep course to improve our score on the GRE, for example, we assume that the instructor and materials are of a sufficient quality to help us learn, and also that we&#8217;re intellectually capable of achieving a better score with assistance. When we open a Twitter account to drive traffic to a website, we assume that we&#8217;ll have sufficient time and inspiration to generate content for the medium in a way that will gain traction over time. And when we start an organization whose goal is to bring about world peace through the arts, well, there are a LOT of assumptions that go into that one!</p>
<p>Where research can help us most is by telling us whether or not our assumptions are valid. We might feel more confident about our decision to sign up for the test prep class if we can first view data on how much improvement previous participants saw in their scores after taking it. Our organization&#8217;s decision to sign up for Twitter would be made easier if we had information on the trajectories of comparable peers&#8217; tweet activity and followers over time along with measures of how much of a drain it was on staff resources. To me, research is not especially meaningful or worthwhile unless it has the potential to inform, either directly or indirectly, the decisions we make. But if it does, it can be very valuable indeed.</p>
<p>That&#8217;s because, unlike in health care or the pharmaceutical industry, in the arts we&#8217;re (usually) not dealing with life and death. It&#8217;s okay if we make a mistake once in a while; the world will continue on. So we don&#8217;t need to have 99.9% or even 95% certainty that the choices we make are the right ones before we move ahead. Indeed, as of now it&#8217;s likely that we make some decisions with virtually no certainty of their wisdom at all! To the extent that research can play a role in reducing the uncertainty we face in making decisions within a strategic framework, that research can provide real, quantifiable value to its users.</p>
<p>Let me elaborate on that last point. One of the most powerful tools I learned in business school was <a href="http://en.wikipedia.org/wiki/Decision_Analysis">decision analysis</a>, a conceptual approach useful for incorporating uncertainties into scenario planning. A common concept in decision analysis is what&#8217;s known as &#8220;<a href="http://en.wikipedia.org/wiki/Expected_value_of_perfect_information">the value of perfect information</a>.&#8221; You know you have perfect information when there is absolutely no uncertainty in the outcomes that might result from an action or set of actions you take. The value of perfect information is the difference in your &#8220;<a href="http://en.wikipedia.org/wiki/Expected_value">expected value</a>&#8221; (i.e., the result of the best possible strategy given the average of all possible outcomes, weighted by probability) with certainty and without. For example, if you&#8217;re only 60% sure that taking the test prep class will get you to the GRE score you need, there&#8217;s a 40% chance the amount you spend on the class will be a waste. In the language of decision analysis, that&#8217;s equivalent to saying that you can &#8220;expect&#8221; to lose 40% of your investment. With perfect information that taking the class will lead to the result you want, you have no risk of wasting that money. Thus, the value of perfect information in this case is 40% of the price of the class.</p>
<p>Research, especially research in the arts, can&#8217;t give us perfect information. But it can sure as hell give us better information than we already have. Even if it can reduce our uncertainty that our strategy is the right one from 40% to, say, 20%, that&#8217;s still quite a boost to our confidence. But the value of research is only as high as its quality. Badly designed or poorly executed studies can be next to useless in reducing uncertainty, or worse, can actually increase it by confusing the underlying issues. Unfortunately, no certification body currently exists to ensure the research conducted in the arts is of a sufficient quality to be helpful. The best way to make sure as a field that we don&#8217;t get taken in by low-quality work is to take some time to educate ourselves on good research practices. For a good, short primer, I recommend <a href="http://www.amazon.com/Evaluation-Essentials-Conducting-Research-Sciences/dp/0787984396">Evaluation Essentials</a> by my own program evaluation teacher, Beth Osborne Daponte.</p>
<p>Next time, some thoughts on how Fractured Atlas puts these principles into practice.</p>
]]></content:encoded>
			<wfw:commentRss>https://createquity.com/2011/01/why-arts-research-is-hard-and-why-we-should-do-it-anyway/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>Lessons I Learned in Business School (Or, My Humble Attempt to Save You $150k)</title>
		<link>https://createquity.com/2009/06/lessons-i-learned-in-business-school-or/</link>
		<comments>https://createquity.com/2009/06/lessons-i-learned-in-business-school-or/#comments</comments>
		<pubDate>Thu, 04 Jun 2009 03:54:00 +0000</pubDate>
		<dc:creator><![CDATA[Ian David Moss]]></dc:creator>
				<category><![CDATA[Economy]]></category>
		<category><![CDATA[business school]]></category>
		<category><![CDATA[decision analysis]]></category>
		<category><![CDATA[evaluation]]></category>
		<category><![CDATA[negotiation]]></category>
		<category><![CDATA[personal]]></category>
		<category><![CDATA[risk]]></category>
		<category><![CDATA[statistics]]></category>
		<category><![CDATA[textbook economics]]></category>

		<guid isPermaLink="false">https://createquity.com/2009/06/lessons-i-learned-in-business-school-or-my-humble-attempt-to-save-you-150k.html</guid>
		<description><![CDATA[I really didn&#8217;t know what to expect when I came to business school in the fall of 2007. I had lived my whole post-college life in the nonprofit sector, and most of that time was spent hanging around musicians. I was brought up by two ex-hippies who, shall we say, did not exactly fit in<a href="https://createquity.com/2009/06/lessons-i-learned-in-business-school-or/" class="read-more">Read&#160;More</a>]]></description>
				<content:encoded><![CDATA[<p><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://2.bp.blogspot.com/_jSTeDrbLy7I/SidkSHcQ3JI/AAAAAAAAAU0/AHK_WPSEdG8/s1600-h/Manual_decision_tree.jpg"><img decoding="async" style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 338px; height: 400px;" src="http://2.bp.blogspot.com/_jSTeDrbLy7I/SidkSHcQ3JI/AAAAAAAAAU0/AHK_WPSEdG8/s400/Manual_decision_tree.jpg" alt="" id="BLOGGER_PHOTO_ID_5343349745296399506" border="0" /></a>I really didn&#8217;t know what to expect when I came to business school in the fall of 2007. I had lived my whole post-college life in the nonprofit sector, and most of that time was spent hanging around musicians. I was brought up by two ex-hippies who, shall we say, did not exactly fit in with corporate culture. (My mom did have a job at a bank once, which she used primarily as an opportunity to read up on astrology textbooks.) So I had a number of questions in my mind as I got ready to start my MBA program. Would I get along with my classmates? Would I get involved in the life of the school? Would I have time to compose and pursue individual projects? Would I actually learn anything?</p>
<p>If nothing else, I knew I would come out of the experience with two things: (a) a piece of paper in my hand that said I was a smart person and (b) a whole lotta debt. Both of these things turned out to be true. But that was about the extent to which my premonitions and preconceptions held water. Time and again, something about the experience would surprise me, and I remain now in awe of not only how much my feelings about b-school evolved, but how much <span style="font-style: italic;">I</span> evolved. In no particular order, then, I offer the most valuable concepts, ideas, skills, and learnings I take away from my four semesters in New Haven, CT:</p>
<ul>
<li>As math subjects go, an understanding of<span style="font-weight: bold;"> statistics </span>must rank close behind basic algebra in highest usefulness-in-daily-life-to-mental-effort-required ratio. It&#8217;s amazing to read websites like <a href="http://fivethirtyeight.com">fivethirtyeight.com</a> now and actually understand how those graphs were produced. And if that&#8217;s not practical enough for you, <a href="http://en.wikipedia.org/wiki/Decision_analysis"><span style="font-weight: bold;">decision analysis</span></a> can shed light onto even the most intransigent dilemmas by identifying dominated (i.e., foolish) strategies and helping to organize one&#8217;s thinking. A tool that I will use for the rest of my life.</li>
<li><span style="font-weight: bold;">Program evaluation </span>is not rocket science. All it requires you to do is to apply logic <span style="font-style: italic;">relentlessly</span>, a skill that does take some practice. A class on ethics and human behavior taught us that the only reliable way to counteract persistent biases, such as overconfidence, is to ask oneself at critical junctures: &#8220;how could I be wrong?&#8221; This is what program evaluators, good ones anyway, do at every step of the process. It&#8217;s so simple, but it can save you a world of remorse down the line.</li>
<li>The teaching of <span style="font-weight: bold;">introductory economics</span> needs serious reform. I have <a href="https://createquity.com/2008/01/economics-myths.html">written about this</a> at length, but having taken <a href="https://createquity.com/2008/10/behavioral-economics.html">additional coursework</a> in the subject and witnessed a global financial meltdown since my original rants, I now feel quite confident in saying that externalities and behavioral analysis should occupy a <span style="font-style: italic;">much</span> more prominent role in the discussion, and that normative judgments about policy  should be taken <span style="font-style: italic;">out</span>. Free markets with perfect information, perfect competition, and rational actors are the exception, not the rule.</li>
<li>Most of us are far too <span style="font-weight: bold;">risk-averse</span>. As a simple example: have you ever been really attracted to someone but decided not to ask them out because you were afraid of getting rejected? It&#8217;s silly, right? What, exactly, are the consequences of getting rejected? How is it worse than not asking in the first place? Take this thought and apply it to your job, a fundraising ask, an application to school &#8212; you name it. Make sure the risks you&#8217;re avoiding are actually risks, and not just fears in disguise.</li>
<li>When it comes to negotiation, it&#8217;s amazing what difference a little <span style="font-weight: bold;">planning</span> makes. Knowing <span style="font-style: italic;">exactly</span> what your preferences are and thinking about what concessions you&#8217;re prepared to make&#8211;and not&#8211;beforehand will equip you with the tools you need to guide the conversation toward an optimal conclusion. This little lesson can be applied to almost any kind of negotiation, not just the usual suspects like big business deals or bargaining at the flea market.</li>
<li><span style="font-weight: bold;">Presentation</span> matters. It really does. Oh sure, it doesn&#8217;t matter as much as content, at least in the long run. The thing is, presentation <span style="font-style: italic;">is a part of your content</span>. That&#8217;s why it matters.</li>
</ul>
<p>As much as I&#8217;ve learned from business school, perhaps the most profound lesson I take away is <span style="font-style: italic; font-weight: bold;">how much I still don&#8217;t know</span>. In 2006, I knew a hell of a lot about choral music, the history of experimental rock ensembles, who&#8217;s who in the American classical composer social hierarchy, and where to find hassle-free street parking in New York City. At the time, that seemed like a pretty decent chunk of stuff to know a hell of a lot about. I now know that it&#8217;s about 0.00000000001% of the full extent of human knowledge and achievement. After two years in business school, I can now claim a passing conversance with, oh, maybe 0.00000000003%. Which, you know, is triple what I knew before &#8212; and still BARELY ANYTHING.</p>
<p>On the other hand, I now have tools to help me make decisions and manage in situations even when I don&#8217;t have all the information. I know how to figure out what questions I must ask in order to get to the answers I need. I know how to surround myself with people who know more than I do about particular subjects so that I don&#8217;t have to keep reinventing the wheel. And most importantly, I can feel secure in the knowledge that, for as long as I live, I will never stop learning new things. And that is a great gift.</p>
]]></content:encoded>
			<wfw:commentRss>https://createquity.com/2009/06/lessons-i-learned-in-business-school-or/feed/</wfw:commentRss>
		<slash:comments>4</slash:comments>
		</item>
	</channel>
</rss>
