ArtsBlog (the blog of Americans for the Arts) recently hosted a forum called: “So, Does Size Matter?” The short answer is hell yes it does, but I disagree with most of the writers about why. I found the best piece in the series was penned by the whip-smart Ian David Moss (Economies and Diseconomies of Scale in the Arts – Take Two), and it was his post that inspired both me to both write an initial comment, and then to take on the subject more fully below.
You see, Dear Reader, like many of my fellow funders and financiers I’ve often touted the benefits of moving toward greater scale: improved operational efficiencies, greater programmatic reach, increased access to resources, heavier political punch. But I’ve also struggled with the oft recognized but seldom addressed reality that scale is not an answer in and of itself, and that sometimes scaled solutions leave even larger problems in their wake. Thanks to Ian, I think I got the mental kick in the epiphany I needed.
And here’s why I think scale sometimes, well, stinks up the joint.
The Mechanics of Moving Capital
I don’t care how you’re doing it, when it comes to getting money out the door it’s always easier to do it in big chunks. Whether you’re making a grant, extending a loan, or placing private equity, cost per transaction is lower if you make fewer, larger transactions. This is axiomatic. There is an inherent bias, therefore, toward systems, institutions, organizations, or entities that can absorb cash and generate returns (whether social or financial) in said big chunks. In other words, “efficiency of delivery” is an important driver of seeking large-scale solutions in and of itself.
There’s also a bias within philanthropy, particularly within big-ticket philanthropy, to be associated with well-recognized, highly visible organizations. These tend to be larger organizations with board members and executive leaders who themselves are power brokers. Fewer larger gifts provide a direct, reputational benefit to those who bestow, and therefore figure directly into the calculus of supporting “scaled” solutions.
Finally, there are things that simply cannot be accomplished without being bigger. One must be able to aggregate capital and expertise in order to do things like build bridges and power plants, or maintain the military, or move the Temple of Dendur several thousand miles brick by ancient brick.
The problem is, we tend to conflate the concept of “scale” with the concept of “size.” Greater scale and greater size appear interchangeable, because they appear to accomplish the same things. Size, after all, implies the ability to aggregate more resources, take on larger projects, and confer reputational benefits.
But scale implies something more (or at least it should): that all the while we have been increasing size we have also been creating more complex systems of communication to manage within this infrastructure; that we have been seeking redundancies and weeding them out; that we have been discovering parallel processes and routinizing them; that we have built specialization and systemization. That we have become more efficient.
Therefore: Scale = Size + Efficiency.
The problem with this little equation is that there are trade-offs when building larger and more efficient systems. For instance, most scaled solutions achieve efficiencies by reducing personnel (a real headache BTW for public and nonprofit folks with a mission-orientation towards employment and living wages, not to mention any entity employing unionized labor). Scaled systems also require more layers of management, firm hierarchies, task specialization, and centralization of resources and decision-making.
But the biggest problem with scaled systems is that they both rely upon and produce standardized outputs. Ian refers to them as “TV dinners” – consistent, bland, normal.
We both fetishize standardized products, and despise them.
The Means of Production
When you “produce” something, that’s a very different process from “creating” something. Production is about assembly, and scaled production means you can bring all the pieces together in an orderly, timely fashion. Again, this works best when both inputs and outputs are standardized. Automobiles, microfinance, and high school educations all share this in common. In my comments to Ian’s blog post, I noted that the Metropolitan Museum of Art, with it’s $300 million annual budget, “produces” quite a bit of art: that is, it has assembled a stunning diversity of work created by others. But the process it uses to produce this art is highly standardized, as is the way that we consume it. When it comes right down to it, the Metropolitan Museum of Art actually creates very little art itself. The same is true for the other captains of the NYC cultural sector (Lincoln Center, MoMA, the Guggenheim, Carnegie Hall), and the rule holds true in other sectors as well.
Therefore: Greater scale = Greater standardization.
Now here’s the rub: from the perspective of creating new works, the vast, vast majority of art is being developed by a veritable horde of small cultural organizations, unincorporated artist confederations, solo artists and professional amateurs. They are just churning it out. These are the folks whose work may someday (through a combination of skill, capable self-promotion and sheer luck) wind up being “produced” through a major cultural institution. And these are the exact same folks whose creative efforts are likely to be completely untouched by “scaled” investment in larger cultural partners, and by the funding efforts of grantmakers seeking to place fewer, larger grants.
In other words, there is an inverse relationship between scaled arts production and large-scale arts creation. And this is exactly where “scale” fails us: standardized production cannot handle diversity, granularity, exception, or fragmentation. As a matter of fact, scaled production is threatened by these phenomena. This is because in order to support the niche, the artisanal, and the quirky, we actually have to move away from “efficiency of delivery.”
Size Matters: The Matter of Size
Oh dear. Doesn’t this mean that art-making is inherently inefficient? Well, it cannot be standardized and therefore it cannot be scaled – at least when it comes to delivering capital to the system. And as capital providers, as the folks with the money, it’s very easy for us to think about the question of efficiency only from our perspective. Again, it’s easier to give money in big chunks. The problem is that as we do this, we can actually reduce the amount of new art that gets made.
The good news is that if you think about efficiency in terms of creation, then investing through small grants in a diversity of creative producers is far more efficient than making fewer, larger grants to scaled institutions. Why? Because for a few thousand dollars per grant (plus the sweat equity you invest in your award making process), you will generate far more art per dollar. As a matter of fact, not only will you generate more art, but more of every dollar will go directly towards art creation (as opposed to administration and overhead).
And there are some handy ways that you can make this process less painful for yourself as a funder:
- Invest in your staff so that they have the relevant expertise in their area of grant-making to pick strong arts creators;
- Work with an intermediary to provide scaled services to the field, or to have them make smaller awards out of a larger commitment you’ve provided to them;
- Use a panel of artists and creative sector stakeholders to help recommend and evaluate funding decisions to insure quality and diversity.
If we want to invest substantially in developing the creative output of New York City (or any city, town or ‘burb), we actually have to move away from scaled investment towards de-centralized, knowledge and process intensive investment. From the perspective of capital placement, a highly inefficient process.
But if our goal is to support the generation of new work, it’s actually more efficient to support many smaller artists. I believe we will get more art made per dollar deployed if we start giving more grants for $1,000 – $10,000. In fact, we actually may get more art consumed. Some answers to these questions lie within the Cultural Data Project, which actually does track audience data; but as far as I know, no one has actually looked to see what the audience / dollar ratio is.
Not yet at least. Hint hint.
Nice piece. It reminds me of Max Weber’s line about great art never being monumental and always having intimacy. Also reminds me of James Scott’s amazing book, Seeing Like a State, about how large regimes — could be corporations — want things legible and therefore standardized in a way that too often depletes vitality and erodes sustainability.
Keep up the blogs!
A wonderful insight into what efficiency in the arts is now, and what we could consider it to be. So often, we (arts administrators) are considered necessary because we know how to make that budget and function in a left-brain capacity. It is a horrible myth that most artists cannot do this. Most can, but have just been taught that they can’t because they are creative and not practical.
I would love to see more investment in programs that empower artists to take grant dollars and use them in the most efficient way possible! Teach them to do their own budgets, taxes, etc! It would liberate artists and their dependence on large organizing bodies. As funding for the arts decreases, I agree that funders should be talking about the efficiency of the dollar after it has left them, not the most efficient way to get money out the door.
A great post! Keep it coming.