The recent report on ending the “Starvation Cycle” confirms that the vast majority of funding is project-based, that this type of funding rarely covers the full cost of the associated work, that “cost-minus” funding is not just unfair but also causes a whole host of financial and non-financial problems for nonprofits, and that funders serious about doing their “fair share” need to pay more or loosen their restrictions. These are important issues, but a few related matters merit discussion as well:
To scale, or not to scale?
As a nonprofit grows, its unrestricted funding almost always falls as a fraction of its total budget (even if it grows in absolute terms) while the proportion of cost-minus funding goes up. From this perspective, scaling makes it harder to cover full costs. At the same time, though it’s impolite to say, larger nonprofits are often more efficient, which makes it easier to cover full costs.
Likewise, larger nonprofits are often better able to recruit and retain high quality finance people and to put robust financial systems in place. Yet this increased financial acumen can be more than offset by the greater managerial complexity of being larger.
This complex relationship between scaling and cost-coverage means that if a nonprofit is too small, it may be inefficient or can’t attract the right people. But if it’s too big, its funding model may become unviable or its finance team overwhelmed; like Goldilocks, a nonprofit needs to find the scale that’s just right.
What is a “fair” funder?
The premise that funders who knowingly fail to cover the full costs of the projects they support are “unfair” implies a nonprofit-as-vendor mindset where the funder is unfair for failing to pay adequately for its projects. Yet many nonprofits don’t see themselves as mere vendors for funders; they view themselves as independent autonomous organizations with their own missions, plans, and programs over and above whatever a particular set of government agencies and restricted donors happen to be keen on funding at any given moment.
For organizations with this sense of self, even a cost-minus restricted grant might be seen as far from “unfair.” They can say, “This is an important program of ours. The grant, while covering only 90 percent of the full costs, allows our precious unrestricted funds to go ten times further since we only need to cover 10 percent of the full costs we would otherwise pay.”
What would happen if foundations actually changed their grantmaking practices?
There are several ways that foundations could change their grantmaking to end the starvation cycle:
- Increase funding to more fully cover the indirect costs, with a corresponding increase in the foundation annual spending rate (i.e., the percent of the endowment being spent annually). Organizations receiving the new, larger grants would be better able to invest in their capacity or to build reserves. All nonprofits would be better off for a long time, although the higher spending rate would eventually lead to less grantmaking at some point in the future since the endowment would gradually become smaller because of the higher spending. (For example, a foundation moving from a five-percent spending rate to a six-percent spending rate would see grantmaking fall after about 20 years, given reasonable assumptions about investment returns.)
- Increase funding to more fully cover the indirect costs without a corresponding increase in the foundation spending rate. In this case, there would be a smaller number of larger grants. In effect, some would-be grantees would be starved to better feed their brethren.
- Make no change to the amount or size of grants but make them more flexible. This would allow grantees to more fully cover their full costs and/or build reserves provided that they shrink their direct program costs. In the extreme case, foundations would make substantially all their grants in the form of unrestricted, general operating support.
While cost coverage is important, it seems highly unlikely to change the spending rate. The most likely outcome—if anything happens—is that grants will be a little more concentrated and a little more flexible, but not larger. This would be great news, though not the cash windfall that many nonprofits are hoping for.
Finally, any discussion of project funding and cost coverage presumes a real connection between a grant and a certain category of expenses. But what is the nature of this connection? Imagine an organization that receives grants of $50,000 from each of two funders. The grants—totaling $100,000—are paid into one bank account, out of which $75,000 is later spent on direct project expenses and $25,000 is spent on indirect expenses. Which grant was used for the indirect expenses?
There is no objective answer to this question because despite foundations’ best efforts to restrict the use of their funds, money is fungible. There is no fact of the matter. The only way to attribute the indirect expenses to one grant rather than the other is through an invented, arbitrary, and mutually agreed upon convention. Conventions are not true/false, but they can be consistent/inconsistent or helpful/unhelpful.
There are many possible conventions that would make it easier for funders to support indirect costs that they acknowledge are important but remain loath to pay for. Here are two:
- The General Operating convention: A funder covering a portion of an organization’s indirect costs could attribute to its grant a similar portion of any of the organization’s program expenses. Imagine a $5 million organization with $1 million in indirect costs, $4 million in direct program expenses, and requiring $500,000 in general operating support to stay afloat. A funder making an unrestricted grant of $100,000 (20 percent of the required total) could “attribute” to its grant any 20 percent of the direct expenses. If there was one program—Program X—that was particularly mission-aligned with the fu