When process questions were raised by some of the experts involved in the review of the first round of Social Innovation Fund (SIF) grants awarded by the Corporation for National and Community Service (CNCS) in 2010.

One of the issues that NPQ identified concerned the potential for conflicts of interest during the later stages of the review process. This emerged directly out of questions raised by some of the expert reviewers involved at earlier stages. NPQ was also concerned about the appearance of an historically strong but informal network of certain grantees and CNCS staff that was operating within what was a fairly un-transparent process. This combination of a perceived “insiders” network and a lack of transparency was bound to engender questions. As our article of September 23, 2010 stated, “We would still like to know which reviewers were used for the rounds that followed the initial review. The issues of conflict of interest get more intense at the levels where decisions are made that contradict the judgment of reviewers at an earlier phase.”

These and other concerns helped prompt a review (PDF) by CNCS’s Office of the Inspector General (OIG), which was released on August 19. As you will note below, problems did in fact occur during the later phases of the review process, as well as during the run-up to the process (recruitment and screening of external reviewers in particular). While the OIG noted in its report that the first two phases of the process appeared to be fair and objective, it was not able to make such a determination about the third stage due to a lack of documentation.

Here are four important issues raised by the report:

1.Lack of diversity among external reviewers. The OIG found some problems with recruitment of external reviewers. Not only were two reviewers found to lack the requisite 10 years of experience, but there was also a problem of reviewer diversity: 11 reviewers out of a sample of 23 happened to be affiliated with one institution: Harvard University. The report said, “Focusing on reviewers with affiliations at one location could prevent [CNCS] from obtaining a wider breadth of expert knowledge of and insights into SIF programs.”

2. Conflicts of interest. For both external reviewers and staff, there were requirements to identify conflicts of interest (COIs), but in some cases the COIs were reviewed and approved only after the process, and in two cases, the OIG discovered conflicts of interest that escaped the disclosure process entirely. In our opinion, this is somewhat related to the reviewer diversity issue. By choosing from a narrow universe of potential reviewers, the risk of stumbling into conflicts of interest is magnified.

3. Failure to follow external review procedures. All applications were supposed to be reviewed by two expert panels, but in two cases (which the report did not identify) the second review was performed by CNCS staff rather than outside reviewers. This is significant in that there were questions raised by reviewers about how an application that was rated fairly low by one panel could end up in the final grants pool. Of the two applications reviewed by CNCS staff, one was because of a miscue in getting the application to a panel of outside reviewers and the other was because delays prevented the recruitment of back-up or alternate reviewers. We are still interested in which applications were staff-reviewed, how the staff reviews compared to the panel reviews, and how these two applicants fared, but these questions will likely remain unanswered.

4.Lack of documentation of decision-making process. At the end of the review process, CNCS conducted a “late stage review,” geared toward not necessarily picking the best individual applicants, but to create “the ideal SIF portfolio [that] would leverage the individual strengths of applicants and create the most robust network and ‘learning community’ of grantees possible.” In these sessions the pool of 16 fundable projects was winnowed down to the final 11 in the grants pool. This part of the process consisted of two meetings; the first involved a few external reviewers working on panels with CNCS staff, and the second was a meeting of CNCS staff only. The OIG found that documentation of this process was missing. “In our review of the meeting notes from the late stage review,” the OIG wrote, “[CNCS] could not provide documentation to demonstrate how it determined its final decision to award the 11 grants that were ultimately funded.” The criteria SIF said it used were geographic mix, type of SIF program activity, leveraging, and SIF grant amounts needed. Given that the program, as we pointed out, effectively excluded the “flyover” states, “geographic mix” apparently didn’t win the day, but it’s hard to know how all the criteria were weighted or operationalized.

In short, the OIG report confirms that the first round of SIF grantmaking was less-than-pristine in its processes, particularly in the selection and screening of external reviewers and in the documentation of the final decision-making process. CNCS has since, in its response to the OIG, laid out plans to redress many of the issues identified.