“DECLINED” Can my proposal be revived?

Black and white photo of a student at a laptop

By Rick Nader

Chief Research and International Officer at Long Island University
Principal, Global Proposal Solutions, Consultant, Academic Research Funding Strategies

Why do so many grant proposals get rejected?

Funding rates hover at about 20% across all federal agencies: the competition is fierce. For new faculty who are inexperienced, their chances of getting funded are more like 10-15%. This is due in part to PI inexperience with proposal writing and/or interpreting review comments and constructively utilizing feedback. However, to shelve the proposal without considering the reasons why it was rejected—and if there are subsequent opportunities to improve and resubmit it—is a mistake: the worst thing faculty can do is to never try again.

At many funding agencies (like the NSF), it takes somewhere between 2-3 submissions on average before a PI’s first award decision. So, it’s wise as a faculty member to think of the declined proposal as “declined for now” rather than “declined forever.” In this article, I offer a handful of tips that will help you a) interpret your reviews, b) use your critiques to understand the cause(s) for the declination, and c) make an evidence-based decision about whether and how to revise and resubmit.

Reviewing your declined proposal: What to look for?

Review your ratings carefully.

The first question you should ask yourself is, what were your ratings? Can they be trusted?

Generally, two or more “good” and even one “fair” (or a “no score” at NIH) is a genuine death sentence. Even if other reviews were more positive (“Very Good”), I would be disinclined to resubmit any proposal with even one “fair” because these ratings often point to a major weakness or combination of weaknesses that led to the poor rating.  In these cases (such as E, VG, VG, F, G) the more positive reviewers may have a generally favorable disposition about the proposal, but have a hard time explaining why. This can occur when a reviewer is impressed by the institution of the PI, excellent writing or glossy proposal presentation, but remains unclear about the details of the research question or methods. Many middle-of-the-road positive reviews (Good-Very Good) are a result of this phenomenon. The clearer you are in conveying key ideas, the more accurate your reviews will be.

Two other cases of ratings of declined proposals include bi-modal reviews (e.g., E, VG, E, G, F, P) or outlier reviews: those that have all high scores except 1 fair or poor (E, VG, E, E, VG, P).  The reasons for this type of result are many. It is critical as a PI to accurately interpret the review comments. If there is a stark contrast among reviews with “bimodal” or “outlier” ratings, these are almost always “discussed” to ascertain what an “appropriate” rating should be. However, once even one review is negative (especially considering group dynamics), it is very hard to overcome the influence it has on other reviewers.

The good news is that program officers or the chairs of study sections are responsible for ensuring good review panel practices and flagging any unfair or ad hominem reviews. Program officers can also ask that these types of comments be tagged and not considered in the review decision. Due to these safeguards, the majority of proposal reviews are fair and can be trusted. This also means it is very difficult to prove that your proposal was declined unfairly. Therefore, it is best not to blame the reviewers and to take the criticism as a learning opportunity.

At the top end of the review spectrum, even all positive scores sometimes result in a declination. To illustrate the competitive nature of award decisions and the subtleties involved, consider that in 2009, the government issued a stimulus (known as AARA) and with it, NSF and other agencies “un-declined” many previously declined proposals and awarded them.  One would think that the quality, as measured by average score, would have declined.  It did not. The point here is that scores are one of many indicators derived from reviewer comments. It is critical to understand the context and meaning of scores and associated review comments in order for a PI to ascertain whether the declined proposal is revivable.

Examine reviewer weaknesses.

The second question you should ask yourself is, what were the weaknesses? It is unhelpful to look at the positive comments.  Focus instead, only on weaknesses.  You are seeking first to understand if there was a critical flaw in the fundamental premise of the proposal as written, or, was the rejection due to a collection of minor weaknesses?

All weaknesses typically fall into the following three categories:

  • Novelty or intellectual contribution: The proposal fails to “move the needle” on answering a gap in the literature or a question of critical importance or interest to the field.
  • Capabilities to carry out the work: The proposal falls apart in methodological plans, or fails to include needed skills or resources such that confidence in the capabilities of the PI or the context to successfully complete the work come to the fore.
  • Specific review criteria: The proposal fails to tailor the approach to meet the nuances of specific types of proposals, like NSF CAREER, RUI, or NIH AREA/REAP for example, where special types of review criteria are relevant.

First, you should ask yourself about the validity of the weaknesses. Were the reviewers right? This is usually an exercise in being honest with yourself and just answering: yes or no. If you truly feel that they were not, it’s best to get a second or third opinion and be prepared to address prior weaknesses when you do resubmit.

If you review the stated weaknesses and feel that the reviewers were right, then you need to test for criticality. In other words, did the identified weakness strike a fatal blow to Intellectual Merit (IM), novelty or hypothesis? If the flaw was in IM, this is equivalent to heart failure: it means your proposal is not viable and will stay that way until you can “go back to the drawing board” and hone in a novel question. If the stated weaknesses were in methods section, ask yourself a) if they can be corrected and b) where exactly (methods, personnel, management) they occurred.  Most non-critical weaknesses tend to occur in the methodology section. Can you tell if the reviewer disagreed with your techniques, your statistics or your analysis plan? Often, these critiques can be overcome by acknowledging reviewers’ perspectives and describing why your proposed method is the best choice given the alternatives.

Sometimes proposals suffer not from one large weakness, but from a collection of small ones:

  • If your proposal’s weaknesses were stylistic in nature, try to find a good proposal writer to help you correct them.
  •  If the sum of the minor mistakes led to the reviewers losing faith and confidence in the PI’s ability to carry out the work, then the PI should take each weakness from each review, across all reviews, OUT of the context of the individual review, and list them together in a new “weaknesses only” review matrix. This removes the comments from the individual reviewer’s context and allows for de novo analysis.
  • The matrix allows weaknesses to be examined on their own merits, and in combination with other weaknesses. One weakness is often related to others, so it is very important to relate them to one another and look for patterns. It is important to distinguish the more important from the less important.

Deciding whether to resubmit

After the analysis of the declination has become clear, you can then make a decision about whether you’d like to revise and resubmit. Remember to check for any resubmission limits of the funding agency.  Here are some things to consider for each decision point.

  • I’ve decided to revise and resubmit.
    Great! As you are doing so, make sure you address only validated criticisms and do not create additional weaknesses or miss weaknesses that reviewers failed to mention previously. Remember, for those mid-range (good-very good) proposals, not all weaknesses were discovered or mentioned. So do not assume that once you fix one set of reviews, you are done. Sometimes revise-and-resubmit proposals can take up to 25-40 additional hours, almost like a new proposal.
  • I’m not going to revise and resubmit right now, but I may do so in the future.
    If you choose not to resubmit your proposal right away, keep your declined proposal “on ice” until the idea matures or until you find another funding source that is a better fit. If your proposal can be retrofitted for a private foundation, don’t forget to check alternative sources. For federal agencies, check with your program officer to see if the proposal is still a good fit.  Sometimes, you can check your analysis of the reviews with your program officer to see if you missed anything. They won’t be able to discuss the specifics of the conversation that occurred in the panel review, but they can still help validate the key weaknesses you uncovered in your matrix.
  • I’ve decided not to revise or resubmit.
    Proposal declinations, though inevitable in a faculty member’s life, are always disappointing and can be hurtful losses that require coping strategies. Conducting an in-depth post-mortem analysis of the reviews can be a great way to learn from your mistakes and better understand your options.

Richard H. Nader, PhD, Chief Research and International Officer at Long Island University and Principal Consultant at Global Proposal Solutions
Dr. Richard (Rick) Nader has 30 years of experience developing teams and writing proposals to federal, private and corporate foundations and international funding sources.  He has helped 100’s of faculty become competitive.