[The 9-HI™ Basics] Responder & Evaluator Scoring

How to create an accurate score

These guidelines provide a summary for creating scores to properly perform a Self-Evaluation for a Response to an opportunity announcement (RFI, RWP, RFP, Pitch Event) and generally how to create a score for each Success Factor. Note that it is critical that a Responder provides an accurate score because Self-Evaluation scores that are misleading or unsubstantiated can lead to non-selection of a proposal. Evaluators will be looking for proposal content and Development Plans that represent Evidence that justifies the self-evaluation scores. We highly recommend that your proposal address each Success Factor identified for your Topic and that you identify Evidence that already exists OR that you will develop in your Development Plan. Note that some Success Factors are Weighted higher than others so these should be prioritized with detailed evidence in your proposal. 

Your proposal needs a Starting TRL and Finishing TRL. This TRL range establishes where the current state of the Technology to be developed is now and how far you will develop it to meet the Finishing TRL. In your proposal you want to prove that you understand the current state of the Technology for the intended application, and that your Team and Stakeholders can advance the Technology all the way to the finishing TRL and within the proposed Project schedule and budget. You should show that you understand the Risks and how the Success Factors will address those Risks, and most importantly that the Evidence you are pointing to that exists or that you are developing in your Development Plan will address the Success Factors and underlying Risks identified by the Host.  The more Evidence (amount and quality) the higher the score you can post for each Success Factor. In the Chart below you can see the Scores that are assigned to each level of Evidence in 9-HI™. 

You can see that each of the Evidence Levels above have a range of scores. Example: 5.0-5.9 is the scoring range that can be assigned to a SF if the proposal includes existing Evidence that "High fidelity assembly of the components has occurred. " If low amounts and/or quality of Evidence is presented, then a lower score in this range such as 5.0 or 5.1 should be used. Conversely if a lot of points of Evidence exists and they are of high Quality then a score of 5.8 or 5.9 should be used.  

Note that these Evidence definitions apply to all Success Factors including those for Team and Stakeholders and Market Applications as well as the Technology. Thus Evidence needs to be presented in the proposal to address all success Factors. Note that 5 of the 9 FPM categories of Success Factors below all desire High Evidence Scores  (8.0-9.9 range) even if the starting TRL is low. See Appeal, Personnel, Planning /Processes, Finances, and Market Size/Scope. These FPM scores should never be low. If they are, then essentially, the Customer does not want the Technology, the Team capability is flawed or the Application can't justify the Development Project.  The Host or investor is always seeking high Scores for these categories, so be sure to display your Evidence accordingly. 


Click on the graphic above to get an interactive explanation.

Note that Technologies sought by a Host Organization, that are “Low” TRL, inherently mean that the Technology itself is not fully developed, may be conceptual in nature, and also inherently has LOW STARTING Value and Reliability FPM Scores. The Value and Reliability of a low TRL technology is what will be developed from your Development Plan.  Therefore, only proposals with lower starting Value and Reliability scores are eligible for low TRL proposal technology needs. As a desired TRL increases throughout a development program, the scores for Value and Reliability need to increase. 

Significant mismatches between the Evidence you identify and your self-evaluation scores can reflect poorly and may diminish the credibility of your proposal.

Below is an example of a Responder scorecard. As a Responder, you are required to add a score to every box on the right. You determine your score for each SF based on the amount of Evidence that you include in your proposal related to each SF. 

Responder Best Practices to Align with Host Needs

When providing a technical write-up as part of a response, the Responder must ensure that all Risks and Success Factors are addressed. Responders must provide sufficient Success Evidence in support of the identified SFs. By doing so, they’re providing proof of how their Plan allows each SF to mitigate, or burn down, one or more Risks. The Success Evidence provided within a technical response must include sufficient Success Evidence that exists, or will be developed in their Development Plan, in order to align with the Responder’s self-score. Within the 9-HI™ scoring process, Success Evidence must be able to survive an audit and/or stringent SME scrutiny. Examples of Success Evidence might include: testing, prototyping, analysis, calculations, pilot reviews, focus group feedback, confirmed supporting data, established theories, similar prior developments, etc., that can be linked together in a cohesive proposal and Development/Deployment Plan. If the Evidence doesn’t exist now, your Development Plan needs to demonstrate how the Evidence will be developed.

Very often a Host will designate an acceptable Starting TRL and Finishing TRL for the Topic they are requesting. Therefore, it is paramount to provide Success Evidence and Development Plans that support a logical progression of technology development from Starting TRL to the Finishing TRL. Not all development projects target TRL 9 as a completion objective, so the Responder should only address needs through the required exit TRL, unless the Host requests additional transition or commercialization information beyond the exit TRL.

Step-by-Step Response Development Guidance

  1. Review all materials provided by the Host, and schedule appropriate time for completion.
  2. Evaluate Risks and SFs for your selected Topic.  Only one Topic is allowed per response submission, but multiple submissions are typically allowed if the Host approves multiple registrations. (I.e. one response will have one topic for each registration that is approved by the Host).
  3. Determine the technical solution that will be proposed to the host. (You may elect to run your own 9-HI™ Responder Solution Project to build and evaluate your best Response.)
  4. Outline all technical response documents and information needed, including existing evidence and development plans that address each of the SFs identified by the Host. Ensure not to miss any highly weighted SF.
  5. Prepare all proposal materials in accordance with Host requirements, rules, and guidance.
  6. Review self-assessed scores for alignment to the TRL objectives of the Host, modify technical response as needed to achieve alignment.
  7. Upload all scores and documentation needed in your Response Portal.
Did this answer your question? Thanks for the feedback There was a problem submitting your feedback. Please try again later.