Page 17 - Technology and Innovation Journal - 19-1
P. 17
UNIVERSITY-BASED TECHNOLOGY ACCELERATORS 355
written applications to get a sense of which day in advance of the review meeting.
teams are of interest. A sample format ◉ An online review and scoring platform is
might include the following sections: the helpful to efficiently collect and manage
problem or unmet need, market size, team judges’ feedback in advance of and during the
introductions and the envisioned solution. proposal rounds. FluidReview , for example,
©
◉ Full Proposal: A full proposal expands on the is a commercially-available tool that we have
aspects first introduced in the pre-propos- found useful for this task.
als and allows teams to update their responses ◉ During review sessions, the scores of all
based on feedback from reviewers during the teams are pre-loaded and teams ranked
the pre-proposal phase. In addition to pro- from lowest to highest based on that scoring.
viding a more in-depth market analysis, Review discussions are focused towards the
teams are asked to discuss competitors, teams in the middle rather than dis-
intellectual property position, project budget, cussing the consensus on the “winners” and
and detailed technical milestones. To help “losers” in depth.
teams dive deeper into their initial markets, ■ An introduction at the beginning of the
a business mentor is assigned (see below) to meeting is important to establish the goals
each team. For our energy and biomedical of the program with the judges.
accelerators, given the centrality of intel- ■ Show both average scores as well as the
lectual property protection, an external law scores of individual judges. To keep vocal
firm is enlisted to perform cursory IP reviews, judges from monopolizing the conver-
which have proved extremely helpful. For a sation, use the individual scores to guide
few thousand dollars per team, the IP review the discussion and draw out quieter judges.
can provide judges with an independent, ■ Judges can change their individual scores
objective lay summary of the core technol- (and thus the overall average score of the
ogy, including prior art landscape. team) at any time.
◉ Live Pitch: Well in advance of pitch day, ■ At the end, display the final ranking of
teams are provided with a structured format teams and estimate where the cutoff will be.
for their materials, including guidelines on Ask the reviewers if they agree with which
content to be covered, length of each sec- teams will be moving forward or if they
tion, time and location of their pitch, and a would like to change their scores.
list of the judges. Teams continue to work ■ Have a dedicated notetaker capture ano-
with their mentors to finalize their pitches nymized verbal comments to be given
and practice delivery in order to give the directly to teams along with the de-identi-
best impression to the judges. Sample pitch fied comments from the reviewers
guidelines might include a general overview themselves. Provide teams with the unfil-
of how to draft a compelling story, specific tered (but aggregated and anonymized)
slides to include (e.g., IP, competitive land- feedback from the reviewers regardless of
scape), a list of questions that should be whether they move forward in the compe-
answered, and/or a list of common mistakes tition or not. This allows teams to see the
to avoid. In addition to these guidelines, pro- judges’ perception of their material so that
gram administrators run a pitch practice they can improve for future rounds or sub-
session (see below) to assist teams in refining sequent submissions.
their stories and sharpening their presenta- ■ Once awards are made, keep judges
tion skills. updated on their progress. The structure of
• Communication of scores and comments to the updates can vary from formal periodic
the teams. reviews to informal invitations to pitch
◉ Allow reviewers two to three weeks to review or demo days that highlight how awardees
written proposals. Pre-scores and comments are progressing.
are due from the judges at least two business

