Methodology for the Evaluation of the National Science ...

18
Methodology for the Evaluation of the National Science Foundation's Experimental Program to Stimulate Competitive Research (EPSCoR) Brian L. Zuckerman Rachel A. Parker Thomas W. Jones Brian Q. Rieksts IDA Document NS D-5322 October 2014 Log: H 14-001125 Approved for public release; distribution is unlimited. SCIENCE & TECHNOLOGY POLIC Y INSTITUTE IDA SCIENCE & TECHNOLOGY POLICY INSTITUTE 1899 Pennsylvania Ave., Suite 520 Washington, DC 20006-3602

Transcript of Methodology for the Evaluation of the National Science ...

Page 1: Methodology for the Evaluation of the National Science ...

Methodology for the Evaluation of the National Science Foundation's Experimental Program to

Stimulate Competitive Research (EPSCoR)

Brian L. ZuckermanRachel A. Parker Thomas W. JonesBrian Q. Rieksts

IDA Document NS D-5322

October 2014

Log: H 14-001125

Approved for public release; distribution is unlimited.

S C I E N C E & T E C H N O L O G Y P O L I C Y I N S T I T U T E

IDA SCIENCE & TECHNOLOGY POLICY INSTITUTE

1899 Pennsylvania Ave., Suite 520 Washington, DC 20006-3602

Page 2: Methodology for the Evaluation of the National Science ...

Copyright Notice© 2014 Institute for Defense Analyses4850 Mark Center Drive, Alexandria, Virginia 22311-1882 • (703) 845-2000.

This material may be reproduced by or for the U.S. Government pursuant to the copyright license under the clause at FAR 52.227-14 (December 2007).

Page 3: Methodology for the Evaluation of the National Science ...

Methodology for the Evaluation of the National Science Foundation’s Experimental Program to

Stimulate Competitive Research (EPSCoR)

American Evaluation Association Session 616, October 17, 2014

Brian Zuckerman

Rachel Parker

Thomas Jones

Brian Rieksts

Page 4: Methodology for the Evaluation of the National Science ...

EPSCoR Origins

• “Balance” built into NSF’s authorizing statute (42 U.S.C. §1862e) – “an objective of the Foundation to strengthen research

and education in the sciences and engineering… throughout the United States, and to avoid undue concentration of such research and education” [emphasis added]

• 1977-8: NSF establishes EPSCoR as limited-time experiment (NSB resolution 78-12)

• 1988 EPSCoR enacted into law as part of NSF reauthorization (Pub. L. 100-570, title I, Sec. 113)

1

Page 5: Methodology for the Evaluation of the National Science ...

Original Legislative Authorization

National Science Foundation Authorization Act of 1988 Section 113 [emphasis added]: (a)The Director shall operate an Experimental Program to

Stimulate Competitive Research, the purpose of which is to assist those States that

(1) historically have received relatively little Federal research and development funding; and

(2) have demonstrated a commitment to develop their research bases and improve science and engineering research and education programs at their universities and colleges

2

Page 6: Methodology for the Evaluation of the National Science ...

3

EPSCoR-Eligible States, by Year of Entry

• Gray = not eligible • Red = 1980 cohort: AR,

ME, MT, SC, and WV

Page 7: Methodology for the Evaluation of the National Science ...

4

EPSCoR-Eligible States, by Year of Entry

• Gray = not eligible • Red = 1980 cohort: AR,

ME, MT, SC, and WV

• Orange = 1985 cohort: AL, KY, ND, NV, OK, PR, VT, and WY

Page 8: Methodology for the Evaluation of the National Science ...

5

EPSCoR-Eligible States, by Year of Entry

• Gray = not eligible • Red = 1980 cohort: AR,

ME, MT, SC, and WV

• Orange = 1985 cohort: AL, KY, ND, NV, OK, PR, VT, and WY

• Yellow = 1987 cohort: ID, LA, MS, and SD

Page 9: Methodology for the Evaluation of the National Science ...

6

EPSCoR-Eligible States, by Year of Entry

• Gray = not eligible • Red = 1980 cohort: AR,

ME, MT, SC, and WV

• Orange = 1985 cohort: AL, KY, ND, NV, OK, PR, VT, and WY

• Yellow = 1987 cohort: ID, LA, MS, and SD

• Green = 1992 cohort: NE and KS

Page 10: Methodology for the Evaluation of the National Science ...

7

EPSCoR-Eligible States, by Year of Entry

• Gray = not eligible • Red = 1980 cohort: AR,

ME, MT, SC, and WV

• Orange = 1985 cohort: AL, KY, ND, NV, OK, PR, VT, and WY

• Yellow = 1987 cohort: ID, LA, MS, and SD

• Green = 1992 cohort: NE and KS

• Blue = entries into EPSCoR 2000+: AK, DE, GU, HI, IA*, MO, NH, NM, RI, TN*, UT*, and VI

* Iowa, Tennessee, Utah above current threshold of 0.75% of NSF R&RA

Page 11: Methodology for the Evaluation of the National Science ...

EPSCoR Uses Multiple Approaches

• Research Infrastructure Improvement (RII) awards: – Track-1 Awards. Currently $4M/yr for 5 yrs. One award per state

funds academic research infrastructure based on state S&T plan – Track-2 Awards. $2M/yr for 3 yrs for collaborative, multi-state

research (started in 2009) – C2 Awards. $500K/yr for 2 yrs for cyber-infrastructure (2009-

2010) – Track-3 Awards. $150K/yr for 5 yrs for education (started in

2013)

• Co-Funding of Research Projects: – EPSCoR co-invests with NSF Directorates and Offices in proposals

that have been merit reviewed and recommended for award, but could not be funded without joint support

• Workshops and Outreach Activities

8

Page 12: Methodology for the Evaluation of the National Science ...

EPSCoR Timeline

9

1978 20142000+

cohorts

1992

cohort

1987

cohort

1985

cohort

1980

cohort

2000 2004 2012

2003

Eligibility criteria:

< 0.75% NSF

R&RA funds;

no secondary

indicators

1988

Congress first

authorizes EPSCoR

1991

Eligibility criteria:

< 0.5% NSF funds to universities,

6 secondary indicators

1980

First RII Track-1 awards:

$3M/5 years,

100% match

1998

NSF initiates formal

ESPCoR co-funding

2001

RII Track-1 awards:

$9M/3 years,

50% match

2002

Eligibility criteria:

< 0.7% NSF

R&RA funds;

no secondary

indicators

1984

Eligibility criteria:

<$3M NSF funds,

6 secondary indicators

2013

First RII

Track-3 awards

1992

RII Track-1 awards:

$4.5M/3 years,

100% match

1978

NSF establishes

EPSCoR

2006

RII Track-1 awards:

$9M/3 years,

no match

1979

Eligibility criteria:

<$1M NSF funds,

5 secondary indicators

2008

RII Track-1 awards:

$15M/5 years,

no match

2009

RII Track-1 awards:

$20M/5 years,

20% match

First RII

Track-2 awards

2010

First RII

C2 awards

Page 13: Methodology for the Evaluation of the National Science ...

Task Origins and Context

• Both internal and external (Office of Management and Budget) drivers for task

• Task initiated August 2011

• Other EPSCoR-related studies previously released

– EPSCoR community convenes internal workshops (reports issued August 2006 and April 2012)

– America COMPETES Reauthorization Act requires National Academies to report on all EPSCoR programs (report issued November 2013)

10

Page 14: Methodology for the Evaluation of the National Science ...

Task Objective

• Perform a two-year, in-depth, life-of-programassessment of NSF EPSCoR activities and theiroutputs and outcomes

– Competitiveness for funding

– Enhanced science and engineering (S&E) research base

• Provide recommendations on better targeting offunding to those jurisdictions for which the EPSCoRinvestment can result in the largest incrementalbenefit to their research capacity

11

Page 15: Methodology for the Evaluation of the National Science ...

Study Methods Overview

• Literature review on EPSCoR and research capacity development

• Developed EPSCoR logic model

• Qualitative data – Survey of EPSCoR jurisdictions

– Interviews of EPSCoR State Committee members

– Analysis of EPSCoR RII proposals and annual reports

• Quantitative data – Analysis of NSF awards data

– Analysis of National Center for Science and Engineering Statistics (NCSES) survey data

– Information from journal articles with U.S. authors, as identified through the Thomson Reuters Web of Knowledge

– Analysis of EPSCoR eligibility criteria and NSF eligibility determinations

12

Page 16: Methodology for the Evaluation of the National Science ...

EPSCoR Logic Model

13

Influence university,

departmental policies

and programs

Policy and program

changes

Faculty hiresSupport faculty hiring

Added incentives for

research

New and existing

faculty retained

More and higher quality

research and

publications

More faculty submit

proposals

Better funded research staff and research

projects

More awards received

Support thematic/large-

scale research

Support research

infrastructure/

cyberinfrastructure

Increased award

success rates

Collaboration

development

Seed funding, student and post-doc support

State Committee plans

and coordination

Innovation activities and

industry support

Activities to broaden

participation in STEM

New equipment and

facilities research

services

Enhanced research

capabilities

Stronger universities

Agreement on state

S&E priorities

Stronger STEM

workforce state-wide

State S&E funding

programs created or

expanded

Stronger high-

technology industry

More STEM workers

and demographically

broader STEM

workforce

STEM education

programs; documents

granted; graduates

move to STEM careers

Increased collaboration

(cross-university,

with industry, and

within state)

Research and

innovation plans

Resource Base

· Number of

universities and

colleges and quality

of their S&T

programs

· State-level policies

and institutions

supporting S&T

· Sociodemographic

distribution of

population in

jurisdiction

INPUTS/CONTEXT

NSF EPSCoR

Award Types

· Research capacity

development

(RII Track-1)

· Collaborative

research support

(RII Track-2)

· Cyberinfrastructure

support (RII C2)

· E/O/D support

(RII Track-3)

· Co-funding of other

NSF single-

investigator and

small team awards

ACTIVITIES OUTPUTS

Broader impact:

decreased

concentration of S&T

funding

SHORT-TERM OUTCOMES

INTERMEDIATE OUTCOMES

CONGRESSIONAL OBJECTIVES/

BROADER IMPACTS

EPSCoR

Eligibility Criteria

Legislative objective:

state S&E research and

education base

increases

Legislative objective:

competitiveness for

Federal research

funding increases

Broader impact:

enhanced capabilities

to support innovation/

economic development

Larger awards received

Collaborations and

academic-industry

co-funding of research

Page 17: Methodology for the Evaluation of the National Science ...

Presentations that Follow Describe Methods Used

• Quantitative Analyses/Competitiveness for Funding – Change in NSF Funding (per-jurisdiction, per-

investigator) – Concentration Analyses

• Qualitative Analyses/Enhanced Research Base – Institution-Building – State Committees – Education, Outreach, and Diversity (E/O/D) – Academic Development – Innovation

14

Page 18: Methodology for the Evaluation of the National Science ...

Standard Form 298 (Rev. 8/98)

REPORT DOCUMENTATION PAGE

Prescribed by ANSI Std. Z39.18

Form Approved OMB No. 0704-0188

The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing the burden, to Department of Defense, Washington Headquarters Services, Directorate for Information Operations and Reports (0704-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 2. REPORT TYPE 3. DATES COVERED (From - To)

4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER

5b. GRANT NUMBER

5c. PROGRAM ELEMENT NUMBER

5d. PROJECT NUMBER

5e. TASK NUMBER

5f. WORK UNIT NUMBER

6. AUTHOR(S)

7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATIONREPORT NUMBER

9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR'S ACRONYM(S)

11. SPONSOR/MONITOR'S REPORTNUMBER(S)

12. DISTRIBUTION/AVAILABILITY STATEMENT

13. SUPPLEMENTARY NOTES

14. ABSTRACT

15.

16. SECURITY CLASSIFICATION OF:a. REPORT b. ABSTRACT c. THIS PAGE

17. LIMITATION OFABSTRACT

18. NUMBEROFPAGES

19a. NAME OF RESPONSIBLE PERSON

19b. TELEPHONE NUMBER (Include area code)

10- 4 Oct 2013 Oct 2014

Methodology for the Evaluation of the National Science Foundation’s Experimental Program to Stimulate Competitive Research (EPSCoR)

CS 43

Zuckerman, Brian L. Parker, Rachel A. Jones, Thomas W. Rieksts, Brian Q.

5322

Approved for public release; distribution is unlimited.

Lewis, Mark J.

SUBJECT TERMS

This presentation was prepared for a meeting of the American Evaluation Association in October 2014. It presents background regarding the National Science Foundation’s Experimental Program to Stimulate Competitive Research (EPSCoR) program itself, including a synopsis of the program’s legislative authorization, goals, and descriptive statistics regarding eligibility and participating jurisdictions. The EPSCoR program logic model is presented, identifying the legislatively mandated goals—increasing the competitiveness of investigators for NSF and other Federal funding and increasing participating jurisdictions’ science and engineering research bases—and the program’s theory of action to reach those goals. The presentation concludes with an overview of the methods that were used to conduct the evaluation.

EPSCoR, Logic Model, Theory of Action, Program Goals, Descriptive Statistics

18