Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research...

32
Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision Medicine Advances Using Nationally Crowdsourced Comparative Effectiveness Research PRANCCER is a joint initiative by:

Transcript of Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research...

Page 1: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

Crowdsourcing Medical Research Priorities

A Guide for Funding Agencies

Lessons and Best Practices from PRANCCER PRecision Medicine Advances Using Nationally Crowdsourced Comparative Effectiveness Research

PRANCCER is a joint initiative by:

Page 2: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

Case Study: PRANCCEROverview

Crowdsourcing Basics

References Appendix

2

Overview 3

Crowdsourcing Basics 4

The What and the Why of CrowdsourcingTo Crowdsource or Not to Crowdsource?

Implementing a Crowdsourcing Program 8

Software PlatformsProcess and Implementation Steps

Case Study: PRANCCER 17

PlanningChallenge I: Patients and CaregiversChallenge II: Researchers and Clinicians

Recommendations & Lessons Learned 27

References 30

Appendix 31

Contents

Page 3: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

Case Study: PRANCCEROverview

Crowdsourcing Basics

References Appendix

Traditional funding mechanisms for biomedical research are poised for a transformation in coming years. Competition for limited funds to answer narrow questions in siloed disciplines will give way to open competition across multiple disciplines, aimed at solving specific critical questions posed by funding entities, driven by patients’ most urgent needs.

This new research paradigm will require an unprecedented degree of coordination among funding agencies, donors, and private and corporate foundations. Recognizing this, the American Heart Association (AHA) and the Patient-Centered Outcomes Research Institute (PCORI) joined forces in 2015 to accelerate research addressing their overlapping priorities in precision medicine and comparative effectiveness research.

Acknowledging the power of open, multidisciplinary research to drive medical progress, AHA and PCORI turned to the rapidly evolving methodology of crowdsourcing to find out what patients, clinicians, and researchers consider the most urgent priorities in cardiovascular medicine and to shape the direction and design of research targeting those priorities.

These discussions culminated in a comprehensive yearlong pilot initiative called PRANCCER (Precision Medicine Advances using Nationally Crowdsourced Comparative Effectiveness Research).

This playbook is meant to serve PCORI as a guide to best practices in scientific crowdsourcing. It begins with general background and guidance on crowdsourcing campaigns, followed by an account of the PRANCCER experience. Additionally, it details required resources and considerations to help other organizations adapt the PRANCCER model across other disease areas, as the new era of multi-dimensional healthcare and research unfolds.

3

This playbook is meant to serve as a guide to best practices in scientific crowdsourcing

Overview

Page 4: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

References Appendix

The What and the Why of Crowdsourcing “Crowdsourcing” means outsourcing to the crowd: assigning an information-gathering task to the public at large or to groups with shared interests or expertise. The host organization provides a forum or platform, usually online, for participants to solve a problem, which might be specific or open-ended.1

Since the early 2000s, crowdsourcing has proven effective across varied disciplines.1-3 Its best known manifestation is Wikipedia, a public-access encyclopedia crowdsourced from

contributions by unpaid volunteers. The Wikipedia phenomenon highlights the natural, and understandable, appeal of sharing one’s own ideas and experiences.

Having matured into a well-vetted tool in some settings, crowdsourcing is finding increasing applications among sizable organizations, academic institutions, and businesses, to support endeavors in areas as diverse as information technology, fundraising, event planning, business development, aviation, personal computing,1,2 and, more recently, science and healthcare.3-5

In the realm of biomedical science, researchers at the University of Washington have harnessed the puzzle-solving skills of thousands of people around the world – no expertise required – to simulate protein folding on its fold.it platform in the search for new proteins. As one measure of its power, these external contributors helped map the structure of an AIDS-related virus that had puzzled academic and industry experts for more than 15 years.

At the Tisch Cancer Institute of Mount Sinai School of Medicine, researchers incorporated crowdsourced input from physicians, researchers, and patients into the design of a cancer clinical trial. This improved consensus among these stakeholders about methods and outcomes, as well as uncovering potentially project-altering information “gems.”5

Crowdsourcing initiatives may provide cash or prize incentives, but often participation serves as its own reward: a desire to share ideas or to learn, enhanced reputation among peers, establishment of social contacts, a sense of accomplishment and belonging, and for patients, insight into conditions and treatment options.6

Patients in particular are quick to relate details about their personal experiences in the course of a crowdsourcing exercise. These remain valuable in their own right, and worth archiving and revisiting, even when they don’t rise to the highest ranks among contributions.2,5,7

Crowdsourcing Basics

Crowdsourcing Basics

Crowdsourcing is finding increasing applications among sizable organizations, academic institutions, and businesses

Case Study: PRANCCER

4

Page 5: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

References Appendix

Crowdsourcing Basics

To Crowdsource or Not to Crowdsource?Why choose crowdsourcing over surveys or other research methods? Surveys may be preferable when the goal is obtaining statistically valid data from large, tightly defined populations. Crowdsourcing is generally the better means to obtain in-depth, creative ideas aimed at solving problems. Crowdsourcing may elicit answers to questions researchers might never have thought to ask, and it is the mechanism of choice when energized engagement in the search for answers is a critical objective in itself.

Crowdsourcing is most powerful when engaging a passionate group of individuals with a broad range of experiences. Casting a wide net increases the likelihood of obtaining unique feedback that could steer innovation in unexpected directions.1,2,8 Crowdsourcing may also help fill knowledge gaps and provide useful perspective as a component of an otherwise conventionally designed study.9

Figure 1 summarizes how study objectives can guide the choice of information-gathering method, while Table 1 breaks out key advantages and disadvantages of crowdsourcing vs surveys. Considerations in deciding whether and how to crowdsource will also help inform the choice of a vendor.

Critically, one should select a crowdsourcing vendor with a solid track record in accessing the audience(s) most appropriate for the project’s objectives.

Case Study: PRANCCER

5

Select a crowdsourcing vendor with a solid track record in accessing the audience(s) most appropriate for your project’s objectives

Page 6: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

References Appendix

Crowdsourcing Basics

Case Study: PRANCCER

6

Are we asking for solutions

and/or ideas?

Are we looking for a wide variety of responses from a

diversified audience?

Use Crowdsourcing

Use Focus Group

Use Survey

YES

YES

YES

NO

NO

NO

Figure 1

Process flow to discern when to use scientific crowdsourcing vs alternative options. Note that crowdsourcing can be used for both solutions and ideas.

Will a conversation be needed or questions

asked about the information provided?

Patient participation in crowdsourcing can improve health outcomes by improving providers’ understanding of patient perspectives, and by increasing patients’ understanding of their conditions and participation in health decisions.10,11 Moreover, patients participating in a highly publicized crowdsourcing initiative are more likely to share their ideas and experiences with family, friends, co-workers, and others than they would if participating in a focus group or scientific survey. High public visibility around a crowdsourcing initiative enhances the profile of the sponsoring organization, highlighting its investment in giving the “crowd” a voice.

Page 7: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

Surveying Crowdsourcing

PROS> Quick

> Reusable

> Can be low cost

> Can utilize more than one population at a time

CONS > May be difficult to recruit participants

> Follow-up questions may require additional research

> Participants are confined to the questions at hand

> May require Institutional Review Board approval

> Perception of monotony by participants

PROS > Knowledge sharing between people with different

experiences and talents

> Can identify and solve complex challenges

> Versatile

> Obtain unstructured responses

CONS > Generally online only

> Can be expensive

> Quality of responses may vary

> May require management of planning and evaluation activities

> Participant engagement may require incentives

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

References Appendix

Crowdsourcing Basics

Case Study: PRANCCER

7

Table 1

Pros and Cons for Scientific Surveys and Crowdsourcing

Page 8: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

References Appendix

Crowdsourcing can generate many varied types of information to help guide decisions or set goals. Crowdsourcing can solve complex mathematical models,12 or be used to select, design, or craft marketing materials. Figure 2 summarizes options for when and how to use specific types of crowdsourcing based on organizational needs.

Implementing a Crowdsourcing Program

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

I Need I Use

Specific Solutions

> Generate independent high-value solutions to highly challenging technical and analytical problems

> Defining the problem in a generalized way and removing company-specific details to protect Intellectual Property (IP)

Campaigns and Contests

People to Share Ideas

> Open collaboration within an organization or externally with multiple diverse stakeholder groups

> Engaging participation, establishing a shared purpose, culture, and cohesiveness

> Protecting IP

Collaborative Communities

Purpose Challenges

Figure 2

When and how to use the crowd. Adapted with permission from Harvard Business Review.2

Case Study: PRANCCER

8

Page 9: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

Software PlatformsIn this age of almost universal internet access, more than 85% of Americans now use the internet via computer or mobile device, according to Pew Research.13 This has led to an explosion in applications, websites, and new methods for people to communicate from disparate locations. A number of sophisticated crowdsourcing software platforms – designed to collect large amounts of information, sort it, and allow for easy tracking and evaluation – take full advantage of this deepening connectivity, making it easier for host organizations to engage stakeholders in ever more interactive ways.

While software can help initiate, collect, evaluate, and implement crowdsourcing initiatives, choosing the platform best suited to an organization’s needs can be daunting. The software capabilities listed in Table 2 should be discussed and prioritized early on in planning a crowdsourcing campaign.

Gamification Engine/Capabilities

Allows crowdsourcing community to win points, badges, etc., based on participation level, votes received, or other evaluation measures

Social Media Integration

Ability to integrate with outlets such as Facebook, LinkedIn, and Twitter to distribute and collect crowdsourced information and ideas

Idea Posting Capacity for software to accept rich media, uploads, photos, and videos

Idea Collaboration Ability to control who can comment on ideas and/or the ability for groups of people to work together on the same idea; capacity to perform automatic idea merging

Idea Browsing Ability to control who can see the ideas and comments being collected

Idea Management Ability to automatically sort, filter, cluster, and/or package comments and ideas as part of workflow control

Idea Evaluation Capacity to allow voting by the crowdsourcing community or a smaller elect group; ability to create customized scorecard; ability to make comparisons between ideas and set priorities

Community Integration

Availability of a “built-in” community associated with the vendor (e.g., a network of university students) that could extend the size and breadth of the participating community, if appropriate

Continued on next page$

Case Study: PRANCCER

9

Table 2

Features and Functions of Different Crowdsourcing Platforms

Software features should be prioritized early on in planning a crowdsourcing campaign

Page 10: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

Continued from previous page

Remuneration Management

Capacity to manage and control compensation, honoraria, and/or prizes

Multilanguage Translation

Capacity for auto-translation of crowdsourcing content

Reporting and Analysis

Capacity to generate custom reports or analytics

Systems Integration

Ability to integrate with other platforms or databases

Integration Experience

The software company’s level of knowledge and experience working with your type of organization, on projects of the current type and scale

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

Table 3

Core Implementation Stages for an Ideation Campaign

Determine campaign- specific:

> Desired outcomes

> Leadership expectations

> Target audience

> Success metrics

> Response process

> Plan for reviewing and using entries

> Timeline

> Communications plan

> Draft challenge language

> Determine custom entry fields

> Set evaluation criteria

> Draft entry tips and FAQs

> Write communications materials

> Set up and test the website

> Organize team workflow and roles

> Send communications

> Activate the campaign

> Moderate incoming entries

> Address users’ questions

> Capture metrics

> Provide points for weekly report

> Draft follow-up communications

> Close and archive the campaign

> Initiate the evaluation process

> Communicate status and results to users

> Run metrics

> Draft final report with data analysis

Plan Design Launch Evaluate

Process and Implementation StepsDeveloping and implementing a successful crowdsourcing initiative breaks down into four main stages, as summarized in Table 3. Although broadly applicable, the stages are presented with a particular focus on ideation campaigns like PRANCCER. (Note: In this document, “campaign” may refer to an overall crowd-sourcing initiative or just that part of the initiative when participants can provide input; the meaning will be clear from context.)

Case Study: PRANCCER

10

$

Page 11: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

Set specific goals and outcomesAsk, “What are we asking people to provide?” An ideation campaign focuses on receiving ideas from participants, while a solution campaign focuses on receiving the steps and workflow to solve a problem.

These campaign outcomes must be defined early; they reflect the guiding principles from which all tactical considerations follow.

Project planCreate a document spelling out program objectives, roles and responsibilities, and workflow for handling entries throughout the campaign lifecycle (intake, moderate, evaluate, address/implement, close). Establish the hierarchy of approval and evaluation – a final decision maker is a necessity for any campaign. This plan should also include legal guidance on potential Intellectual Property (IP) issues, payment constraints, permission to re-contact participants, etc.

Communications planAll crowdsourcing campaigns need a communications plan that spells out how, where, and when to deliver core campaign messages to build awareness and boost participation. Communication plans typically include the following:

> Key messages. Key messages must convey enough context and information to engage the intended participants, clearly articulating “What do we want you to do, and why?” All communications and materials should pull through the key messages to maximize exposure (“impressions”) to the target audience.

> Target audiences. Whom are we reaching out to? Why would they care? Understanding what motivates them toward involvement is the foundation of successful crowdsourcing of any type. Do they respond best to cash incentives? Market analysis may be needed to help understand your diverse stakeholder landscape and how best to message different groups within it.

> Channels/tactics. These can include news stories, social media posts, interpersonal communications, etc. For crowdsourcing, the most effective channels include email, social media, and other online communities and hubs. Ask, “What available avenues are most effective, and economical, for recruiting participants?”

Plan1

Case Study: PRANCCER

11

All crowdsourcing campaigns need a communications plan that spells out how, where, and when to deliver core campaign messages

Page 12: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

Draft campaign languageEach campaign is unique in its goals and desired outcomes. Building off the core messages, the campaign language needs to provide clear and concise information and directions to ensure participants provide responses that are on point. More involved crowdsourcing efforts may require more extensive background information.

Determine custom entry fieldsCustom entry fields are embedded in forms or documents submitted by crowdsourcing participants. Completion of entry fields and submission of the document is itself a metric of participant engagement with the campaign. Additionally, criteria relevant to the evaluation process can often be converted into entry fields to aid in the evaluation stage.

Design2

Customized tracking links are an essential tool for keeping tabs on which channels drive the most traffic to your campaign site. When someone clicks a link to your site from an online ad, news article, social media post, etc., a tracking link will create a record of where that link was clicked. Various commercial services can develop tracking links for your campaign.

Additionally, if your platform requires participant registration, you can capture metrics of impression (visit to the campaign site), conversion (registration of account), and success ratio (submission of response).

> Delivery timeline. The delivery timeline aligns with the overall campaign timeline to ensure messages are being sent to drive crowds to the site. For scientific audiences, starting message delivery earlier is more effective to help raise awareness and interest among prospective crowds.

Case Study: PRANCCER

References Appendix

12

Page 13: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

Converting as much essential information into entry fields that can populate a reference database is especially invaluable for campaigns expecting a high volume of participation. It is helpful to beta test entry forms with a pilot crowd of individuals not involved in the design process. This helps clarify whether more entry fields are needed or if any need to be reworked.

Set evaluation criteriaDeciding what to crowdsource and conceiving how to evaluate it are integral parts of the same process, which requires thorough planning from the beginning. Crowdsourced entries may be evaluated by external reviewers and/or the crowd itself. Especially in campaigns where a monetary incentive is at stake, it often makes sense to articulate the key evaluation criteria in the crowdsourcing document.14 It is also wise to prepare reviewer recruitment and introductory materials in tandem with setting the evaluation criteria. If it’s hard to articulate an evaluation criterion for a reviewer, maybe the criterion itself needs revisiting. Useful questions to ask during this phase include:

> Who will review the campaign? What is the necessary expertise?

> In what format will submissions be presented to the reviewers?

> How will reviewers submit their assessments, in what time frame, and how many submissions will they be assigned?

> Will reviewers assigned to the same submission have an opportunity to discuss and revise their scores prior to the final ranking?

> How will reviewers be educated on the process, and how can they have their questions answered?

> If lay members are included as reviewers, will the review process for them differ?

> How can diverse submissions be quantified and scored to allow fair competitive ranking?

> How will all campaigns and reviews be analyzed and stored?

> How will the review process be adjudicated?

Case Study: PRANCCER

13

Page 14: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

BasicsCase Study: PRANCCER

14

Draft entry tips and FAQsCrowds, or the grouped collective of your outreach audience, look for quick access to guidance on how to draft the best entry. FAQs need to be tailored to needs of participants in specific campaigns; they must necessarily differ in campaigns for patients vs researchers. Ideally, these tips should be posted to the campaign website and expanded as needed, as real-world questions emerge. Again, beta testing the crowdsourcing process can help anticipate a first-time user’s points of confusion and is especially important in a campaign seeking ideas or solutions in a relatively unexplored area.

Develop communications materialsThe communications plan should lay out timelines for developing and activating emails, handouts, web pages, social media posts, and other vehicles for campaign messages. For campaigns co-sponsored by more than one organization, build in review time to ensure consistent messaging and co-branding. These materials can do double duty as recruitment vehicles and institutional promotion through news outlets, journals, and forums. (See Appendix for selected generalized communication resources that can be customized.)

Organize team workflow and rolesWorkflows and processes, with clearly delineated roles and responsibilities, are crucial to a well-run campaign of any sort. Useful tools for managing the complexities of a scientific crowdsourcing campaign include Gantt charts and Responsibilities Actionable Consulted Informed (RACI) matrices (see Appendix). Gantt charts identify tasks aligned with dates and responsible staff. A RACI matrix helps identify appropriate channels and additional staff who may help facilitate specific tasks over the course of the program, as well as approval and authorization pathways.

Beta testing the crowdsourcing process is especially important in a campaign seeking ideas or solutions in a relatively unexplored area

Page 15: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

Set up and test the campaign websiteThe website, housed on the vendor’s platform, should include, at minimum, the campaign language, start and close dates, moderator names, workflow, and evaluation criteria, as well as configuring any other features for the campaign. Prior to launch, members of your team or others should test the site with simulated submissions, making deliberate mistakes in the process, to ensure the site is bug-free and functions as intended.

Send communicationsBased on the communications channels identified in the communications plan, send the materials drafted in the pre-launch section to the targeted audiences.

Activate the campaignEnsure the automated activation takes place at the desired launch time. If using a vendor, identify point of contact with selected vendor in case rapid response is required.

Moderate incoming entriesMonitor incoming entries and implement a response plan assessing each entry for completeness and

adherence to policies and basic entry criteria. Based on entry suitability, send an appropriate email response to each entrant. Some vendors provide this service as part of their contractual agreement. If using a vendor-moderated entry system, ensure that the vendor’s staff is adequately trained to screen incoming submissions. Regardless of the monitoring protocol used, retain all submissions to capture any potentially valuable ideas.

Address users’ questionsSome users have questions above and beyond the FAQs. Collect feedback via social media channels and email, and provide responses in a timely fashion to

keep feedback loops open and functional. Addressing feedback also includes routing complex questions through appropriate channels. Utilizing the host organization’s helpline or consultation line can provide a complete operations loop, provided the operators are informed and trained to answer general questions.

3 Launch

Case Study: PRANCCER

15

If using a vendor-moderated entry system, ensure that the vendor’s staff is adequately trained to screen incoming submissions

Page 16: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

Close and archive the campaignWhen the campaign closes, new entries are no longer permitted and the existing entries are archived for future reference.

Initiate the evaluation processBased on the evaluation protocol established during program planning, reviewers will review and analyze the ideas gathered during the response period against pre-determined criteria.

Communicate status and results to usersFinal-stage communications bring closure to the campaign while also informing users about what will happen with the entries. These communications describe the impact of the campaign and may inform current entrants of future efforts. This is the time to express appreciation for all participants’ contributions and, if a monetary prize was offered, to identify winners.

Run metricsRun and report aggregate metrics that paint a picture of the campaign as a whole. These reports help identify lessons learned and future best practices.

Draft final report with data analysisAnalyze the aggregated data from the entire campaign for patterns and insights. While reports may not be necessary for all campaigns (depending on the desired outcomes), they signify a final close-out and quantify the campaign’s impact.

Capture metricsAs the campaign proceeds, identify, define, and report success metrics data to the team and leadership (such as origin of interest/traffic coming to the campaign site). These data help quantify the campaign’s impact and inform adjustments to the communications plan, site user interface, campaign timing, and other factors. Metrics also help identify cost saving measures that can make future campaigns more impactful.

Provide points for weekly reportReport campaign progress against metrics and criteria for success established during the planning process. Reporting achievement of benchmarks enables leadership to monitor what is happening on the ground and can provide feedback on the campaign’s progress so that necessary adjustments can be made.

4 Evaluate

Case Study: PRANCCER

16

Page 17: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

References Appendix

PRANCCERThe PRANCCER pilot initiative took shape in the course of discussion between AHA and PCORI leaders on recent technological and medical advances relevant to patient outcomes research. Patients continue to demand more active roles in their own healthcare, a trend that also has great potential to shape health research and make it more patient-centered. Finding ways to increase opportunities for patients and those who care for them to have a voice in research is fundamental to PCORI’s research funding and core to the AHA. AHA’s and PCORI’s leaders

noted that rapid advances in crowdsourcing technologies could help empower patients and other stakeholders further, giving them a voice in the direction of research that could improve outcomes for millions.

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

Case Study

Case Study: PRANCCER

17

Crowdsourcing could help guide research funding decisions toward problems that matter most to patients – a priority the AHA and PCORI both share – and be tailored to address the organizations’ respective missions in precision medicine and comparative effectiveness research. AHA and PCORI announced their partnership and launched the yearlong pilot initiative at AHA’s November Scientific Sessions.

Planning Early decisionsFor this pilot program, AHA and PCORI decided to enlist the crowd for ideation – generation of promising ideas for new research in cardiovascular medicine.

Initially, organizers planned on a single crowdsourcing campaign that would reach out to both scientists and patients for input on unmet needs in cardiovascular disease research. It soon became clear that separate, consecutive campaigns, directed first at patients and caregivers and then at scientists and clinicians, made better sense both logistically and for generating actionable research recommendations addressing AHA and PCORI priorities. This also enabled the operational team to plan different systems of review and scoring for the two campaigns. The scientist/clinician responses would be evaluated by a peer review process that would not be applicable to the anecdotal patient/caregiver submissions.

Guiding Principles for PRANCCER

> Be based on US Health outcomes

> Incorporate Comparative Effectiveness Research

- Be feasible for researchers and clinicians to study

- Fit within PCORI’s comparative effectiveness research guidelines

> Incorporate Precision Medicine - Involve factors including

genetics, diet, and lifestyle - Seek ways to tailor

treatment to patients’ individual characteristics

AHA and PCORI agreed that PRANCCER ideation campaigns must:

Page 18: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

References Appendix

Another key early choice was to focus the ideation campaign on treatment rather than prevention. Soliciting patients’ most difficult decisions regarding specific treatments and procedures, it was felt, had the best chance of eliciting unique questions translatable into research designs. Moreover, the difficulty of studying and comparing a multitude of lifestyle options argued for keeping prevention and treatment separate.

A vendor was chosen based on its previous work in healthcare and its breadth of experience with major clients in business and government. The two PRANCCER campaign sites would be housed on the vendor’s website. Although their content would be customized to meet AHA and PCORI needs, the sites would otherwise conform to the vendor’s format and nomenclature. Because the vendor calls crowdsourcing campaigns “challenges,” the PRANCCER campaigns were known as Challenges I and II.

AHA and PCORI’s communications and marketing teams played integral roles in program strategy and planning. When the partnership was launched, they posted concurrent blog and news articles about the partnership. Social and traditional media campaigns supported subsequent PRANCCER milestones throughout the following year (Figure 3).

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

Figure 3

PRANCCER timeline.

PRANCCER Planning and Design Phase

AHA and PCORI Collaboration Announced

Evaluation and Planning Phase

Evaluation and Wrap Up

Challenge I

Patient and Caregiver

Challenge II

Clinician and Researcher

Nov - Dec Jan - April May June - Aug September Oct-Dec

Case Study: PRANCCER

18

Page 19: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

References Appendix

Forming the joint advisory committeeA joint PCORI-AHA Advisory Committee (PAAC), formed in December, comprised former AHA presidents, clinical cardiologists, nurses, experts in precision medicine and comparative effectiveness research, and lay/patient representatives, with two appointed leads, one each from AHA and PCORI. PAAC input was instrumental in the initial decision to divide PRANCCER into sequential patient/caregiver (Challenge I) and scientist/clinician (Challenge II) campaigns. Throughout the PRANCCER initiative, the PAAC provided ongoing guidance to help ensure that PRANCCER generated crowdsourced insights applicable to new research.

Legal considerationsSpecific legal agreements needed to be established during the PRANCCER planning and design phases to ensure adequate protection, indemnification, and policy adherence.

> AHA and PCORI signed a business contract spelling out their respective contributions to the PRANCCER initiative, objectives of the program, and roles and responsibilities. AHA assumed the operating role in interacting with the crowdsourcing vendor.

> The contract with the crowdsourcing vendor specified AHA and PCORI policy requirements that needed to be maintained. In particular, because PCORI may only fund ideas and solutions relevant to healthcare in the United States, this would be a central criterion in evaluating crowdsourced submissions. Also, to protect participants’ privacy, the project would not personally identify participants, require verification of medical diagnoses, or request medical records. The vendor had requirements of its own – for example, setting minimum prize amounts for a campaign and disclosing the review criteria by which submissions would be scored.

> The vendor’s “challenge-specific agreement” set out requirements for crowdsourcing participants, especially regarding IP. For both campaigns, participants were required to acknowledge that while they owned the content of their submissions, AHA and PCORI retained unrestricted but nonexclusive rights to use them. The agreement also noted that participants may receive recognition for their entries other than prize money.

> Given the sensitivity of content based on patients’ lives and healthcare experiences, AHA and PCORI requested that the vendor increase the level of its indemnity coverage for the PRANCCER challenges. The vendor complied and raised its coverage to the minimum level required by AHA.

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

BasicsCase Study: PRANCCER

19

Page 20: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

20

Challenge I | Patients and CaregiversDesignChallenge I was directed toward patients and caregivers. The key focus in the design phase was to elicit patient dilemmas that were potentially remediable through AHA/PCORI-funded research in comparative effectiveness and precision medicine. In March, the PAAC refined the campaign’s target questions, calling for patients and caregivers to relate their “biggest challenges” and “what was hardest” in dealing with heart disease, stroke, and other vascular disease. This language would form the core of the participant submission form, as well as the promotional communications driving individuals to the campaign site.

LaunchBased on past experience, the vendor recommended one month as the ideal length of time to keep a crowdsourcing campaign open for submissions. Following this advice, Challenge I started in early May and ended in early June.

To raise awareness of Challenge I, communications across multiple channels began at intervals, starting just prior to launch and continuing over the course of the campaign. These channels included:

> AHA and PCORI home pages

> Email marketing

> Social media, including Facebook groups,Twitter posts, and dedicated ad space

> News and blog posts

> Patient Support Networks

> Forums and teleconferences

Critically, a series of three mass emails – one prior to launch, one during the campaign, and one three days before the end – had the potential to reach hundreds of thousands of subscribers to various AHA patient support networks, newsletters, and specialized interest and volunteer groups. During the campaign, two social media posts, one Twitter and one Facebook, went out each week. PCORI amplified AHA’s messaging through targeted pushes to its networks of patients, caregivers, patient advocates, clinicians and other stakeholders, and shared word about the challenge several times through its weekly email alerts and regularly through its social media channels. Per the vendor’s

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

BasicsCase Study: PRANCCER

The Three Core Questions in Challenge I (Patients and Caregivers)

Despite multiple available treatment options that your physician may describe, you may not always know which option is right/best for you. Tell us about the most difficult decision that you or your family faced when evaluating treatment options for heart disease or stroke.

We would like to understand more about what you considered most important when making this choice or decision. Please pick your top three choices and tell us why they are important to you and your family.

If you could change the way things were done, what would you do to improve your experience in the decision-making process concerning the treatment of your heart disease or stroke? How satisfied are you with the current treatment plan?

1

2

3

Page 21: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

21

Figure 4

Responses, submissions, finalists, and winners in the patient/caregiver ideation challenge (Challenge I). Explanations of the steps appear in the Launch and Evaluation sections.

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

recommendation, the campaign was not promoted through national news. For a previous campaign that the vendor conducted for a US government agency, promotions through national news generated thousands of inappropriate or otherwise unusable submissions.

Challenge I website visitors who registered to participate were directed to download a “fillable” pdf document, with blank spaces for entering data and answering questions. They could complete or leave blank an optional set of confidential demographic questions before engaging the three main questions about the most difficult decisions they faced in evaluating treatment options for heart disease or stroke (see sidebar on previous page). To complete the submission, they had to upload the completed form back to the website.

Out of 8,000+ visitors to the campaign site, 522 began work on submissions, of whom 158 completed a submission within the allotted one-month period. Nearly every channel used to promote the campaign yielded at least one submission, underscoring the value of a comprehensive communications plan.

A “first pass” review by the vendor and operational staff checked if participants followed directions properly, answered all three main questions, and provided decisional dilemmas, as opposed to, for example, recommending specific treatments or suggesting topics for basic scientific research. In the end, 103 submissions were deemed acceptable for official evaluation. (Figure 4).

Case Study: PRANCCER

Visitors to campaign site >8,000

Prize winners ($1,000 each) 10

Submissions started 522

Submissions entered 158

Submissions validated 103

Top submissions 52

Selected submissions 29

Page 22: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

EvaluationA pool of 33 reviewers from AHA and PCORI was recruited during the campaign, including AHA Council chairs (both physicians and nurses), individuals experienced in comparative effectiveness research and precision medicine, and two additional lay/patient volunteers. All reviewers signed nondisclosure agreements to preserve the confidentiality of the reviewing process. The reviewers were briefed on the review process via a detailed email and an instructional Prezi presentation that organized review steps graphically in a form reviewers could

explore and absorb at their own pace.

The size of the review team was based on the anticipated number of submissions. Each reviewer received about ten submissions to review, with the aim of having each submission judged by three reviewers. Submissions were supplied digitally as pdf documents, accompanied by digital scoring forms (see Appendix). Reviewers scored each submission on the basis of translatability into topics for comparative effectiveness and precision medicine research. Entries were also categorized by cardiovascular condition. Reviewers worked independently, without meeting in person or via teleconference.

Out of the 103 submissions, the highest-scoring 54 were organized into a spreadsheet including summary, score, identified condition areas, and reviewer

comments (see Appendix). The PAAC was then convened to further refine the potential candidates to 29, from which the ten $1,000 prize winners were selected, with final confirmation following an online background check (Figure 4).

At the close of the campaign, metrics were tabulated to evaluate the performance of different marketing channels. Through the use of tracking links from AHA channels, as well as statistics from the vendor, the PRANCCER operational team could keep track of how many people viewed the campaign website, how they arrived there, and how many ultimately submitted entries. These data provide insight on the most effective channels for directing interested individuals to the campaign site, as well as which groups, among those who began the submission process, followed it all the way to completion.

Most Challenge I participants completed a set of optional demographic questions in the submission form. Because the submission process carried an intrinsic bias toward individuals knowledgeable enough to download and upload fillable pdf forms, there was concern that submissions would be skewed toward younger respondents. Notably, however, responses showed that individuals approximately 45 to 54 years of age were the most likely to engage the campaign.

Case Study: PRANCCER

22

Reviewers scored each submission on the basis of translatability into topics for comparative effectiveness and precision medicine research

Page 23: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

23

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

Challenge II | Researchers and CliniciansDesignThe design of Challenge II was built on the foundation of the top-scoring patient and caregiver uncertainties and decisional dilemmas from Challenge I. Researchers and clinicians were invited to submit specific ideas for future research that could help resolve these dilemmas and that fulfilled criteria for both comparative effectiveness research and precision medicine. The top-scoring Challenge I submissions fell into seven major clinical categories (see sidebar).

Challenge II participants were instructed to design a framework for a clinical study around one of the seven major areas. They were required to provide two or more treatment or preventive options that could be compared in a comparative effectiveness research study in the selected clinical area. Then they were asked to define the specific population or type of patient to which the research question applies. This called for more directive language in the online submission form than in Challenge I. Table 4 shows the prompts provided to participants to help ensure their submissions covered criteria for both comparative effectiveness research and precision medicine.

LaunchParticipants again had a one-month window (early September to early October) for submission. $5,000 would be awarded to each of four winning entries. Members of topic-relevant AHA Councils, PCORI’s contacts, and the broader cardiovascular research community were alerted to Challenge II via channels including the following: > Email marketing

> News articles

> Journal ads

> Social media posts

Case Study: PRANCCER

Clinical Categories of the Top Patient Decisional Dilemmas

Patients with heart failure due to coronary artery disease and other conditions

Individuals who have experienced symptoms or evidence of coronary heart disease at a young age

Patients who experienced a stroke and wish to prevent a recurrence

Individuals who experienced a stroke and are receiving treatment for an underlying autoimmune condition such as lupus or rheumatoid arthritis

Elderly individuals with aortic stenosis

Parents or caregivers of newborns with congenital heart disease, including total anomalous pulmonary venous return

Patients with atrial arrhythmias, including atrial fibrillation and atrial tachycardias

Page 24: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

24

Table 4

Prompts for Incorporating Comparative Effectiveness Research (CER) and Precision Medicine (PM) Into Challenge II Submissions

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

BasicsCase Study: PRANCCER

Challenge II Topic CER Examples PM ExamplesPatients with heart failure due to coronary artery disease and other conditions

Consider comparing medical therapy, implantable cardiac defibrillators, or other options

Comparing populations of patients with and without known genetic mutations associated with a specific form of heart disease

Individuals who have experienced symptoms or evidence of coronary heart disease at a young age

Consider comparing treatment options such as: bypass surgery, percutaneous coronary intervention with stenting, and/or medical therapy

Comparing populations with and without a strong family history of early heart disease within which genetic susceptibilities and potential modifiers could be discovered

Patients who experienced a stroke and wish to prevent a recurrence

Consider comparing the relative impact of preventative therapies, which may include medication, exercise, and diet

Using metabolomics to identify biomarkers as risk predictors

Individuals who experienced a stroke and are receiving treatment for an underlying autoimmune condition such as lupus or rheumatoid arthritis

Consider comparing populations with and without autoimmune diseases and their responses to different treatment options or medical therapies

Using a systems biology approach merged with clinical outcomes to identify how stroke therapy interacts with medical support for lupus, rheumatoid arthritis, or autoimmune conditions

Elderly individuals with aortic stenosis

Consider comparing different approaches to valve replacements with negligible symptoms but severe aortic stenosis

A population with and without specific genetic markers

Parents or caregivers of newborns with congenital heart disease, including total anomalous pulmonary venous return

Consider comparing timing of surgery and accompanying rehabilitation and medication regimens

Population stratification with and without known congenital heart mutations

Patients with atrial arrhythmias, including atrial fibrillation and atrial tachycardias

Consider comparing treatment options: cardioversion, ablation, antiarrhythmic medication(s), anticoagulation, or combination thereof

Are there genetic markers that can predict the best treatment options?

Page 25: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

25

Figure 5

Visits, responses, submissions, and winners in the researcher/clinician ideation campaign (Challenge II). See text for more detail.

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

These communications began in August and continued through the one-month submission window. Promotions were worded to make clear that this was not a call for a full-fledged research proposal, but rather for a summary description of a clinical study that was specific enough to guide the design of a subsequent full study. As before, participants were required to download and complete a fillable pdf submission form and then re-upload it to the Challenge II website. Over 4,000 individuals viewed the campaign website; 196 registered to begin work on submissions, of whom 33 completed the process (Figure 5).

Case Study: PRANCCER

4 Prize winners ($5,000 each)

33 Submissions entered

196 Submissions started

>4,000 Visitors to campaign site

Page 26: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

26

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

EvaluationAHA and PCORI recruited 34 reviewers with expertise in each of the seven fields to evaluate the 33 Challenge II submissions. Reviewers were drawn from members of the Challenge I review team and the PAAC, along with additional experts from relevant AHA councils, selected reviewers knowledgeable in comparative effectiveness research and precision medicine, and patient volunteers.

One month prior to the start of the submission period, reviewers were contacted by email with background on the review process. Training included an instructional website and self-guided Prezi presentation, as well as a Word document explaining the similarities and differences between the Challenge II review process and traditional peer-reviewed grant evaluation.

In contrast to the patient/caregiver campaign, submissions were reviewed using a five-point scoring system roughly modeled after the AHA and NIH peer review system for scoring research applications. The peer review process focused on reviewing and rank-ordering the submissions based on research feasibility and compatibility with comparative effectiveness research and precision medicine criteria. For each submission, a primary reviewer provided initial scoring and highlighted key strengths and weaknesses. Additional reviewers added highlights not already covered, with lay/patient reviewers providing qualitative comments.

After reviewers submitted their initial scores independently, the review team met via teleconference. All reviewers had the opportunity to revise their initial scores based

on the evolving input and discussion. The final rank order was determined in a subsequent group discussion, and the four winning entries were selected in coordination with AHA/PCORI leadership.

As before, metrics were gathered showing how individuals found their way to the Challenge II website, who went on to register, and who ultimately submitted entries.

Case Study: PRANCCER

For each submission, a primary reviewer provided initial scoring and highlighted key strengths and weaknesses. Additional reviewers added highlights not already covered

Page 27: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

27

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

Recommendations & Lessons Learned

Ideation campaigns like Challenge I demand time and effort from participants, naturally limiting the number of submissions. Reserve crowdsourcing for information that cannot be collected more efficiently, and from more respondents, through survey research (see Figure 1).

Consider the population(s) you wish to sample and choose crowdsourcing approaches that will reach them, taking into account, for example, their access to the internet and their scientific literacy.

Develop clear objectives for your crowdsourcing initiative, and select a vendor whose platform serves those objectives – for example, allowing a combination of survey research and crowdsourcing, if appropriate to your needs.

Select a vendor with a platform that is flexible and does not demand conformity to specialized jargon unrelated to your field.

If possible, select a vendor capable of supporting a fully online submission process, ideally through an interface or applet that works for mobile devices as well as computers.

Make sure to leave time in the planning process to allow beta testing of the submission entry process by testers not involved in the content or design of your campaign.

Recommendations & Lessons Learned

Case Study: PRANCCER

Page 28: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

28

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

Recommendations & Lessons Learned

In Challenge I, the difficulty of explaining the scientific ends to which submissions would be applied led to a wide range of on- and off-topic responses. This could have been addressed by providing general examples of appropriate responses, as was done in Challenge II (see Table 4).

Based in part on beta testing results, be sure to provide adequate instructions – for example, a tutorial video – that address site navigation and the submission process, especially for less technically proficient participants. Test its effectiveness before proceeding.

Allow sufficient time to market and publicize your crowdsourcing campaign, taking your audience(s) into account.

Patients are highly motivated participants; when they contribute to a crowdsourcing campaign, consider providing them with opportunities to join relevant support networks or participate in other patient-centered research or program efforts, as a way to keep them engaged with your organization moving forward.

Researchers and clinicians will likely require different incentives to participate in a crowdsourcing initiative. In addition to cash prizes, consider structuring the experience to promote networking opportunities, greater visibility and stature among peers, etc. In assessing whether Challenge II was sufficiently engaging to attract participants, questions to consider in future planning include: Was the task of developing a preliminary Request for Applications (RFA) topic interesting enough? Did the dual criteria of comparative effectiveness research and precision medicine make the submission and review processes too complex? Was there another component to scientific engagement that could have been employed?

Keep information and instructions for reviewers concise, and hold a meeting or teleconference early on for all reviewers, to answer operational and content questions.

Case Study: PRANCCER

Page 29: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

29

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

Recommendations & Lessons Learned

Tracking links provided invaluable information about which promotional channels drove the most and the least traffic to the Challenge I and II websites. These links located the origins of site visitors in cyberspace, but apart from country of origin, no geographic data were captured. A regional breakdown of site visitors and participants would be a valuable addition to tracking information for future crowdsourcing initiatives.

Both challenges employed unfamiliar formats and scoring systems, and for Challenge I, a new category of applicants: patients and caregivers. Clinicians and researchers were called upon to score submissions both within and outside their specialized areas of expertise, and the role of lay reviewers in scoring was sometimes ambiguous. Moreover, comparative effectiveness research and precision medicine were not easy terms for reviewers to define and agree upon during the scoring process. Therefore, select a review process that is as efficient and user friendly as possible, and provide reviewers with clear instructions and training in triaging and scoring submissions.

Finally, seek input from your particular types of crowdsourcing participants, as well as reviewers on ways to improve their experience. Determine if there is a need to respond to all participants or post the results, so that people who did participate know how everything turned out.

Case Study: PRANCCER

Page 30: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

30

References1 Howe J. The rise of crowdsourcing. Wired Magazine. 2006;14(6):1-5.

2 Boudreau KJ, Lakhani KR. Using the crowd as an innovation partner. Harvard Business Review. April 2013. https://hbr.org/2013/04/using-the-crowd-as-an-innovation-partner. Accessed December 7, 2016.

3 Lakhani KR, Boudreau KJ, Loh PR, et al. Prize-based contests can provide solutions to computational biology problems. Nat Biotechnol. 2013;31(2):108-11.

4 Khatib F, DiMaio F, Cooper S, et al. Crystal structure of a monomeric retroviral protease solved by protein folding game players. Nat Struct Mol Biol. 2011;18(10):1175-7.

5 Leiter A, Sablinski T, Diefenbach M, et al. Use of crowdsourcing for cancer clinical trial development. J Natl Cancer Inst. 2014;106(10):dju258.

6 Leimeister JM, Huber M, Bretschneider U, Krcmar H. Leveraging crowdsourcing: activation-supporting components for IT-based ideas competition. J Management Information Systems. 2009;26(1):197-224.

7 Riedl J, Riedl E. Crowdsourcing medical research. Computer. 2013;46(1):89-92.

8 Behrend TS, Sharek DJ, Meade AW, Wiebe EN. The viability of crowdsourcing for survey research. Behav Res Methods. 2011;43(3):800-13.

9 Courtney H, Lovallo D, Clarke C. Deciding how to decide. Harvard Business Review. November 2013. https://hbr.org/2013/11/deciding-how-to-decide. Accessed December 7, 2016.

10 Rumsfeld JS, Brooks SC, Aufderheide TP, et al. Use of mobile devices, social media, and crowdsourcing as digital strategies to improve emergency cardiovascular care: a scientific statement from the American Heart Association. Circulation. 2016;134(8):e87-e108.

11 Weiner M. The potential of crowdsourcing to improve patient-centered care. Patient. 2014;72:123-72.

12 Bilal E, Dutkowski J, Guinney J, et al. Improving breast cancer survival analysis through competition-based multidimensional modeling. PLoS Comput Biol. 2013;9(5):e1003047.

13 Anderson M, Perrin A. 13% of Americans don’t use the internet. Who are they? Factank News in the Numbers. Pew Research; September 7, 2016. http://www.pewresearch.org/fact-tank/2016/09/07/some-americans-dont-use-the-internet-who-are-they/. Accessed December 8, 2016

14 Gouillart F, Billings D. Community-powered problem solving. Harvard Business Review. April 2013. https://hbr.org/2013/04/community-powered-problem-solving. Accessed December 7, 2016.

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

Recommendations & Lessons Learned

Case Study: PRANCCER

Page 31: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

Appendix

References Appendix

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

Recommendations & Lessons Learned

Case Study: PRANCCER

31

Project TemplatesThe following are links to templates for use in planning, promoting, and evaluating scientific crowdsourcing campaigns. Clicking on the links will open separate windows displaying editable versions of the files.

>

>

>

>

>

>

>

Project plan Gannt chart template

Blank worksheet for planning a two-stage scientific crowdsourcing initiative modeled on PRANCCER Challenges I and II.

RACI template

Blank Responsibilities Actionable Consulted Informed (RACI) matrix for establishing roles and responsibilities for a scientific crowdsourcing initiative.

Patient/caregiver recruitment template

Sample promotional messaging in email format for recruiting participants for a patient/caregiver crowdsourcing campaign modeled on PRANCCER Challenge I.

Researcher/clinician recruitment template

Sample promotional messaging in email format for recruiting participants for a researcher/clinician crowdsourcing campaign modeled on PRANCCER Challenge II.

Reviewer recruitment template

Sample promotional messaging in email format for recruiting reviewers for crowdsourcing campaign modeled on PRANCCER Challenge I.

Submission review meeting agenda template

Sample agenda for a live or teleconferenced review meeting for a researcher/clinician crowdsourcing campaign, outlining the basic review process used in PRANCCER Challenge II following initial independent scoring of submissions.

Scoring worksheet template

Blank worksheet for scoring researcher/clinician submissions during a live or teleconferenced review meeting, modeled after the review process used in PRANCCER Challenge II.

Page 32: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

References

Implementing a Crowdsourcing Program

Recommendations & Lessons Learned

OverviewCrowdsourcing

Basics

Recommendations & Lessons Learned

Commercial Crowdsourcing Resources The companies and applications listed below are just a sampling of the crowdsourcing solutions available. Each of these options, as well as others, should be vetted against your selected and prioritized features and functions.

100% Open Toolkit – www.toolkit.100open.com

Amazon Web Services: Mechanical Turk – www.mturk.com

Brightidea – www.brightidea.com

Chaordix – www.chaordix.com

CogniStreamer – www.cognistreamer.com

Communifire – www.axerosolutions.com

Crowdiscity – www.crowdicity.com

IdeaScale – www.ideascale.com

Imaginatik – www.imaginatik.com

InnoCentive – www.innocentive.com

Nosco – www.nos.co

QMarkets – www.qmarkets.net

Spigit – www.spigit.com

Appendix

Case Study: PRANCCER

32

Page 33: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

Patient-Centered Outcomes Research Institute and American Heart Association Challenge for Researchers and Clinicians

To participate in this Challenge, please save this template first. Then complete this document, save again, and upload it to our crowdsourcing platform. We encourage the use of references to support your answers; include citations at the end of this document in the space provided.

Please indicate if you are willing to be contacted.

Yes No Email

Challenge Deliverables

For this challenge we are asking you to select one area where there is treatment uncertainty or that poses a decisional dilemma for patients or their clinicians. A requirement of this challenge is to start with a decisional dilemma or treatment uncertainty from this list below.

The form will ask several questions directed specifically at how you might compare two or more treatment or preventative options to address this decisional dilemma for patients and how you will incorporate precision medicine approaches within the overall strategy. Both comparative effectiveness and precision medicine approaches must be utilized in this challenge. Examples are provided under each treatment uncertainty/decisional dilemma below. If you have more than one idea for this challenge, you may make multiple submissions, but please make sure that each solution is represented on a separate PDF template.

1. Patients with heart failure, due to coronary artery disease and other heart diseases affecting normal heart function.

i. Consider comparing medical therapy, implantable cardiac defibrillators, or other options.ii.Precision Medicine component example: comparing populations of patients with and without known genetic mutations associated with a specific form of heart disease.

2. Individuals that experienced symptoms or evidence of coronary heart disease at a young age.

i. Consider comparing treatment options such as: bypass surgery, percutaneous coronary intervention with stenting, and/or medical therapy.ii.Precision Medicine components examples: comparing populations with and without a strong family history of early heart disease within which genetic susceptibilities and potential modifiers could be discovered.

3. Patients who have experienced a stroke and wish to prevent a recurrent stroke.i. Consider comparing the relative impact of preventative therapies which may include medication, exercise and diet.ii.Precision Medicine components examples: using metabolomics to identify biomarkers as risk predictors.

Page 34: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

2. Name two or more treatment or preventive options that may be used to manage thiscondition and that could be compared in a comparative effectiveness researchstudy.Treatment / Preventive Option 1:

Treatment / Preventive Option 2:

Treatment / Preventive Option 3 (if applicable, and list more as appropriate):

3. State the specific population or type of patient to which the research questionapplies. For example, age group, clinical context, or stage of disease. These patientsubgroups could be defined by characteristics such as phenotype (e.g. age or gender);genotype; co-morbid conditions; environomics; or other measures.

1. You must choose one of the seven areas where a patient identified a clinicalcondition.[refer to Patient Decisional Dilemmas 1.-7. above]. State this choice below.

4. Individuals who experience a stroke with underlying autoimmune conditions such aslupus, rheumatoid arthritis, et cetera for which they have been receiving therapy.

i. Consider comparing populations with and without autoimmune diseases andtheir responses to different treatment options or medical therapies.ii. Precision Medicine components example: using a systems biology approachmerged with clinical outcomes to identify how stroke therapy interacts withmedical support for lupus, rheumatoid arthritis, or autoimmune conditions.

5. Elderly individuals with aortic stenosis.i. Consider comparing different approaches to valve replacements withnegligible symptoms but severe aortic stenosis.ii. Precision Medicine components example: a population with and withoutspecific genetic markers.

6. Parents or Caregivers with Newborns with Congenital Heart disease, including TotalAnomalous Pulmonary Venous Return.

i. Consider comparing timing of surgery and accompanying rehabilitation andmedication regimens.ii. Precision Medicine components example: population stratification with andwithout known congenital heart mutations.

7. Patients with atrial arrhythmias; including those with atrial fibrillation and atrialtachycardias.

i. Consider comparing treatment options: cardioversion, ablation, antiarrhythmicmedication(s), anticoagulation, or combination thereof.ii. Precision Medicine components example: are there genetic markers that canpredict the best treatment options?

Page 35: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

4. List Patient-centered outcomes:

5. State the Precision Medicine Approach

6. Provide evidence to support your question, or your stated uncertainty by citingguidelines defining clinical areas where evidence is suboptimal or a recent systematic review or meta-analysis that cites the need for more evidence.

Citations:

1.

2.

3.

4.

Page 36: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

7. Summarize your comparative effectiveness research study design and precision medicineapproaches as it might appear in a potential funding announcement.

8. Additional References

Page 37: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

OPTIONAL Demographic Reporting The AHA/ASA and PCORI would like to better understand the needs of individuals who

have been diagnosed with heart disease or stroke as well as those who are concerned with their heart health. Your input will help us develop materials, resources, and tools to help you or your loved one manage your health. Your decision to provide or not provide this demographic information in this InnoCentive Challenge is completely voluntary. If you do not wish to complete this survey please scroll to page 3 to begin the submission form. The demographic information you provide will be completely confidential and will not impact the evaluation of your submission in any way. Any use of this information will not include your name or any other individual information by which you could be identified. These data are essential for identifying improvements that are meaningful to you.

We thank you for your involvement! Please answer the following:

1. How do you identify yourself?-- American Indian or Alaska Native-- Asian-- Black or African American-- Hispanic or Latino-- Native Hawaiian or Other Pacific Islander-- White Other

2. What is your age group?18-2425-3435-4445-5455-6465-7475 or older

3. What is your highest education completed?Some High SchoolHigh School / G.E.D.Some CollegeCollege Degree

4. Gender IdentityMaleFemaleOther (Please Specify) _________

5. Would you be willing to receive further communication regarding this survey or subsequentChallenges to help us better understand your needs and focus our research efforts? Yes No

Contact Email:

Graduate Degree

ccovington
Typewritten Text
NOTE: The mandatory submission form begins on page 3.
ccovington
Typewritten Text
Page 38: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

The following 3 Questions are for patients who have been diagnosed with heart disease or stroke. If you are not a part of this group please Skip or select Not Applicable or N/A for questions 6 – 8.

6. Have you been diagnosed with heart or stroke disease?YesNoNo, But I have family or friends with heart disease or stroke.

7. Which underlying conditions related to heart or stroke diseases have you been diagnosed with? (Select all that apply) --High blood pressure High cholesterol

Heart attack, acute coronary syndrome, or angina _ Stroke or transient ischemic attack (TIA) Atrial fibrillation or AFib Other Arrhythmia (Heart Rhythm problems) Diabetes Heart failure Heart valve disease or problems Congenital heart defect Peripheral artery disease Venous thromboembolism Not Applicable or N/A Other

8. What treatment plan did your healthcare provider recommend? (Select all that apply) MedicationIncreasing your physical activityReducing the amount of sodium in your dietMedical Procedure (i.e., stent, pacemaker, implantable cardioverter defibrillator, bypass or valve surgery)Cardiac rehabStopping tobacco useOther diet changesOther lifestyle changesN/A, NoneOther

*Note: Information used from this form will not be tied in any way with other personal information. Itwill not be sold, distributed or otherwise used in a way that would impact your privacy. It will be used fordata collection processes only to further aid the science you are contributing to. InnoCentive will not useany of this information as full liability is with the AHA/ASA and PCORI.

Page 39: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

Heart Disease & Stroke Treatment Challenge - Submission Form

If you wish to submit additional documentation (text, graphics, diagrams, pictures), please upload them as an attachment in your submission. If you have any issues with this form or questions about the Challenge, please submit your questions through the Challenge Message Center.

Despite multiple available treatment options that your physician may describe, you may not always know which option is right/best for you. Tell us about the most difficult decision that you or your family faced when evaluating treatment options for heart disease or stroke.

Page 40: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

We would like to understand more about what you considered most important when making this choice or decision. Please pick your top three choices and tell us why they are important to you and your family.

Are you a patient, caregiver or an individual interested in cardiovascular health?

Patient

Caregiver

Interested Individual

What condition do you have? If you are not a patient, which condition concerns you as you consider your treatment options in the future?

If Other, please specify:

ccovington
Typewritten Text
1.
ccovington
Typewritten Text
2.
ccovington
Typewritten Text
3.
Page 41: Crowdsourcing Medical Research Priorities · 2017-01-26 · Crowdsourcing Medical Research Priorities A Guide for Funding Agencies Lessons and Best Practices from PRANCCER PRecision

If you could change the way things were done, what would you do to improve your experience in the decision making process concerning the treatment of your heart disease or stroke? How satisfied are you with the current treatment plan?