Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of...

30
Software Assurance: Software Assurance: Ci li C i Cb lt Ci li C i Cb lt Crippling Coming Cyberassaults Crippling Coming Cyberassaults Dr. Paul E. Black National Institute of Standards and Technology National Institute of Standards and Technology http://samate.nist.gov/ [email protected]

Transcript of Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of...

Page 1: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Software Assurance:Software Assurance:C i li C i C b ltC i li C i C b ltCrippling Coming CyberassaultsCrippling Coming Cyberassaults

Dr. Paul E. BlackNational Institute of Standards and TechnologyNational Institute of Standards and Technologyhttp://samate.nist.gov/[email protected]

Page 2: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Report Documentation Page Form ApprovedOMB No. 0704-0188

Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering andmaintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information,including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, ArlingtonVA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if itdoes not display a currently valid OMB control number.

1. REPORT DATE APR 2010 2. REPORT TYPE

3. DATES COVERED 00-00-2010 to 00-00-2010

4. TITLE AND SUBTITLE Software Assurance: Crippling Coming Cyberassaults

5a. CONTRACT NUMBER

5b. GRANT NUMBER

5c. PROGRAM ELEMENT NUMBER

6. AUTHOR(S) 5d. PROJECT NUMBER

5e. TASK NUMBER

5f. WORK UNIT NUMBER

7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) National Institute of Standards and Technology,100 Bureau Dr # 1000 ,Gaithersburg,MD,20899

8. PERFORMING ORGANIZATIONREPORT NUMBER

9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S)

11. SPONSOR/MONITOR’S REPORT NUMBER(S)

12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited

13. SUPPLEMENTARY NOTES Presented at the 22nd Systems and Software Technology Conference (SSTC), 26-29 April 2010, Salt LakeCity, UT.

14. ABSTRACT

15. SUBJECT TERMS

16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as

Report (SAR)

18. NUMBEROF PAGES

29

19a. NAME OFRESPONSIBLE PERSON

a. REPORT unclassified

b. ABSTRACT unclassified

c. THIS PAGE unclassified

Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18

Page 3: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

What is NIST?What is NIST?What is NIST?What is NIST?U S N ti l I tit t f St d d d T h lU.S. National Institute of Standards and TechnologyA non-regulatory agency in Dept. of Commerce3 000 employees + adjuncts3,000 employees + adjunctsGaithersburg, Maryland and Boulder, ColoradoPrimarily research not fundingPrimarily research, not fundingOver 100 years in standards and measurements: from dental ceramics to microspheres, from quantum p , qcomputers to fire codes, from body armor to DNA forensics, from biometrics to text retrieval.

26 April 2010 Paul E. Black 2

Page 4: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

The NIST SAMATE ProjectThe NIST SAMATE ProjectThe NIST SAMATE ProjectThe NIST SAMATE ProjectSoftware Assurance Metrics And Tool Evaluation (SAMATE) project is sponsored in part by DHSC t f t tiCurrent areas of concentration– Web application scanners – Source code security analyzers– Source code security analyzers– Static Analyzer Tool Exposition (SATE)– Software Reference Dataset– Software labels– Malware research protocols

Web site http://samate.nist.gov/

26 April 2010 Paul E. Black 3

Page 5: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Software Reference DatasetSoftware Reference DatasetSoftware Reference DatasetSoftware Reference DatasetPublic repository forPublic repository for software test cases

Almost 1800 cases in CAlmost 1800 cases in C, C++, Java, and Python

Search and compose custom Test Suites

Contributions from Fortify, Defence R&D Canada, Klocwork, MIT Lincoln Laboratory, Praxis, Secure Software, etc.

26 April 2010 Paul E. Black 4

Page 6: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Software Facts LabelSoftware Facts LabelSoftware Facts LabelSoftware Facts LabelSoftware Facts should:– Voluntary– Absolutely simple to produce– Have a standard format for other claimsHave a standard format for other claims

What could be easily supplied?– Source available? Yes/No/Escrowed

Default installation is secure?– Default installation is secure?– Accessed: network, disk, ...– What configuration files? (registry, ...)

C tifi t ( "N S k f d– Certificates (eg, "No Severe weaknesses found by CodeChecker ver. 3.2")

CautionsA l b l i f l fid– A label can give false confidence.

– A label shut out better software.– Labeling diverts effort from real improvements.

26 April 2010 Paul E. Black 5

Page 7: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Researching Risky SoftwareResearching Risky SoftwareResearching Risky SoftwareResearching Risky SoftwareMany people research malware, but there are no widely accepted protocols.Biological research has defined levels with associated practices, psafety equipment, and facilities.Some approaches areSome approaches are – Weakened programs (auxotrophs)

Programs that ALERT– Programs that ALERT– Outgoing firewalls

Isolated networks26 April 2010 Paul E. Black 6

– Isolated networks

Page 8: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

A th t ft i lAssurance that software is less vulnerable to coming cyberassaultsg yStatic and dynamic analysisStatic Analysis Tool Exposition -2009 outcomes and 2010 progress2009 outcomes and 2010 progress

26 April 2010 Paul E. Black 7

Page 9: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Assurance from three sourcesAssurance from three sourcesAssurance from three sourcesAssurance from three sources

A = f(p, s, e)

where A is functional assurance, p is process quality s is assessed quality ofprocess quality, s is assessed quality of software, and e is execution resilience.

26 April 2010 Paul E. Black 8

Page 10: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

pp is process qualityis process qualitypp is process qualityis process qualityHigh assurance software must be developed with care, for instance:– Validated requirements– Good system architecture– Security designed- and built in– Trained programmers

26 April 2010 Paul E. Black 9

Page 11: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

ss is assessed quality of softwareis assessed quality of softwaress is assessed quality of softwareis assessed quality of softwareTwo general kinds of software assessment:– Static analysis

• e.g. code reviews and scanner tools• examines code

– Testing (dynamic analysis)e g penetration testing f ing and red teams• e.g. penetration testing, fuzzing, and red teams

• runs code

26 April 2010 Paul E. Black 10

Page 12: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

ee is execution resilienceis execution resilienceee is execution resilienceis execution resilienceThe execution platform can add assurance that the system will function as intended.Some techniques are:– Randomize memory allocationy– Execute in a “sandbox” or virtual machine– Monitor execution and react to intrusions– Replicate processes and vote on output

26 April 2010 Paul E. Black 11

Page 13: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Software analysis is vitalSoftware analysis is vitalSoftware analysis is vitalSoftware analysis is vitalBenefits are:– Provide feedback to development process– Build product assurance when process is less

visible• contractors • open source• legacy software• legacy software

– Confirm minimum quality for execution

26 April 2010 Paul E. Black 12

Page 14: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Analysis is like a seatbeltAnalysis is like a seatbeltAnalysis is like a seatbelt …Analysis is like a seatbelt …

26 April 2010 Paul E. Black 13NISI National Institute of Standards and Technology • U.S. Department of Commerce

Page 15: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

A th t ft i lAssurance that software is less vulnerable to coming cyberassaultsg yStatic and dynamic analysisStatic Analysis Tool Exposition -2009 outcomes and 2010 progress2009 outcomes and 2010 progress

26 April 2010 Paul E. Black 14

Page 16: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Comparing Static Analysis with Comparing Static Analysis with p g yp g yDynamic AnalysisDynamic AnalysisStatic Analysis

Code reviewBinary byte or source

Dynamic AnalysisExecute codeSimulate designBinary, byte, or source

code scannersModel checkers & property

Simulate designFuzzing, coverage, MC/DC, use cases

proofsAssurance case

Penetration testingField tests

26 April 2010 Paul E. Black 15

Page 17: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Strengths of Static AnalysisStrengths of Static AnalysisStrengths of Static AnalysisStrengths of Static AnalysisApplies to many artifacts, not just codeIndependent of platformp pIn theory, examines all possibleexecutions, paths, states, etc.executions, paths, states, etc.Can focus on a single specific property

26 April 2010 Paul E. Black 16

Page 18: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Strengths of Dynamic AnalysisStrengths of Dynamic AnalysisStrengths of Dynamic AnalysisStrengths of Dynamic AnalysisNo need for codeConceptually easier - “if you can run the p y ysystem, you can run the test”.No (less) need to build or validate modelsNo (less) need to build or validate models or make assumptions.Checks installation and operation alongChecks installation and operation, along with end-to-end or whole-system.

26 April 2010 Paul E. Black 17

Page 19: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Static and Dynamic Analysis Static and Dynamic Analysis y yy yComplement Each OtherComplement Each OtherStatic Analysis

Handles unfinished codeCan find backdoors eg

Dynamic AnalysisCode not needed, eg, embedded systemsCan find backdoors, eg,

full access for user name “JoshuaCaleb”

embedded systemsHas few(er) assumptionsCovers end-to-end or

Potentially complete system testsAssess as-installed

26 April 2010 Paul E. Black 18

Page 20: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

A th t ft i lAssurance that software is less vulnerable to coming cyberassaultsg yStatic and dynamic analysisStatic Analysis Tool Exposition -2009 outcomes and 2010 progress2009 outcomes and 2010 progress

26 April 2010 Paul E. Black 19

Page 21: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Static Analysis Tool Exposition Static Analysis Tool Exposition y py p(SATE) Overview(SATE) Overview

Goal: advance research in, and improvement of, static analysis tools for security-relevant defects and speed tool adoption by demonstrating use onand speed tool adoption by demonstrating use on real software.CheckpointsCheckpoints– Participants run tools on Java and C programs we choose– NIST-led researchers analyze reports– Everyone shares results and observations at a workshop– Later release final report and all data

http://samate nist gov/SATE htmlhttp://samate.nist.gov/SATE.htmlCo-funded by NIST and DHS/NCSD

26 April 2010 Paul E. Black 20

Page 22: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

SATE ParticipantsSATE ParticipantsSATE ParticipantsSATE Participants20082008:

• Aspect Security ASC � HP DevInspect• Checkmarx CxSuite � SofCheck Inspector for Java• Flawfinder � UMD FindBugs• Fortify SCA � Veracode SecurityReview• Grammatech CodeSonar

2009:• Armorize CodeSecure � Klocwork Insight• Checkmarx CxSuite � LDRA TestbedCheckmarx CxSuite � LDRA Testbed• Coverity Prevent � SofCheck Inspector for Java• Grammatech CodeSonar � Veracode SecurityReview

26 April 2010 Paul E. Black 21

Page 23: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

“Number of bugs” is undefined “Number of bugs” is undefined ggTangled Flow: 2 sources, 2 sinks, 4 pathsTangled Flow: 2 sources, 2 sinks, 4 paths

1503 free 2644 free

808 use

819 use

26 April 2010 Paul E. Black 22

Page 24: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Summary of 2009 tool reportsSummary of 2009 tool reportsSummary of 2009 tool reportsSummary of 2009 tool reports

Reports from 18 tool runsAbout 20 000 total warningsAbout 20,000 total warnings– but tools prioritize by severity, likelihood

fReviewed 521 warnings - 370 were not false

Number of warnings varies a lot by tool and case83 CWE ids/221 weakness names

26 April 2010 Paul E. Black 23

Page 25: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Tools don’t report same warningsTools don’t report same warningsTools don t report same warningsTools don t report same warningsOverlap in Not-False Warnings

40 30

1 tool2 tools

120 207 3 tools4 tools

26 April 2010 Paul E. Black 24

Page 26: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Some types have more overlapSome types have more overlapSome types have more overlapSome types have more overlapOverlap in Not-False Buffer Errors

1 tool2 tools3 tools4 tools

26 April 2010 Paul E. Black 25

Page 27: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Why don’t tools find same things?Why don’t tools find same things?Why don t tools find same things?Why don t tools find same things?Tools look for different weakness classesTools are optimized differentlyp y

nor

e ce

rtai

mo

26 April 2010 Paul E. Black 26

more severe

Page 28: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

Tools find things that people findTools find things that people find

1 1 1IRSSI (3)

4 1 5Roller (10)

Same or other Coincidental NoneIncludes two access control issues – very hard for tools

26 April 2010 Paul E. Black 27

Page 29: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

SATE 2010 tentative timelineSATE 2010 tentative timelineSATE 2010 tentative timelineSATE 2010 tentative timelineHold organizing workshop (12 Mar 2010)Recruit planning committee. Revise protocol.Choose test sets. Provide them to participants (17 M )(17 May)Participants run their tools. Return reports (25 June)June)Analyze tool reports (27 Aug)Sh lt t k h (O t b )Share results at workshop (October)Publish data (after Jan 2011)

26 April 2010 Paul E. Black 28

Page 30: Software Assurance: C i li C i C b ltCrippling Coming ...Strengths of Dynamic AnalysisStrengths of Dynamic Analysis zNo need for code zConceppytually easier - “if you can run the

AcronymsAcronymsAcronymsAcronymsCWE - Common Weakness Enumeration http://cwe.mitre.com/DHS/NCSD D t t f H l dDHS/NCSD - Department of Homeland Security/National Cyber Security DivisionMC/DC Modified Condition/Decision CoverageMC/DC - Modified Condition/Decision CoverageSAMATE - Software Assurance Metrics And Tool Evaluation (project at NIST)Evaluation (project at NIST)SATE - Static Analysis Tool Exposition (annual event)event)NIST - National Institute of Standards and Technology

26 April 2010 Paul E. Black 29

gy