COMPUTER SECURITYjmcruz/ssi/ssi.1617-1/acetat/8...COMPUTER SECURITY 8. Secure Programming (2)...
Transcript of COMPUTER SECURITYjmcruz/ssi/ssi.1617-1/acetat/8...COMPUTER SECURITY 8. Secure Programming (2)...
COMPUTER SECURITY8. Secure Programming (2)
Motivation (3)Famous vulnerability by way of bad programming practice (5)
Malicious logic: attacks by way of programming (6)General types (6)Descriptions (6)
Design principles for a secure computer system (Saltzer & Schroeder) (7)Programming: quality certification (11)
Sound advice for programming robustly (Matt Bishop) (12)Testing for Vulnerabilities (Vulnerability Analyses) (15)
Pointers... (21)
J. Magalhães Cruz Computer Security – Secure Programming 122
8. Secure Programming
J. Magalhães Cruz Computer Security – Secure Programming 222
GETS(3) Linux Programmer's Manual GETS(3)...SYNOPSIS
char *gets(char *s);...DESCRIPTION
gets() reads a line from stdin into the buffer pointed to by s until either a terminating newline or EOF, which it replaces with a null byte ('\0'). No check for buffer overrun is performed (see BUGS below)....BUGS
Never use gets(). Because it is impossible to tell without knowingthe data in advance how many characters gets() will read, and because gets() will continue to store characters past the end of the buffer, itis extremely dangerous to use. It has been used to break computer security. Use fgets() instead....
Programming for Defense or for Attack?
● learn how to defend by studying the attack!
Motivation
● Situation: American government report, 2000: 7 of the 10 main threats to the Internet were programming errors exploitations!1
● Attack prevention: many computer attacks would be avoided if critical software was well written!
● Is it feasible? In principle, yes; in practice, no!
1 I do not have current statistics, but software errors are still considered a major threat to computer systems' security (e.g. see MITRE/SANS' report on 2011 Top 25 Most Dangerous Software Errors).
J. Magalhães Cruz Computer Security – Secure Programming 322
Difficulties of prevention
● Human:
○ people make mistakes
○ use of “hidden doors” planted by the programmers themselves!
● Software complexity:
○ difficult to foresee all possible interactions between different programs, specially considering their future developments...
● Cost:
○ not viable to write secure software because of economical reasons:
■ reduced time of development
■ efficiency loss of applications
J. Magalhães Cruz Computer Security – Secure Programming 422
Famous vulnerability by way of bad programming practice
Buffer overflow (transbordamento de dados)
Buffer overflow (or overrun) vulnerability and exploitation.
J. Magalhães Cruz Computer Security – Secure Programming 522
Malicious logic: attacks by way of programming
General types
● Trojan horse (cavalo de Tróia)○ subtypes
● Virus○ subtypes
● Worm● Rootkit (PT: pacote de controlo)
Descriptions
● see Chapter on Malware!
J. Magalhães Cruz Computer Security – Secure Programming 622
Design principles for a secure computer system (Saltzer& Schroeder)
Main, useful heuristic principles
Principle of least privilege (Princípio do mínimo privilégio) [S & S #6 1]
● a subject should only maintain the strictly necessary privileges for the completion of a task
Principle of failsafe defaults (Princípio das prédefinições seguras) [S & S #2]
● a subject should have no access to an object, unless it was explicitly granted○ this seems to be a “child” of the Principle of least privilege: for all situations where
access control is not specified, access is denied!
1 order of presentation in Saltzer & Schroeder's 1975 paper “The protection of Information in Computer Systems”
J. Magalhães Cruz Computer Security – Secure Programming 722
... Saltzer & Schroeder's design principles for a secure computer system (cont.)
Principle of complete mediation [verification!] (Princípio da verificação completa) [S & S #3]
● all accesses to objects should be always verified for allowance○ so, it will also be possible to check for unauthorized accesses in successive accesses!
Principle of separation of privilege (Princípio da separação de privilégios) [S & S #5]
● authorizations should not be granted based on a single condition○ an important door should need at least two keys to be opened
Principle of least common mechanism (Princípio do mínimo mecanismo comum) [S & S #1]
● mechanisms used to access resources should not be shared.○ sharing is an opportunity for information leaking and problems in shared areas affect a
larger population
J. Magalhães Cruz Computer Security – Secure Programming 822
... Saltzer & Schroeder's design principles for a secure computer system (cont.)
Principle of economy of mechanism (Princípio da economia de mecanismo) [S & S #7]
● security mechanisms should always be as simple as possible○ application of the largely quoted engineering design principle KISS, “Keep it simple,
stupid”
Principle of open design (Princípio da abertura de projecto) [S & S #4]
● the security of a system should not depend on the secrecy of its design or implementation○ negation of “security through obscurity”...
Principle of psychological acceptability (Princípio da aceitabilidade psicológica) [S & S #8]
● security mechanisms should not ever make the system more difficult to use in a normal way (than it was without security)○ otherwise, the users may misuse the mechanisms or will try to bypass them!
J. Magalhães Cruz Computer Security – Secure Programming 922
... Saltzer & Schroeder's design principles for a secure computer system (cont.)
Additional principles of doubtful usefulness
Principle of work factor (Princípio do factor esforço) [S & S #9]
● security mechanisms should cost more to circumvent than to develop & deploy○ difficult to implement: how to measure the effort of an yetunknown type of attack?○ “child” of principle that protection should cost no more than the value of the tobe
protected system
Principle of compromise recording (Princípio do registo de ataques) [S & S #10]
● logging of system's compromises might be preferred to system's complete protection○ if the system is compromised, you might be able to learn how○ difficult to implement: a clever attacker might leave no traces○ “child” of principle of “full appendonly logging” of resource usage
J. Magalhães Cruz Computer Security – Secure Programming 1022
Programming: quality certification
● correctness:
○ expected operation, in accordance to the known and certified used algorithms
● robustness:
○ correct operation, with additional resistance to nonmalicious input and intermediate actions or results
○ gracious and informative termination if normal, planned conclusion of program turns out to be impossible
● secureness:
○ robust operation with additional resistance to malicious usage
J. Magalhães Cruz Computer Security – Secure Programming 1122
Sound advice for programming robustly (Matt Bishop)
Robust Programming:
● Robust programming, also called bombproof programming, is a style of programming that prevents abnormal termination or unexpected actions. Basically, it requires code to handle bad (invalid or absurd) inputs in a reasonable way. If an internal error occurs, the program or library terminatesgracefully, and provides enough information so the programmer can debug the program or routine.
J. Magalhães Cruz Computer Security – Secure Programming 1222
...Sound advice for programming robustly (cont.)
(Some) Guidelines:
● Be Paranoid. Don't trust anything you don't generate!● Assume human behaviour. The routine caller or user is either an idiot or too
lazy to read manual pages or documentation.[Comment: this applies to the programmer him/herself because he/she will frequently and easily forget the details of his/her own programs.][Comment2: of course, a good program should be selfexplanatory and need little or no documentation, in the first place.]
● Avoid dangerous implements. A "dangerous implement" is anything that your routines expect to remain consistent across calls.
● Never think “can't happen”. It will! As the old saw goes, "never say never"!
J. Magalhães Cruz Computer Security – Secure Programming 1322
Example: Java
● both a programming language and an execution environment● secure programming is more easily achieved than with older programming
languages, such as C:○ rangechecking on arrays○ access modifiers (private, protected, public...)○ bytecode verifier checks for conformance to the Java Language
Specification, for violation of Java language rules, memory management violations, stack underflows or overflows...
○ ...
Example: utilization of SUID programs
● Write a small program that illustrates the difference between the identification of the entities used in Unix: real user ID, effective user ID, saved setuserID.
J. Magalhães Cruz Computer Security – Secure Programming 1422
Testing for Vulnerabilities (Vulnerability Analyses)
Definitions● vulnerability (or security flaw): specific failure of security controls
● exploitation: actual use of vulnerability to violate the system's security policy
● attacker: who attempts to exploit a vulnerability (eventually after identifying it)
Vulnerability Analyses● Generally, involves all the parts and operation of the computer system.● In particular, vulnerabilities due to bad programming practices (including the
design, coding, deployment, maintenance of all software) are most important.
J. Magalhães Cruz Computer Security – Secure Programming 1522
...Testing for Vulnerabilities (cont.)
Detecting system's vulnerabilities
● Formal verification○ logic proof of degree of conformance of a system design (or of a real
system) to requirements' specification stated by security policy● Penetration testing
○ authorized attempt to violate specific security controls○ usually has its leeway defined by specific goals and time constraints
J. Magalhães Cruz Computer Security – Secure Programming 1622
...Testing for Vulnerabilities (cont.)
Example of vulnerabilities' classifications: computer program flaws
● NRL's1 Taxonomy of Computer Program Security Flaws○ Classes depend on questions:
■ where in the system is fault manifest? (Location TAB)● classes: software, hardware; subclasses: operating system, support...
■ how did fault enter the system? (Genesis TAB)● classes: intentional, inadvertent; subclasses: malicious...
■ when did fault enter the system? (Time of introduction TAB)● classes: during development, maintenance, operation
1 United States' Naval Research Laboratory
J. Magalhães Cruz Computer Security – Secure Programming 1722
NRL's vulnerability classes: flaws by Location (Landwehr et al., 1994)
Question Class Subclass Subsubclass % cases
Location?software
operating system
system initialization 16
memory management 4
process management 20
device management 6
file management 12
authentication 10
other / unknown 2
supportprivileged utilities 20
unprivileged utilities 2
application 2
hardware 6
J. Magalhães Cruz Computer Security – Secure Programming 1822
NRL's vulnerability classes: flaws by Genesis (Landwehr et al., 1994)
Question Class Subclass Subsubclass % cases
Genesis?
intentional
malicious
trojan horse / virus 18
trap door (4)
logic bomb 2
nonmaliciouscovert channel 6
other 10
inadvertent
validation error 20
domain error 14
serialization 4
authentication 10
boundary condition violation 8
other logic error 8
J. Magalhães Cruz Computer Security – Secure Programming 1922
NRL's vulnerability classes: flaws by Time of introduction (Landwehr et al., 1994)
Question Class Subclass % cases
Time of introduction?
development
requirement /specification /design
44
source code 30
object code 2
maintenance 6
operation 18
More on this subject in → Chap. 9 Protection of local systems.
J. Magalhães Cruz Computer Security – Secure Programming 2022
Pointers...● A report on “Processes to Produce Secure Software”, 2004 – S. T. Redwine,
Jr. and Noopur Davis ○ www.cigital.com/papers/download/secure_software_process.pdf
● The “Saltzer & Schroeder design principles”, 1975 – J. H. Saltzer and M. D.Schroeder○ ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=1451869
● 2011 CWE/SANS “Top 25 Most Dangerous Software Errors,” 2011 – MITRE Corporation and SANS Institute○ cwe.mitre.org/top25/
● A formative paper on “Buffer Overruns”, 2000 – Lefty○ www.windowsecurity.com/whitepapers/Buffer_Overruns_Whats_the_Real_Story
.html● Penetration tests: the “Flaw Hypothesis Methodology”, 1975 – R.R. Linde
○ dl.acm.org/citation.cfm?id=1500018
J. Magalhães Cruz Computer Security – Secure Programming 2122
● The NRL's “A Taxonomy of Computer Program Security Flaws”, 1994 – C. Landwehr, A. Bull, J. McDermott, W. Choi○ dl.acm.org/citation.cfm?id=185412
Further References● Software without Security Holes – Prabhaker Mateti
○ www.cs.wright.edu/~pmateti/InternetSecurity/Lectures/SecSoftware/index.html
● Robust Programming Matt Bishop○ nob.cs.ucdavis.edu/~bishop/secprog/robust.html
● Secure Programming for Linux and Unix HOWTO David A. Wheeler○ en.tldp.org/HOWTO/SecureProgramsHOWTO/index.html
● Programming Secure Applications for Unixlike Systems David A. Wheeler○ www.dwheeler.com/secureprograms/secureprogramminghandouts.pdf
● Carnegie Mellon's CERT (Computer Emergency Readiness Team) program○ www.cert.org/ & www.cert.org/ securecoding/
J. Magalhães Cruz Computer Security – Secure Programming 2222