Producing Production Quality Software
-
Upload
matthew-snow -
Category
Documents
-
view
18 -
download
0
description
Transcript of Producing Production Quality Software
![Page 1: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/1.jpg)
Producing Production Quality Software
Lecture 14:
Group Programming Practices
Prof. Arthur P. Goldberg
Fall, 2004
![Page 2: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/2.jpg)
2
Topics
• Logistics
• MIS ‘Best Practices’: Capers Jones’s view
• Final Assignment
• Improving Software Management
• Effort Estimation
• Literature
• Productivity
![Page 3: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/3.jpg)
3
Oral Exam Logistics
– Out morning of Dec. 11, with 2 example answers– Last day answering questions about the final is Dec. 14– Exam session
• 20 minutes, location TBD• Be on time, or be rescheduled• 5 questions, 4 minutes each
2 you pick2 chosen at random1 I pick for you
• Closed book• While answering, you may talk or write on paper or a white board
– Results will be released on about 12/23, after everyone has taken the exam
![Page 4: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/4.jpg)
4
HW5 Logistics
• Last day answering questions is Dec. 9• Grading
– With Chris• Chris will read your code in advance• Exam session
– 20 minutes, location TBD– Be on time, or be rescheduled– Chris gives you a file containing what you emailed us, and you run it– Chris gives you some sequences of commands, and you run them– Different students will receive different command sequences, depending
on what needs testing
– Results will be released after everyone has had HW5 graded
![Page 5: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/5.jpg)
5
Best Practices
• As influences – Quality– Productivity
![Page 6: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/6.jpg)
6
Software Development Quality
• Capers Jones and Software Productivity Research’s quantitative approach
• Material from “Capers Jones, Software Assessments, Benchmarks,
and Best Practices, Addison-Wesley Pub Co; 1st edition (April 28, 2000)”– Measure 9000 development projects from 600
organizations• In 150 of the Fortune 500
• In 100 small companies
• In many government and military organizations
![Page 7: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/7.jpg)
7
Classification
• Classify organizations and projects by– Client country– Industry– Project ‘nature’– Project scope– By software type– Project class
• 37,400 permutations• Calibrate and benchmark measures of productivity and
defect rate
![Page 8: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/8.jpg)
8
Project ‘Nature’
1. New Development2. Enhancement (new functions added to existing software)3. Mandatory change (updates for new statutes or regulations)4. Maintenance (defect repairs to existing software)5. Performance updates (revisions needed to improve
throughput)6. Conversion or adaptation (migration to a new platform)7. Nationalization (migration to a new national language)8. Reengineering (re-implementing a legacy application)9. Mass update (modification for the Euro or Y2K)10. Hybrid (concurrent repairs and functional additions)
![Page 9: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/9.jpg)
9
Project Scope
1. Subroutine or sub-element of a program2. Module of a program3. Reusable module or object4. Disposable prototype5. Evolutionary prototype6. Stand-alone program7. Component of a system8. Release of a system9. New system or application10. Compound system (multiple linked systems)
![Page 10: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/10.jpg)
10
Software types Non procedural (spreadsheet, query,
generators, and so forth)Web AppletBatch applicationInteractive applicationBatch database applicationInteractive database applicationPen-based applicationClient/server application (two tier)Client/server application (three tier)Enterprise resource planning (ERP)
applicationScientific or mathematical application
Systems or hardware control applicationCommunications or telecommunications
applicationProcess control applicationEmbedded or real-time applicationTrusted system with stringent securityGraphics, animation, or image-processing
applicationRobotic or manufacturing control
applicationExpert system with substantial knowledge
acquisitionArtificial intelligence applicationNeural net applicationHybrid project (multiple types)
![Page 11: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/11.jpg)
11
Project classPersonal Application for private usePersonal application to be shared by othersAcademic program developed in an
academic environmentInternal application to be installed at one
locationInternal application to be accessed via an
intranet or time-sharingInternal application to be installed at
many locationInternal application developed by contract
personnelInternal application developed using
military standards
External application, to be freeware or shareware
External application to be placed on the World Wide Web
External application leased to usersExternal application embedded in hardwareExternal applications bundled with hardwareExternal application marketed commerciallyExternal application developed under
outsource contractExternal application developed under
government contractExternal application developed under
military contract
![Page 12: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/12.jpg)
12
General project classes
End-user applications - developed privately for personal use
Information systems - developed in-house for corporate use (MIS)
Outsource or contract projects - developed under legally binding contract
Commercial software - developed to be marketed to external customers
Systems software - developed to control physical devices
Military software - developed using military standards
![Page 13: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/13.jpg)
13
Review ‘Group Programming Practices Handout’
• ‘Best Technical Practices for MIS Software’
• Distribution of SPR Project Benchmarks circa 1999
![Page 14: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/14.jpg)
14
The AMS
• Changes in the specification
• Questions
![Page 15: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/15.jpg)
15
Software Effort Estimation
• Issues– When will the bleeping system be done?
– How much will it cost?
– If I give you Joe’s team, when will it be ready?
• Approaches– Little or no estimation: XP
– Function Point Analysis
– Lister’s “Estimating Quality Factor”
– Wideband Delphi
![Page 16: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/16.jpg)
Software Effort Estimation
Function Point Analysis
![Page 17: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/17.jpg)
17
Function points• A conceptual measure of complexity• A linear sum, approximately
– Simplistically, FPC = External_inputs * 4 + External_outputs * 7 + External_inquiries * 5 + Internal_logical_files * 4 + External_interface_files * 10
• Accuracy– To +/- 10%– IFPUG study by Kemerer at MIT in 1993
• Rarely used
![Page 18: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/18.jpg)
18
Functional complexity modifier
– The multipliers vary with complexity– Data element type (DET) count– File type reference (FTR) count– Review tables in ‘Group Programming Practices
Handout’
![Page 19: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/19.jpg)
19
FP ‘Priesthood’: International Function Point User’s Group www.ifpug.org
• Rules Version 4.1 1999
• Certified Function Point Specialist– Examination
• http://www.spr.com/products/function.htm, http://www.spr.com/products/programming.htm
![Page 20: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/20.jpg)
20
Backfiring
• Convert SLOC to FPs
• Driven by tables in ‘Group Programming Practices Data’
• About +/- 20% accurate
![Page 21: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/21.jpg)
21
Software Effort Estimation: EQF
• Tim Lister’s “Estimating Quality Factor”• Lister
– The Atlantic Systems Guild, Inc.– Books
• DeMarco and Lister, Waltzing With Bears: Managing Risk on Software Projects. Dorset House, 2003.
• DeMarco, Tom, and Timothy Lister: Peopleware: Productive Projects and Teams. Dorset House, New York, 1999.
– Seminars• Risk Management for Software• Leading Successful Projects
![Page 22: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/22.jpg)
Software Effort Estimation
Wideband Delphi
![Page 23: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/23.jpg)
23
Delphi History
• Consensus without conflict• Invented by RAND
– Santa Monica ‘thinktank’– Used to estimate nuclear bombing damage
• Many other applications• Wideband Delphi
– Developed by Barry Boehm in 70s; see his Software Engineering Economics
![Page 24: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/24.jpg)
24
Wideband Delphi Overview
• Input– Specification
• Outputs– Detailed task list– Estimation assumptions– Effort estimates from each participant
![Page 25: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/25.jpg)
25
Wideband Delphi Process Overview
• Form group of experts• Estimate: Each group member provides an
anonymous estimate• Coordinator assesses the estimates
– Asks for reassessment of crazy estimates
• Combines estimates• Presents combined results to experts• If combined results do not converge then goto
Estimate
![Page 26: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/26.jpg)
26
Wideband Delphi • Benefits
– Practical• Employ multiple minds; promote reasoning• Accurate estimate avoids the underestimate dilemma of “skip & compromise
quality vs. blow schedule”• Build comprehensive task list
– Psycho-social• Reduce bias by influential people, or those with divergent agendas• May resolve disagreement among hostile parties• ‘Buy-in’ of ‘stakeholders’
– Other• Acknowledges uncertainty
• Drawbacks / costs– Time– Takes power away from manager or ‘guru’ (their drawback)
![Page 27: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/27.jpg)
27
Wideband Delphi - 1
• Each individual develops an estimate which is list of– Tasks– Assumption(s)– Effort
• Rules– Assume you’ll do all the work– Assume uninterrupted effort
![Page 28: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/28.jpg)
28
Wideband Delphi – 2
• Moderator presents – Distribution of effort (anonymous) estimates– All task lists (or merged task lists)
• All participants modify estimates concurrently and secretly
![Page 29: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/29.jpg)
29
Wideband Delphi
• Termination—at earliest of– 4 rounds– Acceptable convergence– Meeting time over– Nobody willing to change
• Completion– Moderator assembles the tasks into single list– Moderator merges assumptions– Moderator combines estimates
• Possibilities– Ave– Min, ave, max– Ave and std-dev
![Page 30: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/30.jpg)
30
Wideband Delphi Practice
• Lets try it on the AMS– Use the forms in the handout– Break into groups of 3 or 4
• Record your effort for the final assignment
• Compare
![Page 31: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/31.jpg)
31
Improving Software Management
• Improve an organization– Involves
• Technology• Management• Psychology
– Approaches• CMM• SPR• Weinberg
![Page 32: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/32.jpg)
32
Software Process Assessment Approaches
• SEI CMM – focus: mandated activities, and change processes– Review ‘Group Programming Practices Handout’
• Five Levels of the SEI CMM • Five Levels of the SPR Excellence Scale • Approximate Conversion Between SPR and SEI
Software Scores
– Trained assessors
• SPR – focus: project
![Page 33: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/33.jpg)
33
SPR assessment
• SPR consultant holds a meeting with project manager and up to 7 technical staffers – 2 to 4 hours
• Official questionnaire• Group consensus answers• Finding themes
1. Findings about the projects or software products assessed2. Findings about the software technologies utilized3. Findings about the software processes utilized4. Findings about the ergonomics and work environments for staff5. Findings about personnel and training for management and staff
![Page 34: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/34.jpg)
34
SPR assessment example
• Large telecom– One software lab, 450 people– MIS– 30 projects, related to core business
![Page 35: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/35.jpg)
35
Example strengths (better than telecom industry average)
• Requirements analysis (quality function deployment [QFD])• Change control methods (fully automated with traceable requirements)• Project management: quality measurement and metrics• Design methods, fully automated• Customer support tools, fully automated• Customer support methods, both hot-lines and Internet support• Maintenance release testing• Regression testing• Development testing• Performance modeling• Performance testing• Lab testing of hardware and software concurrently• Support for telecommuting and remote employees• Support for multi-site projects
![Page 36: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/36.jpg)
36
Example average (WRT other major telecoms) factors
• Staff experience with application types• Staff experience with in-house development processes• Staff experience with development tools• Staff experience with programming languages• Staff specialization: testing• Staff specialization: quality assurance• Staff specialization: maintenance• Quality Control: use of formal design inspections• Quality Control: use of formal code inspections• Quality Control: formal software quality assurance teams• Unit testing by development personnel
![Page 37: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/37.jpg)
37
Example weaknesses (worse than industry average)
• Project Management: annual training in state-of-the-art methods• Project Management: cost estimating• Project Management: quality estimating• Project Management: risk analysis• Project Management: value analysis• Project Management: schedule planning• Project Management: lack of productivity measurements• Project Management: lack of productivity metrics• Maintenance: no use of complexity analysis• Maintenance: no use of code-restructuring tools• No reuse program: requirements• No reuse program: design• No reuse program: source code• No reuse program: test materials• No reuse program: documentation• No reuse program: project plans
![Page 38: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/38.jpg)
Literature
![Page 39: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/39.jpg)
Au revoir et merci
![Page 40: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/40.jpg)
40
10 Key Process Areas for Software Process Assessment
1. Project Management methods such as estimating2. Requirements-gathering and analysis methods3. Design and specification methods4. Coding methods5. Reusability methods6. Change control methods7. User documentation methods8. Pretest defect removal methods such as inspections9. Testing methods and tools10. Maintenance methods and tools
![Page 41: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/41.jpg)
41
10 Key Personnel-related Topics for Software Process Assessment
1. Staff hiring practices2. Staff training and education3. Management training and education4. Specialists and occupational groups5. Compensation levels6. Office ergonomics7. Organizational structures8. Morale surveys and results9. Work patterns and overtime10. Staff turnover rates
![Page 42: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/42.jpg)
Productivity
![Page 43: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/43.jpg)
43
Benchmark studies
• Compare like with like, i.e., within Classification– Percent of corporate employees in information technology– Number of users supported per staff member in information
technology– Annual corporate spending for computer hardware equipment– Revenues per information technology employee– Sales volumes correlated to information technology expenses– Profitability correlated to information technology expenses– Average salaries for information technology workers– Numbers of specialist occupations employed within information
technology– Annual attrition or turnover among information technology employees– Corporate revenues per information technology worker
![Page 44: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/44.jpg)
44
Benchmark
• Achievement
• Expense
• Customer satisfaction
![Page 45: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/45.jpg)
45
Missing benchmarks
• Numbers and sizes of corporate databases• The gaps associated with database benchmarks reflect an interesting
problem. As this book is written there are no metrics available for expressing the size or volume of information in a database. For software, both LOC metrics and function point metrics have long been available for performing benchmark studies. However, the database domain does not have any size metrics at all. Due to the lack of data metrics, there are no benchmarks on many critical topics in the database, data warehouse, and data mining domains.
• As this book is written there are no published studies that explore how much data a corporation owns, the numbers of errors in the data, how much it costs to create a database, the value of the data, or the eventual costs of removing obsolete data. Thus, a topic of considerable economic importance is, essentially, a void in terms of benchmark studies.
![Page 46: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/46.jpg)
46
Benchmark client questions
• What are the best-in-class productivity rates in our industry?• What are the best-in-class development costs in our industry?• What are the best-in-class development schedules in our industry?• What are the best-in-class maintenance assignment scopes in our
industry?• What are the best-in-class quality levels in our industry?• What are the best development processes for software like ours?• What are the best development tools for software like ours?• What does it cost to move from SEI CMM level 1 to SEI CMM level
3?• What are the difference in results between SEI CMM level 1 and
CMM level 3?
![Page 47: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/47.jpg)
47
Size/accomplishment measures
• “Source lines of code” (SLOC)• Advantages
– Easy
– Easily automated
– Used by many tools
• Problems» CJ P 71
![Page 48: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/48.jpg)
48
20 Software Artifacts for which size information is important
The functionality of the applicationThe volume of information in databasesThe quantity of new source code to be
producedThe quantity of changed source code, if anyThe quantity of deleted source code, if anyThe quantity of base code in any existing
application being updatedThe quantity of "dead code" no longer
utilized but still presentThe quantity of reusable code from certified
or uncertified sourcesThe quantity of paper deliverables (plans,
specifications, documents, and so on)
The sizes of paper deliverables (pages, words)
The number of national languages (English, French, Japanese and so on)
The number of on-line screensThe number of graphs and illustrationsThe size of nonstandard deliverables (for
example, music, animation)The number of test cases that must be
producedThe number of bugs or errors in
requirementsThe number of bugs or error in
specificationsThe number of bugs or errors in source codeThe number of bugs or error in user manualsThe number of secondary "bad fix" bugs or
errors
![Page 49: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/49.jpg)
49
Cost Factors
1. Variations due to inflation rates for studies spanning several years
2. Variations due to industry compensation averages
3. Variations due to company size
4. Variations due to geographic regions or locations
5. Variations due to merit appraisals or longevity
6. Variations due to burden rate or overhead differences
7. Variations due to work patterns and unpaid overtime
![Page 50: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/50.jpg)
50
productivity paradox
– CJ p 73
![Page 51: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/51.jpg)
51
Cost measures
– Money– Start with Official budget
• Complexities• Overhead, or burden rate (range from 30 to 100% of salary)• Bonus and option costs• Can be distorted—when running out of $, sometimes charged
to other project
– Variations» P. 84
– Compensation variation» p 88
![Page 52: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/52.jpg)
52
Labor (staff months)
» Total effort
– Measurement challenge• Cost tracking systems ‘leak’
• Phone and mail surveys miss costs– In one case
» 15% unpaid overtime
» 12% project management effort
» 19% technical work done by client (Joint Application Design (JAD))
• Face-to-face interviews measure these costs
![Page 53: Producing Production Quality Software](https://reader033.fdocuments.net/reader033/viewer/2022052401/568135bd550346895d9d21f9/html5/thumbnails/53.jpg)
53
Important to track all activities of a software project
• Activities vary across domains– Typical activity patterns for 6 software
domains: P. 94-5