NAME and the Sheffield Software Observatory. NAME NAME, the Network of Agile Methodologies...

37
NAME and the Sheffield Software Observatory
  • date post

    19-Dec-2015
  • Category

    Documents

  • view

    215
  • download

    1

Transcript of NAME and the Sheffield Software Observatory. NAME NAME, the Network of Agile Methodologies...

NAME

and the Sheffield Software

Observatory

NAME

• NAME,

the Network of Agile Methodologies Experience,

is a European Union fifth framework network with three major aims:

Aim 1

• 1. Bring together researchers and practitioners of Agile Methodologies for dissemination of experiences.

Aim 2

• To define a research roadmap on Agile Methodologies (AMs) for the European Union Sixth Framework Programme.

• To this end we will explore avenues for research in Extreme Programming (XP), SCRUM, and other agile methodologies,

Aim 2 (contd.)

• also in relationships with other, well established, methodologies and techniques, such as Object Oriented Development, Component Based Development, Software Product Lines, Open Source Development etc.

Aim 3

• To create an experimental framework to collect, classify, and analyze already existing and new experience in the area of XP and AMs.

Main partners

• NAME is a partnership among the Free University of Bolzano-Bozen, CWI,Datasiel, the Polytechnic of Valencia, the Munich University of Technology, the University of Cagliari, and the University of Sheffield.

Goals for today

• To gather information about experiences of using XP or AM .

Sheffield Software Observatory

• Purpose: to examine software development processes under laboratory conditions.

• Comparative studies under realistic circumstances.

• Quantitative and qualitative approaches.

• Technologies and human factors in software development.

Research approach

• Quantitative analysis of project data:

• Requirements documents, timesheets, plans, designs, tests, code, quality reviews, client feedback, etc.

• Qualitative analysis of collected data:

• Positivist approach – questionnaires, interviews, focus groups, evaluation reports.

Software Hut

• Comparative experiments.

• Typically 20 teams competing to build solutions for real business clients.

• Runs every year 2nd year students.

• Typically half the teams use methodology X and the other methodology Y.

• Statistical analysis on all data collected.

2001 pilot study

• 3 clients each with 5 teams of 4-5 developers.

• Half the teams used XP the rest used trad/UML led approach.

• Data collected and analysed.

• Some evidence that XP teams produced slightly better solutions/documentation.

2001 continued

• Problem with data validation.

• Used the experience to improve the experiment – repeated in 2001.

• Needed better training in XP.

• Needed better data collection

• Needed better analysis

2002 Software Hut.

• 20 teams – 4 clients:

• Small Firms Enterprise Development Institute – wanted an intranet with company processes and information on it.

• Learn Direct – a data analysis tool for studying marketing trends etc.

2002 contd.

• Dentistry Research Institute.

• Web based questionnaire system for field trials.

• National Cancer Screening Service (NHS) – document handling system – archived scientific information etc.

Experiment

• Half of the teams used XP, the rest trad.

• Randomised experiment.

• Data collected includes all system documentation throughout the project.

• 15 hours per week per person in each team.

• Systems evaluated by clients.

• Processes evaluated by academics.

Blocks

A B C D

XP 5, 7, 8 2, 6 1, 9 3, 4, 10

Trad 18, 20 12, 14, 17 11, 13, 19 15, 16Tre

atm

ents

Software Hut Project 2002

Allocation of teams in blocks and treatments.A) SFEDI. B) S. of Dent. C) UFI. D) Cancer S.

Time spent in XP teams

Requirem14%Spec/Des

7%

Coding36%

Testing20%

Review11%

Other12%

Research0%

Time spent in Traditional teams

Requirem14%

Spec/Des16%

Coding42%

Testing7%

Review1%

Other17% Research

3%

Software Hut Project 2002

External quality (Client's assessment)

0

10

20

30

40

50

60

1 2 3 4 5 6 7 8 9 10

XP

Trad

Software Hut Project 2002

Marks given by the clients. Split by treatmentand sorted before plotting.

Software Hut Project 2002

Marks given by the clients. Teams 1 to 10 are XP.Teams 11 to 20 are Traditional.

External quality (Client's assessment)

0

10

20

30

40

50

60

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

External quality (per client)

0

10

20

30

40

50

60

1 2 3 4 5

A (SFEDI)

B (S Dent)

C (UFI)

D (Cancer S)

Software hut Project 2002

Marks given by the client. Split by blockand sorted before plotting.

Software hut Project 2002

Marks given by the client. Split by blockand sorted before plotting.

A) SFEDI. B) S. of Dent. C) UFI D) Cancer S.

External quality (client's assessment)

0

10

20

30

40

50

60

A A A A A B B B B B C C C C C D D D D D

Internal quality (Lecturer's assessment)

05

1015202530354045

1 2 3 4 5 6 7 8 9 10

XP

Trad

Software Hut Project 2002

Marks given the lecturers. Split by treatmentand sorted before plotting.

Software Hut Project 2002

Marks given the lecturers. Teams 1 to 10 are XP.Teams 11 to 20 are Traditional.

Internal quality (lecturer's assessment)

05

1015202530354045

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Internal quality (lecturer's assessment)

05

1015202530354045

1 2 3 4 5

A (SFEDI)

B (S. Dent)

C (UFI)

D (Cancer S.)

Software Hut Project 2002

Marks given by the lecturers. Split by blockand sorted before plotting.

Software Hut Project 2002

Marks given by the lecturers. Split by blockand sorted before plotting.

A) SFEDI. B) S. of Dent. C) UFI. D) Cancer S.

Internal quality (lecturer's assessment)

05

1015202530354045

A A A A A B B B B B C C C C C D D D D D

Internal + External quality

0

20

40

60

80

100

1 2 3 4 5 6 7 8 9 10

XP

Trad

Software Hut Project 2002

Marks given by clients + lecturers. Split bytreatment and sorted before plotting.

Software Hut Project 2002

Marks given by clients + lecturers. Teams 1 to 10are XP. Teams 11 to 20 are Traditional.

External + Internal quality

0102030405060708090

100

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Internal + External quality

0

20

40

60

80

100

1 2 3 4 5

A (SFEDI)

B (S. Dent)

C (UFI)

D (Cancer S.)

Software Hut Project 2002

Marks given by clients + lecturers. Split byblock and sorted before plotting.

Software Hut Project 2002

Marks given by clients + lecturers. Split byblock and sorted before plotting.

A) SFEDI. B) S. of Dent. C) UFI. D) Cancer S.

External + internal quality

0102030405060708090

100

A A A A A B B B B B C C C C C D D D D D

Number of Test Cases

050

100150200250300350400450500

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Software Hut Project 2002

Number of tests cases. Teams 1 to 10 are XP.Teams 11 to 20 are Traditional.

Number of Test Cases per client

050

100150200250300350400450500

A A A A A B B B B B C C C C C D D D D D

Software Hut Project 2002

Number of test cases. Split by block and sorted.A) SFEDI. B) S. of Dent. C) UFI. D) Cancer S.

1 4 7

10 12 15 18

0

2040

60

80

100

120

Requirements

Non-Functional Req Functional Req Total

Software Hut Project 2002

Number of Requirements. Teams 1 to 10 are XP.Teams 11 to 20 are Traditional.

Number of Requirements per Client

0

20

40

60

80

100

120

A A A A A B B B B B C C C C C D D D D D

Non-Functional Req Functional Req Total

Software Hut Project 2002

Number of Requirements. Split byblock and sorted before plotting.

A) SFEDI. B) S. of Dent C) UFI. D) Cancer S.

Conclusions so far

• Testing was emphasised in both groups

• Probably ensured trad results were good

• Incremental delivery good

• Pair programming is hard for some

• Pair programming is good in testing and debugging

• Test first is hard but worth it

Conclusions contd.

• XP groups spend more time on project

• More talking – communication?

• Some practices easier to adopt than others

• Some practices may not be so important

• More research is needed.