Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

87
Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences
  • date post

    19-Dec-2015
  • Category

    Documents

  • view

    213
  • download

    0

Transcript of Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Page 1: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Chapter 6

Experiences

Page 2: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Additional Recommended Literature

• IESE’s Knowledge Management Product Experience Base (KM-PEB)– Internet-based Information system on Knowledge Management Werkzeuge

(http://demolab.iese.fhg.de:8080/KM-PEB/)

• R. Bergmann, K.-D. Althoff, S. Breen, M. Göker, M. Manago, R. Traphöner & S. Wess (2003). Developing Industrial Case-Based Reasoning Applications. Springer Verlag, LNAI 1612

• Althoff, K.-D. & Nick, M. (2004). How to Support Experience Management with Evaluation – Foundations, Evaluation Methods, and Examples for Case-Based Reasoning and Experience Factory. Springer Verlag, LNAI.

• M.M. Richter: Knowledge Mnagement for E-Commerce. Lecture Notes.

Page 3: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Process Models

• In a process model one can find several methods for realizing a process. Each time a new project plan was made it could be stored and enrich the process model.

• In this way a process model constitutes an experience and could be a part of an experience base.

• This leaves the following problems unsolved_– How to get the most useful experience for an actual problem?

– There are many experience that are not coded in the form of a process model.

• So the experience base needs to be much broader than just a process model.

Page 4: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

PART 1

Lessons Learned andBest Practices

Page 5: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Informal Examples for Lessons Learned

• Experience regarding customer– “Company X wants to change as little as possible.”

This experience helps in project acquisition talks and making offers.

• Experience regarding a specific topic (e.g., reports as deliverables)– "The effort for report writing must be carefully planned. Plan four weeks

for a report for structuring, formatting, layout, and phrasing. This suggestion is based on two reports with approx. 70 pages each."

– "Do not hand out draft reports without writing on each page 'DRAFT'."

• Experience regarding certain project types– "Availability of case study partner: Finding a case study partner IS a risk

factor for projects of the combined type 'R&D and transfer'. […]"

• and any combination of the above.

Page 6: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Guideline (preventive solution)

• Title: Process modeling notation

• Description: If you want to use Spearmint™, check whether the customer will accept its process modeling notation.

• Problem:

• Business process:

• Evaluation: { , …}

Project

• Name: XXX

• Goal: ...

• Funding: 20.000€

• Team members: { ... }

• ...

Problem• Issue: Suitability of

Spearmint notation• Situation: The project

members wanted to represent their process model for Y

• Description: The notation of Spearmint was criticized by X.

• Cause: X uses a different process modeling notation

Lessons Learned: Example

Business Process

• Name: Set-Up Project

• ...

Page 7: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Lessons Learned - more than 1 Context

Attribute 1, Value V1.

.

.

Attribute n, Value Vn

Project 1

Attribute 1, Value V1.

.

.

Attribute n, Value Vn

Project 1

Attribut 1, Value X1.

.

.

Attribute n, Value Xn

Project 2

Attribut 1, Value X1.

.

.

Attribute n, Value Xn

Project 2

Attribute 1, Value Y1.

.

.

Attribute n, Value Yn

Project 3

Attribute 1, Value Y1.

.

.

Attribute n, Value Yn

Project 3

Business Process 1Business Process 1

Business Process 2Business Process 2

Lesson Learned 1Lesson Learned 1

Lesson Learned 2Lesson Learned 2

Comprehensibility Problem for a user who analyses a retrieval result:

– The comprehensibility of a lesson learned decreases with an increasing number of a projects linked with the lesson learned.

Page 8: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Context Aggregation as Solution

A Solution to the Comprehensibility Problem: Context Aggregation

Attribute 1, Value V1

.

.

.

Attribute n, Value Vn

Project 1

Attribute 1, Value X1

.

Attribute n, Value Xn

Project 2

Lesson Learned 1

Context/ Validity Info:Attribute 1

Attribute n

Page 9: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Example: Aggregation Using Union Function

Funding : Industrial

ProjectType : {Consulting, Study}

Project 1

Funding : Industrial

ProjectType : {Consulting, Study}

Project 1

Funding : Public

ProjectType : {Consulting, Transfer}

Project 2

Funding : Public

ProjectType : {Consulting, Transfer}

Project 2

Lesson Learned 1

Context/ Validity Info:

Funding : {Industrial, Public}

ProjectType : {Consulting, Study, Transfer}

Page 10: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Benefits of “Lessons Learned”

• Prevention of the recurrence of undesirable events

• Promotion of the recurrence of good practice

• Improvement of current practices

• Increase in efficiency, productivity, and quality by reducing costs and meetings schedules

• Increase in communication and teamwork

• Lessons learned complement more formal measurement programs by allowing persons to articulate insights (usually informally) not captured in the measurement programs. – As such they can also be a vehicle for recording the data interpretations given during

feedback sessions that are part of a GQM-based measurement program.

Page 11: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Maintenance Loop

Organizational Aspects

Challenge Problem How to preserve and improve the value of an experience base with systematic maintenance after its initial build-up?

maintain experience packages

maintain EB infrastructure

analyze EB, its usage, its value

(evaluation)

coverage of competence area too low ~> acquire more experience by conducting project analyses for more projects in the competence area

if more than 20 lessons learned are attached to a business process, then the business process is hard to comprehend~> integrate (some of) the lessons learned into the business process description

ExperienceBase

Feedback & Learn

Loop

decide about maintenance

learn about EB maintenance

organize maintenanc

e

external trigger

Page 12: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

maintenance

Project Analysis

Feedback after application:Actual perceived usefulness + respective observations/problems

Feedback after application:Actual perceived usefulness + respective observations/problems

Business & Knowledge Processes <-> Feedback

Situations:•Project set-up•Start of project execution•Start of work package

Experience Base

Guidelines & Observations

Evaluate + Select

Feedback before application:Estimated/expected perceived usefulness

Feedback before application:Estimated/expected perceived usefulness

Identify + Evaluate

Apply

Checklist xxx yyy zzz

Subject Area Project Management

Checklist xxx yyy zzz

Page 13: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Improving Processes

Page 14: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Observation

Split ProcessPAR

Question 8: What did you learn about the customer?A-LL: Company A likes to do as much as possible on their own, even if available resources are insufficient.A-GQM: •Company A wants to change as little as possible.•It is important to agree to a customer’s wishes.A-LL and A-GQM:•Even if those Company A employees involved in a project react very positive within meetings, it can happen that IESE suggestions are not seized because of the unavailability of the required resources/money to be provided by the responsible manager.•Project members have to learn to differentiate between customer employees being present during meetings and the responsible managers who have to provide the required resources/money.•Pproject members have to learn to talk to their customers »close to a possible offer« and in such a way that it is realistic for the customer.

...

...

Even if those Company A employees involved in a project react very positive within meetings, it can happen that suggestions are not seized because of the unavailability of the required resources/money to be provided by the responsible manager.

Problem

Project members have to learn to talk to their customers »close to a possible offer« and in such a way that it is realistic for the customer.

Guideline

Project members have to learn to differentiate between customer employees being present during meetings and the responsible managers who have to provide the required resources/money.

GuidelineObservation

Page 15: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Experience and Process Moldels• Process models can constitute a valuable experience.

• Deposit of experiences in process models e.g.:– Best Practice

– Information needs and information sources

• Example

INO:InformationNeedObject

Page 16: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

MILOS Process Modelling

• Deposit of experiences in process models– Best Practice

– Information needs and information sources

INO:InformationNeedObject

Page 17: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

PART 2

Similarity Based Reasoning(Case Based Reasoning, CBR)

Page 18: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

The Problem

• If there are many experiences, which is the most suitable one in a given situation?

• The difficulty is that usually none of the experiences deals with exactly the same problem as presented in the actual situation.

• We need a ranking of the experiences which corresponds to their usefulness for the actual problem.

• This will be achieved by defining numerical functions, the similarity measures. They are the dual of distance measures.

Page 19: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Experience Units

• Experience units are the elementary (atomic) pieces of experience.

• We call such a unit a case. A case is structured in a problem description and a solution description. These descriptions can be more or less detailed; in particular the solution description can contain many comments.

• The case took place in certain context which should also be mentioned.

• A complex experience is a set of cases, organized in a case base.

• The idea is to use cases in non-identical contexts for non-identical problems.

Page 20: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Case-Based Reasoning (CBR)

• Notions:

Case = (Problem, Solution)

Case Knowledge versus General Knowledge

Database of cases (called Case Base, Experience Base)

Similarity relation between problem descriptions

Solution

Problem

Case-1

New Solution

New Problem

Case Base

An approach to solve new problems by adapting solutions of similar past problems.

Page 21: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

How to Use a Case

Solution ?

Solution adaptation

ProblemProblemof the case

Solutionof the case

In general, there is no guarantee for getting good solutionsbecause the case may be „too far away“ from the problem. Therefore the problem arises how to define when a case is „close enough“.

Page 22: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Representation of Cases - more than 1 Context

Attribute 1, Value V1.

.

.

Attribute n, Value Vn

Project 1

Attribute 1, Value V1.

.

.

Attribute n, Value Vn

Project 1

Attribut 1, Value X1.

.

.

Attribute n, Value Xn

Project 2

Attribut 1, Value X1.

.

.

Attribute n, Value Xn

Project 2

Attribute 1, Value Y1.

.

.

Attribute n, Value Yn

Project 3

Attribute 1, Value Y1.

.

.

Attribute n, Value Yn

Project 3

Business Process 1Business Process 1

Business Process 2Business Process 2

Lesson Learned 1Lesson Learned 1

Lesson Learned 2Lesson Learned 2

Comprehensibility Problem for a user who analyses a retrieval result:

– The comprehensibility of a lesson learned decreases with an increasing number of a projects linked with the lesson learned.

Page 23: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

How to Use a Case-Base

• A case base is a data base of cases

• If a new problem arises one will use a case from the case base in order to solve the problem

• If we have many cases then the chance is higher to find one with a suitable solution

• Because the given problem is usually not exactly in the base one wants to retrieve a case which solved a problem which is „similar enough to be useful“

• Hence, the notion of similarity is central to CBR

• The concept of similarity based retrieval is compared with data base retrieval

Page 24: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Retrieve: Modeling Similarity• The similarity based retrieval realizes an inexact match which

is still useful:– Useful solutions from a case base

• We restrict ourselves to attribute-value representations of the problems. The solution can be described in an arbitrary way, e.g. in textual form.

• Similarity measures : Operate on attribute- value vectors:

Are functions to compare two cases sim: Case x Case [0..1] The first argument will always be the query case and thesecond one will be the case from the case base.

Page 25: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

A Very Simple Similarity Measure

• Representation language: objects are described as

attribute-value-pairs (attribute vectors).

• In the beginning we restrict us to binary attributes.

• Similarity assessment between two objects x and y:

– x = (x1,...,xn) xi {0,1}

– y = (y1,...,yn) yi {0,1}

– The Hamming measure counts the number of attributes where x and y agree:

)1()1(),(11

n

iii

n

iii yxyxyxH

Page 26: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Attributes of Different Importance

• Importance of the respective attributes is different

• Extension: weighted Hamming distance– Weighting vector:

• Problem: Weighting vector has to be determined! (What is important?)

]1,0[1),...,(1

1

i

n

iin andwith

)1()1(),(11

n

iiii

n

iiii yxyxyxH

Page 27: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Local Measures

• Local measures are defined on the range on an attribute A:

where r(A) is the range of attribute A.

Example: A = Time needed for a task.

An interesting local measure is given for the attribute „available budget“:

simA: r(A) x r(A) [0..1]

1

sim(a, b)

Ib-aI

If the actual available buget is higherthen the budget in the case this isperfect („better is perfect“).This measure reflects usefulnessand does not match the intiuitiveconcept of similarity: It is asymmetric!

Page 28: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Local Similarity Measures for Unordered Symbols

• Similarity : simA(x,y) = s[x,y]

• Symboltype TA={v1,...,vk}

• For reflexive similarity measures: diagonal entries = 1• For symmetric similarity measures:

upper triangle matrix = lower triangel matrix

s[x,y] v1 v2 ... vk

v1 s[1,1] s[1,2] s[1,k]

v2 s[2,1] s[2,2] s[2,k]

...

vk s[k,1] s[k,2] s[k,k]

Page 29: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

A Typical Similarity Measure

• Given two problem descriptions C1, C2• p attributes y1, ..., yp used for the representation

p

1jjj (C1,C2)simSIM(C1,C2) ω

simj : similarity for attribute j (local measure)wj : describes the relevance of attribute j for the problem

Local similarity measure: similarity on attribute levelGlobal similarity measure: similarity on case or object level

Page 30: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Motivation

• Similarity Measures: Heuristics to select useful cases

• Knowledge-Poor Similarity Measures• e.g. Hamming Distance• mainly based on

syntactical differences • consider no or only little

domain knowledge+ easy to define– lead often to poor retrieval

results

traditional approach

• Knowledge-Intensive Similarity Measures• e.g. use of sophisticated

local similarity measures• based on knowledge about

influences on the utility of cases

+ better retrieval results– require deeper analysis

of the domain and more modelling effort

alternative

Page 31: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Local Measures (1)

Binary symbol lsim (vk,vl) = 1 if vk=vl

= 0 if vkvl

Date lsim (vk,vl) = (year(vk) - year(vl))/((year(d0) - year(da))

Number lsim (vk,vl) = max - |vk - vl|/(max-min)

Ordered symbol lsim (vk,vl) = |pos(vk) - pos(vl)|

String lsim (vk,vl) = 0 if |vk|= 0

= ( max lsimword(wk,vl ))/|vk| else

max lsimword unter Beruecksichtigung aller synonyme fuer wk

lsimword (wk,vl) = 0 if |wk|= 0

= max |cw|/|wk| else

Symbol taxonomy lsim (Ni,Nm) = 1 if id(Ni) = id(Nm)

= S< Ni,Nm > else

Unordered symbol lsim (vk,vl) =Skl

Page 32: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Local Measures (2)

Data collection instrument tool interview questionnaire

tool 1.0 0.5 0.5

interview 1.0 0.5

questionnaire 1.0

Goal Purpose char. monitor eval prediction control change

char. 1.0 0.6 0.4 0.4 0.4 0.4

monitor 1.0 0.2 0.2 0.2 0.2

eval 1.0 0.2 0.2 0.2

prediction 1.0 0.6 0.4

control 1.0 0.4

change 1.0

Page 33: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Local Measures (3)

...

maintainability

expandabilitycorrectabilityunderstandability

quality factor

reliability efficiency functionality

ergonomy

usability

• Quality Type

0.5 0.5 0.5 0.5 0.5

0

Page 34: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Global Similarity Measure (1)

is_subinstance

isim (Ni,Nm) = 1 if id(Ni) = id(Nm)

= S< Ni,Nm > else

0.4

business

company IntelliCar company IntelliShop company IntelliBank

department FI departme nt ABS

group A1 group A2 ...

0.1

0.4

0.7 0.7

0.4

group HYPER

department TA

group TS ...

0.7 ...

Page 35: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Global Similarity Measure (2)

has_partinstance

psim (Ni,Mh) = 1 if id(Ni) = id(Mh)

psim (Ni,Mh) = max osim(Ni,Mh) if Ni leaf node

psim (Ni,Mh) = *max osim(Ni,Mh)+ * psim (Nk,Ml)/|< Ni>| if inner node

Nk< Ni>

mit Ml< Mh> mit max osim

osim (Ni,Mh) = gsim (Ni,Mh) excluding has_partinstance attribute

Page 36: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

N1

N3 N2

N4 N5

N6 N7

0.91

1

0 0

0.5 N8 N91 075

N10 N110 1 N121

M 1

M3 M2

M4 M5

M11 M12

M9 M10

M6 M7 M8

Example

• N1: root • N2. maintenance process • N3. development process

• N4. requirements analysis • N5. design

– N6. High-level design– N7. low-level design

• N8. implementation • N9. test

– N10. integration testing – N11. system testing – N12. acceptance testing

• M1. root• M2. maintenance process• M3. development process

• M4. requirements analysis• M5. design

– M6. architecture– M7. database design– M8. interface design

• M9. implementation• M10. testing

– M11. system testing– M12. acceptance testing

Page 37: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Retrieval Parameter Set

• identifier: used for reference purposes.

• reuse goal: reference to the respective reuse goal to which the parameters belong.

• retrieval index set: definition of a set of concepts and their attributes, which are used as indexes for the identification of useful cases.

• relevance vector: definition of a relevance factor for each index of the retrieval index set.

• global similarity threshold: definition of a threshold wrt. to the global similarity of two cases, indicating when a cases is considered sufficiently similar in order to be considered as useful.

• number of returned reuse candidates: limits the number of reuse candidates presented to the user

Page 38: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Specification

SpecEB c1 cn

t1 tl

=

with cj a j 1 aj m j j 1 n

where=

type aj k t1 tl

k 1 mj

concept i concept q c a1 am

= = = c1 c n

sim i q simc i q f sima1vala 1

i vala 1q sima m

vala mi val am

q = =

instance i query q

Page 39: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Similarity and Explanation

• Similarity measures can provide some form of explanation:

• Suppose for an actual problem P the case C is retrieved at the nearest neighbor.

• If there are only few attributes with heigh weights these have to be considered:– If the first k neighbors agree (or almost agree) on these attributes

they are almost equally useful.

– Otherwise they differ on these attributes and one can name those that responsible for being chosen.

• What is not exüplained is the solution itself, only why it is chosen.

Page 40: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Knowledge Model for Case-Based Systems

Database systems

Knowledge-basedsystems

Case-basedsystems

• Knowledge coded into „cases“ (all situations)• Simple „similarity measure“

• Coding knowledge into „similarity measure“ (predicates)• No cases

Notion = Case base + similarity measure

Informal knowledge

representation

Formal knowledge

representation

Continuous transition from informal into formal

knowledge representation

Page 41: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

PART 3

Experience Factory and Experience Management

Page 42: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Experience Factory and Information Sources

• An experience factory contains several information sources.

• These sources are, however, more than just data bases.

• They establish a connection between the objects with which the information is concerned with the possible contexts in which they can/should be used:– Which method?

– Which tool?

– Which agent?

– Etc.

• Therefore the experience factory gives recommendations.

• The access to the factory is usually similarity based.

Page 43: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Experience management, experience-based information systems and case-based systems

• From a technical point of view, the term „case-based system“ stands for a knowledge-based software system that employs the technology of Case-Based Reasoning, i.e., software systems that process case-specific knowledge in a (largely) automated way.

• From an organizational point of view, in the lecture, the term „case-based (reasoning) system“ stands for a socio-technical system, that processes human-based/automated case knowledge in a combined manner. Such systems are called Experience-based Information Systems (EbIS).

• Experience management is the term used for the management of experience-based information systems as well as all related processes.

Page 44: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Representation of Experiences (Example)

Guideline (preventive solution)

• Title: Process modeling notation

• Description: If you want to use Spearmint™, check whether the customer will accept its process modeling notation.

• Problem:

• Business process:

• Evaluation: { , …}

Project

• Name: XXX

• Goal: ...

• Funding: 20.000€

• Team members: { ... }

• ...

Problem• Issue: Suitability of

Spearmint notation• Situation: The project

members wanted to represent their process model for Y

• Description: The notation of Spearmint was criticized by X.

• Cause: X uses a different process modeling notation

Business Process

• Name: Set-Up Project

• ...

Example from COIN (IESE´s in-house EF)

Page 45: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

• Artifact– describes a piece of knowledge wrt a topic area

– often a native format of a given tool

– often not extensible

artifact

characterization• artifact info• interface

(precond for reuse)• application-domain• solution-domain• quality factors

(usage statistics)

0-

1

1 • Characterization– used for indexing

– contents may overlap with artifact

– extensible (new attributes, restructuring,…)

– links to other characterizations

Architectural Principle• Principle:

Artifact and Charakterization are separate physical objects

Page 46: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

(2) System

(3) Use

ExperienceManagement

Methodology

Objectives

DomainsExisting Knowledge

Use/Filling

Modeling

Implementation

„Going Online“

Maintenance

(1) Vision

Oriented at stakeholders’ needs,integrative

Goal oriented

Open, integrative

Page 47: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Experience Factory (EF)

Experience Factory (EF) Principle:

EF as separate organizational unit for knowledge management

Why?

Conflicting aims:– Project teams try to reach

their project goals– EF aims at transferring

knowledge across projects

SW company

ProjectOrganizationProject #1

ExperienceFactoryExperienc

e Base

Page 48: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

learn

reuse

Experience Factory

Project DProject D

Experience Factory

Project A

Project C ? !

Project DProject

Supporter

EF Manager

Project B

Experience Base

EF

Port

al

LibrarianExperie

nce Manager

Experience Engineer

Powered by CBR technology

Page 49: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Project-Organization

project 1

Planning1. Status-quo2. Goal3. Project plan

Executing4. Project implementation

Project team

Experience Factory

Learning5. Analysis6. Preparation

Experience engineer

Case-specific knowledgeKnowledge from past projects

project N

General knowledgeFeedback

Case base

Experience Factory (EF) Organization

Page 50: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Goal-oriented knowledge-build-up acc. to Quality Improvement Paradigm

General method to increase knowledgeacc. to Quality Improvement Paradigm

Project organization

Experience Factory

LearningExperience engineer

Case base

2. Set goals, where to build up knowledge

4. Acquire and package

knowledge

5. Disseminate knowledge and

analyze how knowledge reuse works in practice

6. Packaging knowledge and improving knowledge

dissemination

3. Plan how to acquire and package knowledge

1. Characterize the current state of knowledge

QIP/KM

Page 51: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Experience Factory and Case-Based Reasoning

CBR strategy on the organizational level

CBR technology for EF implementation

EF infrastructurefor CBR applications

Goal-orientedevaluation of

CBR applications

Experience Factory Organization

ProjectOrganizationProject #1

ExperienceFactoryExperience

Base

EF

NewCase

New Case

Proposed

Case

Tested/Repaired

Case

Learned

Case

Retrieved

CasePrevious

Cases

ConceptualKnowledge

Retrieve

Reuse

Revise

Reta

in

Query

SuggestedArtifact

AppliedArtifact

CBR

Page 52: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Roles within the experience factory

•Manager

•Supporter

•Engineer

•Librarian

• provides resources, defines strategic goals and initiates improvement programs. He determines the structure and content of the case base and controls its quality.

• The supporter is responsible for documenting new experiences and supporting the project team. He collects and qualifies artifacts from the projects in accordance with the reuse criteria and the goals of the engineer. Upon request, he supports the project team in retrieving and modifying the experience knowledge.

• The engineer is responsible for packaging and analyzing existing experiences. Together with the manager, he identifies new reuse criteria and, based on that, acquires new cases. He analyzes the case base in order to detect (further) imrovement potential.

• is responsible for technical aspects like setting-up and

maintaining the case base, storing and publishing new cases.

Page 53: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Goal-oriented Development of Experience-Based Information Systems (EbIS)

EbIS

Design EbIS Infuse EbIS

Identify informalusage scenarios

Set objectives

Set subject areas

Identify all scenarios

Conceptualize

Define >record<

t

Page 54: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Example: Measurement Planning with EF

• Wanted: Organization specific experience knowledge

• To do:– Modeling of knowledge in reusable form

– Efficiena und effectiv access integrated in GQM planing process

• Multi modal support for different goals

• Similarity based retrieval– Continuous evolution of the knowledge base

– Maintenance of the technical infra structure

– Adaptation to specific context

– Interactive assistant system

Necessity of a taylored apporach

Page 55: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Solution: GQM - Corporate Memory Management System (GQM-CMMS)

newedxperiences

GQM Planning conceptsGlossar, etc.

GQM-CorporateMemory GQM-CMMS

Access Maintenance

Define the GQM goals

Develop the GQM Plan

Develop data collection planData collection

Analyze& Interprete

Packaging

Measurement program n

Measurement program 1

Reuse goalCharacteristics

GQM ExperienceFactory

Page 56: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

GQM Method (1/2)

Business Goal

GQM Goal

M1 M2 M3 M4 ...

...Q1 Q2 Q3

Def

initi

on

GQM plan

Interpretation Model1 ...Model2 Model3

Abstraction Sheet

In th

e co

ntex

t of

Fro

m th

e vi

ewpo

int

of

With

res

pect

to

For

the

purp

ose

of

Ana

lyze

Quality Factors

VariationFactors

BaselineHypothesis

ImpactHypothesis

Page 57: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

GQM Method (2/2)

Package Prestudy

Interpret collecteddata and compareto goals

Collect data

Derivemeasurementplan

IdentifyGQM goals

DevelopGQM plan

Plan

Execute

Evaluate

G

Q Q

M M M M

Degree ofmaturity

Origin of system

University Industrialresearch

Industry Unknown

Development 9%

Prototype 51%

Pilot use

Practical use 30%

Unknown 9%

Q-9 What is the impact of the origin of the system on its degree of maturity?

H-9 The more “industrial” the origin, the higher its degree of maturity.

Feedback Sessionwith Stakeholders

Interviewswith

StakeholdersBusiness Goal

GQM Goal

Learning Spiral GQM Plans

Quality Model

Lesson Learned

Evaluation Reuse

Repository

Quality Factors

VariationFactors

BaselineHypothesi

s

ImpactHypothesi

s

Abstraction Sheets

Page 58: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Goal #1 Goal #2 Goal #3

Analyze retrieved information retrieved information org. memory

for the purpose of characterization characterization characterization

with respect to economic utility technical utility user friendliness

from the viewpoint of project managers

in the context of decision support for system development

Identify GQM Goals

Success

Page 59: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Identify Reusability Factors

• E.g., the adaptation cost of a code component may be estimated using a difference in functionality (needed functionality versus provided functionality) as well as size and complexity of the module.

• The abstraction sheet of the GQM technique is a means for acquiring such variation factors.

• The identified reusability factors (consisting of both quality and variation factors) can be validated:

– Each artifact must be characterized using attributes from each of the following three classes, namely about the artifact itself, about its interface, and about its context.

Page 60: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

GQM Abstraction Sheet

Goal: Analyze … perceived usefulness ...Names: N.N., M.M.Date: 00/07/13

Quality factors:

Baseline hypothesis:

Variation factors:

Impact of variation factors:

What are the factors of the quality focus perceived usefulness?

Which factors influence the quality focus perceived usefulness?

Current estimation about the quality factors based on expert opinion

How do the variation factors influence the quality focus perceived usefulness?

Page 61: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

GQM Abstraction Sheet

Goal: Analyze … perceived usefulness ...Names: N.N., M.M.Date: 00/07/13

Quality factors:

Baseline hypothesis:

Variation factors:

Impact of variation factors:

What are the factors of the quality focus perceived usefulness?

Which factors influence the quality focus perceived usefulness?

Current estimation about the quality factors based on expert opinion

How do the variation factors influence the quality focus perceived usefulness?

Page 62: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Classify Reusability Factors (1)

Goal: divide reusability factors into record attributes and feedback indicators

Input: reusability factors

Output: feedback indicators; record attributes

Method: For each reusability factor, it is checked whether a value for it can be determined at the time the corresponding artifact is recorded in the experience base (»record attribute«) or whether the value can be determined only after the artifact has been utilized (»feedback indicator«). The latter are highly context-dependent.

• Consequently, values for feedback indicators are a kind of experience that must be recorded with the context in which it was gained.

Page 63: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Classify Reusability Factors (2)

• Examples are adaptation effort, reliability, etc.

• Typically, record attributes are used to compute the similarity between an artifact´s characterization and a query (task »recommend«), whereas feedback indicators are considered by the user to examine artifacts further (task »examine«).

• There are the following reasons for this:

– Values for record attributes are available for every recorded artifact, whereas values for feedback indicators are not available before the artifact has been utilized after its initial recording.

– The development and the first utilization before the recording of an artifact does not matter in this regard, because feedback indicators are usually concerned with the modification of an already existing artifact.

Page 64: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Classify Reusability Factors (3/3)

• There are the following reasons for this (Continued):

– Feedback indicators are often text attributes (e.g., a short description of why and how the artifact was modified), that is, their values are often difficult to formalize, because utilization is a creative process that cannot be foreseen in its entirety.

– However, in general, meaningful similarity functions can only be defined for formalized attributes.

Page 65: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Systematic Application of GQM in Practice

Package Prestudy

Interpret collecteddata and compareto goals

Collect data Derivemeasurementplan

IdentifyGQM goals

DevelopGQM plan

Plan

Execute

EvaluateQuality factors:(What are the factors of the quality focus ?)

Baseline hypothesis:(What is the current knowledge about the quality factors?)

Variation factors:(Which factors influence the quality focus ?)

Impact of variation factors:(How do the variation factors influence the quality focus

2 Degree of maturity [prototype, ...]

3 Origin of system [Research, ...]

2 Experts: 30/50/15/5%

Analyze retrieved information for the purpose of monitoringwith respect to economic utility in the context of decision support

Abstraction Sheet Goal #2

Page 66: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Construct GQM Plan: Example (1)Excerpt from GQM Plan for Quality Focus “Economic Utility”,

• Relative Distribution

• H-9 The more “industrial” the origin of the case, the higher the degree of maturity.

• I-9 There should be a cumulation on the diagonal from (prototype, university) to (practical use, industry).

Degree ofmaturity

Origin of case

University Industrialresearch

Industry Unknown

Development *

Prototype *

Pilot use *Practical use *

Unknown

Q-9 What is the impact of the origin of the case on the degree of maturity?

Q-9.1 What is the origin of the case?

M-9.1.1 Per retrieval attempt: for each chosen case: case attribute “case origin” [university, industrial research, industry]

Q-9.2 What is the degree of maturity of the system?

M-9.2.1 Per retrieval attempt: case attribute “status” [“being developed”, “prototype”, “pilot system”, “application in practical use”; “unknown”]

Page 67: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Construct GQM Plan: Example (2)Excerpt from Measurement Plan for Questions 5/9/12

Page 68: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Degree ofmaturity

Origin of case

University Industrialresearch

Industry Unknown

Development 9%

Prototype 51%

Pilot use

Practical use 30%

Unknown 9%

Q-9 What is the impact of the origin of the case on the degree of maturity?

H-9 The more “industrial” the origin of the case, the higher the degree of maturity.

Construct GQM Plan: Example (3)Feedback Session with Experts

• Objectives and Example Results• Interpretation of the results of the measurement data

analyses• Verification of the hypotheses

“verified: 3; changed: 2; discarded: none; more data required: 4; missing attribute values: 4”

• Evaluation of the measurement program“Better than expected -> very acceptable.”

• Possibilities for improving the EBISe.g.: “Improve on-line help!”

• Possibilities for improving the measurement program e.g.: “Try to get feedback using a field for comments!”

Page 69: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Identify Reuse Information (Summary)

conceptualize

define definenonterminal

conceptattributes

identifyreuse

information

defineglobal

similarity

definedependenciesconcepts

determinereusability

factors

classifyreusability

factors

determineminimalquality

requirements

determineapplicationboundaries

defineapplication

policies

acquiredistinguishingcharacteristics

defineterminalconceptattributes

define meaning of similarityidentify reusability factors classify attribute define type

define value rangedefine local similarity

definemaintenance

policies

Page 70: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Reuse (1): Define Value Range

Goal: define possible values for an attribute

Input: terminal attribute; application boundary rules

Output: value range (part of type)

Method: • The possible values are derived from the application boundary rules

and entered in the »value range« column of the type table. • The unit of measure is also entered.

– In case of a symbol type, the meaning of each possible value is defined separately in the symbol glossary.

Page 71: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Reuse (2): Define Local Similarity

Goal: define meaning of similarity regarding a single attribute formally

Input:

• reuse scenarios;

• terminal attribute;

• value range (part of type);

• application boundary rules

Output: local similarity function (part of type)

Page 72: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Reuse (3): Define Global Similarity

Goal: define meaning of similarity regarding a kind of artifact formally

Input: informal description of global similarity; terminal and nonterminal attributes in form of concept attribute tables

Output: global similarity (part of the changed schema for the EbIS)

Method:

– Using the defined attributes, the informal description of the global similarity is expressed formally.

– This will allow the retrieval system to calculate the similarity automatically (task »calculate similarity«).

Page 73: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Part 4

Maintenance

Page 74: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Motivation & Goals of Maintenance

valu

e

t

Ideal learning curve

(without fo

rgetting)

Benefit for BP

by combination

Additionally available knowledge by LLs

Best practice without LLs

Best practice with

LLs

MaintenanceBuild-Up

t

valu

e

+Evaluation• Experience

– Situated (context)

– High quality

– Up-to-date

• Experience is a corporate asset

• Continuous flow of experience

• Pure accumulation of experience

“Experience overload”Value degradation Issue: Preserve and improve

value of experience packages and experience base after the initial build-up

Page 75: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Challenge ProblemHow to preserve and improve the value of an experience base with systematic maintenance after its initial build-up?

Ideas & Principles• Evaluation guides EB maintenance• Learning about EB maintenance

to improve EB maintenance itselfIngredients• Experience life-cycle model• Integration of maintenance in usage and business processes• Representation issues wrt. EB maintenance needs• EB quality model• Maintenance knowledge• Process & schedule (organizational aspects)• EB maintenance tools

Experience Base

Experience Base Maintenance

Page 76: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

t

Value Degradation by “Experience Overload”

Process Description

“Project Process”

Lesson Learned

accumulate experience

use

Process Description

“Project Process”

Lesson Learned use

Perceived value of the experience base decreases Need for maintenance

Page 77: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

t

Value Degradation by the “Ravages of Time”

Inconsistent information (new vs. obsolete experience package) User is confused, refuses to use EB, ... Perceived value of the experience base decreases Need for maintenance

Problem: “Project not finished in time”• Cause: Needed data (from company

A) was not available on time.• Context: Project X1, Company A

use

Observation: “Company A delivers in time”

• Context: Project Y1, Company A

Company A improves

its business processes Project Y1

use?

Project Z

Page 78: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Types of Maintenance Knowledge + Relations

QualityKnowledge

MaintenanceProcess

Knowledge

• Human-based• Tool-based

Maintenance Decision

Knowledge

How to acquire and develop such maintenance knowledge initially?

How to support maintenance decision making?

• Quality of single experience cases

• Overall quality of whole EB:+ Cases+ Data structure+ Retrieval mechanisms

Guidance

?• Content• Organisational

Page 79: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Deriving Maintenance Decision Knowledge

Knowledge Life-Cycle

Model

Objectives Scenarios

Schema

Knowledge Collection

Plan

Maintenance Policies

Maintenance Decision Support System

Evaluation (e.g., GQM)

formalize

automate

Measurement Plan

Evaluation Plan

Maintenance Guidelines

Subject Areas

Page 80: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Change Request “Merging ...”Affected:{ , , ..., }Priority:- Importance: medium- Deadline: not applicableStatus: newActions:

- Decide which ...

[...]- Aggregate ...

Assigned to: Mr. X

Example for Maintenance Decision Knowledge

Sub-Concepts of “Experience Case”

Instances of “Experience Case”

Maintenance Guideline “Merging Project Process Descriptions with Lessons Learned”Trigger: cardinality(relationship(Project Process, Lesson Learned)) > 20

Actions: - Decide which of the lessons learned should be integrated ...

[...]- Aggregate process description(s) and lessons

learned.Expected benefits: [...]Generated Change Requests: { }

Lesson Learned

Project Process

“Maintenance Processes”

Technique “Decision ...”Description: [...]

Technique “Aggregate ...”Description: [...]

Project Process “Execute Project”

Description: ...

Lesson Learned “...”

Lesson Learned “...”

-

inst

ance

inst

ance

1 n

Page 81: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Typical Knowledge Life-Cycle Model

Experiences: • Result from application in practice• Quality criteria:

– Generality– Validity := successful application in same/similar/different situations Examples:Lessons

Learned,Descriptive business process descriptions

“lessons learned”partially tested knowledge

(experimental use)

„best practice“ (wide-spread use)

re-package

repackage

aggregategeneralizeimprove

package

investigate

feedback from (re-)use

Theory Knowledge• Examples:

– External knowledge (from the Internet, literature, ...)

– Models for which the correctness has been proved

With the application of theory knowledge you can gather experiences that enrich the theory knowledge.

infuse

t

t

repackage

Decision Point

Decision Point

Decision Point

Page 82: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Validity

different notions of validity – depending on the viewpoint

Validity for users (for retrieval result analysis):

• validity := how much one can trust the experience to be successfully reused and applied in its anticipated new application context.

Validity for maintainers:

• validity provides decision support for maintenance for identifying invalid (“wrong”) experience and identifying matured experience(s) that should be transformed into another type of experience.

Page 83: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Quality Model: Goals and Metrics over Time

prototypical use awareness +

understanding of users reg. quality focus

regular .. wide-spread use

• Understand (characterize, monitor), control, improve, predict what knowledge makes the system useful.

utility (contents)

time

+ show economic value

• Understand, monitor, control, improve, predict how useful the system is in terms of money/effort.

economic value ($, effort)

• Determine if the system is accepted (i.e., used and usable).

• Create/increase awareness and understanding of utility.

Main Goals (informal)

Evaluation Phases and Quality Focus

cost to set up and run

Page 84: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

• Issue/Task How to preserve and improve the value of an EbIS with systematic maintenance after its initial build-up?

• 1. Systematic evaluation– EbIS: valuable and alive (i.e., useful, usable, and being used)

• otherwise: no benefit possible– Value for organization

• perceived value for users and other stakeholders• perceived usefulness, usability, ease of use, ...

• 2. Maintenance– EbIS maintenance addresses

• Process patterns and experience items• EbIS infrastructure

– Conceptual model (ontology)– Implementation of the EbIS– Set of methods/techniques for populating & utilizing the EbIS EbIS

The Principle of EbIS Maintenance

Page 85: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Outlook Integrated Architecture

Data ArtifactsInformation

Artifacts

Characterizations of Knowledge

Artifacts

Knowledge ArtifactRepositorycharacterize

Organizational Memory

DomainExperts

cleanse, validate,aggregate, integrate

(re)use,access User abstract, filter

(data mining)

knowledge acqui-sition and transfer

knowledge evaluation,goal setting

continuousfeedback and update

knowledge validationand verification

update

short-term (online)

long-term (offline)

Page 86: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Add-on Value of Experience Management (1)

Individual Learning:too slow w.r.t.• knowledge acquisition• knowledge distribution

Organizational Learning:faster w.r.t.• knowledge acquisition• knowledge distribution

Intensifiedexchange of

experiences !

Experience

Projectst

Employees

Projectst

Experience Employees

Page 87: Process Modeling Calgary 2004 Prof.Dr.Michael M. Richter Chapter 6 Experiences.

Process ModelingCalgary 2004

Prof.Dr.Michael M. Richter

Add-on Value of Experience Management (2)

• Enriching process descriptions with experience: more experience included in a process description during its major revisions closer to ideal learning curve (for each process description ) higher quality of products delivered by projects higher value of SE portal and its repository

Valu

e o

f one b

est

pra

ctic

e

desc

ripti

on

t

sharing of best practice description (without experience)

sharing of best practice descriptionand related experience

Ideal learning curve (without forgetting)