Krahula john
-
Upload
nasapmc -
Category
Technology
-
view
13.925 -
download
0
Transcript of Krahula john
Schedule Metrics Schedule Metrics –– Beyond the Beyond the OrdinaryOrdinary
NASA Project Management Challenge 2007NASA Project Management Challenge 2007John Krahula/PM MetricsJohn Krahula/PM Metrics
Copyright 2006 John Krahula and PM Metrics 2
What is a ScheduleWhat is a Schedule
►►Model/Simulation of the process leading to Model/Simulation of the process leading to the creation of a desired event or the creation of a desired event or deliverabledeliverable
►►Source of important management Source of important management informationinformation
►►Repository of historic information for Repository of historic information for contractual purposes and for creating the contractual purposes and for creating the subsequent schedulessubsequent schedules
Copyright 2006 John Krahula and PM Metrics 3
What Makes a Good ScheduleWhat Makes a Good Schedule►►Properly StructuredProperly Structured
Activity Type UsageActivity Type UsageDurations, Constraint Use, LogicDurations, Constraint Use, LogicFollows your Scheduling Business rulesFollows your Scheduling Business rules
►►Effective and appropriate Statusing ProcessEffective and appropriate Statusing Process►►Effective Coding StrategyEffective Coding Strategy
WBS is only the beginningWBS is only the beginning
►►Effective Measurement/Reporting StrategyEffective Measurement/Reporting Strategy
Copyright 2006 John Krahula and PM Metrics 4
Performance Measurement/MetricsPerformance Measurement/Metrics►► Schedules Generate Tons of Information, What is Schedules Generate Tons of Information, What is
Relevant, Appropriate, In Context.Relevant, Appropriate, In Context.►► LevelsLevels
Schedule Schedule –– Validate the Schedule as a ToolValidate the Schedule as a ToolProject Project –– Validate the Project SuccessValidate the Project Success
►► TypesTypesStructural (S) Structural (S) –– Schedule Development/StatusingSchedule Development/StatusingProgress/Status (S/P) Progress/Status (S/P) Code/Calculated/Management (Key Performance Code/Calculated/Management (Key Performance Indicators etc.) (P)Indicators etc.) (P)
►► Filter What is Measured Filter What is Measured
Copyright 2006 John Krahula and PM Metrics 5
Metrics in Perspective/ContextMetrics in Perspective/Context
►► Snapshot MetricsSnapshot MetricsDescription of the Current SituationDescription of the Current Situation
►► TrendsTrendsAnalysis of Values over timeAnalysis of Values over timeTrend of Current Period/Snapshot MetricsTrend of Current Period/Snapshot MetricsExamine Cumulative Values Examine Cumulative Values –– HistoryHistoryTime phased MetricsTime phased Metrics
►► Measurement Strategy Matches ProjectMeasurement Strategy Matches ProjectNot all Projects are the SameNot all Projects are the SamePM ROIPM ROI
Trip LevelsTrip Levels
Step 1 Step 1 –– Set Up Rules and Set Up Rules and Triggers to Highlight Areas for Triggers to Highlight Areas for
AnalysisAnalysis
Copyright 2006 John Krahula and PM Metrics 7
Trip LevelsTrip Levels
Filter Filter Step 2 Step 2 –– When running an When running an
analysis, analyze the appropriate analysis, analyze the appropriate information. Get rid of some information. Get rid of some
trees.trees.
Copyright 2006 John Krahula and PM Metrics 9
Filter/Setup ReferencesFilter/Setup References
Process and AnalyzeProcess and Analyze
Step 3 Step 3 –– Analyze Results Analyze Results –– High High LevelLevel
Copyright 2006 John Krahula and PM Metrics 11
General Information ScreenGeneral Information Screen
Copyright 2006 John Krahula and PM Metrics 12
The High Level ViewThe High Level View
►►TriageTriage►►Red is BadRed is Bad►►Validate Statusing or Schedule StructureValidate Statusing or Schedule Structure►►Drill down to a lower level of Detail.Drill down to a lower level of Detail.
Copyright 2006 John Krahula and PM Metrics 13
Find Details, Weights, Those Find Details, Weights, Those ResponsibleResponsible
Step 4 Step 4 –– Sort through the detail, find what is Sort through the detail, find what is important and communicate with those important and communicate with those responsibleresponsible
Copyright 2006 John Krahula and PM Metrics 14
Distribution Distribution -- BucketsBuckets
Copyright 2006 John Krahula and PM Metrics 15
Distribution GraphDistribution Graph
Copyright 2006 John Krahula and PM Metrics 16
TrendsTrends
Copyright 2006 John Krahula and PM Metrics 17
Detail ReportsDetail Reports
Copyright 2006 John Krahula and PM Metrics 18
Summary ReportsSummary Reports
Copyright 2006 John Krahula and PM Metrics 19
Details Details –– Other/Chokepoints/AveragesOther/Chokepoints/Averages
Copyright 2006 John Krahula and PM Metrics 20
Burn downs and AggregatesBurn downs and Aggregates
Copyright 2006 John Krahula and PM Metrics 21
Work BurndownWork Burndown
Copyright 2006 John Krahula and PM Metrics 22
Task BurndownTask Burndown
Copyright 2006 John Krahula and PM Metrics 23
Task Day BurndownTask Day Burndown
Copyright 2006 John Krahula and PM Metrics 24
Had Problems, Getting BetterHad Problems, Getting Better09/02/06 is 09/02/0509/02/06 is 09/02/05
Copyright 2006 John Krahula and PM Metrics 25
ConstraintsConstraints
Copyright 2006 John Krahula and PM Metrics 26
Resource InformationResource Information
Copyright 2006 John Krahula and PM Metrics 27
Baseline, Forecast and Previous Baseline, Forecast and Previous Period MetricsPeriod Metrics
Copyright 2006 John Krahula and PM Metrics 28
Cumulative/Task DensityCumulative/Task Density
Copyright 2006 John Krahula and PM Metrics 29
More Trends and Timephased More Trends and Timephased TrendsTrends
►►Look at What Happens over time.Look at What Happens over time.►►Combine different Metrics Combine different Metrics ►►Defensive Metrics Defensive Metrics –– Use Metrics to tell a Use Metrics to tell a
story.story.
Copyright 2006 John Krahula and PM Metrics 30
Comparing Baseline and ForecastComparing Baseline and ForecastDistributions Give Idea of WeightDistributions Give Idea of Weight
Copyright 2006 John Krahula and PM Metrics 31
Baseline Vs. Forecast/ActualBaseline Vs. Forecast/Actual
Baseline vs. Forecast Finishes
0
500
1000
1500
2000
250015
-Jan
-04
08-A
pr-0
4
01-J
ul-0
4
23-S
ep-0
4
16-D
ec-0
4
10-M
ar-0
5
02-J
un-0
5
25-A
ug-0
5
17-N
ov-0
5
09-F
eb-0
6
04-M
ay-0
6
27-J
ul-0
6
19-O
ct-0
6
Time
Cum
mul
atvi
e Fi
nish
e
bfinishcffinishc
Copyright 2006 John Krahula and PM Metrics 32
Milestones By TimeMilestones By TimeSubcontract MS, Givers/Receivers, Lower Level EventsSubcontract MS, Givers/Receivers, Lower Level Events
Milestones By Week
0
2
4
6
8
10
12
14
16
18
20
15-Ja
n-04
26-F
eb-04
08-A
pr-04
20-M
ay-04
01-Ju
l-04
12-A
ug-04
23-S
ep-04
04-N
ov-04
16-D
ec-04
27-Ja
n-05
10-M
ar-05
21-A
pr-05
02-Ju
n-05
14-Ju
l-05
25-A
ug-05
06-O
ct-05
17-N
ov-05
29-D
ec-05
09-F
eb-06
23-M
ar-06
04-M
ay-06
15-Ju
n-06
27-Ju
l-06
07-S
ep-06
19-O
ct-06
30-N
ov-06
Mile
ston
e Fi
nish
es
bccm
Copyright 2006 John Krahula and PM Metrics 33
Task DensityTask DensityHow many tasks are in play at any one timeHow many tasks are in play at any one time
Task Density
0
10
20
30
40
50
60
70
80
15-J
an-0
4
19-F
eb-0
4
25-M
ar-0
4
29-A
pr-0
4
03-J
un-0
4
08-J
ul-0
4
12-A
ug-0
4
16-S
ep-0
4
21-O
ct-0
4
25-N
ov-0
4
30-D
ec-0
4
03-F
eb-0
5
10-M
ar-0
5
14-A
pr-0
5
19-M
ay-0
5
23-J
un-0
5
28-J
ul-0
5
01-S
ep-0
5
06-O
ct-0
5
10-N
ov-0
5
15-D
ec-0
5
19-J
an-0
6
23-F
eb-0
6
30-M
ar-0
6
04-M
ay-0
6
08-J
un-0
6
13-J
ul-0
6
17-A
ug-0
6
21-S
ep-0
6
26-O
ct-0
6
30-N
ov-0
6
04-J
an-0
7
Task
s Ac
tive
Wee
kly
bcct
Copyright 2006 John Krahula and PM Metrics 34
Task Density for 1 WBS ElementTask Density for 1 WBS Element
Task Density WBS: 01020312
0
1
2
3
4
5
6
7
1 2 3 4 5 6 7 8
Month
Num
ber o
f Tas
ks
Copyright 2006 John Krahula and PM Metrics 35
Is Is ––DV Good or BadDV Good or Bad
Copyright 2006 John Krahula and PM Metrics 36
ItIt’’s Bads Bad
Copyright 2006 John Krahula and PM Metrics 37
Are We Recovering?Are We Recovering?
Copyright 2006 John Krahula and PM Metrics 38
Not ReallyNot Really
Copyright 2006 John Krahula and PM Metrics 39
When are We Tackling the Big Boys?When are We Tackling the Big Boys?
Average Finish Variance of Late Tasks
0
5
10
15
20
25
30
35
40
1 2 3 4 5 6 7 8
Week No.
Finish
Var
ianc
e Day
Copyright 2006 John Krahula and PM Metrics 40
Did a RW and NowDid a RW and Now……
Copyright 2006 John Krahula and PM Metrics 41
But But –– Greatly Reducing Lag UseGreatly Reducing Lag Use
Copyright 2006 John Krahula and PM Metrics 42
But 3X Number of Tasks/MSBut 3X Number of Tasks/MS
Copyright 2006 John Krahula and PM Metrics 43
Task Duration Down Task Duration Down –– Lower level of Lower level of DetailDetail
Copyright 2006 John Krahula and PM Metrics 44
So What HappenedSo What Happened
►►Went to a Lower Level of DetailWent to a Lower Level of DetailSmaller DurationsSmaller DurationsMore Tasks and MilestonesMore Tasks and Milestones
►►Got away from using lagsGot away from using lags►►This is an IMS Level Schedule, many This is an IMS Level Schedule, many
represent interactions between represent interactions between IPTsIPTs►►Tons of Spec Reviews etc.Tons of Spec Reviews etc.
Copyright 2006 John Krahula and PM Metrics 45
Comprehensive ComparisonComprehensive Comparison
Copyright 2006 John Krahula and PM Metrics 46
Activity/Date TabActivity/Date Tab
Sample Sample
Copyright 2006 John Krahula and PM Metrics 47
Progress TabProgress Tab
Copyright 2006 John Krahula and PM Metrics 48
Variance and Slack TabVariance and Slack Tab
Copyright 2006 John Krahula and PM Metrics 49
Resources and Relationships TabResources and Relationships Tab
Copyright 2006 John Krahula and PM Metrics 50
ConclusionConclusion
►►There is a lot of Data in a scheduleThere is a lot of Data in a schedule►►Find the right measurement strategyFind the right measurement strategy►►Look for answers, but also more important, Look for answers, but also more important,
look for the questionslook for the questions
Contact PM Metrics at Contact PM Metrics at [email protected]@aol.com or or [email protected]@pmmetrics.com
Copyright 2006 John Krahula and PM Metrics 51
Supporting InformationSupporting InformationMore Detailed Look at ScreensMore Detailed Look at Screens
Copyright 2006 John Krahula and PM Metrics 52
ActivitiesActivities►► Types of Activities, Tasks, Milestones, and Types of Activities, Tasks, Milestones, and
SummarySummaryTasks may represent work packages or more likely EV Tasks may represent work packages or more likely EV MilestonesMilestonesMilestones may be part of a strategy for performance Milestones may be part of a strategy for performance measurementmeasurementCould use Summary Activities for Work Packages or Cost Could use Summary Activities for Work Packages or Cost Accounts, WBS Elements, Etc.Accounts, WBS Elements, Etc.
►► Start, Finish, and Isolated Activities give indication Start, Finish, and Isolated Activities give indication of linking in schedule and/or number of of linking in schedule and/or number of Deliverables.Deliverables.
►► Summary Logic is generally not acceptableSummary Logic is generally not acceptable
Copyright 2006 John Krahula and PM Metrics 53
RelationshipsRelationships
►►Ratio gives indication of linkingRatio gives indication of linking►►Ratio Type usage important in determining Ratio Type usage important in determining
structure health of schedule and possible structure health of schedule and possible hiding of lateness (FS to SS or FF)hiding of lateness (FS to SS or FF)
►►Beware of Schedulers using SFBeware of Schedulers using SF►►The Great Debate: Lags Versus ConstraintsThe Great Debate: Lags Versus Constraints►►Neg Lags can help model Total Slack BetterNeg Lags can help model Total Slack Better
Copyright 2006 John Krahula and PM Metrics 54
ConstraintsConstraints►►Honor Constraints Option is DangerousHonor Constraints Option is Dangerous►►Hard Constraints Should be used SparinglyHard Constraints Should be used Sparingly
Use for DeliverablesUse for DeliverablesDeadlines in MS Project can act as Hard Deadlines in MS Project can act as Hard ConstraintsConstraints
►►Develop a strategy for using soft constraintsDevelop a strategy for using soft constraintsLogic still winsLogic still winsErsatz Resources Ersatz Resources –– Used to Model Resource Used to Model Resource AvailabilityAvailability
Copyright 2006 John Krahula and PM Metrics 55
ProgressProgress
►►Numbers of Complete, In Progress and Numbers of Complete, In Progress and Planned ActivitiesPlanned Activities
►►Missing BaselinesMissing Baselines►►Should have Started/Finished Should have Started/Finished –– Tasks Tasks
unstatused before the Status Dateunstatused before the Status Date►►Future Status Future Status –– Out of Sequence StatusOut of Sequence Status
Copyright 2006 John Krahula and PM Metrics 56
Duration and Duration VarianceDuration and Duration Variance
►►Long Duration TasksLong Duration Tasks►►>0 Duration Variance are tasks that have >0 Duration Variance are tasks that have
taken longer than expectedtaken longer than expected►►<0 Duration Variance are tasks that have <0 Duration Variance are tasks that have
completed sooner than expectedcompleted sooner than expected►►Duration Variance useful for HistoryDuration Variance useful for History►►Useful for Dynamic/Radical/Agile PMUseful for Dynamic/Radical/Agile PM
Copyright 2006 John Krahula and PM Metrics 57
Finish VarianceFinish Variance
►►Used Along with Total Slack for standard Used Along with Total Slack for standard analysis of schedulesanalysis of schedules
►►Have User Definable DistributionHave User Definable Distribution►►Finish Variance calculated on Interim Dates Finish Variance calculated on Interim Dates
helps gauge performance if Baseline not helps gauge performance if Baseline not relevant.relevant.
Copyright 2006 John Krahula and PM Metrics 58
Total Slack and Free SlackTotal Slack and Free Slack
►►Large values show missing relationshipsLarge values show missing relationships►►Negative values show missed Deliverables Negative values show missed Deliverables
or delay of entire projector delay of entire project►►Standard method for identifying problem Standard method for identifying problem
areas areas ►►Depends on Constraints Being Properly UsedDepends on Constraints Being Properly Used►►Decreasing Free Slack means compressing Decreasing Free Slack means compressing
schedule.schedule.
Copyright 2006 John Krahula and PM Metrics 59
DistributionsDistributions
►►Start Variance, Finish Variance, Duration Start Variance, Finish Variance, Duration Variance, Total Slack and LagsVariance, Total Slack and Lags
►►Duration Distribution gives an idea of how Duration Distribution gives an idea of how discrete you are planneddiscrete you are planned
►►Gives idea of scope of problems rather than Gives idea of scope of problems rather than just the long polejust the long pole
Copyright 2006 John Krahula and PM Metrics 60
ChokepointsChokepoints
►►Breakpoints are an analysis of how many Breakpoints are an analysis of how many relationships a task hasrelationships a task has
►►As with important reviews etc., tasks with As with important reviews etc., tasks with many relationships are important to trackmany relationships are important to track
Copyright 2006 John Krahula and PM Metrics 61
Other InformationOther Information
►►SubprojectsSubprojects►►Task CalendarsTask Calendars►►DeadlinesDeadlines►►Elapsed DurationElapsed Duration►►Estimated DurationEstimated Duration
Copyright 2006 John Krahula and PM Metrics 62
AveragesAverages
►►Average Durations give an idea of Average Durations give an idea of granularity of the tasks.granularity of the tasks.
►►Changes/Trends of averages can show Changes/Trends of averages can show degradation or the turning around of a degradation or the turning around of a project.project.
Copyright 2006 John Krahula and PM Metrics 63
Baseline MetricsBaseline Metrics
►►Determine what should have been workedDetermine what should have been worked►►Started/Finished on Exact DayStarted/Finished on Exact Day►►Started/Finished within Status PeriodStarted/Finished within Status Period►►Start Early/Finish EarlyStart Early/Finish Early►►What was Not Started or FinishedWhat was Not Started or Finished►►Previous Lates and HealedPrevious Lates and Healed►►Results by Activity TypeResults by Activity Type
Copyright 2006 John Krahula and PM Metrics 64
Forecast Date MetricsForecast Date Metrics
►►Metrics based on Forecast DatesMetrics based on Forecast Dates►►Metrics for both Starts and FinishesMetrics for both Starts and Finishes►►Metrics results by Activity TypeMetrics results by Activity Type►►Negative Differences mean tasks not Negative Differences mean tasks not
completed and not recompleted and not re--forecastforecast
Copyright 2006 John Krahula and PM Metrics 65
Previous Status PeriodPrevious Status Period
►►How is the schedule doing week to week or How is the schedule doing week to week or status period to status periodstatus period to status period
►►Starts/Finishes Within Status PeriodStarts/Finishes Within Status Period►►Actual Starts/Finishes within Status PeriodActual Starts/Finishes within Status Period►►Start Early/Finish EarlyStart Early/Finish Early►►By Activity TypeBy Activity Type
Copyright 2006 John Krahula and PM Metrics 66
Task DensityTask Density
►►Concurrent tasks are all tasks scheduled Concurrent tasks are all tasks scheduled during a time period.during a time period.
►►The more tasks in the works during a The more tasks in the works during a period, the greater chance of not meeting period, the greater chance of not meeting deliverablesdeliverables
►►How many tasks can we effectively manage, How many tasks can we effectively manage, do we have structure problems?do we have structure problems?
►►Defining the Bow WaveDefining the Bow Wave