Frequency Analysis Problems. Problems 1. Extrapolation 2. Short Records 3. Extreme Data 4....

Post on 28-Dec-2015

227 views 0 download

Tags:

Transcript of Frequency Analysis Problems. Problems 1. Extrapolation 2. Short Records 3. Extreme Data 4....

Frequency Analysis Problems

Problems1. Extrapolation2. Short Records3. Extreme Data 4. Non-extreme Data5. Stationarity of Data6. Data Accuracy7. Peak Instantaneous Data8. Gauge Coverage 9. No Routing10. No Correct Distribution11. Variation In Results12. No Verification Of Results13. Mathematistry

1. Extrapolation

• Danger in fitting to known set of data and extrapolating to the unknown, without understanding physics

• Example of US population growth chart :

• Tight fit with existing data

• Application of “accepted” distribution

• No understanding of underlying factors

• Results totally wrong

1. Extrapolation

US Population ExtrapolationThompson (1942) reported in Klemes (1986)

2. Short Records

• Ideally require record length several times greater than desired return period

• Alberta has over 1000 gauges with records, but very few are long

• Frequency analysis results can be very sensitive to addition of one or two data points

• Subsampling larger records indicates sensitivity

0

200

400

600

800

1000

1200

0 20 40 60 80 100

Minimum Record Length (Years)

Nu

mb

er o

f G

aug

es2. Short Records

2. Short Records

Min. Record Length Number of Gauges Percent

(Years)

0 1085 100.0

10 564 52.0

20 354 32.6

30 212 19.5

40 109 10.0

50 62 5.7

60 44 4.1

70 29 2.7

80 21 1.9

90 3 0.3

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

2

0 20 40 60 80

Sample Length 'n' (Years)

Qn/Q

90

1:100 - 95%

1:50 - 95%

1:25 - 95%

1:100 - 5%

1:50 - 5%

1:25 - 5%

2. Short Records

3. Extreme Data

• The years recorded at a gauge may or may not have included extreme events

• Large floods known to have occurred at gauge sites but not recorded

• Some gauges may have missed extreme events only by chance e.g. 1995 flood - originally predicted for Red Deer basin, but ended up on the Oldman basin. The Red Deer and Bow River basins have not seen extreme floods in 50 to 70 years

• Presence of several extreme events could cause frequency analysis to over-predict

• Presence of no extreme events could cause frequency analysis to under-predict

0

500

1000

1500

2000

250018

70

1880

1890

1900

1910

1920

1930

1940

1950

1960

1970

1980

1990

Pea

k In

stan

tan

eou

s D

isch

arg

e (c

ms)

3. Extreme Data

Gauge 05BH004

Bow River At Calgary

200100502010521.251.051.003100

1000

10000

Return Period (Years)

Dis

char

ge

(cm

s)

LN3 - Without Ungauged Data

LN3 - With Ungauged Data

3. Extreme Data

Gauge 05BH004

Bow River At Calgary

4. Non-extreme Data

• All data points are used by statistical methods to fit a distribution. Most of these points are for non-extreme events, that have very different physical responses than extreme events e.g. :

magnitude, duration, and location of storm snowmelt vs. rainfall amount of contributing drainage area initial moisture impact of routing at lower volumes of runoff

• Fitting to smaller events may cause poor fit and extrapolation for larger events

• Impact of change in values at left tail impact the extrapolation on the right - makes no physical sense

200100502010521.251.051.003100

1000

10000

Return Period (Years)

Dis

char

ge

(cm

s)

LN3 - Without Ungauged Data

LN3 - With Ungauged Data

Gauge 05BH004

Bow River At Calgary

4. Non-extreme Data

4. Non-extreme Data

A - Original FitB - 3 lowest points slightly reducedC - 3 lowest points slightly increased

East Humber River, OntarioKlemes (1986)

200100502010521.251.051.0030.1

1

10

100

Return Period (Years)

Dis

char

ge

(cm

s)

Combined Data

Spring Data

Summer Data

LP3 - Combined

LP3 - Spring

LP3 - Summer

4. Non-extreme Data

5. Stationarity Of Data

• Changes may have occurred in basin that affect runoff response during the flow record e.g.

man-made structures - dams, levees, diversionsland use changes - agriculture, forestation, irrigation

• In order to keep the equivalent length of record, hydrologic modelling would be required to convert the data so that it would be consistent.

• This modelling would be very difficult as it it would cover a wide range of events over a number of years

6. Data Accuracy

• Extreme data often not gauged

• Extrapolated using rating curves

• Channel changes during large floods - geometry, roughness, sediment transport,

• Problems with operation of stage recording gauges e.g. damage, ice effects

• Problems with data reporting e.g. Fish Ck, 1915

• Hydrograph examination can ID problems

6. Data Accuracy

0

50

100

150

200

250

300Ju

n-0

6

Jun

-07

Jun

-08

Jun

-09

Jun

-10

Dis

char

ge

(m3 /s

)

Flood Hydrograph

Gauge Measurement

Mean Annual Flood

6. Data Accuracy

0

1

2

3

4

5

0 100 200 300

Discharge (m3/s)

Sta

ge

(m)

Gauge 05AA004

Pincher Ck - 1995

Highest Recorded Water Level

Highest Gauge Measurement

0

50

100

150

200

250

300

350

Jun-24 Jun-25 Jun-26 Jun-27 Jun-28 Jun-29 Jun-30 Jul-01 Jul-02

Dis

char

ge

(cm

s)6. Data Accuracy

• Qi reported as 200 m3/s• Does not fit mean daily flows

Gauge 05BK001

Fish Ck - 1915

7. Peak Instantaneous Data

• Design discharge is based on peak instantaneous values, but sometimes this data is not available

• Conversion of mean daily data to instantaneous requires consideration of the hydrograph timing e.g. peaks near midnight vs. peaks near noon

•Different storm durations can result in very different peak to mean daily ratios for the same basin

• Applying a multiplier to the results of a frequency analysis based on mean daily values can lead to misleading results

• Statistical methods require that all data points be consistent, even though many are irrelevant to extrapolation

7. Peak Instantaneous Data

0

500

1000

1500

0 20 40 60 80 100

Time (hours)

Dis

char

ge

(m3 /s

)

Hydrograph

Mean Daily - Peak At Noon

Mean Daily - Peak At Midnight

Gauge 05AA023

Oldman R - 1995

7. Peak Instantaneous Data

0

1000

2000

3000

4000

-40 -20 0 20 40 60

Time (hours)

Dis

char

ge

(cm

s)

1975 Flood

pre 1995 1:100 Est

1995 Flood

Oldman R Dam

8. Gauge Coverage

• Limited number of gauges in province with significant record lengths

• Difficult to transfer peak flow number to other sites without consideration of hydrographs and routing

• Area exponent method very sensitive to assumed number

8. Gauge Coverage

All Gauges(1085)

Gauges >30 Years (212)

8. Gauge Coverage

Minimum Record Length (Years)

Number of Gauges

Percent

0 1085 100.010 564 52.020 354 32.630 212 19.540 109 10.050 62 5.760 44 4.170 29 2.780 21 1.990 3 0.3

0.0

0.5

1.0

1.5

2.0

0 0.5 1 1.5 2

Drainage Area Ratio

Dis

char

ge

Rat

io

n = 0.5n = 0.7n = 0.9

8. Gauge Coverage

9. No Routing

• Peak instantaneous flow value is only applicable at the gauge site

• Need hydrograph to rout flows, not just peak discharges

• Major Routing Factors include :

Basin configuration

Lakes and reservoirs

Floodplain storage

inter-basin transfers e.g. Highwood - Little Bow River

15

0

5

10

0 20 40 60 80

Time (hrs)

Inflow

Outflow

Dis

char

ge (

m3 /

s)9. No Routing

10. No Correct Distribution

• Application of theoretical probability distributions and fitting techniques originated with Hazen (1914) in order to make straight line extrapolations from data

• There is no reason why they should be applicable to hydrologic observations

• None of them can account for the physics of the site during extrapolation

discharge limits due to floodplain storageaddition of flow from inter-basin transfer at extreme eventschanges in contributing drainage area at extreme events

11. Variation in Results

• Different distributions and fitting techniques can yield vastly different results

• Many distributions in use - LN2, LN3, LP3, GEV, P3

• Many fitting techniques - Moments, Maximum Likelihood, Least Squares Fit, PWM

• No way to distinguish between which one is the most appropriate for extrapolation

• Extrapolated values can be physically unrealistic

200100502010521.251.051.00310

100

1000

Return Period (Years)

Dis

char

ge

(cm

s)

Data

GEV

LN3

LP3

11. Variation in Results

Gauge 05AD003

Waterton River Near Waterton

74 Years of Record

200100502010521.251.051.0031

10

100

Return Period (Years)

Dis

char

ge

(cm

s)

Data

GEV

LN3

LP3

11. Variation in Results

Gauge 05BL027

Trap Ck Near Longview

20 Years of Record

12. No Verification Of Results

•Due to the separation of frequency analysis from physical modelling, the process cannot be tested.

•1:100 year flood predictions cannot be actually tested for 100's or 1000's of years.

•There is therefore little opportunity to refine an analysis or to improve confidence in its applicability

13. Mathematistry

• Gain artificial confidence in accuracy due to mathematical precision

statistics - means, standard deviations, skews, kurtosis, outliers, confidence limits curve fitting - moments, max likelihood, least squares, probability weighted moments probability distributions - LN3, LP3, GEV, Wakeby

• Loose sight of physics with focus on numbers

Conclusions

• Statistical frequency analysis has many problems in application to design discharge estimation for bridges.

• If frequency analysis is to be employed, extrapolation should be based on extreme events. This can be accomplished using graphical techniques if appropriate data exists.

• Alternative approaches to design discharge estimation should be investigated. These should :

be based on all relevant extreme flood observations for the area, minimizing extrapolations account for physical hydrologic characteristics for the area and the basin

Conclusions

Recommended articles by Klemes :

• “Common Sense And Other Heresies” - Compilation of selected papers into a book, published by CWRA

“Dilettantism in Hydrology: Transition or Destiny?” (1986) “Hydrologic And Engineering Relevance of Flood Frequency Analysis” (1987)

• “Tall Tales About Tails Of Hydrological Distributions” - paper published in ASCE Journal Of Hydrologic Engineering, July 2000