An Academic Data Center Build Out at the University of Michigan Paul Killey [email protected] Net@EDU...

20
An Academic Data Center Build Out at the University of Michigan Paul Killey [email protected] Net@EDU Tempe, AZ Feb 5 2007

Transcript of An Academic Data Center Build Out at the University of Michigan Paul Killey [email protected] Net@EDU...

An Academic Data Center Build Out at the University of Michigan

Paul [email protected]

Net@EDUTempe, AZ Feb 5 2007

Copyright Text is copyright Paul Killey 2007. This work is the intellectual

property of the author. Permission is granted for this material to be shared for non-commercial, educational purposes, provided that this copyright statement appears on the reproduced materials and notice is given that the copying is by permission of the author. To disseminate otherwise or to republish requires written permission from the author.

Photos are copyright 2006-2007 the Regents of the University of Michigan. For information, questions, or permission requests for the photos please contact the Computer Aided Engineering Network at the University of Michigan:

2170 Duderstadt Center Ann Arbor, Michigan 48109

[email protected] (734) 936-3565

Caveats

• Cost and budget information is what I understand it to be as of this presentation. The situation it describes is still very dynamic and these figures and underlying assumptions could still easily change.

• There is some cost sharing in the facility between the University of Michigan and other building occupants which is not fully described here.

The Building

• The MITC building is not a U-Mich building.

• U-Mich has a fifteen year lease with two optional five-year extensions.

• The data center was built-out as a leasehold improvement.

• The data center is in a 10,000 sq ft space in the lower (basement) level.

• U-Mich has an 85% share.

Capital and Operating Expenses

• Overall project construction cost of $18M

• Maintenance of the major equipment in the data center (generators, flywheels, CRAC units, etc.) is estimated at $300K / year.

• $220K / year in rent.

• $200K / year in facility staff.

• $100K in annual network maintenance.

• $1.8M / year electricity when full.

Power

• 2 MW for computers, 2 MW to cool; 4 MW total.• Flywheels and three 2 MW diesel generators.• Single power feed to the property, dual feed from the

property line to each of 9 PDUs.• Power to circuit breaker panels and racks is single cord

– Could run additional circuits for dual-cord applications that may arise.

• When averaged across the floor:– 200 w/sf– 30 sf/r– 6 kW per rack

• Actual: 240 w/sf over the west half and 160 w/sf over the east half.

Cooling

• 16 traditional CRAC units force air underneath the raised floor.

• 78 Liebert XDO and XDV units overhead.• A common glycol loop connects to eight

dry coolers on the roof.• At full load two of the CRAC units and two

of the dry coolers are redundant and the glycol loop contains two parallel pathways with crossover valves at regular intervals.

New Costs

• Quality– UPS– Redundancy– Appropriate space and environmentals

• Quantity– 4 MW of power– 10,000 square feet

• Dedicated facility staff• Moving to “lights-out” management

Barriers

• Paying for things twice– I have three years of use left in my 30 w/sf room with

the leaky ceiling.

• Not all resources are (easily) fungible– Some period of time may be needed for a non-

wrenching resource reallocation.

• Lack of awareness of need or value.– Why do you need a data center? Don’t you have a

laptop?

• Newly Visible (maybe unfamiliar) Costs v. Hidden Costs

How Does This Change Things?

• Makes computing an institutional activity and issue.

• Why did you need the space in the first place?– Smaller unrelated activities were not going to

gain critical mass.– Competitiveness– Business case

What Did We Just Buy?

• Productivity– Capital instead of labor– Improved process– Greater efficiency– Lower per unit costs

• New Capability– Aggregation– Collaboration

How to Justify, Get Buy-in, and $$

• This needs to be sustainable after X years.

• Minimize barriers.

• Buffer or ramp up new costs.

• Pay for unused capacity.– Who covers fixed costs that are not sensitive

to usage.

• How to allocate shares of fixed costs?

• How to allocate variable costs?

Existing and New Funds

• Reallocate existing funding streams– Faculty startup packages– Sponsored research, including equipment grants.– Donations– Construction projects that do not have to provide local

data center or machine rooms.

• Find new opportunities– Research– Regional partners– Economic development– Aggregating “spend”