Overview of DETER

9
The Challenges of Repeatable Experiment Archiving – Lessons from DETER Stephen Schwab SPARTA, Inc. d.b.a. Cobham Analytic Solutions May 25, 2010

description

The Challenges of Repeatable Experiment Archiving – Lessons from DETER Stephen Schwab SPARTA, Inc. d.b.a. Cobham Analytic Solutions May 25, 2010. DETER Highlights 3 distributed clusters, ~500 nodes Combination of DETER developed software and legacy Emulab DETER Capabilities Federation - PowerPoint PPT Presentation

Transcript of Overview of DETER

Page 1: Overview of DETER

The Challenges of Repeatable Experiment Archiving – Lessons from DETER

Stephen SchwabSPARTA, Inc. d.b.a. Cobham Analytic Solutions May 25, 2010

Page 2: Overview of DETER

Overview of DETER

DETER Highlights

-3 distributed clusters, ~500 nodes

-Combination of DETER developed software and legacy Emulab

DETER Capabilities

-Federation

-Security Experimentation

Environment (SEER)

-Templates

Federator

DRAGON

WAIL

DETER

Plug-ins to configurefederants

Internet

Emulab

GMPLS

SEER

CEDL

USERS

Credentials

Emulab

GMPLS

ProvisionedConnectivity

Page 3: Overview of DETER

DDoS Experiment on DETER (circa 2005)

Background Traffic:REPLAY | NTCG | HARPOON

HIGH FIDELITY TRAFFIC

Topology:BUILDING-BLOCKS |

JUNIPER ROUTER CORE

REALISTIC CONNECTIVITY AND SCALE-DOWN

Attack Traffic:DETER-INTEGRATED ATTACK SCRIPTING

AUTOMATION OF VARIETY OF SCENARIOS UNDER STUDY

Instrumentation:PACKET AND HOST STATISTICS CAPTURE | SPECTRAL ANALYSIS | METRICS CALCULATION | INTEGRATED VISUALIZATION

SEER: TOOLBOX FOR RIGOROUS INVESTIGATION OF RESULTS

COREAS-11357

ATTACK TRAFFICBACKGROUNDTRAFFIC

Page 4: Overview of DETER

Security Experiment Methodology & Tools (circa 2005)

Experimenter’s select from a palette of predefined elements: Topology, Background and Attack Traffic, and Data Capture and Instrumentation

Our Methodology frames standard, systematic questions that guide an experimenter in selecting and combining the right elements

Experiment Automation increases repeatability and efficiency by integrating the process within the DETER testbed environment

PALETTESs

METHODOLOGY& GUIDANCE

EXPERIMENTAUTOMATION

TOPOLOGY TRAFFIC ATTACK DATA-CAPTURE

?

… but this level of abstraction leads to major drawbacks

DETER -- integrated workbench & tools for experimenters…

Page 5: Overview of DETER

Worm/Botnet Experiment (2009)

•831 Virtual Nodes on 63 Physical PCs

Page 6: Overview of DETER

Experiment Specification

Large and Complex Experiments are more suitably constructed

• by combining abstract elements

• modeling different aspects (topology, traffic, networking devices, etc.)

• with constraints on behavior

Example of such an experiment on previous slide

• hand-crafted (e.g. hand-compiled) experiment from abstract elements

Page 7: Overview of DETER

Initial Approach: Archiving it All

Intuition drawn from analogy with physical (discovery) sciences…

Record all aspects of experiment to ensure (ideal) reproducibility

• Software• Artifacts being investigated (often the researcher’s new system!) • Operating Environment (OS, standard software on clients, servers in experiment

scenario, network routers, firewalls, etc.)• Experiment & Test Infrastructure (initialization, control, data collection, data

reduction, data analysis, data visualization, …)

• Hardware• All end-systems and routers/switches• All firmware• All chips/chipset variants (Tulip 21140As are not Tulip 21140Es!)

• Procedures• All scripts and manual interactions required to run the experiment

•… networked systems require large and growing (unbounded) detailto describe precisely… which ideal reproducibility would seem to demand

Page 8: Overview of DETER

Challenges to Ideal Archiving

Separating Invariants from Contingencies

• An experiment requires certain properties; these are the essence of the experiment

• But every configuration detail must be specified; these are contingencies – merely choices (perhaps important to record)

• Repeatability should be primarily defined with respect to explicit invariants

Experiment Internals

• Publications do not capture full details because increasing complexity of software, hardware and networking technologies result in (exponential?) growth in description of these aspects

• Peer-review process does not provide incentives to capture full details (noted in other position papers)

• Funding agencies do not provide sufficient funding to do so (how much detail can, will and should be demanded? Where is the limit on returns for dollars invested?)

Granularity of Reuse

• Individual researchers are interested in examining, studying and re-running different elements of any given experiment

• Experiments that archive everything do not clearly delineate the various pieces

Page 9: Overview of DETER

Future DETER Capability & Vision

DETER is developing the capability to

• Specify Experiments Declaratively

• Reason about the software (or hardware) alternatives that may be available to realize each element in a testbed

• Select implementations that are sufficient to perform the experiment correctly

• … and ensure detection of fidelity-loss through the use of monitored invariants

• Resolve global conflicts among local element to implementation mappings

DETER vision is to foster

• Reuse through sharing of tools, technology, results & ideas among researchers

• … and to promote this vision by providing abstractions, models and elements that are supported by our experiment life-cycle framework and tools

• Facilitate individuals and researchers focused on specific topics to create their own abstractions, models and elements