KI Absicherung

21
KI Absicherung Dr. Stephan Scholz, Volkswagen AG 11th March 2021, Online, Interim presentation

Transcript of KI Absicherung

Page 1: KI Absicherung

KI Absicherung

Dr. Stephan Scholz, Volkswagen AG

11th March 2021, Online, Interim presentation

Page 2: KI Absicherung

Partner

Consortium Lead:

Volkswagen AG

Co - Lead: Fraunhofer IAIS

Duration: 36 months

01.07.2019 –20.06.2022

24 Partners

Budget: €41M

Funding: €19.2M

External techonolgy partners

2

Page 3: KI Absicherung

Project vision and goals

1

Page 4: KI Absicherung

Making the safety of AI-based function modules for highly automated driving verifiable

Pedestrian detection

4

Page 5: KI Absicherung

Challenge

KI Absicherung Gemeinsame Sprache

Absicherungsmethodik für sicherheitsrelevante KI-Module

Promising new technology withunimagined possibilities

Established safety processes cannotbe applied

Safe, trustworthy driving function

SafetyLandAI Land

Industry consensus (Safe AI): Methodology forjoint safety argumentation

Pixabay Pixabay

KI Absicherung | Interim presentation | 11.03.2021 5

Page 6: KI Absicherung

Main goals

6KI Absicherung | Interim presentation | 11.03.2021

KI Absicherung develops and investigates means and methods for verifying AI-based functions for highly

automated driving.

1. Methods for training and testing of AI-based functions

For the pedestrian detection use case, the project is developing an exemplary safety argumentation and

methods for verifying a complex AI function.

2. Safety argumentation

The project‘s results will be used in the exchange with standardization bodies to support the

development of a standard for safeguarding AI-based function modules.

3. Communication with standardization bodies on AI certification

Page 7: KI Absicherung

The KI Familie and its projects

KI Absicherung | Interim presentation | 11.03.2021 7

Page 8: KI Absicherung

KI Absicherung in connection with the KI Familie

8KI Absicherung | Interim presentation | 11.03.2021

Page 9: KI Absicherung

Methodological and conceptual approach

2

Page 10: KI Absicherung

From a data-driven AI function to an Assurance Case

10

Data Generation AI FunctionTraining

Improvem

ents Requ

irem

ents

Mitigation effect

Assurance Case Evidence

Methods &Measures MetricsTest

KI Absicherung | Interim presentation | 11.03.2021

Stringent argumentation for the AI function and its Operational Design Domain (ODD).

Development and validation of testing methods for these metrics.

Process-related generation of synthetic learning, testing and validation data.

Development of measures and methods that

improve the AI function over a wide array of

metrics.

Source: BIT Technology Solutions

Page 11: KI Absicherung

2354

1Conceptual approach

1. Provide the AI function for pedestrian detection.

2. Generate synthetic learning, testing and validation data.

3. Develop and evaluate measures and methods for the verification of the AI function.

4. Establish an overall safety strategy for the AI function.

5. Define and implement an Assurance Case.

KI Absicherung | Interim presentation | 11.03.2021 11

Page 12: KI Absicherung

Project structure with sub-projects and work packages

KI Absicherung | Interim presentation | 11.03.2021 12

Page 13: KI Absicherung

Our Approach: Establishment of a Holistic Safety Strategy

KI Absicherung | Interim presentation | 11.03.2021 13Opel

Volkswagen AG)

Identification: Identify potential causes of insufficiencies in the function (in KIA: “DNN related safety concerns”)

Quality metric: Introduce metrics or some form of judgment to argue that insufficiency was mitigated

Countermeasure: Develop methods to mitigate the insufficiencies

Argumentation: Argue that the residual risk associated with the causes has been reduced to a tolerable level

Formalisation: Create the evidence based safety argumentation in a Goal Structuring Notation

Structured safety verification

(Assurance Case)

Page 14: KI Absicherung

Specification

KI Absicherung | Interim presentation | 11.03.2021 14

Pre-processingof sensoroutput

CameraPedestrianDetection

(DNN)

Post-processingof the DNN

output

Supervisor/Plausibility

Check

Monitor

Pixabay

Surrounding conditions

System Architecture

Function

Source: Bosch

Page 15: KI Absicherung

AI Function-Pedestrian detection

KI Absicherung | Interim presentation | 11.03.2021 15Opel

BMW

Sourrounding conditions

AI Function

System architecture

Function

Semantic Segmentation

Intel

2D Bounding Box Detection

Opel

Instance Segmentation

ZF

3D Bounding Box Detection

BMW

3D Pose estimation

Uni Heidelberg

Page 16: KI Absicherung

Modeldesign

Training

ML-Daten

BIT TS

ML-Lifecycle

KI Absicherung | Interim presentation | 11.03.2021 16

[Mackevision]

Volkswagen AG Volkswagen AG

Source: BIT Technology Solutions

Model design

Training

Pre-Processing

ML-Data

Test/ V&V incl. Data

Maintenance

Monitoring

Post-Processing

ML LIFECYCLE

Page 17: KI Absicherung

Test/ V&V inkl. Daten

ML-Lifecycle-Validation data

KI Absicherung | Interim presentation | 11.03.2021 17

Absicherungs-

Daten

Mackevision Mackevision

Model design

Training

Pre-Processing

ML-Data

Test/ V&V incl. Daten

Maintenance

Monitoring

Post-Processing

ML LIFECYCLE

Page 18: KI Absicherung

DNN-specific safety concern:• Adversarial error: leads to overlooking

(false negative)

Method:• Systematic analysis of adversarial errors • Methods to evaluate adversarial resilience• Counter mechanisms during training or operation

Neurocat

Identify, Measure and & Counteract „DNN-specific „Safety Concerns”

KI Absicherung | Interim presentation | 11.03.2021 18

DNN-specific safety concern:• False positive / negative: Pedestrian detection is

incorrect resp. not robust enough

Method: • Assessment of uncertainty: Stochastic evaluation of a

multitude of model variations (Monte Carlo Dropout)

Page 19: KI Absicherung

AI Specific Evidence-Based Safety Argumentation

19

Safety and riskanalysisSafety goal 2Safety goal 2

Safety analysis(HARA, GSA, …)

Safety goals

Hazards / Risks

Safety standards(ISO26262, SOTIF…)

Safety argumentation

Safety requ. 2Safety requ. 2Safetyrequirements

Evidence strategyDNN-Insufficiencies

generalisation

Assurance Case (in GSN)

Assumptions / Context

Safety methods

Best practices

EvidenceSafety

measures

Metrics(Performance

& quality)

DNN specificSafety concerns Data, DNN,and

metrics

AI- based function module

Specification

ODD ground context

(AI-) Function

System architecture

Functionality

Model training

ML-data

Pre-Processing

Monitoring

Test / V&V incl. validation data

ML LIFECYCLE

Model design

Post-Processing

Maintenance

metrics metrics metricsmetrics

Architecturemeasures

DNNmeasures

Teatingmeasures

Data measures

Safety measures & metrics

Page 20: KI Absicherung

Parallel Activities – Available for other committeesBringing our results „on the road“

• Establishing contacts with TÜV

• VDTÜV Meeting with the BSI on AI-Safety

• TÜV@Conference 2020 – Meet the Expert

• Collaboration with „Certified AI“

• KI.NRW Initiative driven by Fraunhofer IAIS and the BSI (starting in 2021)

• Projekt Letter of Intent for collaboration

• Contributions to „DIN-Roadmap AI“

• Leading the „Quality and Certification“ Workgroup (Fraunhofer IAIS)

• Contributing to the „Mobility“ Workgroup (BMW)

KI Absicherung | Interim presentation | 11.03.2021 20

Page 21: KI Absicherung

Project coordination: Dr. Stephan Scholz, Volkswagen AG

Co-lead and scientific coordination: PD. Dr. Michael Mock, Fraunhofer IAIS

Email: [email protected]