DOCUMENT FRAUD DETECTION AT THE BORDER · 4 2013-08-19 Workshop on Innovation in Border Control...

34
2013-08-19 PRELIMINARY OBSERVATIONS ON HUMAN AND MACHINE PERFORMANCE Monica Gariup, Gustav Soederlind Workshop on Innovation in Border Control (WIBC 2013) - 14 August 2013 DOCUMENT FRAUD DETECTION AT THE BORDER

Transcript of DOCUMENT FRAUD DETECTION AT THE BORDER · 4 2013-08-19 Workshop on Innovation in Border Control...

2013-08-19

PRELIMINARY OBSERVATIONS ON HUMAN AND MACHINE

PERFORMANCE

Monica Gariup, Gustav Soederlind

Workshop on Innovation in Border Control (WIBC 2013) - 14 August 2013

DOCUMENT FRAUD DETECTION

AT THE BORDER

2 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

1. Research Context: Frontex,

Risks and IDCHECK2012

2. State of the Art, Background

Questions, and Research

Rationale

3. Document Challenge

Exercise: Approach and

Methodology

4. Observations and Lessons

Learned

5. Future Activities

Contents

1. Who are we and why do we

care

2. Why this research?

3. What did we do?

4. What did we learn?

5. What will we do next?

3 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

1. Research Context: Frontex, Risks and

IDCHECK2012

4 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

The European Agency for the Management of the Operational Cooperation at the External Borders of the Member States of the EU (EU Council Regulation (EC) No 2007/2004 of 26 October 2004)

Mission: ”Intelligence driven operational cooperation at EU level to

strengthen security at external borders”

Tasks: carry out risk analyses; coordinate operational cooperation;

o joint operations o support with technical and operational assistance

assist EU Member States with training their Border Guards (Common Core Curriculum);

Contribute to, follow up and disseminate to COMM and MS developments in research relevant for the control and surveillance of external borders.

What Frontex does not do: does not execute border control/surveillance itself does not cover customs’ area of responsibility (focus on movement of persons

only)

FRONTEX

Changes

- Lisbon Treaty and crime prevention;

- new regulation and enlarged mandate (own

equipments; limited processing of personal info)

5 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

• 4,300,000 km2 (1,660,000 mi2)

• Covering a population of 400 MLN

•800 MLN border crossings per year

• 50,000 km of sea borders (30,000

miles)

• 10,000 km of land borders (6,000

miles)

• 1,782 official BCPs (POEs)

• 665 air

• 871 sea

• 246 land

COMMON EXTERNAL BORDERS +

INTERNAL FREEDOM OF MOVEMENT

2013:

CROATIA

6 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

● Sophistication of document falsification/counterfeiting;

● New techniques to circumvent biometric checks;

● Use of genuine documents by imposters (impersonation, stolen

identities);

● The abuse of genuine supporting documents to obtain genuine travel

documents (abuse of legal channels and fraudulently obtained

documents);

● The abuse of supporting documents to justify purpose of stay;

● Number of detected document fraudsters increasingly under-

represents the overall extent of the phenomenon;

● Fraud expected mostly among EU travel documents (TRUSTED?

MINIMUM CHECK)

Document Fraud as high priority issue (ARA2012)

??

7 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

● risk of less effective border control due to more reliance on

technological equipment (triggered by reduced resources and

increased passenger flows)

● decrease in the importance of human interventions that may increase

the risk that a new modus operandi passes undetected and reduces

the possibility to collect human intelligence.

Technology and Human Factor as Potential

Risks(ARA2012)

8 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

● Capacity Building Division - Research and Development Unit

◊ Long-term Objective:

Improve the capacity for detection of document and identity

fraud and to improve tactical/operational risk assessment in

the FIRST LINE of border control

◊ Areas of Research Activities – Document Fraud (focus on forged/counterfeit documents)

– Identity Fraud (focus on Impostors, fraudulently acquired genuine

documents)

– Risk Assessment (focus on interviewing, behavioral and credibility

assessment)

IDCHECK 2012 Project

9 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

2. State of the Art, Background Questions,

and Research Rationale

10 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

1. KNOWN KNOWN: How many/what types of fraudulent documents are detected

at our border crossing points?

– European Document Fraud Project: collection and statistical analysis of

data on detected fraudulent documents (circa 22k detections per year)

2. KNOWN UNKNOWN: How many/what types of fraudulent documents are

NOT detected at our border crossing points and WHY? (WHAT ARE WE

MISSING AND WHY for a comprehensive situational picture)

BACKGROUND QUESTIONS Why do we need to improve detection capabilities

11 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

EDF

• What is the meaning of numbers? How big is the iceberg?

• WHAT AFFECTS THE DETECTION RATE? Total detection rate as a function of a) capabilities’ performance in the first line (human; technology/equipments);

b) time available for check;

c) total fraudulent docs in circulation (constant – unknown);

d) total no. of documents/persons passing through BCP

Risk definition (what causes the risk) has an IMPACT on the

identification of the COUNTERMEASURE !! (see example above)

THE DETECTION CHALLENGE

2008 2009 2010 Jan/Aug 2011

Forgeries 249 182 238 143

Alterations 510 228 261 167

Blank stolen

docs 28 15 9 16

Impostors 94 87 46 31

Other 7 1 8 6

Total 888 513 562 363

12 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

Document Inspection as only one component of

the border check task

10 seconds

per

passenger? What triggers

detection?

Officer +

equipment

13 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

USE CASES STRENGTHS WEAKNESSES 1

HUMAN ONLY Manual by border control officers without the use of technical support

Physical (including booklet and stamps)

No optical No electronic No databases No biometrics

2

HUMAN - MACHINE Manual by border control officers with the support of computers, optical readers, and watch-lists

Physical Optical Databases

No electronic No biometric

3 HUMAN- MACHINE ++ Manual by border control officers with the support of computers, optical readers, checks against watchlists, and infrastructure for reading electronic passports

Physical Optical Electronic Databases Biometrics possible

4

MACHINE ONLY Fully automated gates for document authentication and biometric verification

Optical Electronic Databases Biometrics Speed

No physical (smell, touch and feel)

DOCUMENT INSPECTION CAPABILITIES USE CASES

14 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

Problem of FUNCTION ALLOCATION

in the border check decision-making process:

● How do we decide what tasks people should perform, and what tasks

machines should perform in a HYBRID human-machine SECURITY system?

● How to distribute functionalities if you don’t know error rate – performance

- limitations (of the HUMAN) – Human Error Probability (HEP) and False

Acceptance Rate (FAR)?

● What affects the performance of the border guards (physiological -

psychological)?

● How can human and machine cooperate in an efficient way?

◊ Issue of trust and confidence in the machine (over-reliance/ mistrust):

How to insure that the human is in full control of the decision he/she

needs to take?

◊ Human motivation

● How do you train the border guard (the HUMAN) for the new tasks?

(attention, cognition, workload, decision-making)

● How to mitigate the new (and old) RISKS introduced in the process? (attempts

to defeat the machine/the human)

15 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

3. Document Challenge Exercise: Approach

and Methodology

16 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

● AIM:

◊ To collect preliminary observations

◊ To initiate a fact-based discussion on the performance of document

inspection capabilities deployed in the first line of border control

● APPROACH:

◊ Action Research (Learning by doing)

◊ Maintain empirical (evidence-driven) and critical analytical approach

◊ End-user driven

◊ Develop synergies with other areas of Frontex operations

Document Challenge Exercise 2012

17 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

● How (method) and how well (performance) human experts verify

document authenticity (physical and optical) under time constraint and

without significant technical help;

● How (method) and how well (performance) automated document

inspection systems (also here referred to as “machines”) verify

document authenticity (physical and optical) under time constraints;

● Whether machines significantly outperform human experts;

● What type of errors do humans and machines make;

● What weaknesses/vulnerabilities are intrinsic to human and machine

document verification taken in isolation and how these vulnerabilities

can be mitigated in the future.

Simulation of document inspection in the first

line to better understand:

NB: Artificial decoupling from identity verification, profiling ... !!!!

18 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

● Designed and conducted in cooperation with ◊ UK National Document Fraud Unit,

◊ NL Royal Mareechaussee Border Security Training Centre

◊ Frontex Training Unit

● One day exercise back-to-back with Frontex Document Specialist Board

meeting: 28 June 2012, Amsterdam Schiphol

● Participants ◊ 26 document experts from 24 EU MS

◊ 104 genuine and forged/counterfeited (global) Passports collected from MS

participants

◊ 10 document inspection systems providers (answered to public call/all

accepted)

Document Challenge Exercise 2012

19 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

◊ 26 document experts from 24 EU MS members of Frontex Document

Specialists Board (info on education, experience etc. collected)

◊ 19 genuine and 85 real forged/counterfeited (global: 62 EU) Passports

supplied by MS participants: total 104 documents

◊ 10 document inspection systems providers (answered to public call/all

accepted) (technical specifications + info on education, experience of operator etc.

collected)

Document Challenge Exercise I – Participants

Company Name, Model, Version of Equipment used

Equipment (level of automation of database of reference document comparison)

Securitech B5000 Passport Reader Fully automated e.g. RED/GREEN diode type

Keesing Reference Systems Keesing ID Authentiscan +ARH PRMc 233 RL

Combination of B,C,D

Foster & Freeman VSC-QC1 Manual comparison of document against reference document from database

Grabba S-8472b-sig-wsq No database available ID Scan Biometrics IDSCAN BORDER Fully automated e.g. RED/GREEN diode type Inspect & Detect ARH PRMC 143 RPLE Fully automated e.g. RED/GREEN diode type

Bundesdruckerei VISOTEC EXPERT 600 + VISOCORE INSPECT

Fully automated e.g. RED/GREEN diode type

Regula Regula Model 7034.111 Fully automated e.g. RED/GREEN diode type AU10TIX 3M AT9000 + Laptop Fully automated e.g. RED/GREEN diode type TrustID 3M AT9000 + Veriscan Software Combination of B,D

20 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

Exercise for Document Experts

-15 seconds per

document

- self-collection of

information

(genuine/not, forgery,

confidence level)

21 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

Exercise for Document Readers -15 seconds per

document

- self-collection of

information

(genuine/not, forgery,

detection point,

confidence level)

22 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

4. Observations and Lessons Learned

23 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

● Used mainly magnifier, rarely UV

● Time constraint as challenge

● Majority did not check booklet

● Bias against genuine (genuine docs

suspected as false) – probably

because of second line experience?

● Easy to detect: ◊ Watermarks were missing or poorly

simulated;

◊ Inkjet had been used for the background,

sometimes quite poorly;

◊ Tactile features were missing;

◊ Evidence of manipulation was “obvious”

by looking at it, e.g. the outline of the

new/old image was visible.

Human Machine

High proportion of BG/Police

operators

What part of the result is

from the machine, what is

from operator experience ?

Time constraint as a challenge

Software reboots

Multiple menu actions

required

Stops at event, e.g. date.

(cascade)

Hardware, Software, Template

Databases, Operator

Observations and surprises

24 2013-08-19

write foot text here 24 Document Challenge

Experience

Experts Industry

0

10

20

30

40

50

60

70

80

Correct Wrong Blank

First Line Experience

No First Line Experience

0

10

20

30

40

50

60

70

80

Correct Wrong Blank

First Line Experience

No First Line Experience

25 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

Document Experts

0

20

40

60

80

100

120

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26

Do

cum

en

ts

Experts

Blank

Wrong

Correct

26 2013-08-19

write foot text here 26 Document Challenge

Inspection Systems

0

20

40

60

80

100

120

1 2 3 4 5 6 7 8 9 10

Blank

Wrong

Correct

27 2013-08-19

write foot text here 27 Document Challenge

Experts versus Machines

0

10

20

30

40

50

60

70

80

90

Correct Wrong Blank

Document Experts

Industry Participants

28 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

● Little time available for planning (event was initially conceived as a

demo)

● Lack of control on choice of participants , documents, time etc.

● Self-made collection of information because of lack of time/resources:

reliability?

● Results in NO WAY representative OR statistically relevant

● ……

Weaknesses

29 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

● Elaborate a formal metrics to evaluate performance (false

acceptance/false rejection) and establish objective methods to

rigorously collect all available data (example time needed, accuracy

etc.);

● Involve a bigger number of first-line border control officers and

thoroughly survey their background/skills/performance;

● Test the equipment and record results without the intervention of

industry/expert operators;

● Compare the results with a full forensic analysis of the documents under

examination;

● More genuine documents need to be included in the exercise to better

reflect a more realistic ratio of genuine to false documents which is

more in line with what first line officers would experience in real life.

Lessons learned for the future

30 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

● Demonstrate to the border control end-users how document readers/verification

equipments currently in the market work (including usability, performance, etc.);

● Gather preliminary information and hands-on observations on inspection systems and

human performance;

● Raise the awareness of the border control end-users of the importance (and

complexity) of the tasks (in particular here document inspection) officers in the first

line are called to perform;

● Offer industry an occasion to

◊ present their products in a “speed-dating” type of presentation rounds to an audience of

experts;

◊ test their products in a fast-paced simulated operational setting;

◊ better understand the functional/operational needs of the first line of border checks;

◊ and finally interact/discuss with border control end-users.

Achievements of the Exercise

31 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

5. Future Activities

32 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

● IDCHECK2013 End-Users Advisory Board (EU MS) identified the area

of testing document inspection systems and study the human

factor as a priority

● Experts identified the need of :

◊ European survey over what equipment is in use, capabilities,

databases etc.

◊ Research on/Technical tests of document inspection capabilities

◊ Research on operational use of document inspection capabilities

(e.g. red/green issues and human factor)

33 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

● Lisbon (second half of September)

● Call for industry participation open until 19 August

● Document experts from MS participating in Frontex coordinated Joint

Operation (first line officers) + Document experts (second/third line)

● In cooperation with

◊ PT SEF

◊ UK National Document Fraud Unit

◊ NL Royal Mareechaussee

◊ DE Forensic Science Institute (Bundeskriminalamt)

● http://www.frontex.europa.eu/news/document-challenge-ii-invitation-

to-document-inspection-system-hardware-manufacturers-F9MRgU

Document Challenge II

34 2013-08-19

Workshop on Innovation in Border Control 2013 – Uppsala

Monica Gariup Research & Development Unit ****************************** Frontex Rondo ONZ 1, 00-124 Warsaw, Poland Tel: +48 22 205 96 19 Fax: +48 22 205 95 01 [email protected]

Thank you for your attention!