2020 Presentation at UM - TECoSA

19
Human Ǧ Robot Collaborative Assembly: The State of the Art and Latest Advancement An Invited Talk on Presentation Outline 2020-11-05 2 Background and definition of human-robot collaboration (HRC) Safety standards and active collision avoidance in HRC assembly Dynamic task planning and on-demand job dispatching Adaptive and programmingǦfree robot control InǦsitu operator support in changing HRC assembly environment Challenges and future research directions KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

Transcript of 2020 Presentation at UM - TECoSA

Human Robot Collaborative Assembly:The State of the Art and Latest Advancement

An Invited Talk on

Presentation Outline

2020-11-05 2

Background and definition of human-robot collaboration (HRC)

Safety standards and active collision avoidance in HRC assembly

Dynamic task planning and on-demand job dispatching

Adaptive and programming free robot control

In situ operator support in changing HRC assembly environment

Challenges and future research directions

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

Why Human Robot Collaboration

2020-11-05 3

Key Requirements:

Safe environmentProgramming-freePlug-and-produceUser friendly

(Adapted from FP6 SME Robotics project’s survey on 482 SMEs)%

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 4

Scope of the Presentation

Adaptive Robot Control

Dynam

ic Task

Planning

Mobile W

orker A

ssistance

Active Collision Avoidance

Smart Sensor Network

Location-aware Task-specific Multimodal In-situ assistance Decision support

Intuitive support Portal for end users Dynamic task planning/re-planning

Supervisory control

IEC-61499 function blocks compliant Smart algorithms encapsulation Adaptation via event-driven behaviours Zero robot programming to end users Multimodal programming in real-time

3D models of structured environment Point-cloud images of human workers

SSaaffee HHRRCC EEnnvviirroonnmmeenntt

HHuummaann--RRoobboottCCoollllaabboorraattiioonn

R1

R2

R3

R4

R5

Symbiotic communication: Auditory Visual Haptic

R1

R2 R3

R4

Classification of H R Relationships

2020-11-05 5

Coexistence Interaction Cooperation Collaboration

Shared

Workspace

Direct contact

Working task

Resource

Simultaneous process

Sequential process

Reference Wang XV, Seira A, Wang L (2018) Classification, personalised safety framework and strategy for human-robot collaboration. Proceedings of CIE48, Vol. 3, pp. 2302–2311.

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

2020-11-05 6

Charicteristics of HRC Assembly

The capability to track an operator’s position allows controlling a robot to support the operator as needed

Operator’s thoughts can be

converted to robot control commands

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

Key Properties of HRC Assembly

2020-11-05 7

Multiplicity Initiative

Wang XV, Kemény Z, Váncza J, Wang L (2017) Human–Robot Collaborative Assembly in Cyber-Physical Production: Classification Framework and Implementation. CIRP Ann Manuf Technol 66(1):5–8.

Reference

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

Definition of Symbiotic HRC

2020-11-05 8

Symbiotic human robot collaboration places the interplay of human and robot into a cyber-physical environment where human and robot interact in a shared work environment to solve complex tasks which require the combination of their best, complementing competencies.

A symbiotic HRC system possesses the skills and ability of perception, processing, reasoning, decision making, adaptive execution, mutual support and self learning through real-time multimodal communication for context-aware human-robot collaboration.

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

L. Wang, R.X. Gao, J. Váncza, J. Krüger, X.V. Wang, S. Makris and G. Chryssolouris, "Symbiotic Human-Robot Collaborative Assembly," CIRP Annals – Manufacturing Technology, Vol.68, No.2, pp.701-726, 2019.

Reference

AI as the Foundation of HRC

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 9

1975:Genetic Algorithm, John Holland

1958:LISP – 1st AI Language,John McCarthy

1950:Turing Test – Can Machines Think?Alan Turing

1951:1st Neuron Computer, Marvin Minsky & Dean Edmonds

1956:1st AI Workshop, John McCarthy

1965:Fuzzy Sets, Lotfi Zadeh

1961:GPS, Allen Newell & Herbert Simon

1969:DENDRAL, Edward Feigenbaum, Bruce Buchanan, Joshua Lederberg

The Riseof AI

1943:Binary ANN Model,Warren McCulloch & Walter Pitts

1997:Hybrid AI Systems

1987:Fuzzy Logic-based electronics, Japan

1943 1950 1960 1970 1980 1990 2000

AIR&DActivities

1976:MYCIN, Stanford University

1982:Hopfield Net, John Hopfield

1970:ANN Learning, Bryson & Ho

1986:Back-Propagation and Start of DAI, D.E. Rumelhart & J.L. McClelland

1992:Genetic Programming, John Koza

1995:Intelligent Agent

Since 2002:ACO, PSO, AIO, DNA Computing …

DarkAge AI Winter

AI Becomesa Science

L. Wang, "From Intelligence Science to Intelligent Manufacturing," Engineering, Vol.5, No.4, pp.615-618, 2019.Reference

Earlier AI Applications

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 10

Garry Kasparov playing Deep Blue in 1997

Honda ASIMO walking downstairs in 2005

AlphaGo in 2016

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 11

AlphaGo vs. Lee Sedol

DeepMindServers in USA

During the legendary matches:Google cloud servers in the USA using 1920 CPUs, 280 GPUs and 64 search threads. Big data: 30 million moves. Reinforcement leaning, Monte Carlo search combined with deep neural network for decision making.

Self Learning of AlphaGo Zero

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 12

Human Tracking and Detection

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 13

Controller KRC

H

R

Reference Meguenani A, et al. (2016) Energy based control for safe human-robot physical interaction. Int. Symp. Exp. Robot, pp 809–818.

2020-11-05 14

Active Collision Avoidance

Reference Schmidt B, Wang L (2014) Depth camera based collision avoidance via active robot control. J Manuf Syst 33(4):711–7118.

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 15

Four Safety Policies

Dynamic Planning and Dispatching

2020-11-05 16

Reference Michalos G, et al. (2014) ROBO-PARTNER: Seamless human-robot cooperation for intelligent, flexible and safe operations in the assembly factories of the future. Procedia CIRP 23:71–76.

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

ML in HRC Assembly Planning

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 17

2020-11-05 18

Human action recognition

Part/tool identification

Part/tool identification: recognize what the human operator picks upIdentified part/tool: valid for a “picking up” + “installing” combination

Deep Learning of Assembly Context

Reference Wang P, Liu H, Wang L, Gao RX (2018) Deep learning-based human motion recognition for predictive context-aware human-robot collaboration. CIRP Ann Manuf Technol 67(1):17–20.

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 19

PredictionBased on cognition and modeling of human behavioral pattern and preferenceHuman heterogeneity needs to be accounted for, in a probabilistic manner

ActionSafely adapt to human worker’s planned and unplanned interactions

Example: probabilistic modeling of human action and intention

,,

min,,

/ 1| ,

/ 1

|, :

| suf

i js ii j ii js ii j i

l sP sP s P

P s l s

P spositive constant

P s

Prediction

Prediction for Robotic Assistance

Reference Wang P, Liu H, Wang L, Gao RX (2018) Deep learning-based human motion recognition for predictive context-aware human-robot collaboration. CIRP Ann Manuf Technol 67(1):17–20.

Pred

ictio

n al

gori

thm

Context Aware Trajectory Planning

2020-11-05 20

Reference Maeda G, et al. (2017) Phase estimation for fast action recognition and trajectory generation in human–robot collaboration. Int J Rob Res 1–16.

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

Robot Assisting Human in Assembly

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 21

Typical Machine Learning Models

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 22

Machine learning models Supervised/ Unsupervised/ Semi-supervised

Discriminative/ Generative

Deep learning/ Not deep learning

K-Means Clustering Unsupervised Generative Not deep learningK-Nearest Neighbours Supervised Discriminative Not deep learningSupport Vector Machine Supervised Discriminative Not deep learningHidden Markov Model Supervised Discriminative Not deep learningRandom Forest Supervised Discriminative Not deep learningXGBoost Supervised Discriminative Not deep learningEnsemble Methods Supervised Discriminative Not deep learningConvolutional Neural Network Supervised Discriminative Deep learningRecurrent Neural Network Supervised Discriminative Deep learningLong Short-Term Memory Supervised Discriminative Deep learningNaive Bayes Supervised Generative Not deep learningGaussian Mixture Model Supervised Generative Not deep learningGenerative Adversarial Nets Semi-supervised Generative Deep learning

Multimodal H R Communication

2020-11-05 23

Reference Pavlovic VI, Sharma R, Huang TS (1997) Visual interpretation of hand gestures for human-computer interaction: A review. IEEE Trans PatternAnal Mach Intell 19(7):677–695.

Gesture basedBrainwave drivenHaptic guidanceVoice processing

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

Gesture Based Robot Control

2020-11-05 24

Reference Lambrecht J, Krüger J (2012) Spatial programming for industrial robots based on gestures and augmented reality. IEEE/RSJ Int. Conf. IROS, pp 466–472.

Reference Makris S, Tsarouchi P, Surdilovic D, Krüger J (2014) Intuitive dual arm robot programming for assembly operations. CIRPAnn Manuf Technol 63(1):13–16.

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

From Brain Signals to Robot Actions

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 25

EEG: electroencephalography

Types of Brainwaves

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 26

2020-11-05 27

Application server

Terminal

Brainwaves engine

EEG and gyro post-processing

Control logic

Emotionalstatebuffer

Instruction unit

Bra

in w

aves

API

Statusvisualizer

AI unit

Robot interface

Robotcontrol

Robotmonitor

Industrialrobot

controller

Emotional event query handling

Bra

inw

ave

h ead

set

Commandshandler

Rob

ot m

odul

esR

obot

com

man

ds

Rob

ot s

tatu

s

Brain waves

Robot control signals

ABB IRB 120 robot

Emotiv Epoc

InterfaceBrain Robot

Braintraining unit

Ethernet

1 42

3

Bluetooth dongle

Mentalcommands

enabler

Function Block

EEG: electroencephalography

Brainwave Driven Robot Control

Reference Mohammed A, Wang L (2018) Brainwaves driven human-robot collaborative assembly. CIRP Ann Manuf Technol 67(1):13–16.

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

Macro Micro Robot Control by FBs

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 28

AT

EMT

EI_INI

EI_UPD

EI_RUN

EO_INI

EO_RUNRDY

EO_ESS

FB_EXE

EMT

AT

Placing AF-FB

F2

EO_INI

EO_P1EI_P

EI_INI

ES-FB

EO_P3

ROUTE

EO_DONE

EO_P2

Screwing AF-FB

F10

Inserting AF-FB

F5

Riveting AF-FB

F11 F12

F13 F14

Press Fitting AF-FB

F4

Welding AF-FB

F6 F8

Chaining AF-FB

F7 F9

EI_ESR

MAC_ID

OPER

CC_UPD

ROUTE

Actuators, I/O’s

FB_EXE

Assembly Function

Block

Material Handling Function

Block

Manager Function

Block

R_INI

US US

AT(0)

FB_INFO

AC_UPD AC_UPD

Controller/HMI

US

ARNG_UPD ARNG

To another resource From Wise-Factory Cockpit

Controller/HMI

Material Handling

FB_INFO

AT

FB_EXE

Finished part conveyor

Robot

Fixture

P1

P2 P3

P4

EO_INI

EO_RUNRDY

EI_UPD

EI_RUN

EI_INI

Inserting AF-FB

INI_INFO Internal Algorithm ALG_INI

ALG_RUN ALG_UPDATE

ALG_MON

Internal Variables

EO_ESS

FB_INFO

AT

EI_ESR

AT

FB_EXE

AC_UPD

c

a

b

EO_RUNRDY 1

EI_RUN

EI_UPD

INI ALG_INI EO_INI

EI_INI

1

ALG_RUN

ALG_UPDATE

START

1

MON ALG_MON

1

EI_ESR

EO_ESS

RUN

UPDATE

Reference Wang L, Schmidt B, Givehchi M, Adamson G (2015) Robotic assembly planning and control with enhanced adaptability through function blocks. Int J Adv Manuf Technol 77(1–4):705–715.

FunctionBlocks

In Situ Support to Mobile Workers

2020-11-05 29

Reference Liu H, Wang L (2017) An AR-based Worker Support System for Human-Robot Collaboration. Procedia Manuf., pp 1–9.

VR/ARAVText

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

VR/AR Assisted HRC Assembly

2020-11-05 30

Reference Alexopoulos K, et al. (2013) ErgoToolkit: an ergonomic analysis tool in a virtual manufacturing environment. Int J Comput Integr Manuf 26(5):440–452.

Reference Makris S, et al. (2013) Assembly support using AR technology based on automatic sequence generation. CIRP Ann ManufTechnol 62(1):9–12.

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

Remote HRC: Monitoring and Assembly

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 31

Joints 2 5Joints 2 5Joints 2 5

Virtual Universe

BackgroundLights

T

BG Viewpoint

Control

BasePlatform

A A

G A

Joint-1Behaviour

Control

T T T

T

T

Joint 1

T

Joints 2 5

GripperBehaviour

Control

Gripper A Appearance

G Geometry

User defined codes

TransformGroup nodeT

Behaviour nodeB

BG BranchGroup node

B

B

B

B Amini robotic assembly cell

Reference L. Wang, M. Givehchi, G. Adamson and M. Holm, "A Sensor-Driven 3D Model-Based Approach to Remote Real-Time Monitoring," CIRP Annals – Manufacturing Technology, Vol.60, No.1, pp.493-496, 2011.

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 32

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 33

Future Research Directions

2020-11-05 34

Current standards do not yet address many key issues of mixed teams of humans and robots. More realisticstandards are needed.

Digital twin shall combine and align all aspects of modelling the function, structure and behaviour of the robotic cell including the worker, representing the multimodal and bidirectional channels of communication and control when effectively tuned to the real environment.

Emergency cases require fast and guaranteed mitigation to bring the situation back to normal without interruption if the situation does not endanger the integrity or safety of human.

AR-based support to workers in dynamic HRC assembly deserves more attention to be both intuitive and mental stress free. Work instructions need to be adaptive to not only the changing competence level of individual workers but also the declining focus and concentration during the day or within the week.

When adding humans to shared robotic environments, the trust of the humans on the robotic environments is unavoidably important and deserves attention.

In HRC assembly, mental stress and psychological discomfort leading to any potential accident can be monitored and diagnosed via the brainwaves of workers, collected by sensors embedded in a safety helmet.

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

Statistics of Cited References

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 35

Near Industrial Applications

Symbiotic HRC Assembly in Action

2020-11-05 36

Acknowledgem

entSYM

BIO-TIC project has received funding from the European Union’s H

orizon 2020 research and innovation program

me under grant agreem

ent No. 637107.

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

2020-11-05 KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang 37

Summary

Adaptive Robot Control

Dynam

ic Task

Planning

Mobile W

orker A

ssistance

Active Collision Avoidance

Smart Sensor Network

Location-aware Task-specific Multimodal In-situ assistance Decision support

Intuitive support Portal for end users Dynamic task planning/re-planning

Supervisory control

IEC-61499 function blocks compliant Smart algorithms encapsulation Adaptation via event-driven behaviours Zero robot programming to end users Multimodal programming in real-time

3D models of structured environment Point-cloud images of human workers

SSaaffee HHRRCC EEnnvviirroonnmmeenntt

HHuummaann--RRoobboottCCoollllaabboorraattiioonn

R1

R2

R3

R4

R5

Symbiotic communication: Auditory Visual Haptic

Background and definition of human-robot collaboration (HRC)

Active collision avoidance in HRC assembly

Dynamic assembly planning and on-demand job dispatching

Adaptive and programming free robot control

In situ operator support in changing HRC assembly environment

Challenges and future researchdirections

2020-11-05 38

Thank You for Listening

KTH Sustainable Manufacturing • www.kth.se • © Lihui Wang

Professor of Sustainable ManufacturingKTH Royal Institute of Technology • Stockholm • Sweden

L. Wang, R.X. Gao, J. Váncza, J. Krüger, X.V. Wang, S. Makris and G. Chryssolouris, "Symbiotic Human-Robot Collaborative Assembly," CIRP Annals – Manufacturing Technology, Vol.68, No.2, pp.701-726, 2019.

Reference

Lihui Wang