USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION...

48
USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master Degree Project in Virtual Product Realization One year Level 16.5 ECTS Spring term 2019 Francisco García Rivera Supervisors: Erik Brolin Aitor Iriondo Pascual Examiner: Anna Syberfeldt

Transcript of USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION...

Page 1: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION

Master Degree Project in Virtual Product Realization One year Level 16.5 ECTS Spring term 2019 Francisco García Rivera Supervisors: Erik Brolin

Aitor Iriondo Pascual Examiner: Anna Syberfeldt

Page 2: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

i

Abstract

Nowadays Virtual Reality (VR) and Human Robot Collaboration (HRC) are becoming more and more important

in Industry as well as science. This investigation studies the applications of these two technologies in the

ergonomic field by developing a system able to visualise and present ergonomics evaluation results in real time

assembly tasks in a VR Environment, and also, evaluating the advantages of Human Robot Collaboration by

studying in Virtual Reality a specific operation carried at Volvo Global Trucks Operation´s factory in Skövde.

Regarding the first part of this investigation an innovative system was developed able to show ergonomic

feedbacks in real time, as well as make ergonomic evaluations of the whole workload inside of a VR

environment. This system can be useful for future research in the Virtual Ergonomics field regarding matters

related to ergonomic learning rate of the workers when performing assembly tasks, design of ergonomic

workstations, effect of different types assembly instructions in VR and a wide variety of different applications.

The assembly operation with and without robot was created in IPS to use its VR functionality in order to test

the assembly task in real users with natural movements of the body. The posture data of the users performing

the tasks in Virtual Reality was collected. The users performed the task without the collaborative robot and then,

with the collaborative robot. Their posture data was collected by using a Motion Capture equipment called

Smart Textiles (developed at the University of Skövde) and the two different ergonomic evaluations (Using

Smart Textiles’ criteria) of the two different task compared. The results show that when the robot implemented

in this specific assembly task, the posture of the workers (specially the posture of the arms) has a great

improvement if it is compared to the same task without the robot.

Keywords: Virtual Reality, Human Robot Collaboration, Virtual Ergonomics, Industry 4.0

Page 3: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

ii

Acknowledgements

I want to thank Anna Syberfeldt, Erik Brolin, Dan Högberg and Aitor Iriondo for their great guidance and

support during this project. It could not have been possible .without them. I also want to thank my family for

their unconditional love and support.

Skövde, May 2019

Francisco García Rivera

Page 4: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

iii

Certificate of Authenticity

Submitted by Francisco García Rivera to the University of Skövde as a Master Degree Thesis at the School of

Engineering.

I certify that all material in this Master Thesis Project which is not my own work has been properly referenced.

Signature.

Francisco García Rivera

Page 5: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

iv

Table of Contents

1 Introduction ................................................................................................................................................. 8

1.1 Background.......................................................................................................................................... 8

1.2 Problem definition and Goals .............................................................................................................. 9

2 Frame of References .................................................................................................................................. 11

2.1 Ergonomics assessment. .................................................................................................................... 11

2.1.1 Costs of Work-Related Musculoskeletal Disorders ................................................................... 11

2.1.2 Assessment Methods ................................................................................................................. 12

2.2 Motion Capture .................................................................................................................................. 13

2.3 Virtual Reality in Manufacturing Context ......................................................................................... 14

2.3.1 Virtual Reality, Definition and Types ....................................................................................... 14

2.3.2 Applications of Virtual Reality in Manufacturing ..................................................................... 15

2.3.3 Ergonomics of assembly processes in Virtual Reality .............................................................. 16

2.4 Human Robot Collaboration .............................................................................................................. 16

3 Literature Review. ..................................................................................................................................... 18

4 Method. ...................................................................................................................................................... 20

4.1 Feedback system able to inject information in the VR environment ................................................. 21

4.2 Simulation of the Assembly task using collaborative robot .............................................................. 26

5 Results ....................................................................................................................................................... 31

5.1 Ergonomic Evaluation Platform ........................................................................................................ 31

5.2 Simulation and advantages of the Assembly task using collaborative robot. .................................... 34

6 Discussion.................................................................................................................................................. 36

6.1 Ergonomic Evaluation Platform ........................................................................................................ 36

6.2 Simulation and advantages of the Assembly task using collaborative robot. .................................... 37

7 Conclusion. ................................................................................................................................................ 38

7.1 Future Work....................................................................................................................................... 38

8 References. ................................................................................................................................................ 40

Appendix 1: Colouring Criteria ......................................................................................................................... 47

Page 6: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

v

Table of Figures

Figure 1. Method 1. ........................................................................................................................................... 20

Figure 2. Method 2. ........................................................................................................................................... 20

Figure 3. Desired Data Flow Diagram............................................................................................................... 21

Figure 4. Data Flow Diagram ............................................................................................................................ 22

Figure 5. Data Flow Diagram. Part 1 ................................................................................................................ 23

Figure 6.Data Flow Diagram. Part 2 ................................................................................................................. 24

Figure 7. Link .................................................................................................................................................... 24

Figure 8. Data Flow Diagram. Part 3 ................................................................................................................ 24

Figure 9. Smart Textiles Interface ..................................................................................................................... 25

Figure 10. Data Flow Diagram. Part 4............................................................................................................... 26

Figure 11. Assembly Task with Collaborative Robot. ...................................................................................... 27

Figure 12. User's sight with collaborative robot ................................................................................................ 29

Figure 13. User's sight without collaborative robot ........................................................................................... 30

Figure 14. Attached to the controller ................................................................................................................. 31

Figure 15. World Forward ................................................................................................................................. 32

Figure 16. Corner of the Screen ........................................................................................................................ 33

Page 7: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

vi

Index of Tables

Table 1. Primary and Secondary Objectives ..................................................................................................... 10

Table 2. Objectives Reached ............................................................................................................................. 33

Table 3. Objectives reached .............................................................................................................................. 35

Table 4. Colouring Criteria Arms ...................................................................................................................... 47

Table 5. Colouring Criteria Trunk ..................................................................................................................... 47

Table 6. Colouring Criteria Arm Speed............................................................................................................. 47

Page 8: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

vii

Terminology

A

AR

Augmented Reality ............................................................. i

H

HRC

Human Robot Collaboration ............................................... i

HMD

Head Mounted Display ....................................................... i

H

MPMWS

Multi-Parameter Monitoring Wearable Sensor .................. i

P

PLM

Product Lifecycle management .......................................... i

S

SPMWS

Single-Parameter Monitoring Wearable Sensor ................. i

V

VR

Virtual Reality ..................................................................... i

VE

Virtual Environment ........................................................... i

W

WMSDs

Work Related Musculoskeletal Disorders .......................... 1

Page 9: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

1. Collaborative workstations: Workstations which make use of a collaborative robot 8

1 Introduction

1.1 Background

Technological advances have driven the industry as well as its methods throughout the history.

According to many researches, industry is going towards a new stage called “Industry 4.0”. (Rüßmann

et al., 2015; Zhong et al., 2017). Industry 4.0 can be defined as the new trend in industry where the

communication and information systems are converging. Although the term is not quite clear due to

the amount of contributions from the part of: companies, research institutions, universities and so on,

it can be said that the term “Industry 4.0” stands for the next industrial revolution (Hermann et al.,

2016). According to (Rüßmann et al., 2015) Industry 4.0 is based on 9 different technologies whereas

“Augmented Reality/Virtual Reality” and “Advanced Robots” are two of them. These two topics

will be central areas of this investigation.

The applications in Industry of Virtual Reality (VR) are endless. In this investigation its application

in the design of workstations will be studied. The importance of considering production aspects in the

early stages of the product lifecycle has been demonstrated (Seth et al., 2011a), and how that can

increase the performance of the whole organization (Riedel and Pawar, 1997). “Production aspects”

can be a wide matter, however, it is logical that the design of workstations as well as their ergonomics

aspects can be included within that area, since the production capabilities as well as its quality are

intimately related with the ergonomic quality of the workstations (Falck and Rosenqvist, 2014).

More specifically, this research is going to study the ergonomic advantages of collaborative

workstations1 using VR. This technology enables the design of the workstations in a virtual

environment and actually test it with natural movement of the body prior to its construction (Zawadzki

and Żywicki, 2016). In addition, with the implementation of Motion Capture technologies (MoCap)

(“XSENS-Research,” 2019) and Software such as IPS (“Industrial Path Solutions–Fraunhofer-

Chalmers Centre,” 2019) a large number parameters of the workstations can be obtained (such as

ergonomic data, path collision, ease in assembly and so on). It enables the designer to have numerical

data about the workstation (Mathematical ergonomic evaluations, task time, ergonomic variety and so

on), improve it, adjust it and enrich it, based on mathematical principles, which can increase efficacy,

efficiency and commodity of its use as well as its design process.

Page 10: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

9

Another important basis of industry 4.0 is the implementation of autonomous robots able to work with

humans in a collaborative context (Human Robot Collaboration; Rüßmann et al., 2015), in which the

advantages of the humans (adaptability, decision making, judgement and so on) are combined with the

robots’ advantages (speed, precision, repeatability, perpetuity and so on) in order to adapt the

companies to the conditions (mass customization, mass production, regional customization and so on)

that the market is tendering to nowadays (Bahrin et al., 2016).

So far, VR and Human Robot Collaboration (HRC) have been briefly introduced. With the concepts

presented in this section the problem will be explained in the next.

1.2 Problem definition and Goals

The problem definition will be divided in two sections since there are two different problems as well

as two different main objectives. Although they are intimately related, they are still quite different.

Regarding the advances of VR technologies as well as the advances in simulation programs in the last

few years, a whole new line of possibilities has been initiated concerning simulations in VR. However,

although assembly tasks can be simulated in VR, there is no way to evaluate ergonomics aspects and

show them to the user in such scenario. Moreover there is a lack of investigations carried in the line of

ergonomic feedbacks in VR, thus, it is necessary to increase the knowledge in the scientific community

regarding this specific matter Therefore this project aims to:

Develop an ergonomic feedback system inside of the VR environment which permits the people

doing the manual assembly in VR to know how good (or bad) their postures are in real time

and how they can be improved.

In a certain operation at Volvo’s factory (in Skövde) the workers must do an awkward posture in order

to assemble one of the part of the trucks engine. They need to excessively rise their arms, which makes

the task uncomfortable and unsafe (especially for short workers, since they have to rise their arms even

more). The growth of the Work-related Musculoskeletal Disorders (see section 2.1) is more probable

for the workers performing this task.

Regarding this problem, a collaborative robot might be implemented in order to improve the ergonomic

aspects of that certain operation. By making use of the technologies presented in the last section, this

project aims to:

Page 11: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

10

Simulate assembly task that these workers have to perform in a Virtual Reality environment

by using Industrial Path Solutions (IPS) software developed by Fraunhofer Chalmers research

centre for industrial mathematics (FCC) and distributed by IPS AB as well as fleXstructures

(“flexstructures GmbH Kaiserslautern,” 2019). This software supports VR simulation through

Steam VR (“SteamVR,” 2019).

Study if the collaborative robot can make any improvements from the ergonomic point of view.

The assembly tasks will be created in IPS and then, by using the VR hardware as well as

Motion Capture equipment, ergonomic data will be collected in order to measure the

improvement that the implementation of the Collaborative Robot causes in the assembly task.

In table 1 the objectives are summarized.

Table 1. Primary and Secondary Objectives

Primary purpose Main Requirements Secondary Requirements

Development of an ergonomic

feedback system

Possibility of overlay

information in the VR goggles

Use X-Sens as Mo-Cap

equipment

Implementation of the system

in IPS

Creation of the assembly task

in IPS

Creation with robot and

without robot.

Ergonomic Comparison

As realistic as possible

(Vacuum Gripper, Tools,

Screens)

Page 12: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

11

2 Frame of References

2.1 Ergonomics assessment.

2.1.1 Costs of Work-Related Musculoskeletal Disorders

In the last few years, the importance of the ergonomic assessment has been increasing, since Work-

Related Musculoskeletal Disorders (WMSDs) in industries and services are causing both: economical

costs and health costs to developing and developed countries (Madani and Dababneh, 2016). The exact

economical costs due to WMSDs is difficult to calculate, due to the lack of standardization (Moar et

al., 2015), however, some investigations have been conducted to find out and measure the actual cost

of WMSDs in different regions. For instance in the USA, WMSDs causes around 32% of all the work

absences in all the industries (Antwi-Afari et al., 2017), (Bhattacharya, 2014) and it is recognized by

institutions such as National Occupation Research Agenda (NORA) as one of the most expensive

problems that the USA have to face nowadays (Ramos et al., 2017). This situation does not differ in

the European Union, where WMSDs suppose a cost of 476 billion euros per year (Pascual et al., 2019).

This situation still persists in developing countries such as Colombia taking the expenses of WMSDs

to a level of 171.7 million US dollars in the year 2005 (Piedrahita, 2006).

In sectors, such as semiconductors manufacturing, in which the main damages to the workers are

usually related to chemical products, WRMSDs are still among the health problems observed in the

workers. (Wong and Richardson, 2010). In other sectors of industry where the hard physical work is a

must, WMSDs are probably the major health problem among the workers, due to the repetitive weight

lifting (Antwi-Afari et al., 2017). In some sectors such as mining, fishing, or construction more than

70% of the workers have experienced health problems related to their work (Álvarez‐ Casado et al.,

2016). Apart from the economic costs, it also implies personal problems to the workers who are

suffering WMSDs, such as psychological problem and stress (Jose, 2012). Furthermore, many studies

have revealed that the WMSDs also increase the production costs of the factories due to a lack of

quality in the assembly. The productivity as well as the work quality can be severely reduced due to

poor working environment. (Falck and Rosenqvist, 2014; Falck et al., 2014, p. 2)

Page 13: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

12

2.1.2 Assessment Methods

Approaches for assessing the posture

Techniques for assessing workers have been around for some time already. Actually 20 years ago there

were already investigations within the methods/approaches and resources available at that time.

Systems such as Isotrack, Flock of Birds, 3space, were computer aided tools which made use of

kinematic sensors to keep track of the body posture. (LI and BUCKLE, 1999).

However the classification of the methods for assessing the workers has been very similar over the

years. The ergonomic assessment has been divided into similar categories by several papers. The

classification that is briefly shown below summarizes the different methods to assess the worker

related musculoskeletal load concisely and straightforwardly, according to different parameters.

(David, 2005):

Self-Report: Traditionally used to collect data from the workers. Newer self-report

methods include self-evaluation videotapes

Simple Observational techniques: it relies on ergonomics experts assessing the posture in

the workplace

Advance Observational Techniques: It includes videotapes and it also may include

dedicated software which analyses different parameters of the posture

Direct methods: It relies on sensors attached to the subjects to keep exact track of the

movement and therefore make an evaluation of the posture in real time. The level of

accuracy reached by these methods is higher than in any other one, although within this

category there are different levels of accuracy. In the investigation carried by Lee et al.

(2017), the direct methods are divided into two main categories.

o Simple systems that only monitors body motion: Useful to assess workers

exposures, measuring aspects such as intensity of flexion during the working

hours.“single-parameter monitoring wearable sensor” (SPMWS).

o Complex systems that collect motion and physiologic measures “multi-parameter

monitoring wearable sensor” (MPMWS) systems. Simple systems can only

measure data such as acceleration or posture, however the complex systems are

able to measure a wider range of data such as electrocardiogram or respiration.

Page 14: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

13

Methods to Evaluate Work load

Regarding the problem of Work Related Musculoskeletal Disorders, posture evaluation methods used

for ergonomic assessment have been developed, such as Rapid Upper Limb Assessment (RULA)

(McAtamney and Nigel Corlett, 1993)Rapid Entire Body Assessment (REBA) (McAtamney et al.,

2004), Ovako Working Posture Assessment System (OWAS) and so on. Their purpose is to identify

and assess risk factors in the workplace to be able to decrease unsuitable musculoskeletal load

produced by the tasks that the workers have to perform. The information that these methods provide,

can be applied to either design more ergonomic workstations (Feyen et al., 2000), improve the existing

workstations in somehow (Kushwaha and Kane, 2016) or provide the workers with ergonomic

feedback for them to have a proper use of the workstations (Mahdavian et al., 2018) .

RULA is one of the most used assessment methods nowadays due to its simplicity and its ease to

develop assessment software based on it (Lowe et al., 2018). Many posture assessment methods have

been developed based on the principles of RULA in combination with different approaches to keep

track of the body position, such as Microsoft Kinect (Haggag et al., 2013), video recording (Ansari

and Sheikh, 2014), and motion capture technologies (Lee et al., 2017).

2.2 Motion Capture

Motion Capture (also known as Mocap) is, according to the Cambridge English Dictionary: “the

process of recording the movements of people and objects, used for example in film-making, video

games, and sports”(“MOTION CAPTURE | meaning in the Cambridge English Dictionary,” 2019)

and according to the Oxford English Dictionary: “The process or technique of recording patterns of

movement digitally, especially the recording of an actor's movements for the purpose of animating a

digital character in a movie or video game”(“motion capture | Definition of motion capture in US

English by Oxford Dictionaries,” 2019).

However, the range of application of motion capture is much broader than recording human motion

with primary purposes related to gaming or filming industries. Other examples of fields of application

of Mocap are: robotics (Field et al., 2009), healthcare (Noonan et al., 2009), ergonomics (Yan et al.,

2017), etc.

Page 15: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

14

This project is focused on the applications of motion capture in the industry, more specifically, on the

posture assessment to the workers, based in motion capture and computer vision software. Regarding

this field, there are several types of motion capture systems, each of which uses a different technology

to keep track of the body movements. It can be divided into three main categories depending on the

technology used: Optical (Haggag et al., 2013), Inertial (Kim and Nussbaum, 2013) Magnetic

(Yabukami et al., 2000) and Mechanical (Gu et al., 2016).

In this project inertial system will be used. Regarding Inertial Measuring Units (IMUs) there are some

advantages over other systems, regarding latency, interference and range (Foxlin, 1996).The IMUs

and a computer are the only required equipment to keep track of the body with this kind of systems,

which supposes a big advantage for the goals of this project, since the construction of a demonstrator

and testing is much simpler. IMUs are usually composed by 3D gyroscopes, accelerometers and

magnetometers. More specifically, X-Sens will be used if the time allows. This system is able to

provided 6 Degrees Of Freedom (DOF) tracking of body segments, and it is composed by 17 IMUs

attached to different parts of the body (Roetenberg et al., 2013). The IMUs attached to the body are

constantly saving the kinematic data (up to 120Hz), which is sent to the X-Sens software and saved

there.

This system has been validated by several studies (Cognolato, 2012; Zhang et al., 2013) (Blair et al.,

2018) and it has been used successfully in several projects (“BALANCE | Augmentation in

Locomotion, through Anticipative, Natural and Cooperative control of Exoskeletons.,” 2019; “Andy

Project - Home,” 2019; “Bewegingsanalyse,” 2016).

2.3 Virtual Reality in Manufacturing Context

2.3.1 Virtual Reality, Definition and Types

The foundation of Virtual Reality as it is known nowadays is attributed to Jaron Lanier, (Firth, 2019).

Virtual Reality is “The computer-generated simulation of a three-dimensional image or environment

that can be interacted with in a seemingly real or physical way by a person using special electronic

equipment, such as a helmet with a screen inside or gloves fitted with sensors.” (“virtual reality |

Definition of virtual reality in English by Oxford Dictionaries,” 2019).

Virtual Reality can be divided into three main categories regarding the immersivity achieved by the

system. (Mujber et al., 2004a)

Non-Immersive VR: in which joysticks and screens are used

Page 16: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

15

Semi-immersive VR: large monitors or projectors are used and the immersivity accomplished

is slightly superior than the last category

Immersive VR: Globes and accelerometers as well as goggles are used to reach a fully

immersive experience.

The growth of the gaming industry has brought a big market of commercial immersive VR equipment

(Watson, 2017), and due to the competence, accurate VR equipment has become “inexpensive”

(around 10000SEK) and standardized. These facts, together with the possibilities and application fields

of VR have supposed an advance in the number of investigations related to VR that are being carried

out nowadays.

The equipment that is used in this project is the HTC VIVE equipment. The main characteristics that

it has to fulfil due to the requirements of this project are related to latency (22 ms) and precision. This

equipment fulfil satisfactorily the requirements for this investigation, and it has been successfully used

in many other similar projects (Soffel et al., 2016;Egger et al., 2017) without issues observed and even

superior to other similar VR systems in pick-and-place-type tasks. (Suznjevic et al., 2017)

2.3.2 Applications of Virtual Reality in Manufacturing

Usually VR is associated to gaming, however there are many more applications in which VR is

becoming a tool to take into consideration, such as healthcare (Seymour et al., 2002), manufacturing

(Mujber et al., 2004b), of course gaming (Zyda, 2005;Kilteni et al., 2013) and so on.

In this project, the central topic will be the application of VR in manufacturing. VR aims to create a

computer-generated world in which the user can immerse, walk around and interact with the

environment, which applied to the industry, it is defined as Virtual Manufacturing (VM); this concept

was first introduced by researchers of the university of Maryland 24 years ago (Lin et al., 1995).

Nowadays, due to the facts shown down below, the companies are trying to implement Virtual Reality

in their production/design method.

Increasing competence that affects to the companies and the pursuit of reducing costs and

increase the efficiency in all of the cycles of the product’s life (Zhong et al., 2017)

Demand from part of the companies to reduce the time from conceptualization to production

(Mujber et al., 2004a)

Advances in computing power, and the consequent reduction of prices in heavy computing

tasks such as VR (Azuma et al., 2001).

Page 17: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

16

The applications of VM are endless, down below a short list of the applications of VM in different

areas of the companies is shown (Mujber et al., 2004a):

Design

o Functionality

o Human Interaction

o Environment

Operation management

Manufacturing processes

Machining:

Assembly

This project is focused on the application of VR in ergonomic aspects of manual assembly processes

and Human Robot Collaboration (HRC).

2.3.3 Ergonomics of assembly processes in Virtual Reality

Assembly processes causes the majority of a product`s production and design costs, therefore, the

development of a proper assembly plan is essential in the early stages of a product. (Seth et al., 2011a).

The virtual reality allows the designers to implement their ideas in a virtual environment and test them

realistically, and thus, to prepare and perform assembly operations, sequences and so on. Therefore

the operations can be realistically tested in the early stages of the product lifecycle, allowing the

designers the improvement of ergonomics or efficiency in the workstation based on data obtained from

the simulations in virtual reality (Riedel and Pawar, 1997).

A powerful tool to virtually test ergonomic aspects in assembly tasks are computer manikins, which

are computational representation of the human body based on anthropometric measurements. These

digital human models allow the testing of different parameters of the product or the production line,

saving time and gaining accuracy in the results (Dukic et al., 2007).

2.4 Human Robot Collaboration

Mass customization as well as the increase of efficiency in production in recent years are demanding

more flexibility and changeability in the assembly lines in industry. The size of the batches is

decreasing as the personalization for the customer increases. It is argued that assembly tasks can be

better performed if they are robot assisted and human guided.(Krüger et al., 2009). The basis that this

idea is built upon, is that humans do what are best at, and robots do what they do best. Speed,

Page 18: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

17

productivity and continuity of robots can be combined with human flexibility which can provide

enormous advantages for the production line.

Human Robot Collaboration is usually categorized in two ways,” workspace sharing or workspace

and time sharing” although it can be deeper categorized depending on the level of interaction between

the Human and the Robot. One of the primaries concerns of the companies is always the security of

the workers, which have made the implementation of collaborative robots in industry slightly slower

due to possible collisions and so on (Michalos et al., 2015). Some companies such as BMW have

already introduced robots to work with the humans which share work-area and worktime in order to

avoid repetitive weight lifting and hence injuries with (specially low-back injuries) (Knight, 2014). It

has been demonstrated how Human-Robot Collaboration can provide with reduction of ergonomics

concerns since it can provide a decrease in physical work and cognitive loading (Cherubini et al.,

2016).

This project will be focused on collaborative workstations in which human and robots share the

working space, time and tasks. Since the main topic studied in this project is the virtual simulation of

human robot collaboration tasks, next section will be focused on it.

Page 19: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

18

3 Literature Review.

Several studies have been conducted in the field of Virtual Reality with different purposes. In the

industrial field, as it was mentioned, VR has a wide range of application, however in real-world

companies it has not been actually used until recent years due to its development stage, and although

there are some inexpensive commercial equipment as well as many ongoing projects, VR is still more

present in the research institutions than in the actual industry.

In the project performed by Michalos et al. (2018) the workplace is analysed by using VR techniques.

It is mainly focused on the factory layout analysis from the Lean manufacturing layout according to

the data obtained from Lean Tools. They state that the reduction of time related to the elimination of

computer simulation is significant, due to (among other factors) the use VR tools. Assembly operations

in VR (Virtual Assembly) were also validated by (Seth et al., 2011b) who state that the use of Virtual

Assembly can reduce the amount of money and time when testing new workstations. So far it has been

shown that VR can be useful at factory level as well as single operation level. Next steps in the

literature review was to find out if human beings can act naturally with the virtual environment when

they are performing assembly tasks.

One of the main industry applications of VR is to train operators for assembly tasks. Many researching

projects have been performed in this specific field. The results show that there is a good learning rate

which does not differ significantly from actual assembly training with real objects. (Murcia-López and

Steed, 2018; Pallavicini et al., 2015; Boud et al., 1999). These research results are useful in this

investigation since now it is known that assembly tasks in virtual reality feels real for a human being,

and if people can learn how to perform a task in VR, it is very likely that they learn how to perform it

with a correct posture by using the feedback system that is intended to be developed. Also ergonomics

aspects have been studied in VR as well as AR showing that VR can be a better tool to study Human

Factor/Ergonomics than AR, and it validates the study of the ergonomic aspects in the virtual prototype

(Aromaa and Väänänen, 2016).

Moving to human robot interaction using VR, some researches have been carried in this subject . It

has been demonstrated that robot appearance and robot movements affect the user perception in

collaboration tasks, although a more anthropomorphic robot does not necessarily improve the

perception of the user, and neither does anthropomorphic movements, since the user might not find it

efficient for collaboration tasks, generally, it seems that human collaboration tasks in a virtual reality

context are well accepted by the user (Weistroffer et al., 2013). Within this field telepresence is being

Page 20: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

19

also quite studied in the research and includes human robot collaboration. Some projects in

telepresence to teleoperate robots have been developed with impressive results (Lipton et al., 2018;

Rosen et al.; 2018; Tran et al., 2018) in which the robot could be teleoperated by using the same

technologies that are going to be used in this project. Although these three projects does not have the

same objectives as this one, they show technical matters which are quite useful in this investigation.

There is a lack of articles focussing on ergonomic feedback systems in a VR environment. The current

investigations mostly provide the feedback in a real environment by using any of the technologies

exposed in the section 2.2. Although the technologies that are used can be slightly different such as

Kinect (Martin et al., 2012) or inertial sensors (Battini et al., 2014), the objectives and the methods are

quite similar. The body movement is tracked, the posture evaluated and base on those evaluations,

some ergonomic feedback are given. These feedback can stimulate vision, hearing or touch. According

to some researches, sight feedback seem to be the most effective. (Hu et al., 2012).

From this literature review some interesting conclusion can be extracted. They will be summarize as

follow.

VR can save money as well as time when testing out new workstations, products and methods.

VR feels natural when the user is interacting with product and it is useful to train workers since

the learning rate when VR is used does not differ from the learning rate when the actual objects

are used.

The best way to make the ergonomic feedback to have effect on the posture of the workers is

show them visually.

Page 21: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

20

4 Method.

The methods used to reach the objectives of this project include two processes since two different

objectives are aimed to be reached:

Create an ergonomic feedback system able to inject information into the VR environment when

using IPS. (Figure 1)

Figure 1. Method 1.

Measure the ergonomic advantages of HRC, for which the process shown in Figure 2.

Figure 2. Method 2.

Implement Smart textiles

software.

Development of Virtual

Screen

Collect and understand the code for

VR in IPS

Gather information about VR in

IPS

Measure (ergonomically) the advantages

of HRC

Test them both with different

users

Creation the task (in IPS) without the

collaborative robot

Creation of the task (in IPS) with the collaborative

robot

Page 22: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

21

The second method needs to rely on the first one, since the ergonomic evaluation will be done by using

the software developed in the first method.

4.1 Feedback system able to inject information in the VR

environment

This system, due to the development stage of the VR functionality of IPS as well as the difficulty of

editing the source code of IPS will involve several software programs, and the data flow among these

programs will be slightly complex. Therefore, to start, in the list below the main pieces of code are

presented.

OpenVR

SteamVR

OpenDesktop

OpenVR can be quickly defined as a piece of code inside of the IPS source code which is used as a

connection between IPS (program drawing the 3D environment) and SteamVR, whereas SteamVR is

used to connect with the VR hardware (HTC VIVE in this case). To make a more efficient connection

between this feedback system and the VR hardware, the best option was to edit the OpenVR code

inserted into the source code of IPS to be able to show this information into the goggles (Figure 3)

Figure 3. Desired Data Flow Diagram

However, that was not an option since there was no access to this source code and the complexity

would be too high to make it in 5 months, in addition the integration of X-Sens in this system , was

not an option either due also to the time constrain. Therefore the second best option was to overlay the

Page 23: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

22

information into the VR environment from a different software. In other words, using another program

(external to IPS and SteamVR) which injects information into the VR environment. Instead of using

X-Sens, the ergonomic evaluation platform called Smart Textiles developed at the university of

Skövde (Mahdavian et al., 2018) was used since it already includes ergonomic evaluation as well as

the hardware necessary to do it.. The problem with this approach, is the complexity of the data flow

among all the programs (Shown in Figure 4)

Figure 4. Data Flow Diagram

To get a better understanding of this flowchart, it will be divided into smaller parts. First one will be

the beginning of the data flow, starting up with IPS. The task as well as all the 3D elements (manikins,

robots, solids, tools, environment and so on) are either created in IPS or imported into it. Therefore, it

is going to be the starting point of the data flow. IPS connects to the HTC VIVE equipment by using

OpenVR as well as Steam (Figure 5).

Page 24: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

23

Figure 5. Data Flow Diagram. Part 1

OpenVR is a public API and it can be defined as the piece of code which is in charge of making the

VR environment run. It calculates the lens distortion, get some information about the hardware

(connected devices) and some other things regarding matters related to the display. It also renders 3D

content (virtual bounds, objects and so on) as well as 2D content (such as menus, icons, buttons and

so on). Although OpenVR can be implemented in any 3D software, SteamVR is still needed to run

protocols in a lower level of abstraction such as getting a correct frame-rate, applying barrel distortion

or getting and interpreting more complex information from the hardware such as positioning from the

sensors, Head Mounted Display (HMD), or controllers as well as the inputs/outputs of these gadgets.

When all this information is gathered, rendered and so on, it is shown to the user. This user is going to

have a certain perception of the environment and tasks that he/she has to perform, and his/her

movements will be based on his/her perception of the environment. Here is where motion capture gets

in.

Moving on to the next phase of this data flow, as it was mentioned, according to the perception of the

users, they will have different movements, and hence posture. It is worth to mention, that the absolute

position of the user inside of the VR environment is collected by the VR equipment.

IPS

OpenVR

SteamVRHTC

equipmentUser

Page 25: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

24

Figure 6.Data Flow Diagram. Part 2

Figure 7. Smart textiles (Mahdavian et al., 2018)

The posture data will be collected by Smart Textiles equipment. The movements of the user could be

defined as the “input” data of the ergonomic evaluation platform, which will be collected by the motion

capture equipment (Figure 7) and sent to the software via Bluetooth which corresponds with the part

of the diagram shown in Figure 8.

Figure 8. Data Flow Diagram. Part 3

Afterwards, while the ergonomic evaluation is being executed, it will be mirrored by a program used

to inject information into the VR equipment and then overlay into the VR environment, in order to

make the user able to see the ergonomic feedback and correct her/his movements if they are wrong in

any way.

Page 26: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

25

The information that the user will see while interacting in the VR environment will show the

different parts of the body coloured according to the ergonomic evaluation (Appendix 1) of each one

of them (Figure 9). For instance, if the left arm is placed in a good position, it will be coloured in

green, if the position is not that good, it will be coloured in yellow, and if the position is hazardous, it

will be coloured in red. Since the user can see in real time this evaluation, they can make use of the

information to improve their posture. (Colour Criteria in Appendix 2). Figure 10 shows the part of

the diagram which corresponds to this step of the data flow.

Figure 9. Smart Textiles Interface

Page 27: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

26

Figure 10. Data Flow Diagram. Part 4

In list below all of this data flow is summarized in order to get a better understanding and a global

view.

1. The task is created with IPS

2. IPS connects to HTC Vive through OpenVR (inside of IPS’ code) and SteamVR (software

distributed by Valve)

3. The user reacts and interacts with this VR environment.

4. Her/his posture is collected by the Motion Capture equipment ( Smart textiles)

5. The data is sent to an Ergonomic Evaluation Platform

6. The data of this platform is mirrored into the VR environment so the user can correct her/his

posture

4.2 Simulation of the Assembly task using collaborative robot

The task was imported to IPS by making use of the CAD provided by the University investigation.

Figure 11 shows a 3D representation of the task done.

Page 28: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

27

Figure 11. Assembly Task with Collaborative Robot.

There were, however, some issues regarding the creation of the task explained in section 1.2. The

actual gripper that is going to be used at Volvo Factory’s is a vacuum gripper, however this one could

not be properly imported to IPS for which the gripper that was used is a Hand-E adaptative gripper.

Nevertheless, it does not affect to the objectives of this project, since what is actually going to be

studied is the ergonomic improvement that the implementation of this robot implies in this specific

assembly task.

Page 29: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

28

Afterwards the task was performed in VR with the Smart Textiles equipment on. This task was

performed with and without robot as it is shown in the pictures down below, in order to compare the

ergonomic advatages of the task using robot to the task without robot.

Figure 12 shows the states before and after the operator make any movements when the robot is

implemeted. Figure 13 shows the state before and afer the operator make any movements when the

robot is not implemented

Page 30: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

29

Figure 12. User's sight with collaborative robot

Page 31: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

30

Figure 13. User's sight without collaborative robot

Page 32: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

31

5 Results

5.1 Ergonomic Evaluation Platform

The ergonomic evaluation platform was implemented in VR. In the pictures down below it is shown

how it looks like. There are three ways of showing the feedback. The virtual screen can be:

Attached to the controllers (figure 14),

Figure 14. Attached to the controller

Shown in any direction of the virtual world (figure 15),

Page 33: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

32

Figure 15. World Forward

Placed at the corner of the user’s sight all the time (figure 16).

Page 34: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

33

Figure 16. Corner of the Screen

With the system developed in this project the user is able to see his/her ergonomic evaluation while

performing assembly tasks.

In table 2 the objectives aimed to be reached at the beginning of this project (regarding this part of the

investigation) and the objectives that actually were reached is shown.

Table 2. Objectives Reached

Objectives State

Develop an ergonomic feedback system inside of

the VR environment

The system was successfully developed

Page 35: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

34

Use X-Sens as Mo-Cap equipment for this project The Mo-Cap equipment that was used was not X-

Sens due to technical limitations

Implement the system inside of IPS Although this system is not implemented inside of

IPS it can be seen inside of VR using IPS

5.2 Simulation and advantages of the Assembly task using

collaborative robot.

The task was simulated as shown in section 4.2. The position data of trunk, left arm and right arm was

collected with the smart textiles equipment, and then, evaluated with the smart textiles software. The

results are shown in figures below.

Figure 17 corresponds to the task without using the collaborative robot. The percentage together with

the colour stands for the proportion of time that the arm has been in a correct posture (green), medium

posture (yellow) or incorrect posture (red). To see the colouring criteria in more detail appendix 2

Figure 17. Ergonomic Evaluation without Robot.

Figure 18 corresponds to the task using the collaborative robot.

Figure 18. Ergonomic Evaluation with Robot.

Page 36: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

35

Table 3. Objectives reached

Objectives State

Creation of the assembly task in IPS The task was successfully created

Creation of this task with and without the robot They both were created

Implement: gripper, screwdrivers, assembly

instructions and so on.

The time did not allow to make it more realistic

Page 37: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

36

6 Discussion

6.1 Ergonomic Evaluation Platform

As it is shown in section 5.1, the system to show ergonomic information in the VR environment was

successfully developed, however some of the objectives that were intended to be reached could not be

properly fulfilled as it is shown in table 2.

Nonetheless the system works, and it enables the possibility to see the ergonomic evaluation inside of

the VR environment when doing assembly tasks which ultimately was the main purpose of this project.

The approach used in the construction of this system has been considerably complex, and in order to

run the Evaluation software it is necessary to run 5 programs at the same time (IPS, Smart Textiles,

Smart Textiles’ Server, Smart Textiles Android application, and the program to overlay in VR). It

makes it difficult to use, and unless the user has deep knowledge about how this system works, an

specialist is needed to set it up and run it. In addition it makes it more likely to fail, due to the fact that

if a program stops working, everything collapses.

Nevertheless, it appears to be valid first prototype, and furthermore, it is necessary to consider that this

system’s audience is rather small, therefore, it might be too time consuming and probably unnecessary

to spend a time and a resources developing it if it is not going to be used by anyone who is not an

expert in this matter and actually can easily learn how to use this system as it is at this point.

Moreover, there is no (according to the literature review) system similar to the one developed in this

project, and it can open interesting investigation lines regarding how people behave when they are

using VR, and how they respond to different visual stimulus regarding their position and so on. It is

interesting that the information can be shown in different ways as it is shown in section 5.1. since

different information can be shown for different purposes. For instance, instead of showing an

ergonomic evaluation, information such as instructions in form of videos or text can be shown as well,

with purposes of training workers or improve the instructions based on the performance of the people

using them.

Page 38: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

37

6.2 Simulation and advantages of the Assembly task using

collaborative robot.

As the result have shown, there is a clear advantage when the Collaborative Robot is implemented. As

it can be seen in the figures 17 and 18, the main difference when the robot is implemented: only one

of the arms goes to an incorrect position (not even reaching the state of hazardous position), whereas

when the robot is not implemented both of the arms are in a incorrect position for a while and then it

reaches the state of hazardous position for a considerable period of time. However there are some

things to discuss about these results.

The task was tested just by one person, although it should have been tried on more people in

order to have results from a wider target group, however, there was not time to make several

test.

The evaluation the weight is not included, however, it is clear that the robot is going to suppose

a clear advantage regarding that matter since the operator would not have to lift the piece up.

There was a lack of information from part of the company regarding the implementation of the

robot, for instance,

o it was not really clear what the robot was supposed to do (Only lifting, lifting and

screwing)

o The layout of the workstation was not specified from the part of the company, there

were just some guidelines such as:

the robot should be behind the engine

The operator should be on the other side

No base for the robot was specified and no distance between the robot and the

engine were specified

o The task time as well as the additional tasks within that workstation were not specified

either.

Only the position of the arms and trunk was collected due to limitations of the equipment

Regarding all of this parameters mentioned in list above, there might be some changes on the outcome,

however it was clear that the implementation of the robot was going to bring ergonomic advantages,

and it has been demonstrated mathematically how the posture of the worker performing that task

improves when the robot is implemented.

Page 39: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

38

7 Conclusion.

This investigation aimed to reach two different objectives, and although they are related regarding the

ergonomic perspective of both, ultimately, they were quite different concerning the approach as well

as the results.

The main conclusions extracted from this project are:

Now it is possible to see the ergonomic evaluation while performing assembly tasks inside of

VR environment, which brings new investigation lines especially related to cognitive

ergonomics (How do people react to Virtual Feedback and so on).

VR has a great potential and there are a lot of tools to make things that 10 years ago were

beyond imagination.

Although the complexity to run this program is quite high, it works reasonably well, and it is

able to show the information given by the smart textiles platform with a satisfactory latency.

The robot makes a great ergonomic improvement in the task from the ergonomic point of view,

although no more aspects (productivity, workload and so on) could have been studied, from

the ergonomic point of view the robot implies an advantage.

This feedback system is (according to the literature review) quite innovative, and as in all the

innovations there is always a better way to do it. It has been mentioned several times that the

complexity of use might be too high for an average user, however there are more aspects that can be

improved, such as efficiency in the code, implementation in more programs and possibility to launch

this system just by using the VR controllers. All in on, this system can be defined as a starting point

of showing information when performing assembly tasks in a VR environment. More information can

be shown and this system can be applied to more fields apart from the ergonomics (for instance

instructing workers, having additional information that might be important in the assembly,

teleoperation of robots and so on).

7.1 Future Work.

The system developed in this project sets the base for future studies within the VR field. As it was

mentioned in the literature review, there is a lack of knowledge regarding how the people react to

feedbacks in VR, and if these feedbacks can be useful in reality when the users go from the virtual

world to the real work. With the system in its actual state this kind of investigations can be perfectly

Page 40: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

39

carried, however, further development of the system can be useful to make these investigations more

efficient and more reliable.

To make these investigations more efficient, the implementation of this system as a functionality of

IPS would be a great achievement, since any average user of IPS would be able to .launch the feedback

system in VR and the data could be directly analysed with IPS making the whole process faster.

To make this investigation more reliable the implementation of X-Sens rather than Smart Textiles is

the best option to continue this project, given that this Motion Capture equipment is able to provide

with more (since it has more sensors) and more accurate data (since it has better latency). This

implementation would make the ergonomic evaluations more reliable and more accurate.

Regarding the HRC analysis, the future work includes:

Test out the virtual scene with a wider target group to be able to analyse numerically how much

the margin for improvement is. .

Improve the Virtual Scene by using more data given by the company

Test it out by using an improved system with the advances stated above in this section7

When these issues are solved, the analysis of the possible improvement that the robot can achieve will

be quite solid.

Page 41: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

40

8 References.

Álvarez‐ Casado, E., Zhang, B., Sandoval, S.T., Pedro, M., 2016. Using ergonomic digital human

modelling in evaluation of workplace design and prevention of work-related musculoskeletal

disorders aboard small fishing vessels. Hum. Factors Ergon. Manuf. Serv. Ind. 26, 463–472.

https://doi.org/10.1002/hfm.20321

Andy Project - Home [WWW Document], n.d. URL https://andy-project.eu/ (accessed 2.18.19).

Antwi-Afari, M.F., Li, H., Edwards, D.J., Pärn, E.A., Seo, J., Wong, A.Y.L., 2017. Biomechanical

analysis of risk factors for work-related musculoskeletal disorders during repetitive lifting task

in construction workers. Autom. Constr. 83, 41–47.

https://doi.org/10.1016/j.autcon.2017.07.007

Aromaa, S., Väänänen, K., 2016. Suitability of virtual prototypes to support human factors/ergonomics

evaluation during the design. Appl. Ergon. 56, 11–18.

https://doi.org/10.1016/j.apergo.2016.02.015

Arvidsson, I., Dahlqvist, C., Enquist, H., Dr, T., Nordander, C., n.d. Åtgärdsnivåer mot

belastningsskada 20.

Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., MacIntyre, B., 2001. Recent advances in

augmented reality. IEEE Comput. Graph. Appl. 21, 34–47. https://doi.org/10.1109/38.963459

Bahrin, M.A.K., Othman, M.F., Azli, N.H.N., Talib, M.F., 2016. INDUSTRY 4.0: A REVIEW ON

INDUSTRIAL AUTOMATION AND ROBOTIC. J. Teknol. 78.

https://doi.org/10.11113/jt.v78.9285

BALANCE | Augmentation in Locomotion, through Anticipative, Natural and Cooperative control of

Exoskeletons. [WWW Document], n.d. URL http://www.balance-fp7.eu/ (accessed 2.18.19).

Battini, D., Persona, A., Sgarbossa, F., 2014. Innovative real-time system to integrate ergonomic

evaluations into warehouse design and management. Comput. Ind. Eng. 77, 1–10.

https://doi.org/10.1016/j.cie.2014.08.018

Bewegingsanalyse [WWW Document], 2016. . Roessingh Res. Dev. URL

http://www.rrd.nl/bewegingsanalyse/ (accessed 2.18.19).

Bhattacharya, A., 2014. Costs of occupational musculoskeletal disorders (MSDs) in the United States.

Int. J. Ind. Ergon. 44, 448–454. https://doi.org/10.1016/j.ergon.2014.01.008

Blair, S., Duthie, G., Robertson, S., Hopkins, W., Ball, K., 2018. Concurrent validation of an inertial

measurement system to quantify kicking biomechanics in four football codes. J. Biomech. 73,

24–32. https://doi.org/10.1016/j.jbiomech.2018.03.031

Boud, A.C., Haniff, D.J., Baber, C., Steiner, S.J., 1999. Virtual reality and augmented reality as a

training tool for assembly tasks, in: 1999 IEEE International Conference on Information

Visualization (Cat. No. PR00210). Presented at the 1999 IEEE International Conference on

Information Visualization (Cat. No. PR00210), pp. 32–36.

https://doi.org/10.1109/IV.1999.781532

Page 42: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

41

Cherubini, A., Passama, R., Crosnier, A., Lasnier, A., Fraisse, P., 2016. Collaborative manufacturing

with physical human–robot interaction. Robot. Comput.-Integr. Manuf. 40, 1–13.

https://doi.org/10.1016/j.rcim.2015.12.007

Cognolato, M., 2012. Experimental validation of Xsens inertial sensors during clinical and sport

motion capture applications [WWW Document]. URL http://tesi.cab.unipd.it/40443/ (accessed

2.18.19).

David, G.C., 2005. Ergonomic methods for assessing exposure to risk factors for work-related

musculoskeletal disorders. Occup. Med. 55, 190–199. https://doi.org/10.1093/occmed/kqi082

Dukic, T., Rönnäng, M., Christmansson, M., 2007. Evaluation of ergonomics in a virtual

manufacturing process. J. Eng. Des. 18, 125–137.

https://doi.org/10.1080/09544820600675925

Egger, J., Gall, M., Wallner, J., Boechat, P., Hann, A., Li, X., Chen, X., Schmalstieg, D., 2017. HTC

Vive MeVisLab integration via OpenVR for medical applications. PLOS ONE 12, e0173972.

https://doi.org/10.1371/journal.pone.0173972

Falck, A.-C., Örtengren, R., Rosenqvist, M., 2014. Assembly failures and action cost in relation to

complexity level and assembly ergonomics in manual assembly (part 2). Int. J. Ind. Ergon. 44,

455–459. https://doi.org/10.1016/j.ergon.2014.02.001

Falck, A.-C., Rosenqvist, M., 2014. A model for calculation of the costs of poor assembly ergonomics

(part 1). Int. J. Ind. Ergon. 44, 140–147. https://doi.org/10.1016/j.ergon.2013.11.013

Feyen, R., Liu, Y., Chaffin, D., Jimmerson, G., Joseph, B., 2000. Computer-aided ergonomics: a case

study of incorporating ergonomics analyses into workplace design. Appl. Ergon. 31, 291–300.

https://doi.org/10.1016/S0003-6870(99)00053-8

Field, M., Stirling, D., Naghdy, F., Pan, Z., 2009. Motion capture in robotics review, in: 2009 IEEE

International Conference on Control and Automation. Presented at the 2009 IEEE International

Conference on Control and Automation (ICCA), IEEE, Christchurch, New Zealand, pp. 1697–

1702. https://doi.org/10.1109/ICCA.2009.5410185

Firth, N., n.d. Virtual reality: Meet founding father Jaron Lanier [WWW Document]. New Sci. URL

https://www.newscientist.com/article/mg21829226-000-virtual-reality-meet-founding-father-

jaron-lanier/ (accessed 2.12.19).

flexstructures GmbH Kaiserslautern [WWW Document], n.d. URL http://www.flexstructures.de/

(accessed 3.11.19).

Foxlin, E., 1996. Inertial Head-Tracker Sensor Fusion by a Complimentary Separate-Bias Kalman

Filter, in: VRAIS. https://doi.org/10.1109/VRAIS.1996.490527

Gu, X., Zhang, Y., Sun, W., Bian, Y., Zhou, D., Kristensson, P.O., 2016. Dexmo: An Inexpensive and

Lightweight Mechanical Exoskeleton for Motion Capture and Force Feedback in VR, in:

Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems - CHI ’16.

Presented at the the 2016 CHI Conference, ACM Press, Santa Clara, California, USA, pp.

1991–1995. https://doi.org/10.1145/2858036.2858487

Page 43: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

42

Haggag, H., Hossny, M., Nahavandi, S., Creighton, D., 2013. Real Time Ergonomic Assessment for

Assembly Operations Using Kinect, in: 2013 UKSim 15th International Conference on

Computer Modelling and Simulation. Presented at the 2013 UKSim 15th International

Conference on Computer Modelling and Simulation, pp. 495–500.

https://doi.org/10.1109/UKSim.2013.105

Hermann, M., Pentek, T., Otto, B., 2016. Design Principles for Industrie 4.0 Scenarios, in: 2016 49th

Hawaii International Conference on System Sciences (HICSS). Presented at the 2016 49th

Hawaii International Conference on System Sciences (HICSS), pp. 3928–3937.

https://doi.org/10.1109/HICSS.2016.488

Hu, B., Zhang, W., Salvendy, G., 2012. Impact of multimodal feedback on simulated ergonomic

measurements in a virtual environment: A case study with manufacturing workers. Hum.

Factors Ergon. Manuf. Serv. Ind. 22, 145–155. https://doi.org/10.1002/hfm.20293

Industrial Path Solutions–Fraunhofer-Chalmers Centre [WWW Document], n.d. URL

http://www.fcc.chalmers.se/software/ips/ (accessed 3.11.19).

Jose, J.A., 2012. Outcome measures and prognosis of WRMSD. Work 41, 4848–4849.

https://doi.org/10.3233/WOR-2012-0775-4848

Kilteni, K., Bergstrom, I., Slater, M., 2013. Drumming in Immersive Virtual Reality: The Body Shapes

the Way We Play. IEEE Trans. Vis. Comput. Graph. 19, 597–605.

https://doi.org/10.1109/TVCG.2013.29

Kim, S., Nussbaum, M.A., 2013. Performance evaluation of a wearable inertial motion capture system

for capturing physical exposures during manual material handling tasks. Ergonomics 56, 314–

326. https://doi.org/10.1080/00140139.2012.742932

Knight, W., 2014. How Human-Robot Teamwork Will Upend Manufacturing. MIT Technol. Rev.

117, 66–67.

Krüger, J., Lien, T.K., Verl, A., 2009. Cooperation of human and machines in assembly lines. CIRP

Ann. 58, 628–646. https://doi.org/10.1016/j.cirp.2009.09.009

Kushwaha, D.K., Kane, P.V., 2016. Ergonomic assessment and workstation design of shipping crane

cabin in steel industry. Int. J. Ind. Ergon., New Approaches and Interventions to Prevent Work

Related Musculoskeletal Disorders 52, 29–39. https://doi.org/10.1016/j.ergon.2015.08.003

Lee, W., Seto, E., Lin, K.-Y., Migliaccio, G.C., 2017. An evaluation of wearable sensors and their

placements for analyzing construction worker’s trunk posture in laboratory conditions. Appl.

Ergon. 65, 424–436. https://doi.org/10.1016/j.apergo.2017.03.016

LI, G., BUCKLE, P., 1999. Current techniques for assessing physical exposure to work-related

musculoskeletal risks, with emphasis on posture-based methods. Ergonomics 42, 674–695.

https://doi.org/10.1080/001401399185388

Lin, E., Ioannis, M., Nau, D., Degli, W., 1995. Contribution to Virtual Manufacturing Background

Research. Manuacturing Technol. Dir. Air Force Wright Lab. Air Force Syst. Command

Wright-Patterson Air Force Base Ohio 45433-6533.

Page 44: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

43

Lind, C.M., 2017. Assessment and design of industrial manual handling to reduce physical ergonomics

hazards : – use and development of assessment tools - Semantic Scholar (Dissertation). KTH,

Stockholm, SWE.

Lipton, J.I., Fay, A.J., Rus, D., 2018. Baxter’s Homunculus: Virtual Reality Spaces for Teleoperation

in Manufacturing. IEEE Robot. Autom. Lett. 3, 179–186.

https://doi.org/10.1109/LRA.2017.2737046

Lowe, B., Dempsey, P., Jones, E., 2018. Assessment Methods Used by Certified Ergonomics

Professionals. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 62, 838–842.

https://doi.org/10.1177/1541931218621191

Madani, D.A., Dababneh, A., 2016. Rapid Entire Body Assessment: A Literature Review. Am. J. Eng.

Appl. Sci. 9, 107–118. https://doi.org/10.3844/ajeassp.2016.107.118

Mahdavian, N., Lind, C.M., Diaz Olivares, J.A., Iriondo Pascual, A., Högberg, D., Brolin, E., Yang,

L., Forsman, M., Hanson, L., 2018. Effect of Giving Feedback on Postural Working

Techniques. Presented at the 16th International Conference on Manufacturing Research,

incorporating the 33rd National Conference on Manufacturing Research, September 11–13,

2018, University of Skövde, Sweden, IOS Press, pp. 247–252.

Martin, C.C., Burkert, D.C., Choi, K.R., Wieczorek, N.B., McGregor, P.M., Herrmann, R.A., Beling,

P.A., 2012. A real-time ergonomic monitoring system using the Microsoft Kinect, in: 2012

IEEE Systems and Information Engineering Design Symposium. Presented at the 2012 IEEE

Systems and Information Engineering Design Symposium, pp. 50–55.

https://doi.org/10.1109/SIEDS.2012.6215130

McAtamney, L., Hignett, S., Hignett, S., 2004. Rapid Entire Body Assessment [WWW Document].

Handb. Hum. Factors Ergon. Methods. https://doi.org/10.1201/9780203489925-17

McAtamney, L., Nigel Corlett, E., 1993. RULA: a survey method for the investigation of work-related

upper limb disorders. Appl. Ergon. 24, 91–99. https://doi.org/10.1016/0003-6870(93)90080-S

Michalos, G., Karvouniari, A., Dimitropoulos, N., Togias, T., Makris, S., 2018. Workplace analysis

and design using virtual reality techniques. CIRP Ann. 67, 141–144.

https://doi.org/10.1016/j.cirp.2018.04.120

Michalos, G., Makris, S., Tsarouchi, P., Guasch, T., Kontovrakis, D., Chryssolouris, G., 2015. Design

Considerations for Safe Human-robot Collaborative Workplaces. Procedia CIRP, CIRPe 2015

- Understanding the life cycle implications of manufacturing 37, 248–253.

https://doi.org/10.1016/j.procir.2015.08.014

Moar, J.M.R., Alvarez-Campana, J.M., Míguez, J.L., González, L.M.L., Ramos, D.G., 2015.

Comparative study of the relevance of musculoskeletal disorders between the Spanish and the

European working population. Work 51, 645–656. https://doi.org/10.3233/WOR-152027

motion capture | Definition of motion capture in US English by Oxford Dictionaries [WWW

Document], n.d. URL https://en.oxforddictionaries.com/definition/us/motion_capture

(accessed 2.16.19).

MOTION CAPTURE | meaning in the Cambridge English Dictionary [WWW Document], n.d. URL

https://dictionary.cambridge.org/dictionary/english/motion-capture (accessed 2.16.19).

Page 45: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

44

Mujber, T.S., Szecsi, T., Hashmi, M.S.J., 2004a. Virtual reality applications in manufacturing process

simulation. J. Mater. Process. Technol., Proceedings of the International Conference on

Advances in Materials and Processing Technologies: Part 2 155–156, 1834–1838.

https://doi.org/10.1016/j.jmatprotec.2004.04.401

Mujber, T.S., Szecsi, T., Hashmi, M.S.J., 2004b. Virtual reality applications in manufacturing process

simulation. J. Mater. Process. Technol., Proceedings of the International Conference on

Advances in Materials and Processing Technologies: Part 2 155–156, 1834–1838.

https://doi.org/10.1016/j.jmatprotec.2004.04.401

Murcia-López, M., Steed, A., 2018. A Comparison of Virtual and Physical Training Transfer of

Bimanual Assembly Tasks. IEEE Trans. Vis. Comput. Graph. 24, 1574–1583.

https://doi.org/10.1109/TVCG.2018.2793638

Noonan, D.P., Mountney, P., Elson, D.S., Darzi, A., Guang-Zhong Yang, 2009. A stereoscopic

fibroscope for camera motion and 3D depth recovery during Minimally Invasive Surgery, in:

2009 IEEE International Conference on Robotics and Automation. Presented at the 2009 IEEE

International Conference on Robotics and Automation (ICRA), IEEE, Kobe, pp. 4463–4468.

https://doi.org/10.1109/ROBOT.2009.5152698

Pallavicini, F., Toniazzi, N., Argenton, L., Aceti, L., Mantovani, F., 2015. Developing effective Virtual

Reality training for military forces and emergency operators: From technology to human

factors. Presented at the International Conference on Modeling and Applied Simulation, MAS

2015, Dime University of Genoa, pp. 206–210.

Pascual, A.I., Högberg, D., Kolbeinsson, A., Castro, P.R., Mahdavian, N., Hanson, L., 2019. Proposal

of an Intuitive Interface Structure for Ergonomics Evaluation Software, in: Bagnara, S.,

Tartaglia, R., Albolino, S., Alexander, T., Fujita, Y. (Eds.), Proceedings of the 20th Congress

of the International Ergonomics Association (IEA 2018), Advances in Intelligent Systems and

Computing. Springer International Publishing, pp. 289–300.

Piedrahita, H., 2006. Costs of Work-Related Musculoskeletal Disorders (MSDs) in Developing

Countries: Colombia Case. Int. J. Occup. Saf. Ergon. 12, 379–386.

https://doi.org/10.1080/10803548.2006.11076696

Ramos, D.G., Arezes, P.M., Afonso, P., 2017. Analysis of the return on preventive measures in

musculoskeletal disorders through the benefit–cost ratio: A case study in a hospital. Int. J. Ind.

Ergon., New Approaches and Interventions to Prevent Work Related Musculoskeletal

Disorders 60, 14–25. https://doi.org/10.1016/j.ergon.2015.11.003

Research, n.d. . Xsens 3D Motion Track. URL https://www.xsens.com/research/ (accessed 3.11.19).

Riedel, J.C.K.H., Pawar, K.S., 1997. The consideration of production aspects during product design

stages. Integr. Manuf. Syst. 8, 208–214. https://doi.org/10.1108/09576069710182027

Roetenberg, D., Luinge, H., Slycke, P., 2013. Xsens MVN: Full 6DOF Human Motion Tracking Using

Miniature Inertial Sensors 10.

Rosen, E., Whitney, D., Phillips, E., Ullman, D., Tellex, S., 2018. Testing Robot Teleoperation using

a Virtual Reality Interface with ROS Reality 5.

Page 46: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

45

Rüßmann, M., Lorenz, M., Gerbert, P., Waldner, M., Justus, J., Harnisch, M., 2015. Industry 4.0: The

Future of Productivity and Growth in Manufacturing Industries 14.

Seth, A., Vance, J.M., Oliver, J.H., 2011a. Virtual reality for assembly methods prototyping: a review.

Virtual Real. 15, 5–20. https://doi.org/10.1007/s10055-009-0153-y

Seth, A., Vance, J.M., Oliver, J.H., 2011b. Virtual reality for assembly methods prototyping: a review.

Virtual Real. 15, 5–20. https://doi.org/10.1007/s10055-009-0153-y

Seymour, N.E., Gallagher, A.G., Roman, S.A., O’Brien, M.K., Bansal, V.K., Andersen, D.K., Satava,

R.M., 2002. Virtual Reality Training Improves Operating Room Performance: Results of a

Randomized, Double-Blinded Study. Ann. Surg. 236, 458.

Soffel, F., Zank, M., Kunz, A., 2016. Postural Stability Analysis in Virtual Reality Using the HTC

Vive, in: Proceedings of the 22Nd ACM Conference on Virtual Reality Software and

Technology, VRST ’16. ACM, New York, NY, USA, pp. 351–352.

https://doi.org/10.1145/2993369.2996341

SteamVR [WWW Document], n.d. URL http://steamvr.com (accessed 3.11.19).

Suznjevic, M., Mandurov, M., Matijasevic, M., 2017. Performance and QoE assessment of HTC Vive

and Oculus Rift for pick-and-place tasks in VR, in: 2017 Ninth International Conference on

Quality of Multimedia Experience (QoMEX). Presented at the 2017 Ninth International

Conference on Quality of Multimedia Experience (QoMEX), pp. 1–3.

https://doi.org/10.1109/QoMEX.2017.7965679

Tran, N., Rands, J., Williams, T., 2018. A Hands-Free Virtual-Reality Teleoperation Interface for

Wizard-of-Oz Control 5.

virtual reality | Definition of virtual reality in English by Oxford Dictionaries [WWW Document], n.d.

. Oxf. Dictionaries Engl. URL https://en.oxforddictionaries.com/definition/virtual_reality

(accessed 2.18.19).

Watson, Z., 2017. VR for News: The New Reality? 48.

Weistroffer, V., Paljic, A., Callebert, L., Fuchs, P., 2013. A Methodology to Assess the Acceptability

of Human-robot Collaboration Using Virtual Reality, in: Proceedings of the 19th ACM

Symposium on Virtual Reality Software and Technology, VRST ’13. ACM, New York, NY,

USA, pp. 39–48. https://doi.org/10.1145/2503713.2503726

Wong, S.B., Richardson, S., 2010. Assessment of working conditions in two different semiconductor

manufacturing lines: Effective ergonomics interventions. Hum. Factors Ergon. Manuf. Serv.

Ind. 20, 391–407. https://doi.org/10.1002/hfm.20189

Yabukami, S., Kikuchi, H., Yamaguchi, M., Arai, K.I., Takahashi, K., Itagaki, A., Wako, N., 2000.

Motion capture system of magnetic markers using three-axial magnetic field sensor. IEEE

Trans. Magn. 36, 3646–3648. https://doi.org/10.1109/20.908928

Yan, X., Li, H., Li, A.R., Zhang, H., 2017. Wearable IMU-based real-time motion warning system for

construction workers’ musculoskeletal disorders prevention. Autom. Constr. 74, 2–11.

https://doi.org/10.1016/j.autcon.2016.11.007

Page 47: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

46

Zawadzki, P., Żywicki, K., 2016. Smart Product Design and Production Control for Effective Mass

Customization in the Industry 4.0 Concept. Manag. Prod. Eng. Rev. 7, 105–112.

https://doi.org/10.1515/mper-2016-0030

Zhang, J.-T., Novak, A.C., Brouwer, B., Li, Q., 2013. Concurrent validation of Xsens MVN

measurement of lower limb joint angular kinematics. Physiol. Meas. 34, N63–N69.

https://doi.org/10.1088/0967-3334/34/8/N63

Zhong, R.Y., Xu, X., Klotz, E., Newman, S.T., 2017. Intelligent Manufacturing in the Context of

Industry 4.0: A Review. Engineering 3, 616–630. https://doi.org/10.1016/J.ENG.2017.05.015

Zyda, M., 2005. From visual simulation to virtual reality to games. Computer 38, 25–32.

https://doi.org/10.1109/MC.2005.297

Page 48: USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ...1327470/FULLTEXT01.pdf · USING MOTION CAPTURE AND VIRTUAL REALITY TO TEST THE ADVANTAGES OF HUMAN ROBOT COLLABORATION Master

1. Collaborative workstations: Workstations which make use of a collaborative robot 47

Appendix 1: Colouring Criteria

Table 4. Colouring Criteria Arms

(Arvidsson et al., 2017), (Lind, 2017)

Table 5. Colouring Criteria Trunk

Table 6. Colouring Criteria Arm Speed