Automation of Offline Programming for Assembly and Welding ...
Transcript of Automation of Offline Programming for Assembly and Welding ...
Linköping University | Department of Management and Engineering
Master thesis, 30 credits | Master of Science – Mechanical Engineering
Spring 2021 | LIU – IEI – TEK – A – 21/04066 – SE
Linköping University
Se-581 83 Linköping, Sweden
+46 013-28 10 00,
www.liu.se
Automation of Offline Programming
for Assembly and Welding Processes
in CATIA/DELMIA using VBA ______________________________________________________________________________________________________
MASTER THESIS
Author: Henrik Müller-Wilderink
Examiner: Johan Persson, Linköping University
Supervisor: Mehdi Tarkian, Linköping University
MASTER THESIS INTRODUCTION
II
ABSTRACT
Programming industrial robots for welding or part manipulation tasks is a time-consuming and
complicated process, resulting in companies not able to implement robot systems and exploit
their advantages. To reduce the time needed for programming, research is looking into ways to
automate this process and reduce manual labour.
In this thesis a concept for automating the programming process of industrial robots was
investigated using EXCEL VBA and CATIA/DELMIA. It was done for an industrial grating
model of varying sizes and configurations, resulting in a time reduction of 99% compared to
manual creation. For this, the model was first automatically created from scratch for the
required configuration and afterwards a robot motion was created fully automatically. The
concept and modelling approach is described, and the automation approach detailed. Finally,
the results are analysed and discussed.
MASTER THESIS INTRODUCTION
III
TABLE OF CONTENTS
1 INTRODUCTION ................................................................................................................................... - 1 -
1.1 BACKGROUND ....................................................................................................................................... - 1 - 1.2 PROJECT GOALS .................................................................................................................................... - 2 - 1.3 SCOPE .................................................................................................................................................... - 2 - 1.4 RESEARCH QUESTIONS .......................................................................................................................... - 3 - 1.5 DELIMITATIONS ..................................................................................................................................... - 3 -
2 STATE OF THE ART ............................................................................................................................. - 4 -
2.1 DESIGN AUTOMATION ........................................................................................................................... - 4 - 2.2 INDUSTRIAL ROBOTS ............................................................................................................................. - 6 - 2.3 MATHEMATICAL TRANSFORMATIONS ................................................................................................. - 10 - 2.4 FIXTURE DESIGN ................................................................................................................................. - 12 - 2.5 MIG/MAG WELDING .......................................................................................................................... - 13 -
3 METHODOLOGY ................................................................................................................................ - 15 -
3.1 PROOF OF CONCEPT ............................................................................................................................. - 15 - 3.2 SPECIFICATION .................................................................................................................................... - 15 - 3.3 CONCEPT GENERATION ....................................................................................................................... - 16 - 3.4 MODELLING ......................................................................................................................................... - 16 - 3.5 AUTOMATION ...................................................................................................................................... - 16 - 3.6 ANALYSIS ............................................................................................................................................ - 17 -
4 RESULTS ............................................................................................................................................... - 20 -
4.1 PRODUCT DESCRIPTION ....................................................................................................................... - 20 - 4.2 PROOF OF CONCEPT ............................................................................................................................. - 21 - 4.3 SPECIFICATION .................................................................................................................................... - 23 - 4.4 CONCEPT GENERATION ....................................................................................................................... - 24 -
4.4.1 Concept Generation of the CAD Model .................................................................................... - 24 - 4.4.2 Concept Generation of the Fixture Design................................................................................ - 26 - 4.4.3 Concept Generation of the RE and RM ..................................................................................... - 26 -
4.5 MODELLING ......................................................................................................................................... - 27 - 4.5.1 Modelling the CAD Model ........................................................................................................ - 27 - 4.5.2 Modelling the RE and RM ......................................................................................................... - 29 -
4.6 AUTOMATION ...................................................................................................................................... - 30 - 4.6.1 Automating the CAD Model Generation ................................................................................... - 31 - 4.6.2 Automating the RM Generation ................................................................................................ - 33 -
4.6.2.1 Automation of the Tag Generation ....................................................................................................... - 33 - 4.6.2.2 Automation of the Task Generation ...................................................................................................... - 36 -
4.7 ANALYSIS ............................................................................................................................................ - 38 -
5 DISCUSSION ......................................................................................................................................... - 41 -
6 CONCLUSION ...................................................................................................................................... - 43 -
7 REFERENCES ...................................................................................................................................... - 44 -
8 APPENDIX ................................................................................................................................................... I
8.1 APPENDIX A ............................................................................................................................................... I 8.2 APPENDIX B ............................................................................................................................................. IV 8.3 APPENDIX C ............................................................................................................................................. VI 8.4 APPENDIX D ........................................................................................................................................... VIII
MASTER THESIS INTRODUCTION
IV
TABLE OF FIGURES
FIGURE 1: INDUSTRIAL GRATINGS ........................................................................................................................ - 1 - FIGURE 2: REPRESENTATION OF THE DATA FLOW ................................................................................................ - 2 - FIGURE 3: TOP-DOWN MODELLING APPROACH ..................................................................................................... - 5 - FIGURE 4: THE DIFFERENT STAGES OF MORPHOLOGICAL (LEFT) AND TOPOLOGICAL TRANSFORMATION (RIGHT)
ACCORDING TO [10] .................................................................................................................................... - 6 - FIGURE 5: COMPONENTS OF INDUSTRIAL ROBOTS ................................................................................................ - 8 - FIGURE 6: COMMON ROBOT COORDINATE FRAMES (LEFT) AND ROBOT REFERENCE FRAMES (RIGHT) ACCORDING TO
[14] ............................................................................................................................................................ - 9 - FIGURE 7: THE STEPS OF OFFLINE PROGRAMMING ACCORDING TO [1] ................................................................. - 9 - FIGURE 8: TRANSFORMATION OF COORDINATE SYSTEMS ................................................................................... - 10 - FIGURE 9: STEPS OF THE FIXTURE DESIGN PROCESS [20] .................................................................................... - 12 - FIGURE 10: VISUALIZATION OF THE PROCESS .................................................................................................... - 15 - FIGURE 11: FLOWCHART OF THE DIRECT AND INVERSE KINEMATIC OF AXIS TRANSFORMATIONS ...................... - 17 - FIGURE 12: COMPARISON OF THE ELAPSED TIME BETWEEN THE MANUAL AND AUTOMATIC OLP PROCESS ....... - 18 - FIGURE 13: DISTRIBUTION OF THE ELAPSED TIME FOR A SINGLE MODEL ............................................................ - 18 - FIGURE 14: SUBCLASS "OUTER STRUCTURE" WITH TEMPLATE COMPONENTS .................................................... - 20 - FIGURE 15: SUBCLASS "INNER STRUCTURE" WITH TEMPLATE COMPONENTS ..................................................... - 21 - FIGURE 16: SUBCLASS "WALKWAY" WITH TEMPLATE COMPONENTS ................................................................. - 21 - FIGURE 17: GUI FOR THE POC .......................................................................................................................... - 22 - FIGURE 18: FLOWCHART OF THE "GENERATE ROBOT MOTION" CODE, WITH THE MAIN CODE (LEFT), “CREATE TAG”
CODE (MIDDLE) AND “CREATE MOTION” CODE (RIGHT). ........................................................................... - 23 - FIGURE 19: THREE DIFFERENT CONFIGURATIONS FROM THE PRODUCT LINE ...................................................... - 24 - FIGURE 20: INTERACTION OF THE BAER AND STOED COMPONENTS WITH THE PLANEFIXTURE COMPONENT- 25
- FIGURE 21: CONCEPTS I AND II .......................................................................................................................... - 25 - FIGURE 22: CONCEPTS III AND IV ...................................................................................................................... - 26 - FIGURE 23: COMBINATION TABLE OF THE FOUR CONCEPTS (LEFT) AND CHOSEN CONCEPT (RIGHT), ROBOT MODELS
BY [39] ..................................................................................................................................................... - 27 - FIGURE 24: SKELETON MODEL ........................................................................................................................... - 28 - FIGURE 25: SKETCH OF THE BASIC HLCT MODELLING PROCESS ........................................................................ - 28 - FIGURE 26: PRODUCT TREE OF THE RE .............................................................................................................. - 29 - FIGURE 27: PROCESS WITH FIRST AND SECOND LEVEL SUB-PROCESSES ............................................................. - 30 - FIGURE 28: INFORMATION HUB FOR THE REFERENCE CREATION AND HLCT INSTANTIATION ............................ - 31 - FIGURE 29: KB (AND IE) FOR THE REFERENCE CREATION IN THE SKELETON ..................................................... - 32 - FIGURE 30: FLOWCHART OF THE “CAD MODEL GENERATION” CODE (LEFT), “CONFIGURE SKELETON” (MIDDLE)
AND “CREATE COMPONENTS” (RIGHT) ..................................................................................................... - 32 - FIGURE 31: INFORMATION HUB FOR THE PLACE/WELD/APPROACH TAG CREATION .................................... - 34 - FIGURE 32: MAIN TAG CREATION CODE (LEFT) AND "CREATE PLACE & WELD TAGS" CODE (RIGHT) ............... - 34 - FIGURE 33: INFORMATION HUB FOR PICK/APPROACH TAG CREATION ........................................................... - 35 - FIGURE 34: "MOVE WORKPIECE COMPONENTS" CODE (LEFT) AND "CREATE PICK TAGS" CODE (RIGHT) .......... - 35 - FIGURE 35: INFORMATION HUB AND KB FOR THE ASSEMBOT RM CREATION ................................................ - 37 - FIGURE 36: CREATION OF THE RM FOR ASSEMBOT (LEFT) AND “SEARCH TAG LIST” SUB-PROCESS (RIGHT) - 37 - FIGURE 37: "CREATE MOTION" (LEFT) AND "UNLOAD GALLERDURK" (RIGHT) SUB-PROCESSES OF THE
ASSEMBOT RM CREATION .................................................................................................................... - 37 - FIGURE 38: COMPARISON OF THE ELAPSED TIME BETWEEN THE MANUAL AND AUTOMATIC OLP PROCESS ....... - 39 - FIGURE 39: DISTRIBUTION OF ELAPSED TIME FOR A MODEL THE SIZE 916X916[MM] ....................................... - 39 -
MASTER THESIS INTRODUCTION
V
TABLE OF TABLES
TABLE 1: ADVANTAGES AND DISADVANTAGES OF ROBOTS IN THE PRODUCTION CYCLE ACCORDING TO [14] ...... - 7 - TABLE 2: HLCTS IDENTIFIED IN THE PRODUCT. ................................................................................................. - 22 - TABLE 3: EXCERPT OF THE LIST OF DESIGN SPECIFICATIONS .............................................................................. - 24 - TABLE 4: AUTOMATION INDEX FOR THE DIFFERENT ITERATIONS OF CAD AUTOMATION .................................. - 33 - TABLE 5: AUTOMATION INDEX FOR THE DIFFERENT ITERATIONS OF TAG AUTOMATION .................................... - 36 - TABLE 6: AUTOMATION INDEX FOR THE DIFFERENT ITERATIONS OF TASK AUTOMATION .................................. - 38 - TABLE 7: TIME ESTIMATION FOR THE INDIVIDUAL COMPONENTS ....................................................................... - 40 -
MASTER THESIS INTRODUCTION
VI
TABLE OF ABBREVIATIONS
AI …………………………………………………………………………… Automation Index
API ………………………………………………………. Application Programming Interface
CAD ………………………………………………………………….. Computer-Aided Design
CAE …………………………………………………………….. Computer-Aided Engineering
CAFD ………………………………………………………... Computer-Aided Fixture Design
CAM ………………………………………………………… Computer-Aided Manufacturing
CAx ………………………………………………………………………… Computer-Aided x
CBR ……………………………………………………………………. Case Based Reasoning
DOF ………………………………………………………………………. Degrees of Freedom
GUI ………………………………………………………………….. Graphical User Interface
HLCt ……………………………………………………………… High Level CAD Templates
IE …………………………………………………………………………….. Inference Engine
KB ……………………………………………………………………………. Knowledge Base
KBE …………………………………………………………... Knowledge-Based Engineering
MAG ………………………………………………………………………… Metal Active Gas
MDO ………………………………………………….. Multidisciplinary Design Optimization
MIG …………………………………………………………………………… Metal Inert Gas
OLP ........................................................................................................... Offline Programming
POC …………………………………………………………………………. Proof of Concept
RE ………………………………………………………………………… Robot Environment
RM ……………………………………………………………………………… Robot Motion
STEP …………………………………………….. Standard for The Exchange of Product data
VBA ……………………………………………………………… Visual Basic for Application
YPR ………………………………………………………………………… Yaw, Pitch & Roll
MASTER THESIS INTRODUCTION
- 1 -
1 Introduction
Traditionally robots are programmed on site by workers familiar with the robot programming,
called online programming. The robot motion (RM) is taught step by step, by means of the lead-
through method. Each motion and action of the robot e.g., how the workpiece is handled to
avoid collisions must be directly programmed. As the process requires direct contact to the
robots, the workers are exposed to the dangers of the workplace. Online programming is time-
consuming and prohibits the usage of the robot until the program is sufficiently tested and
implemented. [1] To counteract these disadvantages offline programming (OLP) was
developed. In OLP the 3D CAD (computer-aided design) data of the workpiece is used to
simulate the robot environment (RE). The robot is virtually trained in this environment, thus
negating the disadvantages of the robot being blocked for usage. [2] Additionally, this process
allows for the programming to be conducted simultaneously as the workpieces are developed,
rather than in the traditional sequential way. This way, changes to the CAD models can be
integrated more quickly. The testing phase of the robot can be minimized through simulations
in the CAD environment. It is, however, still advisable to test the robot program before starting
to ensure the safety and reliability of the program. OLP, however, remains a tedious process as
the robot path is still programmed by hand. [1] A potential way of decreasing the manual input
required for OLP is to be investigated and developed in this master thesis. This can be
accomplished by automating the RM creation process through an externally coded framework.
Small and medium sized enterprises are currently unable to make use of robots due to the time-
consuming process of the robot programming compared to their small product quantity. [3] The
automated RM generation could enable the use of manufacturing robots for small product
quantities.
1.1 Background
Industrial gratings, as shown in Figure 1, are widely used in public and industrial buildings, as
well as in factories and workshops. As of now, these are assembled and welded by hand through
Figure 1: Industrial gratings
MASTER THESIS INTRODUCTION
- 2 -
a trained worker in the company. This is a time-consuming process, as all adjustments,
measurements and welds must be performed by the worker and the quality depends on the
worker’s physical condition. This is where the collaboration between the company, which is
not named here, and the university comes into play. Through the implementation of a handling
and welding robot the dependency on manual labour could be reduced and the quantity of
manufactured gratings increased. By additionally automating the OLP process the robot would
be utilizable for user customizable gratings with small batch sizes. This is to be investigated in
this master thesis.
1.2 Project Goals
In the project the welding process of gratings is to be automated. This is to be achieved by
means of parametric models and macro-based RM creation. The currently existing models are
rebuilt in the program CATIA with integrated parameters. This allows for an external control
of the shape and configuration of the models through EXCEL based macros. Based on these
models, a RM is created, defining the welding program used by the robot. This process is
automated through EXCEL based macros, similar to the model creation. The whole process is
controlled through a GUI (graphical user interface) in EXCEL, where all necessary
specifications to configurate the model can be made. The data flow of this process is shown in
Figure 2.
Figure 2: Representation of the data flow
1.3 Scope
The scope of the project is defined through its restrictions and challenges. As this is a master
thesis, there is a clear restriction of the time available to work on the project. A deadline of 20
weeks is given, in which the task is to be completed. Furthermore, the tools used provide a
boundary of what is possible to implement. It was decided on to use the tool CATIA for creating
the CAD models, the tool DELMIA to create the RM and the programming language visual
basic in EXCEL to automate the process, as the student has previous experiences with these
tools. The size of the models is restricted through the specifications of the robot e.g., its reach
and payload. The challenges facing the project are to not exceed the time limit, as well as to
deliver good results and a working concept. As this is a cooperation between the university and
a company all parties involved should be satisfied with the outcome of the project.
The main goal of this master thesis is, as described in chapter 1.2, the automation of the welding
process. A possible way to evaluate how automated the process is, is to look at the number of
times a manual input is needed and how much information at a given input is necessary. By
reducing these numbers, especially the first one, the amount of automation is increased. To keep
in the time, limit the fixtures required for welding are to be disregarded and will not be included
in this automation process. They do however remain as an option to expand the project if the
time allows. In that case fixtures for the generated models will be generated as well. Further
CONFIGURATE
MODEL
GENERATE
CAD MODEL
GENERATE
ROBOT MOTION
FINISHED
PROGRAM
MASTER THESIS INTRODUCTION
- 3 -
RMs would be created, which assemble the parts in the fixture prior to welding. These two steps
would be included between the steps “generate CAD model” and “generate robot motion” in
Figure 2. This step is, however, entirely optional and solely dependent on the time available.
1.4 Research Questions
In this thesis the following research questions will be investigated:
I. How can VBA script be used to automate the OLP process in CATIA/DELMIA?
II. What influence does the design of the CAD model have on the automation process and
how should the design be structured to allow for automation?
As proof of fulfilment the deliverables mentioned above will be used. A functioning VBA script
will confirm research question I. By comparing the number of manual steps necessary in the
process a direct evaluation of the level of automation can be made, as shown in chapter 3.6.
Furthermore, a flowchart detailing the automation and modelling process will be provided to
fulfill questions I and II. All findings throughout the thesis will be marked down and presented
to show the influence of the design of the CAD model on the automation process, see question
II.
1.5 Delimitations
To fit the research to the size of a master thesis a few delimitations were made. Firstly, the
thesis will only investigate the automation process of OLP using CATIA/DELMIA and VBA
script. Other CAD programs such as SOLIDWORKS or CREO or OLP programs such as
RobotStudio will not be looked into. Secondly, the automation framework is created for a pre-
assembled workpiece “floating” in space in the CAD program. As stated before, an automatic
fixture generation and assembly process is optional. If it is performed it would negate this
delimitation. Thirdly, no sensor technology for detecting the workpiece, as described in [4],
will be incorporated into the thesis work.
MASTER THESIS STATE OF THE ART
- 4 -
2 State of the Art
Increasing efficiency has always and will always be of importance in the competing market.
One way to increase the efficiency can be through use of automation. Automation describes a
method, or technique of governing an action or a collection of actions with the aim to minimize
human interaction [5] [6]. A large step towards automation in the design process has been made
through CAD tools. In recent years knowledge-based engineering (KBE) systems have emerged
as the next step in automating the design process. A way of automating the manufacturing
process can be through the use of industrial robots, programmed to do specific, repetitive tasks
otherwise performed by human workers.
In the following chapter the state of the art in design automation and industrial robot technology
will be given. It is started by stating how KBE can be utilized to help further automate the
design process in CAD systems and how the modeling of components and products should be
undergone to achieve a high level of automation. Afterwards, a short introduction to industrial
robots and methods for programming these will be given. The process of fixture design is
explored, with a focus on welding fixtures and a brief introduction into the advancements of
fixture design automation. Finally, the chapter is concluded by giving a short overview of the
welding processes MIG and MAG.
2.1 Design Automation
Arguably the most important aspect of design automation are CAD, and by extension CAx,
tools. These refer to the computer-aided processes in a product lifecycle, where the x can be
substituted by the process in use, such as Design (CAD), Manufacturing (CAM) or Engineering
(CAE) [7]. CAD tools allow the engineer to virtually model and design their components,
materials can directly be assigned and the manufacturing process can be programmed through
CAM. Due to this the productivity of the designer, as well as the quality of the design can be
increased [8].
The engineer has control over the design process through various tools offered by the CAD
software, so-called modeling operations. To begin, simple operations (2D sketches) or
individual elements (points, lines, circles, etc.) are created. To fully define them, constraints are
needed. These can range from defining an element in a certain plane to stating the length or
coordinates of it. From these, surfaces, 3D shapes and solids can be created and further
manipulated using design features, such as ribs, patterns, or chamfers to create the final design.
[9] With 3D constraints these parts can then be added to an assembly creating the final product.
When designing products there are two traditional approaches that can be followed, called
bottom-up and top-down. In the bottom-up approach all parts are created separately from one
another and assembled in their final stage. When changes to the product occur, all parts must
be reworked independently. This approach is therefore not suited for design automation [10]
and will not be discussed further. The top-down approach is started at the highest level, where
a reference model is created. This reference model, called skeleton in Figure 3, contains
information about the components in the lower levels, such as locations or information about
the length of certain features. These are then transferred down to the components, who
themselves might have lower-level components they supply information to. This allows the
MASTER THESIS STATE OF THE ART
- 5 -
control of the model from the top and thus enables the use of design automation, increasing the
flexibility of the model [10].
Figure 3: Top-down modelling approach
Following a clear modelling strategy is an important step in gaining a robust model, as it reduces
the risk of errors and problems that can occur through geometry changes [10]. While the top-
down approach is generally used in assembly design it still passes the information to the lowest
level in the individual parts. Another modelling strategy for part design is needed to make use
of the information in a robust way. Here the resilient modelling strategy developed by Gebhard
and further described in [9] can be used. The modelling operations, described before, are
divided into 6 groups to give a clear hierarchy of when and how the different features are to be
used. Similarly to the top-down approach, it is started with the reference group, which would
be linked to the information received from said approach. This is followed by the construction
group containing all features necessary for the construction of the model, such as surfaces or
curves. Afterwards the core features are created in the core group. These are made up of all
extrudes, sweeps or other operations adding material to the model. The fourth group, detail,
describes the same operations as before being used to remove material. In the fifth group,
modify, existing features are changed through operations such as patterns or mirrors. And
finally, the group quarantine consists of features that should not be linked to by other features.
These are chamfers, blends, or rounds, where it is important to start with the largest decreasing
to the smallest [9].
According to [11] almost 80% of the determined product costs are created in the early product
phase, a point where there is little knowledge regarding the product. A flexible CAD model,
reached by means of design automation, is therefore crucial in exploring the design space to
find optimal solutions [12]. This is where KBE can be used. KBE is defined by various sources
[10], [11], [12] & [13] as a technology used to record engineering knowledge and store it in a
knowledge base (KB). From there it can be accessed through a reasoning mechanism, called the
inference engine (IE), and reused to solve engineering problems. KBE can be traced back to the
1960s, where it was presented as a means to reduce the production time and cost by automating
repetitive tasks [12]. It is important to note that KBE can only model systems based on logic,
which is why it works well for automating repetitive tasks. The creative part of the design
process should be done by the engineer and KBE should only be a means to extend the time
available for the creative design process. To create a KBE system, restrictions acting on the
system, such as industry or company standards, costs or others must be gathered and stored in
the KB [12]. This information can then be extracted by the IE, also called the inference
mechanism, and takes form of a coded program [13]. Using additional information provided
through a GUI it can manipulate the model according to the engineer’s desires [12] [13].
INF
OR
MA
TIO
N
SKELETON
COMPONENT A COMPONENT B
COMPONENT A1 COMPONENT A2 COMPONENT A3
REFERENCE MODEL
COMPONENT LEVEL 1
COMPONENT LEVEL 2
MASTER THESIS STATE OF THE ART
- 6 -
To allow for an automatic modification of CAD components, these must be modelled in a
parametric way. In [10] a method called high level CAD templates (HLCt) is presented with
which to utilize KBE and CAD tools to their full potential regarding design automation. They
divide the modification process into morphological transformations, changing existing
components, and topological transformation. Each can be further divided into four stages,
compare Figure 4. The assignment of parameters or logic rules to measurements allows for
quick modification of the geometry by changing the parameter value. This process can be
automated through a script and linked to a GUI. Using the example of Figure 4 (left) the length
of the rectangle can be changed by changing the parameter in the GUI and updating the
component. This is useful when the same part is needed in different sizes but does not allow for
the configuration of different components in a single product. To negate this, topological
transformation can be used. It describes the removing, replacing, or adding of components or
instances through different stages of instantiations, see Figure 4 (right). Instantiation is the
process of placing a template, accessed from the KB, into a new surrounding and creating a
new instance. Using scripts, instances can be created automatically and linking the template to
constraints allows the generation of different geometries through a generic template. [10]
Figure 4: The different stages of morphological (left) and topological transformation (right) according to [10]
By using a GUI, a KB, and an IE, the HLCts can be combined with the top-down approach to
create the dynamic top-down approach. A reference model, refer to Figure 3, provides the
information and constraints where the HLCts must be instantiated and what shape they will
take. The IE locates the corresponding HLCt, information and constraints. The process can be
influenced through the GUI, by changing parameters, selecting configurations (detailing what
templates are to be instantiated) or choosing which components to remove/replace. [10]
2.2 Industrial Robots
Industrial robots are widely used in factories to increase the productivity, quality of the results
and safety in the work environment [14]. There is, however, not one type of industrial robot.
Different robot types exist for the different tasks they are meant to perform. The following
chapter will give a short introduction into industrial robots and describe how they can be
programmed.
The word “robot” originates from the story R.U.R. (Rossum’s Universal Literature), written in
1922 by Karl Čapek, where the word “robota” (Czech for “forced labor”, “slavery” or
“worker”) was used for manufactured, artificial people [14] [15]. In engineering terms, robots
MASTER THESIS STATE OF THE ART
- 7 -
are described as “devices that mimic living creatures” [14] or “automatically controlled,
programmable, multipurpose manipulators” [16]. Here it is important to note, that there is a
general difference between manipulators and robots. Manipulators refer to devices that are
being controlled by human operators [14], for example a lift or a crane. They are not
programmable and only perform a certain task. In the case of a lift the tasks being the change
in elevation of transported components. The differences of robots are mentioned in [16]. They
are controlled through a computer and run a pre-determined program [14]. By changing the
program, they can perform different tasks, which allows the use of the same robot for a
multitude of different problems. While introducing industrial robots on the production cycle
can have advantages, such as increasing the automation, some disadvantages will also entail. A
few advantages and disadvantages are displayed in Table 1.
Table 1: Advantages and disadvantages of robots in the production cycle according to [14]
Advantages Disadvantages
Potential increase of productivity, safety,
efficiency, quality, and consistency
Cause economic hardship by replacing
human workers
Can work in hostile/hazardous environments Cannot respond in emergencies unless coded
to do so → requires a prediction of the
emergency
Do not require breaks Need programming not to harm workers
Increased accuracy Have limited capabilities in DOF and real-
time response
High cost →initial cost, programming,
training
There are three general classifications for robots, fixed-sequence robots, playback robots,
numerical control robots and intelligent robots. Fixed-sequence robots only perform
consecutive, previously established tasks. Thus, they are specialized for certain problems
making it hard to change them. Playback robots are, like the name suggests, playing back
previously recorded motions. A human operator leads them through the motion, manually
performing tasks step by step. This programming process is called online programming. The
drawbacks of this programming method are, that it requires direct interaction with the robot,
rendering it unusable for the duration of the programming. Furthermore, this can be a tedious
and time-consuming process, as it is not intuitive and difficult to avoid collision with the
workpiece. In addition to that, testing is needed to reach the required reliability and safety
standards [1]. Numerical robots are controlled through a movement program that can be created
through OLP. OLP makes use of the CAD models to program and simulate the RM offline in a
digital factory. This allows the robot to be in use while new programs are created. As the
programmer is disconnected from the robot, they are not exposed to dangerous environment
during the process [1]. The OLP process is described in more detail at the end of this chapter.
The final classification, intelligent robots, describes robots, that can understand the
environment and finish tasks even though their adjoining conditions are subjected to change.
This can be accomplished through sensors giving feedback to the robot and a program deciding
and adjusting the next action. [14]
MASTER THESIS STATE OF THE ART
- 8 -
The main body of a robot is made up out of joints, links and other required structural elements,
compare with Figure 5. The last joint, called hand, is connected to the end effector, the part of
the robot that interacts with objects and the environment. This can be a welding gun, a gripper
or other tools and is generally not provided by the robot manufacturer. The end effector is
usually specifically designed for the tasks it is meant to perform. The joints can be linear
(prismatic), rotary (revolute), sliding or spherical with hydraulic, pneumatic, or electric
actuators. [14]
Figure 5: Components of industrial robots
There exist different coordinate frames used for positioning the hand of the robot, visualized in
Figure 6 (left). The cartesian coordinate frame is created from the prismatic joints positioned
on three different axes. That way it is possible to move the end effector in the 3D space, a
rotation of the end effector is however not possible. [14] It is the same way a 3D printer or a
three axes mill is set up. In 3D printers the component is created a layer at a time in the XY-
plane. By moving along the Z-axis, a 3D shape is generated. The most common coordinate
frame for industrial robots is the articulated or anthropomorphic coordinate frame, also used as
an example in Figure 5. In this frame all joints are made up of revolute joints. [14] When
programming the RM, three different reference frames can be used, these being the world
reference, joint reference, and tool reference frame, see Figure 6 (right). The world reference
frame works similarly to the cartesian coordinate system, where the RM is created on the
universal X-, Y-, & Z-axes. The robot joints create a motion based on the trajectory of the end
effector in this reference frame. In the joint reference frame, each joint is accessed and moved
individually which changes the motion type of the end effector depending on the joint accessed.
If prismatic joints are accessed the motion is linear, if revolute joints are accessed the robot
moves in a rotary motion. The third reference frame is the tool reference frame. The origin is
set in the end effector/tool and all motions and the surrounding is in reference to this origin.
While in the world reference frame the world is still and the end effector moves in relation to
it, in the tool reference frame the end effector is still and the whole world moves in relation to
it. [14]
1
2
3
4 END EFFECTOR
LINK
BASE
JOINT
Robot Components
2
2
2
3
4
1
MASTER THESIS STATE OF THE ART
- 9 -
Figure 6: Common robot coordinate frames (left) and robot reference frames (right) according to [14]
As stated previously, RMs can be programmed online or offline. Pan describes the steps needed
to perform OLP in [1] and further detailing it in [17]. To start off, the CAD model of the system
is needed, as it supplies necessary information for the following steps. In case no CAD models
are existing, a model can also be generated through 3D scanning the workpiece. From this, tags
can be created. They give the robot the position and orientation the end effector must adapt.
The creation of the RM trajectory requires multiple tags. To avoid collision with the workpiece
or robot singularities, careful planning of the tags is needed [18]. Singularities describe
positions of the robot resulting in the loss of one or more DOFs (degrees of freedom) of the end
effector [19]. Positioning tags manually is a very time demanding process, but can be automated
with embedded VBA scripts, if the tag position and orientation is known to the program. More
advanced software, such as displayed in [17], can be employed to ease the identification of tag
locations in case they are previously unknown. In the next step the trajectory of the motion is
created. According to [18], robot trajectory entails the “displacement, velocity and acceleration
of the robot”, in other words it describes how the robot moves between the tags. When planning
the trajectory in the world reference frame there often exist multiple solutions to travel between
tags and it might be preferable to use the joint reference frame [1] [18]. The trajectory can be
created as a point-to-point motion or a continuous-path motion, where the start and end tag are
connected [18]. After the generation, the reachability of the robot must be assessed to ensure it
can travel the path. Digital REs such as DELMIA possess tools to assess the reachability,
Figure 7: The steps of offline programming according to [1]
MASTER THESIS STATE OF THE ART
- 10 -
according to [17] however, it was not possible to access these tools via VBA script to automate
the process and it was thus automated using MATLAB. In the fourth and fifth steps, process
planning and post processing, the path is optimized, fine-tuned, and converted into the robot
language [1]. In the final steps the program is simulated digitally and calibrated with the on-
site work cell, as it might not fully match the digital representation. [1]
2.3 Mathematical Transformations
To enable the programming, and thus the use, of an industrial robot, a mathematic model of the
robot itself is required. Of interest is the positioning and orientation of the end effector, based
on the positioning and orientation of all links and joints. The positioning of components is
described through their x, y & z attributes in the world reference frame. By placing a coordinate
system in said point the orientation of the component is defined [20]. This is visualized in Figure
8, where two coordinate systems are displayed in the world reference frame. The respective Tnm
describe the transformation matrix required to transform coordinate system n into coordinate
system m and is defined by Eq. 1.
Figure 8: Transformation of coordinate systems
Tnm = [
Rnm rn
m
0 0 0 1] Eq. 1
In the transformation matrix, Rnm describes the rotation and rn
m the translation applied to the
coordinate system, with the translational vector given in Eq. 6.
rnm = [
∆x
∆y
∆z
] Eq. 2
Rotations in a 3D space can be broken down into three elemental rotations about the three axes,
with Eq. 3, Eq. 4 & Eq. 5 describing the rotation around x, y & z, respectively.
Rx(α)= [1 0 0
0 cos(α) - sin(α)
0 sin(α) cos(α)] Eq. 3
Ry(β)= [cos(β) 0 sin(β)
0 1 0
- sin(β) 0 cos(β)] Eq. 4
world reference frame
MASTER THESIS STATE OF THE ART
- 11 -
Rz(γ)= [cos(γ) - sin(γ) 0
sin(γ) cos(γ) 0
0 0 1
] Eq. 5
To acquire the combined rotational matrix, the elemental matrices are multiplied, however due
to the nature of matrix multiplication, the result is dependent on the order of operation.
Depending on the application different conventions are used, with a multiplication from right
to left (XYZ) producing the Cardan-angles, different definitions of the Euler-angles and a
multiplication from left to right producing the RPY-angles, shown in Eq. 6 [20]. The R stands
for roll and describes the angle of rotation around the z-axis, P stands for pitch and represents
the angle of rotation around the y-axis and Y stands for yaw, detailing the angle of rotation
around the x-axis [21]. In subsequent chapters these will be referred to as YPR angles, as it
corresponds to the naming convention utilized by CATIA.
RRPY(α,β,γ)=Rz(γ)Ry(β)Rx(α)= [
cγcβ cγsβsα-sγcα cγsβcα+sγsβ
sγcβ sγsβsα+cγcα sγsβcα-cγsα
-sβ cβsα cβcα
]
Eq. 6
cx= cos(x)
sx= sin(x)
Two different kind of approaches can be used when computing the transformational matrix i.e.,
direct kinematics or inverse kinematics. Direct kinematics follows the equations outlined
previously. The transformational matrix is calculated with the given rotational angles (α, β, γ)T.
For inverse kinematics, the positioning and orientation of the end effector is known and the
afore mentioned rotational angles are to be calculated. Using Figure 8 as an example it can be
explained as following. Two transformation matrices T01 and T1
2 are given. To create coordinate
system 2, the transformation matrix and the rotational angles (α, β, γ)T, here denoted to
(ϕ, ψ, θ)T, are required. The rotational matrix utilizing the RPY-angles, RRPY02 , is calculated
using Eq. 7.
R02
RPY= R0
1
RPYR1
2RPY
= [
a11 a12 a13
a21 a22 a23
a31 a32 a33
] Eq. 7
Using the equations below the rotational angles (𝜙, 𝜓, 𝜃)𝑇 are calculated.
tan(α)=sin(α)
cos(α)=
RRPY3,2
RRPY3,3
=! a32
a33
⇒ α=arctan2 (
a32
a33
)≡ ϕ Eq. 8
tan(β)=sin(β)
cos(β)=
-RRPY3,1
RRPY1,1cγ+RRPY2,1
sγ
=! -a31
a11cγ+a21sγ
(cos2(φ)+ sin2(φ)=1)
MASTER THESIS STATE OF THE ART
- 12 -
⇒ β=arctan2(
-a31
a33cγ+a21sγ
)≡ ψ Eq. 9
tan(γ)=sin(γ)
cos(γ)=
RRPY2,1
RRPY1,1
=! a21
a11
⇒ γ=arctan2 (
a21
a11
) ≡ θ Eq. 10
The function artan2(x, y) is used, as its codomain is [-π; π] compared to the codomain of
arctan (y
x), [-
π
2;
π
2]. [20] [22]
2.4 Fixture Design
Fixtures are an aiding tool in the manufacturing process, with the aim to hold, position and
stabilize, or “fix”, the workpiece(s) within the specified tolerances [23] [24]. The fixture can
hold a single workpiece in the desired position for grinding or it can fix multiple workpieces
for the assembly or welding process. While some progress has been made in the automation of
fixture design through CBR (case based reasoning) or expert systems, presented by Wang in
[23] and [25], it still requires a lot of time and engineering know-how [26]. Fixture design and
manufacturing can make up 10-20% of the total manufacturing system cost according to [27].
For a cost reduction and increased competitivity, fixtures require an adjustable or
reconfigurable design to allow for the holding of multiple components for different uses [24]
[27]. Welding fixtures are especially expensive [27] due to their need to withstand high
temperatures, precisely position multiple parts and, depending on the welding process used,
even conduct electricity [26]. Throughout the lifecycle of the fixture, it must endure the repeated
fixing/loading and unloading of components while maintaining the tolerances specified.
Wang divides the fixture design process into four underlying steps, compare with [23] and
Figure 9. It is started off by planning the setups. A setup describes a sequence of processes
performed on the workpiece by one tool not requiring the manual change of the orientation or
positioning of the workpiece [23]. After the number of setups and positioning and orientation
Figure 9: Steps of the fixture design process [20]
MASTER THESIS STATE OF THE ART
- 13 -
of the workpiece is defined, the fixture planning begins. In this step areas of the workpiece
suitable for clamps and locators are determined. In the unit design step, the CAD model of the
fixture is created. The components can be categorized in three different groups, a base plate on
which all components are mounted, a locating unit and a clamping unit. The locating unit
ensures the correct positioning of the workpiece, and the clamping unit fixes it in place. In the
final step, verification, the design is tested and checked. These tests may include material testing
i.e., stiffness or heat resilience, the positioning accuracy or the overall cost and weight of the
fixture. Zhang, in [26], includes a fifth step after validation called post processing, in which all
components required for the manufacturing of the fixture are defined. This may include load
and unload drawings or information regarding the required components and their needed
amount. Hajduk, in [24], presents an adapted process specific to the modular concept in fixture
design. While the steps largely remain the same, a step between fixture planning and unit design
called module selection is added. Pre-existing modules of clamp and locator systems are
selected and applied to the workpiece. Furthermore, the use of an industrial robot is
incorporated into the process steps. First, the positioning of robot to fixture is defined and
afterwards the RM is created using OLP.
Fixtures utilizing an adjustable or reconfigurable design can be classified into two categories,
the above-mentioned modular fixtures, and internal flexible fixtures. Modular fixtures are
constructed of standardized components of the types, baseplate, locators, clamps, and
connections. This allows them to be used for many different assemblies and they are especially
useful for batch sizes or prototype work, however, due to their standard parts they may not be
applicable for more complicated assemblies. Furthermore, they require a large amount of
experience during the design process and lack in structural properties, accuracy and stiffness
compared to more solid, permanent solutions. Integral flexible fixtures consist of an
unchangeable structure containing variable components at the contact areas between fixture and
workpiece. This can be achieved through compliance engineering or temperature/electricity
induced phase changes in the material. [27]
In recent years advances in CAFD (computer-aided fixture design) have been made. Wang
gives an overview of some in [25]. One method is to design the fixtures with the help of expert
systems. These utilize a multitude of IF-THEN rules to determine and suggest a design for the
fixtures. Through inquiries answered by the engineer their IF-THEN state is gathered. A newer
method is CBR, where new solutions are suggested by mimicking past solutions. These older
solutions are stored and categorized in a database. The reasoning algorithm then compares the
workpiece to older problems to search for matching components. These are then presented to
the engineer to provide a good starting point in the fixture design and avoid a time-consuming
and expensive concept generation. In the proposed multi-tier CBR variant this process is
completed multiple times to iterate on the proposed solutions and find a better final proposal.
However, in all methods it is required of the engineer to rework the proposals to gain a
functioning fixture design. [23] [25]
2.5 MIG/MAG Welding
Welding is a manufacturing technique that has been used by humankind already in roman times
in the form of forge welding [28]. It can be described as a technique, that bonds or coats metallic
materials using pressure and/or heat with or without additional materials [29].Through the
usage of high-power-density heat sources the material undergoes a local phase change into the
MASTER THESIS STATE OF THE ART
- 14 -
liquid phase, which bonds the components after solidifying [29] [30]. The bond is especially
useful for transferring forces, bending and torsion torques and allows the use at high
temperatures. Furthermore, welding can provide an air-tight seal and is generally a cheap
joining technique for single parts, if the necessary tools are at hand [31]. Disadvantages caused
by the high heat can be the induction of stresses, shrinkage, or a change in material structure
close to the weld, potentially causing the material to become more brittle [29] [31].
A category of fusion welding is arc welding, where the material is molten through an electric
arc between the welding tool and the workpiece. Two widely spread arc welding processes are
MIG (metal inert gas) and MAG (metal active gas) welding. In these processes a metallic wire
is molten and used to carry the electric current to create the arc. The wire is used as additional
material to create a bond between the workpieces. As the wire is fed automatically and the
movement of the gun can be controlled by industrial robots these processes are suitable for
automation to increase the productivity. The difference between MIG and MAG welding lies
in the gases used during the process. MIG uses inert gases, such as argon or helium, which do
not react with the molten material and provide protection from outside influences. In MAG the
gas provides protection as well, but reacts with the molten material, hence “active”. Active
gases can be argon mixed with carbon dioxide or oxygen. [31] [32]
MASTER THESIS METHODOLOGY
- 15 -
3 Methodology
In the following chapter the approach used to realize the research questions stated in chapter
1.4 is presented. The process, visualized in Figure 10, was started off by performing a proof of
concept which was analyzed to produce a list of specifications. Then a concept is generated,
modelled, automated, and analyzed, in the respectively called steps. If the automation index
(AI) required exceeded the defined value 𝑥 the process looped back to the concept generation
phase. If it is below, the process is finished.
Figure 10: Visualization of the process
To perform these steps CATIA V5 was used as the modelling tool. The integrated tool DELMIA
was used to model the RE. The tool EXCEL/VBA was used to automate the process.
3.1 Proof of Concept
The work was started off by performing a proof of concept (POC). According to [33] & [34] a
POC describes a small test or an exercise to gain insight whether a concept is viable or
functional. The concept of whether the OLP process for changing CAD models could be
automated was investigated. No detailed brainstorming preceded, as the POC was to be used to
test different approaches. The results from the POC were analyzed and the insights gained were
used to create the list of specifications and help in the concept generation phase.
The role of the POC was to provide a quick test whether the OLP process can be automated
with VBA, which was mainly due to the fact, that not all features in DELMIA are accessible
through VBA.
3.2 Specification
In the design specification a list of what is to be created and achieved is defined. It also contains
requirements of and restrictions to the final product or the process of manufacturing. Design
specifications are used to define the customer requirements and contain customer needs,
metrics, targets, importance of metrics and dependency between them. [35] [36]
The product line and the manufacturing method posed some criteria on the part design and
manufacturing process. Other specifications were gathered by analyzing the POC.
PROOF OF
CONCEPT
ANALYSIS
CONCEPT
GENERATION
SPECIFICATION
AUTOMATIONMODELLING
AI x
yes
no
MASTER THESIS METHODOLOGY
- 16 -
Requirements needed for automating the CAD generation or the RM generation were identified
and included. These can range from measurement requirements to specific features in the model
or the name they are given. The design specification helps in defining and organizing the
information and its flow in the project.
3.3 Concept Generation
In the concept generation phase design concepts will be created that fulfill the design
specifications. Concepts and solutions are presented and used to perform the modelling phase
[36]. During concept generation, methods such as brainstorming can be used to gather creative
ideas and solutions [37].
The concept generation was started by stating the problems with the current model/process that
needed to be addressed. Using the insights gained from the POC and the design specification
as a design space, brainstorming methods were utilized to gather ideas. As the author was
working solitarily on the task, the alteration of the traditional oral brainstorming method, the
brainwriting method, was used. This method was chosen, as the author had prior experience
with it and it allowed the sketching of ideas. These were then sketched for solutions regarding
CAD design or mapped out in flow charts for solutions requiring programming. As the
overarching process was iterative, the concept generation could be repeated to improve already
existing solutions.
3.4 Modelling
In the modelling phase the generated concepts are turned into concrete solutions. The dynamic
top-down approach, discussed in chapter 2.1, was used and started off through the creation of a
reference model, the skeleton. It was followed by the creation of the HLCts for both the work
piece and the fixture components. The modelling phase furthermore contained the modelling of
the RE, though these steps especially were closely linked to the following phase, automation.
Even though the overarching process was iterative, the most work in the modelling phase was
done in the beginning of the project. Once HLCt and CAD products were created they might
undergo slight improvement in following loops, though the focus of the iterative process lied
on increasing the level of automation, compare with Figure 10. This was primarily done through
improving the code in the automation phase.
3.5 Automation
In the automation phase the framework for the code, created in the concept generation phase,
was implemented to automate the creation of the CAD models and RM. The code was written
in EXCEL/VBA starting off from a simple code only able to generate one model type. The code
was generalized using functions and more steps were included as the iterations progressed. A
GUI was created in EXCEL and linked to code. In the beginning of the process, it contained
little information and usability. Throughout the iterations, functions were added.
To allow for the mathematical modelling of the robot and its attached end effector positioning,
the mathematical models presented in chapter 2.3, are applied. Figure 11 shows that process for
both the direct and inverse kinematic that was followed.
MASTER THESIS METHODOLOGY
- 17 -
Figure 11: Flowchart of the direct and inverse kinematic of axis transformations
3.6 Analysis
In the analysis phase the results from the previous modelling and automation phases were
analyzed regarding the level of automation, how well they fulfilled the design specifications,
as well as the general modelling approach, robustness, and flexibility. The findings were then
formulated into problem statements for the following concept generation phase. Using Eq. 11
the automation index (AI) was calculated. It consists of the sum of the manual steps multiplied
with the corresponding penalty factor 𝑤. Manual steps that were required before the code was
run were multiplied with 𝑤 = 1. If the process of running the code was paused due to additional
manual steps needed, 𝑤 is increased by 1 for each instance it occurred. This resulted in the
automation index being lower, if all manual steps were performed in the beginning of the
process opposed to throughout. The equation was created by the author specifically for enabling
the comparison of the level of automation between all iterations.
A I = ∑(∑MS) *w
n
w=1
Eq. 11
With:
AI=Automation Index
MS=Number of Manual Steps performed
w=Penalty factor
n=Pauses in process through manual steps+1
Additionally, the results were compared to a corresponding manual OLP process. For this, the
elapsed run-time of the code was calculated and the number of components needed to be
manually created acquired. Using this, the time required to manually create the OLP was
estimated. An example of a comparison is shown in Figure 12. It is of note, that the data sets
are divided between two vertical axes, with the left axis being linked to the manual process and
showing the elapsed time in hours and the right axis displaying the time in minutes. The
horizontal axis shows the area of the workpiece in m2. Trendlines for each data set were
included providing a linear correlation between the data points, with ETM for the manual and
ETA for the automatic process. Using Eq. 12 the percentile difference between the two data
sets was calculated. In Figure 13 the distribution of the elapsed time for a single model is shown.
It is acquired by running the code for the same model size multiple times. And gives information
about how much the run-time varies between different runs i.e., how accurate the estimation for
the automatic process in Figure 12 is.
Direct kinematic
Inverse kinematic
, , 𝑇
𝑥, , 𝑇
,
(Eq. 2 and 6)
(Eq. 1)Positioning
,
𝜙,𝜓, 𝜃
(Eq. 8 to 10)
(Eq. 7)Orientation
MASTER THESIS METHODOLOGY
- 18 -
percentile difference = 1-ETA [h]
ETM [h] Eq. 12
Figure 12: Comparison of the elapsed time between the manual and automatic OLP process
Figure 13: Distribution of the elapsed time for a single model
From this, using Eq. 13 and Eq. 14 the arithmetic mean value and standard deviation was
calculated. [38]
x̅=1
n(∑ xi
n
i=1
) Eq. 13
ETM = 15*A + 15
ETA = 5*A + 8
0
10
20
30
40
50
60
0
10
20
30
40
50
60
70
0,2 0,6 1 1,4 1,8 2,2 2,6 3
Ela
pse
d T
ime
Auto
mat
ic [
min
]
Ela
pse
d T
ime
Man
ual
[h]
Area [m2]
Comparison of Elapsed Time
Manual
Automatic
MASTER THESIS METHODOLOGY
- 19 -
σ=√∑(xi-x̅)2
N
Eq. 14
MASTER THESIS RESULTS
- 20 -
4 Results
Hereinafter the results will be presented, following the process outlined in chapter 3 and
visualized in Figure 10. Though the process is of an iterative nature only the results of the last
iteration will be presented, as they encompass all changes that were made throughout the loops.
However, if interesting observations or changes during a particular iteration were made, these
will be presented as well. It is started off with a product description, followed by a presentation
of the POC. Thereafter, the list of design specification and the results of the modelling and
automation phase will be shown. Lastly, the automation index for all iterations will be
presented.
4.1 Product Description
To allow for a better understanding of the following chapters a product description is given. It
is of note that the classification of components was performed after completing the POC, thus
resulting in a different classification being used in chapter 4.2.
Even though the components and configuration of the grating differs depending on the input
parameters, elaborated in chapter 4.3 and following, they can be broken down into the three
subclasses “Outer Structure”, “Inner Structure” and “Walkway”. These contain several
template components (HLCt) as shown in Figure 14, Figure 15 and Figure 16. The components
in the subclasses “Outer Structure” and “Walkway” are present throughout all configurations
only changing their size (and number for subclass “Walkway”) in relation to the input
parameter. The components of “Inner Structure” vary in configuration and only one is shown
below as an illustration. Further information and illustrations can be found in Figure 19 or in
Appendix A. Additionally, a visual representation for each HLCt is given in Appendix A. The
naming convention for the HLCt was mostly a description of the visual. For PLANEFIXTURE,
BAER and STOED however the component name given by the company was adapted.
Figure 14: Subclass "Outer Structure" with template components
1
1.1 PLANEFIXTURE
1.2 WALL ONE
1.3 WALL TWO
Outer Structure1.1
1.3
1.2
MASTER THESIS RESULTS
- 21 -
Figure 15: Subclass "Inner Structure" with template components
Figure 16: Subclass "Walkway" with template components
4.2 Proof of Concept
In the POC the automation of the OLP process in DELMIA using VBA scripting was tested.
To reduce the time necessary for modelling, the complexity of the model was reduced and the
parts creating the grating, see Figure 17, where cut in their entirety. The inner structures
remained however, to keep some complexity in the POC. A visualization of the CAD model
used for the POC can be seen in Figure 17. Through analyzing the product, it was broken down
into similar parts making use of the same HLCt. Four different HLCts were identified and are
listed in Table 2. To further reduce the amount of work, no HLCt for the rounded corner element
was created and the original STEP-file (Standard for The Exchange of Product data) used in
the model. Furthermore, the HLCts were instantiated manually, corresponding to the third stage
of topological transformation in Figure 4. Manual instantiation was chosen, as it did not require
the creation of code for automatic instantiation. As the top-down approach was used, linking all
the HLCts to a skeleton model, controlled through parameters, the size changes of the model
could be controlled.
2
2.1 SUPPORT NO CUT
2.2 SUPPORT ONE CUT
2.3 SUPPORT TWO CUT
Inner Structure2.2
2.1
2.1
2.3 (not shown)
3
3.1 BAER
3.2 STOED
Walkway
3.13.2
MASTER THESIS RESULTS
- 22 -
Table 2: HLCts identified in the product.
HLCt Parts Count
1 Outer Elements 4
2 Inner Cross Section 2
3 Inner Stabilizers 2
4 Rounded Corner Element 1
The GUI, displayed in Figure 17, requires two input variables, the afore mentioned length and
width of the model. Upon stating these, the button “Configure CAD Model” could be activated.
It changed the parameter values in the skeleton model to the stated ones and updated all parts
in the product. Afterwards the creation of the RM could be initiated by pressing the button
“Generate Robot Motion”. A visualization of the RM code is shown in Figure 18. It was started
by setting up the link to CATIA/DELMIA and defining all needed references. In the “create
tags” program, shown in Figure 18 (middle), the tag groups were created and the coordinates
for the outer tags located. This was done by accessing the corresponding parts and saving the
coordinates of the corresponding points. A tag was then created at that location, with the
orientation selected from a table of possible orientations, stored in excel. This process was
redone for the inner tags and the moving tags. The outer tags are the welding locations of the
outer elements and the inner tags are the welding locations of the inner structures, both the inner
cross sections and inner stabilizers. The moving tags refer to approaching and retreating points
for the robot when it approaches or retreats from a welding point. After the tags had been created
the “create motion” code was run. It was started by creating a robot task in which all motion
and welding operations were stored. Afterwards the creation of operations was set up, by
defining the necessary parameters and references. For the POC the path of the robot was pre-
planned and stored in excel. For each, information about where it was stored and what operation
was needed was provided. As the robot needs to travel to each location to be able to weld, a
motion was created for each tag. Afterwards an inquiry found the information if a welding
operation was to be performed. In the following step it was checked whether the path was
completed. If that was not the case, the code looped back to “get tag nr & group from excel”
and jumped to the next operation in the pre-planned path list. This process was repeated until
the path was finished. After the “create motion” code was completed, the main code was
finished as well and a simulation of the motion was possible in DELMIA.
Figure 17: GUI for the POC
MASTER THESIS RESULTS
- 23 -
Using Eq. 11 the AI of the POC could be calculated. Since the trajectory of the RM was pre-
planned, the POC requires a large input. The order, in which the tags are moved too, as well as
their location and the orientation of the tags lead to a total manual steps of 𝑀𝑆 = 117 introduced
over + 1 = 2. Two of the MS were performed for + 1 = 1 and the rest for + 1 = 2,
resulting in automation index of 𝐴𝐼 = 232.
4.3 Specification
From the information received through the POC, and specifications from the company, the list
of design specifications was created. The specifications underwent a constant update throughout
the project, as new insights were gained. Only an excerpt is presented in this chapter, see Table
3, and the whole list can be found in Appendix B.
To identify the specifications more easily, they were divided into different categories.
Displayed in Table 3 are specifications on the skeleton model and on the HLCt. The
specifications 1.1 to 1.3 come from analyzing the models from the product line. The variables
that define the products are their length and width. Depending on these, different configurations
of inner support structures were activated. For small length and width values no inner support
structures are necessary, see Figure 19 left. When increasing the values, the support structures
can be made up out of single or multiple bars in parallel, at an angle or crossing each other.
This resulted in a total of nine configurations, displayed in Appendix A. As different
configuration required different references in the skeleton model, specification 1.4 was
introduced lowering the number of features in the model at a given time.
Figure 18: Flowchart of the "Generate Robot Motion" Code, with the main code (left), “create tag” code (middle)
and “create motion” code (right).
MASTER THESIS RESULTS
- 24 -
Table 3: Excerpt of the list of design specifications
Design Specification - Excerpt
Number Category Description
1 Skeleton Model
1.1 Design The model follows the given design and includes all design
elements.
1.2 Parameter The skeleton model can change length and width through
parameters.
1.3 Measurement Except for the parameterized values, all measurements are
fixed.
1.3 Design features The skeleton model contains all constant features needed as
references to instantiate the HLCt.
1.4 Design features All changing features needed as references are created through
code.
4.4 Concept Generation
In this step concepts for the CAD model and RM were generated. For the following chapters,
the CAD model and RM are presented separately. It is however of note, that they are closely
linked. As the RM requires the CAD model it also influences the concept generation, modelling
and automation process of the CAD model.
4.4.1 Concept Generation of the CAD Model Concepts for the CAD model were generated through brainwriting with the help of sketches.
As the process is similar for all components it will be presented by a single example, the
reference creation and HLCt modelling of the BAER and STOED components. Extra focus is
laid on the interaction with the PLANEFIXTURE component.
The number of instances of the BAER and STOED components was dependent on the length
and width of the product, as well as the spacing between each component. An additional
complication was provided through the PLANEFIXTURE. The two HLCt were generally
instantiated between the outer walls. However, in one corner the available space was reduced
through the PLANEFIXTURE, visualized in Figure 20. In the first concept the points were
created through patterns of two existing points, shown in Figure 21. Two start points could be
manually created and their distance influenced through parameters. In the EXCEL GUI, the
number of instances would be calculated, which in return would control the pattern. Upon
testing however, the concept proved not usable. While the pattern could be created via code it
was not possible to access the individual points through code, making an automatic instantiation
Figure 19: Three different configurations from the product line
MASTER THESIS RESULTS
- 25 -
impossible. In the second concept, see Figure 21, the points that intersect with the
PLANEFIXTURE were still created manually, but the rest of the points were created through
code. The X-, Y- and Z-coordinates were calculated in the excel file in relation to the width and
length. This allowed for the individual features to be selected, however also required the
deletion of said features prior to the configuration of a new CAD model.
For the third concept, compare with Figure 22, all points were created via code to reduce the
complexity of the deleting code, allowing it to delete all features in the given geometrical set.
In return it required the change of the HLCt modelling, as the reference point no longer followed
the surface of the PLANEFIXTURE. An additional reference was needed to solve the issue. By
extruding the component to a reference surface, the length was variable. Using the surface of
the WALL or the surface of the PLANEFIXTURE the HLCt was able to take the required shape.
This however proved problematic when the component to instantiate was at the edge between
two surfaces.
To solve this issue the HLCt was changed to the previous modelling method, not requiring a
reference surface, and being instantiated between three points. The location of the points
Figure 20: Interaction of the BAER and STOED components with the PLANEFIXTURE component
Figure 21: Concepts I and II
MASTER THESIS RESULTS
- 26 -
intersecting with the PLANEFIXTURE component were calculated using the Pythagorean
theorem, see Figure 22, and created via code.
4.4.2 Concept Generation of the Fixture Design As written in chapter 1.5, one of the delimitations imposed on this thesis, was to disregard the
use of a fixture. To give visual aid and increase the realism and understanding of the problem,
a black box design of a fixture was included. Each component had 0-2 fixture components
“securing” it for the welding process. As the fixture generation is of black box design only and
not the focus of this thesis it will not be discussed further. As reference, a rendering of a
generated fixture is provided in Appendix A.
4.4.3 Concept Generation of the RE and RM The concept generation for the robot can be divided into two segments. The concept generation
of the environment and of the motion, whereby the concepts for the motion were influenced
through the choices made in the environment. Here, only the concepts for the RE will be
presented, as the concepts for the motion were gained mostly through trial and error with the
results presented in chapters 4.5.2 and 4.6.2.
Four concepts for the RE were drafted, see that table in Figure 23 (left) with WP = workpiece
positioner, APR = articulate positioning robot, MPR = multi-purpose robot, WR = welding
robot and HR = handling robot. Concepts I and II represent concepts often utilized in the
industry. A workpiece positioner was used to rotate the workpiece to the desired orientation for
the following manufacturing process and was therefore classified as manipulators. In concepts
III and IV it was exchanged through an articulate robot requiring a payload of the fixture and
workpiece combined. The other option was, whether the welding and assembly of the
workpiece was performed by a single multi-purpose robot or by two robots specialized for their
task. Concepts I and II were not chosen, as they represent a common concept in the industry.
Concept IV was chosen due to a more complex motion and interaction between three robots.
As this does not represent a common concept in the industry it lent itself to a more research-
based approach, furthering the potential impact and relevance of the master thesis. A
visualization of all four concepts is provided in Appendix C.
Figure 22: Concepts III and IV
MASTER THESIS RESULTS
- 27 -
Figure 23: Combination table of the four concepts (left) and chosen concept (right), robot models by [39]
For easier identification, the robots were named as shown in Figure 23. FIXBOT refers to the
APR, the robot attached to the fixture and positioning the workpiece for assembly and welding.
ASSEMBOT refers to the HR in charge of the workpiece assembly and WELDBOT is the WR
performing the welding operations.
4.5 Modelling
In the following two chapters the modelling process of the CAD model and RE are presented.
The skeleton model, as well as the modelling of a HLCt will be described.
4.5.1 Modelling the CAD Model The skeleton model, described in chapters 2.1 and 4.2, was used for the instantiation process of
all components by providing necessary references. It therefore contained as little features as
possible to ensure flexibility and robustness, limiting the modelling to feature groups 1 and 2
as described by [9] presented in chapter 2.1. These contain planes, curves, points, and (if
necessary) surfaces, but excluding all 2D sketches. As mentioned in chapter 4.3, the
configuration of the product changed depending on its length and width. Therefore, the
references were divided into permanent and changing references and six geometrical sets were
created to organize the different references. This can be seen in Figure 24, where the basic
version of the skeleton model is shown. The geometrical set “OTR_STRUCT” contained all
permanent references used for the HLCts PLANEFIXTURE, WALL_ONE, and WALL_TWO.
The other geometrical set were used for the changing references. How they were created will
be described in chapter 4.6.1.
WP APR MPR WR HR
I x x
II x x x
III x x
IV x x x
Robot Names
1 FIXBOT
2 ASSEMBOT
3 WELDBOT
2
3
1
MASTER THESIS RESULTS
- 28 -
Figure 24: Skeleton model
While the HLCts might differ in shape and detail, they were based on the same basic modelling
strategy and therefore only one HLCt will be described here. A sketch of the basic modelling
strategy is presented in Figure 25. Three reference points were needed to fully define the
geometry. By creating a plane through all three points and lines between point 1 and 2 and 1
and 3, see Figure 25, two more reference planes could be created. These were then used to
define all other features. The default planes (xy, yz and zx) provided by CATIA were not used
and it was important that they were not referenced to either, as that would reduce the flexibility
of the HLCt. Only referencing to the three reference points allows for the HLCt to be
instantiated in different ways, for example at an angle. Referencing to the default planes would
negate this and could lead to errors in the instantiation process. Because of this, 2D sketches
were not used either. The geometry was created by creating the surfaces based on curves. The
area between the surfaces was then filled to create the solid model.
Figure 25: Sketch of the basic HLCt modelling process
REFERENCE 1
REFERENCE 2
REFERENCE 3
MASTER THESIS RESULTS
- 29 -
4.5.2 Modelling the RE and RM Based on the concept chosen for the robot environment, as described in chapter 4.4.3 the RE
and basis of the RM was modelled. In this chapter, the structuring of the product tree and
selected robots are presented. The steps taken in this phase serve as the foundation for the latter
RM automation phase.
The RE was modelled in CATIA through a “.CATProcess“ document, in which three distinct
sub-levels were accessible, see Figure 26. These being “ProcessList”, “ProductList”, and
“ResourceList”. The sub-level “ProductList” is kept empty. The sub-level “ResourceList”
contains resources or tools required for the RM, such as the industrial robots and their end
effectors and the workpiece (GALLERDURK model). It additionally stores all tags. A tag
consists of XYZ coordinates and YPR angles for rotation around the corresponding axis. These
were used to convey the location and orientation the industrial robot was required to travel to.
This was done by matching the tag in the end effector of the robot to the tag of the destination.
Each robot encompassed a pre-defined number of tasks, shown in Figure 27, in which segments
of the RM were created. Different operations were saved in tasks that were used to influence
the behavior of the robot. Pick and place operations allowed for interaction with the workpiece,
through moving the location of selected workpiece parts. Weld operations simulated the
welding process and motion operations moved the end effector to a specified tag.
The sub-level “ProcessList” contains information about all operations that were performed and
in what order they were accessed, thereby linking the previous two sub-levels. Each process in
the process list was linked to a robot task. By simulating the process, the whole RM was
visualized. Due to that, a general structure for the motion was required, which was
accomplished, by breaking down the motions by the different positions of the FIXBOT. A robot
task was created for the FIXBOT position at “Tag1”, permitting all parts to be assembled in
this position before moving on to the next. This allowed for permanent robot tasks to be created,
while generating their contents based on the workpiece-to-assemble.
Figure 26: Product tree of the RE
The process was broken down into nine sub-processes, which in return were divided into a third
level of processes. A list of the first level sub-processes and second level sub-processes for sub-
process one can be found in Figure 27. A full list of all processes is provided in Appendix C.
The sub-process 1. describes the assembly of the workpiece without the BAER and STOED
components. For this it was broken down into eight further sub-processes detailed in the right
in Figure 27. The tasks were divided between the FIXBOT and ASSEMBOT and describe the
movement to an assembly position, for example “Tag1”, and the subsequent assembly of
MASTER THESIS RESULTS
- 30 -
components at said position. Upon completion the FIXBOT moved to the WELDBOT and the
assembled components were welded. This continued until the workpiece was completed and
taken out of the fixture in process 9.
Figure 27: Process with first and second level sub-processes
The industrial robot models chosen were IRB_7600_500_255 for the FIXBOT,
IRB_6620_22_150 for the ASSEMBOT and IRB_1600ID_4_150 for the WELDBOT, from the
company ABB, [39]. The workpiece, with the largest inputs possible, was estimated to weigh
100kg. The IRB_6620_22_150 with a payload of 150kg was chosen. The weight of the fixture
was estimated to be three times as large as the workpiece. Combined with the workpiece this
gives a required payload of 400kg for the FIXBOT. To have a safety factor, the
IRB_7600_500_255 with a payload of 500kg was selected.
4.6 Automation
In this chapter the steps taken to automate the process are shown. It is started off by detailing
the generation of the CAD model. Afterwards, the automation of the RM generation is
presented. The whole process is controlled through a single GUI in EXCEL, but clear
distinctions between two generation processes can be made.
1. Assemble Gallerdurk without Baer and Stoed 1.1 Move FIXBOT to Tag1
2. Move to WELDBOT 1.2 Assemble Components at Tag1
3. Weld all except Baer and Stoed 1.3 Move FIXBOT to Tag2
4. Move to ASSEMBOT 1.4 Assemble Components at Tag2
5. Assemble Baer and Stoed 1.5 Move FIXBOT to Tag3
6. Move To WELDBOT 1.6 Assemble Components at Tag3
7. Weld Baer and Stoed 1.7 Move FIXBOT to Tag4
8. Move to ASSEMBOT 1.8 Assemble Components at Tag4
9. Unload Gallerdurk
First Level Sub-Processes Second Level Sub-Process of 1.
MASTER THESIS RESULTS
- 31 -
4.6.1 Automating the CAD Model Generation To generate the CAD model two steps needed to be automated. The before mentioned reference
point creation in the skeleton model and the instantiation of the HLCts. For this a KB and an IE
was required. The KB, shown in Figure 29, was created in an EXCEL sheet and the IE was
created through a VBA script, as well as in the EXCEL sheet. The reference point for the
different configurations were created and information about their name and geometrical set was
defined. Through the IE, based on the length and width, the required configuration was
activated. The section to the left called “Grating” defines the start reference points for the BAER
and STOED HLCts. As described in chapter 4.4.1, multiple points, shifted along two different
axes, were needed. These values were calculated through the script, as the points were created.
The middle section, “Cross Section Support – One & Two”, calculated the reference points for
a diagonally crossed support section, as depicted in Figure 19 (middle). The section on the right
is used for all other configurations. Depending on the input variables different parts were
activated. This was then linked to an information hub shown in Figure 28. It contained
information on the HLCt and where to find the values calculated in Figure 29. In the second
column the status of the HLCt was shown and the third row detailed how many instances needed
to be created of each HLCt. In the 11th and 12th column information about where the HLCts
were saved were shown. The script interacts primarily with the information hub to gather the
Figure 28: Information hub for the reference creation and HLCt instantiation
HLCt
ACTIVE
INST
ANCES
REFER
ENCE
GEO
SET
NR. REF
EREN
CES
EXCEL
SHEE
T
COLUM
N - REF
ROW - R
EF
COLUM
N - HLC
t
ROW - H
LCt
PART N
AME
LOCA
TION
PRODUCT
PLANEFIXTURE 1 1 OTR_STRUCT 3 CAD S 9 T 9 HLCt_PLANEFIXTURE.CATPartX:\Documents\Master Thesis\CAD_DATA_CATIA\First Iteration\GALLERDURK
WALL_ONE 1 2 OTR_STRUCT 3 CAD S 3 T 3 HLCt_WALL_ONE.CATPartX:\Documents\Master Thesis\CAD_DATA_CATIA\First Iteration\GALLERDURK
WALL_TWO 1 2 OTR_STRUCT 3 CAD S 6 T 6 HLCt_WALL_TWO.CATPartX:\Documents\Master Thesis\CAD_DATA_CATIA\First Iteration\GALLERDURK
BAER 1 29 BAER 3 CAD V 3 W 3 HLCt_BAER.CATPartX:\Documents\Master Thesis\CAD_DATA_CATIA\First Iteration\GALLERDURK
STOED 1 40 STOED 3 CAD V 6 W 6 HLCt_STOED.CATPartX:\Documents\Master Thesis\CAD_DATA_CATIA\First Iteration\GALLERDURK
SUPRT_NOCUT 1 2 STRGHT_SUP 3 CAD AB 7 AC 7 HLCt_SUPRT_NOCUT.CATPartX:\Documents\Master Thesis\CAD_DATA_CATIA\First Iteration\GALLERDURK
SUPRT_ONECUT 0 0 - - CAD - - - - HLCt_SUPRT_ONECUT.CATPartX:\Documents\Master Thesis\CAD_DATA_CATIA\First Iteration\GALLERDURK
SUPRT_TWOCUT 0 0 - 4 CAD - - - - HLCt_SUPRT_TWOCUT.CATPartX:\Documents\Master Thesis\CAD_DATA_CATIA\First Iteration\GALLERDURK
SUM 6 76
Create Outer
Create Inner
Parts
Create bär and
MASTER THESIS RESULTS
- 32 -
necessary information. In Figure 30, the flowchart of the “CAD Model Generation” code is
depicted. As described, it starts off with the “Configure Skeleton” code where all features that
were not permanent are deleted. Afterwards, using the information hub all necessary references
were created. This was followed by the “Create Components” code. In this part the reference
previously created were used to instantiate the HLCt to create all components. The actions “Get
HLCt Location” and “Get nr. Instances & References” was linked directly to the information
hub. The other tasks were performed in CATIA.
Figure 30: Flowchart of the “CAD Model Generation” Code (left), “Configure Skeleton” (middle) and “Create
Components” (right)
In Table 4, the AI for all iterations of the CAD model generation process is presented. The first
iteration required the manual selection of which codes to run for the reference creation, as well
as the manual instantiation of all components. This was reduced in the second iteration by fully
automating the reference creation. A manual instantiation was, however, still necessary. In the
third instantiation the instantiation process was automated resulting in only three steps that are
needed to perform. Changing the two input variables to their required value and running the
code.
GET NR. INSTANCES
& REFERENCES
CREATE CROSS
SUPPORTS
GET HLCt
LOCATION
CREATE NEW PART
WITH REFERENCES
COPY
REFERENCES
DELETE
FEATURES
CREATE
COMPONENTS
CREATE
STRAIGHT
CREATE BAER
AND STOED
ALL
INST.
CREATE
D?
yesno
CONFIGURE
SKELETONACTIVE?
ACTIVE?
yes
yes
no
no
INSTANTIATE
HLCt
spacing instances X [mm] Y [mm] Z [mm] name GeoSet X [mm] Y [mm] Z [mm] name nr. variables length line X [mm] Y [mm] Z [mm] name nr. variables
34 29 2,5 20 2.5 baer_fr. BAER 902.5 0 2.5 cross_fr. 1 0.1 2280.898946 335.833 0 2.5 fr. 1 0.333 l and w check 1
34 29 2,5 20 2052.5 baer_bk. BAER 902.5 -100 2.5 cross_frdwn. 1 0.9 335.833 -100 2.5 frdwn. 1 0.5 active 1
34 29 2,5 0 2052.5 baer_bkdwn. BAER 102.5 0 2052.5 cross_bk. 1 state length 335.833 0 2052.5 bk. 1 state l 2
50 40 2,5 17,5 2.50 stoed_fr. STOED 207.5 -100 2.5 cross_fr. 2 start 3 335.833 0 1027.5 mdl. 1 state w 0
50 40 1002.5 17,5 2.5 stoed_bk. STOED 207.5 0 2.5 cross_frdwn. 2 end 8 669.167 0 2.5 fr. 2 start z 3
50 40 1002.5 0 2.5 stoed_bkdwn. STOED 902.5 -100 2052.5 cross_bk. 2 GeoSet CROSS_ONE 669.167 -100 2.5 frdwn. 2 start x 23
2.5 0 1847.5 cross_fr. 1 669.167 0 2052.5 bk. 2 end z 10
2.5 -100 1847.5 cross_frdwn. 1 669.167 0 1027.5 mdl. 2 end x 26
length 2050 1002.5 0 207.5 cross_bk. 1 active 0 502.5 0 2.5 fr. 1 GeoSet STRGHT_SUP
width 1000 2.5 -100 207.5 cross_fr. 2 502.5 -100 2.5 frdwn. 1
2.5 0 207.5 cross_frdwn. 2 502.5 0 2052.5 bk. 1 z dir
1002.5 -100 1847.5 cross_bk. 2 502.5 0 1027.5 mdl. 1 x dir
2.5 -100 685.833 fr. 1
X [mm] Y [mm] Z [mm] name nr. variables length line 2.5 0 685.833 frdwn. 1
2.5 0 925 crssup_fr. 1 0.55 2280.898946 1002.5 -100 685.833 bk. 1
2.5 -60 925 crssup_frdwn. 1 0.45 502.5 -100 685.833 mdl. 1
1002.5 0 1130 crssup_fr. 2 state length 2.5 -100 ####### fr. 2
1002.5 -60 1130 crssup_frdwn. 2 start 17 2.5 0 ####### frdwn. 2
452.5 0 2.5 crssup_fr. 1 end 20 1002.5 -100 ####### bk. 2
452.5 -60 2.5 crssup_frdwn. 1 GeoSet CROSS_TWO 502.5 -100 ####### mdl. 2
552.5 0 2052.5 crssup_fr. 2 x_dir 1 2.5 -100 1027.5 fr. 1
552.5 -60 2052.5 crssup_frdwn. 2 z_dir 0 2.5 0 1027.5 frdwn. 1
1002.5 -100 1027.5 bk. 1
502.5 -100 1027.5 mdl. 1SUPRT_NOCUT
HLCt
HLCt
x_di
r sn
gl
wid
th
x_di
r db
l 1x_
dir
dbl 2
Part
1Pa
rt 2
leng
thw
idth
Cross Section Support - Two
Part
1Pa
rt 2
bärs
tal
stö
dsta
l leng
th
Cross Section Support - OneGrating
Part
1Pa
rt 2
Straight Support
Individual Macros
z_di
r db
l 2z_
dir
dbl 1
wid
th
Part
1Pa
rt 2
HLCt
SUPRT_NOCUT
-
leng
th
SUPRT_ONECUT
z_di
r sn
gl
Create Points for
BAER/STOED
Create Points for Cross
Section Support
Delete Features in
SKEL
Create Points Straight
Support
Configure Skeleton
Figure 29: KB (and IE) for the reference creation in the skeleton
MASTER THESIS RESULTS
- 33 -
Table 4: Automation Index for the different iterations of CAD automation
Iteration AI Comments
1. Iteration 243 For a product with 57 components
2. Iteration 118 For a product with 57 components
3. Iteration 3 For all products
4.6.2 Automating the RM Generation The automation of the RM generation was divided into two parts, the tag generation, and the
task generation. Each part will be presented in the following two chapters.
4.6.2.1 Automation of the Tag Generation
The tag generation was structured similarly to the process described in chapter 4.6.1. An
information hub, shown in Figure 31, served as the KB and IE by allocating the correct
information to the components being worked on. This information hub was linked to the code
“Create Place & Weld Tags” shown in Figure 32 (right). The required reference part was
selected through its name and combination of the instance integer. Afterwards, the tag category
was chosen, where four options were available, place, place approach, weld, and weld
approach. The place category creates tags in the fixture and is used as a destination for the
components i.e., where they must be placed. The weld category details where weld operations
were performed. Approach tags were used to approach place and weld tags (as well as the later
mentioned pick tags) from the correct side and orientation. Depending on the category chosen,
the references were accessed through the four segments visualized in Figure 31. The HLCts
contained reference points for their tag locations. The coordinates and YPR angles of the
reference tag, listed in “FIXBOT - REF TAGS” (Figure 31), were acquired and the tag
coordinates calculated. As the RM of the FIXBOT was manually determined, all positions and
orientations of it, and thereby the fixture were known. From this, the tag coordinates and
orientations were calculated by performing an axis transformation as detailed in chapter 2.3,
using Eq. 1 to Eq. 10. A flowchart of the axis transformation code is provided in Appendix D.
The reference tag defined the position and orientation of the FIXBOT and in extension of the
fixture. Since the origin of the fixture and GALLERDURK coincide the ref tag could be used
to calculate all needed points in all FIXBOT positions by performing the respective rotational
and translational transformation. Additionally, the method detailed in chapter 2.3 using Eq. 7
to Eq. 10, was used to calculate the orientation of the ASSEMBOT when placing components.
To assemble the workpiece correctly the orientation of the ASSEMBOT needed to correspond
to the FIXBOT orientation. This was done by performing a modified axis transformation on the
YPR angles of the FIXBOT reference tag, compare with flowchart in Appendix D. By applying
a rotational transformation to the reference YPR angles, the corresponding YPR angles of the
ASSEMBOT were calculated. The required values were accessed through the “FIXBOT - REF
TAGS” and “ORIENT OPT” lists shown in Figure 31. As these steps differed between the
place/weld & pick tags, different information hubs and scripts were necessary.
MASTER THESIS RESULTS
- 34 -
Figure 31: Information hub for the PLACE/WELD/APPROACH tag creation
The corresponding tag group was accessed through the information hub and the tag created.
This process was done for all instances of all parts, except for the BAER and STOED
components. As these were pre-welded only one pick & place tag was needed for all instances
combined.
Figure 32: Main tag creation code (left) and "Create Place & Weld Tags" code (right)
The tag creation for the pick & pick approach tags was similar to the process described above.
An information hub, shown in Figure 33, provided the required references and a similar code,
compare with the “Create Pick Tags” code in Figure 34 (right), created the tags. The difference
to the previous process was, that the workpiece components were not connected to the RM of
the FIXBOT. Prior to assembly they were relocated next to the ASSEMBOT and the tags of the
FIXBOT could not be used to calculate their tag coordinates. When copying the coordinates of
TAG
ACTIVE
INST
ANCE
S
PART N
AME
GEO SE
T
POIN
T NAM
E
TAG G
ROUP
REF T
AG
PART N
AME
GEO SE
T
POIN
T NAM
E
TAG G
ROUP
REF T
AG
CURR
ENT I
NSTANCE
GEO SE
T
POIN
T NAM
E
NR POIN
TS
TAG G
ROUP
REF T
AG
GEO SE
T
POIN
T NAM
E
TAG G
ROUP
REF T
AG
YPR l
ocati
on
YPR -
PLAC
E
curre
nt in
stanc
e
TAG GROUPS PURPOSE
PLANEFIXTURE 1 1 FIXTURE_REFERENCE_POINTSPLACE PLACE_PLANEFIXTUREASSEMBOT_PLACE_PLANEFIXTURE2 FIXTURE_REFERENCE_POINTSPLACE_APP PLACE_APP_PLANEFIXTUREASSEMBOT_PLACE_PLANEFIXTURE2 1 - - - - - - - - - - 5 1
WALL_ONE 1 2 FIXTURE_REFERENCE_POINTSPLACE PLACE_WALL_ONEASSEMBOT_PLACE_WALL_ONE3 FIXTURE_REFERENCE_POINTSPLACE_APP PLACE_APP_WALL_ONEASSEMBOT_PLACE_WALL_ONE3 2 WELD weld 6 WELDBOT_WELD_WALL_ONE2 WELD_APP weld_appWELDBOT_WELD_APP_WALL_ONE2 2 7 2
WALL_TWO 1 2 FIXTURE_REFERENCE_POINTSPLACE PLACE_WALL_TWOASSEMBOT_PLACE_WALL_TWO4 FIXTURE_REFERENCE_POINTSPLACE_APP PLACE_APP_WALL_TWOASSEMBOT_PLACE_WALL_TWO4 2 WELD weld - WELDBOT_WELD_WALL_TWO2 WELD_APP weld_appWELDBOT_WELD_APP_WALL_TWO2 10 6 2
SUPRT_NOCUT 1 2 FIXTURE_REFERENCE_POINTSPLACE PLACE_SUPRT_NOCUTASSEMBOT_PLACE_SUPRT_NOCUT1 FIXTURE_REFERENCE_POINTSPLACE_APP PLACE_APP_SUPRT_NOCUTASSEMBOT_PLACE_SUPRT_NOCUT1 2 WELD weld 8 WELDBOT_WELD_SUPRT_NOCUT2 WELD_APP weld_appWELDBOT_WELD_APP_SUPRT_NOCUT2 12 6 2
SUPRT_ONECUT 1 2 FIXTURE_REFERENCE_POINTSPLACE PLACE_SUPRT_ONECUTASSEMBOT_PLACE_SUPRT_ONECUT2 FIXTURE_REFERENCE_POINTSPLACE_APP PLACE_APP_SUPRT_ONECUTASSEMBOT_PLACE_SUPRT_ONECUT2 2 WELD weld 8 WELDBOT_WELD_SUPRT_ONECUT2 WELD_APP weld_appWELDBOT_WELD_APP_SUPRT_ONECUT2 20 8 2
SUPRT_TWOCUT 0 0 FIXTURE_REFERENCE_POINTSPLACE PLACE_SUPRT_TWOCUTASSEMBOT_PLACE_SUPRT_TWOCUT- FIXTURE_REFERENCE_POINTSPLACE_APP PLACE_APP_SUPRT_TWOCUTASSEMBOT_PLACE_SUPRT_TWOCUT- 1 WELD weld 24 WELDBOT_WELD_SUPRT_TWOCUT2 WELD_APP weld_appWELDBOT_WELD_APP_SUPRT_TWOCUT2 36 - 1
BAER 1 39 FIXTURE_REFERENCE_POINTSPLACE PLACE_BAERASSEMBOT_PLACE_BAER1 FIXTURE_REFERENCE_POINTSPLACE_APP PLACE_APP_BAERASSEMBOT_PLACE_BAER1 39 WELD weld 8 WELDBOT_WELD_BAER2 WELD_APP weld_appWELDBOT_WELD_APP_BAER2 60 7 39
STOED 1 26 - - - - - - - - - - 26 WELD weld 2 WELDBOT_WELD_STOED2 WELD_APP weld_appWELDBOT_WELD_APP_STOED2 68 - -
X Y Z Y P R TAG NAME Y P R
1 1517,975 -1488,886 1322,693 0 18,727 -37 Tag1 1 0 180 90 PICK
2 1517,975 -1488,886 1322,693 18,728 0 53 Tag2 2 45 -150 0
876 3 1517,975 -1488,886 1322,693 0 -18,728 143 Tag3 3 -45 -150 0
4 1517,975 -1488,886 1322,693 -18,728 0 -127 Tag4 4 180 30 0
5 1826,677 873,020 1576,392 -20 90 0 Tag5 5 180 0 -45
6 1773,817 830,294 1576,386 -38,946 90 0 Tag6 6 180 0 -90
7 1785,495 1146,729 1576,379 10,748 90 8,635 Tag7 7 180 0 0
29.2 8 1826,682 873,022 1576,385 -160 -90 0 Tag8 8 180 0 -38.66
time per tag [s] 9 1773,828 830,297 1576,372 -141,054 -90 0 Tag9 9 180 0 -128.7
120 10 1785,500 1146,795 1576,372 177,887 -90 0 Tag10
11 1826,682 873,022 1576,385 -90 0 -70 Tag11
12 1826,682 873,022 1576,385 90 0 110 Tag12
time estim. Tag
creation man. [h]
WEL
D
If TwoCut is active, cutwelds not needed. If
TwoCut is not active and two OneCut, then all
are needed from the first Instance.
If active, all are needed
PLA
CEFIXBOT
FIXBOT
FIXBOT
FIXBOT
WALL_ONE
SUPRT_ONECUT
sum of al tags
created
All needed
6 needed; 5,6 for instance 1; 7,8 for instance 2
TAG GROUP
ORIENT OPT
ASS
EMB
OT
PLANEFIXTURE
FIXBOT - REF TAGS
PLACE PLACE - APPROACH
FIXBOT
FIXBOT
FIXBOT
FIXBOT
FIXBOT
SUPRT_TWOCUT
BAER
STOED
FIXBOT
FIXBOT
WEL
DB
OT
FIXBOT All needed
WELD - APPROACHWELD
WALL_TWO
SUPRT_NOCUT
No welding points needed
Reasoning choosing weld(_app) points
All needed
All needed, only from first instance
Delete Tags
Create Tags
SET REFERENCE
PART
MOVE
WORKPIECECOMPONENT
CREATE
PLACE & WELD TAGS
ALL
INST.
CREATED
?
yes
yesno
no
DELETE
TAGS
CREATE PICK
TAGS
SET
INSTANCE
PLACE/WELD/
APPROACH
COPY
COORDINATES
CALCULATE TAG
COORDINATES
CREATE TAG
yesno
ALL PARTS
CREATED?
ALL
CATEG.?
GET
REFERENCES
MASTER THESIS RESULTS
- 35 -
the reference points, the local coordinates of the part were acquired. These were transformed
through a coordinate transformation [40] to get the global coordinates. The relocation of parts
followed the flowchart “Move Workpiece Components” presented in Figure 34 (left). The
individual components were required to be moved to the same destination each time,
independent of their initial location. This was accomplished by defining the coordinates of the
final location and linking it to a point in the component. Getting the coordinates of said point
allowed the calculation of the destination of the part origin and to move the part. This was
redone until all components had been moved in reach of the ASSEMBOT. Moving the parts
was accomplished by accessing the compass tool through VBA.
Figure 33: Information hub for PICK/APPROACH tag creation
Figure 34: "Move Workpiece Components" code (left) and "Create Pick Tags" code (right)
Table 5 shows the AI for all iterations of the tag generation process. In the first iteration each
tag required 13 manual steps. The XYZ coordinates of the reference point and the reference tag,
the YPR angles of the reference tag and the to-be-created tag, as well as the destination tag
TAG GROUPS PURPOSE
PICK &
APPROACH
ACTIVE
INST
ANCES of P
arts
Insta
nces
per p
art
GEO SE
T REF
REF GEO
NAM
E
UPSID
E DOW
N?
CURRENT I
NSTANCE
DEST T
AG GROUP
YPR
PLANEFIXTURE 1 1 1 PICK PICK_PLANEFIXTURE- 1 ASSEMBOT_PICK_PLANEFIXTURE1
PLANEFIXTURE 1 1 1 PICK_APP PICK_APP_PLANEFIXTURE- 1 ASSEMBOT_PICK_PLANEFIXTURE1
WALL_ONE 1 2 1 PICK PICK_WALL_ONE- 2 ASSEMBOT_PICK_WALL_ONE1
WALL_ONE 1 2 1 PICK_APP PICK_APP_WALL_ONE- 2 ASSEMBOT_PICK_WALL_ONE1
WALL_TWO 1 2 1 PICK PICK_WALL_TWO- 2 ASSEMBOT_PICK_WALL_TWO1
WALL_TWO 1 2 1 PICK_APP PICK_APP_WALL_TWO- 2 ASSEMBOT_PICK_WALL_TWO1
SUPRT_NOCUT 0 0 1 PICK PICK_SUPRT_NOCUT- 2 ASSEMBOT_PICK_SUPRT_NOCUT1
SUPRT_NOCUT 0 0 1 PICK_APP PICK_APP_SUPRT_NOCUT- 2 ASSEMBOT_PICK_SUPRT_NOCUT1
SUPRT_ONECUT 1 2 1 PICK PICK_SUPRT_ONECUT0 2 ASSEMBOT_PICK_SUPRT_ONECUT1
SUPRT_ONECUT 1 2 1 PICK_APP PICK_APP_SUPRT_ONECUT.10 2 ASSEMBOT_PICK_SUPRT_ONECUT1
SUPRT_TWOCUT 1 1 1 PICK PICK_SUPRT_TWOCUT1 1 ASSEMBOT_PICK_SUPRT_TWOCUT1
SUPRT_TWOCUT 1 1 1 PICK_APP PICK_APP_SUPRT_TWOCUT.21 1 ASSEMBOT_PICK_SUPRT_TWOCUT1
BAER 1 41 1 PICK PICK_BAER - 20 ASSEMBOT_PICK_BAER1
BAER 1 41 1 PICK_APP PICK_APP_BAER- 20 ASSEMBOT_PICK_BAER1
10 16
Only one point for all baer and stoed components
and not for each instance.
GET
REFERENCES
CALCULATE POINT
DESTINATION
GET COORDINATE
POINTS
ALL
PARTS
MOVED?
yesno
MOVE PARTS
SET REFERENCE
PART
ALL INST.
CREATED
?
yes
yesno
no
SET
INSTANCE
GET
REFERENCES
COPYLOCAL
COORDINATES
CALCULATE GLOBAL
COORDINATES
CREATE TAG
ALL PARTS
CREATED?
MASTER THESIS RESULTS
- 36 -
group. It was possible to perform all steps before running the code once, thus resulting in =
1 and 𝐴𝐼 = 8685, with the number of tags for a model the size 1050𝑥1050[ ] being 668.
For subsequent iterations, the need for defining the reference point and tag group (2. Iteration),
reference tag (3. Iteration) and YPR angles of the created tag (4. Iteration) were automated.
The 𝐴𝐼 = 1 for iteration four stems from running the code. All created Tags, with their XYZ
and YPR angles, as well as the corresponding part name and reference tag, along with further
information was stored in an EXCEL sheet for later use.
Table 5: Automation Index for the different iterations of tag automation
Iteration AI Comments
1. Iteration 8685 For a product with 57 components
2. Iteration 6013 For a product with 57 components
3. Iteration 2004 For a product with 57 components
4. Iteration 1 For all products
4.6.2.2 Automation of the Task Generation
In the following chapter, the automation of the task generation will be described. This will
primarily be visualized through the automation of the ASSEMBOT. Additionally, all results of
further interest will be presented.
Basis for the automatic task generation were the preceding tag generation, as well as the manual
creation of all sub-processes and robot tasks, detailed in chapter 4.5.2. Furthermore, some
manually created tags and a general structure for the operations, shown in the two tables on the
right side of Figure 35, to be created was required. As previously, a KB and IE were built in
EXCEL and VBA, compare Figure 35. A code, visualized through the flowchart in Figure 36
(left) was deployed to create the RM for the ASSEMBOT. It was started off by selecting the
reference point of the FIXBOT, following the list detailed in Figure 31. The list of all tags was
searched for all rows containing the specified reference tag and by running through the list of
parts, the first part was defined. The rows containing said parts “pick”, “pick approach”, “place”
and “place approach” were selected and the part name saved for later use, see Figure 36 (right).
Afterwards, the motion was defined following the pre-defined operation structure, compare
Figure 37 (left). The robot task was selected based on the reference tag used and the motion
created. This process was looped until a motion for all parts linked to the activated reference
tag was created. Next, a final operation was created, compare Figure 36 & Figure 37 and the
next reference tag was activated. Subsequently, the BAER and STOED components were
assembled in the fixture and finally the motion for unloading the workpiece was created, shown
in Figure 37 (right). The operation “robot motion” was created by accessing the API
(Application Programming Interface) through VBA. For the “pick”, “place” and “weld”
operations each, a single operation was manually created. The operations throughout the robot
tasks were created by copy-pasting and modifying said operations, as no API exists. This
method relied on an apparent bug for “pick” and “place” operations. Only the gripping element
could be modified through VBA, however, when copying the operation, the gripping element
and the element that was being gripped switched places. By modifying the operation before and
after the copy paste process it was possible to produce a functioning “pick” or “place” operation.
For this to work, all parts needed to be placed in the “ResourceList”, see Figure 26.
MASTER THESIS RESULTS
- 37 -
Figure 35: Information hub and KB for the ASSEMBOT RM creation
Figure 36: Creation of the RM for ASSEMBOT (left) and “Search Tag List” sub-process (right)
Figure 37: "Create Motion" (left) and "Unload Gallerdurk" (right) sub-processes of the ASSEMBOT RM creation
The FIXBOT motion predominantly relied on manually created tags. Similar to the
ASSEMBOT motion, an operation structure was created and accessed through code.
Furthermore, upon locating the part in Figure 36 (left), its name and when it was to be accessed
were saved in the information hub for the FIXBOT RM. Using this, the RM for the FIXBOT
was automated, relying on similar code as detailed for the ASSEMBOT. A flowchart of the
FIXBOT RM code can be found in the Appendix D. The RM of the WELDBOT is almost
RM -
ASSEMBOT
activ
e
instan
ces
curre
nt in
stanc
e
point
nam
e plac
e
Geo Se
t Pick
oper
ation
struc
ture
tag gr
oup
oper
ation
struc
ture
tag gr
oup
PLANEFIXTURE 1 1 1 PLACE_PLANEFIXTURE.1PICK ASSEMBOT_MOVE_.1 ASSEMBOT_MOVE Tag1 ASSEMBOT_MOVE_.1 ASSEMBOT_MOVE Tag1
WALL_ONE 1 2 2 PLACE_WALL_ONE.2PICK ASSEMBOT_MOVE_1.1 ASSEMBOT_MOVE Tag2 ASSEMBOT_MOVE_3.1 ASSEMBOT_MOVE Tag3
WALL_TWO 1 2 2 PLACE_WALL_TWO.2PICK Pick App 879 ASSEMBOT_MOVE_4.1 ASSEMBOT_MOVE Tag4
SUPRT_NOCUT 1 2 2 PLACE_SUPRT_NOCUT.2PICK Pick 878 Place_App 419
SUPRT_ONECUT 1 2 2 PLACE_SUPRT_ONECUT.2PICK Pick App 879 Place 418
SUPRT_TWOCUT 0 0 PLACE_SUPRT_TWOCUT.PICK ASSEMBOT_MOVE_1.1 ASSEMBOT_MOVE Tag2 Place_App 419
BAER 1 39 19 PLACE_BAER.19PICK ASSEMBOT_MOVE_3.1 ASSEMBOT_MOVE Tag3 ASSEMBOT_MOVE_4.1 ASSEMBOT_MOVE Tag4
ASSEMBOT_MOVE_4.1 ASSEMBOT_MOVE Tag4 ASSEMBOT_MOVE_3.1 ASSEMBOT_MOVE Tag3
ASSEMBOT Tag1 Place App 419 ASSEMBOT_MOVE_1.1 ASSEMBOT_MOVE Tag2
1.2_PlaceTag1 1 Place 418 Pick_App 879
1.4_PlaceTag2 1 Place Place App Pick Pick App Place App 419 Pick 878
1.6_PlaceTag3 8 418 419 878 879 ASSEMBOT_MOVE_4.1 ASSEMBOT_MOVE Tag4 Pick_App 879
1.8_PlaceTag4 ASSEMBOT_MOVE_3.1 ASSEMBOT_MOVE Tag3 ASSEMBOT_MOVE_1.1 ASSEMBOT_MOVE Tag2
5_AssembleB&S ASSEMBOT_MOVE_1.1 ASSEMBOT_MOVE Tag2 ASSEMBOT_MOVE_2.1 ASSEMBOT_MOVE Tag5
9_UnloadG end ASSEMBOT_MOVE_2.1 ASSEMBOT_MOVE Tag5 before movinig
on to next tag
Unload Gallerdurk
count
refTag
rT Int Tag Locations
stan
dar
d o
per
atio
n b
lock
star
t
optcnt.
only when start-
ing new task
Create ASSEMBOT RM
SET REFERENCE
TAG
ALL TAGS?
yes
yes
no
noDEFINE SEARCH
KEYWORDS
SEARCH
TAG LIST
CREATE
MOTION
ALL
Parts?
GET PART
CREATE
MOTION
DEFINE BAER
& STOED
SEARCH TAG
LIST
GET REFERENCE
INFO
CREATE
MOTION
UNLOAD
GALLERDURK
SEARCH FOR
REFERENCE TAG
yes
no
=REF
TAG?
yes
no
=POINT
NAME?
yes
no ALL
Parts?
yes
no
=PART
NAME?yes
no
=GEO
SET?
yes
no ALL
TAGS?
SAVE PLACE
(APP) ROWS
SEARCH FOR
PART NAME
SAVE PICK
(APP) ROWS
NEXT PART
GET ROBOT
TASK
CREATE LAST
OPERATION
CREATE FIRST
TWO OPERATIONS
yes
no
GET ROBOT
TASK
CREATE FIRST
OPERATIONSTART
?
MAIN?GET OPERATION
TYPE FROM EXCEL
MOTION
?
PICK?
ALLOPS?
CREATE ROBOT
MOTION
CREATE PICK
OPERATION
CREATE PLACE
OPERATION
END?
yes
yes
yes
yes
yes
no
no
no
no
no
MOTION
?
PICK?
ALLOPS?
CREATE ROBOT
MOTION
CREATE PICK
OPERATION
CREATE PLACE
OPERATION
yes
yes
yes
no
no
no
GET OPERATION
TYPE FROM EXCEL
MASTER THESIS RESULTS
- 38 -
identical to the motion of the ASSEMBOT, shown in Figure 36, with multiple weld points per
part. A flowchart of the WELDBOT RM code and a combined flowchart for the whole process
can be found in the Appendix D.
Table 6Table 5 shows the AI for all iterations of the task creation process. In the first iteration
each motion task required four manual steps, while all “pick”, “place” & “weld” tasks needed
six. It was possible to perform all steps before running the code once, thus resulting in n=1 and
AI=6140, with the number of tasks for a model the size 1050x1050[mm] being 1266. For the
subsequent iteration, the creation of motion tasks was automated, leaving the “pick”, “place” &
“weld” tasks to be created manually, thus raising n to n=2. This leads to an increase in the AI
from first iteration to second iteration. For the third iteration, the AI=1 stems from running the
code. Table 6: Automation Index for the different iterations of task automation
Iteration AI Comments
1. Iteration 6140 For a product with 57 components
2. Iteration 6457 For a product with 57 components
3. Iteration 1 For all products
4.7 Analysis
As detailed in chapter 3.6, a comparison of the elapsed time for a manual and automatic OLP
process was created, visualized in Figure 38. For this a design of experiments with 90 entries
was created, containing varying sizes of the workpiece. The area, displayed on the x-axis was
calculated through multiplication of the input parameters, length and width. The elapsed time
of the automatic process, displayed in minutes on the second y-axis, was the run-time of the
code. The manual time, first y-axis in hours, was estimated using the amount of components
requiring manual execution. A time estimation for a single component, for example creating a
point, was done by manually creating and measuring it in CATIA. These were then multiplied
by the number of components required and added to get the manual time estimation. A list of
time estimations for all components is provided in Table 7. This estimation represents a
simplified process and does not take breaks, moments of decreased productivity or other
deviations into consideration. Additionally, it is assumed, that all HLCt were already created
and ready for instantiation and all required tag and point coordinates and YPR angles calculated.
For the automatic process only the run-time of the code was considered. The time needed for
setting up the code was disregarded for this estimation.
For both data sets a rise in the elapsed time can be seen for increased areas. Furthermore, a
spread in both data sets is visible with a defined lower and upper limit. The automatic elapsed
time starts at below 5min for the smallest model and rises up to above 20min for larger models.
The trendline ETA, displayed in Figure 38, shows an estimated linear correlation in the data set
for the elapsed automatic time. The estimated manual time started at below 15h for the smallest
model, with above 40h required for larger models. The elapsed manual time was estimated
through the trendline ETM. It is of note here, that the area eliminates information about the
specific values of length and width. A workpiece the size 1050x900[mm] does not have the
same amount of parts as a workpiece of 900x1050[mm], due to the different spacing between
BAER and STOED instances. Using Eq. 12, it was calculated, that the code reduces the time
required for the OLP process by 99%.
MASTER THESIS RESULTS
- 39 -
Figure 38: Comparison of the elapsed time between the manual and automatic OLP process
Figure 39: Distribution of Elapsed Time for a Model the Size 916x916[mm]
Figure 39 shows the distribution of the elapsed time for a single model the size 916x916[mm].
The largest frequency, just below 25, occurred in the range (00:10:12,00:10:30] with the next
two largest frequencies in the ranges (00:10:30, 00:1049] & (00:10:49, 00:11:07]. A singular
outlier can be seen in the range (00:08:41, 00:08:59]. Using Eq. 13 & Eq. 14 the standard
deviation was calculated with σ = 25s centered around the arithmetic mean of
x ̅= 00:10:33 [hh:mm:ss].
ETM = 8.1373*A + 15.103
ETA = 6.7199*A + 5.942
0
10
20
30
40
50
60
0
5
10
15
20
25
30
35
40
45
0.2 0.6 1 1.4 1.8 2.2 2.6 3
Ela
pse
d T
ime
Auto
mat
ic [
min
]
Ela
pse
d T
ime
Man
ual
[h]
Area [m2]
Comparison of Elapsed Time
Manual
Automatic
MASTER THESIS RESULTS
- 40 -
Table 7: Time estimation for the individual components
Component Estimated Time [s]
Creating Point 18
Instantiating HLCt 60
Changing Parameter 10
Creating Tag 60
Creating Tag Group 11
Creating Operation 29
MASTER THESIS DISCUSSION
- 41 -
5 Discussion
The focus of the thesis was on investigating the potential of automating the OLP process of
industrial robots through VBA scripting and how it can be utilized to achieve this, as stated in
research question one in chapter 1.4.
As showcased in chapter 4.6.2, VBA scripting was utilized for automating repetitive tasks. For
the tag generation APIs were accessible and their values could be calculated by measuring the
corresponding point coordinates and performing an axis transformation, mentioned in chapter
4.6.2.1. For the task and operation generation access to APIs and VBA scripting was only
conditionally applicable. Whilst tasks and robot motions had corresponding APIs, these did not
exist for other operations such as “pick”, “place” or “weld”. These operations could be created
by copy & pasting manually created operations and modifying them. Selecting, copy pasting
and modifying the desired component is however computationally more expensive than
accessing an API to create a new component. This resulted in an increased run-time of the code.
Additionally, as explained in chapter 4.6.2.2, only the grabbing element was accessible through
VBA and an apparent bug caused it to switch places with the grabbed element when copy
pasting. While this allowed the full modification of the “pick” and “place” operations, it
increased the complexity of the code and required all components to be placed in the
“ResourceList”. Fixing the apparent bug would fully remove the accessibility of the “pick” and
“place” operations through VBA. Many other tools could not be accessed through APIs or copy
& pasting methods altogether. As described in chapter 4.5.2, different processes, all linked to a
specific robot and task, are required to simulate multiple robots in sequence. This was not
possible to automate through VBA, thus requiring meticulous planning from the start, as
adjustments were time-consuming. Robot analysis tools, such as the reach analysis were not
accessible either, thus requiring manual post processing after the code was run.
The second research question asked was, what influences the CAD model design has on the
automation process and how it should be structured, compare with chapter 1.4.
The CAD model generation, visualized in Figure 30, was the backbone of the automation
process. Utilizing the methods of HLCt presented in [10], a robust and flexible model was
created, able to change configuration and topology of the CAD model, providing in return a
flexible basis for the OLP automation. For this, supplying information to the robot about tags
or what tasks were required i.e., “pick”, “place” or “weld” operations, was necessary. This could
be accomplished by storing the information in form of points in the individual HLCt, that were
accessible through a KB and IE. While the CAD model did not have a large impact on the
overall automation process, as, independent of the geometry, tag locations and tasks were
always needed, on a smaller scale it did have an impact. Depending on how the CAD model
was structured, a different KB and IE would be needed to locate the required points, with
increasing complexity in the CAD model increasing the complexity of the KB and IE. A
structured approach to modelling the HLCt was necessary, to enable the same code to be utilized
for all HLCt. The structure proposed in chapter 4.5.1, clearly defined the location of all points
required for later use. This allowed the selection of required points via code and the automatic
creation of tags at the specified coordinates; however, it could limit reusability of the HLCt. As
the same HLCt instantiated in a different surrounding might require different welding points,
these would be needed to be included in the HLCt as well, thus increasing the complexity of
selection. All weld locations are required to be pre-planned, thus requiring the HLCt to be
MASTER THESIS DISCUSSION
- 42 -
updated if instantiated in a new environment. Due to this the general structure could be applied
for different products, yet each would require reworking for it to fully function.
As stated in the beginning of this thesis, the manual OLP process is quite tedious and time-
consuming, with [3] stating that this keeps small and medium sized companies from
implementing this technology. Especially for small batch sizes and/or a large product variety
the time consumption and the related cost of manually programming an industrial robot unit
outweighs the benefit of implementation.
Chapter 4.7 and Figure 38 supports that this thesis proposes a concept capable of decreasing
the time required for the OLP process. Overall, it was estimated, that the code reduces the time
required by 99%. Figure 39 shows the consistency of the code with a standard deviation of 25s.
Even though some outliers occur, this combined with the trendlines shown in Figure 38 give a
good basis for estimating the run-time. This estimation can be further improved by creating
different trendlines for different configurations, as length*width ≠ width*length, compare with
description in chapter 4.7. The reduction of 99% needs to be put into perspective, however,
since additional time would be required to adapt the structure to the required model. As
mentioned before, Figure 38 only considered the actual run-time of the code and estimated the
manual time for all steps that were automated. When implementing an automatic process,
however, the time needed for implementing the code needs to be considered. Especially for
small batch sizes, this would reduce the effect the automation has on the time reduction. To get
a more accurate estimation of the automation process, the implementation should be tested for
different products. Furthermore, post-processing is not included in the time estimation for the
automatic process, the same applies to the manual process. Additionally, the calculation process
for all tag coordinates was disregarded in the manual time estimation as a simplification. This
in return would further increase the time difference between the two processes, as well as all
breaks and deviations from the estimated component times. Implementing and improving on
the concept presented could be of great benefit to companies by significantly decreasing the
time required for programming.
MASTER THESIS CONCLUSION
- 43 -
6 Conclusion
Programming industrial robots is a time-consuming process, making its implementation harder
for small or medium sized companies or for smaller batch sizes, such as prototyping. This
makes companies unable to exploit the advantages industrial robots entail, such as increased
accuracy and quality in welding processes. The aim of this thesis was to develop a concept that
automates the OLP process, thus decreasing the time necessary before implementation. This
was successfully accomplished for a flexible product geometry. It was possible to decrease the
estimated manual time by 99% through the implementation of DA. Incorporating reference
points (for welding and other uses) directly in the HLCt allows for the automatic creation of
context specific information required by the automation process. The instantiated part merely
needs to be accessed through an IE to gain the coordinates the robot is required to travel to.
This results in all weld locations needing to be known at the modeling stage. It is of further
note, that this thesis only developed a concept for a single model type (Gallerdurk) and only for
spot welding and not the full welding process. These limitations were necessary due to the time
restrictions imposed on the thesis. However, the concept proved auspicious and with further
development this could enable the use of industrial robots for smaller companies, smaller batch
sizes or potentially even prototyping.
Throughout the thesis, different areas of interest have crystalized. Further investigating the
missing APIs in the DELMIA environment could be of great benefit, eliminating the necessity
for copy & pasting operations would increase the overall performance of the code. Including
steps for post-processing, such as clash detection or robot reach analysis could further reduce
the manual input needed. This would have the potential to be paired with multidisciplinary
design optimization (MDO) to fully automize and optimize the robot placement to ensure
reachability. This would highly increase the computational expensiveness of the code and the
applicability, especially for smaller companies would be debatable, but in terms of scientific
potential it is an interesting subject. MDO has the potential to be included on smaller scales as
well. By investigating path planning for the robot, it could be possible to optimize for the
shortest path to further reduce the manufacturing time of parts. Research to explore could be
the “Anytime Ant System” for manipulator path planning by [41] or another research by [18].
Additionally, the generalizability of the concept developed should be improved to allow for an
easier and faster implementation. From this, the time required for implementing the automation
concept could be estimated to allow for a more accurate representation of time difference
between the manual and automatic process.
MASTER THESIS REFERENCES
- 44 -
7 References
[1] Z. Pan, J. Polden, N. Larkin, S. V. Duin and J. Norrish, “Recent progress on
programming methods for industrial robots,” Robotics and Computer-Integrated
Manufacturing, vol. 28, no. 2, pp. 87-94, 2012.
[2] J. Xiaoshu and Y. Xichen, “Off-Line Programming of a Robot for Laser Re-
Manufacturing,” Tsinghua Science and Technology, vol. 14, no. S1, pp. 186-191, 2009.
[3] L. A. Ferreira, Y. L. Figueira, I. F. Iglesias and M. Á. Souto, “Offline CAD-based robot
programming and welding parametrization of a flexible and adaptive robotic cell using
enriched CAD/CAM system for shipbuilding,” Procedia Manufacturing, vol. 11, pp.
215-223, 2017.
[4] U. Thomas, “Automatisierte Programmierung von Robotern für Montageaufgaben,” in
Ausgezeichnete Informatikdissertationen 2008, Bonn, Hölldobler, 2009, pp. 291-300.
[5] “dictionary.com,” DICTIONARY, [Online]. Available:
https://www.dictionary.com/browse/automation. [Accessed 12 02 2021].
[6] “techopedia.com,” techopdia, 07 08 2020. [Online]. Available:
https://www.techopedia.com/definition/32099/automation#:~:text=Automation%20is
%20the%20creation%20and,were%20previously%20performed%20by%20humans..
[Accessed 12 02 2021].
[7] C. W. Dankwort, R. Weidlich, B. Guenther and J. E. Blaurock, “Engineers` CAx
education - it´s not only CAD,” Computer-Aided Design, vol. 36, no. 14, pp. 1439-1450,
2004.
[8] M. M. M. Sarcar, K. Mallikarjuna Rao and K. Lalit Narayan, Computer Aided Design
and Manufacturing, PHI Learning Pvt. Ltd., 2008.
[9] J. D. Camba, M. Contero and P. Company, “Parametric CAD modeling: An analysis of
strategies for design reusability,” Computer-Aided Design, vol. 74, pp. 18-31, 2016.
[10] K. Amadori, M. Tarkian, J. Ölvander and P. Krus, “Flexible and robust CAD models for
design automation,” Advanced Engineering Informatics, vol. 26, no. 2, pp. 180-195,
2012.
[11] C. B. Chapman and M. Pinfold, “Design engineering - a need to rethink the solution
using knowledge based engineering,” Knowledge-Based Systems, vol. 12, no. 5-6, pp.
257-267, 1999.
[12] G. Frank, D. Entner, T. Prante, V. Khachatouri and M. Schwarz, “Towards a Generic
Framework of Engineering Design Automation for Creating Complex CAD Models,”
International Journal on Advances in Systems and Measurements, vol. 7, no. 1 & 2, pp.
179-192, 2014.
[13] G. La Rocca, “Knowledge based engineering: Between AI and CAD. Review of a
language based technology to support engineering design,” Advanced Engineering
Informatics, vol. 26, no. 2, pp. 159-179, 2012.
[14] S. B. Niku, Introduction to Robotics 3rd Edition: Analysis, Control and Applications,
Sab Luis Obispo: Copyright John Wiley & Sons Limited, 2020.
MASTER THESIS REFERENCES
- 45 -
[15] “interestingliterature.com,” Interesting Literature, [Online]. Available:
https://interestingliterature.com/2016/03/the-curious-origin-of-the-word-robot/.
[Accessed 16 02 2021].
[16] ISO Standard 8783:1994, Manipulating Industrial Robots - Vocabulary.
[17] J. Polden, Z. Pan, N. Larkin, S. Van Duin and J. Norrish, “Offline Programming for a
Complex Welding System Using DELMIA Automation,” in Robotic Welding,
Intelligence and Automation, Wollongong, Springer-Verlag, 2011, pp. 341-350.
[18] T. Zhang and S. Chen, “Path Planning and Computer Simulation of a Mobile Welding
Robot,” in Robotic Welding, Intelligence and Automation, Shanghai, Springer-Verlag,
2011, pp. 421-428.
[19] F. Reynier, P. Chedmail and P. Wenger, “Optimal positioning of robots: feasibility of
continuous trajectories among obstacles,” in IEEE International Conference on Systems,
Man and Cybernetics, Chicago, IL, USA, USA, 1992.
[20] J. J. Craig, Introduction to Robotics: Mechanics and Control, New Jersey, USA: Prentice
Hall, 2004.
[21] National Aeronautics and Space Administration, “grc.nasa.gov,” 07 May 2021.
[Online]. Available: https://www.grc.nasa.gov/www/k-12/airplane/rotations.html.
[Accessed 10 May 2021].
[22] W. Khalil and E. Dombre, Modeling, Identification & Control of Robots, New York,
USA: Routledge, 2002.
[23] H. Wang and Y. K. Rong, “Case based resoning method for computer aided welding
fixture design,” Computer-Aided Design, vol. 40, no. 12, pp. 1121-1132, 2008.
[24] M. Hajduk, J. Semjon and M. Vagaš, “Design of the Welding Fixture for the Robotic
Stations for Spot Welding Based on the Modular Concept,” Acta Mechanica Slovaca,
vol. 13, no. 3, pp. 30-37, 2009.
[25] H. Wang, y. K. Rong, H. Li and P. Shaun, “Computer aided fixture design: Recent
research and trends,” Computer-Aided Design, vol. 42, no. 12, pp. 1085-1094, 2010.
[26] Y. Zhang, J. Wen, J. Zeng and Q. Xiong, “Design method and information representation
for Computer Aided Welding Fixture Design,” in 2016 5th IIAI International Congress
on Advanced Applied Informatics (IIAI-AAI), Kumamoto, Japan, 2016.
[27] J. Zhang, J. Yang and B. Li, “Development of a Reconfigurable Welding Fixture System
for Automotive Body,” in 2009 ASME/IFToMM International Conference on
Reconfigurable Mechanisms and Robots, London, UK, 2009.
[28] J. F. Lancaster, “The physics of welding,” Physics in Technology, vol. 15, no. 73, pp.
73-79, 1984.
[29] H.-J. Bargel and G. Schulze, “Schweißen,” in Werkstoffkunde, 11., bearbeitete Auflage,
Heidelberg, Springer Vieweg, 2012, pp. 97-107.
[30] S. Katayama, “Introduction: fundamentals of laser welding,” in Handbook of Laser
Welding Technologies, Osaka, Woodbead Publishing, 2013, pp. 3-16.
[31] H. Wittel, D. Muhs, D. Jannasch and J. Voßiek, “Schweißverbindungen,” in
Roloff/Matek Maschinenelemente 22.,überarbeitete und erweiterte Auflage, Wiesbaden,
Germany, Springer Vieweg, 2015, pp. 215-198.
MASTER THESIS REFERENCES
- 46 -
[32] K. Weman and G. Lindén, MIG welding, New York: Woodhead Publishing Limited,
2006.
[33] W. Malsam, “projectmanager.com,” PROJECTMANAGER, 24 09 2019. [Online].
Available: https://www.projectmanager.com/blog/proof-of-concept-definition.
[Accessed 17 02 2021].
[34] M. Singaram and P. Jain, “entrepeneur.com,” Entrepeneur INDIA, 13 01 2018. [Online].
Available: https://www.entrepreneur.com/article/307454. [Accessed 17 02 2021].
[35] X. Liu, W. J. Zhang, Y. L. Tu and R. Jiang, “An Analytical Approach to Customer
Requirement Satisfaction in Design Specification Development,” IEEE
TRANSACTIONS ON ENGINEERING MANAGEMENT, vol. 55, no. 1, pp. 94-102,
2008.
[36] H. Johansson, J. G. Persson and D. Pettersson, Produktutveckling-Effektiva metoder för
konstruktion och design, Stockholm: Liber AB, 2018.
[37] C. R. Bryant, R. B. Stone, D. A. McAdams, T. Kurtoglu and M. I. Campbell, “Concept
Generation from the Functional Basis of Design,” in ICED 05: 15th International
Conference on Engineering Design: Engineering Design and the Global Economy,
Barton, A.C.T., Engineers Australia, 2005, pp. 1702-1715.
[38] P. D.-I. H. Meerkamm, “Technische Statistik,” in Technisches Taschenbuch,
Herzogenaurach, Schaeffler Technologies GmbH & Co. KG, 2014, pp. 86-101.
[39] ABB, “new.abb.com,” ABB, [Online]. Available:
https://new.abb.com/products/robotics/industrial-robots. [Accessed 22 03 2021].
[40] G. Veredas, “Community of experts of Dassault Systèmes Solutions,” Dassault
Systèmes Solutions, 29 September 2008. [Online]. Available:
http://www.coe.org/p/fo/et/thread=17535. [Accessed 02 April 2021].
[41] D. Wang, N. M. Kwok, G. Fang and Q. P. Ha, “Anytime Ant System for Manipulator
Path planning,” in Robotic Welding, Intelligence and Automation, Berlin Heidelberg,
Springer-Verlag, 2011, pp. 411-420.
[42] Weland, “weland.com,” Weland, [Online]. Available: https://www.weland.com/sv-
se/produkter/gallerdurk/. [Accessed 01 06 2021].
MASTER THESIS APPENDIX
i
8 Appendix
8.1 Appendix A
HLCt PLANEFIXTURE HLCt WALL ONE
HLCt WALL TWO HLCt SUPPORT NO CUT
MASTER THESIS APPENDIX
ii
HLCt SUPPORT ONE CUT HLCt SUPPORT TWO CUT
HLCt BAER HLCt STOED
HLCt FIXTURE
MASTER THESIS APPENDIX
iii
FIXTURE
FIXTURE WITH GALLERDURK
MASTER THESIS APPENDIX
iv
8.2 Appendix B
Design Specification
Number Category Description
1 Skeleton Model
1.1 Design The model follows the given design and includes all design
elements.
1.2 Parameter The skeleton model can change length and width through
parameters.
1.3 Measurement Except for the parameterized values, all measurements are
fixed.
1.3 Design features The skeleton model contains all constant features needed as
references to instantiate the HLCt.
1.4 Design features All changing features needed as references are created through
code.
2 HLCt
CONFIGURATIONS
MASTER THESIS APPENDIX
v
2.1 Design features The HLCt have a separate Geometrical Set called “External
References” containing the needed external references.
2.2 Design features The HLCt have an additional Geometrical Set called “WELD”
containing the welding locations.
2.3 Design features The HLCt have an additional Geometric Set called
“WELD_APP”, containing the approach and retreat locations
for each welding operation.
2.4 Design features The HLCt have an additional Geometric Set called “PICK”,
containing the location for picking up the part.
2.5 Design features The HLCt have an additional Geometric Set called
“PICK_APP”, containing the approach and retreat locations
for each welding operation.
2.6 Design features The HLCt have an additional Geometric Set called “FIX” (and
“FIX2”), containing the reference points for the fixture
components.
3 Tags
3.1 Organization The tags are organized into the according tag lists for pick,
place, and weld for their respective HLCt name.
3.2 Naming Tags are named after their tag group and HLCt name.
3.3 Numbering Tags are numbered continuously, to create a unique name and
number combination for all tags
3.4 Accessibility Reference points, reference tags and the created tag are stored
in an excel sheet called “TAG”
3.5 Accessibility Stored are the part name, point name, tag name, geometric set,
tag group XYZ values, & YPR angles
3.6 References A reference tag with its XYZ & YPR angles is allocated to all
place, place approach, weld & weld approach tags
3.7 References Pre-defined YPR angles are allocated to all tags
4 Robot Environment and Motion
4.1 Concept There are three robots, one for handling, one for positioning
and one for welding.
4.2 Naming The positioning robot is called FIXBOT, the handling robot
ASSEMBOT and the welding robot WELDBOT.
4.3 Motion The FIXBOT moves the whole product through the design
space.
4.4 Motion The robot motion of the FIXBOT is pre-defined but created via
code.
4.5 Motion The motion of the ASSEMBOT and WLEDBOT are created
through code.
4.6 Motion Standard movement blocks between the part specific tags are
created manually and stored in EXCEL
5 GUI
5.1 Input The values of the length and width can be manipulated via
code.
5.2 Output The required reference points and if needed their coordinates
will be made accessible to the inference engine
MASTER THESIS APPENDIX
vi
6 Fixture
6.1 Design The fixture is modular to allow the usage for different
configurations of gratings
6.2 Design The fixture is only created as a non-functional black box model
6.3 Assembly The fixture can be assembled by the robot.
6.4 References All reference point created are stored in the file
“FIXTURE_REFERENCE_POINTS”
7 General
7.1 Design Depending on the chosen length and width of the product, the
inner structures vary accordingly.
7.2 Manufacturing Welds are only allowed to be created horizontally from above.
8.3 Appendix C
CONCEPT I CONCEPT II
CONCEPT III CONCEPT IV
MASTER THESIS APPENDIX
vii
Robot Environment with Moved Components
1. Assemble Gallerdurk without Baer and Stoed 1.1 Move FIXBOT to Tag1
2. Move to WELDBOT 1.2 Assemble Components at Tag1
3. Weld all except Baer and Stoed 1.3 Move FIXBOT to Tag2
4. Move to ASSEMBOT 1.4 Assemble Components at Tag2
5. Assemble Baer and Stoed 1.5 Move FIXBOT to Tag3
6. Move To WELDBOT 1.6 Assemble Components at Tag3
7. Weld Baer and Stoed 1.7 Move FIXBOT to Tag4
8. Move to ASSEMBOT 1.8 Assemble Components at Tag4
9. Unload Gallerdurk
First Level Sub-Processes Second Level Sub-Process of 1.
MASTER THESIS APPENDIX
viii
8.4 Appendix D
Flowchart of the axis transformation of coordinates (left) and YPR angle acquisition (right)
3.1 Weld Components at Tag5 7.1 Weld Components at Tag5
3.2 Move FIXBOT to Tag6 7.2 Move FIXBOT to Tag6
3.3 Weld Components at Tag6 7.3 Weld Components at Tag6
3.4 Move FIXBOT to Tag7 7.4 Move FIXBOT to Tag7
3.5 Weld Components at Tag7 7.5 Weld Components at Tag7
3.6 Move FIXBOT to Tag8 .6 Move FIXBOT to Tag8
3.7 Weld Components at Tag8 7.7 Weld Components at Tag8
3.8 Move FIXBOT to Tag9 7.8 Move FIXBOT to Tag9
3.9 Weld Components at Tag9 7.9 Weld Components at Tag9
3.10 Move FIXBOT to Tag10 7.10 Move FIXBOT to Tag10
3.11 Weld Components at Tag10 7.11 Weld Components at Tag10
3.12 Move FIXBOT to Tag11
3.13 Weld Components at Tag11
3.14 Move FIXBOT to Tag12
3.15 Weld Components at Tag12
Second Level Sub-Process of 3. Second Level Sub-Process of 7.
GET REF TAG
VALUES
GET POINT
COORDS
TRANSLATIONAL
TRANSFORMATION
ROTATIONAL
TRANSFORMATION
PASTE NEW
COORDS IN EXCEL
ROTATIONAL
MATRIX 01
GET REF YPR
VALUES
ROTATIONAL
MATRIX 12
GET ORIENTATION
YPR VALUES
ROTATIONAL
MATRIX 02
EQUATING THE
COEFFICIENTS
IMPLEMENT
YPR VALUES
MASTER THESIS APPENDIX
ix
Creation of the RM for FIXBOT
Creation of the RM for WELDBOT
GET ROBOT
TASK
DEFINE TAGS
MOTION
ALL
PARTS
?
CREATE ROBOT
MOTION
CREATE PICK
OPERATION
yes
yes
no
no
GET OPERATION
TYPE FROM EXCEL
START
?
yes
no
CREATE FIRST
OPERATIONS
MAIN?yes
no
GET PART
FROM EXCEL
ALLTASKS
?
yesno ACTIVATE LAST
TASK
CREATE PLACE OPERATION
FOR ALL PARTS
SET REFERENCE
TAG
ACTIVATE
B. & S.?yes
yes
no
no
DEFINE SEARCH
KEYWORDS
SEARCH
TAG LIST
CREATE
MOTION
ALL
PARTS/
POINT
S?
GET PART &
POINT
CREATE
MOTION
GET ROBOT
TASK
CREATE LAST
OPERATION
CREATE FIRST
OPERATIONS
yes
no
START
?
MAIN?GET OPERATION
TYPE FROM EXCEL
MOTION
?
ALLOPS?
CREATE ROBOT
MOTION
CREATE PLACE
OPERATION
END?
yes
yes
yes
yes
no
no
no
no
SET PART TYPE
ALL
TASKS
?
yes
no
CREATE REST
POSITION
no
END?yes
MASTER THESIS APPENDIX
x
Combined flowchart for the whole process
CAD MODEL GENERATION
TAG GENERATON
ROBOT MOTION GENERATION
SET MODEL
SIZE
CONFIGURATE
SKELETON
GENERATE
COMPONENTS
CREATE
FIXBOT MOTION
CREATE
ASSEMBOT MOTION
CREATE
WELDBOT MOTION
CREATE
PICK TAGS
MOVE
WORKPIECE
CREATE
PLACE & WELD TAGS