On Homogenization of Coal in Longitudinal Blending...

8
On Homogenization of Coal in Longitudinal Blending Beds Pradyumn Kumar Shukla Karlsruhe Institute of Technology [email protected] Michael P. Cipold J&C Bachmann GmbH Bad Wildbad, Germany [email protected] ABSTRACT Coal blending processes mainly use static and non-reactive blending methods like the well-known Chevron stacking. Al- though real-time quality measurement techniques such as online X-ray fluorescence measurements are available, the possibility to explore a dynamic adaptation of the blend- ing process to the current quality data obtained using these techniques has not been explored. A dynamic adaptation helps to mix the coal from different mines in an optimal way and deliver a homogeneous product. The paper formu- lates homogenization of coal in longitudinal blending beds as a bi-objective problem of minimizing the variance of the cross-sectional quality and minimizing the height variance of the coal heap in the blending bed. We propose a cone based evolutionary algorithm to explore different trade-off regions of the Pareto front. A pronounced knee region on the Pareto front is found and is investigated in detail us- ing a knee search algorithm. There are many interesting problem insights that are gained by examining the solutions found in different regions. In addition, all the knee solutions outperform the traditional Chevron stacking method. Categories and Subject Descriptors I.6.m [Simulation and Modeling]: Miscellaneous; J.m [Computer Applications]: Miscellaneous Keywords Multi-objective algorithms, Knees, Simulation optimization 1. INTRODUCTION AND RELATED WORK Bulk material processing describes the handling of raw bulk materials such as minerals, coals or ores. These are mostly mined from natural source and thus are usually inho- mogeneous in matters of quality parameters. Quality equal- ization to guarantee efficient processing can be achieved by using blending systems in the processing chain. Most sys- tems today still implement static blending methods like the Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full cita- tion on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or re- publish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]. GECCO’14, July 12–16, 2014, Vancouver, BC, Canada. Copyright 2014 ACM 978-1-4503-2662-9/14/07 ...$15.00. http://dx.doi.org/10.1145/2576768.2598316. well-known Chevron stacking. There, the material is placed in the center of the stockpile while driving the stacker along the blending bed at constant speed. As the stacker is moved back and forth along the full length of the stockpile the ma- terial forms layers along the sides of the partial stockpile below. In cross section these layers form a chevron pattern. Blending with Chevron stacking results in partially equal- ized quality output curves which still contain fluctuations ex- ceeding the desired limits in many cases. Real-time quality measurement makes it possible to follow the current quality trend and react dynamically with the blending process. This paper utilizes a particle simulation to reproduce the coal blending process virtually and allows calculation of the outcome of a possible solution during optimization. The op- timization is done modifying the stacker traverse path only. This implements a method of minimal system change and thus with minimal financial and hardware effort. Previous studies focus on analyzing and improving the coal blending efficiency; the work by Cipold. et al. [4], to the best of our knowledge, is the only one that presented a simulation based optimization of the stacker path to achieve a homogeneous coal stockpile. Kumral [8] describe a method to optimize a mineral blending system design before its con- struction to get the best performance. Using genetic algo- rithms and a multiple regression model they determine the best blending bed parameters simulating the stockpile with a simplified cell model. Pavloudakis and Agioutanis [10] use a more complex stockpile simulation approach consisting of multiple layers lying on top of each other. They assume a constant material flow and calculate the expected quality at the material output. However, their work did not focus on optimizing the blending efficiency. Bond et al. [2] describe methods to improve the efficiency of blending beds by mod- ifying the volume of a stockpile dynamically and modifying the stacking speed while measuring the input quality with an online analyzer. They only suggest a solution with an appropriate software to control the stacker speed dynami- cally during the stacking to place material with recognized quality at specific locations in the blending bed but do not extend this thought in detail. The main contributions of this paper are as follows: A multi-objective model of the problem that includes a realistic formulation of the objectives and a constraint violation measure is presented. (Section 3) A cone based algorithm, which additionally uses a util- ity measure to produce a biased density of solutions in good trade-off regions, is developed. (Section 4) 1199

Transcript of On Homogenization of Coal in Longitudinal Blending...

Page 1: On Homogenization of Coal in Longitudinal Blending Bedshansen/proceedings/2014/GECCO/proceedings/p... · On Homogenization of Coal in Longitudinal ... blending methods like the well-known

On Homogenization of Coal in Longitudinal Blending Beds

Pradyumn Kumar ShuklaKarlsruhe Institute of

[email protected]

Michael P. CipoldJ&C Bachmann GmbHBad Wildbad, Germany

[email protected]

ABSTRACTCoal blending processes mainly use static and non-reactiveblending methods like the well-known Chevron stacking. Al-though real-time quality measurement techniques such asonline X-ray fluorescence measurements are available, thepossibility to explore a dynamic adaptation of the blend-ing process to the current quality data obtained using thesetechniques has not been explored. A dynamic adaptationhelps to mix the coal from different mines in an optimalway and deliver a homogeneous product. The paper formu-lates homogenization of coal in longitudinal blending bedsas a bi-objective problem of minimizing the variance of thecross-sectional quality and minimizing the height varianceof the coal heap in the blending bed. We propose a conebased evolutionary algorithm to explore different trade-offregions of the Pareto front. A pronounced knee region onthe Pareto front is found and is investigated in detail us-ing a knee search algorithm. There are many interestingproblem insights that are gained by examining the solutionsfound in different regions. In addition, all the knee solutionsoutperform the traditional Chevron stacking method.

Categories and Subject DescriptorsI.6.m [Simulation and Modeling]: Miscellaneous; J.m[Computer Applications]: Miscellaneous

KeywordsMulti-objective algorithms, Knees, Simulation optimization

1. INTRODUCTION AND RELATED WORKBulk material processing describes the handling of raw

bulk materials such as minerals, coals or ores. These aremostly mined from natural source and thus are usually inho-mogeneous in matters of quality parameters. Quality equal-ization to guarantee efficient processing can be achieved byusing blending systems in the processing chain. Most sys-tems today still implement static blending methods like the

Permission to make digital or hard copies of all or part of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor profit or commercial advantage and that copies bear this notice and the full cita-tion on the first page. Copyrights for components of this work owned by others thanACM must be honored. Abstracting with credit is permitted. To copy otherwise, or re-publish, to post on servers or to redistribute to lists, requires prior specific permissionand/or a fee. Request permissions from [email protected]’14, July 12–16, 2014, Vancouver, BC, Canada.Copyright 2014 ACM 978-1-4503-2662-9/14/07 ...$15.00.http://dx.doi.org/10.1145/2576768.2598316.

well-known Chevron stacking. There, the material is placedin the center of the stockpile while driving the stacker alongthe blending bed at constant speed. As the stacker is movedback and forth along the full length of the stockpile the ma-terial forms layers along the sides of the partial stockpilebelow. In cross section these layers form a chevron pattern.

Blending with Chevron stacking results in partially equal-ized quality output curves which still contain fluctuations ex-ceeding the desired limits in many cases. Real-time qualitymeasurement makes it possible to follow the current qualitytrend and react dynamically with the blending process.

This paper utilizes a particle simulation to reproduce thecoal blending process virtually and allows calculation of theoutcome of a possible solution during optimization. The op-timization is done modifying the stacker traverse path only.This implements a method of minimal system change andthus with minimal financial and hardware effort.

Previous studies focus on analyzing and improving thecoal blending efficiency; the work by Cipold. et al. [4], tothe best of our knowledge, is the only one that presented asimulation based optimization of the stacker path to achievea homogeneous coal stockpile. Kumral [8] describe a methodto optimize a mineral blending system design before its con-struction to get the best performance. Using genetic algo-rithms and a multiple regression model they determine thebest blending bed parameters simulating the stockpile witha simplified cell model. Pavloudakis and Agioutanis [10] usea more complex stockpile simulation approach consisting ofmultiple layers lying on top of each other. They assume aconstant material flow and calculate the expected quality atthe material output. However, their work did not focus onoptimizing the blending efficiency. Bond et al. [2] describemethods to improve the efficiency of blending beds by mod-ifying the volume of a stockpile dynamically and modifyingthe stacking speed while measuring the input quality withan online analyzer. They only suggest a solution with anappropriate software to control the stacker speed dynami-cally during the stacking to place material with recognizedquality at specific locations in the blending bed but do notextend this thought in detail.

The main contributions of this paper are as follows:

• A multi-objective model of the problem that includes arealistic formulation of the objectives and a constraintviolation measure is presented. (Section 3)

• A cone based algorithm, which additionally uses a util-ity measure to produce a biased density of solutions ingood trade-off regions, is developed. (Section 4)

1199

Page 2: On Homogenization of Coal in Longitudinal Blending Bedshansen/proceedings/2014/GECCO/proceedings/p... · On Homogenization of Coal in Longitudinal ... blending methods like the well-known

• A detailed physics simulation and a fast simulationare used to find the optimal solutions to the multi-objective problem. (Section 5)

• Interesting insights about the problem and commonprinciples in the obtained optimal solutions are inves-tigated (innovization). (Section 6)

• The solutions obtained by our algorithm produce bet-ter homogenization results than Chevron stacking.

2. COAL PROCESSING AND BLENDINGRaw bulk materials such as minerals, coals or ores are

mostly mined from natural resources and hence the productis usually inhomogeneous in matters of the quality param-eters of interest. Typical quality parameters for coal, forexample, are moisture or ash content as not burning com-ponents, sulfur content or calorific value among others [1].

In most bulk material processes a range of acceptable qual-ity parameters can be defined for the process to work prop-erly. The process is then adapted for the specified aver-age quality. As the quality of the material varies over timethe process is not operated in the most efficient way andthe used process equipment can even take damage from toohigh variation even though the material meets the averagequality setpoints. As an example an autoclave for coal gasi-fication may be considered. This unit should be run at aspecific heat setpoint in order to operate most efficiently.Even though the average particle size is constant during theday some minutes or even hours of smaller particles (bal-anced by bigger particles in the later time of the day) cancause the oven to heat up over the target setpoint. This iscritical for the machines used and has to be detected, thematerial flow has to be reduced and the process is influenced.Even more critical situations occur if the calorific value isnot longer balanced to the amount of ash in the coal. Ifthe energy contained in the coal is not sufficient to keep thetemperature above a certain limit the fluid ash will solidifywhich puts the autoclave out of service.

Summarized, it can be stated that the absolute averagequality value is not easily controllable but also the controlis not necessary as long as the process is adapted for thespecified quality value but there is a requirement for a stablequality which means a low quality variance.

In the last decade a whole range of instruments has beendeveloped to allow online determination of quality parame-ters like elemental composition, size distribution and otherparameters of interest. These instruments are determiningthe parameters directly on the main stream [9] and allowthe operator to create a regulated process which is adaptedto the current parameter measurement. The main advan-tage of these instruments is the lack of the necessity to drawsamples and the real-time quality determination.

The measurement instrument employed to determine thequality parameters used for the data in this paper is anonline X-ray fluorescence analyzer [9]. The instrument islocated over or even on the main material stream as it canbe seen in Figure 1. An X-ray beam is directed onto thematerial surface and fluorescence radiation can be observedby a silicon drift detector. The spectra are then analyzedand the elemental composition and other desired processparameters (like moisture) can be derived. The instrumentupdates the current quality values every couple of secondsand is therefore usable for real-time control.

Figure 1: An online X-ray fluorescence analyzermounted on a sled gliding over coal on a conveyor.

Figure 2: Illustration of coal mining and processing.

The natural deposits of coal are typically inhomogeneousand therefore in most cases the material needs to be pro-cessed in a plant. Coal is mined at places where explorationresults indicate a worthwhile material extraction and then itis normally directly transported to the respective processingplant after some minor handling as illustrated in Figure 2.

Mining patterns limit the average quality and assure itdoes not exceed given limitations. But as layers of mate-rial are mined, the quality varies strongly. Variation levelscan last for hours or days. To minimize the negative effect,which poor material would have onto the process, blendingis inserted before processing. The aim is to homogenize theraw material and minimize the quality variation.

This paper here is focused on longitudinal blending bedsas illustrated in Figure 3. The blending bed is an area allo-cated for building so-called stockpiles of raw material. Theblending bed can have a typical length up to 1000m and awidth up to 50m. The bed is located between two rail trackson which huge stacking and reclaiming machines move alongthe bed. The stacking machine is fed with the input bulkstream using a conveyor belt. The reclaimer collects thebulk material on the stockpile and feeds it to a conveyorbelt moving to the plant. The stacking machine has a longarm which reaches over the middle of the blending bed wherethe material is deposited. The material is then stacked tostockpiles by layering it with so-called cone-shell or chevronstacking [2]. For instance, using chevron stacking, the mate-rial is layered in multiple levels. Defined by the speed of thestacking machine and the material flow the levels become

1200

Page 3: On Homogenization of Coal in Longitudinal Blending Bedshansen/proceedings/2014/GECCO/proceedings/p... · On Homogenization of Coal in Longitudinal ... blending methods like the well-known

www.kit.eduKIT – University of the State of Baden Württemberg andNational Research Center of the Helmholtz Association

INSTITUTE OF APPLIED INFORMATICSAND FORMAL DESCRIPTION METHODS

AIFB

Michael P. Cipold, Pradyumn K. Shukla, Claus C. Bachmann, Kaibin Bao and Hartmut Schmeck

AN EVOLUTIONARY OPTIMIZATION APPROACH FOR BULK MATERIAL BLENDING SYSTEMS

Results

Simulation Particle based physics simulation Detailed: 20 h Simplified: 25 ms

Multi-Objective Optimization Steady state NSGA-II

Objective Functions Output quality variation

Stockpile height constancy

Constraints Speed and traverse region

Individual Representation Array of speeds Array of positions

Repair Mechanisms Lamarckian Baldwinian

Stacker Path Quality in Cross Sections Height Number of Cross Sections Non-Empty Cross Sections Volume in Cross Section

24

26

28

30

32

34

36

24 26 28 30 32 34 36

Qua

lity

Val

ue D

etai

led

Sim

ulat

or

Quality Value Simplified Simulator

F1 (p) :=

n(p)n

i=1

wi (p) · (qouti (p) − qout)2

(n(p) − 1)n

i=1

wi (p)˜

F2 (p) :=hmax (p) − hmin (p)

h(p)

qoutp

hn

wn

i

Algorithms

Bulk Material Processing Constant product quality required Blending of inhomogeneous product

Blending Longitudinal bed (1000m x 50m) Railed stacker / reclaimer machines Stacking defines blending efficiency Usually static blending strategies (chevron stacking)

Quality Determination Online X-ray fluorescence analysis Real-time product information (e.g. coal ash content) Allows reactive blending optimization

Mine Mine

MineMine

Blending Preparation Processing

Optimized Traverse Paths Provide intuitive solution (Innovization) Static stacking always outperformed

Best solutions by array of positions and Baldwinian repairing Efficient front is not known

Optimization System Flexible and scalable Adaptable to other types of blending systems (e.g. circular chevron pile)

0 0.1 0.2

0.3 0.4 0.5 0.6 0.7

0.8 0.9

1

0 0.2 0.4 0.6 0.8 1

Nor

mal

ized

Qua

lity

and

Posi

tion Path 1

Path 2Quality

0 0.1 0.2 0.3 0.4 0.50

0.2

0.4

0.6

0.8

1

F1

F2

Best (1st ) attainment surface

Median (23rd) attainment surface

Worst (45th) attainment surface

0 0.5 1 1.5 20

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

F1

F2

Baldwinian

Lamarckian

Kneeregion

Bulk Material Preparation

Goal: Finding an optimal stacker traverse path for bulk material blending systems

rails

stackerinput qualitymeasurement

bridgereclaimer

new stockpile finished stockpile being reclaimed

output qualitymeasurement

Figure 3: Illustration of a longitudinal blending bed with railed stacking and reclaiming machines.

thicker or thinner. The bridge reclaiming machine ablatesthe material in diagonal layers. If the stockpile was stackedusing the chevron stacking method the cross sections ablateddiagonally contain material of all levels. The blending resultcorrelates mostly with the number of layers, and hence withthe speed of the stacker. It is additionally influenced by thematerial quality curve which is illustrated in Section 6.

As standard, the necessary number of layers for a goodblending result is calculated statically and optimized mainlyat design time together with the general blending bed pa-rameters [8]. In some cases the number of layers is varied byexperienced operating personnel, but not directly controlledin reaction to quality variation changes.

3. MODELING AS A MULTI-OBJECTIVEOPTIMIZATION PROBLEM

The described real world system consists of multiple ele-ments which need to be mapped into a system model first.The model consists of:

System Parameters (%) are geometrical stockpile param-eters, maximum stacker speed, reclaimer angle andmaterial properties,

Quality Input (q) is the quality curve for each simulationrun plotted against the total amount of material, and

Traverse Path (p) is the path along which the stacker isdriving during the stockpile creation.

Given that the stacking system is unchangeable and thatthe quality input is fixed for each run, the optimization fo-cuses on modifying the stacker path. For simplicity, materialproperties (like shape distribution of the coal particles) andweather parameters (like humidity) are not considered.

3.1 ObjectivesThe traverse path of the stacker is optimized for ideal

material blending. The first objective here is the weightedvariance of the quality at the reclaimer; this is defined as

F1(q,p,%) =

n′n∑i=1

wi(qouti − qout)2

(n′ − 1)

n∑i=1

wi

, (1)

where n is the number of cross sections ablated; wi and qouti

are the amount of material and the average material qualityfor Section i; n′ is the number of non-empty cross sections;and qout is the total average quality.

The traverse path of the stacker has a direct influence onthe shape of the stockpile. This leads to the other primaryobjective to create a stockpile with a ridge of nearly constant

height. The variance of the height is∑n

i=1(hi−h)2

n−1, where hi

is the height of cross section i, and h is the average height.From the geometry of the stockpile, it can be easily shownthat the height of i-th cross section hi ∝

√wi, and hence,

the second objective function is given by

F2(q,p,%) =1

n− 1

n∑i=1

(√wi −

1

n

n∑i=1

√wi

)2

. (2)

For a given value of q,p, and %, the values of both theabove objective functions are computed from the stockpilethat is formed. Simulation is used to create the stockpile.

3.2 ConstraintsThe constraints in this optimization problem restrict the

movement of the stacker on the longitudinal blending bed.These constraints depend on the traverse path and the blend-ing bed parameters (p and %); these are defined next.

Position Constraint: The stacker must traverse within thestockpile region.

Speed Constraint: The given speed range is not violated.

As the stacker moves along the rails, the Position Con-straint is a hard constraint. The Speed Constraint, on theother hand, is a soft constraint and it can be included asan another objective (like driving only as fast as necessaryto meet a defined threshold). For simplicity, we regard theSpeed Constraint as a hard constraint.

3.3 Solution RepresentationThis solution representation is based on position sampling

and utilizes an array of real values from the interval [0, 1]which correspond to the relative position in the valid tra-verse range of the stacking machine while creating the stock-pile [4]. Hence, a solution is defined as:

δ = (δ1, . . . , δl), δi ∈ [0, 1], l > 1. (3)

1201

Page 4: On Homogenization of Coal in Longitudinal Blending Bedshansen/proceedings/2014/GECCO/proceedings/p... · On Homogenization of Coal in Longitudinal ... blending methods like the well-known

The time tδslot for driving between two positions linearly withconstant velocity is calculated as described in equation 4where ttotal is the time to fully fill the stockpile.

tδslot =ttotall − 1

(4)

The position representation automatically satisfies the Po-sition Constraint. However, the Speed Constraint can beviolated if the speed exceeds the maximum speed. This is

the case if|δi+1 − δi|vmax

> tδslot. The probability of obtaining

infeasible solutions increases the smaller tδslot gets. As theconstraint do not depend on the quality curve q, we can re-pair it using standard techniques, without having to run thesimulator. To repair an infeasible solution, we use the directcorrection and iterative balancing techniques from [4].

After repairing, the feasible solution produced by the evo-lutionary algorithm is used to run the simulation and calcu-late the objectives. The original infeasible solution can ei-ther be replaced by a repaired one in the population (Lamar-ckian technique) or remains in the population (Baldwiniantechnique). We investigate both Baldwininan and Lamarck-ian techniques in this paper. In addition, we define a con-straint measure for the infeasible solutions. The measureputs a number on how far away a solution is from beingfeasible (degree of constraint violation). This way the solu-tions are sorted from feasible to infeasible to most infeasible.Feasible individuals are always preferred over infeasible indi-viduals [5] while less infeasible individuals are preferred overmore infeasible ones. The constraint measure is calculatedby summing up the distance between each original positionδi and the repaired position r(δ)i as follows:

c(δ) =

l∑i=1

|δi − r(δ)i|

We use both direct and iterative repairing methods to createfeasible solutions. Repaired solutions are not written backto the population.

4. CONE UTILITY BASED GENETIC AL-GORITHM

The main difficulty in a multi-objective optimization prob-lem is to rank the solutions. Usually, the Pareto order-ing is a natural relation that is used to rank the solutions.However, as the Pareto ordering is a partial order, an addi-tional crowding or clustering measure is commonly employedto rank the Pareto non-dominated solutions. A commonlyused extension of the Pareto ordering is a cone ordering as acone ordering helps to focus on good trade-off regions of thePareto front [5]. Moreover, using a cone based approach canalso help to focus on different regions of the Pareto front.

A cone based ordering is also a partial order and hencethere can be many solutions that are equally good. To dif-ferentiate among these solutions we could use an additionalutility based measure that assigns a trade-off based goodnessmeasure to each solution. This measure, together with theusual diversity measure (assumed to be crowding distance),can be used to bias the solutions in the trade-off region. Al-though there exist many biasing techniques we prefer to usea trade-off utility based one as trade-offs have a meaningthat is easily understood by a designer (rather than othermathematical biasing measures [3])

Pt

Qt

Rt

Lexicographic sorting (cone

order, trade-off utility)

F1 µ ↓

F2 µ ↓

F3 µ ↓

Crowding distance

sorting

+ trade-off utility

Rejected

Pt+1

ϑ(N-F1 -F2)

Figure 4: Illustration of CUEA. A cone partial or-der is used together with a trade-off based utilitymeasure to sort the population.

In this paper, we propose and investigate a cone utilitybased genetic algorithm (Cuea) to solve the multi-objectiveproblem described in Section 3. This algorithm is based onNSGA-II [5] and PKEA [11] and uses a cone based approachto focus on different regions of the Pareto front.

We assume a population size of N . At any generation t, letPt and Qt denote the parent and the offspring populations,respectively (in our simulation we assume |Qt| = 1, a steady-state version). Let Rt be the combined population, i.e., Rt =Pt∪Qt. Cuea uses three additional parameters. The first ofthem T > 0 is a lower bound on the trade-off utility value. Asecond parameter ϑ ∈ [0, 1] is employed to ensure diversityin the population. Finally, an ordering cone C needs to bespecified. In order to rank the solutions in the set Rt (andcreate Pt+1) we first perform a sorting using the cone C toRt and identify different fronts: Fi, i = 1, 2, . . . , etc.

For an arbitrary solution u ∈ Rt, such that u ∈ Fk forsome k ∈ N (Fk is the k-th front in which u lies), let µ(u) > 0be defined by

µ(u) = max

{maxv∈Fk

max{i:ui>vi}

min{j:uj<vj}

ui − vivj − uj

, T

}. (5)

At the end, every solution u has a tuple (k, µ(u)) and thisis considered in a lexicographic way. The T parameter inEquation 5 serves as an upper trade-off bound, above whichsolutions are not considered relevant to the designer.

Next, we create the new parent population Pt+1. For this,we first set new population Pt+1 = ∅, counter i = 1. Until|Pt+1| + |Fi| < N , we perform Pt+1 = Pt+1 ∪ Fi and i =i+1. Now, we need to include N−|Pt+1| solution more intoPt+1 from the last front F`. For this, we sort the solutionsof F` into increasing order of their µ values and take thefirst N := dϑ · (N − |Pt+1|)e solutions into Pt+1 from thesorted list. The N − |Pt+1| − N remaining solutions aretaken considering the crowding distance values (in F`).

Figure 4 shows the schematic of creating the new parentpopulation. The sorting is based on the given cone orderand the utility values in a lexicographic way. We see thatthe front (based on the cone ordering) cannot be fully ac-commodated and here, both the utility and the crowdingdistance values are used. The next offspring population iscreated from Pt+1 using a lexicographic (rank, crowding dis-tance) binary tournament selection, crossover and mutationoperators.

1202

Page 5: On Homogenization of Coal in Longitudinal Blending Bedshansen/proceedings/2014/GECCO/proceedings/p... · On Homogenization of Coal in Longitudinal ... blending methods like the well-known

20

25

30

35

40

30% 40% 50% 60% 70% 80% 90% 100%

Qu

alit

y

Percentage of Total Amount of Material

Quality Curve 2 (q2) Quality Mean Quality Standard Deviation

Figure 5: A quality curve from a coal processing plant in South Africa showing the variation in ash content.

5. PHYSICS SIMULATION AND EXPERI-MENTAL SETUP

In contrary to the most approaches for modeling stackingand reclaiming of coal [8, 10], we proposed a physics baseddetailed simulation system that models the real world asclose as possible by simulating the dropping, falling, rolling,and stacking of bulk materials [4]. This physics simulation,however, was regarded to be too slow to use it for optimiza-tion and thus it was replaced by a simplified fast simulationsystem. The physics simulation was only used to verify theoptimization results generated by the simplified simulation.

For a deeper problem understanding we accelerated thedetailed simulation process and ran it on modern machines(Intel Core i5-3570K, single core speed up to 3.80Ghz). Theacceleration was mainly achieved by reducing the simulationcomplexity simulating only 60, 000 particles for a 90, 000m3

stockpile. One simulation run takes about 25 minutes tocomplete which allows 100 function evaluations in 42 hourswith a population size of 10.

Using the detailed simulation the stockpiles generated al-low a deeper analysis of the stockpile composition. The re-sults are closer to reality, and, in contrast to the fast sim-ulation system, they are without integer artifacts. Further-more, more realistic scenarios can be generated with finergranularity of simulation increasing the amount of particlessimulated, allowing variable particle sizes and their effectson the stockpile building process which is an outstandingfeature of this simulation system. The system itself and itsintegration into the standard optimization problem formatallows for optimization of a highly complex problem and isan ideal example of expensive function evaluation.

Figure 5 shows one of the quality curves used in this study.The quality data was collected in a coal processing plant inSouth Africa. At the measuring point, coal of up to sevendifferent mines is fed to the stockpile. The actual data usedin the quality input curves for the optimization was the ashcontent of the coal.

The complex optimization problem is influenced by a widerange of parameters. We use three different quality inputcurves (denoted by Q0, Q1, and Q3) as problem instances.Except for the detailed physics simulation, each algorithmiccombination is run 31 times in order to be able to give astable statement about its performance The number of vari-ables l = 30 and population size is set to 100. The maximumstacker speed is set equal to the speed necessary to place 20layers with Chevron-stacking. The stockpile dimensions are

300 by 50 meters, the simulation detail is set to 1.5 m3

particle,

and we set the maximum function evaluations to be 20, 000(except for the detailed simulation). We use the simulatedbinary crossover (probability 0.9; distribution index 15.0),polynomial mutation operator (probability 3.3

variablecount=30;

distribution index 20.0 after a limited parametric setting)

and the binary tournament selection. The threshold param-eter ϑ, and the trade-off parameter T were taken as 0.9 and1.5 respectively.

We used three cones C1 := {d ∈ R2 : d1 ≥ 0, d2 ≥ 0},C2 := {d ∈ R2 : d1 ≥ 0, d1 + d2 ≥ 0}, and C3 := {d ∈R2 : d2 ≥ 0, d1 + d2 ≥ 0}. These cones help to focus on themiddle (knee) region, left (F1 minima), and the right part(F2 minima) of the Pareto front respectively [5].

To include problem knowledge in the algorithm, the steadystate version of the Cuea algorithm was adapted to usea partially initialized population consisting of 30 solutionsthat are initialized with the Chevron stacking method withdifferent speeds.

The initial population is generated as shown in Algorithm 1by looping through m different speed levels where m is thenumber of Chevron initialized solutions in the initial popu-lation. For all l position variables δij of the ith solution aforward-backward motion is generated with the inner whileloop where the modulo in the inner if-statement decides onthe direction of movement.

Algorithm 1: Generation of Chevron stacking solutions

for i← 1 to m doj ← 0while j < l do

k ← 0while (k < i) ∧ (j < l) do

if⌊ji

⌋≡ 0 mod 2 then

δij ← ki

else

δij ← 1− ki

endk ← k + 1j ← j + 1

end

end

end

For problem analysis and a deeper understanding, threebasic experimental setups were chosen with varying subor-dinate parameters:

• To classify the proposed knee finding algorithm andits variations, it was run on the different problem in-stances and compared to NSGA-II, ssNSGA-II andssNSGA-II with the same partially initialized popula-tion as used for Cuea.

• To evaluate the proposed algorithms combined withvarious repairing strategies, the algorithm was param-eterized to do direct and iterative repairing either ap-

1203

Page 6: On Homogenization of Coal in Longitudinal Blending Bedshansen/proceedings/2014/GECCO/proceedings/p... · On Homogenization of Coal in Longitudinal ... blending methods like the well-known

plied in the Baldwinian and Lamarckian way or usedfor constraint evaluation.

• To get deeper insights to the stockpile composition andproblem behavior using a more realistic simulation, thedetailed simulation was used with 100 function evalu-ations and a population size of 10 individuals.

For the coal homogenization multi-objective problem abovethere is no way to calculate an exact Pareto front. To geta better measure for partially calculated fronts, especiallyfor the cone version of the Cuea algorithm, another mea-sure has to be used. The distance to a non-dominated setdescribes a performance measure in which a non-dominatedset of solutions is defined as a reference set. Each solutionis measured against this set by first calculating the mini-mum distance to the reference set for each point and usingthe root mean square of these values as a distance mea-sure. The reference set is obtained by grouping all evercalculated fronts and calculating the non-dominated set ofthese solutions which can then be used to compare the re-sults. Equation 6 describes the set distance with r beingthe non-dominated reference set, f the objective values ofthe currently evaluated solution, |r|, |f| the associated setsizes and d(x,y) the Euclidean distance with equally scaledobjective values as follows:

dset(r, f) =

√√√√ 1

|f| ·∑fi∈f

(minri∈r

d(f i, ri))2. (6)

A solution set is regarded as better than another one if theset distance (computed using Equation 6) is smaller.

Apart the above distance metric, we use the hypervolumeto evaluate our algorithms. The hypervolume measures thevolume of solutions that are dominated by a set of solutions,and it requires no knowledge of the exact Pareto front (whichis the case for our real-world problem).

0

2

4

6

8

10

12

14

0 1 2 3 4 5

F2 (

He

igh

t V

aria

nce

)

F1 (Quality Variance)

All IndividualsNon-Dominated Set

Solution SetInitial ChevronInitial Random

Figure 6: Illustration of an optimization run usingthe detailed simulation. The initial points and allintermediate points are shown. The non-dominatedset computed using the fast simulator is also shown.

6. ANALYSISUsing the detailed physics simulation and the steady state

version of the Cuea algorithm focusing on the knee regiontogether with a partially initialized population, the systemwas able to achieve results outperforming Chevron stackingand forming a knee region in the objective function scattergraph. The tradeoff between the height constancy and thequality stability in the output stream is clearly visible aswell as the knee region (Figure 7). For better orientationthe best results ever achieved in our optimization runs withthe simulation system are also shown in the graph.

Figures 6 and 7 show a full scale and zoomed overview ofthe intermediate and final results achieved in this optimiza-tion run. The starting advantage of Chevron stacking overrandom solutions is clearly visible. Though the simulationwas run only for 100 iterations the success of improvementof the starting population can be seen in the distributionof all individuals. Newly generated individuals are locatedcloser to the final knee solutions in most cases.

Problem knowledge is beneficial for generic algorithms ingeneral and we see that most of the solutions in the cus-tomized initial population generation dominate the initialrandom solutions (shown by circles in Figure 6). The finalsolutions forming the knee region are mostly static Chevronstacking solutions with little modifications which improvequality variance or height variance.

The nondominated front of this bi-objective problem isapparently a convex curve (can be seen in Figure 7). A conebased analysis of such problems is natural approach to findgood trade-off solutions [14]; this also justifies the use ofCuea on this real-world problem.

0

0.2

0.4

0.6

0.8

1

0 0.2 0.4 0.6 0.8 1

F2 (

Heig

ht V

ariance)

F1 (Quality Variance)

All IndividualsNon-Dominated Set

Solution SetInitial ChevronInitial Random

Figure 7: Enlarged illustration of the final solu-tion sets obtained using detailed and fast simulation.One can clearly observe a pronounced knee.

Figure 8 shows the comparison of the quality variance of arandom solution from the knee region of the objective spacewith an acceptable height variance to the quality variance ofsolutions generated with static Chevron stacking at differentspeeds. It is clearly visible that Chevron stacking is alwaysoutperformed even if it uses higher speeds than given as

1204

Page 7: On Homogenization of Coal in Longitudinal Blending Bedshansen/proceedings/2014/GECCO/proceedings/p... · On Homogenization of Coal in Longitudinal ... blending methods like the well-known

freedom to the optimization system. Moreover there is notalways an improvement in terms of homogenization result ifthe speed is increased. This graph shows the clear advantageof an optimized solution compared with static stacking.

0.01

0.1

1

10

5 10 15 20 25 30 35

F1 (

Qu

alit

y V

aria

nce

)

Amount of Layers

Static stacking varying maximum speedFitness achieved with optimization

Figure 8: Illustration of a random knee solution(dotted line) in comparison to static Chevron stack-ing with various speeds. The quality variance in logscale (y-axis) shows a clear outperformance by theoptimized solution.

Figure 9 shows the three-dimensional plot of the objec-tive values of all solutions in fronts which were regarded forthe generation of the non-dominated set in Equation 6. Thez-axis of the plot indicates the average speed of each solu-tion as a fraction of the maximum speed available. Mostsolutions are concentrated in a knee region, where a widespread of speeds can be found; mainly from 0.3 to 0.7. Solu-tions with lowest quality variance but high height variancetend to have the highest speeds and vice versa. An impor-tant design implication of this is the following: to achieve asolution which is also good in terms of height variance thespeed needs to be reduced from maximum. Such problem in-sights from optimized solutions are valuable (innovization);this is only possible if the problem is modeled and solvedas an optimization problem, rather than adhering to a stan-dard stacking procedure. The results from Figure 9 shownot only the intuitive benefits from Chevron stacking withhigh speeds, but also that there is a wide range of strategicoptions to get results in the knee region.

Interestingly, the set of points in Figure 9 form a smoothmanifold (an analytical representation of this can be com-puted). The reason for this is currently being investigated.

In order to compare Cuea with other standard algorithmslike NSGA-II and the steady state version of NSGA-II, weexecuted 31 runs for all combinations of problem instancesand algorithms. The results, presented in Table 1, show theadvantage of Cuea over standard algorithms (algorithmsusing a steady state version are marked with ss prefix; algo-rithms using a partially Chevron initialized population aremarked with ini suffix). Cuea outperforms NSGA-II in allextensions for this problem instance and all combinationsof extensions in terms of hypervolume (computed using thereference point (0.4, 0.2)) and set distance, based on me-dian values. The steady state algorithms show slightly bet-ter performance. Although steady state algorithms requiremore computation (like more sorting operations), the func-tion evaluations (at least when using the detailed simulation)are far more expensive, and, are the most time consumingelements. Hence, these steady state algorithms are a usefulchoice for such optimization problems.

0 2 4 6

8 10

0 5 10 15 20 0

0.2

0.4

0.6

0.8

1

Avera

ge S

peed

F1 (Quality Variance)

F2 (Height Variance)

Avera

ge S

peed

Figure 9: A 3D plot of the two objectives versus theaverage speed of the non-dominated solutions. Mostof the knee solutions are concentrated in a centralspeed range. In addition, higher speeds lead to lowerquality variance (F1) but worse height variance (F2).

Computations have shown that the standard deviation ofall the algorithms are very small (stable performance) andare lie in a similar range. As the standard deviations don’toffer any further insights, these are not presented in Table 1.

Hypervolume measureQuality Input CurveQ0 Q1 Q2

Alg

ori

thm

NSGA-II 0.048 0.045 0.036ssNSGA-II 0.053 0.053 0.046

ssNSGA-II-ini 0.057 0.061 0.047CUEA 0.060 0.054 0.051

ssCUEA 0.060 0.059 0.056ssCUEAini 0.062 0.063 0.053

Set distance measure

Alg

ori

thm

NSGA-II 0.050 0.038 0.045ssNSGA-II 0.045 0.030 0.043

ssNSGA-IIini 0.027 0.019 0.034CUEA 0.024 0.032 0.026

ssCUEA 0.023 0.020 0.017ssCUEAini 0.022 0.012 0.024

Table 1: Median hypervolume (top) and distanceto the non-Dominated set (bottom) values based on31 runs of the algorithms. Whiter shades representbetter values.

In direct comparison the results for the repair mechanisms(in Table 2) show the advantage of the simplest repair mech-anism: the direct Baldwinian repairing. It is outperformingall other mechanisms by far. The iterative repairing usuallyperforms worse than the direct repairing and the same istrue for the Lamarckian repairing in comparison with theBaldwinian. Although the results for the Constraint Viola-tion measure are the worst in direct comparison with othermechanisms, the difference is small. Furthermore the re-sults are encouraging as deeper investigations have shownthe quick convergence of the population towards feasiblesolutions generating new solutions with a high percentagebeing feasible and beneficial for the population.

1205

Page 8: On Homogenization of Coal in Longitudinal Blending Bedshansen/proceedings/2014/GECCO/proceedings/p... · On Homogenization of Coal in Longitudinal ... blending methods like the well-known

Hypervolume measureQuality Input CurveQ0 Q1 Q2

Mech

anis

m

Direct Baldwinian 0.062 0.063 0.053Iterative Baldwinian 0.059 0.060 0.051Direct Lamarckian 0.056 0.062 0.051

Iterative Lamarckian 0.055 0.057 0.048Direct Constraint 0.054 0.046 0.048

Iterative Constraint 0.054 0.049 0.045Set distance measure

Mech

anis

m

Direct Baldwinian 0.022 0.012 0.024Iterative Baldwinian 0.030 0.022 0.027Direct Lamarckian 0.034 0.012 0.036

Iterative Lamarckian 0.040 0.027 0.037Direct Constraint 0.037 0.036 0.029

Iterative Constraint 0.032 0.038 0.033

Table 2: Median hypervolume (top) and distance tothe non-Dominated set (bottom) values based on 31runs using different repair mechanisms.

7. CONCLUSIONSThe composition of the non-dominated set shows the ben-

efits of using cone-based algorithms where cones can be usedto guide the search towards knee and extreme regions.

A new knee focusing algorithm with a biased density ofsolutions in good trade-off regions was proposed. We in-vestigated various repair mechanisms and proposed a con-straint violation measure. Systematic tests showed that theBaldwinian direct repairing is the most beneficial repairingmethod. The utilization of a distance measure to a non-dominated set was shown and is used as a more advanta-geous method to evaluate solution sets which are only opti-mal in terms of one objective.

The generation of a non-dominated set and the contribut-ing solutions allowed insights about the benefits of the cone-based approach and insights about the speed distributionand its relation to the two objectives. The detailed simula-tion system was adapted to be usable from the optimizationsystem generating results in reasonable time, allowing theresults generated by the optimization system to be of higherdegree of realism. The results generated by the optimizationsystem outperform static stacking. Initializing the startingpopulation with Chevron strategies, these can be quicklyimproved by the algorithm to generate much better results.

We are currently exploring the use of Parego [7] on ourcoal homogenization problem as Parego is designed for solv-ing expensive multi-objective optimization problems. Weare additionally employing the algorithmic framework of [12,13] to search for robust trade-off solutions for this problem.The use of variable cone orderings [6] to simultaneously focuson different regions of the Pareto front also seems relevant.

8. ADDITIONAL AUTHORSAdditional authors: Claus Bachmann (J&C Bachmann

GmbH email: [email protected]) and Hartmut Schmeck(Karlsruhe Institute of Technology email: [email protected]).

9. REFERENCES[1] J. Alderman. Coal Blending and Optimization, pages

52–62. Coal Age, 2012.

[2] J. Bond, R. Coursaux, and R. Worthington. Blendingsystems and control technologies for cement rawmaterials. IEEE Industry Applications Magazine,6:49–59, 2000.

[3] J. Branke and K. Deb. Integrating user preferencesinto evolutionary multi-objective optimization. InY. Jin, editor, Knowledge Incorporation inEvolutionary Computation, volume 167 of Studies inFuzziness and Soft Computing, pages 461–477.Springer Berlin Heidelberg, 2005.

[4] M. Cipold, P. Shukla, C. Bachmann, K. Bao, andH. Schmeck. An evolutionary optimization approachfor bulk material blending systems. In C. e. a. Coello,editor, PPSN XII, volume 7491 of Lecture Notes inComputer Science, pages 478–488. Springer, 2012.

[5] K. Deb. Multi-objective optimization usingevolutionary algorithms. Wiley, 2001.

[6] C. Hirsch, P. Shukla, and H. Schmeck. Variablepreference modeling using multi-objective evolutionaryalgorithms. In R. Takahashi, K. Deb, E. Wanner, andS. Greco, editors, Evolutionary Multi-CriterionOptimization, volume 6576 of Lecture Notes inComputer Science, pages 91–105. Springer, 2011.

[7] J. Knowles. Parego: a hybrid algorithm with on-linelandscape approximation for expensive multiobjectiveoptimization problems. Evolutionary Computation,IEEE Transactions on, 10(1):50–66, Feb 2006.

[8] M. Kumral. Bed blending design incorporatingmultiple regression modelling and genetic algorithms.International Journal of Surface Mining, Reclamationand Environment, 17:98–112, 2003.

[9] M. Laurila and C. Bachmann. X-ray fluorescencemeasuring system and methods for trace elements. USPatent US 2004/0240606, 2004.

[10] F. Pavloudakis and Z. Agioutantis. Simulation of bulksolids blending in longitudinal stockpiles. Journal ofthe South African Institute of Mining and Metallurgy,106:229–237, 2006.

[11] P. Shukla, M. Braun, and H. Schmeck. Theory andalgorithms for finding knees. In R. Purshouse,P. Fleming, C. Fonseca, S. Greco, and J. Shaw,editors, Evolutionary Multi-Criterion Optimization,volume 7811 of Lecture Notes in Computer Science,pages 156–170. Springer, 2013.

[12] P. Shukla, C. Hirsch, and H. Schmeck. A frameworkfor incorporating trade-off information usingmulti-objective evolutionary algorithms. InR. Schaefer, C. Cotta, J. KoACodziej, andG. Rudolph, editors, PPSN XI, volume 6239 ofLecture Notes in Computer Science, pages 131–140.Springer, 2010.

[13] P. Shukla, C. Hirsch, and H. Schmeck. Towards adeeper understanding of trade-offs usingmulti-objective evolutionary algorithms. In C. e. a.Chio, editor, Applications of EvolutionaryComputation, volume 7248 of Lecture Notes inComputer Science, pages 396–405. Springer BerlinHeidelberg, 2012.

[14] M. Wiecek. Advances in cone-based preferencemodeling for decision making with multiple criteria.Decision Making in Manufacturing and Services, 1(2),2007.

1206