The Role of Existing Measures in Developing and Implementing Performance Measurement Systems

24
International Journal of Operations & Production Management The role of existing measures in developing and implementing performance measurement systems Marc Wouters Mark Sportel Article information: To cite this document: Marc Wouters Mark Sportel, (2005),"The role of existing measures in developing and implementing performance measurement systems", International Journal of Operations & Production Management, Vol. 25 Iss 11 pp. 1062 - 1082 Permanent link to this document: http://dx.doi.org/10.1108/01443570510626899 Downloaded on: 05 March 2015, At: 10:12 (PT) References: this document contains references to 35 other documents. To copy this document: [email protected] The fulltext of this document has been downloaded 2607 times since 2006* Users who downloaded this article also downloaded: Mike Bourne, John Mills, Mark Wilcox, Andy Neely, Ken Platts, (2000),"Designing, implementing and updating performance measurement systems", International Journal of Operations & Production Management, Vol. 20 Iss 7 pp. 754-771 http://dx.doi.org/10.1108/01443570010330739 A. De Toni, S. Tonchia, (2001),"Performance measurement systems - Models, characteristics and measures", International Journal of Operations & Production Management, Vol. 21 Iss 1/2 pp. 46-71 http://dx.doi.org/10.1108/01443570110358459 Andy Neely, Huw Richards, John Mills, Ken Platts, Mike Bourne, (1997),"Designing performance measures: a structured approach", International Journal of Operations & Production Management, Vol. 17 Iss 11 pp. 1131-1152 http://dx.doi.org/10.1108/01443579710177888 Access to this document was granted through an Emerald subscription provided by 226153 [] For Authors If you would like to write for this, or any other Emerald publication, then please use our Emerald for Authors service information about how to choose which publication to write for and submission guidelines are available for all. Please visit www.emeraldinsight.com/authors for more information. About Emerald www.emeraldinsight.com Emerald is a global publisher linking research and practice to the benefit of society. The company manages a portfolio of more than 290 journals and over 2,350 books and book series volumes, as well as providing an extensive range of online products and additional customer resources and services. Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the Committee on Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for digital archive preservation. *Related content and download information correct at time of download. Downloaded by Universidad Nacional Autonoma de Mexico At 10:12 05 March 2015 (PT)

description

El rol de los indicadores de desempeño

Transcript of The Role of Existing Measures in Developing and Implementing Performance Measurement Systems

  • International Journal of Operations & Production ManagementThe role of existing measures in developing and implementing performancemeasurement systemsMarc Wouters Mark Sportel

    Article information:To cite this document:Marc Wouters Mark Sportel, (2005),"The role of existing measures in developing and implementingperformance measurement systems", International Journal of Operations & Production Management, Vol.25 Iss 11 pp. 1062 - 1082Permanent link to this document:http://dx.doi.org/10.1108/01443570510626899

    Downloaded on: 05 March 2015, At: 10:12 (PT)References: this document contains references to 35 other documents.To copy this document: [email protected] fulltext of this document has been downloaded 2607 times since 2006*

    Users who downloaded this article also downloaded:Mike Bourne, John Mills, Mark Wilcox, Andy Neely, Ken Platts, (2000),"Designing, implementing andupdating performance measurement systems", International Journal of Operations & ProductionManagement, Vol. 20 Iss 7 pp. 754-771 http://dx.doi.org/10.1108/01443570010330739A. De Toni, S. Tonchia, (2001),"Performance measurement systems - Models, characteristics andmeasures", International Journal of Operations & Production Management, Vol. 21 Iss 1/2 pp. 46-71http://dx.doi.org/10.1108/01443570110358459Andy Neely, Huw Richards, John Mills, Ken Platts, Mike Bourne, (1997),"Designing performance measures:a structured approach", International Journal of Operations & Production Management, Vol. 17 Iss 11pp. 1131-1152 http://dx.doi.org/10.1108/01443579710177888

    Access to this document was granted through an Emerald subscription provided by 226153 []

    For AuthorsIf you would like to write for this, or any other Emerald publication, then please use our Emerald forAuthors service information about how to choose which publication to write for and submission guidelinesare available for all. Please visit www.emeraldinsight.com/authors for more information.

    About Emerald www.emeraldinsight.comEmerald is a global publisher linking research and practice to the benefit of society. The companymanages a portfolio of more than 290 journals and over 2,350 books and book series volumes, as well asproviding an extensive range of online products and additional customer resources and services.

    Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the Committeeon Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for digital archivepreservation.

    *Related content and download information correct at time of download.

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • The role of existing measures indeveloping and implementingperformance measurement

    systemsMarc Wouters

    School of Business, Public Administration and Technology, University ofTwente, Twente, The Netherlands, and

    Mark SportelAccenture Technology Solutions, Twente, The Netherlands

    Abstract

    Purpose The purpose of this study is to investigate the role of existing, local performance measuresin the process of developing and implementing an integrated performance measurement system.Performance measurement has received much attention since the 1980s, based on the notion thatperformance measurement systems should be adapted to modern manufacturing systems. However,relatively few empirical studies have investigated implementation processes of such systems.

    Design/methodology/approach The paper describes a case study of the development of aperformance measurement system in a medium-sized company.

    Findings It was found that the process was strongly guided by the need to identify existing reportsand metrics at different levels within the organization, which informed the development andimplementation of the new performance measurement system. This is a more significant role than hasusually been proposed in the literature one side of the gap between existing measures and an idealsystem that has first been developed following a kind of greenfield approach.

    Research limitations/implications Future research could use other longitudinal case studies toobtain more insights into development and implementation processes, and also focus on informationsystems in these processes.

    Originality/value The value of this paper lies in highlighting the interplay between organizationalexperiences that are embedded in informal, local performance reports and new performancemeasurement initiatives that are initiated from a higher management-level.

    Keywords Performance measurement (quality), Process management

    Paper type Case study

    IntroductionPerformance measurement in operations is receiving a lot of attention since the 1980s.Companies are at various stages of implementing and refining their performancemeasurement systems, and they are finding solutions for many practical andconceptual challenges. From a research perspective, it is interesting and relevant tounderstand how such implementation processes are actually developing, to documentand explain diversity in implementation processes.

    The Emerald Research Register for this journal is available at The current issue and full text archive of this journal is available at

    www.emeraldinsight.com/researchregister www.emeraldinsight.com/0144-3577.htm

    The authors like to thank the case-study company for the research opportunity and LeonardFortuin for his comments on an earlier version of this paper.

    IJOPM25,11

    1062

    International Journal of Operations &Production ManagementVol. 25 No. 11, 2005pp. 1062-1082q Emerald Group Publishing Limited0144-3577DOI 10.1108/01443570510626899

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • Numerous papers in the literature have paid attention to the development ofperformance measurement systems. One of the major challenges that has beendiscussed, is defining a consistent set of measures that are clearly linked to theoperational strategy of the organization (Kaplan and Norton, 1992, 1993, 1996,2000; Nanni et al., 1992; Neely et al., 1995; Bourne et al., 2000). A performancemeasurement system aims to support the implementation and monitoring ofstrategic initiatives. The definition of performance measures and the setting oftargets for these measures are concrete formulations of the firms strategic choices.First steps in the development process are to clearly define strategic objectives andoperations contribution to achieving these strategic objectives. From suchanalyses, the organizations global performance measures and functional measuresare derived (Neely et al., 1995, 1997). Key to this process is the assurance of a linkbetween strategic objectives and performance criteria used at each level. Theemphasis on aspects such as overview, transparency, consistency, drill-downability, and relationships to strategy indicates that the literature has mainlyfocused on higher-management information needs.

    Managers often already use performance measures, without these being part ofintegrated performance measurement systems. McKinnon and Bruns (1992) found thatmanagers get information from observations, talking to people (individually or ingroup meetings) and from performance reports. Reports are often informal, distributedfrequently, developed locally, and containing a mix of local and centralized data.Existing performance measures are discussed in the performance measurementsystems literature, and typically they come into view after determining the idealperformance measurement system, when the gap between that ideal and the existingperformance measures is identified. Is this the only role of existing measures, or isthere a more fundamental role of existing measures in the process of developing andimplementing performance measurement systems?

    Not much research has been done to study the interplay of existing local reports andnew initiatives for the development and implementation of performance measurementsystems. Lohman et al. (2004) described a case study of a large organization. One of themajor challenges for developing and implementing a performance measurementsystem for high-level supply chain managers was to identify existing reports andongoing initiatives in performance measurement, both within and outside theoperations function. And this was not only the first step; the whole process wascharacterized by exchanging information and experiences, and by coordinating whatwas there and what was being changed. The overriding importance of identificationand coordination was different from the typical processes described in the literature,which emphasize designing and updating of performance measures and reports.

    While the current literature seems to identify many real-world challenges and usefulapproaches to developing and implementing performance measurement systems, moreempirical research is needed to describe and understand the variety of implementationprocesses that organizations follow. This paper contributes to the literature by givingfurther empirical evidence on implementation processes. The objectives of the presentstudy are to investigate the impact of existing informal performance measures on thedevelopment process and the content of formally initiated, integrated performancemeasurement systems. We report on a case study of the development andimplementation of performance measurement in a medium-sized company.

    Performancemeasurement

    systems

    1063

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • The remainder of this paper is structured as follows. First we will review some ofthe literature on performance measurement and motivate our research objectives. Thenwe will provide more details about our case-study method. In the section with empiricalfindings, there are subsections on the research site, the performance measurementchallenges that the case-study company was facing, findings with respect to thecontent and the development process of the performance measurement system (bothresearch objectives). A discussion and conclusions section ends the paper.

    Literature review and research objectivesSeveral papers provide excellent reviews of research studies about performancemeasurement systems in operations (Chow et al., 1994; Neely et al., 1995; Beamon, 1999;Kennerley and Neely, 2003; Bourne et al., 2003; Evans, 2004). Early research identifiedthe need to broaden performance measurement systems to support new operationspractices. Traditional performance measures in operations put a one-sided emphasison minimizing direct costs through low material costs, high capacity utilization, andhigh direct labor efficiency. A stream of literature on modern manufacturing systemsand service operations started in the 1980s (Hayes et al., 1988) and was soon followedby papers and books discussing that performance measurement systems needed to beadapted to include measures on quality, throughput times, flexibility, responsiveness,etc. (Eccles, 1991; Hall et al., 1990; Kaplan, 1990; Maskell, 1991). The empirical literaturehas reported results that support the existence of relationships between the pursuit ofspecific operational strategies, such as JIT, quality improvement and flexibility, andthe expansion of traditional efficiency-focused performance measures to embrace newperformance measures (Abernathy and Lillis, 1995; Fullerton and McWatters, 2002;Perera et al., 1997).

    In other words, performance measurement systems need to be clearly linked to theoperational strategy of the organization. Definitions of performance measures andtargets for these measures can be seen as concrete formulations of the firms strategicchoices; and the actual results achieved for the various measures reflect how well thefirm succeeds in achieving these strategic choices (Kaplan and Norton, 2000). Bothfinancial and non-financial measures are needed to translate the strategy into specificobjectives that guide operational actions taken by middle and lower management.Performance measurement reports may include a large number of different measuresfor each responsibility unit, spanning financial performance, customer relations,internal business processes, learning and growth objectives of the organization and theemployees. Some measures may be unique to only one responsibility unit, while othermeasures are common across several units (Kaplan and Norton, 1992, 1993, 1996; Ittnerand Larcker, 1998). The balanced scorecard concept attracted a lot of attention as alabel to broaden PM initiatives:

    . to include a variety of financial and non-financial measures from variousperspectives;

    . to pay attention to relationships between different measures; and

    . to link PM explicitly to strategy development.

    Development and implementation processes of performance measurement systemshave also been investigated (Neely et al., 1997). The development of a performancemeasurement system may conceptually be separated into phases of design,

    IJOPM25,11

    1064

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • implementation, and use (Bourne et al., 2000). The design phase is about identifyingkey objectives and designing measures. The process is often iterative, wherebymeasures are developed and adjusted as more information about strategy, customers,processes, etc. becomes available. The appropriate measures are derived in severalrounds to review and revise the measures. The availability of data is one of theconsiderations in the design process. Kaplan and Norton (1993) mention usingdocuments, interviews, and executive workshops for gathering information andbuilding consensus. In the implementation phase, systems and procedures are put inplace to collect and process the data that enable the measurements to be maderegularly. In the use phase, managers review the measurement results to assesswhether operations are efficient and effective, and the strategy is successfullyimplemented. This may also lead to challenging the strategic assumptions. The design,implementation, and use of a set of performance measures are not a one-time effort: afirm should install processes that ensure continuous review of the system (Beamon andWare, 1998; Bourne et al., 2000; Medori and Steeple, 2000; Kennerley and Neely, 2003).Review processes imply that a measure may be deleted or replaced, the target maychange, and the definition of measures may change.

    A typical development process is described by Neely et al. (1995) (Table I). Firststeps in the process are to clearly define the firms mission statement and strategicobjectives, and to develop an understanding of each functional areas role in achievingthe various strategic objectives. From such analyses, the organizations globalperformance measures and functional measures are derived, which are then furtherrefined. Key to this process is the assurance of consistency with strategic objectivesamong the performance criteria used at each level. Hudson et al. (2001) review theliterature and identify requirements for the development process, which should:

    . evaluate existing performance measurement systems;

    . enable strategic objective identification;

    . enable performance measure development;

    Step Action

    1 Clearly define the firms mission statement2 Identify the firms strategic objectives using the mission statement as a guide (profitability,

    market share, quality, cost, flexibility, dependability, and innovation)3 Develop an understanding of each functional areas role in achieving the various strategic

    objectives4 For each functional area, develop global performance measures capable of defining the firms

    overall competitive position to top-management5 Communicate strategic objectives and performance goals to lower levels in the organization.

    Establish more specific performance criteria at each level6 Assure consistency with strategic objectives among the performance criteria used at each level7 Assure the compatibility of performance measures used in all functional areas8 Use the performance measurement system9 Periodically re-evaluate the appropriateness of the established performance measurement

    system in view of the current competitive environment

    Source: Neely et al. (1995)

    Table I.Develop steps of a

    performancemeasurement system

    Performancemeasurement

    systems

    1065

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • . provide a maintenance structure;

    . involve key users;

    . have top-management support;

    . have full employee support;

    . have clear and explicit objectives; and

    . have set timescales.

    While several authors describe empirical studies that support the importance of thesechallenges for implementing performance measurement systems (Bititci et al., 2000;Hudson et al., 2001; McAdam and Bailie, 2002; Medori and Steeple, 2000; Neely et al.,2000), more empirical research is needed to describe and understand the variety ofimplementation processes that organizations follow. The interplay of existing localreports and new initiatives for the development and implementation of performancemeasurement on a wider scale is an important aspect. Performance measures oftenexist at various levels within the operations function (McKinnon and Bruns, 1992).Some characteristics of many existing reports are, that these are informal, aredistributed frequently, have been developed locally, contain a mix of local andcentralized data, report operating information with a very short period of time (a week,a day, or less), status information (up-to-date accumulations of bits of operating data,e.g. inventory-level reports and backlog reports), and comparisons (for example,compared to budget, planning, a previous period).

    The existence of performance measures may impact the development andimplementation of an integrated performance measurement system in a fundamentalway. Meyer and Gupta (1994) discuss that variability of performance measures tendsto decrease over time, so they can no longer discriminate between good and badperformers and become less informative. Several effects lead to diminishing variability(Meyer and Gupta, 1994): positive learning diminishes gaps between high and lowperformers; perverse learning causes people to strive to meet performance as measuredto the detriment of objectives that are not measured; selection processes remove lowperformers; and persistent differences that are not eliminated through learning andselection are often suppressed simply because further improvements are not likely.Organizations also create new measures as existing ones become less informative.

    Another consideration for the role of existing, local measures is that these maycontain experience that can be used for developing the official performancemeasurement system. Lohman et al. (2004) present a case study of the development of aperformance measurement system for top-management levels of the European supplychain organization of a large, worldwide operating company. The results suggests thatin some cases developing a performance measurement system should to a large extentbe understood as a coordination effort rather than a design effort. This means that it iscrucial throughout the development to understand current performance metrics andreports in great detail, and to include ongoing initiatives that affect performancemeasurement (such as new information systems, parallel initiatives for developingperformance measures, global scorecard development, etc.). Lohman et al. (2004)describe a process where documenting, coordinating, identifying gaps, and only somedesigning are central. While this may not be entirely different from what has beensuggested in the literature, it describes a process with a different emphasis.

    IJOPM25,11

    1066

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • . At the start and throughout the development process, it is identified what isalready being measured at various levels within operations and whichperformance measurement initiatives are being undertaken (both within andoutside the operations function), rather than looking at existing measures toidentify the gap between existing and developed measures.

    . Strategy is taken as given from existing documents, or it is left implicit (becauseit is clear what is important in operations, and now the challenge is to measurethis with a few numbers), rather than including discussions about strategywithin the scope of developing and implementing a performance measurementsystem.

    . The metrics dictionary (Neely et al., 1997) is used for describing existingmeasures and for coordination, not only for documenting newly developedmeasures.

    The objectives of the present study are to investigate the impact of existing informalperformance measures on the development process and the content of formallyinitiated, integrated performance measurement systems. We focus on a medium-sizedcompany to investigate whether the identification of existing performance measuresand the coordination of parallel performance measurement initiatives are as importantas in a large company.

    Case study as the research methodThis research is based on a case study of a Dutch brewing company, which wasstudied during a 15-month period. Information was gathered through interviews,company documents, attending and participating in meetings, making proposals andgaining feedback on these.

    Case study was chosen as the research method, because the objective was to gatherdata on the process of developing and implementing performance measurementsystems in a real-life setting (Voss et al., 2002), specifically aimed at investigating therole of existing measures in a medium-sized company. It is the process of confrontationbetween existing measures (informally defined, locally implemented) and an officialperformance measurement initiative (that emphasizes transparency, consistency) thatwe want to study in a real-life context and over a longer period. A single case study isin our view the appropriate methodology for this investigation, because this isexploratory research and we are seeking to open up a stream of future work.

    The research site was selected because it satisfied a number of criteria. First, welooked for a medium-sized company, as we wanted to investigate the role of existingmeasures and ongoing initiatives in such a setting (compared to very large companies).A second criterion was that the overall objectives of the logistics function and its rolewithin the companys strategy were reasonably clear. In other words, we wanted tostudy a company that had clear understanding of what its logistics function wassupposed to do, because we wanted to avoid first having to go through a strategydevelopment exercise. A third criterion was that top-management of the logisticsfunction had some experience with performance measurement.

    There were elements of action research (Coughlan and Coghlan, 2002), as one of theauthors was involved in developing and implementing performance measures, as partof his Masters program in industrial engineering and management, over a six month

    Performancemeasurement

    systems

    1067

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • period. The project aimed to assist the company as well as contribute to science. Thecompany participated in this study, because they expected to be assisted with theirdevelopment and implementation of performance measurement.

    Data were gathered through individual interviews, meetings with groups ofemployees of the organization, and company documents. Notes of the interviews andmeetings were kept. The data gathering process was recorded in such a way, that itwas traceable how data was obtained, but also which inputs the researchers providedto the process. The process of obtaining quantitative and qualitative data should betransparent in case-study research (Stuart et al., 2002). Each interview or meeting wasrecorded in a table that listed: who took part, when and how long did the interview ormeeting take, a reference to a document with notes of the meeting or interview, areference to any sample documents that were obtained in the meeting or interview, anda reference to any input that the researchers provided as input to the meeting orinterview. The files and documents referred to in the table were kept in the case-studyfile. The final version of the table contained about 85 entries on interviews, and otherkinds of meetings. Table II summarizes the data has been gathered in the case study.

    FindingsOn the basis of the data, the case was analyzed from several perspectives. The firstsubsection describes what, at the outset of our study, were the main challenges for thefurther development and implementation of performance measurement in logistics.The subsection Production processes and organization contains a high-level processdescription and a description of the organization of the logistics function. One researchobjective is to study the impact of existing measures on the content of the performancemeasurement process that has been developed. This is the focus of the subsectionPerformance measurement system. The data in Table III (discussed in later sections)bring out the difference between the measures that were in use in the existing situationand the measures that had been developed and in some cases implemented after thestudy. The other research objective focuses on how existing measures may play arole in the development process of the formal performance measurement system.

    Interviews Twenty-eight interviews with 22 different people, total duration 23 hoursMeetings Sixty-one meetings, total duration 83 hours, ranging from short personal

    meetings to group discussions/brainstormsThe interviews and meetings provided interaction with around 30 differentpeople in the organization

    Sample documents Description of strategic initiativesAnnual report 2002Delivery reliability reportsDashboard physical distribution (first initiative with KPIs) versions 2002and 2003Documents about supplier audit proceduresSupplier performance reportThesis written by a student about performance measurement in a strategicco-operationHistorical documents about initiatives for a PI at the Internal TransportDepartment

    Table II.Data gathered incase study

    IJOPM25,11

    1068

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • Ex

    isti

    ng

    mea

    sure

    saN

    ew,

    add

    itio

    nal

    mea

    sure

    sbS

    tatu

    sn

    ewP

    Isc

    Log

    isti

    csov

    eral

    lK

    ept

    Log

    isti

    csco

    sts

    per

    hl

    Del

    iver

    yre

    liab

    ilit

    yP

    erce

    nta

    ge

    FT

    Esp

    ent

    onin

    nov

    atio

    np

    roje

    cts

    Per

    cen

    tag

    eof

    FT

    Esp

    ent

    ontr

    ain

    ing

    Mat

    eria

    lsm

    anag

    emen

    tK

    ept

    Sto

    cktu

    rnar

    oun

    ds

    Del

    iver

    yre

    liab

    ilit

    yW

    areh

    ouse

    uti

    liza

    tion

    Per

    cen

    tof

    risk

    stoc

    kP

    rod

    uct

    ivit

    yof

    ord

    er-p

    ick

    ers

    Del

    iver

    yre

    liab

    ilit

    yon

    war

    ehou

    sele

    vel

    (*)

    Imp

    lem

    ente

    dC

    ance

    lled

    No.

    ofch

    ang

    esin

    pro

    du

    ctio

    np

    lan

    nin

    gN

    o.of

    day

    sin

    ven

    tory

    was

    pla

    nn

    edb

    etw

    een

    set

    bou

    nd

    arie

    sIn

    dev

    elop

    men

    t

    Sto

    ckco

    rrec

    tion

    s(*

    )Im

    ple

    men

    ted

    Ph

    ysi

    cal

    dis

    trib

    uti

    onK

    ept

    Tru

    ckin

    gh

    ours

    per

    sold

    pal

    let

    pla

    ceIn

    dev

    elop

    men

    tD

    eliv

    ery

    reli

    abil

    ity

    Tru

    ckin

    gk

    ilom

    eter

    sp

    erso

    ldp

    alle

    tp

    lace

    Ind

    evel

    opm

    ent

    Tra

    nsp

    orta

    tion

    cost

    sp

    erH

    Lta

    nk

    bee

    rS

    tock

    corr

    ecti

    ons

    (*)

    Imp

    lem

    ente

    dH

    Lta

    nk

    bee

    rp

    erm

    anh

    our

    Loa

    dfa

    ctor

    Tra

    nsp

    orta

    tion

    cost

    sp

    erp

    alle

    tp

    lace

    War

    ehou

    seoc

    cup

    atio

    nA

    vai

    lab

    ilit

    yv

    ehic

    les

    No.

    ofp

    alle

    tsh

    and

    led

    per

    lift

    tru

    ckh

    our

    Per

    cen

    tof

    illn

    ess

    Can

    cell

    edIn

    tern

    altr

    ansp

    orta

    tion

    cost

    sp

    erp

    alle

    tp

    lace

    Inte

    rnal

    tran

    spor

    tati

    onm

    anh

    ours

    per

    pal

    let

    pla

    ceA

    ver

    age

    ord

    ersi

    zeP

    urc

    has

    ing

    Kep

    tP

    erce

    nt

    pu

    rch

    ased

    by

    pu

    rch

    asin

    gd

    epar

    tmen

    tIn

    dev

    elop

    men

    tD

    eliv

    ery

    reli

    abil

    ity

    Per

    cen

    tof

    mat

    chin

    gin

    voi

    ces

    Su

    pp

    lier

    per

    form

    ance

    No.

    ofin

    nov

    ativ

    ed

    evel

    opm

    ents

    Cos

    tle

    vel

    of(r

    aw)

    mat

    eria

    lsIn

    tern

    alcu

    stom

    ersa

    tisf

    acti

    onP

    ack

    agin

    gd

    evel

    opm

    ent

    Non

    eN

    o.of

    idea

    sp

    erp

    erio

    dIm

    ple

    men

    ted

    Per

    cen

    tag

    eh

    ours

    spen

    ton

    inn

    ovat

    ion

    (ab

    solu

    tean

    dre

    lati

    ve)

    Imp

    lem

    ente

    d

    Per

    cen

    tof

    pac

    kag

    ing

    mat

    eria

    lsw

    ith

    fun

    ctio

    nal

    spec

    s

    Notes:

    aIt

    isin

    dic

    ated

    wh

    ich

    exis

    tin

    gm

    easu

    res

    wer

    ek

    ept

    and

    wh

    ich

    wer

    eca

    nce

    lled

    ;bm

    easu

    res

    mar

    ked

    wit

    ha

    (*)

    wer

    eal

    read

    yin

    itia

    ted

    ,b

    ut

    not

    yet

    imp

    lem

    ente

    d;

    c In

    dev

    elop

    men

    t:

    the

    mea

    sure

    has

    bee

    nd

    evel

    oped

    du

    rin

    gth

    ep

    roje

    ct,

    bu

    tis

    not

    yet

    fin

    ish

    ed;

    Im

    ple

    men

    ted

    :th

    em

    easu

    reis

    com

    ple

    tely

    dev

    elop

    edan

    dim

    ple

    men

    ted

    du

    rin

    gth

    ep

    roje

    ct

    Table III.Existing and new

    performance measures incase study company

    Performancemeasurement

    systems

    1069

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • The subsection Development process provides information for assessing this. Thedata in Table IV summarize the development process.

    Performance measurement challengeThe company where the case study took place, is a brewing company in theNetherlands, with annual sales of around e300 million and 3 million hectoliters. Thecompany employs around 900 people. The company has formulated three mainobjectives: continuity of the firm, value creation for shareholders, and volume growth.Continuity is seen as finding a balance between the interests of employees,shareholders, and environmental concerns. Volume growth is important for ongoingshareholder value creation and because of large investments that the company ismaking.

    The company has selected strategic initiatives in order to achieve the objectivesmentioned above. Important elements in the companys strategic initiatives are to focuson high-quality, high margin premium segments of the Dutch market and on a limitednumber of international markets where the brand has a strong position. Innovation isimportant, both in specialty and seasonal beers, as well as in packaging. Cooperationwith other brewing companies is a cornerstone of its international strategy, for a widerdistribution of the brand, for combining purchases, and for better utilizing theavailable capacity (for example, the company brews and distributes an Americanbrand in Europe). The company is currently in the process of completely rebuilding itsbrewery and office buildings on a new location, investing about e250 million.

    The main functional areas of the company are marketing and sales, production, andlogistics. This research project took place within the logistics function, whichcomprises of a number of departments: purchasing, physical distribution, materialsmanagement, and packaging development. The Director of Logistics is responsible forthe logistics function. The logistics director and the heads of the four logisticsdepartments form the management team of the logistics function, the so-called logisticsteam. More information about these sub-departments is provided below.

    The goals of the logistics function were pretty clear at the outset of this study. Thelarge investment for the new brewery was connected to ambitious targets for growth of

    Weeks Description Reference to steps in Table I

    1 Talk with people and study documents about thestrategy of the company, the contribution of thelogistics function, and the objectives of logistics

    1-4

    2-6 Study existing performance measures in logisticsdepartments

    7-15 Determine the usefulness of existing measures andstart discussing potential new measures for eachlogistics department

    5, 6, 7a

    14-22 Develop or improve reports and procedures for aselection of new or existing measures

    6, 7

    22-. . . Use the new measures 8

    Note: aThe reference to steps 5-7 does not completely reflect the process in this period; it does not reflectthe ongoing process of identifying new measures understanding the details about existing measures

    Table IV.Types of steps taken inthe development process

    IJOPM25,11

    1070

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • sales and profits. The company had recently discussed each functional areas role inachieving its strategic objectives, and the mission of the logistics function had beendescribed as to coordinate the supply chain in an effective, efficient, and innovativeway for providing optimal service to our customers. This had resulted in fourobjectives for the logistics function: to be number one in customer satisfaction,excellence in supply chain efficiency, continuous supply chain innovations, and aprofessional and learning organization.

    Besides a clear understanding of what logistics was supposed to contribute to theoverall success of the company, the company had one performance measure in placethat measured at least partly whether logistics was successful: delivery reliability. Thismeasure was available at various levels of aggregation, ranging from overall deliveryreliability in a particular month for the company as a whole, to, for example, thecontribution of the finished goods warehouse in a particular week for a particularproduct group. This measure was reported and clearly understood throughout thelogistics function. As the Director of Finance explained in one of the first interviews:In Logistics there are also quality indicators, for example, are deliveries on time, dodeliveries match exactly what has been ordered, are products out-of-stock?

    The situation described above sets the stage for further development ofperformance measurement in the case-study company. The challenges for theimplementation of performance measurement in this case study can be understood asfollows.

    . Expand the number of performance measures for the logistics team. Deliveryreliability was available and an important one, but it did not cover the totalcontribution of the logistics function to realizing the firms strategy, so additionalmeasures were required to provide a more comprehensive picture in relationshipto the four objectives of the logistics function.

    . Provide guidance for concrete actions to lower-level managers, by creating newmeasures that connect activities that take place at the shop floor to the fourobjectives of the logistics function. Again, relationships are very clear for thedimension of delivery reliability, almost every sub-department has a clearlyformalized influence on this measure. However, other dimensions of performanceof the sub-departments are not captured in performance measures that areclearly linked to the logistics objectives.

    . Implement new measures, in such a way that managers and other employees inthe logistics function are going to create the periodic report, instead of the financeand accounting function. This required overcoming several practical constraints,such as computer skills and time availability. The controller had some doubtshere: People dont know enough about Excel and other IT tools, they cannotwork with it very well. However, as the Director of Logistics expressed it: Ifpeople are not going to take the effort to do the measurements and make thereports, it probably means its not essential to do them. Whether new measureswould actually be implemented was a kind of relevance test in his mind.

    Production processes and organizationThe supply chain and the companys production process are shown in Figure 1. Themain inputs for the brewing process are water, malt and hop. These comefrom about ten different suppliers, located between approximately 10 and 500 km from

    Performancemeasurement

    systems

    1071

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • the breweries. There are currently two brewery sites that are about 40 km apart,but these are being replaced by a completely new, state-of-the-art brewery on a newlocation in-between the two old sites. This new brewery will operate more efficientlyand is logistically much better located.

    The brewery process consists of four major steps. The first step in the brewingprocess is to produce wort (a malt mixture) from water, and this takes about eighthours. This involves gradually heating a mixture of water and malt to release sugars.After filtering this mixture, the clear liquid is brought to a boil and hops are added.After cooling it, the wort is pumped to the fermentation tanks.

    In the next step, yeast is added to transform the sugar into alcohol and carbondioxide. This fermentation process takes about two weeks, and after that the yeast istaken out of the beer and the beer is pumped to storage tanks to stabilize for severalweeks (also called aging). This improves the taste and lets particles in the beer sink tothe bottom of the tanks. This is not just a storage point, but it is part of the brewingprocess, that is why it is indicated as a process-step in Figure 1, not as an inventorypoint. Because the time in storage is somewhat flexible, there is in fact also a bufferfunction involved. The beer is then filtered and bottled in various types of bottles andcans, 20 or 50 l containers for bars and restaurants, or into tanks for tank transportation.

    Transportation to customers can be thought of as two different flows (these areshown in Figure 1): the beer is distributed to customers on pallets (that contain bottles,cans, or containers) or in tanks. Pallets are transported to retail customers who sell toconsumers in stores, or to distributors who sell to bars, restaurants, etc. (the companydoes not distribute pallets directly to bars and restaurants). Transportation in tanksdoes go directly to bars, restaurants, etc. that store the beer in special tanks and selldraft beer. About 70 percent of the volume (brewed in the Netherlands) is distributed tocustomers in the Netherlands, the other 30 percent to international customers.

    As mentioned above, the logistics function is made up of the materials management,physical distribution, purchasing and packaging development departments, which willnow be described in some more detail (Figure 2). The materials managementdepartment consists of several sub-departments:

    . logistics planning is responsible for controlling the level of finished goodsinventory and determining the production plan, from which the requiredmaterials are derived;

    . the central warehouse is where all materials are kept, except for what is in themarketing warehouse; and

    . in the marketing warehouse the commercial materials for marketing purposesare stored (such as glasses, umbrellas, shirts and other clothing, toys, etc.; this isa significant material flow for a brewing company).

    Figure 1.Supply chain andproduction process at thecase study company

    IJOPM25,11

    1072

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • The physical distribution department is responsible for the goods flow from finishedgoods inventory until products reach the customer, and for return flows of emptybottles and crates that are reused. (A system of deposits leads to almost 100 percent ofbottles and crates being returned to stores and then to the brewery.) The physicaldistribution department consists of several sub-departments.

    . Customer Service Netherlands processes all incoming orders from customers inthe Netherlands, records logistics complaints from customers and controlswhether the ordering behaviour of customers complies with the commercialservice level agreements.

    . Transportation and distribution is responsible for executing all transportationactivities towards customers, and also between the company sites and betweenthe company and external warehouses. The company is in process of graduallyoutsourcing almost all of its transportation and distribution activities (except forthe distribution of beer that is delivered and stored on the customers location inlarge tanks) by reducing the number of drivers and trucks during a number ofyears and increasing the share of transportation activities executed by externalparties.

    . Internal transport and warehouses loads and unloads trucks and executes allhandling activities in the warehouses using a fleet of forklift trucks.

    . One person is responsible for transportation planning to customers, but this isnevertheless considered as a separate sub-department.

    The purchasing department is responsible for all purchasing processes of thecompany, and these include investments, raw materials, process materials, packagingmaterials, transportation, and other services. Purchasing aims for long-term

    Figure 2.Organizational structure

    of the case study company

    Performancemeasurement

    systems

    1073

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • relationships with suppliers and in many cases contracts are set by the purchasingdepartment and some people outside the purchasing department can directly placecall-off orders under these contracts.

    The department for packaging development is responsible for developing newpackaging materials, primary packaging (bottles, cans) as well as secondarypackaging (boxes, foils, etc.). This requires a lot of cooperation with outside parties,such as designers, printing companies, suppliers of packaging materials. Thedepartment is also responsible for auditing suppliers of packaging materials.

    Performance measurement systemSome results of the development process in our case-study company are summarized inTable III, which shows the measures that were in use at the start of the study and thenew measures that were going to be used in the logistics function of the company. Thetable shows that various departments had measures in place, and 75 percent (12 out of16) have been maintained. In addition, a number of new measures (three) were alreadybeing developed, but had not yet been implemented at the start of the study. For thenew measures, there are priorities: a limited number (five) has been completelydeveloped and operationalized at the end of the study period, four more measures werebeing implemented at that point in time, and the remaining new measures (13) hadbeen identified as important, but the actual implementation was postponed until thefirst-priority measures were going to be used for some time. The metrics dictionary(Neely et al., 1997) was used for the development of the new measures.

    As Table III shows, delivery reliability was already being measured and continuesto be a central measure for the logistics function of the case-study company. We willprovide some more detail on the measure, because it was defined in an interesting waythat allowed drilling down. Overall monthly or weekly delivery reliability for thelogistics function was defined as:

    timeliness completeness; or

    12#complaints about timeliness

    #order lines delivered

    12 #complaints about completeness

    #order lines delivered

    :

    The number of complaints and the number of order lines delivered refer, or course,to the same period. This overall number can be broken down to a department, and thenthe number of complaints caused by that specific department is in the denominator.(For example, suppose the total number of complaints is ten and the total number oforderliness delivered is 1,000, so overall performance is 12 0:01 99 percent. If threecomplaints would be caused by one department and seven complaints by a seconddepartment, the overall performance number is disaggregated into 12 0:003 99:7percent and 12 0:007 99:3 percent for the two departments, respectively.)The customer service department investigates customer complaints and tracks eachcomplaint to the department that caused it. Drilling down the overall deliveryreliability to specific products or customers is not yet possible with the informationsystems, but would be possible if some queries in the information systems are adjustedand customers and products are registered. In the current measure all complaints countas one (have the same weight), while it would be possible to give different weights,

    IJOPM25,11

    1074

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • for example, an incomplete order line for commercial materials could be considered asless severe than an incomplete order line for a large quantity of beer.

    One of the interesting new measures that deserves a description in more detail, isabout finished goods inventory. In Table III, this measure is called No. of daysinventory was planned between set boundaries (in development). Based on expectedsales and planned production, the inventory level for a certain standard-sized bottle ofregular beer is projected in a graph. This graph contains an upper and a lower bound(Figure 3). When the inventory level becomes higher than the upper bound, which isdetermined by the capacity of the warehouse, transportation to external warehouses isneeded. An inventory level lower than the lower bound, which is calculated as a certainservice level, means a higher risk of out-of-stock situations.

    The value of the lower bound is variable and is calculated as the sum of threecomponents: one day of sales, safety stock and anticipation stock. One day of sales iskept as a constant value to avoid loading problems. The safety stock is calculatedusing variances in expected and actual sales per day and in production quantitiesplanned and realized (expressed in delivery time), to cover uncertainties. Theanticipation stock is calculated as the difference between the production capacity andthe expected sales for a certain day. This is to anticipate the fact that the productioncapacity, which is mostly determined by the number of shifts, cannot be changed on adaily basis.

    The number of days per month that the planned inventory level in this graph isgoing to be in-between the bounds, can be used as a performance measure. This graphis currently in use on an experimental basis for gaining experience, and some aspectsmight change, for example, the time-horizon that is reviewed.

    This section explored the impact of existing performance measures on the content ofthe new, centrally initiated, formal performance measurement system, which was oneof our research objectives. For our case study, the data suggest that the existing,informal measures are reflected in the official system, but it is not only a matter of

    Figure 3.Graph of planned

    inventory level, 40 daysafter into the future (forcrates with 24 bottles of33 cl), based on plannedproduction and expected

    sales

    Performancemeasurement

    systems

    1075

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • cut-and-paste, because the centrally initiated systems leads to new performancemeasures as well. Furthermore, in this case study it was found that performancemeasures are highly specific to the organization, as shown by the example of themeasure for the finished goods inventory in Figure 3.

    Development processTable III suggests that the development and implementation is an ongoing process ofdeveloping, implementing, experimenting, and revising. There is not a specific point intime when the performance measurement system is ready. The company was andwill be experimenting with measures and reports. The table also shows that thethroughput time for actually implementing a few new measures is considerable andmay easily take half a year to one year.

    In the case study, it was crucial to look in detail at existing measures at the start ofthe development process. New measures could only be developed after understandingand using as much as possible from what was already in place. This was different froma gap analysis where the new measurement system is designed first and thencompared to the existing system to identify gaps. Rather, it was about building anunderstanding of:

    . the precise definitions of existing measures;

    . the rationales behind these;

    . the data that were used for these;

    . the limitations that people experienced with the existing measures;

    . ideas that people are working on to improve the existing system; and

    . changes of information system changes that could impact existing reports.

    The understanding of these issues informed the development of new measures and theperformance measurement system. As an illustration of the important nuances of theexisting indicators that needed to be understood, the Director of Finance talked aboutefficiency performance measures on the packaging lines: Various definitions havebeen set up that exactly indicate how it should be calculated, for example, whetherstart-up time is part of it, and those kinds of things. So there are several productionefficiency measures with different definitions. The development process became ashort-cycled iterative process of looking at existing and required measures. Whilecurrent models of implementation processes do not exclude this, this observation hasnot received much attention in the literature thus far.

    Another way to describe the case study is in terms of the kinds of steps that havebeen taken, which are depicted in Table IV. (The numbering refers to the steps listed inTable I.) Steps 1 through 4 were already clear and took almost no time; it only requiredstudying documents and some interviews. Considerable time was spent onunderstanding the existing measures as Table IV shows. It became clear that thesteps listed in Table I cannot fully reflect the process in this case study, as theidentification of existing performance reports and the coordination of initiatives is notmade explicit in those steps.

    The results of these analyses have been discussed with the Director of Logisticsand his management team. They supported the findings reported in this paper. Theyalso concluded that they would continue developing their PMS in a similar manner:

    IJOPM25,11

    1076

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • giving much emphasis to developing new measures through a combined effort ofmanagers, controllers, and researchers, and further developing and implementing thePMS step-by-step in a process of continuous revision and improvement. At the end ofthe period of this study, the director commented: We now want to achieve thatthe indicators we have, are up and running and visible on the publication boards.The researchers maintained contacts with the company, and about 18 months afterthis study, performance measurement was implemented more broadly in thecompany and the performance measurement system implemented within logisticswas presented by top-management as an example for other departments in theorganization.

    Is there a danger of taking measures that are already there as the starting pointfor further development? Maybe this inhibits change and innovation? The controllerremarked in this context: You can never be fully sure that the right indicators areemerging. He also felt the improved performance measures were not only relevantfor the employees themselves, but also for him in his capacity as controller to havemore facts about actual performance: I often hear all kinds of false excuses: Itsbecause of. . . and then other people or uncontrollable circumstances are blamed.The control-perspective on performance measures might not get sufficient attentionin a pure bottom up development process. Reflecting on our case-studyexperiences, we would suggest that it is important at some stage to take a freshlook and to try to think individually and in group-sessions about new measures thatmay have nothing to do with existing measures. In the case study, the questions andideas that the researchers brought to the process, were one source that provided suchinput of new ideas to the process. We could imagine that only looking at what isalready there would hamper creativity. It is probably important to do both:identifying what exists, and taking a step back to come up with completely newideas. We have emphasized the first point, because this has received less attention inthe literature so far.

    This section presented findings about the impact of existing informalperformance measures on the development process of formally initiated integratedperformance measurement systems (the other research objective). We found that muchattention (time) is paid to discovering and documenting existing reports and initiatives.Even in a medium-sized company, this was not an easy process because the knowledgewas scattered across different people in different departments, and at different levels inthe organization. Knowledge about existing measures was local, and building anunderstanding of what is already there required time and attention even in amedium-sized company. We also found that a standardized format is useful fordocumentation of existing measures, such as suggested by Neely et al. (1997).Furthermore, our findings suggest that the revision process is a continuous activityrather than a periodic review with the PMS being frozen in-between those officialreviews.

    Discussion and conclusionThis paper described and explained a development and implementation process of aperformance measurement system in a case study. Such processes are not the same inevery organization, so there is need to document and to make sense of diversity ofdevelopment and implementation processes. Previous research (Lohman et al., 2004)

    Performancemeasurement

    systems

    1077

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • suggested, based on a case study in a large company, that in the development andimplementation process of performance measurement systems, the identification ofexisting performance measures and the coordination of independent performancemeasurement initiatives are crucial, while these have hardly been highlighted in theliterature thus far. The objectives of this study were to investigate the impact ofexisting informal performance measures on the development process and on thecontent of formally initiated, integrated performance measurement systems. In thisstudy we investigated a medium-sized company.

    Regarding the development process, we found that much attention was required fordiscovering and documenting existing reports and initiatives. Even in a medium-sizedcompany, this was not trivial because the knowledge was dispersed across differentpeople, departments, and organizational levels. We observed that in our case-studycompany, the contribution of logistics to the companys strategy was quite clear at theoutset. The challenge was to make this more tangible and measurable at lower levels inthe organization, and to provide a more concrete direction for action. The challengesimply did not lie in steps 1 through 4 of Table I. On the other hand, existing measuresplayed a much more important role than often suggested in the literature, where theyare one side of the gap: the other side is an ideal performance measurement systemthat is the result of a development process, and looking at the gap shows how feasiblethe ideal is. We found that existing measures and initiatives needed to be identified, therationales of these measures and their precise definitions needed to be understood, andrelationships (and inconsistencies) between similar measures in different departmentsand between higher-level and lower-level management needs needed to be explored.These activities did not come in place of creatively thinking about entirely newmeasures, but these activities are much more important for the process and theoutcomes than perhaps has been recognized in the literature so far.

    We also found that a standardized format is useful for documentation of existingmeasures, such as suggested by Neely et al. (1997). Furthermore, our findings suggestthat the revision process is a continuous activity rather than a periodic review with thePMS being frozen in-between those official reviews.

    Regarding the impact of existing performance measures on the content of the new,centrally initiated, formal performance measurement system, our findings suggestthat existing, informal measures are reflected in the official system. However, it is notonly a matter of cut-and-paste, because the centrally initiated system led to newperformance measures as well. Furthermore, in this case study it was found thatperformance measures were highly specific to the organization.

    Practical recommendationsReflection on the research process may provide some practical recommendations forfuture use and development of the performance measurement system.

    . Group discussions, such as through brainstorming, could be successfully usedfor generating ideas at lower levels of the logistics department. Although weexpected it might save time treating the different sub-departments of logisticsseparately, we feel it would have been better if we had earlier applied a groupdiscussion at management-level too, instead of discussing it later on.

    . The metric design template proved to be a useful tool to define the necessaryaspects of a performance indicator.

    IJOPM25,11

    1078

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • . It was important to first investigate which performance measures and reportsexist and judge their usefulness. This step is hardly mentioned explicitly in theliterature.

    . It was important to be aware of the level of computer skills of potential users ofthe performance measures. Providing support to users in the design and usephase becomes important, to prevent users becoming too dependent on people inthe organization who are skilled with computers.

    . At the case company, developing many metrics simultaneously has led to manyalmost-finished metrics. A few have been completely implemented. Therefore,it might be better to develop and implement metrics one-by-one, also dependingon how much time people have available for developing and implementing newmeasures, in order to quickly achieve tangible results.

    . Continuous involvement of managers the future users of the reports duringthe selection and design phase is crucial. Mostly, managers are involved in manyother different projects. Despite their busy schedules, they should spent enoughtime discussing the selected metrics and specifications.

    Limitations and future researchSeveral limitations should be mentioned. As with any case study, what is just typicaland coincidental for the particular company studied, and what empirical findings havea theoretical significance beyond the company that has been studied? Our researchsuggests that the following factors could be important for the generalizability of theresults. A first condition that seems relevant is that the firms strategy and the logisticsfunctions role are well defined. Maybe the underlying question is whether strategyformulation is part of a performance measurement initiative? Of course, in so far as theorganizational strategic objectives and logistics role in realizing those are vague, thereis no basis for developing and implementing performance measurement in logistics. Ifthat is the case, time needs to be spent clarifying those. A second condition that mayhave to do with generalizability of our results is that top-management of a firmslogistic function has some successful experience with performance measures, so thecompany is ready to take performance measurement to the next level. A third conditioncould be that lower management-levels inside the logistics function have experiencewith performance measurement. There is some empirical support (McKinnon andBruns, 1992) that in many companies performance measures exist, although developedand reported informally, known inside departments (and higher management may notbe aware of these), measured according to various definitions (that may not beconsistent across departments and management-levels), using local data andinformation systems. The implications are that, when higher-level managers wantmore structure, more overview, more uniformity, and more numbers, then diggingdeeply into the existing reports and metrics will be a key element throughout manydevelopment processes.

    Doing action research causes another limitation, because there is the risk of biasingthe results because of the influence of the actors. While this may be true, we believethat doing this type of research provides a great opportunity to see what is happeningin real companies, to obtain much detailed information, and to experiment withapproaches for developing and implementing performance measurement. Furthermore,many different people from the case-study company have been involved, who would

    Performancemeasurement

    systems

    1079

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • not easily follow an approach they would not see as valuable, who represent differentviews and provide different kinds of information.

    The results so far suggest several directions for interesting and useful futureresearch. First, it would be worthwhile if companies would cooperate in longitudinalcase studies to investigate the process of using and modifying the new performancemeasurement system. During a period of at least one year, it could be investigated, forexample, whether it is feasible and beneficial to continuously improve the report, whilekeeping the metrics quite stable. Is it beneficial to have a high-levelperformance-measurement manager prepare the report and the input for themeeting, as well as translating the discussion into a new format for next month?How are experiences with a PMS shaping further developments?

    A second research direction would be to have pay attention to information systems.What are the problems when new information systems are introduced, and what arethe causes? What are the advantages of more sophisticated tools for data warehousing,etc. Does this enable more use to be made of data in ERP systems? What are thebarriers here? Case studies could be an appropriate method for researching thesequestions.

    A third set of research issues pertains to using complex reports. Performancemeasurement as it was found in this case study, and as it is also discussed in theliterature, may lead to complex reports, with detailed measures that are specific toparticular departments. Can users analyze such information? Previous studies haveshown that cognitive complexity of reports, with many measures with differentdimensions, some unique and some common measures, may prevent managers fromtaking full advantage of modern performance measurement systems (Lipe and Salterio,2000).

    References

    Abernathy, M.A. and Lillis, A.M. (1995), The impact of manufacturing flexibility onmanagement control system design, Accounting, Organizations and Society, Vol. 20 No. 4,pp. 241-58.

    Beamon, B.M. (1999), Measuring supply chain performance, International Journal ofOperations & Production Management, Vol. 19 No. 3, pp. 275-92.

    Beamon, B.M. and Ware, T.M. (1998), A process quality model for the analysis, improvementand control of supply chain systems, Logistics Information Management, Vol. 11 No. 2,pp. 105-13.

    Bititci, U., Turner, T. and Begemann, C. (2000), Dynamics of performance measurementsystems, International Journal of Operations & Production Management, Vol. 20 No. 6,pp. 692-704.

    Bourne, M., Mills, J., Wilcox, M., Neely, A. and Platts, K. (2000), Designing, implementing andupdating performance measurement systems, International Journal of Operations &Production Management, Vol. 20 No. 7, pp. 754-71.

    Bourne, M., Neely, A., Mills, J. and Platts, K. (2003), Implementing performance measurementsystems: a literature review, International Journal of Business Performance Management,Vol. 5 No. 1, pp. 1-24.

    Chow, G., Heaver, T.D. and Henriksson, L.E. (1994), Logistics performance: definition andmeasurement, International Journal of Physical Distribution & Logistics Management,Vol. 24 No. 1, pp. 17-28.

    IJOPM25,11

    1080

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • Coughlan, P. and Coghlan, D. (2002), Action research for operations management, InternationalJournal of Operations & Production Management, Vol. 22 No. 2, pp. 220-40.

    Eccles, R.G. (1991), The performance measurement manifesto, Harvard Business Review, Vol. 69No. 1, pp. 131-7.

    Evans, J.R. (2004), An exploratory study of performance measurement systems andrelationships with performance results, Journal of Operations Management, Vol. 22,pp. 219-32.

    Fullerton, R.R. and McWatters, C.S. (2002), The role of performance measures and incentivesystems in relation to the degree of JIT implementation, Accounting, Organizations andSociety, Vol. 27, pp. 711-35.

    Hall, R.W., Johnson, H.T. and Turney, P.B.B. (1990), Measuring up: Charting Pathways toManufacturing Excellence, Business One Irwin, Homewood, IL.

    Hayes, R.H., Wheelwright, S. and Clark, K. (1988), Dynamic Manufacturing, The Free Press,New York, NY.

    Hudson, M., Smart, A. and Bourne, M. (2001), Theory and practice in SME performancemeasurement systems, International Journal of Operations & Production Management,Vol. 17 No. 11, pp. 1096-115.

    Ittner, C.D. and Larcker, D.F. (1998), Innovations in performance measurement: trends andresearch implications, Journal of Management Accounting Research, Vol. 10, pp. 205-38.

    Kaplan, R.S. (Ed.) (1990), Measures for Manufacturing Excellence, Harvard Business SchoolPress, Boston, MA.

    Kaplan, R.S. and Norton, D.P. (1992), The balanced scorecard measures that driveperformance, Harvard Business Review, Vol. 70 No. 1, pp. 71-9.

    Kaplan, R.S. and Norton, D.P. (1993), Putting the balanced scorecard to work, Harvard BusinessReview, Vol. 71 No. 5, pp. 134-47.

    Kaplan, R.S. and Norton, D.P. (1996), Using the balanced scorecard as a strategic managementsystem, Harvard Business Review, Vol. 74 No. 1, pp. 75-87.

    Kaplan, R.S. and Norton, D.P. (2000), Having trouble with your strategy? Then map it, HarvardBusiness Review, Vol. 78 No. 5, pp. 167-78.

    Kennerley, M. and Neely, A. (2003), Measuring performance in a changing businessenvironment, International Journal of Operations & Production Management, Vol. 23No. 2, pp. 213-29.

    Lipe, M. and Salterio, S. (2000), The balance scorecard: judgemental effects of common andunique performance measures, The Accounting Review, Vol. 75 No. 3, pp. 283-98.

    Lohman, C., Fortuin, L. and Wouters, M. (2004), Designing a performance measurement system:a case study, European Journal of Operational Research, Vol. 156 No. 2, pp. 267-86.

    McAdam, R. and Bailie, B. (2002), Business performance measures and alignment impact onstrategy the role of business improvement models, International Journal of Operations& Production Management, Vol. 22 No. 9, pp. 972-96.

    McKinnon, S.M. and Bruns, W.J. (1992), The Information Mosaic, Harvard Business SchoolPress, Boston, MA.

    Maskell, B.H. (1991), Performance Measurement for World Class Manufacturing. A Model forAmerican Companies, Productivity Press, Cambridge, MA.

    Medori, D. and Steeple, D. (2000), A framework for auditing and enhancing performancemeasurement systems, International Journal of Operations & Production Management,Vol. 20 No. 5, pp. 520-33.

    Performancemeasurement

    systems

    1081

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • Meyer, M.W. and Gupta, V. (1994), The performance paradox, in Staw, B.M. and Cummings, L.L.(Eds), Research in Organizational Behavior, Vol. 16, pp. 309-69.

    Nanni, A.J., Dixon, J.R. and Vollmann, T.E. (1992), Integrated performance measurement:management accounting to support the new manufacturing realities, Journal ofManagement Accounting Research, Vol. 4, pp. 1-19.

    Neely, A.D., Gregory, M. and Platts, K. (1995), Performance measurement system design,International Journal of Operations & Production Management, Vol. 15 No. 4, pp. 80-116.

    Neely, A.D., Richards, H., Mills, J., Platts, K. and Bourne, M. (1997), Designing performancemeasures: a structured approach, International Journal of Operations & ProductionManagement, Vol. 17 No. 11, pp. 1131-52.

    Neely, A.D., Mills, J., Platts, K., Richards, H., Gregory, M., Bourne, M. and Kennerley, M. (2000),Performance measurement system design: developing and testing a process-basedapproach, International Journal of Operations & Production Management, Vol. 20 No. 10,pp. 1119-45.

    Perera, S., Harrison, G. and Poole, M. (1997), Customer-focused manufacturing strategy and theuse of operations-based non-financial performance measures: a research note,Accounting, Organizations and Society, Vol. 22, pp. 557-72.

    Stuart, I., McCutcheon, D., Handfield, R., McLachlin, R. and Samson, D. (2002), Effective caseresearch in operations management: a process perspective, Journal of OperationsManagement, Vol. 20, pp. 419-33.

    Voss, C., Tsikriktsis, N. and Frohlich, M. (2002), Case research in operations management,International Journal of Operations & Production Management, Vol. 22 No. 2, pp. 195-219.

    IJOPM25,11

    1082

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • This article has been cited by:

    1. Kwee Keong Choong. 2014. The fundamentals of performance measurement systems. InternationalJournal of Productivity and Performance Management 63:7, 879-922. [Abstract] [Full Text] [PDF]

    2. Debora M. Gutierrez, Luiz F. Scavarda, Luiza Fiorencio, Roberto A. Martins. 2014. Evolution of theperformance measurement system in the Logistics Department of a broadcasting company: An actionresearch. International Journal of Production Economics . [CrossRef]

    3. Cory Searcy. 2014. Measuring Enterprise Sustainability. Business Strategy and the Environment n/a-n/a.[CrossRef]

    4. Fei Deng, Hedley Smyth. 2013. Contingency-Based Approach to Firm Performance in Construction:Critical Review of Empirical Research. Journal of Construction Engineering and Management 139,04013004. [CrossRef]

    5. Noorliza Karia, Chee Yew Wong. 2013. The impact of logistics resources on the performance of Malaysianlogistics service providers. Production Planning & Control 24, 589-606. [CrossRef]

    6. Laura Grosswiele, Maximilian Rglinger, Bettina Friedl. 2013. A decision framework for the consolidationof performance measurement systems. Decision Support Systems 54, 1016-1029. [CrossRef]

    7. Bob Lillis, Marek Szwejczewski. 2012. An exploratory study of strategic operations audit methods inservices. International Journal of Operations & Production Management 32:11, 1306-1336. [Abstract] [FullText] [PDF]

    8. Kalle A. Piirainen, Rafael A. Gonzalez, Johanna Bragge. 2012. A systemic evaluation framework for futuresresearch. Futures 44, 464-474. [CrossRef]

    9. Cory Searcy. 2012. Corporate Sustainability Performance Measurement Systems: A Review and ResearchAgenda. Journal of Business Ethics 107, 239-253. [CrossRef]

    10. Mathies Pohl, Kai Frstl. 2011. Achieving purchasing competence through purchasing performancemeasurement system designA multiple-case study analysis. Journal of Purchasing and SupplyManagement 17, 231-245. [CrossRef]

    11. Renata Gomes Frutuoso Braz, Luiz Felipe Scavarda, Roberto Antonio Martins. 2011. Reviewing andimproving performance measurement systems: An action research. International Journal of ProductionEconomics 133, 751-760. [CrossRef]

    12. Pietro Micheli, Matteo Mura, Marco Agliati. 2011. Exploring the roles of performance measurementsystems in strategy implementation. International Journal of Operations & Production Management 31:10,1115-1139. [Abstract] [Full Text] [PDF]

    13. Kim Sundtoft Hald, Chris Ellegaard. 2011. Supplier evaluation processes: the shaping and reshapingof supplier performance. International Journal of Operations & Production Management 31:8, 888-910.[Abstract] [Full Text] [PDF]

    14. Gulgun Kayakutlu, Gulcin Buyukozkan. 2011. Assessing performance factors for a 3PL in a value chain.International Journal of Production Economics 131, 441-452. [CrossRef]

    15. Cory Searcy. 2011. Updating corporate sustainability performance measurement systems. MeasuringBusiness Excellence 15:2, 44-56. [Abstract] [Full Text] [PDF]

    16. Ljubica Glavan. 2011. Understanding Process Performance Measurement Systems. Business SystemsResearch 2. . [CrossRef]

    17. Chee Yew Wong, Noorliza Karia. 2010. Explaining the competitive advantage of logistics service providers:A resource-based view approach. International Journal of Production Economics 128, 51-67. [CrossRef]

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)

  • 18. Claire Moxham. 2009. Performance measurement. International Journal of Operations & ProductionManagement 29:7, 740-763. [Abstract] [Full Text] [PDF]

    19. Marc Wouters. 2009. A developmental approach to performance measuresResults from a longitudinalcase study. European Management Journal 27, 64-78. [CrossRef]

    20. Robert Johnston, Panupak Pongatichat. 2008. Managing the tension between performance measurementand strategy: coping strategies. International Journal of Operations & Production Management 28:10,941-967. [Abstract] [Full Text] [PDF]

    21. Cory Searcy, Stanislav Karapetrovic, Daryl McCartney. 2008. Application of a systems approach tosustainable development performance measurement. International Journal of Productivity and PerformanceManagement 57:2, 182-197. [Abstract] [Full Text] [PDF]

    22. T.F. Burgess, T.S. Ong, N.E. Shaw. 2007. Traditional or contemporary? The prevalence of performancemeasurement system types. International Journal of Productivity and Performance Management 56:7,583-602. [Abstract] [Full Text] [PDF]

    23. Tony Davila, Marc WoutersManagement Accounting in the Manufacturing Sector: Managing Costs atthe Design and Production Stages 831-858. [CrossRef]

    Dow

    nloa

    ded

    by U

    nive

    rsid

    ad N

    acio

    nal A

    uton

    oma

    de M

    exic

    o A

    t 10:

    12 0

    5 M

    arch

    201

    5 (P

    T)