ROI: A Useful Tool for Corporate Learning Evaluation Cheri...

25
ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 1 ROI: A Useful Tool for Corporate Learning Evaluation Cheri L. Fenton Purdue University

Transcript of ROI: A Useful Tool for Corporate Learning Evaluation Cheri...

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 1

    ROI: A Useful Tool for Corporate Learning Evaluation

    Cheri L. Fenton

    Purdue University

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 2

    ROI: A Useful Tool for Corporate Learning Evaluation

    Corporate instructional designers are constantly seeking ways to demonstrate the

    effectiveness of training they design. Management and company executives often want

    quantifiable ways to identify and measure the value from training as companies work to

    maximize cost benefits, train and retain employees, and keep company shareholders and boards

    of directors satisfied. According to Pine & Tingley (1993), learning professionals

    …are under increasing pressure to direct their efforts toward satisfying their internal

    customers – and many of those customers want to see a measureable, bottom-line impact

    from training. This translates into an effort to tie training directly to the business results

    that management is emphasizing – increased productivity, fewer errors, higher employee

    morale, a stronger bottom line. (p. 56)

    One solution to evaluating training is to analyze the return on investment (ROI) of the

    training. According to Byerly (2005), “An ROI effort measures a single, numerical business

    metric, such as sales revenue or customer satisfaction; considers its financial impact; and

    identifies potential improvements” (para. 1). ROI can be used to forecast the value of training,

    plan for the most cost-effective training when multiple options are present, demonstrate business

    results and effectiveness of training, and guide decisions pertaining to future training (Mattox,

    2011). Phillips (2010a), an advocate of ROI, outlines:

    As executives and managers watch learning budgets grow, there is prevailing frustration

    from the lack of evidence showing that learning…programs can really help performance.

    Sponsors need to know how major investments of time, money, and resources are paying

    off and aligning with strategic business goals. A comprehensive measurement and

    evaluation process represents the most promising approach to meet rising accountability

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 3

    challenges. … Trends show that organizations with comprehensive measurement and

    evaluation systems in place have enhanced their program budgets while those without

    comprehensive measurement and evaluation systems have reduced or eliminated their

    program budgets. (para. 2)

    Like other tools, theories, and methods used by instructional designers, ROI has a

    purpose and place in learning and development. This paper seeks to identify conditions for when

    ROI is an appropriate tool for evaluating corporate learning. The use of ROI is explored to

    provide insight to and context for this type of evaluation. Arguments for the use of ROI to

    evaluate corporate training are presented to demonstrate a need for this tool. Arguments against

    the use of ROI as a means to evaluate corporate training are identified. A critical analysis of the

    literature follows, providing insight into appropriate uses of ROI in corporate training evaluation

    and alternatives to ROI. This paper concludes with recommendations for additional study.

    Literature Review

    How ROI is Used to Evaluate Corporate Learning

    Before an instructional designer can determine if ROI is the appropriate tool for corporate

    training evaluation needs, it is imperative to understand the purpose of ROI and how it is used.

    According to Phillips (2010a), “An accurate ROI calculation…requires data collection at four

    levels – reaction, learning, application, and impact. The impact data is isolated from other

    influences and converted to monetary value. This monetary value is then compared to the cost of

    [training]” (para. 4). The costs, savings, and results (impact data) include tangibles and

    intangibles.

    Best practices dictate that instructional designers should determine whether they plan to

    use ROI to evaluate corporate training as the training is being designed (Ellis, 2005). This way,

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 4

    instructional designers and stakeholders determine outcomes for measurement in advance and

    link training to the outcomes for appropriate and accurate measurement (Pine & Tingley, 1993).

    Pine & Tingley recommend evaluators determine which levels of evaluation will be used and

    then work backward to ensure sufficient preparation for the data.

    Corporate instructional designers can find multiple methods to approach ROI. Byerly

    (2005) suggests a five-step method. Steps include agreeing on the ROI strategy and goals,

    selecting evaluation metrics, considering previous group performance, analyzing data, and

    properly presenting findings. A more popular method of calculating ROI, the

    Kirkpatrick/Phillips model, builds a fifth level of evaluation onto the Kirkpatrick model. With

    the Kirkpatrick/Phillips model, financial implications on a company’s bottom line can be

    analyzed in addition to tracking reactions, learning, application, and business results. The

    assumption is that many organizations currently conduct evaluations of their training programs in

    terms of satisfaction, so companies can add another layer to the Kirkpatrick model to obtain ROI

    information in the evaluation process (Phillips, 1996). See Appendix A for figures and tables

    associated with the Kirkpatrick/Phillips model.

    Arguments for Using ROI to Evaluate Corporate Learning

    Many arguments advocating the use of ROI as an evaluation tool for corporate learning

    have been presented in recent decades. Primarily, ROI is used and requested by corporate leaders

    because it can demonstrate cost effectiveness, unlike many evaluation types. ROI resonates with

    corporate leaders who seek quantifiable data to justify training costs (Mattox, 2011; Phillips &

    Phillips, 2011). In addition, ROI is a concept that company leaders and shareholders understand.

    It takes into account tangible, intangible, and hidden costs, such as loss of productivity and

    employee turnover, which evaluators do not always account for when considering the

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 5

    effectiveness and value of training. Translating training results into ROI calculations helps

    management to understand the financial impact of training to determine how much they are

    willing to invest in training teams and projects.

    Considering arguments for the use of ROI, it is not surprising that Mattox (2011) cites

    “ROI is often viewed as the ultimate measure of effectiveness…” (p. 30). It is also not surprising

    that Phillips & Phillips (2011) were able to cite results from a 2009 Fortune 500 CEO survey

    indicating that 74 percent of top executives surveyed wanted to see ROI evaluations from

    learning and development (p. 35).

    Arguments Against Using ROI to Evaluate Corporate Learning

    Some authors and learning professionals support the use of ROI to evaluate corporate

    training while others call attention to drawbacks and recommend alternatives. Common

    arguments indicate there are four main areas of concern with the use of ROI: ROI lacks

    credibility and is difficult to measure, ROI is costly and time consuming to conduct, ROI is

    irrelevant to stakeholders, and ROI is outdated (Ellis, 2005; Hassett, 1992; Jacobs, 2011; Mattox,

    2011; McGeough, 2011; Taylor, 2007).

    Credibility and measurement. While Phillips & Phillips (2011) disagree, detractors

    indicate that it is difficult to isolate the effects of training alone in order to precisely report ROI

    findings. Hassett (1992) notes external factors such as the economy and world that make

    isolating training effects difficult. According to Hassett, “…the results [of ROI] are never

    entirely unambiguous because it is so difficult to unravel the effects of training from other

    variables” (pp. 54-55). Due to this problem with isolating quantitative data solely to training,

    estimates are often used, making ROI evaluations less credible and accurate.

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 6

    Still, Phillips & Phillips (2011) claim that estimates can be credible and recommend

    using participant estimates as a best practice especially when other methods of calculation are

    not possible. Mattox (2011) agrees, citing that having learners estimate training impact and

    related improvements on learners’ ability to perform can allow for a downward adjustment,

    creating conservative and more credible data isolated to training alone. Phillips & Phillips refer

    to this process as the “confidence” factor, noting that estimates are made meaningful when they

    come “…from the most credible source of data – the [participants]” (p. 37).

    Costs and time. Detractors note concerns associated with the ROI analysis itself (Ellis,

    2005; Jacobs, 2011; Mattox, 2011; Taylor, 2007). According to Taylor, complete and thorough

    ROI studies waste time and resources and people only request ROIs to force a discussion about

    performance problems or because they do not understand how to evaluate training’s value in an

    organization. Ellis seems to agree, citing “…tight budgets and poor pre-solution data often create

    an environment where ROI takes a back seat” (para. 15).

    While it may be difficult for instructional designers with limited resources and budgets to

    attempt a full ROI evaluation in addition to other duties, advocates of ROI seek ways to make

    this type of evaluation work. As an example, Anderson (2003) outlines a study in which a power

    company used ROI to determine business benefits of training for sales performance. Due to time

    constraints, “…ROI analysis relied on existing data” (para. 2).

    Relevancy to stakeholders. Jacobs (2011) claims that ROI is not demanded by company

    executives and cites a 2006 study by ASTD and IBM that indicated company leaders evaluate

    learning more by perception than by quantitative metrics. Jacobs noted “word-of-mouth support”

    for training commonly measures effectiveness that trainers deem most important. However,

    satisfaction is one area of evaluation and quantitative display of results for the bottom line are

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 7

    another, as indicated by their varying places in some evaluation methods including the

    Kirkpatrick/Phillips model. A positive view of training does not necessarily translate to

    improvements in learning, performance, or business results.

    Hassett (1992) notes that because ROI is time-consuming to conduct, results can become

    irrelevant before they are presented. According to Hassett,

    …the most important training to evaluate is not last year’s, it is next year’s. That’s the

    program you will go ahead with or cancel. And that’s the program that will affect the

    bottom lines you care about most: this year’s and next year’s. (p. 55)

    Phillips & Phillips (2011) would have one note, however, that the inability to demonstrate

    training’s contribution may lead training departments to “…lose support, influence, commitment,

    and yes, funding” (p. 36).

    ROI is outdated. According to Jacobs (2011), “ROI for training is an ‘old technology’

    used by older leaders or maybe just old ‘thinking’ leaders” (para. 1). Jacobs cites a 2009 study

    conducted by the ROI Institute that discovered only four percent of 96 companies surveyed

    measured ROI. The study also outlined that “58 percent of training managers are not required to

    report on effectiveness” and “69 percent are not required to report on productivity” (para. 5).

    Mattox (2011), however, disagrees with Jacobs’s claim that ROI is outdated, citing:

    ROI is still alive and relevant. ROI has many more vital years to live. It is an excellent

    measure of cost effectiveness, and it resonates well with business leaders. The C-suite

    cares about investments and outcomes, not knowledge gain or satisfaction scores. ROI

    provides senior leaders with information about where their investments will produce the

    most benefit for the business. (p. 33)

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 8

    Critical Analysis

    When ROI is the Appropriate Tool

    Due to the benefits and concerns surrounding the use of ROI to evaluate corporate

    learning, instructional designers must consider evaluation needs and to select the best tool for the

    situation. Different training projects have different goals and evaluation needs. For example, if a

    company’s need is to reduce turnover or increase efficiency, perhaps ROI is not appropriate.

    Still, ROI should not be discounted for situations in which it would provide relevant and critical

    evaluation data.

    McGeough (2011) indicates only 15 to 20 percent of his company’s training receives ROI

    analysis. He indicates:

    Our thought is if you try to put too much training under the ROI microscope, you spend

    too much time chasing numbers to justify programs. … [ROI] must be reserved for select

    programs; it should be looked at as one more tool to add to the other measures, such as

    surveys, return on value, and balanced scorecard, used to evaluate training. (para. 2)

    Others agree that constant measurement is not cost effective (Ellis, 2005; Hassett, 1992).

    Instructional designers should contemplate setting limits on the number of ROI analyses and

    other types of evaluation. Phillips (1996) indicates that some organizations prefer to set a target

    for each of the five levels of evaluation in the Kirkpatrick/Phillips model. Typically,

    organizations require Level 1 evaluation (satisfaction) for 100 percent of programs, 40 to 70

    percent at Level 2 (learning), 30 to 50 percent at Level 3 (application), 10 percent for Level 4

    (business results), and perhaps five percent for Level 5 (ROI). According to Phillips (1996),

    advantages for setting evaluation targets includes measurable and focused goals for assessing

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 9

    training, focus on accountability, and a message about the importance of measurement and

    evaluation to others in the organization.

    Jack Phillips, founder and president of the ROI Institute, promotes reserving ROI analysis

    for “programs with a great deal of visibility, interest from management, or strong ties to the

    company’s strategic objectives” but not “task-oriented or technical training” (Ellis, 2005, para.

    9). According to Jack Phillips (via Ellis):

    …good candidates for the ROI level of evaluation include programs that are:

    Focused on an operational issue, such as solving a quality bottleneck.

    Targeted to a company-wide strategy, such as enhanced customer service.

    Expensive. Some companies find it helpful to develop a decision tree based on a cost

    factor.

    Highly visible. An ROI evaluation may turn critics into advocates.

    Of particular interest to management.

    Attended by a large audience. …

    Permanent…. (para. 30)

    More specifically, ROI may be suitable for large-scale strategic initiatives (Ellis), certification

    programs (Ellis), comparisons in the cost effectiveness of delivery methods (“How to,” 2002), or

    a series of courses rather than multiple evaluations of single courses in a curriculum (Phillips,

    1996).

    While many considerations for using ROI come from the nature of training and the

    amount of evaluation, other factors should play into the decision to use this evaluation type.

    Instructional designers should consider what is driving ROI evaluation, be that internal concerns

    over evaluation or requests from management. Hassett (1992) cites an article stating the

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 10

    importance of company leaders supporting learning initiatives. If training teams are constantly

    struggling for buy in, concerns outside of ROI may need to be addressed before any training

    development or evaluation begins or continues.

    Alternatives to ROI. As with any tool in an instructional designer’s set, there are

    alternatives to consider when selecting the appropriate evaluation tool. Detractors from ROI

    provide alternate methods of evaluation, some of which include quantitative measures. It is

    possible that ROI is not required when, for example, stakeholders are only looking at other

    measures, such as employee turnover and faster project completion rates, to evaluate training

    (Taylor, 2007). Some detractors also look to the Kirkpatrick model for inspiration while others

    approach different styles of evaluation. Two alternatives to ROI include ROE and Training

    Investment Analysis.

    Jacobs (2011) recommends using return on expectations (ROE) in place of ROI, but in

    fairness, the two measurements do not attempt to evaluate the same things. He claims that ROE

    “is a collaborative, proactive, and customer-oriented way of ensuring the training delivered and

    the training expected is in synch with your customer” (para. 27). Jacobs’s ultimate argument is to

    look to the customer to see what expectations are there for the training and evaluation. Still, it is

    possible for ROI to be a part of an expectations-driven evaluation.

    As a part of ROE, Jacobs (2011) recommends the use of a “Results Contract.” This tool

    identifies what is important to stakeholders and allows them to rate satisfaction levels with or

    without a formulated, quantitative measure. The Results Contract contains a scorecard – another

    evaluation tool – to align business goals to performance objectives and training results. Refer to

    Table C2 for a sample scorecard. The Results Contract, which asks management and attendees to

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 11

    commit to attending and participating in the training, could be used in a variety of training

    projects, regardless of associated evaluation.

    Hassett (1992) is an advocate of the Training Investment Analysis, which can also

    provide quantitative evaluation data to aid accountability for training initiatives. He claims that

    this approach is particularly suited in “situations in which time and money are severely limited”

    (p. 57). This “modest four-step procedure” can provide an alternative to ROI, helping

    instructional designers calculate “a simple, straightforward estimate of the impact of any training

    program on your organization’s bottom line” (p. 53). While this tool utilizes estimates rather than

    hard numbers, instructional designers can involve decision makers in creating estimates so

    results are more credible. See Table C1 for a sample Training Investment Analysis worksheet.

    Recommendations for Further Study

    There are four main areas of recommendation for further study on the topic of using ROI

    to evaluate corporate learning. First, it is important to continue to evaluate ROI and other

    evaluation methods as tools, technology, and methods change. Some of the research cited in this

    paper is approximately 20 years old. In a field as new as instructional design, this research could

    be considered outdated. As Phillips (1996) indicates, “Practitioners and researchers must

    continue to refine the techniques and show successful applications” (p. 47).

    Second, this paper does not contain case studies, works, and recommendations directly

    from the ROI Institute. “Since 1995, more than 3,000 professionals have been awarded the

    Certified ROI Professional designation” from the ROI Institute (Phillips & Phillips, 2011, p. 37).

    Due to its focus and growth in past decades, the ROI Institute could provide additional insight

    into current trends, best practices, and detailed methods for accurately representing ROI for

    evaluation purposes in corporate learning.

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 12

    Third, this paper does not begin to reflect upon the impact of ROI studies on specific

    learning and training teams or in specific industries or environments. Corporate instructional

    designers should conduct further analysis appropriate for their role, company size, and industry

    to pinpoint best practices and recommendations specific to them.

    The fourth and final area for of recommendation for further study relates to professional

    responsibilities associated with evaluating training and collecting or estimating data. At all times

    in any method of evaluation, ethics must be adhered to in order to ensure reported information is

    precise. ROI results can be used in key corporate decision making to enact change and

    organizational direction. Evaluators should ensure methods used to calculate ROI are appropriate

    to validate ROI results.

    Conclusion

    In a world in which corporate leaders seek to identify and measure the value from

    training, corporate instructional designers are becoming more involved with evaluating and

    justifying training initiatives. ROI, a solution to the needs to evaluate and justify training, is

    presented in this paper.

    While many advocates of ROI recommend use of this tool, corporate instructional

    designers should not approach ROI – or any type of evaluation – with an “all or nothing”

    approach. ROI has its place in training and development, and instructional designers must

    identify conditions that allow ROI to demonstrate value and corporate training evaluation needs.

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 13

    Appendix A

    Appendix A highlights figures and tables related to the Kirkpatrick/Phillips model of evaluation.

    Figure A1

    Questions Pertinent to the 5-Level Kirkpatrick/Phillips Model

    Note. Figure A1 is from Phillips (1996, p. 43). This figure outlines questions associated with a

    five-level Kirkpatrick/Phillips evaluation to demonstrate the thought process for instructional

    designers attempting evaluation using the Kirkpatrick/Phillips model and ROI. The

    Kirkpatrick/Phillips model leverages the four levels of the Kirkpatrick evaluation model and

    adds to it Jack Phillips’s fifth level of ROI.

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 14

    Figure A2

    Jack Phillips’s Model for Calculating ROI

    Note. Figure A2 is from Phillips (1996, p. 46). This figure identifies the key actions involved in

    an ROI calculation.

    Figure A3

    ROI Institute Model for ROI

    Note. Figure A3 is from Phillips & Phillips (2011, pp. 38-39). This figure identifies the ROI

    process and key actions involved in an ROI calculation.

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 15

    Table A1

    Sample Data Collection Plan for a Kirkpatrick/Phillips Evaluation

    Note. Table A1 is from Phillips (2010d, pp. 347-348). This table identifies ways in which a

    company can plan for a 5-Level evaluation including ROI.

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 16

    Table A2

    Sample Use of Isolation and Confidence Data to Adjust ROI Calculations

    Note. Table A2 is from Phillips (2010f, p. 365). This table identifies an example of how isolation

    data and confidence ratings were obtained from learners and used to average and adjust ROI

    calculations. This adjustment makes ROI calculations more conservative and aid credibility.

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 17

    Appendix B

    Appendix B contains case studies and samples from ROI analyses.

    Table B1

    Outline of Case Studies Used in Jack Phillips’s 1994 Research

    Note. Table B1 is from Phillips (1996, p. 45). This table exemplifies that ROI can be conducted

    by a variety of companies and in a multiple settings and industries. The evaluation process used

    and ROI results are also provided.

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 18

    Table B2

    Sample Objectives Associated with a Kirkpatrick/Phillips Evaluation

    Note. Table B2 is from Phillips (2010b, p. 25). This table demonstrates planning for evaluation

    and ROI by tying objectives to each level of measurement to forecast outcomes and create

    expectations for the evaluation.

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 19

    Table B3

    Using Action Plan Estimates to Calculate ROI Sample

    Note. Table B3 is from Phillips (2010c, p. 121). This table illustrates a sample project

    demonstrating the use of confidence estimates and isolating data to training impact for an

    appropriate ROI calculation.

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 20

    Table B4

    Leveraging Evaluation Results to Determine Course Impact and Recommendations

    Note. Table B4 is from Phillips (2010g, pp. 369-370). This table illustrates a sample course

    impact study to explain and contextualize evaluation results for a maintenance course.

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 21

    Appendix C

    Appendix C provides samples from ROI alternatives.

    Table C1

    Training Investment Analysis

    Note. Table C1 is from Hassett (1992, p. 57). This table displays a sample worksheet for

    conducting a Training Investment Analysis, an alternative to ROI.

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 22

    Table C2

    Training Scorecard

    Note. Table C2 is from Phillips (2010e, p. 352). This table provides insight into a scorecard some

    consider an alternative to ROI. In this table, however, the scorecard contains elements of the

    Kirkpatrick/Phillips model including ROI.

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 23

    References

    Anderson, M. C. (2003). ROI on the fly: Using existing data to determine ROI. ASTD. Retrieved

    from http://www.astd.org/Publications/Newsletters/ASTD-Links/ASTD-Links-

    Articles/2003/07/ROI-on-the-Fly-Using-Existing-Data-to-Determine-ROI.aspx

    Byerly, W. B. (2005). A look at ROI strategy. ASTD. Retrieved from

    http://www.astd.org/Publications/Newsletters/ASTD-Links/ASTD-Links-

    Articles/2005/02/A-Look-at-ROI-Strategy.aspx

    Ellis, K. (2005). what's the ROI of ROI?. Training, 42(1), 16-21. Retrieved from

    http://web.ebscohost.com.ezproxy.lib.purdue.edu

    Freer, K. (2011). ROI power in educating colleagues. ASTD. Retrieved from

    http://www.astd.org/Publications/Newsletters/ASTD-Links/ASTD-Links-

    Articles/2011/07/ROI-Power-in-Educating-Colleagues.aspx

    Hassett, J. (1992). Simplifying ROI. Training, 29(9), 53-53. Retrieved from

    http://search.proquest.com/docview/203369145?accountid=13360

    How to compute ROI for online vs. traditional training. (2002). HR Focus, 79(4), 10. Retrieved

    from http://web.ebscohost.com.ezproxy.lib.purdue.edu (6415091)

    Jacobs, S. A. (2011). ROI is DOA: How ROE is a better measure of training success than ROI: A

    results contract can prove it. ASTD. Retrieved from

    http://www.astd.org/Publications/Newsletters/ASTD-Links/ASTD-Links-

    Articles/2011/05/ROI-Is-DOA-How-ROE-Is-a-Better-Measure-of-Training-Success-

    Than-ROI-a-Results-Contract-Can-Prove-It.aspx

    Mattox, J. (2011, August). ROI: The report of my death is an exaggeration. T+D, 30-33.

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 24

    McGeough, D. (2011). Measuring ROI. Training, 48(2), 27. Retrieved from

    http://web.ebscohost.com.ezproxy.lib.purdue.edu

    Phillips, J. J. (1996). ROI: The search for best practices. T + D, 50(2), 42-47. Retrieved from

    http://search.proquest.com/docview/227016875?accountid=13360

    Phillips, J. J. (2010a). Calculating the ROI of e-learning. ASTD. Retrieved from

    http://www.astd.org/Publications/Newsletters/ASTD-Links/ASTD-Links-

    Articles/2010/12/Calculating-the-ROI-of-E-Learning.aspx

    Phillips, J. J., & Phillips, P. P. (2011, August). Moving from evidence to proof: New directions

    for the way we think about metrics. T+D, 34-39.

    Phillips, P. P. (2010b). Table 2-8: Levels 1-5 objectives for a software implementation project

    [Table]. ASTD handbook of measuring and evaluating training (25). Alexandria, VA:

    ASTD Press.

    Phillips, P. P. (2010c). Table 8-2: Case example: Using action plan estimates to measure

    business impact and ROI [Table]. ASTD handbook of measuring and evaluating training

    (121). Alexandria, VA: ASTD Press.

    Phillips, P. P. (2010d). Table 25-3: Data collection plan [Table]. ASTD handbook of measuring

    and evaluating training (347-348). Alexandria, VA: ASTD Press.

    Phillips, P. P. (2010e). Table 25-5: Training scorecard [Table]. ASTD handbook of measuring

    and evaluating training (352). Alexandria, VA: ASTD Press.

    Phillips, P. P. (2010f). Table 26-1: Average performance ratings by participants [Table]. ASTD

    handbook of measuring and evaluating training (365). Alexandria, VA: ASTD Press.

  • ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 25

    Phillips, P. P. (2010g). Table 26-5: Basic maintenance course impact study logic map [Table].

    ASTD handbook of measuring and evaluating training (369-370). Alexandria, VA:

    ASTD Press.

    Pine, J., & Tingley, J. C. (1993). ROI of soft-skills training. Training, 30(2), 55-60. Retrieved

    from http://search.proquest.com/docview/203404658?accountid=13360

    Taylor, D. (2007). ROI - is it any use? Training Journal, 12. Retrieved from

    http://search.proquest.com/docview/202949853?accountid=13360