Index

A

Academic databases, 115

Acquisition costs, 119

Action planning, 40

Action plans, 73–76

Analytics, vi

Appirio, 87–88

Application data, 81

Application objectives, 15–16

Assessment(s)

needs-based. See Needs assessment of organizational climate for achieving results, 153

readiness, 3

Assessment center method, 73

Attitude surveys, 63

Attributes, 22

Audience

communication of program results to, 139–140, 142, 149

examples of, 149

progress reporting for, 160

B

Bad news, 159

Balanced Scorecard, 4

Baseline

building of process for, 25

data collection on, 22, 25

lack of, 23–25

post-then-pre assessment for creating, 24

Behaviorally anchored numerical scale, 69

Benefit-cost ratio (BCR), 123

Block, Peter, 137

Bloom’s Taxonomy, 14–15

Brainstorming, 20

Broad objectives

clarity of, 18–21

example of, 40

SMART objectives versus, 17–18

specific measures for achieving, 21, 40

stakeholder needs and, 21

Budget, 118

Business alignment

case study of, 52–53

model of, 18

Business games, 72

Business measures, 19–20. See also Measure(s)

C

Case studies

description of, 7

of evaluation planning, 51–58

of focus groups, 27–30

of implementation, 163–164

isolating of program effects using, 94

of objectives, 27–30, 53

of reporting of results, 143–149

as simulations, 72

Cause and effect, 102

CEB, 88

Center for Talent Reporting, 140

CEOs, vi

Champion, 153

Change, 36

Classic experimental control group design, 95–97

Clients

buy-in from, for evaluation planning, 50

as data sources, 80

Cognitive processes, Bloom’s taxonomy of, 14–15

Collection of data. See Data collection

Communication of program results

audience’s opinion after, 139–140

cautions in, 139–140

challenges in, 133

consistency in, 143

customizing of, for audience, 142

description of, 46

feedback action plan for, 137–138

guidelines for, 142–143

improvements from, 143

meetings for, 135–136, 142

mode of, 142

neutrality during, 143

planning of, 138–139

political aspects of, 139

recommendations after, 139

testimonials included in, 143

timeliness of, 142

tools for, 136–137, 142

Considerations, 8–9

CornerstoneOnDemand, 88

Coscarelli, Bill, 69, 70

Costs, of program

acquisition, 119

categories for, 118

delivery, 120

design and development, 119

evaluation, 120

needs assessment, 118–119

overhead, 120

worksheet for estimating, 121–122

Credibility, 104

Credible data sources, 41, 51, 78, 86, 106, 127

Criterion-referenced test (CRT), 70

Customer input, for isolating of program effects, 95

D

Darwin, Charles, 106

Data

application, 81

availability of, 81–82

baseline, collection of, 22

failure to use, 159–160

hard, 113–116

impact, 81

learning, 61

monetary conversion of

databases for, 115, 125

description of, 44, 46, 58

expert input used in, 114–115

historical costs for, 114

standard values for, 113–114, 124, 126

techniques for, 113–116

negative, 137

observational, 102

positive, 138

post-program, 81

reaction, 61

soft, 113

Data analysis, 8

Data analysis plan

case study of, 56–58

communication targets, 46

conversion of data into monetary values, 44–46, 58

cost categories, 46

influences, 47

intangible benefits, 46, 50

isolating the effects of program, 43–44

planning of, 50

template for, 45

Data collection

action plans for, 73–76

case study of, 53–56

constraints on, 83

convenience of, 82–83

data sources for, 41

demonstration for, 68–70

evaluation level considerations, 86

failure to use data after, 159–160

focus groups for, 40, 67–68

follow-up sessions for, 77–78

guidelines for, 85–86

high response rates for, 42–44, 50–51, 56

ideal times for, 82

importance of, 61

instruments for, 39–40

interviews for, 40, 67–68

methods of, 39–40

mixed method approach to, 83–85

multiple approaches to, 85–86

objectives used in developing questions for, 39

observation for, 39, 68–70

performance contracts/contracting for, 40–41, 77

performance monitoring for, 41

performance records for, 78

planning of, 38–42, 50

questionnaires for, 40, 62–67

quizzes for, 70

responsibility for, 42

simulations for, 70–73

stakeholder consensus on methods for, 106

surveys for, 40, 62–67

technologies for, 86–90

template for, 39

tests for, 70

timing of, 42, 61–62, 81–83

tracking technologies and devices for, 87–88

Data sources

clients as, 80

credible, 41, 51, 78, 86, 106, 127

description of, 41

direct reports as, 79

external experts as, 80

facilitators as, 79–80

internal customers as, 79

learning and developmental staff as, 80

participants as, 78

participants’ managers as, 78–79

peer groups as, 79

performance records as, 80

sponsors as, 80

Databases, 115, 125

Deceptive feedback cycle, 14

Demonstration, for data collection, 68–70

Design and development costs, of program, 119

Direct reports, 79

E

EBSCOhost, 115

Electrical simulations, 72

Employee grievances, 112

Employee-Customer-Profit Chain, 129

Epicenter, 86

Estimations

errors with, 106, 127

isolating of program effects using, 94, 102–106

Evaluating Learning Impact Certificate Program, 154

Evaluation

barriers to, 161–162

champion for, 153

communication targets of, 46

costs of, 120

criteria for success, 1

data from, failure to use, 159–160

high response rates for, 42–44, 50–51

human capital and, vi

ideal world of, 2

investments in, 8

learning and, 1

myths regarding, 158–159

organization’s policies, procedures, and practices for, 156

planning of. See Evaluation planning

as process improvement tool, 157, 161

purpose of, 36–38

in real world, 2

relevance of, vi–viii

research demand for, v–vi

resistance to, overcoming of. See Resistance, overcoming of

target setting for, 155

task force for, 154

value of, tangible evidence of, 157

Evaluation framework

description of, 4

five-level, 4–5

Evaluation leader, 153–154

Evaluation levels

characteristics of, 62

evaluation characteristics based on, 62

evaluation planning based on, 50–51

objectives and, matching of, 31

observation and, 69

schematic diagram of, 4–5

Evaluation planning

buy-in from client on, 50

case study of, 51–58

data analysis. See Data analysis

data collection. See Data collection

guidelines for, 50–51

for higher levels, 50

measures, 49

objectives, 49

people involved in, 35–36

reasons for, 35

team for, 49

Experts, 114–115

External experts, 80, 114

F

Facilitators

as data sources, 79–80

salaries of, 120

Feedback

360-degree, 79

deceptive cycle of, 14

focus groups for obtaining, 68

learning and development team’s role in, 137

questionnaires for obtaining, 65

reporting of, 137–138

Fitbits, 87–88

Flawless Consulting, 137

Focus groups

case study of, 27–30

data collection using, 40, 67–68

feedback from, 68

program outcomes identified by, 26

ROI measures created from, 26

Follow-up sessions, for data collection, 77–78

Forecasting techniques, for isolating of program effects, 94, 100–102

G

Galton, Francis, 106–108

Generally Accepted Accounting Principles (GAAP), 140

Grievances, 112

Guiding principles, of ROI Methodology, 6–7

H

Handbook of Training Evaluation and Measurement Methods, 4

Hard data

description of, 113–114

soft measures linked to, 115–116

Historical costs, 114

Human capital, vi

I

Impact data, 81

Impact objectives, 15–16

Impact study report, 134–135

Implementation

barriers to, 161–162

case study of, 163–164

champion for, 153

consistency for, 162

description of, 7, 151

efficiency in, 162–163

evaluation targets, 155

importance of, 162

organizational climate for achieving results, 153

plan for, 155–156

progress monitoring, 160

resistance to, overcoming of, 151–152

roles and responsibilities in, 153–155

staff preparation for, 156–157

team members’ involvement in, 156–157

timetable for, 155

Initiating of projects, 157–158

Intangible benefits, 46, 50, 127

Internal customers, 79

Internal experts, 114

Interviews, for data collection, 40, 67–68

Isolating of program effects

case studies for, 94

control group arrangement for

classic experimental design, 95–97

description of, 94

post-program only design, 97–99

credit given after, 93

customer input for, 95

data analysis plan, 43–44

estimations used for, 94, 102–106

forecasting techniques for, 94, 100–102

importance of, 105

methods of, 93

trend line analysis for, 94, 99–100

J

Job simulations, 71

K

Kirkpatrick, Donald, v, 4

Kravitz, Joan, 143–149

L

Leadership Challenge case study, 51–58

Leadership development programs

case study of, 51–58

description of, 37

Learning

evaluation and, 1

job simulations for measuring, 71

Learning and development professionals, 13

Learning and development team

feedback role of, 137

in implementation, 156–157

teaching of, 157

Learning and developmental staff, 80

Learning data, 61

Learning objectives, 14–16

Level 1, 5, 16, 37, 46–47, 54, 80

Level 2, 5, 16, 36–37, 46–47, 54, 71, 80

Level 3, 5, 16, 36–37, 41–43, 49, 55, 65, 73, 78, 81–82

Level 4, 5, 16, 37, 41–43, 55, 65, 73, 81, 111

Level 5, 5, 16, 37, 42–43, 55, 111

Litigation costs, 114

Lowery, Richard, 163

M

Management

meeting of, 136

preparation of, 158

Managers, as data sources, 78–79, 142

Martinez, Arthur, 128

Measure(s)

business, 19–20

change of performance in, 112

focus groups used to create, 26

as intangible benefits, 46

monetary conversion of

case study of, 127–130

databases for, 115, 125

description of, 44, 46, 58, 112–113

estimations used in, 116–118, 126

expert input used in, 114–115

four-part test for determining whether to convert, 124–126

historical costs for, 114

standard values for, 113–114, 124, 126

techniques for, 113–116

monetary value on, 111–112

for ROI calculation, 25–26

soft, 115–116, 127–130

survey questions used to create, 26

value of, 112

Measurement

human capital and, vi

relevance of, vi–viii

Meetings, for reporting of results, 135–136, 142

Metrics-that-Matter, 88

Mixed method approach, to data collection, 83–85

Money

data conversion into, 44, 46, 58

measures converted into

case study of, 127–130

databases for, 115, 125

description of, 112–113

estimations used in, 116–118, 126

expert input used in, 114–115

four-part test for determining whether to convert, 124–126

historical costs for, 114

standard values for, 113, 124, 126

techniques for, 113–116

normalizing with, 111–112

Motsinger, Joan, 86

Myths, 158–159

N

Needs assessment

baseline data collection during, 22

case study of, 52–53

costs associated with, 118–119

description of, 18, 21

as program cost, 118–119

target setting after, 22

NexisLexis, 115

Nielsen, Chantrelle, 87

Nominal group technique, 20

Norm-referenced tests, 70

O

Objective tests, 70

Objectives

application, 15–16

case study of, 27–30, 53

data analysis based on, 8

definition of, 13

evaluation levels and, matching of, 31

expectations and, 13

function of, 13–14

guidelines for creating, 27

impact, 15–16

lack of, 25–27

learning, 14–16

measurable, 27

powerful, 13–23

reaction, 14, 16

ROI, 15–16

SMART, 17–23

stakeholder needs and, 20

worksheet for developing, 32

Observation, for data collection

checklist for, 68–69

description of, 39, 68–70

Observational data, 102

Organization

business alignment model, 18

business measures of, 19–20

climate in, for achieving results, 153

payoff opportunities for, 18–19

policies, procedures, and practices of, 156

Overcoming of resistance

bad news, 159

building blocks for, 152

goals and plans used in, 155–156

guidelines for, 162–163

initiating of projects, 157–158

management team preparation for, 158

obstacle removal for, 158–160

progress monitoring for, 160

revising of policies, procedures, and practices for, 156

roles and responsibilities developed for, 153–155

staff preparation for, 156–157

Overhead costs, of program, 120

P

Participants

as data sources, 78

managers of, as data sources, 78–79, 142

salaries and benefits of, 120

Payback period, 124

Payoff opportunities for organization

business measures to achieve, 19

description of, 18–19

Peer groups, 79

Performance contracting, 40–41

Performance contracts, 77

Performance monitoring, for data collection, 41

Performance needs, 20

Performance records, 78, 80

Performance testing, 68, 70

Phillips, Jack, 4

Phillips Analytics, 88

Pilot program, ROI forecasting from, 38

Post-program data, 81

Post-program only control group design, 97–99

Post-then-pre assessment, 24

Process models, 5–6

Program

change brought by, 36

follow-up sessions for, 77–78

impact of, techniques for isolating the. See Isolating of program effects

planning of, people involved in, 35–36

reasons for conducting, 161

reporting the results of. See Reporting of results

results of

organizational climate for achieving, 153

reporting of. See Reporting of results

Program costs

acquisition, 119

categories for, 118

delivery, 120

design and development, 119

evaluation, 120

needs assessment, 118–119

overhead, 120

worksheet for estimating, 121–122

Progress monitoring, 160

Progress reports, 160

Project plan, 47–48

PTG, 89

Q

Qualtrics, 89

Quantitative analysis, 106

Questionnaires

data collection using, 40, 62–67

feedback, 65

non-question items on, 63

questions on

content issues included in, 65–66

response choices to, 66

sample types of, 64

writing of, 66

surveys versus, 62–63, 65

writing of, 66–67

Questions

for developing objectives, 27

stakeholder needs clarified through, 20

survey, for creating measures, 26

Quizzes, for data collection, 70

R

RaoSoft, 89

Reaction data, 61

Reaction objectives, 14, 16

Readiness assessment, 3

Real world evaluation

components of, 7

data collection for, 61

description of, 1–2

Recommendations, 139

Regression analysis, 100–101

Report, 133–135

Reporting of results

audience’s opinion after, 139–140

case study of, 143–149

cautions in, 139–140

challenges in, 133

consistency in, 143

feedback action plan for, 137–138

guidelines for, 142–143

improvements from, 143

meetings for, 135–136

neutrality during, 143

planning of, 138–139

reasons for, 133

recommendations from, 139

report used in, 133–135

testimonials included in, 143

tools for, 136–137

Resistance

examples of, 151–152

overcoming of

bad news, 159

building blocks for, 152

goals and plans used in, 155–156

guidelines for, 162–163

initiating of projects, 157–158

management team preparation for, 158

obstacle removal for, 158–160

progress monitoring for, 160

revising of policies, procedures, and practices for, 156

roles and responsibilities developed for, 153–155

staff preparation for, 156–157

presence of, 162

Results, of program

organizational climate for achieving, 153

reporting of. See Reporting of results

Return on investment. See ROI

ROI

description of, 1, 15

forecasting of, 38

Level 4 impact data conversion into money, 111

questionnaires for data collection about, 65

target, 49

ROI calculation

benefit-cost ratio, 123

measures for, 25–26

payback period, 124

ROI percentage, 123–124

ROI Methodology

case studies, 7

definition of, v

development of, 4

evaluation framework, 4–5

guiding principles that support, 6–7

implementation, 7

process models, 5–6

standards, 6–7

ROI objectives

example of, 15–16

function of, 15

setting of, 17

ROI percentage, 123–124

ROINavigator, 88

Role plays, 72

S

Seagate, 86

Sears, 127–130

Senior management meetings, 136

Sequential explanatory design, of mixed method research, 84–85

Sequential exploratory design, of mixed method research, 25, 84

Shrock, Sharon, 68, 70

Simple regression, 100–101

Simulations, 70–73

SMART objectives

attributes for measures, 22

baseline data, 22

broad objectives versus, 17–18

definition of, 17, 36

indicators included in, 17

objectives map for, 23

purpose of, 36

specific measures for achieving, 21

steps for developing, 18–23

target setting, 22

writing of, 22–23

Smith, Kirk, 89

Soft data, 113

Soft measures, 115–116, 127–130

Sponsors, as data sources, 80

Staff meetings, 135

Stakeholder(s)

consensus of, on data collection methods, 106

description of, 1

impact objectives effect on, 15

objectives effect on, 13, 15

value as defined by, 36

Stakeholder needs

broad objectives for describing, 21

business alignment model, 18

clarifying of, 20, 36

payoff opportunities for organization, 18–19

questions for clarifying, 20

Standard value, 113, 124

Standards, list of, 6–7

Statistical process control, 20

Structural equation modeling, 116

Survey Basics, 67

SurveyMonkey, 89

Surveys

attitude, 63

data collection using, 40, 62–67

questionnaires versus, 62–63, 65

questions on, measures created from, 26

writing of, 66–67

T

360-degree feedback, 79

Talent Development Reporting Principles, 140–141

Target audiences

communication of program results to, 139–140, 142, 149

examples of, 149

progress reporting for, 160

Target setting, 22

Task force, for evaluation, 154

Task simulations, 72

Taxonomy, Bloom’s, 14–15

Team, for evaluation planning, 49

Technical simulations, 72

Technology, for data collection, 86–90

Testimonials, 143

Tests, for data collection, 70

Thompson, Arthur A. Jr., 72

Time management, 161–162

Timing, of data collection, 42, 61–62, 81–83

Trend line analysis, 94, 99–100

Triangulation, 81

V

Value

credibility issues for, 125

stakeholder, 36

standard, 113, 124

Virtual reality simulations, 73

VoloMetrix, 86–87

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.217.144.32