20

The Criticality Approach to Content Selection

For the purposes of instructional design, criticality is defined as a purposeful process for content selection and inclusion guided by criteria that contextually classify objectives into one of four areas: critical, required, prerequisite, and unnecessary.

What this means operationally is that a designer is able to take a series of objectives or skills and then work with SMEs to determine which content really needs to be included in a course or series of courses. This is useful because there sometimes arises a need to reduce the quantity of content in order to fit a specific amount of implementation time available, keep costs within a budget, or numerous other related reasons. No instructional designer working with SMEs or in an unfamiliar content area wants to undertake the process of thinning content without having an accepted approach that ensures decisions are made based on input from key design partners.

A recommended approach, which can be used for any variety of situations and expanded as the complexity of the content demands, rates content by two factors: frequency of use and importance, or as it has been named, criticality. The logic of this approach lies in the way each skill or objective is rated based on these criteria. Then each objective can be compared against the others, and through a process of simple rankings the most critical objectives can be determined. This works so efficiently that it almost makes decisions for the designer; the collected data also can be used by SMEs or a design group for discussion.

Criticality Basics

The steps for implementing the criticality rubric are as follows:

1.  Write objectives—Create draft objectives for all content.

2.  Select level of criticality—Assign a criticality value to each objective.

3.  Select frequency of application—Assign a frequency value to each objective.

4.  Combine criticality and application data—Combine the values for criticality and frequency for each objective.

5.  Determine ranking—Combine the value for each objective to determine criticality.

6.  Calculate disposition—Determine the final disposition for each objective.

The next sections explain how each rubric works and how the criticality rankings are formed.

Step 1: Write Objectives

This is the easiest part of the process because a designer already knows how to write objectives. For the purposes of criticality, these can even be draft objectives that only contain the behavior that will be used for ranking.

Step 2: Select Level of Criticality

Create a criticality rubric for rating each objective or content element based on its level of criticality within the course in one of the following ranges: critical, essential, prerequisite, and unnecessary.

Although designers may develop their own definitions of these classifications, the range of each shown here will be helpful.

Critical

Critical objectives are those that cannot under any circumstance be omitted from the course. The following are among them:

• mandated critical: objectives involving required legal or technical content

• performance critical: objectives that because of the severity of the consequences of omission or poor performance must be included

• organizational critical: required for reasons other than mandate or performance, such as internal political issues, policy, or practice.

Essential

Essential objectives are those objectives that are not critical but are required for a thorough course in the content area. Note these examples:

• skill steps: detailed content or procedures on a particular skill or concept

• objective domain specific: objectives that match a required domain requirement, such as teaching learners specifics of a skill rather than just providing an overview.

Prerequisite

Prerequisite objectives cover content that is sometimes marginal in terms of necessity for implementation but is useful as background information or for ensuring learning conformance with prerequisites. Examples include the following:

• relevant policy, practice, or organizational procedures: setting the background

• skills review: equations, safety rules

• adjunct information: background readings, history, or added detail on a topic.

Unnecessary

Unnecessary objectives are just about every other objective there may be; they are not always easy to eliminate. This classification can cause distress in group or team settings where decisions need to be made to shorten a course and issues other than content are involved. For example:

• unrelated content: information that just is not necessary, connected, or useful

• loyalty content: the video of the organization’s president adds nothing to the course

• political content: objectives that are clearly meant to aid organizational presence but offer little or no tangible content.

Each objective is ranked by level of criticality, and this is assigned a numerical value; for example, you could use a scale of 3 to 5 and then enter a value for each objective in the corresponding cell. This is how a typical rubric will look for rating four objectives for level of criticality. How these numbers work together in the ranking will be clear shortly (Table 20-1).

Table 20-1. Rating Four Objectives for Level of Criticality

For this set of four objectives, objective 2 is rated as critical, objectives 1 and 3 are rated as prerequisite, and objective 4 is rated as essential.

Step 3: Select Frequency of Application

How frequently an objective is most likely used in actual practice is determined by review by SMEs or others in a position to be able to make that determination. Any scale can be used that works for your needs. For example, you could use the following frequency classifications: daily, weekly, monthly, quarterly, yearly, and never (Figure 20-1).

Figure 20-1. Illustration of Frequency of Application

Now complete Table 20-2, which measures the relative value of specific objectives based on the frequency of application of the content by a learner. This value may be entered for each individual potential learner or averaged over a population, assuming that each individual learner may have somewhat different frequency-of-use requirements. A scale of 0 to 5 can be used for values in this rubric. This is how four objectives might look in a typical frequency of use rubric.

Table 20-2. Measuring Frequency of Application in Objectives

For this rubric, four objectives have been rated for frequency of use, with objectives 1 and 3 rated as monthly, objective 2 rated as quarterly, and objective 4 rated as daily.

Step 4: Combine Criticality and Application Data

It is now time to combine the criticality and frequency of application data into one rubric, providing another dimension for comparison of objectives. It is important to add this additional data rubric because critical objectives that are used most often by learners will trend toward the high range on this scale. Similarly, less important objectives that are seldom if ever used by learners trend toward the bottom of the scale.

Combine the criticality and frequency ratings and place that number in the appropriate cell of Table 20-3. This is what the rubric will look like before all the data are entered.

Table 20-3. Example for Combining Criticality and Application Data

Step 5: Determine Ranking

For ranking, use a rubric that takes the data from all of the combined friticality and frequency rubrics and places them on one ranking rubric (Table 20-4). This process provides a clear and concise placement for each objective and makes the process of prioritizing objectives much easier.

Table 20-4. Rubric for Determining Ranking

Objective Ranking
1 6
2 7
3 6
4 9

Now there is a clear picture of not only how the criticality process works, but also how each of the four objectives ranked in the review. Objective 4 cannot be deleted, and objective 2 should probably be kept. Objectives 1 and 3 are the lowest rated and therefore give a designer some options.

Step 6: Calculate Disposition

Now is the time to make the final decisions about objectives and to chart those decisions in this rubric (Table 20-5). This rubric, based on the format of the last table, will probably be the final worksheet needed for preparing the design plan. There are different options for disposition listed; it is up to the designer to decide disposition options and what would be appropriate choices in this rubric.

Table 20-5. Rubric for Calculating Disposition

Putting Criticality to Work

Now work through the operation of the criticality rubric system using a real-world example. A designer must make critical content decisions for a course entitled Safety in the Workplace for first-year apprentices in the building trades.

For final disposition in a four-hour course, the designer must make determinations concerning the following terminal objectives. For this exercise, draft objective wording will be used because the design process is in the early stages and the objectives are not yet formalized into four parts (A-B-C-D).

• Objective 1: should be able to state OSHA regulation(s) concerning hard-hat use

• Objective 2: should be able to demonstrate recommended hard-hat use

• Objective 3: should be able to list union and company policy on hard-hat use

• Objective 4: should be able to site possible types of head injury resulting from failing to use a hard hat

• Objective 5: should be able to list sources for purchasing hard hats.

Work through the criticality rubric using the five objectives. Table 20-6 represents a possible scenario for these objectives.

Table 20-6. Criticality Rubric Analysis of Five Objectives

The results may be arguable, but it is clear that there is a pattern of criticality based on these choices. Objective 2 is critical, objective 4 is essential, and the remaining objectives (1, 3, and 5) are prerequisite. What this tells an instructional designer is that there is a clear road map to use for gauging inclusion of objectives. If there is time for implementing all five objectives, the designer is fine. If there is time for implementing two objectives, there is also a clear guide to follow—objectives 2 and 4 will be included, and the remaining objectives would be prerequisites or pre- or post-course reading assignments.

The application rubric allows you to dig deeper into objectives when the issues are not as clear cut as those in the first example. Assume that there is some disagreement about the outcome of the first rubric and further digging is required to see which objectives should be included. Now, the frequency of application data is included for review.

Table 20-7 depicts how these objectives might be rated in terms of the frequency of application.

Table 20-7. Application Rubric Analysis of Five Objectives

A pattern is starting to appear for the five objectives. The second objective is also something that a learner will probably use every working day on the job. The other four objectives will probably find less frequent use, with one actually having only yearly value in terms of learner implementation. Again, these decisions can be made as a group and ironed out in the course of the conversation. The content criticality has been refined to the point where one objective, objective 2, is by far the most important for this course.

The objectives are further defined using the rubric that adds each objective’s criticality rating and application data arriving at a final, multidimensional criticality.

Table 20-8 shows the rubric completed for objective 1, which was rated as prerequisite (criticality) and monthly (frequency).

Table 20-8. Rubric for Combining Criticality and Application Data of Objective 1

Tables 20-9 through 20-12 show the level 3 criticality matrices for the remaining four objectives.

Table 20-9. Rubric for Combining Criticality and Application Data of Objective 2

Table 20-10. Rubric for Combining Criticality and Application Data of Objective 3

Table 20-11. Rubric for Combining Criticality and Application Data of Objective 4

Table 20-12. Rubric for Combining Criticality and Application Data of Objective 5

The five objectives are ranked based on the last rubric results (Table 20-13).

Table 20-13. Rubric for Ranking Five Objectives

Objective Ranking
1 6
2 10
3 6
4 6
5 4

This analysis indicates that objective 2 has the highest ranking, 10, with objectives 1, 3, and 4 tied for the next highest ranking of 6. Objective 5 is last with a 4 ranking.

The final rubric provides the final disposition for each objective (Table 20-14).

Table 20-14. Final Disposition of Analysis of Five Objectives

In Conclusion

The process of determining criticality for content selection and other aspects of the instructional design process has not been well documented, but it now exists in a form that can be easily utilized by designers. The more a designer can integrate this criticality process into her work, the more productive and efficient the work flow and decision-making process becomes. The introduction of criticality as an instructional design best practice will add credibility and professionalism to any designer’s skill set.

Discussion Questions

1.  A client is concerned that a criticality review might suggest that certain key content areas are not as important as the client believes them to be. What do you say in response?

2.  You are reviewing a criticality report written after a meeting with a group of SMEs, and they are recommending that all course content relating to the organization’s leadership, including an organizational overview and video from the CEO, not be included and be labeled as “unnecessary.” What should you do as the instructional designer?

3.  Can you think of any situation where an objective that might never be used by a learner should be rated as “critical”?

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.191.211.66