When I get a systems engineer, I don’t know what I’m supposed to be getting.
– Anonymous Helix program manager participant
Chapter 4 explained the proficiencies that systems engineers need to perform their jobs well. This chapter presents a method to visualize how strong someone is in each proficiency – a PROFICIENCY PROFILE – or just PROFILE, for short. We begin with three fundamental truths:
These truths drive the importance of PROFILES and several distinct uses for them:
An illustrative personal PROFILE is shown in Figure 6.1 for Matt, the hypothetical senior systems engineer introduced at the end of Chapter 3. Recall that Matt’s career spans 10 years with a very rapid rise – 5 years at DoesThatHurt as a biomedical engineer followed by 5 years at MedBright, first as a systems engineer and eventually as lead systems engineer. Figure 6.1 is a radar or spider plot that has six spokes, one for each proficiency area. Matt’s strength in each proficiency area is marked by a small circle along the radial axis from 1 (least proficient) to 5 (most proficient).
The PROFILE reveals that Matt believes himself to be a strong systems engineer. He self‐assesses with a score of 3, 4, or 5 on every proficiency area using the rubric that will be explained in the next section.
There is nothing sacred about a 5‐point scale to measure proficiency strength. In fact, at different times in our research, the scale has varied from a low of 3 points to a high of 11 points. For purposes of exploring PROFILES in this book, a 5‐point scale is sufficient, but a person or organization could choose a more or less granular scale.
This relatively simple radar diagram shown in Figure 6.1 is easy to complete and understand. The disadvantage is that, with only six data points, it is a coarse view of a systems engineer’s strengths and weaknesses. Nevertheless, for a top‐level look, the more than 100 people who completed these PROFILES as part of our interviews found it insightful.
We can easily imagine expanding Matt’s personal PROFILE in two ways:
The sequence in Figure 6.2 shows that over the 10 years of Matt’s career, his TECHNICAL LEADERSHIP, INTERPERSONAL SKILLS, SYSTEMS MINDSET, and SYSTEMS ENGINEERING DISCIPLINE strength steadily grows. His SCIENCE/MATH/GENERAL ENGINEERING strength remains constant, and his SYSTEM’S DOMAIN AND OPERATIONAL CONTEXT strength grows slightly. In the next chapter, such a sequence of PROFILES will be explored further, first in the context of the forces that lead to such changes in strength and second as part of a larger depiction of a career path – the specific sequence of positions a systems engineer assumes over time together with the corresponding roles, events, PROFILES, and other milestones of consequence in a person’s career.
The sequence of six radar diagrams in Figure 6.3 shows the basis for Matt’s self‐assessment. For example, there are six constituent categories in MATH/SCIENCE/GENERAL ENGINEERING. The individual scores roll up to an overall score of 4. The method used in Helix to calculate a proficiency area score from its constituent category scores, is to compute the average and round off to a whole number. In Matt’s example, he scored 5 in NATURAL SCIENCE FOUNDATIONS and ENGINEERING FUNDAMENTALS, in SOCIAL SCIENCE FOUNDATIONS and PROBABILITY AND STATISTICS scored 3, while he assessed himself as a 4 in CALCULUS AND ANALYTIC GEOMETRY and COMPUTING FUNDAMENTALS. The average of those six scores is 24/6 = 4. Certainly, more sophisticated roll‐up methods are possible, such as to assign weights to individual categories to reflect their relative importance to the organization or to avoid rounding and use noninteger values. Such methods are not shown here, although some organizations who participated in Helix preferred these more sophisticated approaches. For an individual, the most basic assessment using averages is easy and insightful. For an organization, any approach could work as long as it is applied consistently across all individuals.
During organizational interviews, more than 100 people created their own personal PROFILES at the proficiency area level. This section explains how those self‐assessments were conducted and then presents a specific rubric for self‐assessment based on what we learned from those interviews. Small groups of two to seven people from the same organization usually sat together with members of our research team. Figure 6.4 shows an actual worksheet used for some data collection, with a 6‐point scoring scale from 0 to 5. Participants were asked to be mindful of the categories constituting each area but were not asked to score themselves against each category as shown in Figure 6.4. Although there was some variation, participants typically had only 10 minutes to complete a self‐assessment, making it impossible to do a category‐level assessment.
Because we wanted to understand how their proficiency strength changed over time, they created two PROFILES – one as they judged themselves on the day they conducted the exercise and one as they remembered themselves at some point in the past. This sequence is an abbreviated version of the one shown in Figure 6.2, with just two PROFILES rather than three.
We explored several options on how far back in the past a systems engineer’s first PROFILE should be. For some, it was immediately after completing their bachelor’s degree. For others, it was 10 years before the interview or when they joined their current organization. Completion of undergraduate education was a frequent point of comparison, but particularly for those with many years of experience, they were uncomfortable with self‐assessment that far back because they doubted how well they could accurately recall their strength. For each assessment, we recorded the general dates for the point of comparison.
Participants were generally told that a score of 1 meant “only has awareness” and the highest value (usually a number between 5 and 10) meant “among the best in the organization.” We avoided being too prescriptive and told people to use their “gut” to decide where to place themselves on the scale. The rubric itself varied somewhat over time from organization to organization, but this description fairly reflects the general approach.
After everyone in the room completed their PROFILES, the group talked openly about the scores and why they assessed themselves as they did. These conversations moved the group closer to a shared way to score. There were often comments such as “Well, I was going to rate myself a 5, but if you consider yourself a 5, I should be lower” or “I think you’ve under‐rated yourself, I personally see you as ….” Such shared understanding was valuable not only to our research team but for the subjects as well and somewhat reflected what we expected in personal PROFILES. Most people interviewed indicated that the experience of self‐assessing their proficiencies, as subjective as it was, offered insight into their own strengths and weaknesses and how they compared with their colleagues.
Initially, all PROFILE data was entered on paper. This was soon replaced by an Excel spreadsheet that automatically built the PROFILE visualization from entered data. As this book is being written, a Web‐based tool for data collection, analysis, and visualization is being developed.
With 6 proficiency areas, 33 categories, and an indefinite number of topics, the scale and complexity of using unique rubrics to assess strength for each proficiency would be daunting. We chose a simpler route, adapting a relatively clean rubric developed by the National Institutes of Health (NIH). Quoting from its website [1],
The NIH Proficiency Scale is an instrument used to measure one’s ability to demonstrate a competency on the job. The scale captures a wide range of ability levels and organizes them into five steps; from “Fundamental Awareness” to “Expert.”
Table 6.1 offers our adaption of NIH’s five‐level scale.
TABLE 6.1 Proficiency Scale
No. | Level | Level Description |
1 | Fundamental Awareness | You have common knowledge or an understanding of basic techniques and concepts. Your focus is on learning rather than doing |
2 | Novice | You have the level of experience gained in a classroom or as a trainee on the job. You can discuss terminology, concepts, principles, and issues related to this proficiency and use the full range of reference and resource materials in this proficiency. You routinely need help performing tasks that rely on this proficiency |
3 | Intermediate | You can successfully complete tasks relying on this proficiency. Help from an expert may be required from time to time, but you can usually perform the task independently. You have applied this proficiency to situations occasionally while needing minimal guidance to perform it successfully. You understand and can discuss the application and implications of changes in tasks relying on the proficiency |
4 | Advanced | You can perform the actions associated with this proficiency without assistance. You are certainly recognized within your immediate organization as “a person to ask” when difficult questions arise regarding the proficiency. You have consistently provided practical and relevant ideas and perspectives on ways to improve the proficiency and its application. You can coach others on this proficiency by translating complex nuances related to it into easy to understand terms. You participate in senior‐level discussions regarding this proficiency. You assist in the development of reference and resource materials in this proficiency |
5 | Expert | You are known as an expert in this proficiency. You provide guidance, troubleshoot, and answer questions related to this proficiency and the roles where the proficiency is used. Your focus is strategic. You have demonstrated consistent excellence in applying this proficiency across multiple projects and/or organizations. You are considered the “go‐to” person in this area within your own organization and perhaps externally. You create new applications for this proficiency or lead the development of new reference and resource materials on this proficiency. You can explain this proficiency to others in a commanding fashion, both inside and outside your organization |
This rubric can be used for PROFILES at any level – area, category, or topic. Nevertheless, any self‐assessment is notoriously problematic. People tend to rate their abilities higher than would independent evaluators. William Sedlacek identified the importance of personal characteristics, such as self‐awareness, in the ability to accurately self‐assess [2–5]. Validation with a second party is an important way to improve accuracy and consistency (building a validated PROFILE).
MITRE Corporation was chartered in 1958 “as a private not‐for‐profit corporation to provide engineering and technical guidance for the federal government – with an initial focus on defense needs” [6]. In 2016, MITRE had annual revenues exceeding $1.5 billion and employed more than 8000 people who support a wide swath of the US government, including the Department of Defense, Federal Aviation Administration, and the Department of Health and Human Services, among others [7].
MITRE has a large, diverse population of systems engineers and focuses considerable resources to grow their capability. It graciously participated in the original Helix research that spawned Atlas and was the first organization to rigorously pilot the use of PROFILES to support workforce development. It has used PROFILES to support what they refer to as CLEAR Conversations, which are regular exchanges between systems engineers and their supervisors about performance and performance planning. Their intention was to provide structure and a common vocabulary to guide CLEAR Conversations. After running a trial of this approach with approximately 50 systems engineers, as of the writing of this book, MITRE was considering expanding its use across a wider swath of the organization.
Piloting Atlas required tailoring the proficiency model appropriately. In fact, the tailoring approach described in Chapter 4 resulted from our work with MITRE, which developed its own proprietary variant of the proficiency model, primarily by tailoring topics in each of the categories. Using a 5‐point scale, MITRE asks each participant to self‐assess their strength with respect to the categories and tailored topics, a fairly granular approach. To facilitate this assessment, MITRE has developed a set of instructions for the participants, so they understand how to apply the tailored model correctly. Supervisor and systems engineers discuss the self‐assessment during periodic CLEAR Conversations. Each supervisor provides their own thoughts about the scoring, and together, they consider how the systems engineer can improve scores over time through additional assignments, mentoring, and education & training. MITRE’s approach creates validated PROFILES.
The best people in an organization can help their colleagues grow by publishing their PROFILES as exemplars, i.e. profiles of leading systems engineers together with a rationale from those individuals on why their profiles look as they do. An exemplar PROFILE serves as the North Star for others who aspire to achieve their success. Someone entering the organization and looking to understand “what success looks like” could examine PROFILES of the best for guidance. Of course, the availability of this information will depend on personnel policies, organizational culture, and the willingness of people to share information about themselves. Consider the situation where a suitable exemplar PROFILE is not available, perhaps because one of the people best qualified to be an exemplar does not wish to share their PROFILE. The manager can develop recommended PROFILES based on hypothetical team members rather than real ones.
In one organization we visited, we performed such an exercise on several positions, including “system analyst.” In this organization, “system analyst” is a specific position that included the systems engineering role of SYSTEM ANALYST. Their management recommended that an individual be “4/Advanced” in every proficiency area. We then interviewed a number of their system analysts who self‐assessed their personal PROFILES to understand how well they stacked up compared to management recommendations. Most fell short, in some cases by quite a bit.
There are a few things we learned by running this exercise with this organization and others. It is a common and very human trait to answer the question “How good do you want an employee to be?” with the answer “the best.” However, it is impossible for every single individual to be the best at everything. In fact, some individuals we interviewed found these “superhero” profiles to be discouraging. If the organization truly expects them to be at least “Advanced” strength in every proficiency area, they felt they would always fall short. Among very junior systems engineers, some felt that it would take 10–15 years to grow to that level, which is fine provided management believes that “4/Advanced” everywhere is a long‐term goal and not an immediate requirement to fill the system analyst position.
The complete set of an organization’s or a team’s personal PROFILES can be a powerful tool to understand their overall relative strengths and weaknesses. This set becomes even more powerful if they are validated by managers and peers. Simple statistical analyses can be done, calculating, for example, the mean score for specific proficiencies of people who occupy various positions, such as system analyst, system architect, or requirements engineer. Of course, the 5‐level scale is only ordinal. There is clearly some subjectivity in how scores are assigned, even when using the proficiency scale in Table 6.1. Hence, it is important to recognize the limitations of such statistics. Nevertheless, they can be most helpful in understanding the general characteristics and patterns of strength and weakness in an organization. That understanding can guide workforce investments and policies.
Consider the organization discussed above that expected every system analyst to be “4/Advanced” in every proficiency area. We developed an aggregate profile for those we interviewed as shown in Figure 6.5, allowing for noninteger average values. One could read Figure 6.5 to mean that many system analysts do not meet management expectations and therefore are not effective. However, Figure 6.5 could also show where the workforce needs to grow over time, which could drive investment in training, rotational assignments, and other steps to improve the aggregate PROFILE. Also, Figure 6.5 groups all system analysts into one position. That grouping could be refined into three PROFILES, reflecting junior, mid‐level, and senior system analysts, for which different recommended PROFILES would be appropriate. Setting proper expectations is vital.
18.222.240.21