3
Evaluating the Maturity of Developing Technology

3.1 Background

Amongst the most fundamental questions to ask when developing a new technology or product is: ‘How far have I got and what remains to be done?’ If the answers to such questions were easy, then the past would not be littered with examples of technologies that have gone wrong.

Every engineer has his favourite stories, either from history or from his own experiences.

Scandinavians like to quote the example of the warship Vasa, which listed and sank immediately after being launched in August 1628. The ship's builders had been too afraid to tell the Swedish king that his ideas wouldn't work. And there is a lesson in this tale even for today's engineering companies – there is still a fear of giving bad news.

In the United Kingdom, an oft‐quoted example is the Advanced Passenger Train (APT). The ambition in the 1980s was to develop a tilting passenger train that could negotiate the ‘curvaceous’ West Coast Main Line and reduce journey times from London to Glasgow, which it was well capable of doing (in fact, at the time of writing, the APT still holds the record of 3 hours 55 minutes for the northbound journey). However, the technology was developed in the public eye and involved fare‐paying passengers – something that no right‐minded engineer would ever recommend. Not surprisingly, the problems that arose on the development programme, which were quite typical for a technology still maturing, became a source of ridicule and embarrassment. The project was cancelled. Today, tilting trains run very successfully on this route using technology based on the APT's but made by someone else.

The IT world has provided additional evidence. Even Apple, the doyen of domestic computing, introduced and quietly withdrew, for example, the Apple Newton, which achieved some notoriety at the hands of the strip cartoonists.

Could these problems have been avoided if there had been a better understanding of the maturity of the technologies in question? There are no guarantees of a magic cure, but surely there would have been more chance of avoiding embarrassment if there had been an objective way of measuring the readiness of these technologies and then acting on it, thus avoiding the loss of reputation that always accompanies publicly obvious failures.

Graph of spend overrun versus predesign spend displaying a descending dotted curve along with scattered dots labeled OMV, Galileo, TDRSS, hubble space telescope, shuttle, Ulysses, voyager, and HEAO.

Figure 3.1 Spend overrun versus predesign spend.

3.2 Origins of Technology Readiness Measurement

This form of problem faced the National Aeronautics and Space Administration (NASA) in the 1970s. The US space program had a large portfolio of developing technologies that might be used on future missions. But which of them could be made to work, and by when? Historical cost analyses had shown that there was a direct link between the success of a technology and the amount (of dollars) spent on its early‐stage development – the more the expenditure at that stage, the less likelihood there was of cost and timescale overruns. Most engineers would regard this as a statement of the obvious, but NASA cost engineers demonstrated the point and put some numbers to it.

The question was obviously crucial to planning of future programmes, especially manned space flights where there is no room for error. According to Donna Shirley (Ref. 1), manager of Mars Exploration at the Jet Propulsion Laboratory, the business of technology readiness levels (TRLs) got started at NASA because of a ‘guy named Werner Gruel, who was a NASA cost analyst.’ Gruel ‘had this great curve that showed that, if you spent less than 5% of the project cost before you made a commitment to the technology, it would overrun. If you spent 10% of the total cost before that commitment, it would not overrun’. The curve is reproduced in Figure 3.1.

The approach to addressing this issue and taking it forward is credited to NASA researcher Stanley Sadin and two of his colleagues. In 1974, he developed a seven‐step system for estimating the readiness of new technologies. It was formally defined in a published report in 1989. It was then extended and used more publicly by NASA from the early 1990s as a nine‐step system, running from initial concept to validation by operation in space – see Ref. 25.

This approach has grown in application over the subsequent 25 or so years by an expansion of use by both public and private organisations.

3.3 Purpose of Technology Maturity Assessment

Since its original development to support the technology planning of space programmes, a range of uses has been found for technology readiness assessment methods:

  • Understanding the maturity of individual projects within a portfolio of technology development projects. This helps relatively complex organisations to understand what they have in their development pipeline and the timescales within which a particular technology is likely to mature.
  • Evaluating the status of a specific technology, or product development, programme prior to deciding whether to invest resources in further development or to launch into a live production programme. Many larger companies have institutionalised this approach as a ‘stage‐gate’ system where formal sign‐off is required before that technology can be taken further – see Chapter 4.
  • Judging whether a product or technology is worthy of external investment. This is particularly relevant to angel or venture capital‐type investment organisations who often struggle to understand the maturity of early‐stage companies requesting funding.
  • Managing the procurement or acquisition of technology by (public) procurement bodies such as those running defence programmes. Here the idea is to make sure that what has been promised is being delivered and that the procuring organisation is not throwing good money after bad.

3.4 Users of Technology Maturity Assessment

Within this context, technology readiness assessment is, or could be, used by quite a wide range of organisations, including:

  • Space and defence system companies, the original users
  • Public procurement/technology acquisition bodies, such as national defence procurement organisations
  • Engineering companies and complex system integrators, e.g. aircraft manufacturers, jet engine manufacturers, car and commercial vehicle manufacturers, railway rolling stock manufacturers, oil and gas companies
  • Nuclear industry, including those involved in decommissioning
  • Investors such as venture capital and ‘angel’ investors
  • Pharmaceutical or biomedical products, e.g. US Department of Health & Human Services

The principles of its use can also be extended into other fields such a software development.

3.5 What Is Technology Maturity?

It may seem like a statement of the obvious, but a mature technology is one that works reliably when in the customer's hands and can be manufactured consistently at the appropriate cost without having to use specialised or difficult methods. It is likely to be used in a range of applications by a number of companies.

An immature or underdeveloped technology is the opposite of these and will at best cause frustration to the end user and at worst be downright dangerous. In the early stages of developing a technology, where technology is considered to be the application of science into something useful, immaturity translates into uncertainty and risk, and then into cost and time overruns. Arguably, the most crucial question in applying a new technology is whether it has reached a sufficient level of maturity to be inserted into a planned new product programme where a commitment will be made to ‘go live’ at a certain date, with all the associated costs and investment as well as the exposure to end customers.

TRL NASA* EU H2020 Definitions UK Automotive, slightly simplified
1 Basic principles observed and reported Basic principles observed Basic Principles observed and reported. Scientific research undertaken and beginning to be translated into applied research and development. Paper studies and scientific experiments taken place and performance predicted.
2 Technology concept and/or application formulated Technology concept formulated Speculative applications identified. Exploration into key principles ongoing. Application specific simulations or experiments undertaken and performance predictions refined.
3 Analytical and experimental critical function and/or characteristic proof of concept Experimental proof of concept Analytical and experimental assessments identified critical functionality and/or characteristics and physically validated predictions of separate elements of the technology or components not yet integrated or representative. Performance investigation using analytical experimentation and/or simulations underway.
4 Component and/or breadboard validation in laboratory environment. Technology validated in laboratory Technology component and/or basic subsystem validated in laboratory or test house environment. The basic concept observed in other industry sectors. Requirements and interactions with relevant vehicle systems determined.
5 Component and/or Breadboard validation in relevant environment. Technology validated in relevant environment Technology component and/or basic subsystem validated in relevant environment, potentially through a mule or adapted current production vehicle. Basic technological components integrated with reasonably realistic supporting elements so that the technology can be tested with equipment that can simulate and validate all system specifications within a laboratory, test house or test track setting with integrated components Design rules established. Performance results demonstrate the viability of the technology and confidence to select it for a new vehicle programme

Figure 3.2 Definition of TRL levels.

TRL NASA* EU H2020 Definitions UK Automotive, slightly simplified
6 System/sub‐system model or prototype demonstration in an operational environment. Technology demonstrated in relevant environment (industrially relevant environment in the case of key enabling technologies) A model or prototype of the technology system or subsystem has been demonstrated as part of a vehicle that can simulate and validate all system specifications within a test house, test track or similar operational environment. Performance results validate the technology's viability for a specific vehicle class.
7 System prototype demonstration in an operational environment. System prototype demonstration in operational environment Multiple prototypes demonstrated in an operational, on‐vehicle environment. The technology performs as required. Limit testing and ultimate performance characteristics now determined. Technology is suitable to be incorporated into specific vehicle platform development programmes.
8 Actual system completed and “flight qualified” through test and demonstration. System complete and qualified Test and demonstration phases have completed to customer's satisfaction.
Technology proven to work in its final form and under expected conditions
9 Actual system flight proven through successful mission operations. Actual system proven in operational environment Actual technology system has been qualified through operational experience. Technology applied in its final form and under real‐world conditions.
10 Technology is successfully in service in multiple application forms, vehicle platforms and geographic regions. In‐service and life‐time warranty data available, confirming actual market life, time performance and reliability

*https://www.nasa.gov/pdf/458490main_TRL_Definitions.pdf

3.6 Technology Readiness Level (TRL) Structure

The structure developed originally by NASA has stood the test of time and has been developed by different industries to suit their needs. Three of these structures are summarised in Figure 3.2. Alongside the NASA system itself are the definitions for each TRL used by the European Union in the Horizon 2020 research programme. There is then a 10‐step system developed by the UK automotive industry, using the language, processes, and practices of the automotive industry – Ref. 6. These are presented as illustrations. Further examples can be found from a wide range of industries such as energy and rail.

3.7 Phases of Technology Readiness

The nine TRL levels can be divided into three distinct phases:

  • TRL 1–3, which is usually carried out in a university or corporate research laboratory
  • TRL 4–6, which is harder to categorise and which could be done in either a university, a research organisation or an industrial test house
  • TRL 7–9, which is very much in the industrial or commercial domain

The first phase establishes the basic principles of the technology – that it can be made to work. It is likely to require practical laboratory development by highly skilled staff and using supporting calculations or mathematical simulations. The emphasis will be very much on making the technology function through a lot of iteration, rework, skill, and ingenuity. The dependability of the technology might be quite limited; we are all familiar with the laboratory demonstrations that worked successfully an hour ago but that fail when shown to an important visitor. There will, however, be much learning about the critical factors in making the technology a success. Finally, there will be also be some understanding of the likely market applications of the technology, how it might be manufactured, and roughly what it might cost.

The intermediate TRL 4–6 phase has the aim of making the technology more robust and predictable so it could then be deployed, with confidence, into a detailed product development and manufacturing launch phase. Testing will become increasingly realistic in terms of the scale of any prototypes and will be conducted in an environment approximating the true operational situation. This could require significant facilities and supporting calculations. Simulations will be very detailed. Full design for manufacture (DfM) studies should be undertaken so the design does not have to be amended subsequently to make it capable of production (many technologies effectively regress if the DfM phase is left too late). This work will also flush out manufacturing costs and highlight any need for further development to hit realistic cost targets.

The decision to move beyond TRL 6 is one of the most critical in the product development cycle, as it leads to substantial expenditure where the scope to respond to immature technology is much more limited, in timescale terms, and is increasingly expensive. However, the TRL 4–6 phase is also one where there is pressure to short‐cut and compromise because the benefits of the phase are often only apparent later, or conversely, problems arise that could have been ironed out earlier.

The final, TRL 7–9 phase is the easiest to understand and by far the most expensive. Full and detailed product definition in the form of computer‐aided design (CAD) models, drawings, bills of material, and specifications are completed. Calculations and modelling are finished. Prototype and pre‐production programmes are undertaken to demonstrate the correct functioning of the technology and its durability, often using accelerated test regimes. Legally required certification is achieved and the full production system is launched.

Many companies and industries have detailed sign‐off requirements to ensure that a robust and legal product is produced. They may also have stage‐gate decision‐making processes which subdivide the progress through the different stages of maturity.

Progress through the TRL phases described above is characterised as follows:

  • An increasing level of detail in the definition of the product or the technology
  • Increasingly rigorous and representative (and costly) testing, test environment and supporting analysis
  • Increasingly realistic test items, produced by increasingly representative methods
  • Reducing level of skill in the production of development and test material

At the same time as developing the maturity of technology and products, the corresponding manufacturing processes must be developed and thought through. Without this, the new product may not be capable of efficient production.

Graph of cumulative cash flow vs. time displaying a curve that descends from the origin passing through the fourth quadrant then to the first quadrant along with dots.

Figure 3.3 Start‐up cash‐flow profile.

3.8 The ‘Valley of Death’

No discussion of technology maturity development is complete without touching on the concept of ‘the valley of death’. This term is frequently used in both the public and private arenas in the context of moving early‐stage companies from concept demonstration to financial viability. It is an imprecise and evocative term used to describe a period where the majority of start‐ups come to grief and where public funding of earlier technology research may therefore be seen as wasted.

As indicated above, this phase or phases of work include the period when the technology itself undergoes thorough development, beyond concept demonstration, and is then followed by detailed engineering and investment in manufacturing launch. A cash flow profile might typically look like Figure 3.3 where there are four to five periods of cash outflow before sales revenue starts to come in.

This profile applies whether the work is undertaken by a start‐up or an established corporation. However, in the latter case, the nature of the work will be better understood and experience will provide a better indication of the costs and timescales involved. There will probably also be a better understanding of the market potential and hence whether an adequate return can be earned from the investment.

In the case of start‐up companies, the opposite points apply. There is likely to be more of a tendency to underestimate what is required, in terms of both time and money, something guaranteed to annoy investors. This can then lead to shortages of funds and attempts to short‐cut, leading to underdeveloped products.

Can this be avoided? Stating the obvious, companies in this situation need robust marketing plans, realistic development plans and costs, and ideally early sales of trial units to supportive customers. More is said on the topic of early‐stage funding in Chapter 6.

3.9 Manufacturing Readiness Level (MRL) Structure

Moving back to readiness scales, the technology readiness concept has led to a proliferation of offshoot maturity scales in manufacturing, supply chains, and software, for example. The most important and relevant for technology development is manufacturing readiness, which can be applied either to a new manufacturing technology or to the manufacture of a new product technology using existing manufacturing methods. Figure 3.4 summarises the principles of the manufacturing readiness level (MRL) system used by the US Departments of Defence and of Energy, which uses similar language to the NASA TRL system. Alongside it is the scale developed by the UK automotive industry, which uses the language and activities of this particular industry. One of the key points about the MRL scales is the breadth of activities they cover. As well as manufacturing itself, there is coverage of materials availability, the supporting industrial base including supply chains, the business case, skills availability, and supply schedules. The term manufacturing readiness might well be replaced by business readiness.

MRL Specified by DoDI 5000.02 UK Automotive, slightly simplified
1 Basic manufacturing implications identified Basic manufacturing Implications identified. Materials for manufacturing characterised and assessed
2 Manufacturing concepts identified Manufacturing concepts and feasibility determined and processes identified. Producibility assessments underway including advanced design for manufacturing considerations.
3 Manufacturing proof of concept developed Manufacturing proof‐of‐concept developed. Analytical or laboratory experiments validate paper studies. Experimental hardware or processes created, but not yet integrated or representative. Materials and/or processes characterised for manufacturability and availability. Initial manufacturing cost projections made. Supply chain requirements determined.
4 Capability to produce the technology in a lab environment Capability exists to produce the technology in a laboratory or prototype environment. Series production requirements, such as in manufacturing technology development, identified. Processes to ensure manufacturability, producibility and quality in place and sufficient to produce demonstrators. Manufacturing risks identified for prototype build. Cost drivers confirmed. Design concepts optimised for production. APQP processes scoped and initiated.
5 Capability to produce prototype components in a production relevant environment. Capability exists to produce prototype components in a production relevant environment. Critical technologies and components identified. Prototype materials, tooling and test equipment, as well as personnel skills demonstrated with components in a production relevant environment. FMEA and DFMA have been initiated.
6 Capability to produce a prototype system or subsystem in a production relevant environment. Capability exists to produce integrated system or subsystem in a production relevant environment. Majority of manufacturing processes defined and characterised. Preliminary design of critical components completed. Prototype materials, tooling and test equipment, as well as personnel skills demonstrated on subsystems/ systems in a production relevant environment. Detailed cost analyses include design trades. Cost targets allocated and approved as viable. Producibility considerations shaping system development plans. Long lead and key supply chain elements identified.

Figure 3.4 Definition of MRL levels.

MRL Specified by DoDI 5000.02 UK Automotive, slightly simplified
7 Capability to produce systems, subsystems or components in a production representative environment. Capability exists to produce systems, subsystems or components in a production representative environment. Material specifications approved. Materials available to meet planned pilot line build schedule. Pilot line capability demonstrated including run at rate capability. Unit cost reduction efforts underway. Supply chain and supplier Quality Assurances assessed. Long lead procurement plans in place. Production tooling and test equipment design & development initiated FMEA and DFMA completed.
8 Pilot line capability demonstrated. Ready to begin low rate production. Initial production underway. Manufacturing and quality processes and procedures proven in production environment. An early supply chain established and stable. Manufacturing processes validated.
9 Low Rate Production demonstrated. Capability in place to begin Full Rate Production. Full/volume rate production capability has been demonstrated. Major system design features stable and proven in test and evaluation. Materials available to meet planned rate production schedules. Manufacturing processes and procedures established and controlled to three‐sigma or some other appropriate quality level to meet design characteristic tolerances in a low rate production environment. Manufacturing control processes validated. Actual cost model developed for full rate production.
10 Full Rate Production demonstrated and lean production practices in place. Full Rate Production demonstrated. Lean production practices in place and continuous process improvements on‐going. Engineering/design changes limited to quality and cost improvements. System, components or other items in rate production and meet all engineering, performance, quality and reliability requirements. All materials, manufacturing processes and procedures, inspection and test equipment in production and controlled to six‐sigma or some other appropriate quality level. Unit costs at target levels and applicable to multiple markets. The manufacturing capability is globally deployable.

c03f005

Figure 3.5 Achievement of TRL levels.

c03f006a c03f006b

Figure 3.6 Achievement of MRL levels.

3.10 Progressing through the Scales – Some Practical Points

The principles outlined above do require some effort to be translated into practical use. The language used in the TRL or MRL definitions is quite general and its origins in the space and defence industries do come through, even when adapted to other industries. Figures 3.5 and 3.6 outline in more practical terms what might be undertaken or completed in each stage of the TRL/MRL journey in a way that relates to the language of general company management.

3.11 International Standards

In terms of officially recognised standards relating to this topic, the only one which currently exists is ISO16290:2013 – ‘Space Systems – Definition of the TRLs and their criteria of assessment’ – Ref. 7. It was produced by ISO Technical Committee 20 – TC 20 – Aircraft and space vehicles, Sub‐Committee SC14, Space systems and operations and covers 11 pages.

The standard provides useful information:

  • Definition of nine TRL levels
  • Examples of the readiness level of various technologies at different points of time in the past
  • Work expected to have been achieved on completion of each TRL level

The language used in the standard is very much that of the space industry and, although it is explained well through definition of key terms (e.g. ‘breadboard’, ‘element’ and ‘relevant environment’), it would be more difficult to use directly in other industries using different terminology and processes. The less official documents produced by other industries, or indeed other individual companies, show that the principles of the system can be adapted to a wide range of circumstances.

3.12 Assessment of TRL and MRL Levels

TRL and MRL assessment can be carried out in individual companies on their products. When a company decides to make use of a TRL or MRL system, it is up to them to decide how to do so and how to build it into their formal processes. Normal practice is to have a documented set of definitions and criteria that describe the process to be used in the company and what is required to pass each TRL or MRL point.

Starting with technology readiness, questions such as these should be asked:

  • What level of detail in product definition exists, in terms of drawings, specifications, bills of material, and so on?
  • What form of physical realisation has been achieved, and how realistic is it compared with the final product?
  • What calculation, modelling, and simulation work has been completed to verify the performance of the design, and what results has this work shown?
  • What physical test work has been undertaken, in what quantities, and in how realistic a test environment? What results has this shown?
  • Has intellectual property been protected?
  • Is there a programme for regulatory approval, and how far has it progressed?
  • To what extent has manufacturing feasibility been assessed, and how far has design‐for‐manufacture progressed?

Figure 3.7 is an example of a worksheet that could be used to assess whether a technology had reached, in this instance, TRL 3.

Technology Readiness Level ‐ TRL3 TRL achieved; Yes/No
TRL3 ‐ Basic concept of the new technology is shown to be viable (proof of concept). On the basis of this, further investment in the next, and more expensive, stages of development might be expected by either the organisation doing the development or by an external investor... Overall Comments
Area Supporting Information expected at this stage Pass/Fail Issues/comments
Concept design The design of the solution will be complete, as an overall system but not the details. This will probably in the form of a CAD model plus coding of any embedded software.
Simulation & modelling Substantial simulation and modelling will have been undertaken to prove the performance characteristics of the system.
Practical development This will be supported by practical test work, in a laboratory environment, of a complete system, albeit using relatively crude representations of the final items. This demonstration system is unlikely to work reliably, and could be quite temperamental, but it will show that the concept can be made to work and can achieve something approaching the required level of performance.
Marketing evidence The development work will also be supported by an initial QFD‐type (quality function deployment) analysis, linking the product in detail to the market need that has been identified.
Risk identification FMEA or similar methodologies will also be started to identify areas critical to the reliability of the solution.
IP protection The need for formal IP protection will be identified and may have been sought.
Manufacturability A manufacturing approach will be defined and documented, suited to the volumes of the market identified.
Financial viability A basic assessment will be available to demonstrate financial viability.

Figure 3.7 Example of TRL assessment sheet.

Manufacturing readiness can then be questioned in the same way:

  • Is there a clear understanding of the manufacturing technology and processes needed to produce the planned product?
  • Is there an understanding of the planned manufacturing volumes and manufacturing costs as a function of time that link to a viable business case and an introduction programme?
  • Have the manufacturing processes and quality management arrangements been defined and modelled?
  • Are the required materials and manufacturing infrastructure available, either within the company or outside it in the supply chain, in sufficient capacity to support the planned volumes?
  • Is there a costed programme of investment in facilities and tooling?
  • Have the manpower and skill requirements been defined?

These questions could form the basis of a review process that might be undertaken by independent staff in the company or through formal review meetings, also involving independent or experienced staff. To support this process, there is likely to be some form of check‐sheet system which records the outcome of the review and defines the TRL or MRL status of the technology or product. Reviews should be evidence‐based in the sense that they look directly at the available evidence and do not rely on second‐hand information or opinion. (The promoters of new technology usually overestimate its readiness and brush aside known weaknesses.)

There are also publicly available spreadsheet‐based calculators, such as that published by the American Air Force Research Laboratory – Ref. 8. As might be expected, it is aimed at aerospace and defence users, but the principles could be used across a wider range of industries. In this example, there is one sheet of questions, 10–20 in number, for each TRL level. These can be ticked off or a completion percentage estimated. The spreadsheet then calculates the resulting TRL outcome.

One point of debate occurs when a technology or product passes most of the criteria at a specific review point but still has some deficiencies. The correct argument is that these deficiencies should be corrected before a ‘pass’ can be awarded. Otherwise, the company misleads itself about where it has reached and is storing up problems for the future. Hence, the TRL or MRL level of a system is, in effect, dictated by the lowest ranking, or weakest, element of that system.

Figure 3.8 shows the typical outcome of a real TRL assessment, in this instance a new development in the field of renewable energy. In this example, the market was well understood, there was a lot of engineering detail with some degree of IP protection, prototypes were well made but constructed very laboriously, test work had been completed to demonstrate the concept but a more thorough programme had still to be undertaken, and risk and manufacturing work were still at early stages. Hence, although the design might be considered as being at TRL5, overall, the project was still at TRL3 and the various elements of it were out of synchronisation with each other. In particular, there was a concern that when a satisfactory method of manufacture had been found, then substantial redesign and retesting might be needed.

Horizontal bar graph illustrating the results from a TRL assessment, with bars for Understanding & analysis of markets & applications, Level of detail of product engineering, Level of IP protection, etc.

Figure 3.8 Example of results from a TRL assessment.

3.13 Synchronising Technology and Manufacturing Maturity

One of the key principles put forward in this book is that technology development can only be successful if the associated manufacturing development is carried out closely in parallel. In TRL and MRL terms, this means that MRL should never lag TRL by more than a couple of points (the best or acceptable lag is a matter of some debate).

This ideal relationship is illustrated in Figure 3.9. The graph also illustrates what can happen if technology development runs too far ahead of the underpinning manufacturing work. Typically, it is discovered that the design, which has been detailed and tested, cannot be made economically, needs to be modified to facilitate manufacture, or needs to be made from a somewhat different material. Redesign is then necessary, resulting in the TRL level regressing and development work having to be repeated. Interpersonal factors can often then intervene, recrimination sets in, and the project never quite recovers.

Graph of TRL level vs. MRL level displaying an ascending dashed line for TRL ideal and a fluctuating solid curve for TRL problematic.

Figure 3.9 TRL versus MRL relationship.

The key point is always to ensure early assessment of manufacturing, supply chain, and business parameters in the low‐TRL stages of a new development (see, e.g. Ref. 9).

3.14 Limitations of Technology Maturity Assessment

One point to bear in mind when conducting maturity assessments is the consistency of TRL definitions. There has been much debate about how the various TRL and MRL levels should be defined. Different industries and different companies have taken their own approaches. Hence, the TRL or MRL declared by one organisation may not be directly comparable with one declared by another. The difference, however, is only likely to be ±1 TRL or MRL level.

However, the most obvious drawback is that maturity assessment is essentially backward‐looking. It considers how far a technology has progressed, which is useful information in judging whether enough effort has been expended to reduce the risk of future problems. However, this says nothing about the future – as the financial world says in its standard disclaimer, ‘past performance is no guarantee of future results’. Hence, some caution needs to be exercised in considering a new technology to be problem‐free simply because it has reached a certain maturity.

The assessment process should reduce the risk of future problems but it will not eliminate them. For example, it is not unusual for a technology that has worked well in one application or environment to be problematic in another where the duty cycle or operating environment is different. The engineering development process trials the new technology in an increasingly realistic environment, as progress is made through the TRLs, but it is not until the real operating environment is reached using production material that victory can be declared.

One way of addressing this issue is, at the same time as assessing current technology maturity, to undertake an expert review of future problems and risks, methods of identifying them, and measures that can then be taken to overcome them. Again, the aim is to reduce risk whilst acknowledging that risk can never be totally avoided. An approach to this forward‐looking assessment has been codified by under the title ‘Advancement Degree of Difficulty’ or AD2. Chapter 7 looks in some detail at risk identification and avoidance.

3.15 Concluding Points

The principle that new technologies pass through identifiable phases of development is an important one to grasp when considering how to manage and plan the development of new technologies. There is an element of universal truth about the concept – rather like human development from cradle to grave, there are elements one would prefer to avoid. This, of course, is not possible, and all the examples from engineering history show that short‐cuts cause problems – the process can certainly be made more efficient, but all phases of maturity have to be worked through. The TRL and MRL scales give a common language for this process and, in particular, can be used as a means of communication with nonexpert parties such as general managers or investors.

References

The first five references provide background to the original development of the TRL system within the US space industry:

  1. 3.1 NASA Headquarters Oral History Project, Edited Oral History Transcript, Donna L. Shirley, Interviewed by Carol Butler, Norman, Oklahoma – 17 July 2001, https://www.jsc.nasa.gov/history/oral_histories/NASA_HQ/Herstory/ShirleyDL/ShirleyDL_7‐17‐01.htm
  2. 3.2 Sadin, S.R., Povinelli, F.P., and Rosen, R. (1988). NASA technology push towards future space mission systems. Acta Astronautica 20: 73–77.
  3. 3.3 Mankins, J.C. (1995). Technology Readiness Levels: A White Paper. NASA, Office of Space Access and Technology, Advanced Concepts Office.
  4. 3.4 Software Technology Readiness for the Smart Grid — Cristina Tugurlan, Harold Kirkham, David Chassin Pacific Northwest National Laboratory Advanced Power and Energy Systems Richland, WA, PNSGC Proceedings, 2011.
  5. 3.5 Nolte, W.L. and Bilbro, J.W. (2008). Did I Ever Tell You About the Whale?: Or Measuring Technology Maturity. Information Age Pub Inc.

The UK automotive industry has developed a system for its specific needs:

  1. 3.6 Automotive Technology and Manufacturing Readiness Levels: A guide to recognised stages of development within the Automotive Industry, Automotive Council UK, January 2011

The only formal international standard is:

  1. 3.7 ISO 16290:2013 Space Systems − Definition of the Technology Readiness Levels (TRLs) and Their Criteria of Assessment

A spreadsheet method of TRL estimation is provided in this reference:

  1. 3.8 Technology Readiness Level Calculator, Air Force Research Laboratory

This document provides an approach for using TRL estimation in the context of early‐stage investment:

  1. 3.9 Measuring Technology Readiness for Investment – The Manufacturing Technology Centre & Heriot‐Watt University, March 2017
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.225.255.187