12
Decision‐Making and Problem Solving

12.1 Introduction

As has been frequently stated in this book, the engineering development process revolves around iteration and learning. New facts are continually emerging, and therefore decisions are constantly being made. Problem solving and decision‐making are, from one point of view, just everyday elements of technology development work. Some of these decisions will be minor and hardly be recognised as decisions at all. Others will be substantial, requiring careful thought and objective analysis before a conclusion is reached.

The ability to apply the mind well and solve problems is vital to work of this nature. The processes of ‘critical thinking’ are therefore very relevant.

However, reaching the right decision, if there is such a thing, may not be quite as straightforward as might be thought. As will be shown, the human decision‐making process is subject to all sorts of biases and prejudices that get in the way of choosing the best way forward.

Research in the field of behavioural economics has provided some of the best insights into how the human mind works in this respect. This research has shown that human beings, when presented with ‘rational’ economic choices, behave far from rationally. The findings of this research apply across the spectrum of human endeavour, but particularly to technology work.

The other field that impinges on technology work is statistics. ‘Statistical thinking’ has revolutionised the approach to manufacturing over recent decades. Started in Japan under the guidance of the American, Dr W. Edwards Deming, thinking of manufacturing as a series of processes with their own characteristics expressed in statistical terms has resulted in a radical change in the management of those processes (Ref. 1). In particular, it has produced that elusive combination of better and faster results at lower costs. As a result, many companies now train their people in Six Sigma, probably without realising its true meaning.

Elements of the same statistical thought processes should be applied to technology and product development work. Of course, the quantities involved in this work, often ‘one‐offs’, are much lower than in manufacturing, which limits the extent to which detailed statistical analysis can be applied. However, this does not prevent the use of the principles of statistical thinking.

The field of decision‐making is one where some level of awareness is beneficial to everyone involved in technology development. This chapter aims to provide a start point for study of this field. As with many of the topics covered by this book, there are much more detailed, and rigorous, publications on this subject.

12.2 Decisions to be Taken

As noted above, decisions permeate development processes. Decisions are required in many areas, such as:

  • Choice of project to pursue
  • Technical route to be followed
  • Drawing conclusions from customer data
  • Choice of solutions to problems encountered
  • Type of analysis or test work to be undertaken
  • Whether to continue to invest money
  • Allocation of work among individual engineers
  • Selection of people for employment

Some of these decisions will be taken in a few moments by one individual. Others may be the subject of detailed and documented analysis for review by a senior company committee. The point is that there are thousands of decisions, small and large, short‐term, and long‐term, to be dealt with in a development programme – all potentially subject to the vagaries of the human mind. This raises the question of how those decisions can best be made.

In principle, any significant decision should be made on the basis of a cost−benefit analysis. Costs can usually be calculated but benefits, which could be nonfinancial, may need more thought. However, the principle stands. Alongside costs and benefits, the opportunity costs should also be considered. All decisions come with an opportunity cost in terms of alternative uses for money, resources, or expertise, and hence the lost benefits of an alternative course of action. Rational evaluation of these points should underpin every decision.

12.3 Critical Thinking

The study of rational processes of human thought can be traced back more than two and a half thousand years to Greece and the times of Socrates, Plato, and Aristotle. ‘Critical thinking’ seems to be a more modern term, although the word critical has Greek origins. In today's world, it is a recognised term relating to the objective and rational evaluation of an issue in order to draw a sound conclusion. The importance of the topic is reinforced by the existence, in the United States, of the US National Council for Excellence in Critical Thinking and, around the world, numerous other centres and hubs for this topic. In academic circles, the subject is taught quite widely and forms the basis of rigorous academic analysis.

Another US organisation, the Foundation for Critical Thinking, based in California, describes the proficient critical thinker as someone who:

  • Raises vital questions and problems, formulating them clearly and precisely
  • Gathers and assesses relevant information, using abstract ideas to interpret it effectively
  • Comes to well‐reasoned conclusions and solutions, testing them against relevant criteria and standards
  • Thinks open‐mindedly within alternative systems of thought, recognising and assessing, as needs be, their assumptions, implications, and practical consequences
  • Communicates effectively with others in figuring out solutions to complex problems

To these points, it could be added that the proficient thinker is also aware of human fallibilities in reaching decisions and is competent in the use of statistical methods in arriving at sound conclusions.

These rather philosophical points underpin the following discussion about decision‐making in the field of technology development.

12.4 System 1 and System 2 Thinking

In the work by Daniel Kahneman referenced in the previous chapter (Ref. 2), the concept of two distinct modes of human thinking is described, based on classifications developed by the psychologists Keith Stanovich and Richard West. These two modes are described, somewhat unoriginally, as System 1 and System 2.

System 1 looks after short‐term reactions to situations as they develop. It is instinctive and fast. Some of its skills are ‘built‐in’, such as responding to a loud bang or ducking a flying object. Other skills are acquired through learning but become instinctive: language, car driving, or playing musical instruments. System 1 reacts to incomplete information so it is fast but imperfect. It links to the choice of ‘fight or flight’.

System 2 is slower but essentially rational. It relies on deeper thought, analysis, and reasoning. If applied, it can in effect override the instinctive reaction of System 1.

There is some academic criticism of this simple model; there are those who argue for example for four distinct systems and others who argue for a graduated transition from one mode of thinking to another. However, the two‐system model does correspond with general, human experience, and it is useful to bear in mind when faced with a pressurised decision‐making situation.

It might be argued that technology development work is entirely the domain of System 2. However, in the hurly‐burly world of engineering delivery projects with tight timescales, hasty decisions, subsequently regretted, are not unknown.

An extension of this model concerns the unconscious mind (Ref. 3). We are all familiar with the experience of finding the solution to a knotty problem (at least the principles, if not the detail) after ‘sleeping on it’. Studies of people who are regarded as very creative all report flashes of inspiration, almost from nowhere, relating to problems or topics that have often been under review for weeks and months. Psychologists have concluded that there is such a thing as the unconscious mind and that it acts in the background in a very powerful manner. It registers much more environmental information than the conscious mind can process and seems to be capable of solving problems that are beyond the reach of the conscious mind.

The relevance of these points lies in gaining greater understanding of the mental processes at work when attempting to manage technology development. This understanding may be helpful in achieving better results from the work.

12.5 Human Barriers to Decision‐Making

In addition to the points made above, it needs to be noted that the human mind is susceptible to all sorts of biases and fallacies which may obstruct logical thought, especially on complex issues. Many of these stem from a human tendency to form quick conclusions based on imperfect evidence and a tendency to gross overgeneralisation based on a small amount of information. This weakness probably derives from ancient survival instincts. Some of those relevant to the work of this book are described below:

  • Confirmation bias. We have preconceived views on almost every topic we encounter. These preformed hypotheses tend, then, to bias the way we look at information, emphasising data that supports our view (perhaps subconsciously) and filtering out (again subconsciously) data that go against our view. An extension of this bias is conducting experiments that only test one hypothesis and don't therefore allow other conclusions to be reached.
  • Post hoc fallacy. Described fully as post hoc ergo propter hoc (translated as ‘after this so because of this’), this logical fallacy is the erroneous connection of one event to a preceding event, especially if they are very close in time. Many of these fallacies are no more than superstitions: ‘I always play better football if I tie my left laces first’, for example. In engineering work, problems can arise in rapid succession, and it is important to understand which are related and which just happened to occur very closely in time.
  • Sunk cost fallacy. When a lot of money has been spent on a development, there is a natural reluctance to abandon it, even if the prognosis is that the project will not lead to a successful outcome. The decision about the future of a project should be based on future costs and revenues, independent of sunk costs. If the prognosis is poor, don't throw good money after bad and accept that the earlier investment was wasted.
  • Status quo bias. Another natural human tendency, when faced with a difficult choice, is to revert to the status quo, the security of what is already known, the safe option. It is linked to two other recognised biases: loss aversion (potential losses being more highly rated than potential gains of the same magnitude, generally by a factor of about 2) and endowment effect (a preference for keeping hold of what is already possessed). As the terms imply, the research in this area derives from behavioural economics. In technology work, it manifests itself in a reluctance to make changes when evidence is pointing to the need for change. The expression ‘bite the bullet’ comes to mind.
  • Attribution error. This well‐known phenomenon concerns the way we judge people's behaviour, and is therefore relevant to successful teamwork or management of people. In essence, it is the tendency to judge other's behaviour as being a consequence of who they are, intrinsically, rather than its being a product of the situation in which they find themselves. Conversely, we judge or explain our own behaviour more as a reaction to circumstances and hence more favourably. A one‐hour car drive, with the usual encounters with ‘idiot drivers’, will illustrate the point.
  • Reliance on second‐hand information. This is not a recognised human fallacy, in the world of social psychology, but is an issue frequently experienced by the author. When discussing problems and making decisions, it is surprising how frequently and confidently people are prepared to form opinions without any first‐hand experience. It is just as surprising to see the effect of experiencing issues first‐hand and the changes it often brings about to those opinions. A useful dictum is to refuse to discuss any issue before experiencing it directly.

The six points described above can all come into play, not usually together, when making decisions. They can get in the way of making effective and often difficult decisions. Hence, when faced with a difficult decision, it is sometimes a good idea just to stand back and reflect on whether the thinking about that decision is being biased by any of these points.

12.6 East versus West

Whilst discussing mind‐sets and decision‐making, it is worth reflecting on the differing approaches taken in Eastern countries – Japan, China, and South Korea – to those in Western countries – Europe and the United States. In his book The Geography of Thought (Ref. 4), Richard Nisbett suggests there are fundamental differences in the way that East Asians think about issues and decisions, compared with Westerners. Anyone with experience of engineering business in these two cultures will know this to be the case, although those with this experience will also know that there are wide variations between:

  • Western countries, e.g. United Kingdom, United States, Germany, France, and Italy
  • Eastern countries, e.g. India, China, Japan, South Korea, and Malaysia
  • Specific companies, with their individual cultures
  • One person and another

Westerners are brought up and educated, whether they know it or not, on the basis of rational logic. Facts are gathered, logical truths follow and, if a contradiction occurs, further analysis will resolve the rights and wrongs of the contradiction. Once these points have been dealt with, the way is clear to proceed! So, if A = B, and A = C, then B = C. Hard logic of this type leaves no room for ambiguity. This approach is the basis of scientific theory, the development of which over the past 400 years has been undertaken largely in the Western world. It is interesting to note, however, that much of recent development in quantum mechanics, for example, is based on theories where more than one answer is possible, e.g. wave‐particle duality, uncertainty principles and entanglement, and hence where classical logical principles appear to be overruled.

Eastern traditions, on the other hand, place much more emphasis on the environment or context surrounding the ‘facts’. They are more willing to live with contradictions, to develop a point of view by dialogue (so‐called dialectical reasoning), and to maintain some level of ambiguity as events proceed. Events inevitably bring change, and hence a new ‘answer’ becomes appropriate. There is also a greater willingness to explain human behaviour through the context in which an individual is operating, rather than through the intrinsic traits of that individual. Some Eastern countries also place much greater emphasis on data and use the gathering and discussion of data as a means of building consensus before taking action. In other Eastern countries, ‘loss of face’ is a major driver of behaviour.

It is interesting to consider the extent to which Eastern countries', especially Japan's, recent success in technology and product development has been based around this mind‐set. It is certainly the case that many Western companies have changed their priorities in the last two decades, have taken a more process‐centric approach and have thereby adopted many of the Japanese principles of product development. It would seem that the most successful organisations are likely to be those which can make balanced use of both the Eastern and Western traditions.

However, it is difficult to be entirely prescriptive on this topic, and the most important preoccupation for any engineer going into a new, international cooperation is to be aware that significant cultural differences exist from country to country, company to company, and individual to individual.

12.7 Statistical Thinking

A good definition of statistics is ‘that branch of mathematics dealing with the collection, analysis, interpretation, presentation, and organization of data’ (Ref. 5). The subject is usually associated with large quantities, or series, of data from which inferences can be drawn. It is certainly the case that technology and engineering work will generate data sets in various circumstances to which classic statistical methods can readily be applied. Examples might include:

  • Data from undertaking customer surveys
  • Time series information from data logging on a prototype machine
  • Data from manufacturing operations over a period of time
  • Warranty and complaint information from products in the field
  • Increasingly, data from ‘connected’ products out in the field

In larger companies, engineers trained in statistical methods will be able to analyse such information and make good use of it.

However, it is also often the case that some data are available, but in very small quantities, and decisions still have to be made. Applying a ‘statistical mind‐set’ often helps such decisions. Lawrence Summers, the sometimes controversial former president of Harvard University, stated: In an earlier era, almost every student entering a top college knew something of trigonometry. Today, a basic grounding in probability, statistics, and decision analysis makes far more sense' (Ref. 6). These are wise words.

First, consider that information or data can usually be grouped into ‘populations’ of related information. The five groupings noted above, which could contain substantial amounts of data, could be considered as populations. It might also be a small grouping, such as a number of separate R&D projects or a small number of candidates for a job.

Large populations are normally analysed or characterised through samples of data, which clearly have to be representative of the population as a whole. Populations can then be characterised by parameters such as the mean and standard deviation. In the engineering world, dispersion or wide variability in the data about the mean usually causes problems. For example, in a manufacturing process, there is a need to hit a dimension on a drawing accurately, within narrow tolerances. There will be some process variability but it should be minimal, and one of the constant objectives of manufacturing engineering work is to reduce variability. This can be done by understanding the details of the process, including the causes of variability, and developing improvements.

Turning to small populations of data, analysis and presentation of key facts requires more thought. This was illustrated when the causes of the Space Shuttle Challenger disaster were analysed. The issue came down to the low‐temperature performance of the O‐ring seals in the joints between the different sections of the booster rockets. There had been some failures on previous missions. Putting aside all the issues of management culture exposed by the enquiry, and with the benefit of hindsight, a simple presentation (and a willingness to take it seriously) of failure rate versus ambient temperature (Figure 12.1) might have clarified that a serious problem existed.

Graph of challenger O-ring failure rate versus temperature with simple linear extrapolation (exponential extrapolation is worse) illustrated by a descending line (dotted) and dots.

Figure 12.1 Challenger O‐ring failure rate versus temperature with simple linear extrapolation (exponential extrapolation is worse).

This is an unfortunate but clear example of a situation where an understanding of data and statistical thinking would be beneficial to everyone.

A more positive, and older, example of clear presentation of data can be found in the work on the causes of cholera, undertaken in the nineteenth century by Dr. John Snow. At that time, cholera and a number of other diseases were thought to be caused by the ‘miasma’, some sort of mysterious heavy vapour. John Snow had first experienced a cholera outbreak in 1832. After an outbreak in London in 1849, he published On the Mode of Communication of Cholera proposing that cholera was caused by contaminated drinking water, but his theory was not accepted. He then studied a further outbreak in 1854 centred on Soho in London in which over 600 people died. He plotted the exact location and numbers of the deaths on a map (Figure 12.2). In 1855, he published a second, much expanded edition of his work.

Image described by caption.

Figure 12.2 Map by John Snow showing the cholera cases in the 1854 London epidemic.

Source: Re‐drawn by Charles Cheffins.

Close inspection of the nineteenth‐century plot shows quite clearly that the outbreak centred on a water pump at the junction of Broad Street and Cambridge Street. Snow's further and careful analysis also explained why some nearby places had no outbreaks (the brewery, for example, which had its own water supply) and someone in Islington suffered (she liked the taste of the Broad Street Water and had it ferried in). A twenty‐first century replot of the data with colours and various sized dots shows an even clearer picture.

More examples of how best to present numerical data are given in Ref. 7.

12.8 Application to Management Processes

Some of the same principles could be applied to management processes. Take recruitment as an example. Studies have shown a 0.1 correlation between interview results and success in the job, i.e. the two are only very weakly related. This is hardly surprising – the sample here is a one‐hour interview versus, as an alternative, data samples from the real work a person may have undertaken over many years. If an organisation's recruitment process produces highly variable results in terms of the effectiveness of new recruits, then place more emphasis on finding out how effective candidates have been in their previous work and less emphasis on interviews.

As a slightly humorous example to illustrate how the wrong sample can mislead, the author has had one 20‐minute sample of the golfing prowess of the Japanese golfer Tsuneyuki (‘Tommy’) Nakajima. It happened to be at the 17th hole in the third round of the Open Championship at St. Andrews in 1978. It took him nine shots to sink the ball into the hole after having some trouble in a well‐known bunker. Based on that small sample, he would have trouble getting into a game with a normal Saturday morning fourball, let alone entry to professional competitions. He was, of course, one of the most successful Japanese golfers ever.

This example can also be used as an illustration of the concept of ‘regression to the mean’. This concept states simply that, after one extreme measurement, the next measurement is likely to be much closer to the mean. Tommy confirmed this principle by doing much better on the 18th hole. The point is that extreme results can occur even in stable processes and there should be no overreaction until there is more evidence that a process has radically changed.

Another example of a typical engineering management decision, where previous data should be taken into account, concerns product introduction timing. There is always pressure to speed up the process but it is worth reflecting on how long previous projects have actually taken (not what they should have taken) from approval to actual introduction – including an estimate of the mean and standard deviation, even if there are relatively few data points. If the proposed introduction timing is radically faster than has been achieved previously, and especially if there is no strategy to achieve it other than ‘trying harder’, then a statistical review of the proposal might suggest a different decision.

The overriding point of the sections above is that all decisions are based on information and, usually, numerical data. It is very helpful to frame information, data, and decisions in a way that is consistent with statistical principles and to use these principles to challenge the potential solutions being put forward.

12.9 Problem Solving – A3 Method

Problem solving and decision‐making are closely linked. Fortunately, there are some very straightforward methods of helping the process of solving problems. A quick scan of the internet will reveal numerous methodologies, usually designed around a certain number of steps, where that number can be anything between 4 and 10.

All methods follow a similar pattern of identifying the problem, characterising it, coming up with solutions, trying them out, and confirming a workable solution. This is essentially a slightly longer version of the iterative process outlined in Chapter 2.

A good method, which is widely used and may have its origins in Toyota, is based on an A3 sheet of paper – – see Ref. 8 and Figure 12.3 for the format. This approach makes it easy to display the work in, for example, a team room so others can see what is happening and make a contribution. Shown in Figure 12.3 , there are nine key steps:

  1. What has caused a problem to be identified and recognised, i.e. what are the symptoms of the problem? These may be described in nontechnical or semi‐technical terms, such as ‘the bracket holding the fluid reservoir has broken’.
  2. How is the problem defined? This may require some further observation, data gathering, and analysis; the more thorough and technical the definition, the better the problem can be understood. An example might be: ‘the fluid reservoir mounting bracket has failed through fatigue cracking between two bolt holes after 105 cycles of the endurance test to test code xxx’.
  3. What should the situation be when the problem has been overcome? Again, good definition will help derive a solution and will give a point of reference against which to test solutions; for example: ‘the bracket should withstand 106 cycles of the standard test’.
  4. What root causes can be established as the source of the problem? Arguably, this is the most important step; if the root cause(s) can be established accurately, a well‐designed solution is far more likely. In the example above, the root cause might be a manufacturing method that introduces surface defects from which cracks can develop and propagate.
  5. What solution(s) can be proposed? There may be several; in fact, a choice of solutions is ideal. Solutions to the above problem might include different manufacturing methods, or some form of surface treatment, or redesign to spread the loads more evenly.
  6. How are the solution(s) to be taken forward, in terms of review, analysis, testing, and implementation? The different solutions might be costed and analysed.
  7. What results have been obtained from the development of the solutions, and what choice should therefore be made?
  8. How has the chosen solution been shown to be effective in meeting the requirements defined in (3)?
  9. What lessons have been learnt for future use?
One-page A3 problem-solving sheet displaying empty boxes for Symptoms Observed (1), Problem Definition (2), Target Result (3), Analysis of Root Cause (4), Proposed Solution (5), Action Plan (6), etc.

Figure 12.3 One‐page A3 problem‐solving model.

This process does not guarantee a solution, but it does provide a sensible and open structure for dealing with problems. By making the process explicit, it helps other people, including those with more experience, to make a contribution. The focus is on a clear definition of a problem and an organised approach to solving it.

The example above relates to a development test and therefore a limited number of test samples, possibly only one. If it was a problem in service, the approach would be somewhat different and would have to take into account a wider population of components. Further questions might include:

  • The number of components in service
  • The failure rate amongst those components
  • Whether the failure occurs uniformly across the population or whether it is related to a particular batch
  • Whether the failure is linked to a specific user or duty cycle

There are also supplementary methodologies that can be used to support some of these stages. For example, the ‘5 whys’ method is a very good way of moving from symptoms to root causes. This method, very simply, consists of successively asking the question ‘why’ to unearth the root cause of a problem. Asking the question five times usually works, although it may require anything between 3 and 10 cycles. For example, in the instance above:

  1. Why did the bracket fail? – fatigue cracking
  2. Why did fatigue cracks occur? – surface imperfections
  3. What caused the surface imperfections? – insufficient clamping force during machining operations
  4. Why did the clamping problems occur? – lack of previous experience of this type of operation
  5. Why was there a lack of experience? – peer review of machining operation was cut short

This example is somewhat trite but it does illustrate the process. There is some criticism that ‘5 whys’ can be oversimplistic and one‐dimensional, particularly when problems are more complex, but it is a good starting point for thinking about the causes of problems. It focuses attention on the underlying causes of problems and not just the symptoms. Although it might not provide a complete answer, it is always worth thinking through before undertaking a more complex analysis.

It is also important to appreciate that some problems are relatively straightforward whereas others are more intractable. Straightforward issues can often be solved by a single individual following a logical and linear process, working through the calculations to a single solution. A lot of engineering development falls into this category. Other problems may be more difficult to resolve within the constraints of performance requirements, material properties, space availability, procedural requirements, cost, and manufacturability. These latter will require a range of ideas from different people and may result in a range of solutions from which the best compromise is chosen.

12.10 Creative Problem Solving – TRIZ Method

The method just described relates to well‐defined, or bounded, problems to which a specific answer is sought or where a specific decision is required. When it comes to more creative work, where a creative engineering or technological solution is required, the TRIZ method (Ref. 9) has been widely taught and used. It is aimed at finding innovative solutions to problems where the root cause is largely understood. Rather than relying just on the instinctive creative talents of the engineer or scientist working on a problem, the method is intended to give structure to an otherwise unstructured process.

TRIZ has its origins in Soviet Russia, where it was first developed, after the Second World War, by Genrich S. Altshuller. Despite spending time in a labour camp during the Stalin era, he persisted in developing a methodical approach to solving technical problems, based on his thousands of observations of inventions described in patents. (Like Albert Einstein, he worked in a patent office for a period.) The method did not emerge from the former Soviet Union until the 1990s, since when it has developed a significant following and is taught quite widely.

In Russian, the acronym TRIZ stands for Teoriya Resheniya Izobreatatelskikh Zadach, which roughly translates as ‘Theory of Inventive Problem Solving’. It has a focus on problems that involve contradictions or trade‐offs. For example, making something stronger could involve also making it heavier – so how can the strength be improved without a corresponding weight increase?

The method has been criticised as being difficult to follow; there are, about 40 different solution types that can be used, but it is not clear which should be used in a particular situation. However, there are route maps to guide the process, and the reference by Gordon is particularly good in this respect (Ref. 10).

Altshuller's original work identified five different levels of problem difficulty and solution sources. Level 1 is the simplest and generally used solutions are already known in the industry concerned. Level 3 problems used known solutions but from another industry whilst Level 5 required new scientific discoveries, e.g. solid‐state semiconductors in place of thermionic valves.

The process advocated by Campbell starts with an analysis of the problem via the ‘classic’ route outlined in the previous section, with its emphasis on identifying and structuring the problem. It only then proceeds along the TRIZ route if a solution cannot be found by standard methods. Summarised in Figure 12.4, the first step in the TRIZ route is to assign the problem to one of four different problem classes.

c12f004

Figure 12.4 TRIZICS solution roadmap.

There is then a series of standard TRIZ tools for analysing the problem according to its type. Specific problems are then formulated from this analysis, often in the form of a ‘contradiction’ statement. The classic TRIZ tools can then be applied to find solutions – for which there is also a solution bank.

As can be seen from Figure 12.4 , the TRIZ process is quite intricate and has a huge toolkit of analytical methods, which are based on the patent analysis. It clearly requires extensive training and practice to be effective, but it does have a strong and enthusiastic following from its adherents.

12.11 Concluding Points

The aim of this chapter has been to point out the frequency and importance of decision‐making and problem‐solving in technology development work. Decisions or choices must often be made under pressure with imperfect information and small sample sizes. Even without these pressures, decision‐making is subject to a wide range of human foibles, biases, and fallibilities. Framing and structuring a problem or decision in the right way is the first step to improve the quality of the process. At the same time, an awareness of these issues, and some of the common pitfalls, might give pause for thought rather than just blundering on. For everyone involved in work of this type, some study of this topic is likely to be beneficial. An appreciation of statistical methods and thinking is particularly valuable, especially in terms of reacting to ‘bad news’ and distinguishing whether something has fundamentally changed or whether the bad news is just an extreme version of what is already known. Finally, clear methods of presenting numerical data can often make the difference between acceptance or rejection of potentially sound ideas.

References

This book by W. Edwards Deming is one of the classics of management thinking. It places an emphasis on statistical thinking but, as much as anything, is about management philosophy. Deming still has a strong following through the Deming Institute:

  1. 1 Edwards Deming, W. (1982). Out of the Crisis. Massachusetts Institute of Technology.

More about how the human mind operates is described in:

  1. 2 Kahneman, D. (2011). Thinking, Fast and Slow. New York: Penguin Books.

The same author has also written about the differences between Asian and Western thinking patterns:

  1. 3 Nisbett, R. (2015). Mindware, Tools for Smart Thinking. Farrar, Straus & Giroux.

This book has been referenced in an earlier chapter:

  1. 4 Nisbett, R.E. (2003). The Geography of Thought: How Asians and Westerners Think Differently – and Why. New York: The Free Press.

For a definition of ‘statistics’, see:

  1. 5 http://www.merriam‐webster.com/dictionary/statistics.

Thanks to technology, it is possible to read not only the full remarks by Lawrence Summers in the twenty‐first century but also the entire text of John Snow's great work on cholera from the nineteenth century:

  1. 6 Summers, L.H. (2012). What You (Really) Need to Know. The New York Times (Jan. 20): https://www.nytimes.com/2012/01/22/education/edlife/the‐21st‐century‐education.html.
  2. 7 Snow, J. (1855). On the Modes of Communication of Cholera. London: John Churchill https://archive.org/details/b28985266.

This publication contains some interesting material on how to present data:

  1. 8 Tufte, E.R. (2001). The Visual Display of Quantitative Information. Cheshire, CT: Graphics Press.

A3 problem solving is described in more detail here

  1. 9 Shook, J. (2008). Managing to Learn: Using the A3 Management Process Pap/Chart Edition. Cambridge, MA: Lean Enterprise Institute.

Finally, the TRIZ method is unravelled in:

  1. 10 Cameron, G. (2010). Trizics: Teach yourself TRIZ, how to Invent, Innovate and Solve “Impossible” Technical Problems Systematically. CreateSpace.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.138.204.186