5
Philosophical Positioning of Discrete-Event Simulation and System Dynamics as Management Science Tools for Process Systems: A Critical Realist Perspective

Kristian Rotaru,1 Leonid Churilov2 and Andrew Flitman3

1Department of Accounting, Monash University, Melbourne, Victoria, Australia

2Florey Institute of Neuroscience and Mental Health, Melbourne, Australia; RMIT University, Melbourne, Victoria, Australia

3Florey Institute of Neuroscience and Mental Health, Melbourne, Victoria, Australia

5.1 Introduction

Management science (MS) has historically developed as a scientific approach to analysing management problems and making management decisions, and is distinct from other disciplines by applying scientific principles in the context of practical management decision making. In line with Kuhn (1970) and Meadows (1980), a scientific discipline is traditionally supported by a set of explicitly formulated assumptions underlying its approach to the main phenomena under investigation. These assumptions, in particular, encapsulate ontological and epistemological bases of a given scientific discipline, that is postulate ‘the way human beings comprehend knowledge about what is perceived to exist’ (Becker and Niehaves, 2007, p. 201; Burrell and Morgan, 1979). The main phenomenon of investigation for MS is management decision making. MS achieves its aims through the use of scientific methodologies. Explicit articulation of the fundamental philosophical assumptions underlying MS methodologies is therefore an important requirement for further development of MS as a scientific management decision-making discipline. Buchanan, Henig and Henig (1998) concisely express this requirement as: ‘Our convictions about the nature of the world … should indeed be made explicit as a necessary prerequisite for any proposed decision making methodology’ (p. 343).

The importance of adequate philosophical positioning for the science of management has been traditionally recognised, in particular, on the pages of Management Science journal. This tradition goes back to the first Editor-in-Chief of Management Science, W. Churchman, who believed that: ‘Philosophy should be used to study serious problems like war, security, and human living’ (Churchman, 1994). Mitroff (1994) underlines direct dependency of the quality of management theories upon the quality of underlying philosophical notions. The importance of philosophical debates in MS was emphasised at different times by Churchman (1955), Mitroff (1972) and Hopp (2008).

Over the years simulation modelling has become clearly recognised as an effective and robust part of the MS toolkit widely used in management practice (Altinel and Ulas, 1996; Dittus et al., 1996; Lehaney, Malindzak and Khan, 2008; Brailsford et al., 2004; Peña-Mora et al., 2008). One of the most successful areas of application of simulation modelling is providing scientific tools for management thinking and decision support for process systems (Van Horn, 1971; Doomun and Jungum, 2008; Gorunescu, McClean and Millard, 2002; Haraden and Resar, 2004; Pidd, 2003a). Simulation methods have been applied in virtually every process industry – from manufacturing production lines and logistics networks to call centres, air traffic control and patient flows in hospitals (Lehaney, Malindzak and Khan, 2008; Doomun and Jungum, 2008; Brailsford et al., 2004; Greasley, 2005). Simulation of a process system involves building a valid model of such a system and, subsequently, using this model in order to gain insight into the system's functioning under alternative conditions and courses of action, thus providing scientific support for management decision-making activities.

Depending on the management decision support needs, different simulation approaches are used, potentially resulting in several alternative models for a given decision-making situation (Rohleder, Bischak and Baskin, 2007; Morecroft and Robinson, 2005; Karpov, Ivanovsky and Sotnikov, 2007; Popkov and Garifullin, 2006). In addition to the published research on the application of individual simulation approaches, there is a growing body of literature that explores the possibilities and implications for providing management decision support based on more than one simulation approach (Venkateswaran and Son, 2005; Lorenz and Jost, 2006). In the context of process systems, the existing MS literature mainly explores the possibilities of the individual, combined (Tako and Robinson, 2008, 2009; Brailsford and Hilton, 2001; Morecroft and Robinson, 2005; Lane, 2000) or even integrated use (Peña-Mora et al., 2008; Coyle, 1985; Barton and Tobias, 2000; Borshchev, Karpov and Kharitonov, 2002; Brailsford, Churilov and Liew, 2003) of discrete-event simulation (DES) and system dynamics (SD) modelling approaches.

In line with the inherent subjectivity of the very notion of a model (e.g. defined by Pidd (2003b, p. 12) as ‘an external explicit representation of part of reality as seen by people who wish to use that model to understand, to change, to manage and to control that part of reality [emphasis added]’), any given simulation approach assumes ‘a particular worldview that prescribes how a model should be developed and expressed’ (Pidd, 2003a, p. 77). These ‘worldviews’ are also referred to in the literature as modelling philosophies (Morecroft and Robinson, 2005) or modelling paradigms (Meadows, 1980; Lorenz and Jost, 2006). The idea of simulation worldviews emerged from intelligent thinking about practical management experiences (Pidd, 2004a, p. 2). This echoes a more general observation by Buchanan, Henig and Henig (1998) who emphasise that ‘the worldview that we hold determines the process we will advocate for solving a decision problem’ (p. 343).

The MS literature includes quite a number of contributions illustrating how a combination of a particular simulation engine and a corresponding worldview can be applied to facilitate management decision support in process systems (e.g. Borshchev, Karpov and Kharitonov, 2002; Popkov and Garifullin, 2006). Unfortunately, much less attention has been paid historically to the scientific philosophical assumptions underlying the use of respective simulation worldviews. As early as in 1980, Meadows (1980) noted that these assumptions are often implicit and made by modellers on the subconscious level. Almost 30 years later, despite the recognised need to make explicit the fundamental assumptions underlying the use of simulation worldviews by management scientists (Pidd, 2004a) in order to effectively apply different simulation approaches to support managers in recognising and solving specific decision-making problems (Morecroft and Robinson, 2005), there is still a clear gap in the MS literature symptomatic of a very limited understanding of fundamental assumptions underlying various simulation worldviews.

In particular, this leads to the situation where the systematic philosophical positioning of DES and SD remains largely unexplored. The relevant literature is scarce. Specifically, Lane (1999) maps a range of diverse SD research streams into a four-quadrant framework proposed by Burrell and Morgan (1979) based on the interaction between social science and society, thus depicting four paradigms for the analysis of social theory and philosophical schools that correspond to such social theories, but not explicitly addressing epistemological foundations of SD. Mingers (2003) provides a high-level epistemological framework for characterising the philosophical assumptions underlying OR/MS methodologies and techniques, including, but not specifically targeting, DES and SD. When it comes to DES, Morecroft and Robinson (2005) compare the advances of the SD ‘philosophy of practice’ (based on Lane, 1999) with the current advances in DES, concluding that while such a philosophical position does not yet exist in DES, it might be beneficial for extending the scope of DES.

Overall, lack of research into philosophical assumptions underlying the use of respective simulation worldviews precludes management scientists from using simulation methodology in a truly scientific manner, thus going to the core of MS as a scientific discipline that is aimed at supporting managers in analysing management problems and making management decisions.

This chapter sets out to contribute to MS knowledge by explicitly examining the main philosophical assumptions underlying the DES and SD simulation worldviews while utilising practical simulation experience in process systems as described in the relevant body of the published literature. More specifically,

the objective of this chapter is to investigate how critical realist philosophy of science (Bhaskar, 1978, 1979) facilitates explicit articulation of the fundamental philosophical assumptions underlying system dynamics and discrete-event simulation worldviews, thus contributing to more effective use of simulation for intelligent thinking about, and management decision support in, process systems.

The novelty and original contribution of this research is in using the elegance and power of critical realism as a philosophy of science, and, in particular, critical realist stratified ontology and abductive mode of knowledge generation, to examine explicitly the philosophical bases of DES and SD simulation worldviews. The outcomes of this research are targeted at both the manager, who is the contributor to, as well as the end user of, the simulation model of a real-world process system and, as such, could benefit from clear understanding of how management knowledge is generated through the modelling process, and the management scientist who chooses to use simulation modelling to support management decision making in real-world process systems and requires an in-depth understanding of the scientific bases of respective modelling methodologies to apply it in a truly scientific manner.

Three observations regarding the scope of this study should be made explicit at the outset. First, the differences between the study by Mingers (2003) and our study are that the former is a broad review covering the entire spectrum of operational research methodologies, while the latter is specifically targeted towards simulation methodologies and is using critical realism to address their foundations in the context of process systems. Second, this chapter supports a management scientist by focusing on simulation worldviews and treating all the issues of simulation engines, however important they might be, beyond its scope. Third, while ‘any system that can be modelled by one approach can also be expressed in one of the others’ (Pidd, 2003a, p. 77), resolution of the popular argument regarding the potential use of DES in place of SD (or vice versa) is well beyond the scope of this chapter. When analysing both simulation modelling methodologies, our analysis is based on the findings reported in the literature on what simulation methodology was used for a specific purpose, rather than what it could have been used for. In this sense, this study is driven by the empirical evidence provided in the literature that explores the use of DES and/or SD modelling methodologies as MS tools for intelligent thinking and management decision support in process systems.

The remaining part of this chapter is organised as follows. In Section 5.2 stratified ontology and abductive modes of knowledge generation are introduced as the bases of critical realism (CR). Section 5.3 presents process system modelling with SD and DES through the prism of CR scientific positioning by superimposing the outcomes of each phase of a relevant simulation modelling lifecycle upon the stratified ontological representation of CR. Section 5.4 uses patient care process simulation with SD and DES as an example of a complex process system modelling activity and specifically explores the relevant MS literature through the ontological/epistemological lens of CR; this section also provides the analysis of apparent process system simulation modelling trends and formulates subsequent implications for MS from the adopted CR ontological/epistemological perspective. Finally, summary and conclusions are presented in Section 5.5.

5.2 Ontological and Epistemological Assumptions of CR

CR is a philosophy of science that is most commonly associated with the work of Bhaskar (1978, 1979, 1986, 1989). Epistemology, or theory of knowledge, is concerned with the nature and scope of knowledge, in particular addressing the important question of how knowledge is acquired (Meredith, 2001; Becker and Niehaves, 2007; Burrell and Morgan, 1979). Epistemological foundations of a modelling method cannot be considered in isolation from the ontological assumptions about the nature of existence (Mingers and Brocklesby, 1997; Meredith, 2001). Thus, for the purposes of this discussion, two fundamental elements of CR scientific positioning are of major relevance: the stratified CR ontology and CR epistemology, primarily presented by the abductive mode of reasoning.

5.2.1 The Stratified CR Ontology

The stratified CR ontology divides reality into three distinct ‘domains of being’ (Bhaskar, 1979) that are commonly referred to as the domains of real, actual and empirical (Danermark et al., 2002; Mingers, 2000, 2006a, 2006b).

The domain of real consists of underlying generative mechanisms and causal structures activated by these mechanisms. In the context of process system simulation, the causality underlying the behaviour of a process is not necessarily a reflection of the succession of distinct causes and effects, but rather an emergent property of the complex interaction between the agency and technical components of an open socio-technical process system. Taken from this perspective, the ‘cause thus has an ontological depth’ (Downward and Mearman, 2007, p. 88). For SD simulation, this perspective on causality results in the idea of systems archetypes (Mingers, 2000), generic structures (Lane, 1998), or recurring structures (Morecroft and Robinson, 2005).

The activation of the causal powers of the generative mechanisms triggers patterns of events and behaviours that reside in the ontological domain of actual and represent the universum of all events and behaviours that could be possibly produced by causal structures and generative mechanisms. In the context of process simulation, the ontological domain of actual is represented by the set of all possible simulation scenarios of a given process.

The domain of empirical is made up of those events and behaviours that are actually experienced or observed by humans. In particular, it is the domain where the empirical data is available to validate the simulation models.

Contrary to the realist philosophy of science grounded in the positivist/empiricist tradition (Meredith, 2001), the CR stratified ontology argues that data-free conceptualisation of the problem behaviour of a system is ontologically separated from the experienced/perceived behaviour of the system. This ontological gap explains the difficulty of building realistic conceptual models based on the limited and personalised view of the real world. The abductive mode of reasoning which is at the core of the CR epistemological position provides the link between the knowledge acquired at the empirical and real levels of the CR ontology.

5.2.2 The Abductive Mode of Reasoning

The general direction of scientific generalisation adopted by CR is from the available empirical data to the postulation of ‘actual events’ and ‘real causes’ (Downward and Mearman, 2007, p. 93). CR regards empirical knowledge as antecedent for generating the new knowledge, that is the knowledge about generative mechanisms and the universum of all possible events and behaviours generated by generative mechanisms' actualised causal powers. This position has been well articulated by Lawson (1998, p. 156):

The aim is not to cover a phenomenon under a generalization … but to identify a factor responsible for it, that helped produce, or at least facilitated, it. The goal is to posit a mechanism (typically at a different level to the phenomenon being explained) which, if existed and acted in the postulated manner, could account for the phenomenon singled out for explanation.

In CR this type of scientific generalisation is supported by a mode of inferential logic referred to as abduction, also known as retroduction (Collier, 1994; Lawson, 1998; Mingers, 2000; Danermark et al., 2002; Downward and Mearman, 2007). Abduction aims to obtain knowledge about ‘what properties are required for a phenomenon to exist’ (Danermark et al., 2002, p. 206). Mingers (2000, 2006b) provides a more detailed account of abduction (referring to it as retroduction) as the realist method of science in which cognition about the phenomenon moves from the often limited empirical data about the perceived behaviour of the phenomenon to the postulation and testing of the generative mechanism(s) that, if it (they) existed, would cause this behaviour to be expressed empirically.

Bhaskar (1994) suggests a method of elicitation of generative mechanisms based on the abductive mode of reasoning mostly suitable for scientific generalisation in the context of applied science including, in particular, MS. This approach (RRREI) is based on the following steps: Resolution, Redescription, Retroduction, Elimination and Identification. In this study, the RRREI approach is used to direct the abductive mode of logical inference that allows the acquisition of knowledge about the generative mechanisms that underlie the behaviour of the process systems being modelled. According to RRREI, the abductive process starts by classifying the empirical data available about the process or phenomenon under study. The general understanding about how the process/phenomenon is shaped is required at this stage (Resolution) (Mingers, 2000, 2006b). This is followed by the representation of the empirical data through the prism of a particular theory (Redescription) (Mingers, 2000, 2006b). Based on these empirically viable results, a ‘creative model’ of the possible generative mechanism leads to a new supposition about the phenomenon under generalisation, that is the process system's behaviour. Herein the reasoning moves from the acquired empirical event data to the postulation of the underlying generative mechanisms, which, if they existed, would causally generate the given empirical data (Retroduction) (Collier, 1994; Mingers, 2000; Downward and Mearman, 2007; Danermark et al., 2002). Next, the existence of generative mechanism(s) is demonstrated by: first, isolating the hypothetical generative mechanism; and, second, eliminating alternative hypotheses by comparing it with the available empirical data (Elimination). At this stage, the inductive mode of logical inference plays a role as part of the abductive process. The elimination process is iterative to the point when the eliminated generative mechanism soundly corresponds to the available empirical data. This is followed by identifying a correct causal structure in the model under study (Identification).

In the next section philosophical assumptions underlying DES and SD worldviews are considered from a CR standpoint through superimposing relevant elements of SD and DES modelling constructs and methods upon the relevant domains of the stratified CR ontology.

5.3 Process System Modelling with SD and DES through the Prism of CR Scientific Positioning

In order to better understand the nature of knowledge acquired as a result of each phase of the simulation lifecycle, the outcomes of each simulation phase are to be imposed upon the three-layered ontological representation of CR (see Figure 5.1). The stratified CR ontology clearly separates the steps of the knowledge building process according to the three ontological levels where this knowledge originates. For instance, the empirical nature of the Data Inputs modelling outcome pertains to the CR ‘empirical’ ontological domain. The Conceptual Mode/Dynamic Hypothesis output, being non-empirical by definition and reflecting the underlying mechanism(s) and structure(s) that determine(s) the origins of the system behaviour, is related to the CR domain of ‘real’. Yet, the process of building the Computerised Simulation Model is inherently empirical and therefore pertains to the ‘empirical’ domain of CR ontology. And finally, the aim of the Solutions/Understanding output is to cover as many as possible alternative scenarios generated by the modelled system. This places this latter phase in the ontological domain of ‘actual’ where all the events and behaviours produced by underlying generative mechanism(s) are actualised.

img

Figure 5.1 DES and SD model building and implementation phases through the prism of a CR ontological/epistemological continuum.

5.3.1 Lifecycle Perspective on SD and DES Methods

In line with previous studies that discussed the lifecycle-based perspectives for the SD (Sterman, 2000; Randers, 1980; Lane, 2000; Größler, Thun and Milling, 2008) and DES (Wainer, 2009; Tako and Robinson, 2008; Pidd, 2004b) simulation methods, in Figure 5.1 the SD and DES modelling lifecycles are represented. The respective representation of SD and DES lifecycles is based on the following assumptions:

  1. The granularity of the lifecycle perspective may vary depending on the objectives of the simulation project (the granularity of the reported simulation lifecycles varies from two (Größler, Thun and Milling, 2008) to eight (Wainer, 2009; Tako and Robinson, 2008) phases.
  2. The phases of the simulation lifecycle are not executed in a strictly sequential but rather iterative manner and may involve feedback transitions to the opposite directions of the lifecycle view (Tako and Robinson, 2008; Sterman, 2000; Wainer, 2009). For an in-depth discussion of this point, see Chapter 8.
  3. The principles and assumptions underlying SD and DES methods are unfolded throughout the execution of the simulation lifecycle phases.

The implementation of the SD modelling method begins with the Problem Articulation (Boundary Selection) phase (Figure 5.1) (Sterman, 2000). At this phase the rationale and objective of the simulation project are specified, the key variables that reflect the behavioural characteristics of the system and that are of particular interest to modellers are determined, and the historical behaviour of the key variables – the system's reference modes (Randers, 1980; Sterman, 2000) – are specified. The reference modes may well be represented by both quantitative data describing past system performance and qualitative data that represents expert knowledge on the system's performance (Sweetser, 1999). This phase is followed by the Formulation of Dynamic Hypothesis phase that aims to generate a hypothesis about the underlying causal (feedback) structure(s) and mechanism(s) that generate(s) the behaviour of the modelled system (Randers, 1980; Sterman, 2000). Herein, Randers (1980) stresses the importance of the accumulated empirical data on the system's reference modes for the hypothesis building process. The dynamic hypothesis may be built using both causal loop and stock and flow SD grammatical constructs (Sterman, 2000).

The DES lifecycle starts in the Problem Formulation/Structuring phase. Similar to the SD lifecycle initial phase, it requires the general nature of the simulation project to be understood (‘a precursor to conceptual modelling’ (Robinson, 2008, p. 283)) and the objectives of the project to be identified (Wainer, 2009; Tako and Robinson, 2008). This phase is followed by the Conceptual Model Building phase (Pidd, 2003a; Wainer, 2009; Tako and Robinson, 2008). Pidd (2003a, p. 35) refers to it as ‘an activity in which the analyst tries to capture the essential features of the system that is being modelled’. Robinson (2008, p. 283) defines the conceptual model building as developing ‘a non-software specific description of the computer simulation model (that will be, is or has been developed), describing the objectives, inputs, outputs, content, assumptions and simplifications of the model’.

Combining the perspectives on the conceptual model provided by the definitions of both Pidd (2003a) and Robinson (2008), it becomes clear that within the DES method the conceptual model is a non-empirical representation of ‘the essential features of the system that is being modelled’.

Contrary to the modelling logic inherent to the SD method, the DES method implementation prescribes that the Conceptual Model Building is not preceded but followed by the collection and analysis of Data Inputs (Wainer, 2009; Tako and Robinson, 2008). To complete this ‘super-phase’, Wainer (2009, p. 27) suggests to ‘observe and collect the attributes chosen in the previous phase’.

The logic of the Implementation and Experimentation ‘super-phase’ of the SD and DES lifecycles is quite similar. The lifecycle perspective on both SD and DES modelling methods refers to the Formulation/Implementation of the simulation model based on computer simulation packages as the following simulation phase.

The next phase is referred to in the DES literature as Validation and Verification (e.g. Wainer, 2009) and in the SD literature as Testing of the Dynamic Hypothesis (e.g. Randers, 1980). In this phase: (a) the correctness of transferring the non-software-specific conceptual model into a computerised simulation model is verified; and (b) in order to check whether the executable computer model adequately reproduces the modelled system's behaviour, the executable computer model is validated against the empirical data (the so-called ‘comparison to reference modes’ (Sterman, 2000, p. 86)) collected in the previous phase of the simulation lifecycle (Problem Formulation phase for SD and Data Input phase for DES).

The final phase of both the SD and DES simulation lifecycles is the Experimentation and Output Analysis (DES)/Policy Formulation and Experimentation (SD). In this phase possible behavioural patterns exhibited by the system under consideration are simulated. The ultimate aim of this phase is to attain an improved understanding of the behaviour of the system as a consequence of a given management decision. This is achieved through the analysis of the ‘alternative histories’ of the simulated system's behaviour generated by the executable computer system in response to alternative courses of action chosen by a manager in the process of decision making.

Achieving better understanding of real-world management systems is a key goal for MS as a scientific discipline (e.g. Hall, 1985; Commission on the Future Practice of Operational Research, 1986; Meredith, 2001; Walsham, 1992). Based on the discussed lifecycle perspective of SD and DES modelling methods, simulation models can be regarded as tools for systematically attaining a better understanding of the real world and thereby for providing support for addressing the real-world management decision-making problems.

The ontological and epistemological assumptions of SD and DES worldviews informed by CR are discussed below in a stepwise manner in accordance with the phases depicted in Figure 5.1.

Phase 1: Problem Structuring (DES)/Problem Articulation (Boundary Selection) (SD)

Within phase 1 of the simulation lifecycle, irrespective of the chosen simulation approach, each simulation project starts by determining the goal and scope of the project. If the simulation model is built to analyse the behaviour of an existing system, the data that reflects the historical behaviour of this system is accumulated in this phase of the simulation lifecycle. This accumulation of the relevant empirical data on the past system's behaviour is paramount, in particular in SD modelling. To illustrate, Forrester (1961, 1992) refers to accumulated empirical knowledge as the ‘mental database’ of the simulation project. Randers (1980) argues that the conceptual modelling in SD should be preceded by solid empirical data expressed in the form of ‘reference mode’. Hence, both SD and DES approaches determine the goal and boundaries of the simulation study based on the acquired empirical knowledge of the past behaviour of the process system under study (Sweetser, 1999). This evidences that, in the context of both SD and DES simulation, the knowledge acquired in this phase of the simulation lifecycle corresponds to the ‘empirical’ domain of CR ontology (step 1 in Figure 5.1).

There is a strong similarity between the phase 1 of the SD/DES simulation lifecycle and the Resolution and Redescription steps of the RRREI abductive process discussed in Section 5.2. Indeed the focus of the first phase in both SD and DES modelling lifecycles is on specification of the objective of the simulation study and on defining the boundaries of the real-world system described by the simulation model. The common objective of the simulation study includes the understanding and improvement of the performance of the real-world system. The realised necessity to model and improve the behaviour of the real-world system is always supported by some empirical evidence of its past performance upon which a call for improvement is made. This directly matches the scope of the Resolution step of the RRREI abductive process. Moreover, the phase 1 requirement to specify the boundaries of the real-world system that is to be reflected in the simulation model is in line with the requirement to acquire a general understanding about the behaviour of the phenomenon under study as part of the Resolution step of the RRREI abductive process. In the following step of the abductive process, the empirical data about the behaviour and structural characteristics of the phenomenon under study is represented in the context of a particular theory. Hence, the process systems, which are the focus of this study, are grounded in one of the two theoretical approaches: SD as a structural theory (Größler, Thun and Milling, 2008) or DES as another theoretical approach to modelling and understanding complex dynamic systems (Pidd, 2003a, 2004b). The choice of the modelling techniques for building the conceptual model of the process system therefore reflects the parallel between the RRREI abductive process and the lifecycle view on simulation modelling methods.

Phase 2: Conceptual Model Building (DES)/Formulation of Dynamic Hypothesis (SD)

The Conceptual Model Building phase (Figure 5.1) aims to capture the origins of the behaviour of the system being modelled (e.g. Pidd, 2004b). In SD the formulation of the dynamic hypothesis is regarded as the product of the conceptualisation phase and implies ‘creating an initial model that is capable of reproducing the major dynamics of the reference mode’ (Randers, 1980, p. 122) (step 2a in Figure 5.1). The principle underlying the SD method, and especially the conceptual model building stage, corresponds to the core epistemological premise of CR: that is, structure (including the relationships between structural components of the system) determines the system's behaviour (Richardson and Pugh, 1981). This is in line with the CR method informed by its ontological/epistemological position that ‘proceeds by trying to discover underlying structures that generate particular patterns of events (or non-events)’ (Mingers, 2000, p. 1266). In order to identify the dominant feedback structures in causal loop diagrams as well as the relationships between the stock and flow variables, the logic of the model building process progresses from the available empirical data to the postulation of the dynamic hypothesis (see step 2a).

In DES the conceptual model building phase has been reported as ‘a vital’ but at the same time ‘the most difficult and least understood’ phase of the simulation modelling (Robinson, 2008, p. 278). This phase ‘aim[s] to identify the main entities of the system and to understand the logical ways in which they interact [emphasis added]’ (Pidd, 2004b, p. 36). This is in line with the reported characteristic of ‘DES methodology [as] … a disciplined means of capturing the structure of an existing or proposed system [emphasis added]’ (Sweetser, 1999, p. 3). According to Brailsford and Hilton (2001), for process systems, it is the structure of the underlying queuing system (network) that represents the underlying control flow principle and determines all possible pathways of the simulated system behaviour. Therefore, while DES adopts different tools of conceptual model representation than SD, it is evident that the primacy of structure over behaviour as a core principle of conceptual model building still holds.

As the universe of all possible system's pathways is primarily generated by the structure of the underlying queuing system, the CR epistemological lens refutes the reported argument that postulates the primacy of the inherently empirical concept of randomness in determining the origins of the system's behaviour. Indeed in the context of DES simulation, randomness determines the probability of choosing one or another pathway whose existence is enabled by the underlying generative mechanism (e.g. a network of queues for a process system). As such, CR clarifies the role of randomness as an empirical factor that allows one to operationalise generative mechanisms in the later phases of the simulation project lifecycle.

The process of developing the SD dynamic hypothesis from the available empirical data has not been explicitly formalised in the MS literature. However, Mingers (2000) demonstrated how the abductive mode of reasoning (RRREI) supported by CR corresponds to the logic of building and testing the SD dynamic hypothesis. In the same manner, the structure of the underlying conceptual model in DES may be based on the Resolution of the empirical data on the past behaviour of the modelled system. In light of the adopted CR epistemological position, the queuing systems (networks) which are at the core of the ‘non-software-specific’ conceptual model representation of process systems, can be viewed as generative mechanisms. Hence, the Retroduction step would involve an understanding of the structural mechanisms (i.e. the structure and interconnectedness of the underlying queuing system) that, if they existed, would generate the problem behaviour of the modelled process system. In terms of CR this kind of relationship is referred to as generative causality. Then, in the subsequent phase of the simulation process, the most feasible generative mechanism(s) (i.e. adopted conceptual model) is (are) eliminated and identified when the conceptual model is validated against the available Data Inputs.

Summarising the discussion of this simulation phase, CR epistemology informs the process of conceptual model building for both SD and DES by stressing the commonality of two knowledge building concepts:

  1. The primacy of structural factors that generate the behaviour of the modelled systems.
  2. The requirement of the empirical data (accumulated in the preceding modelling phase) necessary to hypothesise about the underlying structural factors that could have generated the available data about the past system's behaviour.

From the point of view of the CR RRREI abductive process, these two factors form the necessary basis for the Retroduction and the following steps of the RRREI abductive process. Thus, the empirical data that reflects the behaviour of the real-world process system (or rather the part of this system that is of interest to the modeller) is used as a basis for hypothesising about the nature of the underlying generative mechanisms, which, if they existed, would causally generate this empirical data at hand. This is in direct correspondence with step 2a of the simulation modelling lifecycle (see Figure 5.1) referred to in the SD literature as ‘Initial hypothesis generation’ (Sterman, 2000, p. 86). Next, in line with the Elimination step, the hypothesised generative mechanism(s) is represented in isolation from any other components of the process system (which are not part of the generative mechanism(s)) and then the suggested hypothesised generative mechanism is validated against the initial set of the empirical data. In SD (e.g. Randers, 1980) and DES (e.g. Sargent, 2004) simulation modelling (step 2b in Figure 5.1) this refers to the necessary validation of the hypothesised conceptual model (initial dynamic hypothesis(es)). And finally, based on the results of the Elimination phase the correct generative mechanism that is tested against the actual behaviour of the modelled system is identified.

Overall, there are strong similarities between the abductive mode of logical inference that was adopted by CR as a tool for knowledge building and the phases of the SD and DES lifecycles contained within the Data Collection and Conceptualisation ‘super-phase’ of the combined simulation modelling lifecycle depicted in Figure 5.1.

Phase 3: Model Coding/Computer Implementation and Data Input

In this stage of the simulation project the computerised simulation model is implemented/coded (step 3a) and the relevant historical (empirical) data is populated in the computerised model (step 3b). As shown in Figure 5.1, from the point of view of CR, both of these steps are performed at the empirical level of CR ontology. While DES modellers traditionally spend most of their project time in this simulation modelling phase sampling probability functions into the computerised model (e.g. Tako and Robinson, 2008; Sweetser, 1999), the epistemological premises underlying the knowledge building process with DES and SD do not considerably differ. In both DES and SD the incorporation of the relevant empirical data into the computerised simulation model allows one to operationalise the non-empirical generative mechanisms identified in the previous phase of the simulation lifecycle.

Phase 4: Verification and Validation (DES)/Testing (SD)

In the Verification (DES)/‘Testing for consistency’ (SD) step (step 4a in Figure 5.1) the correctness of the computer programming and implementation of the conceptual model/dynamic hypothesis is checked (Sargent, 2004; Tako and Robinson, 2008; Sterman, 2000; Randers, 1980). From the point of view of CR epistemology, the goal of this step is to verify whether the built-in parameters of the computerised model reflect the nature of the underlying generative mechanisms identified in step 2a of the simulation process.

Validation (DES)/‘Testing through comparison to reference modes’ (SD) (step 4b) implies checking that the ‘model's output behaviour has sufficient accuracy for the model's intended purpose over the domain of the model's intended applicability’ (Sargent, 2004, p. 132; Randers, 1980). As the computerised model and relevant data utilised for model validation are empirically driven, the knowledge generated in this step of the simulation project lifecycle refers to the CR ontological domain of ‘empirical’.

Phase 5: Experimentation

The final step in the model building process implies running the simulation and the exploration of the ‘what-ifs’ based on the simulation results.

From the CR viewpoint this step aims at unveiling all possible events and behaviours that may be produced by generative mechanisms identified during the Conceptual Model Building. The available simulation knowledge and the computational power of simulation packages thus allow experimental identification of how a hypothetical generative mechanism may be actualised. Varying the impact of different experimental factors (simulation model parameters) aims to reveal all possible ‘what-if’ scenarios produced by a certain generative mechanism (or interplay of multiple generative mechanisms). Therefore the Experimentation phase in both SD and DES allows the representation of the patterns of events that have been outside of the manager's empirical scope prior to the simulation project. In other words, this step aims at getting access to the knowledge on the behaviour of the process system that corresponds to the ‘actual’ domain of CR ontology (Figure 5.1).

This section demonstrated in detail how the CR scientific position facilitates explicit articulation of the fundamental epistemological assumptions underlying SD and DES worldviews. In the following section the implications of the CR philosophical position for more effective use of DES and SD simulation to support managers' understanding of decision making in process systems are discussed.

5.4 Process System Modelling with SD and DES: Trends in and Implications for MS

In this section the CR-enabled phases of the SD and DES simulation lifecycle as described in Figure 5.1 are illustrated using a specific segment of the process system modelling literature: patient care process modelling. Patient care process modelling is a challenging area of research where simulation has traditionally played a role as a tool to support management decision making (Davies and Davies, 1994; Jun, Jacobson and Swisher, 1999). For each phase of the DES and SD modelling and implementation lifecycle, we first review the selected research literature that explored the operational perspective on patient care process simulation with SD and DES through the ontological/epistemological lens of CR and then discuss how the suggested CR ontological/epistemological perspective contributes to more effective use of simulation for intelligent thinking about, and management decision support in, process systems.

For simplicity reasons, the modelling context discussed in this section is limited by the operational perspective on patient care processes (Cote, 2000). For instance, the patient flow can be represented from the operational point of view as the movement of patients (entities) through a set of locations in a health care facility (Cote, 2000). A more complete operational definition of a patient care process was suggested by Little (1986, p. 6) who categorised it as a type of designed flow process that ‘include[s] a wide variety of situations where tangible items … [i.e. patients or materials] or intangible entities (such as information) flow in some purposive way along paths that may be individually charted or collectively established but proceed with the intention of achieving some outcome’. The perspective chosen for illustrative purposes leaves the clinical perspective on patient care processes (ibid.) outside the focus of this discussion.

Phase 1: Problem Structuring (DES)/Problem Articulation (Boundary Selection (SD))

SD: Lattimer et al. (2004) start with problem articulation and boundary selection that is followed by the accumulation of the empirical data on how the system operates as well as on the nature of interdependency between the components of the system. Initial interviews and hospital site visits are used in order to accumulate necessary empirical evidence on how the acute patient flows could be mapped. As described in Chapter 6, Brailsford et al. (2004) ran 30 interviews with key decision makers from across health and social care prior to developing a ‘conceptual map’ of the emergency health care system in Nottingham. Lane, Monefeldt and Rosenhead (2000) reveal that in order to build and refine the initial conceptual model of the process system, the ‘mental databases’ that reflected the knowledge of the process owners as well as the ‘formal sources’ of empirical data were accessed. Prior to developing a conceptual model (influence diagram) of ‘the flow of elderly people from the community into UK National Health Service (NHS) and out into community care’, Wolstenholme (1993, p. 927) refers to the source of the empirical data that allowed understanding the behaviour of the modelled system as a result of the collaboration with a senior manager who conversed with both the NHS and Personal Social Services.

On the other hand, not all of the reviewed SD studies clearly specify the sources of the empirical data. For example, Wolstenholme (1999) provides a conceptual map of the patient flow model but remains silent on the nature of the empirical data used to build the conceptual model. Moreover, one of the reviewed studies did not follow in a consecutive manner the phases of the SD lifecycle. Worthington (1991) introduces the basic conceptual hospital waiting list management model before the description of the data collection and analysis stage. This results in a schematic and over-simplistic representation by Worthington (1991) of the basic waiting list model as part of the description of the ‘overall approach’ (p. 835) adopted in this study.

DES: The review of the DES studies in patient care process simulation reveals that, as is the case with SD modelling, there is no uniformity in whether the empirical data sources are described at this stage or not. For example, Moreno et al. (2000) and Ferreira et al. (2008) report on the methods of data collection and sources of the empirical data in order to better understand how the process system under study operates. The review of more than 350 papers on simulation modelling (mainly using DES) by Fletcher and Worthington (2009) reveals that data collection in patient flow modelling was generally performed through computer records, but also occasionally through work studies and consultation with the experts (e.g. clinicians).

On the other hand, a considerable number of the reviewed DES studies report on the results of the problem formulation and even conceptualisation before the sources of the empirical data are described (e.g. Dittus et al., 1996; Coelli et al., 2007; Werker et al., 2009; Cardoen and Demeulemeester, 2008; Levin et al., 2008). This poses questions on the availability of the necessary empirical support at the following conceptual model building phase of the DES simulation modelling.

Trends and implications for management science: As discussed in Section 5.3, this phase of SD and DES simulation modelling lifecycle aims at specifying the objective of the simulation study as well as to determine the boundary of the simulated system. The epistemological position of CR calls for the acquisition of empirical evidence on the problem behaviour of the modelled system as a prerequisite for further exploration of the mechanisms that trigger this behaviour. In most cases the objectives of the reviewed simulation projects are triggered by calls for understanding the reasons for the poor performance of the patient care processes and by the attempts to increase this performance. Hence, collecting empirical data at a very early stage of the simulation project allows one effectively to identify the objective of the project and to understand which part of the real world is to be covered by the project.

Based on the review of the SD and DES simulation literature, not all of the reviewed SD- and DES-based patient flow simulation projects specify data sources in this phase of the simulation lifecycle. This does not necessarily imply that such data does not exist, as in most cases it is specified in the later phases of the simulation modelling lifecycle. However, as follows from the CR philosophical position, the logic of scientific enquiry calls for specification of the available empirical evidence on the modelled system upfront, as such data may well determine the research design and the selection of the specific simulation method (cf. Brailsford et al., 2004), as described in Chapter 6.

The nature of the empirical data sources in SD and DES projects often varies. Generally, the data sources used in SD simulation are more value based than those used in DES simulation: while SD modellers refer to the ‘mental databases’ (Forrester, 1961) collected through the interviews with the process owners and/or clinicians, DES modellers tend to utilise the computer records that contain historical data on the performance of the system of interest.

Phase 2: Conceptual Model Building (DES)/Formulation of Dynamic Hypothesis (SD)

SD: Lane and Husemann (2008) demonstrate how the conceptual models of the patient flow have been refined throughout a series of interviews, site visits and workshops. Lane, Monefeldt and Rosenhead (2000) also demonstrate how the initial conceptualisation of the modelled Accident and Emergency system elements, processes and pathways revealed the need to include in the conceptual model other factors that were unnoticed in the initial conceptualisation phase. While building the conceptual model, Wolstenholme (1993) concentrates on the development of the initial or ‘first type’ conceptual model of the flow of elderly people from the community into NHS and out into community care and its transformation into an archetype-driven conceptual model. The focus is on refining the conceptual model that provides a very high-level description of the interaction of the components of the modelling process into a more detailed feedback loop diagram that demonstrates key SD archetypes that influence the behaviour of the process system under study. From the point of view of the abductive mode of reasoning supported by CR, the initial conceptual model of the process system suggested by Wolstenholme (1993) reflects the Resolution step (where the boundaries of the modelled process systems were shaped and the key variables defined) and Redescription (where the SD modelling has been selected as an appropriate theory to reflect the behaviour of the modelled system). Redescription also results in the transformation from the initial ‘first-order’ conceptual model into the model that reveals the nature of the underlying system archetypes.

DES: In order to determine how bed demand affects Emergency Department (ED) patients' access to patient cardiac care, Levin et al. (2008) built a conceptual model of the patient flow between the cardiology macro-system units. The model is built using queuing principles. Bailey (1952), Jackson, Welch and Fyr (1964), Vissers and Wijnaard (1979), Brahimi and Worthington (1991) and Walter (1973) formulate the problem of outpatient and general practice/hospital department appointment scheduling as a queuing system. Lowery and Martin (1992), El-Darzi et al. (1998), Altinel and Ulas (1996), Cardoen and Demeulemeester (2008), Coelli et al. (2007), Ferreira et al. (2008), Werker et al. (2009), Levin et al. (2008) and de Bruin et al. (2007) regard the operational model of patient flow as a simulated queuing system. Jiang and Giachetti (2008) suggest an alternative structure of the queuing network model which handles the fork/join structure of patient flows and provides rapid analysis of alternative patient flows. Edwards et al. (1994) compare the performance of patient processing based on two different queuing system structures and found that patient waiting times can be reduced up to 30% when using quasi-parallel processing as opposed to serial processing.

These reviewed DES studies of patient care processes strongly emphasise the structure vs randomness debate when building and/or simulating the conceptual model of the process system. They acknowledge that the structural characteristics of the queuing system, that is the structure of the entities' flow through their active and passive states, play a decisive role in behaviour generation of process systems. For example, de Bruin et al. (2007), based on expert meetings with cardiologists, develop a structural model of the emergency cardiac in-patient flow. According to the authors, the critical role of the provided conceptual model is not in a complete representation of all cardiac in-patient flows in the university medical centre, but rather in understanding the major underlying queuing mechanisms that affect the dynamics of the in-patient flow system under study. Hence, while not a completely accurate representation of a real process system, the suggested structural model allows one to ‘reduce complexity without losing integrity by focusing on the most critical patient flows’ (p. 127). Kolker (2008, 2009), by using in a combined manner process model simulation and queuing theory, identifies the core queuing mechanisms that determine the structure and behaviour of the intensive care unit and Emergency Department patient flows. Haraden and Resar (2004, p. 4) argue that ‘it is a common but an incorrect assumption that the healthcare flow is a result of what appears to be the randomness and complexity of the disease presentation’. Instead they argue that the health care flow is a result of more structured factors that can be consistently represented and managed based on historical data and queuing methods. In the same manner, according to Ryan and Heavey (2006) the application of queuing methods is ‘vital to the modelling of a discrete-event system when gathering requirements or building a conceptual model for the purposes of a simulation project’ (p. 441). This said, the randomness still plays a role in determining the process system dynamics; however, its primacy as a behaviour generating mechanism is questionable in the process system context.

The review of the DES studies also demonstrates that there is a discrepancy between the representation of the overly simplistic conceptual model on the one hand and the computerised model on the other. According to Pidd (2003a) and Robinson (2008), the representation of the conceptual model reflects the essence of the behaviour of the dynamic system under study. We demonstrated above the role of the queuing mechanism for understanding the behaviour of the process system. However, in a number of studies (e.g. Dittus et al., 1996; Cardoen and Demeulemeester, 2008), while the queuing mechanism is used within the computerised model, it is omitted in the conceptual model. When reviewing the conceptual models suggested by these authors, a question arises whether the reported relationships between the model components indeed reflect the essence of the modelled system.

Trends and implications for MS: The review of the SD and DES studies reveals a number of epistemological differences when approaching this phase of the simulation modelling. Generally, SD simulation studies are more concerned with the link between the empirical data (both ‘hard’ and ‘soft’) and the conceptual model. Thus, in SD studies the development of the conceptual model undergoes conceptualisation (Wolstenholme, 1993) or the initial conceptual model is validated against the reference modes of the simulated process system (Lane, Monefeldt and Rosenhead, 2000; Lane and Husemann, 2008; Wolstenholme, 1993), that is against the sources of empirical data that describe the behaviour of the key concepts and variables of the system. The specification of reference modes can be run on a continuous basis, thus allowing a comparison of the hypothesised conceptual model against novel reference modes of the system under study (e.g. Lane, Monefeldt and Rosenhead, 2000; Lane and Husemann, 2008) (see Figure 5.1, step 2b).

On the other hand, most of the reviewed DES projects suggest a conceptual model for the patient care processes without mentioning the sources of the empirical data used in order to reflect the key features of the simulated system (e.g. Werker et al., 2009; Cardoen and Demeulemeester, 2008; Dittus et al., 1996; Levin et al., 2008; Coelli et al., 2007). Neither of the reviewed DES studies demonstrates whether and how the conceptual model is refined as a result of comparing the suggested conceptual model with the collected historical data that describes the modelled behaviour of the process system. This questions the role of the conceptual model as the one that aims ‘to capture the essential features of the system that is being modelled’ (Pidd, 2003a, p. 35). This also explains why SD modellers dedicate a considerably larger part of their project time to the conceptual model building phase than DES modellers do (Tako and Robinson, 2008), as described in Chapter 8.

From the point of view of CR epistemology, while the conceptualisation phase in reviewed SD studies corresponds to the complete abductive cycle (both steps 2a and 2b in Figure 5.1), the DES studies cover only step 2a. Hence, the reviewed DES studies formally ignore the Elicitation and Identification steps of the abduction process in the conceptual model building phase. Instead, these knowledge building activities are undertaken in the two subsequent phases of the DES simulation lifecycle.

Phase 3: Formulation of a Simulation Model/Computer Implementation

SD: Lane, Monefeldt and Rosenhead (2000) in their study of an Accident and Emergency department refer to this phase as ‘Formulation of the structure and equations’. This does not necessarily reflect the true nature of the described activity, as the structure has been already formulated in the previous phase of the simulation modelling when the initial dynamic hypothesis was suggested and subsequently refined. In this phase the qualitative conceptual model expressed as a causal loop diagram is transformed into a quantitative stock and flow diagram (Lane, Monefeldt and Rosenhead, 2000). The latter is based on 194 equations and designed using the iThink software simulation package. When describing this phase of simulation model building, Wolstenholme (1993) notes that the conceptual model supported by systems thinking falls short when comparing and contrasting the alternative interventions. Hence, the model parameters and variables are to be specified as a precondition for testing the modelled system's logic upon which the simulation model was built. Brailsford et al. (2004) refer to this phase as a ‘Quantitative phase’ defining its aim as ‘to facilitate experimentation with various potential changes in service configurations and demand rates’ (p. 37). They used the iThink simulation package to develop the computerised simulation model. The model was populated with data obtained from health care providers in Nottingham, UK, as described in Chapter 6.

DES: In DES modelling this phase also presupposes translation of the conceptual model into the formal computerised model while specifying the necessary process tasks and preserving the underlying queuing mechanism (Werker et al., 2009; Lowery and Martin, 1992; El-Darzi et al., 1998; de Bruin et al., 2007; Altinel and Ulas, 1996; Kolker, 2008, 2009). These and all additional aspects of the model (such as the availability of resources, the arrival patients' data) are often represented in DES simulation studies by adopting one of the existing simulation packages, for example Arena (Werker et al., 2009; Cardoen and Demeulemeester, 2008), SIMUL8 (Katsaliaki et al., 2009) as well as MATLAB (Levin et al., 2008) and MedModel (Levin et al., 2008; Coelli et al., 2007). Some more complex modelling situations require additional model coding. For example, while investigating the opportunities to shorten the average time patients spend in the urgent care facility, Tan, Gubaras and Phojanomongkolkij (2002) built a simulation model using the SIMAN programming language. The model is then simulated on the Arena software package.

Trends and implications for MS: The major task of the computerised model is to reflect the essence of the modelled system that was represented within the conceptual model. Built on the empirical data inputs, the computerised model is formulated within the ‘empirical’ domain of CR ontology. At the same time the computerised model should reflect the premises inferred by the conceptual model formulated within the ontological domain of ‘real’. As defined when discussing the previous phase of the DES and SD simulation lifecycle, the SD Dynamic Problem Definition step (step 2b in Figure 5.1) is often omitted in DES simulation projects. The DES simulation literature provides evidence that, while they are formulated, DES conceptual models are seldom compared with the historical behaviour of the key concepts and variables reflected within these models. Therefore, other factors being equal, the dynamic hypotheses suggested within SD simulation studies are more empirically grounded than most of the DES conceptual models that are not empirically validated (see step 2a in Figure 5.1). This may well explain why in the majority of the reviewed DES studies the discrepancy between the complexity of the computerised (empirically based) model and the very high-level representation of the conceptual model was greater than in SD studies (to the extent that not all the key features responsible for the dynamics of the process system were reflected within the DES conceptual model). Hence, in terms of the decision making, the process of gradual development of the SD conceptual model (including the Dynamic Hypothesis Definition phase – see step 2b in Figure 5.1) potentially infers a tighter relationship between the representation of the computerised and conceptual model of the process system, that is between the computerised system and the part of the real-world system that is being modelled.

Phase 4: Verification and Validation (DES)/Testing (SD)

SD: Brailsford et al. (2004, p. 38) refer to the validation of the SD models as a ‘thorny topic’. The fact is that the validation of the qualitative and quantitative SD models requires a different degree of understanding of the simulation model by managers. For example, building and validation of the qualitative SD simulation model require a detailed understanding of the model's structure and outputs by managers as their ‘mental databases’ are regarded as a paramount tool not only for model building, but also for model validation (e.g. Wolstenholme, 1993; Lane and Husemann, 2008). In this regard, the SD study performed by Lane and Husemann (2008) is particularly illustrative. It discusses the activities performed before and during a series of workshops that allowed one to build, refine and validate a complex qualitative model of pathways for acute patients referring to expert knowledge as the major source of model building and validation.

At the same time, the quantitative SD models can be validated ‘by a “black box” process’ (Brailsford et al., 2004, p. 38), that is by comparing the simulated behaviour of the modelled system with the historical behaviour of the corresponding real-world system (e.g. Lane, Monefeldt and Rosenhead, 2000). This can also be the case when an SD study combines both approaches to model building and validation (e.g. Brailsford et al., 2004), as described in Chapter 6.

DES: While Pidd (2003a) refers to verification in DES modelling as an almost outdated phase of simulation modelling due to the introduction of visual interactive modelling systems (VIMSs), the DES studies in the area of patient care process simulation still use model verification as the core step. This step assures the correct transition from the conceptual model to the computerised representation. The discrepancy between the conceptual model and computerised model revealed in most of the reviewed studies, therefore, calls for verification of the transformation process from the conceptual to the computerised model.

The validation of the simulation model in DES studies is predominantly quantitative and, as is the case with quantitative SD, includes a comparison of the behaviour exhibited by the simulation model. Examples of such an approach to validation of the DES simulation models can be found in Dittus et al. (1996), Altinel and Ulas (1996), Cardoen and Demeulemeester (2008), Coelli et al. (2007), Ferreira et al. (2008), Kolker (2008, 2009), Werker et al. (2009) and Levin et al. (2008). As validation of DES is predominantly quantitative, a considerable amount of the simulation time in DES studies is spent on data accumulation and validation (Tako and Robinson, 2008), as discussed in Chapter 8.

Trends and implications for MS: Evidently, the validation performed in SD and DES simulation projects is empirical by nature as it questions whether the suggested computerised model can reproduce the problem behaviour of the process system (see step 4b in Figure 5.1). The empirical data describing the problem behaviour of the process system may be in the form of ‘mental databases’ (in the case of qualitative SD) or in the form of quantitative data about the past system's behaviour (in the case of quantitative SD and DES). Due to the discrepancy in the level of detail between the DES conceptual and computerised models, as well as the lack of empirical grounding in a large part of DES conceptual models, verification in DES simulation is more problematic than in SD. Indeed, in the case when such properties of a process system as the nature of the built-in queuing mechanism are not covered within the conceptual model, but are reflected in the computerised model (e.g. Cardoen and Demeulemeester, 2008), the value of the computerised model verification may be highly questionable.

Phase 5: Experimentation

The following step within both quantitative SD studies (Worthington, 1991; Lane, Monefeldt and Rosenhead, 2000; Brailsford et al., 2004) and DES (Werker et al., 2009; de Bruin et al., 2007; Cardoen and Demeulemeester, 2008; Kolker, 2008, 2009) studies includes the analysis of alternative scenarios by using model simulations. In this phase a set of simulation experiments is run in order to determine the possible scenarios of the system's behaviour under changing conditions. Therefore both SD and DES simulation support the management decision-making process when evaluating different operational alternatives in order to improve existing patient care processes or to assist in designing and planning new ones.

Trends and implications for MS: Both SD and DES reportedly help the managers to analyse the design of the patient care process and to come up with a more effective design in terms of use of the limited resources in order to achieve the objectives that drive the operational patient care processes. DES and SD modelling help to get an insight into the functioning of the system under the alternative conditions and courses of action. The ultimate aim of adopting SD and DES as tools for support for management decision making in the context of patient care processes is to realise the alternative scenarios of the behaviour of the process system under changing conditions. In order to understand how the alternative courses of action of the process system can be enabled and why they are enabled in a specific way, the understanding, modelling and operationalisation of the generative mechanisms that enact the ‘what-if’ scenarios are required.

5.5 Summary and Conclusions

This research set out to investigate how the CR philosophy of science facilitates explicit articulation of the fundamental philosophical assumptions underlying SD and DES worldviews. In doing so, we shared the view of many management scientists that ‘the worldview that we hold determines the process we will advocate for solving a decision problem’ (Buchanan, Henig and Henig, 1998).

This research postulated the primacy of ontological positioning (i.e. ‘our convictions about the nature of the world’) that, in turn, strongly influences the nature of the produced knowledge (epistemology) about a specific decision-making problem as well as the choice of the methodology used to tackle this problem. In the context of this research, simulation models were regarded as tools for systematically attaining a better understanding of the real world and thereby for providing support for real-world management decision-making activities, in particular contributing to more effective use of simulation for intelligent thinking about, and management decision support in, process systems.

We have demonstrated how CR provides a unique response to a call (made in Buchanan, Henig and Henig, 1998) for a scientific position that would allow the combined use of the objectivist worldview that is based on the assumption of existence of ‘an objective reality which can be measured and described’ and a more subjectivist position that allows ‘the mind of the decision maker’ to interpret the reality based on the available, including value-laden, data. CR addresses this research call by explicitly articulating the fundamental philosophical assumptions underlying SD and DES simulation worldviews.

The adopted CR scientific position allowed an interpretation of the logic underlying the SD and DES simulation methods through the prism of its epistemological position dictated by its stratified ontology. Mapping distinct phases of SD and DES lifecycles onto the ontological domains of CR allowed a comparison of SD and DES knowledge-generating/decision-making steps and the respective outcomes. Particularly, this relates to the use of the available empirical data (generally more value laden in the case of SD than DES) in order to identify the generative mechanisms that trigger the puzzling behaviour of the complex process systems. While using different sets of grammatical constructs, both simulation methodologies aim at understanding the underlying structural pattern that triggers the behaviour of the complex process systems.

Without denying the criticality of understanding the impact of randomness (the factor that operates at the ontological level of empirical) on the behaviour of the system under study (Morecroft and Robinson, 2005), our perspective informed by CR epistemology demonstrated the need for a more structured enquiry into the logic that guides conceptual model building as part of the DES-induced decision-making process. The abductive mode of logical inference was therefore suggested as an established method for identification of the underlying entities of the modelled system and the logic of their interaction.

Based on the literature that reports the use of DES in the context of health care process systems, it was concluded that DES MS models are often less explicit in relating the existing data with the suggested conceptual model of the process system as well as in relating the conceptual model and the computerised model. Hence, one of the suggestions for improving the results of using DES for decision support in process systems would be to focus on providing a better link between the data accumulation and conceptual model building phases (steps 2a and 2b in Figure 5.1) and between conceptual model building and computerised (simulation) model building (steps 3a and 4a in Figure 5.1). A tighter relationship between the available empirical data and conceptual and computerised models would arguably assure a better understating of the existing preferences and overall better decision-making results when adopting DES in the context of process systems.

In the case of SD simulation, the adoption of the abductive mode of logical inference as part of the CR position demonstrated the importance of acquiring the most possibly compete dataset (whether it is value based or not) to assure reliability of dynamic hypothesis building (step 2a in Figure 5.1) and testing (step 2b in Figure 5.1). Herein, the importance of random factors is not denied and the possibility to reflect them as part of the empirical dataset would mean the acquisition of better empirical bases for enquiry about the hypothetic causal mechanisms that triggered the given behaviour of the process system under study. The adoption of the CR epistemology also indicates that building the computerised model and running the experiment are strongly required in order to expand the set of decision-making alternatives concerning the behaviour of the process system under study.

This chapter is the first one to examine explicitly the philosophical bases of DES and SD simulation worldviews by adopting a particular philosophical position and to report its findings through the prism of CR stratified ontology and the abductive mode of knowledge generation. By doing so it provides a better understanding of how management decision-making knowledge is generated through the modelling process. This contribution is important for managers who, as ‘intelligent thinkers’ about real process systems, both contribute to and use simulation models for decision making. It is also important for management scientists who require an in-depth understanding of the scientific bases of modelling methodologies in order to choose an appropriate simulation worldview to support management decision making in real-world process systems in a truly scientific manner, thus living the original promise of MS as a scientific discipline.

References

  1. Altinel, I.K. and Ulas, E. (1996) Simulation modeling for emergency bed requirement planning. Annals of Operations Research, 67, 183–210.
  2. Bailey, N.T.J. (1952) A study of queues and appointment systems in hospital outpatient departments, with special reference to waiting times. Journal of the Royal Statistical Society, Series B, 14, 185–199.
  3. Barton, P.M. and Tobias, A.M. (2000) Discrete quantity approach to continuous simulation modeling. Journal of the Operational Research Society, 51 (4), 485–489.
  4. Becker, J. and Niehaves, B. (2007) Epistemological perspectives on IS research: a framework for analysing and systematizing epistemological assumptions. Information Systems Journal, 17, 197–214.
  5. Bhaskar, R. (1978) A Realist Theory of Science, Harvester Press, Brighton.
  6. Bhaskar, R. (1979) The Possibility of Naturalism: A philosophical critique of the contemporary human sciences, Harvester Press, Brighton.
  7. Bhaskar, R. (1986) Scientific Realism and Human Emancipation, Verso, London.
  8. Bhaskar, R. (1989) Reclaiming Reality: A Critical Introduction to Contemporary Philosophy, Verso, London.
  9. Bhaskar, R. (1994) Plato Etc: The Problems of Philosophy and their Resolution, Routledge, London.
  10. Borshchev, A., Karpov, Y. and Kharitonov, V. (2002) Distributed simulation of hybrid systems with AnyLogic and HLA. Future Generation Computer Systems, 18, 829–839.
  11. Brahimi, M. and Worthington, D.J. (1991) Queuing models for out-patient appointment systems: a case study. Journal of the Operational Research Society, 42 (9), 733–746.
  12. Brailsford, S.C., Churilov, L. and Liew, S.-K. (2003) Treating ailing Emergency Departments with simulation: an integrated perspective, in Health Sciences Simulation 2003 (eds J. Anderson and M. Katz), Society for Modeling and Computer Simulation, San Diego, CA, pp. 25–30.
  13. Brailsford, S.C. and Hilton, N.A. (2001) A comparison of discrete event simulation and system dynamics for modelling health care systems, in Planning for the Future: Health Service Quality and Emergency Accessibility (ed. J. Riley), Glasgow Caledonian University, Glasgow.
  14. Brailsford, S.C., Lattimer, V.A., Tarnaras, P. and Turnbull, J.C. (2004) Emergency and on-demand health care: modelling a large complex system. Journal of the Operational Research Society, 55, 34–42.
  15. Buchanan, J.T., Henig, E.J. and Henig, M.I. (1998) Objectivity and subjectivity in the decision making process. Annals of Operations Research, 80, 333–345.
  16. Burrell, G. and Morgan, G. (1979) Sociological Paradigms and Organisational Analysis: Elements of the Sociology of Corporate Life, Heinemann, London.
  17. Cardoen, B. and Demeulemeester, E. (2008) Capacity of clinical pathways – a strategic multi-level evaluation tool. Journal of Medical Systems, 32, 443–452.
  18. Churchman, C.W. (1955) Management Science, the journal. Management Science, 1 (2), 187–188.
  19. Churchman, C.W. (1994) Management science: science of managing and managing of science. Interfaces, 24 (4), 99–110.
  20. Coelli, F.C., Ferreira, R.B., Almeida, R.M.V.R. and Pereira, W.C.A. (2007) Computer simulation and discrete-event models in the analysis of a mammography clinic patient flow. Computer Methods and Programs in Biomedicine, 87, 201–207.
  21. Collier, A. (1994) Critical Realism: An Introduction to Roy Bhaskar's Philosophy, Verso, London.
  22. Commission on the Future Practice of Operational Research (1986) Report of the 1986 Commission on the Future Practice of Operational Research. Journal of the Operational Research Society, 37, 829–886.
  23. Cote, M.J. (2000) Understanding patient flow. Decision Line, 31, 8–10.
  24. Coyle, R.G. (1985) Representing discrete events in system dynamics models: a theoretical application to modeling coal production. Journal of the Operational Research Society, 36(4), 307–318.
  25. Danermark, B., Ekstrom, M., Jakobsen, L. and Karlssn, J.C. (2002) Explaining Society: Critical Realism in the Social Sciences, Routledge, London.
  26. Davies, R. and Davies, H. (1994) Modeling patient flows and resources in health systems. Omega, 22, 123–131.
  27. de Bruin, A., van Rossum, A., Visser, M. and Koole, G. (2007) Modeling the emergency cardiac in-patient flow: an application of queuing theory. Health Care Management Science, 10, 125–137.
  28. Dittus, R.S., Klein, R.W., DeBrota, D.J. and Fitzgerald, M.A. (1996) Medical resident work schedules: design and evaluation by simulation. Management Science, 42 (6), 891–906.
  29. Doomun, R. and Jungum, N.V. (2008) Business process modeling, simulation and reengineering: call centres. Business Process Management Journal, 14 (6), 838–848.
  30. Downward, P. and Mearman, A. (2007) Retroduction as mixed-methods triangulation in economic research. Cambridge Journal of Economics, 31, 77–99.
  31. Edwards, R.H., Clague, J.E., Barlow, J. et al. (1994) Operations research survey and computer simulation of waiting times in two medical outpatient clinic structures. Health Care Analysis, 2, 164–169.
  32. El-Darzi, E., Vasilakis, C., Chaussalet, T. and Millard, PH. (1998) A simulation modeling approach to evaluating length of stay, occupancy, emptiness and bed blocking in a hospital geriatric department. Health Care Management Science, 1, 143–149.
  33. Ferreira, R.B., Coelli, F.C., Pereira, W.C.A. and Almeida, R.M.V.R. (2008) Optimizing patient flow in a large hospital surgical centre by means of discrete-event computer simulation models. Journal of Evaluation in Clinical Practice, 14, 1031–1037.
  34. Fletcher, A. and Worthington, D. (2009) What is a ‘generic’ hospital model? A comparison of ‘generic’ and ‘specific’ hospital models of emergency patient flows. Health Care Management Science, 12 (4), 374–391.
  35. Forrester, J.W. (1961) Industrial Dynamics, MIT Press, Cambridge, MA.
  36. Forrester, J.W. (1992) Policies, decisions and information sources for modeling. European Journal of Operations Research, 5, 42–63.
  37. Gorunescu, F., McClean, S.I. and Millard, P.H. (2002) A queuing model for bed-occupancy management and planning of hospitals. Journal of the Operational Research Society, 53 (1), 19–24.
  38. Greasley, A. (2005) Using system dynamics in a discrete-event simulation study of a manufacturing plant. International Journal of Operations and Production Management, 25 (6), 534–548.
  39. Größler, A., Thun, J.-H. and Milling, P.M. (2008) System dynamics as a structural theory in operations management. Production and Operations Management, 17 (3), 373–384.
  40. Hall, R.W. (1985) What's so scientific about MS/OR? Interfaces, 15, 40–45.
  41. Haraden, C. and Resar, R. (2004) Patient flow in hospitals: understanding and controlling it better. Frontiers of Health Services Management, 20, 3–15.
  42. Hopp, W.J. (2008) Management science and the science of management. Management Science, 54 (12), 1961–1962.
  43. Jackson, R.R.P., Welch, J.D. and Fyr, J. (1964) Appointment systems in hospitals and general practice. Operational Research Quarterly, 15, 219–237.
  44. Jiang, L. and Giachetti, R. (2008) A queuing network model to analyze the impact of parallelization of care on patient cycle time. Health Care Management Science, 11 (3), 248–261.
  45. Jun, J.B., Jacobson, S.H. and Swisher, J.R. (1999) Application of discrete-event simulation in health care clinics: a survey. Journal of the Operational Research Society, 50 (2), 109–123.
  46. Karpov, Y.G., Ivanovsky, R. and Sotnikov, K.A. (2007) Application of simulation approaches to creation of decision support system for IT service management, in Parallel Computing Technologies, Lecture Notes in Computer Science, vol. 4671, Springer, Berlin, pp. 553–558.
  47. Katsaliaki, K., Mustafee, N., Taylor, S.J.E. and Brailsford, S. (2009) Comparing conventional and distributed approaches to simulation in a complex supply-chain health system. Journal of the Operation Research Society, 60, 43–51.
  48. Kolker, A. (2008) Process modeling of emergency department patient flow: effect of patient length of stay on ED diversion. Journal of Medical Systems, 32, 389–401.
  49. Kolker, A. (2009) Process modeling of ICU patient flow: effect of daily load leveling of effective surgeries on ICU diversion. Journal of Medical Systems, 33, 27–40.
  50. Kuhn, T.S. (1970) The Structure of Scientific Revolutions, 2nd edn, Chicago University Press, Chicago.
  51. Lane, D.C. (1998) Can we have confidence in generic structures? Journal of the Operational Research Society, 49, 936–947.
  52. Lane, D.C. (1999) Social theory and system dynamics practice. European Journal of Operations Research, 113, 501–527.
  53. Lane, D.C. (2000) You just don't understand me: Models of failure and success in the discourse between system dynamics and discrete event simulation, LSE OR Department, Working Paper LSEOR 00-34.
  54. Lane, D.C. and Husemann, E. (2008) System dynamics mapping of acute patient flows. Journal of the Operational Research Society, 59 (2), 213–224.
  55. Lane, D.C., Monefeldt, C. and Rosenhead, J.V. (2000) Looking in the wrong place for healthcare improvements: a system dynamics study of an accident and emergency department. Journal of the Operational Research Society, 51, 518–531.
  56. Lattimer, V.A., Brailsford, S.C., Turnbull, J.A. et al. (2004) Reviewing emergency care systems: insights from system dynamics modelling. Emergency Medicine Journal, 21, 685–691.
  57. Lawson, T. (1998) Economic science without experimentation, in Critical Realism: Essential Readings (eds M. Archeret al.), Routledge, London, pp. 144–169.
  58. Lehaney, B., Malindzak, D. and Khan, Z. (2008) Simulation modeling for problem understanding: a case study in the East Slovakia coal industry. Journal of the Operational Research Society, 59, 1332–1339.
  59. Levin, S.R., Dittus, R., Aronsky, D. et al. (2008) Optimizing cardiology capacity to reduce emergency department boarding: a systems engineering approach. American Heart Journal, 156, 1202–1209.
  60. Little, J.D.C. (1986) Research opportunities in the decision and management sciences. Management Science, 32 (1), 1–13.
  61. Lorenz, T. and Jost, A. (2006) Towards an orientation framework in multi-paradigm modelling. Proceedings of the 24th International Conference of the System Dynamics Society, Nijmegen, The Netherlands.
  62. Lowery, J.C. and Martin, J.B. (1992) Design and validation of a critical care simulation model. Journal of the Society for Health Systems, 3, 15–36.
  63. Meadows, D.H. (1980) The unavoidable a priori, in Elements of the System Dynamics Method (ed. J. Randers), Productivity Press, Cambridge.
  64. Meredith, J.R. (2001) Reconsidering the philosophical basis of OR/MS operations research. Operations Research, 49 (3), 325–333.
  65. Mingers, J. (2000) The contribution of critical realism as an underpinning philosophy for OR/MS and systems. Journal of the Operational Research Society, 51, 1256–1270.
  66. Mingers, J. (2003) A classification of the philosophical assumptions of management science methods. Journal of the Operational Research Society, 54, 559–570.
  67. Mingers, J. (2006a) A critique of statistical modeling in management science from a critical realist perspective: its role within multimethodology. Journal of the Operational Research Society, 57, 202–219.
  68. Mingers, J. (2006b) Realising Systems Thinking: Knowledge and Action in Management Science, Springer, New York.
  69. Mingers, J. and Brocklesby, J. (1997) Multimethodology: towards a framework for mixing methodologies. Omega, 25 (5), 489–509.
  70. Mitroff, I.I. (1972) The myth of objectivity or why science needs a new psychology of science. Management Science, 18 (10), B613–B618.
  71. Mitroff, I.I. (1994) The cruel science of world mismanagement: an essay in honor of C. West Churchman. Interfaces, 24 (4), 94–98.
  72. Morecroft, J.D.W. and Robinson, S. (2005) Explaining puzzling dynamics: comparing the use of system dynamics and discrete event simulation, in Proceedings of the 23rd International Conference of the System Dynamics Society (eds J.D. Sterman, M.P. Repenning, R.S. Langeret al.), System Dynamics Society, Boston, MA.
  73. Moreno, L., Aguilar, R.M., Martín, C.A. et al. (2000) Patient-centered simulation to aid decision-making in hospital management. Simulation, 74, 290–304.
  74. Peña-Mora, F., Sangwon, H., Lee, S.H. and Park, M. (2008) Strategic-operational construction management: hybrid system dynamics and discrete event approach. Journal of Construction Engineering and Management, 134 (9), 701–710.
  75. Pidd, M. (2003a) Computer Simulation in Management Science, 4th edn, John Wiley & Sons, Ltd, Chichester.
  76. Pidd, M. (2003b) Tools for Thinking: Modeling in Management Science, 2nd edn, John Wiley & Sons, Ltd, Chichester.
  77. Pidd, M. (2004a) Simulation worldviews – so what? Proceedings of the 2004 Winter Simulation Conference.
  78. Pidd, M. (2004b) Computer Simulation in Management Science, 5th edn, John Wiley & Sons, Ltd, Chichester.
  79. Popkov, T. and Garifullin, M. (2006) Multi-approach simulation modeling: challenge of the future, in Proceedings of the Asia Simulation Conference 2006 on Systems Modeling and Simulation: Theory and Applications (eds K. Koyamada, S. Tamura and O. Ono), Springer, Tokyo, pp. 103–108.
  80. Randers, L. (1980) Elements of the System Dynamics Method, Productivity Press, Cambridge.
  81. Richardson, G. and Pugh, A. (1981) Introduction to Systems Dynamics Modeling with DYNAMO, MIT Press, Cambridge, MA.
  82. Robinson, S. (2008) Conceptual modeling for simulation Part I: definition and requirements. Journal of the Operational Research Society, 59, 278–290.
  83. Rohleder, T.R., Bischak, D.P. and Baskin, L.B. (2007) Modeling patient service centers with simulation and system dynamics. Health Care Management Science, 10, 1–12.
  84. Ryan, J. and Heavey, C. (2006) Process modeling for simulation. Computers in Industry, 57 (5), 437–450.
  85. Sargent, R.G. (2004) Validation and verification of simulation models. Proceedings of the 2004 Winter Simulation Conference, pp. 130–143.
  86. Sterman, J.D. (2000) Business Dynamics: Systems Thinking and Modeling for a Complex World, Irwin McGraw-Hill, Boston, MA.
  87. Sweetser, A. (1999) A comparison of system dynamics and discrete event simulation. Proceedings of the 17th International Conference of the System Dynamics Society and 5th Australian & New Zealand Systems Conference.
  88. Tako, A.A. and Robinson, S. (2008) Model building a quantitative comparison. Proceedings of the 26th International Conference of the System Dynamics Society.
  89. Tako, A.A. and Robinson, S. (2009) Comparing discrete-event simulation and system dynamics: users' perceptions. Journal of the Operational Research Society, 60, 296–312.
  90. Tan, B.A., Gubaras, A. and Phojanomongkolkij, N. (2002) Simulation study of Dreyer Care Facility. Proceedings of the Winter Simulation Conference, pp. 1922–1927.
  91. Van Horn, R.L. (1971) Validation of simulation results. Management Science, 17 (5), 247–257.
  92. Venkateswaran, J. and Son, Y.-J. (2005) Hybrid system dynamic—discrete event simulation-based architecture for hierarchical production planning. International Journal of Production Research, 43 (20), 4397–4429.
  93. Vissers, J. and Wijnaard, J. (1979) The outpatient appointment system: design of a simulation study. European Journal of Operations Research, 13, 459–463.
  94. Wainer, G.A. (2009) Discrete-Event Modeling and Simulation: A Practitioner's Approach, Taylor & Francis, Boca Raton, FL.
  95. Walsham, G. (1992) Management science and organisational change: a framework for analysis. Omega, 20, 1–9.
  96. Walter, S.D. (1973) A comparison of appointment schedules in a hospital radiology department. British Journal of Preventive and Social Medicine, 27, 160–167.
  97. Werker, G., Sauré, A., French, J. and Shechter, S. (2009) The use of discrete-event simulation modelling to improve radiation therapy planning processes. Radiotherapy and Oncology, 92(1), 76–82.
  98. Wolstenholme, E. (1993) A case study in community care using system thinking. Journal of the Operational Research Society, 44 (9), 925–934.
  99. Wolstenholme, E. (1999) A patient flow perspective of U.K. health services: exploring the case for new ‘intermediate care’ initiatives. System Dynamics Review, 15 (3), 263–271.
  100. Worthington, D. (1991) Hospital waiting list management models. Journal of the Operational Research Society, 42 (10), 833–843.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.40.63