13
Multiscale Modelling for Public Health Management: A Practical Guide

Rosemarie Sadsad1,2,3 and Geoff McDonnell3,4

1Centre for Infectious Diseases and Microbiology – Public Health, Westmead Hospital, Sydney, New South Wales, Australia

2Sydney Medical School, Westmead, The University of Sydney, New South Wales, Australia

3Centre for Health Informatics, Australian Institute of Health Innovation, University of New South Wales, Sydney, New South Wales, Australia

4Adaptive Care Systems, Sydney, New South Wales, Australia

13.1 Introduction

System modelling and simulation is increasingly applied to manage chronic persistent problems, including public health. Simulation models of complex and multilevel organisational systems like health care are often abstracted at one level of interest. There are many challenges with developing simulation models of systems that span multiple organisational levels and physical scales. We describe several theoretical and conceptual frameworks for multilevel system analysis and present an approach for developing multiscale and multimethod simulation models to aid management decisions. We use simulation models to illustrate how management actions, informed by patterns in stock levels, govern discrete events and entities, which, collectively, change the flow mechanism that controls stock levels.

13.2 Background

Public health management is complex and inherently multilevel. The determinants of public health are highly interrelated and context dependent (WHO, 2011). These determinants exist at different organisational levels, affect a range of population groups, and are effective at different times or for different lengths of time (Winsberg, 2010). Context refers to the conditions or circumstances surrounding an event or action. Context may include biological or individual capacities, interpersonal relationships, institutional settings (resources, culture, and leadership) or the wider infrastructure system (the physical, cultural and regulatory environment) (Pawson, 2006; Kaplan et al., 2010). Interventions are more effective and sustainable when these complex and multilevel aspects are understood and considered.

13.3 Multilevel System Theories and Methodologies

Few approaches support comparative studies across contexts (McPake and Mills, 2000; Mills, 2012) or the development of multilevel causal theories. Theories and methodologies that adopt systems approaches recognise multilevel and dynamic determinants (Leischow and Milstein, 2006; Homer and Hirsch, 2006; Galea, Hall and Kaplan, 2009).

Systems thinking, based on general system theory (von Bertalanffy, 1968), perceives the world as a system composed of interrelated parts that are coherently organised in a way that serves a purpose (von Bertalanffy, 1968; Meadows, 2008; de Savigny and Adam, 2009). Hierarchical system theory (Simon, 1962; von Bertalanffy, 1968) describes a multilevel system as nested systems and parts. Changes to the set of systems and parts, how they are connected, or the complex and sometimes circular relationships between them, can cause outcomes to respond in a nonlinear and often unpredictable manner over time (Sterman, 2000; Meadows, 2008). Koestler (1978) expands on the concepts of ‘part’ and ‘system’ for hierarchical systems and introduces the concept of a ‘holon’. A holon is a partially independent unit that is self-regulated by its internal parts and subsystems. Each holon is embedded within, serves parts of and is influenced by a larger holon. A holon is both a part and a subsystem. Holons interact with internal and external parts and can span multiple levels of a system.

Problem abstraction involves the development of a simpler, but adequate, representation of a real-world complex system. The complexity of a system can be reduced by the principle that complex systems are partially decomposable (Simon, 1962). The links between structures within a holon are stronger (or more tightly coupled) than the links between holons (see Figure 13.1). This allows holons to be separated at weaker (or loosely coupled) links, while acknowledging their interconnectedness, for a simpler analysis. This may help resolve boundary definitions between hierarchical or organisational levels that may be hard to specify in absolute terms, a commonly encountered problem (Rousseau, 1985).

img

Figure 13.1 Multilevel context, mechanisms and outcomes. Adapted from Pawson and Tilley (1997) and Ratzé et al. (2007).

The concept of decomposability aligns with the design rules of Baldwin and Clark (2000) for modular technology development. A module can be separated from the larger system within which it functions. Modular design, when applied to technology development, can produce increasingly complex technology. This technology evolves and improves in an unpredictable, yet coordinated way through a decentralised value-seeking process. With independent module improvement (or evolution), the technology as a whole evolves; consequently, the ecosystem (Adomavicius et al., 2007) also evolves. Holling and Gunderson (2002) use the term ‘panarchy’ to describe complex evolving hierarchical systems. Panarchy is a conceptual framework that describes systems to be interlinked in ongoing adaptive cycles of growth, conservation, release and reorganisation. These cycles of change occur over a range of spatial and temporal scales. These concepts are applicable to an adaptive approach to management. By understanding these cycles and the scales in which they operate, leverage points for change that promote resilience and sustainability may be identified.

An organised view of a system can be achieved by plotting holons, events or actions and their interrelationships on a map of hierarchical organisation. This conveys key organisational levels, timescales, or spatial scales that may be involved in the public health problem and cross-level (or cross-scale) information. This can help determine appropriate levels of governance, key multidisciplinary stakeholders and actions, and shared responsibilities. An example of such a map is a Stommel diagram, a tool used in physical sciences that presents on a graph the range of levels (or physical scales such as time or space) that may be involved in the phenomena (Stommel, 1963).

The Realist approach can help identify key determinants of public health outcomes and their management. Regularities observed in public health outcomes are conceptualised as being generated by particular mechanisms acting within a context (Pawson and Tilley, 1997). Mechanisms are how outcomes are hypothesised to be produced. Context is the conditions or circumstances that trigger or control the operation of the mechanisms. These causal relationships are articulated with context–mechanism–outcome (CMO) configurations (Pawson and Tilley, 1997). CMO configurations are a list of mechanisms and contextual factors, and the outcomes that result from this interaction. Context, mechanisms and consequently outcomes each change over time. The original CMO framework, as described by Pawson and Tilley (1997), is extended to highlight the dynamic nature of this interaction (see Figure 13.1).

Contextual factors, mechanisms and outcomes are conceptualised as holons. Their state may change over time (t) and across levels of organisation (l). Holons are components and form part of a compound holon. Holons are connected by weak or strong links.

According to a Realist approach, an intervening (or management) action can change an outcome by creating a new mechanism, by modifying or disabling the operation of existing mechanisms, or by modifying the governing context (Pawson and Tilley, 1997; Kazi, 2003). Causal loop diagrams, a tool from the system dynamics approach (Sterman, 2000), can describe the direction of causation between CMO elements and whether changes in one CMO element drive a similar or opposite change in another CMO element (polarity). The concept of stocks and flows from the system dynamics approach (Sterman, 2000) can also be incorporated. With stock and flow diagrams, changes in the level of a stock or quantity is conceptualised to occur by adjusting the rate at which the stock is emptied or filled. Management actions can change flow rates to control stock levels. Analogous to stock and flow diagrams which represent changes to the accumulation of quantities or entities, Unified Modeling Language (UML) state charts (OMG, 2012) represent event-triggered transitions between finite states for a single entity, such as an individual. Management actions can govern the occurrence of these events so that the collective response of entities may impact outcomes. We describe a synthesis of these theories and methodologies for analysing complex and multilevel management outcomes. In Chapter 14, we apply this approach to frame the problem of MRSA endemicity in hospitals and explore alternative hospital infection control policies for its management.

13.4 Multiscale Simulation Modelling and Management

Conceptual models for framing problems and planning and evaluating public health actions can convey multilevel determinants and show their interrelationships, but they may not capture the magnitude of their impact or the way these determinants and relationships change with time (Sterman, 2000). A simulation provides a platform to rigorously test single-level or multilevel hypotheses in a single experimental framework. It can capture and explore the temporal dynamics of multilevel management problems.

Single-level or single-scale simulation, where only one level of abstraction of a problem is modelled and simulated, has been extensively used to study many management problems and actions, including that of public health (Fone et al., 2003; Brailsford et al., 2009; Forsberg et al., 2011; Sobolev, Sanchez and Vasilakis, 2011). The management model of Forrester (1961, 1992), conceptualised with stocks and flows, provides a high-level view of the impact of policy decisions. Patterns seen in stock levels over time inform management of the current state of the system. Management actions aim to control the flows into and from stocks so that the desired stock-level patterns are produced. While this model shows what is intended to happen, how this happens is not explicit at this level of abstraction. This model is extended to include details of how individuals respond to the policy. The problem and its management are also abstracted at the individual level. This describes how high-level policies shape particular individual discrete events and decisions, and how the stream of collective corrective actions subsequently changes the flow mechanism that controls outcomes (see Figure 13.2).

img

Figure 13.2 The multilevel system structure of decision making. Adapted from Forrester (1992) and Borshchev and Filippov (2004).

A high-level system structure for decision making is adapted from Forrester (1992) and Borshchev and Filippov (2004) to portray the role of individual discrete events and decisions in changing system outcomes. The multilevel management model can be developed using multiple methods. For example, high-abstraction methods, such as system dynamics, can model high-level policies and structures; low-abstraction methods, such as agent-based or discrete-event modelling methods, can model discrete events and individuals interacting and acting within context and guided by policies.

A multiscale simulation1 models multiple levels of abstraction for different time or spatial scales and can link between these levels and scales (Bassingthwaighte, Chizeck and Atlas, 2006; Meier-Schellersheim, Fraser and Klauschen, 2009; Sloot and Hoekstra, 2010). It can also switch between views at different levels and scales during real-time simulation (Bassingthwaighte, Chizeck and Atlas, 2006). This flexibility enables the presentation of both broad and specific views of the problem and its solution to multidisciplinary stakeholders and may promote consensus and collaboration (Costanza and Ruth, 1998; Etienne, Le Page and Cohen, 2003; NCI, 2012).

Multiscale simulation is primarily applied in biology (Schlessinger and Eddy, 2002; Eddy and Schlessinger, 2003; Mitha et al., 2008; Dada and Mendes, 2011), environmental sciences (Millennium Ecosystem Assessment, 2005) and physical sciences (Horstemeyer, 2010). Barriers to its application in public health include the poor availability, quality and consistency of data and theory that span multiple levels of the health care system. This is being addressed with the advancement of electronic health records and the development of frameworks for managing multilevel data collection and analyses (Eddy, 2007; IOM, 2010). Often management problems involve variables that are difficult to observe and measure. Such latent variables have been shown in single-level or single-scale models to be quantifiable (Richmond, Peterson and Vescuso, 1987) and therefore computable (Brailsford and Schmidt, 2003). Given this and the successful application of multiscale simulation in other fields, there is potential for multiscale simulation to be applied to health care and other organisational management problems.

Our multiscale simulation modelling process is composed of the following steps:

  1. Develop a multilevel conceptual model of management actions.
  2. Decompose the conceptual model by levels of abstraction.
  3. Develop single-scale models for each abstraction.
  4. Integrate the single-scale models to form one multiscale model.
  5. Calibrate and validate the multiscale model.
  6. Develop an interface to the model for use as an interactive learning tool, decision support tool or experimental framework.

Multilevel conceptual models, such as that described in Figure 13.2, can be broken down into smaller parts, with each part determined by considering differences in how they could be abstracted. These differences include:

  1. Changes in scale or level (Bar-Yam, 2006).
  2. The core focus, whether this may be patterns in aggregate quantities, key processes or interaction between individuals.
  3. The treatment of time, continuous or discrete (Brennan, Chick and Davies, 2006).
  4. The treatment of entities or quantities of interest, continuous or discrete (Brennan, Chick and Davies, 2006).
  5. The importance of heterogeneity.
  6. The importance of variability in outcomes and random events (Brennan, Chick and Davies, 2006).
  7. The importance of dynamic structure (Ratzé et al., 2007).

Each smaller conceptual model informs the structure of a corresponding single-scale simulation model. Each single-scale model can be developed and validated using well-documented model development frameworks (Richmond, Peterson and Vescuso, 1987; Sterman, 2000; Bossel, 2007; Morecroft, 2007). These frameworks comprise three stages that are often repeated for ongoing refinement of the model: problem conceptualisation, model formulation and testing, and simulation analysis. Modelling methods used to develop single-scale models, such as system dynamics (Forrester, 1961; Sterman, 2000), discrete-event (Jun, Jacobson and Swisher, 1999; Fone et al., 2003; Banks et al., 2009; Gunal and Pidd, 2010) or agent-based (Bonabeau, 2002; Epstein and Axtell, 1996; Axelrod and Tesfatsion, 2011) modelling, should be selected appropriately. Several frameworks can guide this selection process (Koopman, Jacquez and Chick, 2001; Borshchev and Filippov, 2004; Brennan, Chick and Davies, 2006; RIGHT, 2009). Each framework matches characteristics of the data, theory and abstraction of the problem (i.e. the conceptual model) with characteristics of the modelling method. The RIGHT framework (2009) also considers resources such as time, money and expert knowledge to build the model in the selection process.

The independently developed and validated single-scale models are then integrated to form one multiscale and often multimethod (Koopman, Jacquez and Chick, 2001; Mingers, 1997) model. There are five ways to link single-scale models so that information can propagate across multiple levels and scales within one multiscale model (Pantelides, 2001; Ingram, Cameron and Hangos, 2004) (see Figure 13.3).

  1. Serial method: The models operate sequentially. A model on one scale first generates information. Another model on a different scale then uses this information.
  2. Simultaneous method: Lower scale models simulate the entire system. A higher scale model samples the information produced by the lower scale models and aggregates the information to provide a high-level summary. All the models operate simultaneously.
  3. Hierarchical method: Lower scale models are embedded within higher scale models and allow information to be exchanged directly. The models operate simultaneously.
  4. Multidomain method: Information is exchanged between lower and higher scale models using a common interface placed between the models.
  5. Parallel method: Several multiscale models are integrated to form one model. Each model describes phenomena occurring over a range of scales with some of these scales overlapping.
img

Figure 13.3 Frameworks for multiscale model construction. Adapted from Pantelides (2001) and Ingram, Cameron and Hangos (2004).

The Pantelides (2001) and Ingram, Cameron and Hangos (2004) frameworks for multiscale model construction are adapted and presented as Stommel diagrams (Stommel, 1963). These multiscale models are described for two types of scales: time and health system; organisational level, however, can be described for any number of level or scale types.

The challenge when combining models is how information of different scales and levels of abstractions is bridged (Koopman, Jacquez and Chick, 2001). The ‘bridging mechanism’ performs either a direct translation between physical scales (Ewert et al., 2006; Tsafnat and Coiera, 2009; Winsberg, 2010; Seck and Job Honig, 2012) or an approximation if no theory exists for mapping between levels of abstraction or conceptualisation. The bridging mechanism ensures information from one model at one scale is sampled at an appropriate frequency to capture important information. These samples can be aggregated or disaggregated over the time period for which they are sampled (depending on whether the sampled data is used by higher or lower scale models). Care must be taken to avoid inaccurate generalisations or variability among individual-level information.

The bridged models form the multiscale model. The multiscale model must be validated to ensure accurate and plausible results. The model is first calibrated with empirical data or plausible estimates where data is unavailable. There are a number of methods for estimating unknown parameters, such as a Least-Squares approach, Maximum Likelihood estimation and Bayesian approaches including Markov Chain Monte Carlo approaches. Parameter variation and sensitivity tests are conducted to describe the uncertainty in simulated outcomes caused by estimated parameters, and guide the interpretation of results (Granger Morgan and Henrion, 1990).

Once the model is calibrated its structure and simulated public health outcomes are validated (Barlas, 1996). There are many frameworks for validating models (Forrester and Senge, 1980; Carley, 1996; Barlas, 1996; Balci, 2007; Sargent, 2010; Gurcan, Dikenelli and Bernon, 2011). Several validation tests included in these frameworks are listed in Table 13.1.

Table 13.1 Validation or confidence building tests of simulation models. Adapted from Forrester and Senge (1980), Barlas (1996), Carley (1996), Balci (2007) and Sargent (2010).

Tests of model structure

Compares the model equations, program code and parameters with empirical or established theoretical relationships and values

  1. Structure verification
  2. Parameter verification
  3. Problem scope or boundary adequacy
  4. Dimensional consistency

Tests of model behaviour

Compares the range of simulated patterns in outcomes with that observed empirically, produced by other models (‘docking’), desired or expected

  1. Behaviour at extreme conditions
  2. The reproduction or prediction of behavioural patterns, points, distributions or values
  3. Behaviour anomaly
  4. Behaviour sensitivity

The calibrated and validated multiscale model can then be used as an experimental framework. By performing simulation analyses, the problem can be investigated, responses and sensitivity to variations in a range of parameters can be examined, and relevant scenarios can be explored.

Multiscale simulation models can be used as learning and decision support tools. An interface can be designed that enables interactive learning, communicates and presents results simply, provides immediate feedback, and encourages thinking. To structure the design of the visual interface, the analytical design principles of Tufte (2006) can be followed: show comparisons, causality and multiple variables, use integrated text and figures, thoroughly describe the data and its sources, and use credible content. These principles are derived from analytical thinking and aim to aid the cognitive task of making sense of evidence.

The visual interface can present multiple ‘viewpoints’. Multidisciplinary stakeholders can view and understand their role and the role of other stakeholders towards addressing the problem, thus promoting consensus towards management actions (Costanza and Ruth, 1998; Etienne, Le Page and Cohen, 2003; NCI, 2012). This is similar to the Enterprise Conformance and Compliance Framework (NCI, 2012) where the presentation of information is customised for the foci of stakeholder groups from a multidisciplinary team.

Interfaces can encourage learning by allowing learners to make policy or practical decisions during the simulation and observe the effect of their decisions on outcomes (Forsberg et al., 2011). By providing immediate feedback to the learners, critical reflection on their decisions is encouraged and, through group interaction, can promote consensus for practice (Crichton, 1997).

13.5 Discussion

The analysis and virtual evaluation of competing public health actions are a part of a larger iterative process for effective public health management. Surrounding these steps are the gathering of knowledge, evidence and theory to be synthesised and analysed, and the communication and interpretation of the results to inform real-world decisions for action (see Figure 13.4).

img

Figure 13.4 Models to inform decisions for action. Adapted from Sterman's (2000) idealised learning process.

While this multilevel systems approach is capable of complex analysis and virtual experimentation, the uncertainty surrounding the findings and conclusions is strongly linked to the quality of the evidence and underlying theories. The limited availability of knowledge, data and theory from multiple levels of the health system is therefore a major challenge with adopting this approach. In addition, available information must be consistent and coherent for synthesis. The difficulty in collecting this information is recognised, with progress made towards establishing a framework to assist with collecting multilevel evidence (IOM, 2010). Koopman (2004) proposes that parameter sensitivity experiments be conducted with mathematical or simulation models to challenge inferences and subsequently inform empirical studies. Gaps in evidence may be addressed by using techniques either to estimate missing evidence or to represent it in a different way. Further research into methods for the management of missing evidence, knowledge or theory would be of great benefit.

There are opportunities to evaluate the use of multiscale simulations by a team of multidisciplinary stakeholders and gain insights into the capability of the approach for building consensus and encouraging collaborative action.

13.6 Conclusion

We present a systematic and structured approach for analysing multilevel and systemic public health problems with system modelling and simulation. The approach explicitly considers the role of context when designing and evaluating public health actions. The approach presented extends current analytical and experimental methods and has the potential to encourage more collaborative and multidisciplinary effort towards effective public health management.

Note

References

  1. Adomavicius, G., Bockstedt, J.C., Gupta, A. and Kauffman, R.J. (2007) Technology roles and paths of influence in an ecosystem model of technology evolution. Information Technology and Management, 8 (2), 185–202.
  2. Axelrod, R. and Tesfatsion, L. (2011) On-line guide for newcomers to agent-based modeling in the social sciences, http://www2.econ.iastate.edu/tesfatsi/abmread.htm (accessed 12 November 2012).
  3. Balci, O. (2007) Verification, validation and testing, in Handbook of Simulation: Principles, Methodology, Advances, Applications and Practice (ed. J. Banks), John Wiley & Sons, Inc., Hoboken, NJ, pp. 335–398.
  4. Baldwin, C.Y. and Clark, K.B. (2000) Design Rules, Volume 1: The Power of Modularity, MIT Press, Cambridge, MA.
  5. Banks, J., Carson, J.S.II, Nelson, B.L. et al. (2009) Discrete-event System Simulation, Prentice Hall, Englewood Cliffs, NJ.
  6. Barlas, Y. (1996) Formal aspects of model validity and validation in system dynamics. System Dynamics Review, 12 (3), 183–210.
  7. Bar-Yam, Y. (2006) Improving the effectiveness of health care and public health: a multiscale complex systems analysis. American Journal of Public Health, 96 (3), 459–466.
  8. Bassingthwaighte, J.B., Chizeck, H.J. and Atlas, L.E. (2006) Strategies and tactics in multiscale modeling of cell-to-organ systems. Proceedings of the IEEE, 94 (4), 819–830.
  9. Bonabeau, E. (2002) Agent-based modeling: methods and techniques for simulating human systems. Proceedings of the National Academy of Sciences, 99 (3), 7280–7287.
  10. Borshchev, A. and Filippov, A. (2004) From system dynamics and discrete event to practical agent based modeling: reasons, techniques, tools. Proceedings of the 22nd International Conference of the System Dynamics Society, Oxford, England.
  11. Bossel, H. (2007) Systems and Models: Complexity, Dynamics, Evolution, Sustainability, Books on Demand, Norderstedt.
  12. Brailsford, S.C., Harper, P.R., Patel, B. et al. (2009) An analysis of the academic literature on simulation and modelling in health care. Journal of Simulation, 3, 130–140.
  13. Brailsford, S. and Schmidt, B. (2003) Towards incorporating human behaviour in models of health care systems: an approach using discrete event simulation. European Journal of Operational Research, 150 (1), 19–31.
  14. Brennan, A., Chick, S.E. and Davies, R. (2006) A taxonomy of model structures for economic evaluation of health technologies. Health Economics, 15 (12), 1295–1310.
  15. Carley, K.M. (1996) Validating computational models, http://www.casos.cs.cmu.edu/publications/papers/howtoanalyze.pdf (accessed 12 November 2012).
  16. Costanza, R. and Ruth, M. (1998) Using dynamic modeling to scope environmental problems and build consensus. Environmental Management, 22 (2), 183–195.
  17. Crichton, S. (1997) Learning environments online: a case study of actual practice. PhD thesis. University of Sydney.
  18. Dada, J.O. and Mendes, P. (2011) Multi-scale modelling and simulation in systems biology. Integrative Biology, 3 (2), 86–96.
  19. de Savigny, D. and Adam, T. (eds) (2009) Systems Thinking for Health Systems Strengthening, Alliance for Health Policy and Systems Research, World Health Organization, Geneva, http://www.who.int/alliance-hpsr/resources/9789241563895/en/index.html (accessed 12 November 2012).
  20. Eddy, D.M. (2007) Linking electronic medical records to large-scale simulation models: can we put rapid learning on turbo? Health Affairs, 26 (2), w125–w136.
  21. Eddy, D.M. and Schlessinger, L. (2003) Archimedes: a trial-validated model of diabetes. Diabetes Care, 26 (11), 3093–3101.
  22. Epstein, J.M. and Axtell, R. (1996) Growing Artificial Societies: Social Science from the Bottom Up, Brookings Institution, Washington, DC.
  23. Etienne, M., Le Page, C. and Cohen, M. (2003) A step-by-step approach to building land management scenarios based on multiple viewpoints on multi-agent system simulations. Journal of Artificial Societies and Social Simulation, 6 (2), 257–262.
  24. Ewert, F., van Keulen, H., van Ittersum, M.K. et al. (2006) Multi-scale analysis and modelling of natural resource management options, in Proceedings of the iEMSs Third Biennial Meeting Summit on Environmental Modelling and Software (eds A. Voinov, A.J. Jakeman and A.E. Rizzoli), International Environmental Modelling and Software Society, Manno, Switzerland.
  25. Fone, D., Hollinghurst, S., Temple, M. et al. (2003) Systematic review of the use and value of computer simulation modelling in population health and health care delivery. Journal of Public Health Medicine, 25 (4), 325–335.
  26. Forrester, J.W. (1961) Industrial Dynamics, MIT Press, Cambridge, MA.
  27. Forrester, J.W. (1992) Policies, decisions and information sources for modeling. European Journal of Operational Research, 59 (1), 42–63.
  28. Forrester, J.W. and Senge, P.M. (1980) Tests for building confidence in system dynamics models, in System Dynamics, Studies in the Management Sciences, vol. 14 (eds A.A. Legasto, J.W. Forrester and J.M. Lyneis), Elsevier, Amsterdam, pp. 209–228.
  29. Forsberg, H.H., Aronsson, H., Keller, C. et al. (2011) Managing health care decisions and improvement through simulation modeling. Quality Management in Health Care, 20 (1), 15–29.
  30. Galea, S., Hall, C. and Kaplan, G.A. (2009) Social epidemiology and complex system dynamic modelling as applied to health behaviour and drug use research. International Journal of Drug Policy, 20 (3), 209–216.
  31. Granger Morgan, M. and Henrion, M. (1990) Uncertainty: A Guide to Dealing with Uncertainty in Quantitative Risk and Policy Analysis, Cambridge University Press, Cambridge.
  32. Gunal, M.M. and Pidd, M. (2010) Discrete event simulation for performance modelling in healthcare: a review of the literature. Journal of Simulation, 4, 42–51.
  33. Gurcan, O., Dikenelli, O. and Bernon, C. (2011) Towards a generic testing framework for agent-based simulation models. Proceedings of the Federated Conference on Computer Science and Information Systems, IEEE, pp. 635–642.
  34. Holling, C.S. and Gunderson, L.H. (2002) Resilience and adaptive cycles, in Panarchy: Understanding Transformations in Human and Natural Systems (eds L.H. Gunderson and C.S. Holling), Island Press, Washington, DC, pp. 25–62.
  35. Homer, J.B. and Hirsch, G.B. (2006) System dynamics modeling for public health: background and opportunities. American Journal of Public Health, 96 (3), 452–458.
  36. Horstemeyer, M.F. (2010) Multiscale modelling: a review, in Practical Aspects of Computational Chemistry: Methods, Concepts and Applications (eds J. Leszczynski and M.K. Shukla), Springer, New York, pp. 87–135.
  37. Ingram, G.D., Cameron, I.T. and Hangos, K.M. (2004) Classification and analysis of integrating frameworks in multiscale modelling. Chemical Engineering Science, 59 (11), 2171–2187.
  38. IOM (Institute of Medicine) (2010) Bridging the Evidence Gap in Obesity Prevention: A framework to inform decision making, The National Academies Press, Washington, DC.
  39. Jun, J.B., Jacobson, S.H. and Swisher, J.R. (1999) Application of discrete-event simulation in health care clinics: a survey. Journal of Operations Research Society, 50 (2), 109–123.
  40. Kaplan, H.C., Brady, P.W., Dritz, M.C. et al. (2010) The influence of context on quality improvement success in health care: a systematic review of the literature. Milbank Quarterly, 88 (4), 500–559.
  41. Kazi, M.A.F. (2003) Realist Evaluation in Practice: Health and Social Work, Sage, London.
  42. Koestler, A. (1978) Janus: A Summing Up, Random House, New York.
  43. Koopman, J.S. (2004) Infection transmission through networks, in Biological Networks (ed. F. Kepes), World Scientific, Singapore, pp. 449–505.
  44. Koopman, J.S., Jacquez, G. and Chick, S.E. (2001) New data and tools for integrating discrete and continuous population modeling strategies. Annals of the New York Academy of Sciences, 954, 268–294.
  45. Leischow, S.J. and Milstein, B. (2006) Systems thinking and modeling for public health practice. American Journal of Public Health, 96 (3), 403–405.
  46. McPake, B. and Mills, A. (2000) What can we learn from international comparisons of health systems and health system reform? Bulletin of the World Health Organization, 78 (6), 811–820.
  47. Meadows, D.H. (2008) Thinking in Systems: A Primer, Chelsea Green, Burlington, VT.
  48. Meier-Schellersheim, M., Fraser, I.D. and Klauschen, F. (2009) Multiscale modeling for biologists. Wiley Interdisciplinary Reviews: Systems Biology and Medicine, 1 (1), 4–14.
  49. Millennium Ecosystem Assessment (2005) Ecosystems and Human Well-Being: Multiscale Assessments, Island Press, Washington, DC.
  50. Mills, A. (2012) Health policy and systems research: defining the terrain; identifying the methods. Health Policy and Planning, 27 (1), 1–7.
  51. Mingers, J. (1997) Multi-paradigm multimethodology, in Multimethodology: Theory and Practice of Combining Management Science Methodologies (eds J. Mingers and A. Gill), John Wiley & Sons, Ltd, Chichester, pp. 1–20.
  52. Mitha, F., Lucas, T.A., Feng, F. et al. (2008) The multiscale systems immunology project: software for cell-based immunological simulation. Source Code for Biology and Medicine, 3 (6). doi 10.1186/1751-0473-3
  53. Morecroft, J. (2007) Strategic Modelling and Business Dynamics: A Feedback Systems Approach, John Wiley & Sons, Ltd, Chichester.
  54. NCI (National Cancer Institute) (2012) SAIF interoperability reviews. White Paper – Introduction to SAIF and ECCF, https://wiki.nci.nih.gov/display/VCDE/Introduction+to+SAIF+and+ECCF (accessed 12 November 2012).
  55. OMG (Object Management Group) (2012) ISO/IEC 19505-2. Information Technology – Object Management Group Unified Modeling Language (OMG UML) – Part 2: Superstructure, http://www.omg.org/spec/UML/ISO/19505-1/PDF (accessed 12 November 2012).
  56. Pantelides, C.C. (2001) New challenges and opportunities for process modelling. Computer Aided Chemical Engineering, 9, 15–26.
  57. Pawson, R. (2006) Evidence-based Policy: A Realist Perspective, Sage, London.
  58. Pawson, R. and Tilley, N. (1997) Realistic Evaluation, Sage, London.
  59. Ratzé, C., Gillet, F., Muller, J.P. et al. (2007) Simulation modelling of ecological hierarchies in constructive dynamical systems. Ecological Complexity, 4 (1–2), 13–25.
  60. Richmond, B., Peterson, S. and Vescuso, P. (1987) An academic user's guide to Stella Software, High Performance Systems, Inc., Lyme, NH.
  61. RIGHT (Research Into Global Healthcare Tools) (2009) Modelling and Simulation Techniques for Supporting Healthcare Decision Making – A Selection Framework, Engineering Design Centre, University of Cambridge, Cambridge.
  62. Rousseau, D.M. (1985) Issues of level in organisational research: multi-level and cross-level perspectives. Research in Organizational Behaviour, 7, 1–37.
  63. Sargent, R.G. (2010) Verification and validation of simulation models, in Proceedings of the 2010 Winter Simulation Conference (eds B. Johansson et al.), IEEE Press, Piscataway, NJ, pp. 166–183.
  64. Schlessinger, L. and Eddy, D.M. (2002) Archimedes: a new model for simulating health care systems – the mathematical formulation. Journal of Biomedical Informatics, 35 (1), 37–50.
  65. Seck, M.D. and Job Honig, H. (2012) Multi-perspective modelling of complex phenomena. Computational & Mathematical Organization Theory, 18 (1), 128–144.
  66. Simon, H.A. (1962) The architecture of complexity. Proceedings of the American Philosophical Society, 106 (6), 467–482.
  67. Sloot, P.M.A. and Hoekstra, A.G. (2010) Multi-scale modelling in computational biomedicine. Briefings in Bioinformatics, 11 (1), 142–152.
  68. Sobolev, B.G., Sanchez, V. and Vasilakis, C. (2011) Systematic review of the use of computer simulation modeling of patient flow in surgical care. Journal of Medical Systems, 35 (1), 1–16.
  69. Sterman, J.D. (2000) Business Dynamics: Systems Thinking and Modeling for a Complex World, Irwin/McGraw-Hill, New York.
  70. Stommel, H. (1963) Varieties of oceanographic experience. Science, 139 (3555), 572–576.
  71. Tsafnat, G. and Coiera, E.W. (2009) Computational reasoning across multiple models. Journal of the American Medical Informatics Association, 16 (6), 768–774.
  72. Tufte, E.R. (2006) Beautiful Evidence, Graphics Press, Cheshire, CT.
  73. von Bertalanffy, L. (1968) General System Theory: Foundations, Development, Applications, George Braziller, New York.
  74. WHO (World Health Organization) (2011) The determinants of health, http://www.who.int/entity/hia/evidence/doh/en/ (accessed 12 November 2011).
  75. Winsberg, E.B. (2010) Science in the Age of Computer Simulation, The University of Chicago Press, Chicago.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.133.133.61