List of Tables

  1. 1.1 Recommendation examples in current practice.
  2. 1.2 Reviewed projects.
  3. 1.3 Example findings of assessment.
  4. 1.4 Recommendation description templates.
  5. 1.5 Companies that participated in case studies.
  6. 1.6 Assessment data used in case studies.
  7. 1.7 Participants in case studies.
  8. 1.8 Case study summary.
  9. 1.9 Measurement items for PU and PEOU.
  10. 1.10 Survey results on PU.
  11. 1.11 Survey results on PEOU.
  12. 1.12 Mann-Whitney U Test results on PU and PEOU.
  13. 1.13 Example recommendations produced by an ad-hoc method in current practice.
  14. 1.14 Process element coverage.
  15. 1.15 Process area coverage per project.
  16. 1.16 A comparison of ReMo with existing studies.
  17. 1.17 Example results in case studies.
  18. 1.18 Example results in case studies (continued).
  19. 1.19 Example results in case studies (continued).
  20. 3.1 Profile equivalence in e-CF, ESCO and Colomo’s study.
  21. 3.2 Profiles involved in ISO 12207 processes.
  22. 3.3 Skills relevance in ISO 12207 processes.
  23. 3.4 Link from NCSF to processes in ISO 12207.
  24. 3.5 Link from processes to e-CF competences through deliverables.
  25. 4.1 Combination of different approaches used as monitoring and controlling methods in software project management.
  26. 4.2 Tools/techniques used for traditional project scope definition.
  27. 4.3 Tools/techniques used for agile project scope definition.
  28. 4.4 Extracted factors from the literature.
  29. 4.5 Mapping of factors elements.
  30. 4.6 Descriptive statistics.
  31. 4.7 Description of notations.
  32. 4.8 A-SPSRI elements.
  33. 4.9 Top contributing A-SPSRI elements.
  34. 4.10 Top 8 contributing A-SPSRI elements.
  35. 4.11 Assign definition level.
  36. 4.12 Weight, dl and score of elements in Case Study 2.
  37. 4.13 Assign cost to A-SPSRI elements.
  38. 5.1 Acronyms and notations.
  39. 5.2 Special cases of GMWD function with different values of parameters.
  40. 5.3 Dataset: Beam 2.0.0, Beam 2.1.0, and Beam 2.2.0.
  41. 5.4 Competing and proposed models with MVF.
  42. 5.5 Competing and proposed models with MVF.
  43. 6.1 Quality assessment criteria.
  44. 6.2 Existing MMs assessment results.
  45. 6.3 The process dimension of the OD-CMM.
  46. 6.4 Capability levels.
  47. 6.5 Existing maturity models in the open data domain.
  48. 6.6 An example process definition of ODM2 open data discovery process.
  49. 7.1 Search outcomes of different resources/libraries and databases.
  50. 7.2 Details of extracted data.
  51. 7.3 Synthesis of details of success factors.
  52. 7.4 Success factors for software outsourcing human resource.
  53. 7.5 Description of 9-point scale for intensity of importance.
  54. 7.6 Relationship between size of matrix and random consistency index.
  55. 7.7 Success factors identified in the questionnaire study.
  56. 7.8 Pairwise comparison matrix between the success factors of the “procurement” category.
  57. 7.9 Synthesized or normalized matrix of the “procurement” category.
  58. 7.10 Pairwise comparison matrix between the success factors of the “organization” category.
  59. 7.11 Synthesized or normalized matrix of the “organization” category.
  60. 7.12 Pairwise comparison matrix between the success factors of the “reliance” category.
  61. 7.13 Synthesized or normalized matrix of the “reliance” category.
  62. 7.14 Pairwise comparison matrix between the success factors of the “reliance” category.
  63. 7.15 Synthesized or normalized matrix of the “quality” category.
  64. 7.16 Pairwise comparison matrix between the categories of success factors.
  65. 7.17 Synthesized or normalized matrixes of the categories of success factors.
  66. 7.18 Summary of local and global weights issues and their rankings.
  67. 7.19 Prioritizing the success factors.
  68. 9.1 Search strings and digital libraries.
  69. 9.2 Inclusion and exclusion criteria
  70. 9.3 Quality evaluation criteria.
  71. 9.4 Search process.
  72. 9.5 Identified challenges.
  73. 9.6 Categorization of the challenges based on development and operations teams.
  74. 9.7 Selected primary studies.
  75. 9.8 DevOps practitioners.
  76. 10.1 Search string construction.
  77. 10.2 RQ1 search terms construction.
  78. 10.3 RQ2 search terms construction.
  79. 10.4 RQ3 search terms construction.
  80. 10.5 Detail of search results of different databases.
  81. 10.6 Detail of primary selected papers.
  82. 10.7 Data extraction format.
  83. 10.8 List of critical challenges.
  84. 11.1 Search term construction composition.
  85. 11.2 RQ1 search terms construction.
  86. 11.3 RQ2 search terms construction.
  87. 11.4 RQ3 search terms construction
  88. 11.5 Final search results.
  89. 11.6 Snowballing technique results.
  90. 11.7 Data demo table.
  91. 11.8 Frequency-wise list of challenges.
  92. 11.9 Results of data from different continents.
  93. 11.10 Continent-wise frequency.
  94. 12.1 SLR process.
  95. 12.2 Quality assessment.
  96. 12.3 Selection of studies using tollgate approach.
  97. 12.4 Project management challenges identified from selected primary studies.
  98. 12.5 Project management knowledge areas, GSD challenges and PM implications in GSD.
  99. 13.1 General format of the construction of a search term.
  100. 13.2 RQ1 search term construction.
  101. 13.3 RQ2 search term construction.
  102. 13.4 Search results of different resources.
  103. 13.5 Primary and final selected papers results.
  104. 13.6 Data extraction presentation.
  105. 13.7 List of challenges in CSCM identified through SLR.
  106. 13.8 Database/digital library-wise list of challenges.
  107. 13.9 Methodology-wise list of challenges.
  108. 14.1 The process definition of DX-HRSD.
  109. 14.2 Scale definitions.
  110. 14.3 Process capability level ratings (adapted from ISO/IEC 33002).
  111. 14.4 Capability Level 1 assessment of DX-HRSD process.
  112. 14.5 Capability level assessment of DX-HRSD process.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.113.30