List of Figures

  1. 1.1 Process of improvement process.
  2. 1.2 Summary of recommendation review results.
  3. 1.3 ReMo.
  4. 1.4 Correlation analysis.
  5. 1.5 Correlated tasks via work products.
  6. 1.6 Identifying finding correlations.
  7. 1.7 Finding correlations.
  8. 1.8 Identifying improvement packages.
  9. 1.9 Improvement package description.
  10. 1.10 Refining improvement packages.
  11. 1.11 Findings-to-organization matrix.
  12. 1.12 Findings from organizational view.
  13. 1.13 Finding-to-life cycle matrix.
  14. 1.14 Findings from software life cycle view.
  15. 1.15 Capability-based process model guidance.
  16. 1.16 Finding from assessment model view.
  17. 1.17 Finding-to-business value matrix.
  18. 1.18 Findings from business value view.
  19. 1.19 Building recommendations.
  20. 1.20 Abstract recommendation.
  21. 1.21 Identifying recommendation seeds.
  22. 1.22 Concrete recommendation.
  23. 1.23 Evolution of ReMo.
  24. 1.24 Concreteness comparison − Process element coverage.
  25. 1.25 Concreteness comparison per project.
  26. 1.26 Concreteness comparison of recommendation examples.
  27. 1.27 Comprehensiveness comparison − Process area coverage.
  28. 1.28 Comprehensiveness comparison − Recommendation example.
  29. 2.1 The ambidextrous security program.
  30. 2.2 Percentage of people in each adopter category.
  31. 3.1 Non-cognitive skills framework represented as clusters.
  32. 4.1 Research methodology.
  33. 4.2 Evaluation results.
  34. 4.3 Projections of percentiles for score and cost (F. Acebes [62]).
  35. 4.4 AON network diagram.
  36. 4.5 Sample burn-up chart showing schedule variance.
  37. 4.6 Burn-down chart reflecting scope change and schedule variance.
  38. 4.7 Projection of ith iteration into different percentiles.
  39. 4.8 Burn-up chart incorporating cost.
  40. 4.9 Projection into score and cost.
  41. 4.10 Projection into score and cost.
  42. 4.11 Multiple releases.
  43. 4.12 Features at single release.
  44. 4.13 Projection into score and cost.
  45. 5.1 Workflow diagram.
  46. 5.2 (a-i) Fitted shapes and residuals of GO, G-GO, and GMWD models on Beam 2.0.0, 2.1.0, and 2.2.0 failure datasets.
  47. 6.1 The capability dimension of the OD-CMM (adapted from [49]).
  48. 7.1 Division of complex problem into a hierarchical structure.
  49. 7.2 Schematic diagram of proposed research design of the analytic hierarchy process (AHP) consistency ratio.
  50. 7.3 Hierarchical structure of the present study for the analytic hierarchy process (AHP) method.
  51. 8.1 Proposed system model.
  52. 8.2 Performance evaluation of datasets using precision, recall and F-measure.
  53. 8.3 Change in recall.
  54. 9.1 Reporting the review process.
  55. 11.1 Layered architecture.
  56. 11.2 Graphical representation of microservices architecture pattern.
  57. 11.3 Event-driven architecture pattern.
  58. 11.4 Graphical representation of blackboard software architecture.
  59. 11.5 Graphical representation of the identified challenges.
  60. 13.1 Searching and selection process of research articles.
  61. 14.1 The measurement framework (adapted from ISO/IEC 330xx).
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.142.153.224