4.5. Architecture Linkage to Software Development

If we apply proper architectural principles to create and maintain software structure, potential cost saving could be 50% or greater [Horowitz 93]. Good software structure is a function of the overall application architecture, the software interfaces or what is called confrontational architecture, and the implementation itself (Figure 4.7).

Figure 4.7. Computational Specification Links Architecture and Implementation


Computational interfaces may be the key enabler for improved software structure. Software interfaces as specified in IDL define boundaries between modules of software. If the software interfaces are coordinated architecturally, it is possible to define the boundaries for application programmers so that the intended structure of the application becomes its implemented structure. In practice we have found that the specification of software interfaces provides an actual benefit to the programmers, because they then have a guideline for how to implement software independently of other application developers. When developers share the same specification, their software can then interoperate, even though the applications are developed separately.

Figure 4.8 describes the overall process for how these kinds of results can be achieved. Starting with a set of enterprise requirements for a community of users, a business object analysis process can define the overall structure and characteristics of the application environment. Business object analysis is an object-oriented analysis in which both end users and object-oriented modelers and architects participate in defining new information technology capabilities which satisfy the needs of the business and the constraints of the technology implementation process. Once the business object analysis has produced object models, there is a further step, a drill-down exercise to define the common interface definitions. The common interface definitions are the software interfaces which define the actual internal software boundaries for the system. This is a drill-down exercise because these interfaces will specify the actual operations and parameters which are passed throughout the software system.

Figure 4.8. Sample Architecture-Driven Process


The common interface definitions must be coordinated with individual software projects in order for the appropriate lessons learned and legacy-migration considerations to be incorporated into the designs. As the common interface definitions mature and are applied across multiple projects, these definitions can become localized standards and profiles for the community of developers. These can provide useful information for new developers and commercial vendors that may want to participate in the interoperability solutions. It is not sufficient for interface specifications to stand alone. One important lesson learned that has been repeatedly discovered is that no matter how precise a specification is, the definition of how applications use this specification is required to assure interoperability. This requirement is equivalent to the profiling concept that we introduced in Chapter 2.

Figure 4.9 shows how a set of specifications both horizontal and vertical can be constrained with respect to a profile, so that application developers will be much more likely to provide interoperability between separate implementations. There is a distinct difference between specifications and profiles, which needs to be incorporated into software process. A specification such as an IDL definition should be designed so that it can be reused across multiple applications or families of systems. The profile information, on the other hand, should correspond to specific applications and families of systems, so that the conventions can be specialized without compromising the reusability of the overall specification. Specifications can be standardized either locally within the organization or on a more global scale through organizations like the object management group. However, profiles should remain fluid. Profiles in their best case are documented developer agreements for how standards specifications are used in specific instances.

Figure 4.9. Interoperability Profile


Identifying the appropriate categories of specifications to be standardized is a challenge that many organization never overcome. The process which has been applied repeatedly to achieve this purpose is shown in Figure 4.10. The problem for many individual software development projects and end users is understanding the need for commonality and how that need is distinguished from the actual design and architecture of specific applications. The same problem arises in identification of common data elements when commonality of information architecture is desired. The first step in the process is to basically survey the available requirements and technologies and other kinds of input which provide stakeholder impact on the selection of common functionality. Given that a broadly based survey across the scope of the enterprise is impossible, a smaller group of architects can get together and brainstorm some of the candidate facilities for interface coordination.

Figure 4.10. Large-Scale Architecture-Driven Process


It is important to abstract the selection of these facilities in an architectural block diagram to display how some facilities play roles that are horizontal in relationship to some of the others. It is also important to define a diagram extraction in order to communicate the structure of an architecture of this scale to multiple stakeholders in these deliberations. In Step 4, the individual facilities identified earlier are defined and documented as to their scope and basic functionality. This definition is necessary in order to constrain the drill-down process, which will be necessary in order to drive out the details for the interface definitions or data element definitions. In Step 5, a review process allows the various stakeholders in the architecture to verify that their needs are being met and also to build consensus across the enterprise for funding the reusable assets which will be defined when the interfaces are created.

Step 6 in the process is to slow the pace of architectural decision making and stabilize the architecture. After multiple iterations of the architecture document and review among all of the potential stakeholders, it is necessary to conclude the exercise and publish the document. It is then appropriate to tutorialize this information and make sure that there is a thorough understanding of it across the developer community. This final step of communicating architectural vision is often overlooked in many organizations, because once approval is obtained, many architects assume that potential developers will be constrained by the organizational decision and they assume that it is an appropriate transfer of responsibility to individual developers to understand what has been documented.

There is a key distinction between what happens in Steps 1–6 and what happens in Step 7. In Steps 1–6 the design of the architecture is being delivered and there is open discussion of potential extensions and changes, particularly among individual architects who are highly cognizant of the design implications. In Step 7 the assumption is that the architecture has been stabilized and that individual elements of the architecture are no longer the subject of debate. It is not possible to properly disseminate the architecture if the perception is that the debate is continuing. This phenomenon is the downfall of some significant and otherwise well-conceived architectures.

Figure 4.11 shows the overall prices for architecture migration. The migration process starts with some preexisting software including legacy applications, commercial software, and the possible use of shareware or freeware. Mixed into this is the creation of new software which will be implementing many new capabilities within the target system. The architecture migration process is influenced by business needs and by the definition of enterprise architecture that we described earlier, with a focus on the computational interfaces which are the real keys to controlling software boundaries. Once the target architecture is defined, then there is a continuous process of migration.

Figure 4.11. System Architecture Migration


The process of migration may take many years to realize and may never truly be completed. The kind of migration that we recommend is sometimes called chicken-little migration because it does not assume that on any specific date the legacy system will be shut down and the new system turned on at potentially substantial risk to the organization if the new system is not perfect. In chicken-little migration the capabilities of the legacy which already provide business value in the capabilities of the target system can be brought on line or transferred as the target system takes form. Figure 4.12 shows one of the key concepts in how the target system is realized by leveraging legacy applications. The legacy application will have one or more mechanisms for transferring information. At a minimum a legacy system maintains some files on disk or perhaps a database, and the legacy implication may have more than that; for example, it may have some application program interfaces that are documented or other types of command-line interfaces.

Figure 4.12. Legacy Object Wrapping Approach


Legacy applications may comprise a majority of commercial software having the same kinds of mechanisms available for the purpose of integration. In our experience with object-oriented integration we found a different set of mechanisms for virtually every legacy and commercial package that we encountered. The purpose of the object wrapper is to map from the preexisting interfaces to the target architecture interfaces which may be defined using IDL. In addition to providing a direct functional mapping, there are capabilities of the target architecture which should be considered and will reside in the resulting object wrapper. For example, a distributed object architecture typically has one or more directory services to enable applications to dynamically discover other applications in the environment without hardwired programming of point-to-point integration. The support for metadata director services is one of the new functions that the object wrapper can provide. Other kinds of functions in the wrapper include support for security, for system management, and for data interchange.

Object-oriented technology enables the creation of significant applications. Through survey research we have discovered some of the key challenges to the migration to object technology. The key challenge is the difficulty in establishing an architecture for the information system for the enterprise. To quote one of our sources, people start in the middle of the software process, immediately begin development without doing their homework, with no vision, no business process, and an incomplete architecture. Another challenge is in management of the object-oriented process, which differs in some fundamental ways from how software processes from previous paradigm were managed. To quote one of our sources, people are solving tomorrow's problems with today's technology and yesterday's methodology. Another challenge that we frequently encountered was a difficulty in sustaining an architecture during development and maintenance, once an architecture had been established. To quote our sources, it is easier to scope and start over rather than to figure out what they did. Another source noted that requirements evolve during design implementation, leading to hack design.

Other types of challenges were perceived as smaller obstacles than one might expect. For example, technology requirements were accorded a fairly low priority in the migration to object technology, compared to architectural and management issues.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.147.56.18