Chapter 4. The Role of IT in the Effective Knowledge Network

The theme of this chapter is the need for simplicity in choosing and implementing technology. It is tempting to think in terms of choosing or designing software that will do more work and thereby increase human productivity. However, there are important reasons for at least beginning with the simplest tools that will enable measurable improvement in knowledge exchange. One reason is cost. Another is in facilitating the building of a good working relationship between the IT department and the people looking to build the online knowledge network.

The overarching purpose of information technology (IT) is to increase productivity in the workplace. To that end, as we've mentioned in previous chapters, IT departments now assemble complex systems of specialized hardware and software applications to serve the varied and distinct information needs within the company. Some of these applications are designed to work together or to share standard interfaces, but many are not.

Once adopted, these applications and their software structures become indispensable to the company or business unit in direct proportion to the amount of information they hold. So in a busy business, as the information and knowledge needs of the company change, its rigid legacy systems become the weak links in the coevolutionary chain that would ideally evolve software in coordination with changing needs. Outdated and limiting features hold back the company's ability to adapt its technology to important aspects of its evolving business values and culture.

The twin preoccupations of the chief technology officer (CTO) today are overcoming the inertia of legacy systems and attaining greater interface flexibility, where the technology is able to adapt to meet the needs of the organization in flux. Knowledge networks, by their nature, seek change and discovery and are likely to bring demands for more change to the technical environment. This makes the relationship between IT and the concept and culture of knowledge sharing a critical one.

Until the technology in place can coevolve with the organization's changing business models and cultures, companies will go through periods where the design of the information interface is out of sync with operational needs. The use of information systems then becomes so cumbersome, nonintuitive, and inefficient that people refuse to use them, settling instead for less technical means to accomplish their tasks. This may be fine but for the fact that it sacrifices the potential efficiencies that well-designed technology can bring. For the technology of an online knowledge network to succeed, its members must choose to use it regularly as an essential part of their jobs.

This chapter focuses on the importance of the IT department's collaborative role in building the knowledge network. This role centers on its cooperation in the overall design of the networking environment, including tool selection and configuration, and the integration of features and functions appropriate to meet the needs of each distinct knowledge-based community. The role extends beyond the provision of tools to an active part in an ongoing relationship with the social network as it strives to improve its knowledge-sharing environment incrementally.

The professional and social relationship between the people implementing the knowledge network and the people in the IT department, which comprises a powerful knowledge network in its own right, is crucial. A close working relationship between IT and those leading the development of knowledge exchange systems will, in the end, benefit both groups.

IT and Knowledge Exchange

How do the people responsible for the information technology in an organization relate to the needs of those seeking to improve knowledge exchange and transfer through computer technologies? What issues consume IT's attention today, and how do those issues affect its ability and willingness to work with knowledge-networking advocates? It's important that we understand some practical and cultural realities about the people charged with IT responsibilities before we assume that the implementation of the right technologies is simply a matter of asking that they be done. IT is a busy place, and the technologies required by an effective knowledge network can be complex.

Our historical account of the tools and management models that heralded the arrival of the Information Age described email as one of the first applications created after the hardware, software, and networking protocols were made available to support it. The information technology community became the first working online knowledge network.

As technology was subsequently adopted and embraced by large businesses and organizations—and then customized to meet their growing information-handling needs—emphasis switched from group communication to more sophisticated ways of inputting, organizing, storing, and retrieving the burgeoning mountains of data. And as these information-processing organizations eventually connected to the Internet, the challenges and problems faced by the person in the role of CTO multiplied.

Now emphasis is swinging back from data storage and manipulation toward interpersonal communication as the most effective means of exchanging knowledge. How do the priorities, activities, and culture of IT—with its information-based worldview built in the 1990s—relate to the needs and culture of the interactive knowledge network? There are many ways in which IT inherently understands the model of the knowledge network and may be able to save itself some work in the long run by helping to establish strong technical bases for them.

The CTO's Growing To-Do List

In most companies, the work of the IT department is forever a work in progress and never without crisis. Consider just some of the department's responsibilities. Most IT departments must do many, if not all, of the following:

  • Provide the most up-to-date tools and connectivity for internal system access to every desktop

  • Configure, maintain, and upgrade the software used by every employee

  • Select, secure, install, and fix all of the company's computer technology

  • Provide system security, backups, Internet access, firewall configuration, and virus protection

  • Evaluate and approve the selection of new technical tools for new needs

  • Program in-house solutions to business problems

  • Work with outside technical consultants on a wide variety of projects

This is, of course, only the tip of a very complex iceberg. When all of this has been systematized or accomplished, IT is expected to integrate all of these components and requirements into a seamless system that can serve a long list of different and changing needs throughout the organization.

This would be a full plate under any circumstances, but the emergence of and connectivity with the Web have added yet another dense layer of complexity to IT's tasks. Among the many facets of that layer, IT must now account for a much greater volume of independent and collaborative online activity by the average worker. No longer is the individual at the PC simply the destination point of an information query's response or the source point of input for a page or stream of information. Now, every individual, both local and remote, is a potential correspondent, contributor, and editor in an interactive network of ad hoc publishers and readers. There is an ever-increasing volume of information passing back and forth across the last bastion of internal company security: the corporate firewall.

Since 1997, the Web interface has become the lingua franca of commerce over electronic networks. This level of standardization has simplified many interface challenges. However, IT managers still face an imposing array of incompatible software platforms with no accepted standard for integrating the many different applications now accessible through this global Web protocol. Such standards, now referred to under the umbrella label Web services, are under feverish development (and are described more fully later in this chapter). Yet there remains the question of whether the Web-using world, so accustomed to incremental grass-roots innovation, is ready to accept standardization and thereby change a culture with roots going far back to the original days of collaborative technology hacking in the sixties. Until that question is resolved, the major challenge to IT, especially from the knowledge management point of view, will remain the integration of disparate sources of information and knowledge through the shared interface of the Web.

The Daunting Task of Integration

Referring to software as a legacy system makes it sound as if it's been around for a long time. But these days, any system that has been installed and has become essential for a core function in the company is, de facto, the legacy system. The costs of replacing it can be prohibitive, so other upgrades or links between it and new applications adopted by the company must be designed to work with it. The products provided to build bridges between noncompatible applications are sometimes referred to as middleware. However, the more applications needing to be made data compatible with each other, the more geometrically complex becomes the integration, and the less reliable the final results. In Chapter 5, "Fostering a Knowledge-Sharing Culture," we'll describe some middleware products and products that attempt to remove the need for middleware by integrating applications directly through the top-level user interface.

Application integration directly addresses knowledge management problems. The purpose for taking on this daunting task is to make the knowledge contained in various legacy information troves more accessible and useful across the company. Customer relationship management should be able to use information from the order control system. Stock control systems should be integrated with accounting. The software designed for these different applications has, historically, been provided by different vendors, so merging or integrating their formats into one that made sense for the end user has required translation.

Such data translations have been done in the past by programmers, who wrote the code that glued the applications together. The more applications requiring translation, the more consultants and programmers needed to be hired or assigned to work on the task. Return on investment often became the determining factor as costs rose, and for that reason, the relatively new genre of enterprise application integration (EAI) products has appeared to reduce the need for expensive programming. But EAI brings its own unintended consequences of complexity and expense.

The CTO has huge problems to solve, even as interoperability standards like XML (Extensible Markup Language) attempt to fill the solutions gap. The knowledge network is made smarter by the availability of information from the many applications at work in the company, and it has its own needs for application integration in the technologies of online conversation and content management. But the promise of integration standards and of new middleware solutions that can automatically standardize the input and output of a variety of software applications has yet to be fully realized. Companies are finding that the work involved in installing and applying these solutions can be unexpectedly difficult and expensive. This is a chief reason for our recommending a phased approach to implementing knowledge-networking systems.

The technical fixes needed to improve knowledge management (KM) which relies on the manipulation and integration of information-handling applications, is more complex and expensive than the technologies required for interactive knowledge networking (KN) which provides online facilities for managed conversation and the sharing of relevant content. In that respect, KN should be simpler to implement than KM, but the more socially driven aspects of the knowledge network and its technical needs make it important that some intercultural issues between IT and KN be examined.

IT Culture in the Organization

Although we, the authors, work with technology and information, we don't think of ourselves as "IT people." Yet having worked cooperatively with them over many years, we've come to understand how IT professionals work, how they think, and the language they use. In our roles as online community builders, we have served as bridges between the users of the community-supporting interfaces and our associates in IT who managed the servers, operating systems, and software applications. We recognize a distinct difference in culture between the tool makers and the tool users—between the technicians who build and maintain the digital and hardware infrastructure and the people who use it as one of their primary social communications channels and thereby discover flaws that they can't fix. One culture is dependent on the other, and that asymmetry can lead to less than optimum collaboration.

Most information technicians learn their science first in the classroom, removed from the realities of organizations that must react to opportunity and competition. Once on the job, technicians follow learned standards in building systems that have been proven over time to work. They face constant demands to expand and modify those systems to meet new and unique needs presented to them by the CEO in response to the often-competing needs of various departments. Two key goals of their work today are integration (making all of the internal systems compatible with each other) and scaling (configuring technical systems to expand to meet growing demands).

The population of users in most organizations is unsophisticated in its understanding of technology to an extent not fully appreciated by the technical culture. Often, the highly trained people of IT incorrectly assume that the technology, as provided and configured, is easy for untrained people to use when in fact it is not. Often, IT will assume that the applications are delivering the required solutions when they are not. And often, IT will overlook the importance of and need for training in the use of the tools they provide.

Still, as we've pointed out, IT has strong knowledge-sharing roots. Its culture has formed over the years around attention to detail, faith that there is a technical solution for every problem, the shared assumption that a technical system is never truly complete, and a united feeling that, were it not for the soldiers of IT, the company would grind to a sorry halt. IT is the original digital knowledge culture because it has traditionally depended on the free exchange of ideas, discoveries, and credible rumors for its collective learning and advancement. To the extent that proprietary standards and technologies are now becoming more prevalent, IT as an open, knowledge-sharing culture that traditionally spanned the loyalty boundaries of competing companies is changing. But within each organization large enough to support an IT department, the local IT culture usually remains a microcosm of the knowledge-sharing tradition.

People who work day in and day out with network technologies develop their own viewpoints of best approaches to system design and development. But we know many IT managers and departments that have worked cooperatively and collegially with their internal colleagues, building close consultant-client relationships. The best keep an open mind and go out of their way to understand the needs, values, and strategies of the departments that depend on them for making optimal use of the technical platforms and facilities. These managers serve as the communications liaison between the technicians and the nontechnicians in the company, helping to translate needs into tasks and reducing misunderstandings and communication disconnects. This level of cooperation is critical to the implementation of knowledge-sharing technologies and to meeting the unique challenges they bring.

IT Culture and Knowledge-Sharing Culture

Knowledge sharing is about dynamic information exchange and communication. Its technical challenges have to do with interaction, the retrieval of stored information, and the constant gathering of new information. The key players, who may range from specialized teams to cross-discipline experts to entire departments, must be enabled to interact through the network with one another and with information resources. As part of that process, these knowledge-sharing communities must be able to produce new collections of information—based on their interaction, conversation, and the content they create and gather—that can be categorized, searched, and retrieved.

For IT managers, knowledge networkers are a special class of client. People who rely on the availability and regular maintenance of online meeting places will have different relationships with IT than those who deal only episodically with the software and the data it carries. The direct conversational involvement of knowledge networkers with their supporting technology can lead to frustration with IT, or it can serve to build unique working relationships with individuals in IT. The social nature of the knowledge-networking community should ideally become an asset in forging strong alliances with the IT community by building active and well-nurtured communication links between the two.

Unlike most user populations served by IT, knowledge-sharing communities spend time "living in" the company's technical environment. They treat it as a malleable resource just as they would a physical meeting room where the furniture can be rearranged to facilitate conversation and where various audiovisual tools can be requisitioned and operated to present information to the group.

Self-determined local control over incremental improvement to the interface is important for both the knowledge community and IT. The ability of community managers to respond to needs and suggestions of their members without having to get approval from IT is both convenient and empowering. And with the right software setup, IT can be relieved of the responsibility of making every minor (in terms of system resources) interface-level change in software configuration.

Knowledge sharers converse through the technology and about the technology because they recognize together how improvements in interface design and content delivery can help them discover, exchange, and use information and conversation more effectively. More than with other technical clients, IT can expect the members of a knowledge culture to be well informed and involved in identifying needs for their own changes and modifications. When those experience-based changes are specified directly by the tool users, as shown in Figure 4.1, they are more likely to be appreciated and used productively when implemented. The rising quality curve in Figure 4.1 shows how suggestions made by the community to improve the interface are followed by small jumps in the quality of the online interaction. As the community adapts to the interface changes, the quality curve flattens until the next suggested improvement brings another jump in quality.

What IT managers most need to know about knowledge culture is that, like programmers and system administrators, people conversing through the Net about their special interests are likely to be experimenters and explorers. They fill disk space with their discourse, their writings, relevant documents, and with the information they gather and collect as the basis of their shared work. Through their activity, they discover the needs for new software features, changes in the design of their online work environment, and the composition of their online teams. The idea is to put knowledge directly to work, and the best way to do that through the Net is to establish a trusted communications loop between the knowledge network and the IT resources that support it. Figure 4.2 illustrates how the members of the knowledge network discover what is lacking in the technical interface, pass that information along to IT or the parties qualified to improve the interface, and then receive the benefits of those improvements. These improvements often serve to make use of the interface more convenient or more specific to the knowledge network's needs.

Incremental advances in conversation quality with technical improvements.

Figure 4.1. Incremental advances in conversation quality with technical improvements.

To establish an ongoing relationship between the knowledge exchange community and IT, a phased approach to implementation is most economical and productive. With each phase of technical improvement, as shown in figure 4.3, the communications between the two communities can be refined and made more efficient. IT can assess its practical capabilities, assign resources, and work with the knowledge community to define the goals of each phase of implementation. The feedback process of technical design and actual use of the technical changes can be made smoother, with time set aside between phases for reevaluation of needs and capabilities.

The ongoing feedback loop between IT and the knowledge network.

Figure 4.2. The ongoing feedback loop between IT and the knowledge network.

The knowledge-sharing community, for its part, must be sensitive to the practical capabilities and limitations of the IT department and thereby minimize inappropriate demand. Communication between the two communities should be defined by an agreed-upon process, with identified liaisons on either end. IT departments prefer trouble ticket systems that keep complaints and bug reports in order and track responses to them. IT should provide training to the knowledge exchange community in how to obtain its services most effectively.

Teams representing the expressed needs of the knowledge community should meet, between build-out phases, with teams representing the relevant skills and responsibilities in the IT department. As figure 4.4 shows, an uncoordinated barrage of individual requests and trouble reports from an active knowledge community can force a busy IT support team to shut down its intake. Orderly systems for reporting technical bugs and suggested improvements help preserve good relationships between an IT department and its clients. Uncoordinated communications can confuse technical fixers and cause them to avoid responding to a deluge of redundant or conflicting requests.

The mutual interests of knowledge exchange and IT serve to (1) keep solutions as simple as possible, (2) arrange efficient and steady communications about needs and capabilities, and (3) make the most positive difference for the company for the least cost in terms of time and technology. With so many software integration solutions reportedly running way over already high budgets, the knowledge network should be technically managed to make the company smarter through the collaborative creativity of the knowledge sharers and the technicians.

The relationship between the knowledge network and its technical resources should be active and ongoing.

Figure 4.3. The relationship between the knowledge network and its technical resources should be active and ongoing.

Coordinated and orderly communication with a busy IT department brings better results and relationships.

Figure 4.4. Coordinated and orderly communication with a busy IT department brings better results and relationships.

IT and the ROI of Knowledge Networks

A constant drumbeat in the technical press today is the heightened need for CTOs to work within ever-tightening budgets. Return on IT investment has become more important than at any time in the past decade, and the need to justify every dollar spent has forced CTOs to find better ways to evaluate in advance the return that can be expected from every purchase. So many intangibles affect those returns that accuracy and certainty are impossible. Costs may be easy enough to estimate, but returns can be affected by a range of unknowns such as the strategic fit of the technology, levels of customization required, and the possibility that a competitor's change in strategy will somehow devalue the investment.

Knowledge networks don't necessarily require the scale of expenditure that many other IT projects do because their objective ROI assessments can be based more on the cost and resulting revenue impact of basic online communications tools and integrated interface design. However, improvements in online knowledge sharing tend to lead more indirectly to greater revenue returns. Thus, the ROI assessment of those improvements must be done more subjectively than in situations where changes directly affect costs of production or net profit on sales. Some experts have confronted this situation and have devised several useful approaches.

Jakob Nielsen, the widely respected user interface guru, in considering how ROI assessment could be achieved for improvements in the interface design of a company intranet, recommends "measuring the productivity gains and seeing how it improves the employee's ability to undertake their tasks."[45] He looks for objective criteria for assessment, achieved (in the specific case of intranets) by having "study groups of ten, twenty people being monitored in their tasks to see the gains they are making." Organizations, he says, should "list a number of key metrics right at the start of the project" and work toward defined goals to realize the cost savings that technical interface improvements can and should bring. But for many technical improvements, objective measurements of ROI are tricky.

A joint study by Intel and the Wharton School of Business[47] recommended more subjective evaluation of IT investment as one way to avoid purchases based only on objective (but speculative) revenue numbers. The study also emphasized the concept of revenue distance: how far the software or hardware proposed for purchase is from the collection of actual revenue.

Unlike technology acquisitions that can be used immediately and directly by buyers of the company's products, most knowledge-related applications have large revenue distance. Their use leads to clearer thinking in strategic planning more often than to immediate effects such as higher sales. But combined with subjective evaluation, an investment in improving knowledge transfer can bring greater overall long-term benefits to the company. Further combining these evaluative approaches with the objective observation of actual changes in employees' efficiency in work patterns recommended by Jakob Nielsen can provide reliable metrics for ROI assessment.

Another approach to economizing through the technologies and practices of knowledge networking is in improving "management leverage metrics." Through the wise use of online communications, the number of employees reporting to a manager can be increased. Even a slight increase in this metric, multiplied across a company with thousands of employees, can pay for the investment in technology in a short time.

Of course, for the knowledge network, this level of analysis should be part of the initial strategic planning. Combined with phased implementations that employ basic and economical software solutions rather than elaborate and expensive ones, the CTO should be able to justify the initial phases of the knowledge network's technical infrastructure. The solutions we recommend in Chapter 8, "Initiating and Supporting Internal Culture," and Chapter 9, "Conversing with External Stakeholders," will provide further ideas for ROI assessment.

Technical Approaches to Managing Knowledge

As we emphasize throughout this book, a knowledge network is a technosocial entity requiring a good match between the tools supporting conversation and the organization of the conversationalists. Without the involvement of humans and their social concerns at all stages of strategy, planning, design, implementation, management, and development, the technological components can do little to advance the spread and use of knowledge in the organization. And without the correct technology, selected and implemented with the wisdom of IT, many opportunities and conveniences for sharing knowledge and generating new knowledge will be lost.

Technology can only do so much, and it can be deviously simple to provide what look like the right solutions only to find that they don't fit the process needs, work habits, or social culture of the people meant to use them. However, there are several areas where technology can provide tremendous leverage, and IT, in collaboration with the planners of the knowledge network, should prioritize the fulfillment of needs in the following areas:

  1. Integrating knowledge resources

  2. Organizing relevant information

  3. Providing the most appropriate basic tools to support the knowledge exchange conversation

Limitations of Technical Solutions

Knowledge is not like inventory items that can be stored by description in distinct bins on assigned shelves. A 6-millimeter hexagonal brass nut with standard threads, for example, is not subject to different interpretations. In contrast, a story about how a salesperson learned to understand the needs of a customer might be stored, presented, and understood in many different ways by different people because it has many subjective characteristics.

Knowledge is so dependent on human perception and context that one can't depend on a purely technical, automated solution to meet the learning needs of a group or a company. The group must involve itself in the design process of its technical knowledge-sharing environment. That effort is, in itself, a knowledge-sharing activity. The ideal role of IT in that process would be as the group's technical advisor and consultant.

This collaborative design process for knowledge-networking technologies distinguishes it from the more top-down implementation of many knowledge management technologies. Knowledge networks, by definition, are to be used as part of the daily work process. They require the participation of their members in their strategies and design. As the big consulting firm KPMG concluded, after a large-scale analysis of the realized benefits of knowledge management systems reported by 400 companies: "These responses confirm the fundamental flaw in viewing KM as a technology issue: It is not the technology that is holding organizations back but a lack of strategy and a failure to build KM in the organization's day-to-day operations and its culture in order to encourage end-user buy-in."[49]

What good is technology if it is not used? An online knowledge network does not exist without its technical tools, but it must wisely choose tools and design interfaces that are appropriate and will actually be used because they answer real needs. The base-level tools that one knowledge group requires will almost surely differ from those required by other groups. Some kind of technology will certainly be necessary for online knowledge exchange, but unlike KM systems, the most important exchange activity will not be in the retrieval of well-organized information. It will be in the active give-and-take between people through the communication and content delivery systems provided by IT.

Integrating Knowledge Resources

According to the technical dictionary site WhatIs.com, "a kludge is an awkward or clumsy (but at least temporarily effective) solution to a programming or hardware design or implementation problem."[51] In the pursuit of higher productivity per worker, many IT departments, lacking sophisticated solutions or strategies to guide them, have built kludges to provide access to different applications from a single location on an intranet. Although usable, the resulting online gateways have not really solved the interface compatibility problem, and the resulting confusion with incompatible interfaces presented in a common window has often eroded or reversed the very gains they were meant to achieve. Users refuse to use the confounding gateways, and their productivity is not improved. Thus the emphasis in IT on improved integration: the technical conversion or reconfiguration of data and interfaces from different software applications into single, unified, comprehensible "consoles" that users are more likely both to understand and employ on a regular basis.

The conversion of business theory to the knowledge management approach brought greater focus on two things: (1) delivering specific information to specialists who needed it and (2) avoiding the unnecessary duplication of the same tasks within the organization. To those ends, specialists defined the knowledge resources that needed to be made more conveniently available to them. Depending on the business unit or department being served, these may have included records of client transactions, stored proposals and project histories, and locators for expertise and current related activities within and outside the organization. Different applications created and stored these resources, and IT provided kludges to tie those different applications and their databases together. Recognizing a clear opportunity for improvement, software providers began offering packaged products that claimed to serve the same purpose, saving time for IT and providing solutions that were more elegant and intentionally designed.

Whether or not the company's integration solution consisted of knitting together best-of-breed software applications or purchasing these ready-made all-in-one applications, one great obstacle to utility remained: the inappropriate manner in which the content of the databases was selected and stored. IT would create and set up the information storage process, but without the essential advice and consent of non-IT experts, who represented the knowledge needs and perspectives of the end users and internal clients. Thus, the stored information did not go through the essential processes of editorial selection, categorization, and filtering provided by the people most familiar with the content and how it would ultimately be used. The results of providing application integration without the involvement of the end users of the information can be something like granny's attic, where piles of articles related to the family history have been stashed expediently over the years. There they sit, gathering dust in their random heaps, until a family member with a desire to do genealogical research (and plenty of spare time) finally comes along to make sense of the chaos. Without including some systematic and meaningful ordering of content as application integration takes place, the knowledge held by the organization becomes, for all practical purposes, useless.

Web Services: A New Approach to Integration

In the pursuit of simpler application integration, the latest trend as we write this book is toward the creation of standardized "Web services." It's too early to be sure that these will fulfill their early promise, but there is no doubt that companies desire what Web services claim to deliver: the ability to mix and match utilities from different providers to build full Web applications for use in both internal and public networks. Web services would eliminate the shortcomings of both kludged integration and all-in-one solutions because they could neatly bring together the best-of-breed solutions for various functions in customized, internally consistent Web interfaces.

The software standards being bandied about go under different acronyms such as UDDI, WDSL, and SOAP. Through widespread adoption of a Web service standard, programmers hope be able to assign "agents" that can go to specific Web sites to accomplish specific functions such as integrating various programs. For a knowledge network, this opens the possibility of selecting a variety of applications from different Application Service Providers (ASPs)—message boards from one, news feeds from another, supply chain management from yet another—and integrating them into one seamless Web site. Here all of the required applications follow the same formatting and functionality rules, thus eliminating the need for cutting and pasting data from one application to another.

The greatest barrier to the widespread adoption of any one Web service standard, as we noted earlier, is the Internet's history of incremental grass-roots innovation and its tendency to resist the freezing-in-place effects of standardization. Microsoft (no surprise) is one of the leaders in bucking the resistance, offering its "Net" standard for Web services. Through its alliances with eBay, CNBC, and its own Carpoint site, Microsoft has been able to demonstrate its protocol in action. Its allied companies can, through the Web, send custom alerts to their customers containing auction updates, stock prices, and realtime auto-related news tidbits. Customers can receive these alerts by way of email, cell phone, and personal digital assistants (PDAs).

Because of Microsoft's huge installed base of PCs and servers, there would be widespread compatibility with its standard. That would be a good thing for many companies, but the groups advocating competing standards maintain that there would be unacceptable interoperability problems with systems based on other operating systems, notably the very popular UNIX. UDDI also provides a list of applications that, under its UNIX-based standard, could be linked together into integrated products.

As of year end 2001, the Web service standards issue remains unresolved. IT managers not concerned about compatibility with external systems managed by other companies might be persuaded to adopt any one of the standards internally to gain its application integration benefits for building out their intranets. But should they make a decision now, they might regret it later if an important partner or market turns out to be using a different Web service.

Besides compatibility problems, a secondary hurdle in the adoption of Web services across applications could be the assignment of responsibility for problems encountered in integrated systems. Suppose a company employs Web services to deliver an online product to customers or clients through integrated applications provided by five different companies. If a customer encounters problems with output from the system, who takes the blame and provides the support? The application provider or the application integrator?

Internal knowledge networks can benefit greatly by adopting Web service solutions. It's only when the networks extend to outside the organization—as in applications where customers on the Internet are involved or where partnering companies become members of a knowledge-sharing extranet community— that the hard questions about choosing a standard must be answered.

Knowledge Organization

Knowledge networks rely on the organization and contextual availability of content to support their conversations and their work. Likewise, they need to quickly store the content they produce with the same quality of order and with the same level of availability. IT must provide the tools that allow this customized information flow.

The need to assign order to knowledge resources led to the development of taxonomies and categories as far back as the Library at Alexandria in Egypt. In today's world of knowledge, categorization helps match information with the tasks, projects, and departments that create and need to retrieve it. Editors and archivists, representing the focus of the knowledge community, are essential in making the best use of static and dynamic information as it flows between conversations, new content, and stored databases. IT provides the technical facilities, and the knowledge network provides appropriate human intervention for using them. As Figure 4.5 illustrates, a librarian or archivist fills an important role in any online knowledge-based community. Categorization must address the special needs of business units, teams, and communities where new information is being generated constantly.

To meet this need, automatic taxonomic software programs, which file information according to embedded or assigned keywords or by the context of its creation, are becoming more common. IT may recommend these as solutions for the knowledge community, but they are only as effective as the active involvement of their human users makes them. Automation of knowledge organization may help prevent the granny's attic scenario, but human involvement and evaluation are necessary to determine what knowledge is truly worth saving and in what context it should be saved.

Essential elements for online integration of discussion and content resources.

Figure 4.5. Essential elements for online integration of discussion and content resources.

The categorizing functions of a knowledge network should have an online home that, through its location, provides some context and access to the people most likely to use it. This is an important integration point for IT, where it can bring such services to users in immediate online proximity to the conversations that will make use of, and contribute to, the contained subject knowledge. Such an online location would be the knowledge portal. In our discussion of portals later in this chapter (see The Knowledge Portal), we describe how knowledge communities can perform taxonomic functions through the portal interface.

Basic Tools of the Knowledge Network

Conversation and content are the basic building blocks of knowledge exchange. Putting the process online creates the need for basic tools to support the wide range of conversation styles and structures and the wide variety of content formats and shelf life. We emphasize the wisdom of starting small, following a phased implementation, and basing that implementation on an overall strategy. Thus, in this brief overview of the basic tools, we focus on technical products that are inexpensive to install, easy to scale, and simple to customize and manage. Chapter 5, "Fostering a Knowledge-Sharing Culture," provides a view of knowledge-networking software that expands beyond the basics.

The IT department's involvement in these tools will begin with evaluation and approval and can extend from integration into the company's intranet interface to full administration of the platform on company servers. In most cases, some level of support will be required for ongoing improvement and evolution of server-side integration and for support of the Common Gateway Interfaces (CGIs) and Java applications now a part of most software.

Email

This oldest, most basic, and most ubiquitous software application is recognized as the one "killer app" that, more than any other, justifies the existence of the Internet. It is also the most abused of applications, as everyone who must delete junk email constantly or who has received misdirected messages (or sent them) understands.

Email was, and is, the channel of most communication between individuals and groups. As a means of participating in mail lists, Usenet newsgroups, and now, many commercially designed online message boards, it serves as the interface to group communications. Through its many user interfaces, it permits the sharing of files, links, and graphics.

For IT, the existence and maintenance of mail servers are some of the most basic elements of their installation. Their involvement is necessary in configuring special mail lists and aliases that allow defined groups to circulate announcements and participate in conversations. Where email is used as a means of participation in online message boards, they may need to be brought in or at least consulted about the configuration of the program and its interaction with the company's mail servers.

The most vexing problems of email (besides the unending task of filtering junk) are user overload and the security risks of transmitting viruses to internal systems. The more groups or lists people subscribe to and the more alerts and updates they ask to receive, the more likely they are to begin ignoring those messages as time goes by. IT can only do so much to help relieve the email burdens that people put on themselves in their pursuit of the right knowledge and information. As to security, most competent IT departments have active virus-filtering programs in place and use firewalls and policies to minimize risk.

Instant Messaging

Email is asynchronous; two people corresponding don't have to be online, writing and reading at the same time, to carry on a conversation. But there is a different quality to communication when the medium is synchronous, like the telephone or instant messaging (IM), as America Online calls their Instant Mes-sager technology. The immediacy of response when people communicate in real time is much closer to the experience of talking face to face or on the telephone. For many people, that immediacy makes the communication more intimate, more exciting, or more social. It's a very popular way of communicating, as was demonstrated when a company called ICQ offered its instant messaging client over the Web several years ago and, without any marketing, had 14 million people download it over the course of a year.

Today, IM software is available primarily from two sources: AOL (even for non-AOL members) and Microsoft (and the many distributors and servers of its technology). Although compatibility problems still exist between the two main standards used by AOL and Microsoft, the use of IMs within businesses has been skyrocketing because of its convenience for supporting teamwork.

The problem for IT is one of IM security, as noted in a report by the Gartner Group.[54] With up to 70 percent of enterprises expected to be using IM for various purposes such as customer support and workplace collaboration, the use of what Gartner calls "free" instant messaging clients opens the door to the interception of messages, transmission of computer viruses, and intrusion through nonstandard system ports. New enterprise-level secure instant messaging applications are now on the market.

Discussion, Conversation, and Conferencing

Online message boards and conferencing interfaces allow groups to engage in organized, moderated conversations that serve as both a means of meeting in virtual space and a content-generating activity. The main strengths of these platforms are that they offer the opportunity for participation and involvement at the convenience of people whose schedules may not allow them to attend real-time online meetings, and their interfaces allow conversations to be built on planned structures as an aid to organizing knowledge communities and their projects.

These systems are available to support conversations in two main formats: linear and threaded. Linear conversations begin with a title and topic header and proceed with messages that are added one after the other in a linear progression. As each participant reads through the list of messages and adds his or her own, it becomes the last message in the list. Threaded messages permit a participant to respond directly to any message posted after the topic header instead of only to the last one. Thus, any message responding to the topic header can, itself, become a topic header for a new conversation or thread. Which is best for a given knowledge community depends on the preferred format of conversation, the amount of participation, and the purpose of the individual conversation. Some products permit the use of both formats, with the participants able to choose their preferred view and use of the interface.

Message boards can run as licensed applications on the company's own servers or can be used for activity-determined fees as run on ASPs. The preference of the IT department and its need for security or to customize the interface will determine a given company's approach. Different products provide different levels of control over customized interfaces, degree of organizational options, and the extent to which different users can be assigned permissions and powers for administrative control over levels of interaction.

For example, a system may be used to conduct ongoing meetings for four related work teams. Each would need its own set of conversation topics, and each would want to have its own conversation manager with powers to start, end, edit, and organize topics. An overall community manager would be empowered, under the software's tools, to set up these lower level administrative capabilities.

The best of these products also permit the integration of content through their Web interfaces, making them hybrid platforms for conversation and relevant content publication. They may also permit the integration of other software applications such as email (for posting messages or for receiving new messages posted to selected conversations), real-time chat or IMs, and groupware tools such as collaborative white boards and copublishing interfaces such as wiki, which we'll describe in detail in Chapter 5, "Fostering a Knowledge-Sharing Culture."

The Peer-to-Peer World

Peer-to-peer, or P2P, applications are the rebellious youth of the organizational software world. They permit peers (that is, individual computer users) to collaborate directly over the Internet without the direct use of intermediating servers, which are the domain of control of IT. In fact, one of the reasons P2P applications are being developed is to circumvent the limitations and rules imposed by IT on users of its systems. Many people would prefer to configure their working interactions according to the needs of the moment rather than wait for approval, clearance, and possibly unsatisfactory results coming from the requirements of IT's involvement in the process.

P2P is truly the ultimate vision of the Web in that it gives equal power to every individual with a connection and a computer. Yet, as Eric Woods states in KM World,[55] "it has yet to make a significant impact on the corporate IT world." The reason he gives is that the prospect of people in the corporate workplace all doing their own technical thing "is the stuff of nightmares for most IT managers. Their systems—and their lives—are complex enough without adding new layers of connectivity and interaction."

That said, there's not much more for us to write about it in this chapter about IT, but we'll write more about the exciting possibilities of P2P in the knowledge-networking realm in Chapter 5, "Fostering a Knowledge-Sharing Culture".

Content Management and Publishing

Providing timely and relevant content through the Web to working knowledge communities is just as important as supporting their conversations. In fact, as we've pointed out repeatedly, those conversations often become the stuff of content—as edited transcripts of the actual interaction, as quotes extracted from dialogues, and as stimulators for new writings and documents that become available to the group.

There are many products available for enterprise-level content management, and most enterprise-level companies now have at least one installed for publishing to their intranets, extranets, or customer-facing Web sites. Many of them can be adapted for use in combination with the interactive interfaces described earlier. And as we mentioned, some message board interfaces provide at least limited content management capabilities.

All of these tools, of course, need to be integrated into a common, useful online space where the knowledge community can gather and share what it knows in the context of specific projects, goals, and practices. In the next section, we describe two approaches to bringing it all together into knowledge-sharing environments on the Net.

Online Environments for Knowledge Sharing

Though any of the foregoing Web-based tools can serve the needs of a knowledge-sharing community, they are more likely to be used and to make a difference in the productivity of that community if provided through an online facility meant to serve the entire organization. Such facilities have been available in many organizations since the late 1980s, yet they are still in their infancy in terms of design and utility. Application integration is only one of their shortcomings, and IT is usually given most of the responsibility for designing and creating them. However, it would be unfair to expect IT to understand the human engineering dimensions necessary to fit them to all of the possible uses that different groups within the organization will have for such resources.

The first attempt to bring useful services and resources to the desktop of employees was called the intranet because, unlike the Internet, it was meant only to network within organizations rather than between them. Intranets, as we've pointed out, suffered in their acceptance from poor design and limited integration and standardization. Only the most skilled or curious employees made good use of them. And though much has been written about them as knowledge management resources, their most telling limitation was in their lack of actual use. People simply chose not to devote the time necessary to learning how to penetrate their confusing interfaces and formats.

Knowledge networking depends for its success on participation. When the members of such a network find that they must devote too much of their time to searching for information, learning to use the tools, or establishing a connection between the conversation and its supporting content, they will give up and make decisions based on whatever information they can find before frustration sets in.

Thus, good design in the online environments that support employees and, in our case, knowledge communities is important. Companies are learning from their failures in early intranet design, and the examples set by Web pioneers like Yahoo! have demonstrated the value of building sites that provide access to many complementary resources, the so-called Web portals.

These can be key resources for knowledge exchange, so we will devote some pages to them in this chapter, where we'll concentrate on the IT department's role in providing good ones. In Chapter 8, "Initiating and Supporting Internal Content," and Chapter 9, "Conversing with External Stakeholders," we'll provide design and management tips for making these environments a good fit with the needs and goals of knowledge networks.

The Productive Intranet

In the 1990s, corporations began building intranets to give their employees access to commonly used information resources. These included directories of personnel, information from human resources about benefits, 401(k)s and stock option plans, online forms (also used by HR), and general news about the company. Some companies used their intranets as gateways to different databases of business-related information maintained by the company. These were accessible in their raw data formats and were not presented through user-friendly interfaces. As Jakob Nielsen described most intranets through the 1990s, they were "lacking interface design standards, unified information architecture, and task support for collaboration and other activities."[57] Employees weren't motivated to use them because their designs were confusing and difficult.

As the Web became more widely used and as its technology and more standardized interface features penetrated the internal design sensibilities of these organizations, the options for what could be provided through the intranet expanded, as did the capabilities for more integrated interfaces. Their original purpose—to enhance employee productivity through convenient access to often-needed information and resources—hadn't changed, but the ability to motivate their use by making them simpler and more attractive to use had.

People who study intranet utility now regard simplicity and adequate training as the best ways to entice employees to use them. Once the work force has become accustomed to intranet use as a normal part of the daily routine— checking shared calendars, company bulletin boards, paycheck stubs, and daily management announcements—its use for purposes beyond those administrative and HR-related tasks is more likely to be adopted. Groups seeking to share knowledge will begin to recognize the utility of taking their activities online and, following the design examples of simplicity and utility, will drive the building of their knowledge networking environments.

Figure 4.6 shows the design of an intranet page chosen by interface guru Jakob Nielsen as one of the best of 2001. Simple design and navigation with emphasis on a "community" feel and intracompany communication are among its strengths. But intranet design must address the needs of the communities using them. This one, assembled by an intranet specialty company called sil-verorange (www.silverorange.com), meets Nielsen's design criteria and implicitly supports the values of knowledge sharing. However, it does not provide the specific utility required by a specialized knowledge network. Instead, it serves as an informative bulletin board for an organization.

The silverorange intranet design was judged by Jakob Nielsen to be among the best of 2001.

Figure 4.6. The silverorange intranet design was judged by Jakob Nielsen to be among the best of 2001.

Good and useful intranets should be designed by the people who will actually use them. Following the theme of this chapter, IT should serve in the role of consultant to the client design team to make sure that the end product is something that will answer real needs. The resulting product should make it easy for people to innovate: to create new pages, post new content, and collaborate with colleagues. The people who would use the intranet to find information should be able to find what they're looking for without satisficing—making do with unsatisfactory results. Eric Hards, a senior designer for Lockheed Martin's intranet, recommends redundancy: "You need to give users as many ways as possible to find something."[58]

To ensure that the design will work, IT must implement and then go through testing phases with the design team as focus groups and beta groups use the initial design and evaluate its usability, navigability, and searching capability. Subsequent improvement will need to be made, and where needed, training resources will need to be provided. Once the intranet is launched, IT will maintain the basic structure while providing an interface that is under direct control of the various groups, including knowledge-exchange communities, who make use of it.

The Knowledge Portal

Portals are like intranets in that they provide online interfaces that bring a variety of resources together in one place. One difference in definition between intranets and portals is that the intranet is a system provided to all employees of the company, whereas the portal interface is a Web page devoted to serving the needs of more specific interest groups. Portals are often accessed through the company intranet.

Sometimes referred to as enterprise information portals (EIPs), portals have the same purpose as the intranet: to improve the productivity of employees. The best of them provide access to a range of services from company history and policy to training resources and detailed product information. They bring together all of the tools we've described in this chapter—conversation interfaces, content management, access to information databases—and they can be administered locally by leaders of the teams that use them.

Well-designed portals can reduce IT costs by distributing administrative responsibilities to the people who are most likely to understand the changes needed in them and are most able to respond promptly to the expressed needs of portal users. That effectively removes a time-consuming task from the support loop and spares IT the responsibility of making technical changes that are within the expertise of less skilled (and often, less expensive) people.

Not all portals are limited to internal access, as are, by definition, intranets. The broad definition of portals applies to Web interfaces that invite access and participation by customers and partners, fitting within the definition of extranets. In their support of knowledge exchange among customers and between customers and the sponsoring company, they serve as knowledge networks.

All of the divisions of responsibility and design we described for intranets apply equally to portals, the difference being that organizations that support portals are likely to have more than one of them, with corresponding teams designing and managing them. This creates the need for some restructuring within the IT department to serve what may be a whole new category of support under the IT umbrella rather than a single point of contact for intranet administration.

As with intranets, we will explore best practice solutions for portals as a powerful tool in the knowledge-networking process in Chapter 8, "Initiating and Supporting Internal Conversation." We'll revisit portals as an extension of the knowledge network into the realm of customers and interbusiness collaboration in Chapter 9, "Conversing with External Stakeholders."

Summary

An online knowledge network depends on the active support of the IT department for the creation and basic maintenance of its working environment. For that reason, it's important that a well-understood working relationship be established between the leadership of the knowledge community and the appropriate people in IT. The knowledge network is a dynamic entity that discovers better solutions for its needs through its internal conversations and exchanges. It requires an attentive ear in IT just as it needs software tools that it can modify at its own discretion.

Simplicity is the primary criterion for technical solutions both because of costs and the need for members of a knowledge network to adopt and use them. Beginning with the most basic of interface tools for conversation and content management will bring greater participation and a smoother path to incremental improvement of the interface. The role of IT should be to aid in tool selection, initial installation, and the maintenance and integration of relevant information applications within the company that will support the pursuit of knowledge.

Knowledge-exchange communities are most productive when provided with complete online environments that include current relevant content, appropriate conversation tools, and the ability to customize their virtual workspace as needed. Intranets are one approach to building these environments, but portals fit more of the criteria of meeting spaces specialized to the focus of distinct knowledge networks.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.116.10.107