CHAPTER 7
The Technology Trap
The knowledge of the world is only to be acquired in the world, and not in a closet.
—Lord Chesterfield (1694-1773), published 1774
 
Based on the flawed use of the term knowledge management to depict the management of an entity external to humans, a number of organizations approached the original issue of making best use of the knowledge within their organization from an infor mation technology (IT) perspective. Many knowledge management projects have beenstarted within the IT organization, and not too surprisingly they began with the evaluation and buying of software and hardware. This was true 10 years ago, but in a lot of cases it is still true today. Again and again, one of the first questions I get when I talk to those who have been charged with creating or reviving a knowledge management program and who are just getting started is “What software did you use?” When I investigate further, it is very clear that people think that all they have to do is buy and install the right software to be successful. The software question should be one of the last ones I am asked, not the first one.
But if we look back, the situation looked like this (do not feel bad if this is how your company approached it; you are definitely not alone):
A middle manager (Joe) encounters the potential value of knowl edge management (via a conference, article, or book).
Joe goes back to his company, discusses it with others, and gets some excitement and buy-in from his boss. He then initiates a new project, appointing a project manager (Bill).
Bill and a few others do some additional research and stumble over a few knowledge management vendors.
Before they have planned a fully holistic view of what this might mean to their company culture and how they are going to deal with the ongoing issues, they invite three or four vendors to showcase their knowledge management software.
Companies A, B, C come in to show their super-polished portals, content management, or collaboration platforms, carefully demon strating how great the systems look if they are filled with information that can be retrieved at a click of a button. At the same time they are just as carefully steering around any questions on issues you might have to fill the back end of that portal with useful, up-to-date, clean information on a daily basis via processes integrated into the normal day-job of your employees.
Everybody evaluating the software is very excited; the return on investment of saving time and avoiding reinventing the wheel and the innovation potential of such a system filled by thousands of employees dwarfs the $1.5 million that it will cost you. Well, your budget was only $1 million, but that extension is easy to argue.
The system gets installed, and IT support is handled by a few technically savvy geeks who just love to explore the 324 features the new software has to offer.
The roll-out is done in a strategic way. Everything is tested to make sure that technically it is working perfectly at the big launch. After all, this is an important and strategic project. A high-level vice president or even the president either sends out a special message or introduces it at some kickoff presentation. The internal communications group produces and runs a news article about the way that the new knowledge management system will make everybody more effi cient and how it will now be so easy to find all the knowledge they need. It will be only a few clicks away.
Usually the expectancy will be that everyone will immediately go to the new system and start sharing their knowledge.
And some people will do that. Some will share the excitement and start entering information representing some of their knowledge.
Just as typical is the situation 18 months down the road. The system is still there. It contains some content, usually entered once, seldom updated. There are a few users—usually a fraction of the number of people anticipated. The original team that developed and rolled it out has left the company or is on other projects within the organization. If you ask people about it, they will say things like “Oh yes, I remember we had something like that, but I am not sure if it is still there and where it is.”
So what went wrong? They forgot the engine in the car!
If we take a car as an analogy for a moment, what happened is a little bit like buying a car that has a lot of luxury items. It has alloy wheels and all the internal extras (power-everything, 500-watt stereo, leather seats); however, you buy it without an engine. As a result, it will just sit there. The car without the engine was already $45,000, so there was no money left for the engine. It would have cost another $5,000.
But as we all know a car without an engine misses the point. What I am claiming is that a knowledge-sharing or knowledge flow initiative without investing in a lasting team to run the show not only from a technical support perspective, but with the right strategy, ongoing support, frequent application adaption, internal marketing, and moti vational activities, is like buying a car without an engine. If getting knowledge to flow was your objective, without those elements, you did not give yourself much of a chance.
For every dollar you spend on the technology, you need to spend at least 50 cents for the ongoing initiative support. And that is considering that you got well-experienced professionals running your knowledge flow management initiative.
This can be illustrated by the following, Leistner’s first law of knowledge flow management:
K = T + S
where
K = Full investment for a knowledge flow management initiative
T = Technology investment
S = Initiative support investment
For a successful initiative, I am proposing the following relation ship, where the investment into initiative support should be at least half the amount that is spent for technology. Ideally the amount spent should be close to the same amount that you spend on technology.
0.5 × T < S < T
This relationship can vary depending on the type of initiative. It will also vary over the time that an initiative is running. Once basic technology is in place, S should grow a bit higher than T. This formula is most important in the launch year.
It is essential to realize that S is not just technical support. While some of that is necessary, the technical portion should only be small portion. Most of it should be spent on the process and drivership support.
As mentioned earlier, technology is often easier to introduce and embed than processes. As soon as the focus shifts to human behavior, the situation becomes more complex. Or at least for those who lack a proper understanding of how to deal with human behavior, it seems a lot more complex. In some cases, the actual knowledge exchange can be simpler in a face-to-face situation, but there is no obvious handle for control in such processes, which makes it complex.
Because many knowledge management projects were driven pri marily by technically focused people, it is easy to see why the focus was on the technical side. The human side of why people might par ticipate or not and what could be done to get their ongoing involvement is something that needs a specific skills set and experience, as discussed earlier.
Do not misunderstand me: I believe technology is very important. It is a great enabler, and its existence can actually change people’s behavior over time, as we have seen with some of the Web 2.0 tech nologies. The question is to what degree is that achieved by technology alone. Just building something and putting it in front of people is not enough. No matter how easy it is, it needs some type of guidance to be really efficient. Without that guidance, you might get good usage but potentially not very efficient usage. What good is it if you have large participation, because people find your technology easy to use, but they are not using it in a way that creates business value? You might think in that case that the system is insufficient, as it does not guide the user enough through a business process. As discussed, locking the business process too heavily into the system would create another big long-term problem because it would not allow enough flexibility.

ASSET OR POINTER?

But how should you guide people to use the flexibility? What are typical mistakes, and what guidance should you give? One good example would be the customization support for portals mentioned in Chapter 6, representing a more general support on how to use technology efficiently.
One other category of guidance is around the role that technology actually plays. The next story illustrates how viewing a system in one way or another can make a big difference.
In one of the earlier years of ToolPool, a consultant asked on a mailing list for a tool that would produce automatic process documentation as an add-on to one of our products. In the request she sent out, she mentioned that she had found a version of the tool that worked with a prior release of the software, but she needed it for the very latest release. It was Friday morning and she needed it rather urgently; without having such a tool, the project might be delayed.
When I saw her e-mail, I privately (directly) replied to her, asking whether she had contacted the author to ask if an update already existed. She had not done so. When she asked the author, it turned out that he already had the updated version but had not found time to contribute it yet, but he could send it to her. Within hours she had exactly what she needed, her project stayed on track, and the ToolPool team, aware of the update, supported the author to get it into ToolPool for everybody as soon as possible.
Many people looking at a “knowledge base” think of it as a repository of knowledge. If they cannot find what they are looking for or if it is really not there, the reaction is to turn away and in the worst case they start reinventing the wheel. But any such system is actually more than just a repository; it is what I refer to as a repository of pointers to the one who knows.
In the example, the consultant was looking for a certain knowledge asset that was not in the repository, but the older entry was a great pointer to the person who had the solution. In other cases, the solution might not be a new version of a tool; it could be that the author of the contribution has some specific knowledge that could be of value.
The contribution itself is only an incomplete representation of the knowledge that a contributor has, but it serves well as a pointer. In fact, often people acknowledge that they actually prefer to talk to an expert than look at some document or pick up something contributed into a database. Often people prefer to go directly to the person who has the knowledge. In a large global organization, this method does not scale well; it can be very tough to find that person. One approach might be to use some type of skills database or enhanced staff directory system that records skill and experience levels. Often such systems have a data quality problem. Participants are supposed to fill in their profiles but do not do it regularly and consistently enough. But even if you get over that problem by using enough drivers to ensure sufficient quality (as we managed with the skills database at SAS), you have to be clear that such systems map only one out of an endless number of dimensions of a person’s knowledge. As long as you understand those limitations, a skills database can serve a number of purposes.
However, a collection of an author’s contributions represents a work product that covers additional dimensions. For example, when a consultant indicated in the skills database that she knows product X at level 3, which is defined in detail as being able to perform certain tasks with the product, it might be somewhat one-dimensional. If the same consultant indicated via a different skills database item that she also knows a certain computer operating system, you would have additional information useful in a search for a given expert.
Now if the same person has made several contributions around the product, those contributions represent a pointer to certain knowledge that the consultant would have to have in order to produce what she contributed. Her contributions represent a few more dimensions of her knowledge that make it easier to identify her.
Often users do not look at knowledge repositories in this way. They see the repository and that is where it stops. If they do not find the right asset to use, they assume nothing is available. Their thinking has to go further, and they need to look beyond the repository that is in front of them. This limited view can be considered the asset view; the one that views contributions as pointers can be thought of as the pointer view.
Recently, through the use of Web 2.0 type of tools, people seem to be getting more used to this type of connecting to experts via pointers. In fact, Twitter, the immensely growing microblogging service, is based largely on people taking a pointer view. Each individual Twitter message with its 140 characters is too small to represent full answers, but it can also be a pointer to a person with more expertise.
One of the Twitter accounts that I follow is from somebody offering tips for competitive swimmers. The first pointer to that person was a small tweet (Twitter message) that a friend pointed out to me. That tweet led me to the actual person twittering, and I decided it would be worthwhile to follow that person. In this case it was actually a combination of pointers: A human (my friend) pointed me to a system, which in turn contained a pointer to another human.
The pointer view is important, and I recommend that all participants of a knowledge flow management initiative should understand it. It ties back to the quality argument. Say there are contributions that are not complete and leave open questions; therefore, they are not considered of value by those who only look at it from using the repository view. Under the pointer view, though, that same contribution could actually be a key pointer to an expert, and the fact that there is an open question could well inspire direct contact. Once the direct contact is established (via e-mail, telephone, or in person), more of the tacit knowledge could flow, some of which is between the lines. A document itself could not have provided that type of transfer. In an extreme case, an imperfect contribution could raise the chances that two humans connect with each other, but that is taking it a little far. (Note that I am not advocating producing low quality to inspire direct contact.)

TOOLS: NOT ONLY TECHNOLOGY

For the remainder of this chapter, the word tools is used in a slightly different fashion than in previous chapters. So far it has been used to depict contributions within the ToolPool case study. But just as you have tools in a software environment, you can speak of tools to be used to enhance organizational knowledge flow. As you will see, some of those “tools” are technology based and others are just methods that do not necessarily need any technology to support them.
The tools I discuss are only a selection and by no means a complete list of everything you could or should use. But these examples cover different aspects of a knowledge flow. Some of them will be familiar, others might be new, or you might not have thought of them in that way before. Some of the latest tools, such as blogs, wikis, and networking platforms, are not covered in this chapter but are discussed in the context of Web 2.0 and social media in Chapter 9.
Each tool is looked at from the point of view on how it can support the knowledge flow. To discuss any one of them in detail would take much more than a section in a chapter; however, there are a number of books dedicated to tools and concepts such as Communities of Practice and Storytelling, for example.1

COPS

Communities of practice (CoPs) are one of the most important concepts when it comes to enhancing knowledge flow, especially when there is considerable tacit knowledge involved in whatever the members of the community practice on a daily basis. CoPs are largely centered around humans and their interaction. Nevertheless, some people still manage to turn them into a primarily technological topic. Recently a colleague told me that he had this CoP for “topic X.” When I asked some more questions, it turned out that basically all he had was a mailing list around “topic X” that he was managing.
A mailing list might be one of many tools that you could use to support your CoP, but it is definitely not a CoP in itself. In the days prior to the Web, some mailing lists were probably a good backbone that helped a community to exchange ideas. But the actual CoP is a group of people who are practicing common things or has some common understanding. The individuals within the CoP usually play one or multiple roles; the key is that membership is defined by that common interest or knowledge. Membership happens by invitation or self-selection but not by organizational structure or mandate. Etienne Wenger, who is one of the fathers of CoPs, often refers to them as the way that work really happens. They are an organizational structure that sometimes can be invisible but nevertheless plays an important role in letting knowledge flow around the organization.
As a CoP is not bound to organizational structures, it can span the boundaries and silos that exist in the organization due to bureaucracy or politics. Because members of the community focus more on the topical issues, they find a common ground in the CoP that enables them to develop the type of trust needed to share across boundaries.
CoPs could exist completely without technology support based only on face-to-face interaction between community members. But in today’s global organizations, where scaling is one of the success factors, they are often spread around the globe. To permit the needed connectivity, a number of technologies, such as mailing lists, video or telephone conferences, and virtual meeting places, will play an enabler role.
Analogous to the need to drive a knowledge flow management initiative using support roles, a CoP needs drivership. It needs a passionate leader to get started and survive the bootstrap phase. But it also needs other supporting functions to survive long term.

SKILLS MANAGEMENT

In 1998 I was working at the European headquarter of SAS in Heidelberg, Germany. At the time we had a central organization that was doing product marketing but was also responsible for knowledge transfer from headquarters to the different countries across Europe. Knowledge transfer was done via central or local training and by sending headquarter experts into the offices to work side by side with local people. After some time it became clear that the headquarters function could not cover all the engagements and that it would be better to add local experts to the pool of those going out to support projects.
Two issues prompted us to look into some type of skills management system:
1. How should you plan the training? How many people are up to speed with the latest products and solutions?
2. Where would you find those local experts whom you might want to lend out to those offices that need them?
To answer those questions and get a central overview, we needed some way to collect skill levels from local consultants. So the idea of a skills database was born. The very first version of this skills database was very simple, and the technology support was rudimentary. We basically had consultants fill in spreadsheets with a list of skills and a simple rating category (Novice/Advanced/Expert/Guru); then local personnel would transfer that data into a simple Web application. The centrally stored data could then be used for searches, reporting, and analysis. The list of competencies was one-dimensional. The descriptions of the four levels were the same across all skills entered. We dealt with only a specific kind of staff: consultants. While the system was simple, we developed a number of processes that got more and more sophisticated with experience. The simple skills database worked well, but over time it turned out that there was definitely room for improvement.
The data entry performed by agents for the person who had the skill turned out to be cumbersome. The descriptions were too broad; we needed more ways of defining rating categories for different types of skills. There were some other areas where we identified ways to improve processes and the system itself.
An assessment with users (consultants, managers, and administrators) revealed that we actually would be best off by designing a new system, taking the gained experiences into account. That new system, the ESDB (employee skills database), was developed interactively with constant involvement of the user community. The programming started out using a method called extreme programming, where two developers worked side by side for months to come up with a modular and flexible design that we are still building on today to extend the system. But the ESDB did not evolve on technical ideas and comments as much as it did through the constant feedback from its users.
One of the key ideas was to build standard profiles of skills for each of the staff groups. As those relate to certain type of jobs, we call those job families. So each member of a job family gets the same set of skills to choose from. They evaluate themselves on a subset of the whole list—only those that apply—and their manager reviews and approves the profile before it can be searched. Based on people’s concerns, several levels of confidentiality were introduced and special roles were created that have different views and capabilities within the ESDB.
In the early years I went through a few discussions with knowledge management experts from other organizations about skills management systems in general. If people had used them, they usually said they did not get as much out of them as they had expected, mostly due to data quality issues and participation problems.
As mentioned in Chapter 5, by using multiple drivers and integrating the system into multiple initiatives (skills management, resource sharing, and training development), we have been able to overcome those problems to a large extent. With the success of the initiatives, our skills database has turned out to be a success as well.
What made it successful was an adjusted expectancy of what skills management could deliver. If the expectation is that you will have every skill of those involved perfectly mapped at all times, you will easily get disappointed. If, however, you need a way to identify prime candidates for a short list to be qualified further using human judgment and interaction, you can be successful. It is another case of the human-technology continuum, where not shooting for 100 percent technology but striking a good balance between human activity and technology can be the key to success.

“KNOWLEDGE BASES”

In Chapter 1 we discussed the fact that there is actually no such thing as a knowledge base, because knowledge cannot be stored in a database. But it was also mentioned that it would be unrealistic to expect that this terminology will go away soon. One possible way to put the term back into the perspective of an organizational knowledge flow could be to envision a knowledge base as a database of information that could be used to create new knowledge and also serve as a repository of pointers to the one who knows.
If you understand that distinction and operate any initiatives involving such a knowledge base accordingly, a knowledge base actually might prove to be valuable. But this will be true only if the surrounding processes cover all the human and motivational aspects that enable the creation of new knowledge and if key stakeholders understand the nature of what they are working with.
Knowledge bases can be a supportive piece of a full initiative that enables scaling and global reach. Like any repository, they usually are only as good as the content that they contain. The value of that content could be in the contribution itself or the potential value of serving as a pointer to the one who knows. The emphasis needs to be on getting good information into the knowledge bases by motivating the right individuals to share key pointers to the knowledge they possess.

PORTALS

Intranet portals, sometimes referred to just as portals, are usually Web sites that bring together a range of sources into one Web page. As the name indicates, they are supposed to be the entrance to a wide range of information sources. But unlike a normal entrance, a portal actually represents not one but many doors, each door an opening to an information stream that could be kept internal or external to an organization but is brought together into one common interface. The portal itself would not store much of the information but only bring it together. One of the key functions of portals is that they usually are customizable. They offer a standard set of presented sources that can then be altered by an administrator or the users themselves. And the modifications usually happen at different levels. You can change the actual information streams that are presented. As an example, you might be able to select from a range of internal streams (company news, sales status numbers, events, cafeteria menus) as well as external streams (general and industry news, competitor watch lists, weather). The portal designer or portal administrator can also lock some streams to ensure that they are always present.2 Customization means the users could:
Subset or customize the streams. A good example of a customized stream might be a subset on a news stream. Instead of all the news flowing in via that stream, the user could choose a subset based on a given condition.
Change frequencies and refresh rates. Instead of updating the stream whenever something new appears, the user could choose to get it updated only once a day.
Adapt the appearance. This could mean that the user would be able to move the different stream presentations (portlets) around on the main pane (the desktop). It could also mean customizing titles, colors, sizes, and so on.
Portals are often mentioned as a component in knowledge management as they provide a way to link into multiple knowledge management systems. But remember: What is represented remains information; it is not knowledge. The portal might open a way to interact with others, which in the end could result in some type of knowledge exchange via information. Also, as portals offer a range of information, they can be seen as a good tool to bring together the right streams to produce new knowledge.
Portals have become a lot more flexible in recent years, through the wider distribution of RSS channels.3
With the use of RSS and smarter portlets, it has become very easy to present streams in the portlet without any programming.
One thing to understand very clearly about portals is that the value is under the hood. To come back to the analogy of the car without an engine, this is a similar case. A portal might be great looking, but it is only a presentation layer. It lives and dies with the underlying information: the streams and how they are filled. For that matter, you could show a great portal in a demonstration that is filled with some example information. People see those portlets and can imagine how they might look with their own information, offering all those valuable resources that might be out there around their organization. But they often underestimate the effort needed to get from the demonstration to a functional ongoing and valuable portal. I have seen many demonstrations that overlook the need for those efforts and focus solely on the output side of the application. But how does all that useful information get into the portal? Who drives the strategy on what type of information streams should be feeding the portal and how should they be integrated and presented. Who is in charge of maintaining the streams themselves and their sensible integration? Who manages common dimensions in a way that some of the streams can be combined?
Specific support is needed to drive the quality, deal with user issues, and constantly adapt the strategy to fit with what users need.
Customization has been discussed in earlier chapters. Customization has great potential, but a lot of people will not attempt it. Instead, they expect the perfect interface magically to appear or just suffer through inefficient processes pushed onto them via the technology.
My recommendation for portals is this: Make sure that there is a strong support group that has typical knowledge intermediary skills outside of technical understanding. You will need an experienced information architect to deal not only with the portal itself but with the underlying information infrastructure that will be the key to the portal being useful. And I would propose to have good ongoing training that not only includes general sessions about what is in the portal and how it could be used, but also personal trainers to help users get the most out of the possibilities that the portal might offer.

OPEN SPACE TECHNOLOGY

The last two tools discussed had a strong technology support component. But knowledge flows most effectively when people are in close proximity. The following is a method that is used specifically to transfer knowledge between people not via remote means but when those people are actually in one room: Open Space Technology (OST). The term technology is actually somewhat misleading as it is more a methodology than a technical concept.
OST is a special method of a gathering (e.g., a meeting, seminar, or even a full conference). But in contrast to traditional events, the organization of an OST event is quite different. The term open mainly refers to the agenda. While a traditional event might have a defined agenda with an occasional free-form element, such as a break, an OST event is pretty much the opposite. Topics usually are not chosen beforehand but are brought to the table by the participants. Any interested participants stands up and presents a topic, question, or struggle she has and she is interested in discussing in a group. The main facilitator guides the process to come to a sensible number of those proposed topics. It depends on the number of people in the room, but, for example, for a group of 30 people, you might have four or five topics. Each person who proposed a topic will move to a table, usually in the same room, and put up a sign to indicate what the original topic at that table will be. Participants then swarm out to the tables to form groups of equal sizes.
One interesting rule is that people are not bound to stay at a table but can change tables whenever they like. They can also take a break, if they feel like it. (For more information on Open Space Technology, there are a number of books and articles.)4
From a knowledge flow management point of view, OST events are very effective. One prerequisite for effective knowledge sharing is that people are eager to discuss a topic that they are strongly interested in. Within the OST format, the likelihood that those really interested in a topic come together is much higher than in other forms of knowledge exchange.
Participants change tables to make sure they are find the discussion beneficial and valuable.
Ideally an OST event starts with some activity to build basic trust between participants, which again makes interaction on a higher level more likely in the groups.

SEARCH

Search is another topic that could take up several books. In regard to an organizational knowledge flow, internal search engines play an important role. What search engines do is provide a quick way into a range of information that has not necessarily been preorganized in any fashion. It can also offer another way to find information apart from browsing. You could, for example, have a range of documents, e-mails, or Web pages that are structured to be browsed traveling via certain dimensions, such as topics, time, or relevance. But browsing is not always the most effective way to search. If hierarchies get more complicated and the number of elements becomes large or if the person searching does not like topical browsing, search will be a better way to find what is sought.
If elements found via search are properly identified and point to those who had the knowledge when producing that specific element, search can also produce a list of pointers to the person with the knowledge. Search engines usually produce a range of challenges to users mostly related to high numbers of result items. From a human side, these aspects need to be taken into account:
• Often it is hard to specify the right context in a way that search results really offer what you are after. There are some attempts to enhance search with semantics to improve the results.
• Most individuals these days rarely scroll beyond the first page of a search result. If the relevant or “correct” information is not on that page, the searcher might actually build knowledge from incorrect or outdated content.
• Results might be removed from proper context. An example would be a document that originally was linked from a Web page. The Web page might give some context, such as a disclaimer or positioning. Finding only the document without that context might lead to misinterpretation of the information it contains.
• The relevance of search results can be influenced by special features, such as highlighting of key words or phrases.
Even a purely technical tool like search requires people who understand the issues, can train others to use it properly, and build an ongoing strategy around it in an organization.

STORIES

There are many ways that stories play a role in an organization. Stories are actually tools that do not need to be introduced; stories are always there already. But there are different ways to deal with them. You could just ignore them, try to fight some of them, or use them strategically. No matter which way an organization chooses, stories will always play a key role in transporting a certain type of knowledge. With stories, it is not so much the information that gets shared but certain principles or analogies that transport a message. A good story always has some meaning that cannot be put into plain informational text, for example.
Some great work by Steve Denning from the World Bank and David Snowden from Cognitive Edge5 explains how stories apply in a business environment and how they can be used to drive organizational behavior. Stories often prove to be a lot more effective in transporting a message and getting people to act than just presenting these people with information (i.e., in a presentation). I strongly recommend exploring stories further by reading the available literature on storytelling in business.6

KNOWLEDGE TRANSFER SESSIONS

In Chapter 5, I discussed that you need to embed knowledge into the organization to make it harder to be lost when certain experts leave. One method that has been used at SAS is knowledge transfer sessions. The method is based on a similar method that Rolls-Royce Aerospace had used and that one of its key knowledge management experts shared.
The typical issue: A key person leaves the department, division, or the organization as a whole, and there is not much time between the announcement and the actual last day in the office of that person. How can you get some of his or her knowledge embedded or captured for later use?
Of course, in an ideal situation, you would have several months or even years for people to shadow the expert to transfer some of the knowledge to them, but realistically, the time frames are much shorter. Knowledge transfer sessions are a minimalistic approach to this problem, acknowledging that keeping some knowledge might be better than keeping nothing.
Here are the simple rules:
• The person who is leaving, the main actor, sits in the front of the room in the hot seat. In a small circle around the main actor are between 6 and 10 questioners and 1 or more facilitators. The facilitator role is very important; the person should be experienced in facilitating discussions and also have at least a basic understanding of the topic of expertise represented by the main actor in order to ask specific questions in case the conversation slows down.
• The questioners are the manager, all or a number of key subordinates, and some other colleagues with whom the main actor had ongoing contact with in the job. Questioners should come somewhat prepared with questions. Even if they do not have the time for preparation or are called in at the last minute, they can be of value, though. They should not feel too much pressure to ask only smart questions.
• The typical session should not be longer than 90 to 120 minutes. If more time is needed, the sessions can be split. But since this is a pragmatic approach, one session might be all the time you can schedule for the person who is leaving and all the other people needed.
• The main actor is asked beforehand whether she objects to being taped during the session. If she does not have any issues with that, a small camera should be set up to record the session. If you can, it would be great to have somebody direct the camera to focus on those talking; if not, set it at a wider angle and focus primarily on the main actor.
• The session begins by the facilitator interviewing the main actor by saying things like:
• Please describe a typical daily routine of tasks you would perform in your job.
• Tell us about what you liked most about your job.
• Tell us about those things you liked least about your job.
At any time, questioners are encouraged to ask qualifying questions, bring up special cases, or ask those things they always wanted to know but never got around to ask.
The main actor is also encouraged to tell stories from the most memorable cases she encountered.
The questioners can then go on to ask questions they had written down beforehand or those that come up based on things other people say.
I have personally facilitated a few knowledge transfer sessions and was always amazed at the depth of the discussions and the astonishingly positive atmosphere that could be created.7 Questioners found the sessions very valuable and were surprised at how much they learned in the short time frame. It is also amazing how, in almost all those cases, the main actors were positively surprised by the event. Talking about all the things they have achieved and done, making some of their knowledge more visible to themselves and to others, produced a mix of positive feelings. Often others are surprised at the range of things that the actor had to deal with. Usually people do not get time to present themselves in stories and experience sharing, not even for one to two hours.
Other than the camera, there is no technology involved for the actual session when using this tool. If the actor agrees, the recordings can be cut into sensible and easy-to-digest chunks and made available to successors or new colleagues.
As the examples in the previous sections showed, tools can range from technical to people oriented, but even with technical tools, the key is to find ways to use them to really enhance the knowledge flow and not just become information graveyards.
Some of the tools discussed in this chapter (CoPs, skills management, Open Space, and stories) can provide business benefits to organizations by enhancing the degree to which knowledge can flow directly between staff. Other tools (knowledge bases, portals, or search) provide benefits by allowing larger scaling or offering collections of pointers to the one who knows. In both cases the increased knowledge flow can have considerable positive effects.

NOTES

1 Appendix B lists some of my favorite books regarding those tools.
2 Locking would mean taking away the possibility to remove them.
3 RSS (Really Simple Syndication) offers the possibility to address a stream of information (usually produced in an XML format) very easily via a fixed address—a specific type of Web address or URL. The key is that the stream might change in place but is immediately fed to all the places on the Web that have created a link to that address.
4 For an example see Harrison Owen’s book Open Space Technology: A Users’s Guide (San Francisco, CA: Berrett-Koehler, 2008)
5 See Stephen Denning’s book The Springboard: How Storytelling Ignites Action in Knowledge-Era Organizations (Woburn, MA: Butterworth-Heinemann, 2001) and some articles on narrative by Dave Snowden at www.cognitive-edge.com/articlesbydavesnowden.php
6 A more recent book on storytelling from Stephen Denning is The Leader’s Guide to Storytelling: Mastering the Art and Discipline of Business Narrative (San Francisco, CA: Jossey-Bass, 2005)
7 You would organize these events only with people who leave the organization under friendly conditions. Be careful with highly frustrated individuals who are not able to keep to business-relevant topics.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.191.170.239