FOREWORD

Many years ago, the company I worked for had just been bought by IBM as one of the basic elements of what was to become IBM Global Services. I had been a part of it for well over a decade, rising from a developer working on decision support systems through to general management and to a leading role in strategy. The GENUS program I put together combined Rapid Application Methods, Legacy System Management, and Object Orientation in a synthesis that had sufficient novelty to win marketing awards against Microsoft’s latest release of Windows and was also credited with being a significant part of the turnaround of the company that made us attractive to IBM in the first place. Freed by the acquisition into a free-floating do whatever interests you role with some important top cover, I had a chance to go back to my origins in decision support and pick up the reins again with the emerging disciplines of Knowledge Management. That led to my joining Larry Prusak in the Institute of Knowledge Management, working on counterterrorism before and after the tragic events of 9/11 and eventually into complexity theory and the newly formed Centre for Applied Complexity at Bangor University in my native North Wales.

Now, I tell that story, not just to set a context for what follows, but also because it tells some of the story of knowledge management over the years, a story that continues in fits and starts, but continues nevertheless. Lesley’s book, to which I am honored to write the foreword (and flattered to be referenced therein), is a significant contribution to the development of the field, but as importantly to an understanding of its journey. My goal here is not to summarize her argument but to compliment it. I do not agree with the idea that Discourse Analysis is the solution—in fact, that is not what Lesley is suggesting—but rather, as she notes, it is a valuable and neglected aspect of the field that could extend the directions of current thinking. I also think that she has made a significant contribution simply by pointing out the absence of coherent theories of language from most thinkers and practitioners in the domain. Her work and analysis would be important for that insight alone, but she also goes on to expand the theory and practice of one approach to rectify that omission.

So what do we know about knowledge in organizations? There is little dispute that it is a critical aspect of service provision, competitive advantage, strategic development, and so on. Hence, the value is rarely disputed; however, the practice of knowledge management is controversial in theory, practice, and adoption. The reality is that most knowledge management initiatives rarely survive in the long term, or at best become a subsection of Information Management within the IT function. A variation in professional services companies sees it manifested as the modern version of what used to be a library or registry. But it is no longer strategic as a function, despite being strategic as a practice. In the early days, we had Directors of Knowledge; now, they are few and far between. Not only that, companies seem to go through multiple adoption and abandonment cycles. In one industry that will remain nameless to protect the naively innocent, I have seen three knowledge management teams arise and two fall over similar time scales. The team, having established a reputation on the conference circuit, then went on to form a consultancy unit that sold recipe-type approaches based on their self-reported successes. I confidently expect the third team to follow in due course. The various institutes rose and died in their turn. But despite this, people keep coming back; the trouble is that by the time they come back, they have forgotten (the supreme irony for knowledge management) what went wrong last time. Thus, they are doomed to repeat the same follies with the same inevitable result.

Lesley correctly points to a failure of definition in the field as a reason for this, or more specifically the brutal fact that the dominant paradigm is to treat knowledge as an object, with tacit knowledge an inconvenience of little value that walks out of the door each night until it is codified in digital form, at which point it becomes an asset. That paradigm traces back to The Knowledge Creating Company and Nonaka’s SECI model, now rebranded as BA. There are two facts about that book that people tend to neglect:

  1. The book was never intended to start a movement; it was an attempt to document the process of knowledge creating in product-based manufacturing. In that context, the process of observing and understanding a skill (such as the much quoted baker) and transferring that insight into an explicit form that will allow a product to be manufactured makes a lot of sense.
  2. The publication coincided with the height of the Business Process Reengineering movement and the directly related rise of Enterprise Wide Resource Planning systems and the parallel growth of Management Consultancy from a small craft skill to a manufacturing process in its own right. The assumption was that knowledge management would follow a similar vector, with a focus on consultancy-led standardization enforced by technology-based augmentation (or more frequently replacement) of human agency.

The net result was that knowledge management as a discipline started in the wrong place. The problem was made worse by the obsession with case-based approaches in management science. That meant academics, who had knowledge that might have prevented the lapse to objectivization of intangibles, were not engaged. A very, very few of us, with a background in Philosophy, realized at the start that the paradigm being adopted was deeply limited, but we were voices crying in the wilderness.

One early method I created in an attempt to stem the flow toward codification was a simple form of mapping together with a perspective question to force people to think about the issue from a more diverse set of lens. The process involved self-ethnography, reporting of decisions, and associated information flows by employees. The results were then clustered, like-with-like, and we consolidated the information flows. The result was rather like a spider’s web in the early morning after a light rain. You could see a coherent pattern, but it was messy. We then went to each decision cluster and asked three questions of each decision:

  1. What artifacts were used?
  2. What skills were needed?
  3. What heuristics or rules of thumb came into play?
  4. What experience is critical?
  5. What natural talent exists that simply makes some people better at this than others?

Known as the ASHEN model, it continues in use (often in modified form) to this day. The goal of the ASHEN question was to look at knowledge from multiple perspectives, but in such a way that people would realize that some things simply could not be codified and in consequence employee retention was more effective than codification. We also compared the process map with the decision map to show gaps between actual practice and formal process and then matched the knowledge objects (any grouping of ASHEN aspects coherent enough to be managed) against core business goals. From the dependency maps that resulted, we ended up with portfolios of pragmatic knowledge projects. A knowledge management program thus became a portfolio of knowledge projects that emerged from day-to-day practice, informed by strategic needs. That method is still in play to this day, and if anything is growing in use.

The process of engaging in ethnography around decisions also produced an accidental effect. We found that decisions were best revealed in stories, so we went hunting for those. Not the grand stories of workshops or interviews, but the day-to-day stories of practice that inform and instruct. From those, we could extract the decisions. As a group, we came to the story from the perspective of discovery, not communication, and that produced one of the genuinely novel approaches of the last two decades, namely, scalable or distributed ethnography, now manifested in the SenseMaker® product. With the benefit of hindsight, it was inevitable that narrative forms of knowledge retention, capture, and distribution would emerge. We all come from cultures in which oral history dominated for ages. In work with Boisot, we identified that knowledge acts as a transitionary device between the purely tacit knowledge of the person and the explicit knowledge of the database. Since then, that approach has extended to Development Sector evaluation, Patient Journeys, preradicalization monitoring, and understanding of entrepreneurial culture. That is, to name a limited number of what are now myriad applications, some of which might have been labeled knowledge programs in the past, but these days stand on their own.

While narrative is a form of language, I accepted Deacon’s (The Symbolic Species) and others’ refutation of Chomsky’s idea of grammar being genetic. We realized that narrative carries with it essential ambiguity and constant change. That can be interpreted by Discourse Analysis but not in real time. So we moved to self-created and high-abstraction metadata to allow novel capture, interpretation, and advocacy-based solutions. That means we could allow field engineers to capture narratives on the go, rather than write reports or be forced into a community of practice. The self-signification meant that recall of fragmented knowledge across silos became easy without the formal structures of taxonomies and Communities of Practice. We started to move to peer-to-peer knowledge flow, allowing conceptual blending of diverse fragmented memories and observations to come together in the context of a need to create a real and novel solution.

Also the fragmented, loosely coupled aspect of micronarratives (as they became known) required new theories of systems and an encounter with the literature of Complex Adaptive Systems Theory. Known as the science of uncertainty, complexity deals with systems that have no linear causality, and this cannot be engineered to goals. Instead, they are dispositional and need to evolve; the management of evolution is a very different process from that envisaged by most engineers. That led to the Cynefin framework, which appears in two award winning articles: Complex Acts of Knowing and A Leader’s Guide to Decision Making. Lesley references the framework later in this book and kindly shows its capacity to embrace conflicting theories in a single framework by recognizing different states or types of causality that permit and disallow different approaches to both understanding and management, not just of knowledge but also more widely.

I have recounted this as a narrative of accidental discovery, as that is what happened. But the discoveries were informed by reading, reaching, and discussing with experts from many fields. At a seminar at Mussolini’s former palace on Lake Garda, now the conference center for the University of Milan, I found that process had a name, namely, exaptation. The contrast with adaptation is deliberate. A dinosaur’s feathers evolve over time for warmth or sexual display. That is a linear adaptation; then, we get a nonlinear exaptation for flight. It would not have evolved on its own; it required something to develop for another purpose first. In the same way, the cerebellum adapted over time to do fine-grained manipulation of muscles to allow seeds to be picked for seed pods, but then that capability exapts to allow the sophistication of grammar in language. Art also precedes language in human evolution, allowing limited neuron clusters to handle abstract concepts rather than simply naming things. Human language is a glorious accident, a key knowledge component, but one that delights in ambiguity and meaning change.

So at its best, knowledge management has created a new form of generalists in a world of increasing specialization. That capability goes beyond the simplistic tacit to explicit codification of knowledge that dominates too much practice. But to embrace that capability, we have to develop a capacity to manage inherent, irreducible uncertainty and complexity. That requires a new simplicity and humility of the glories of human evolution, knowledge that can be articulated and communicated, but which allows and enables chance discovery, adductive not inductive reasoning, and the ability to fall into discovery by accident. Mary Midgley, one of the great British philosophers, wrote a wonderful book Science and Poetry, in which she says more about knowledge management and the role of aesthetics in effective knowledge creation than many a dedicated textbook. Then, as to management, well maybe we need to go back to the original meaning of the word. I quote from an article I wrote with Kurtz some years ago:

the English verb “to manage” was originally derived from the Italian maneggiare, meaning to handle and train horses. …the emphasis is on learning with, abiding with, adapting to, respecting, and working with another complex entity: the horse and rider as co-evolving brambles in a wider thicket of social traditions surrounding beauty and form. Around the early 18th century, this original meaning merged with the French term menage, or household, making it easier to adapt the meaning of the combined term manage to the metaphor of the obedient machine, to the corridors of power, and to the actions of controlling and directing.

Thinking of knowledge management as riding a horse is a useful metaphor. Continuing to develop transdisciplinary approaches to the field is vital. Lesley has made a significant contribution to that, and I commend the book and am privileged to have been allowed to write the foreword.

Professor David Snowden

Founder and Chief Scientific Officer of Cognitive Edge

Director of the Centre for Applied Complexity at the Bangor University

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.148.104.124