Knowledge Newspeak
In George Orwell's "1984," the totalitarian government achieved its control in part through what today we might call "LM," language management. Newspeak is most famous for redefining the elemental words, so that "peace" means "war." Today's knowledge managers would never stoop to such tactics; they leave that to marketing. But there's another aspect of Newspeak that is more relevant to KM: In Orwell's dystopia, every edition of the dictionary had fewer words than the one before it, for Orwell's nightmare government achieved control of ideas by driving out ambiguity.
Now, it would simply be a logical flaw, as well as vastly unfair, to conclude that KM systems that are intolerant of ambiguity are totalitarian. It'd be like saying that KM must be Kafkaesque because both begin with K. But there is a lesson here about the value of ambiguity and about the relationship of clarity and control.
The problem is that clarity is always desirable while ambiguity rarely is: You don't have to explain why you're being clear but you do have to explain why you're being ambiguous. Or so it seems. In fact, a different "language game" is being played. For example, if we were having a beer, grousing about our bosses, and if I were to whip out a manila file folder stuffed with seven inches of particulars—names and dates for every little offense to my sensibilities—something would have gone wrong with our discussion. Similarly, if I complained about the guy who was tailgating me this morning and obsess over whether he averaged 38 inches or 39 inches from my rear bumper, you'd know that I don't really know how humans talk.
The fact is that ambiguity is the normal case for how we talk and precision is the exception. This sounds odd because we carry around a model of language that says that words are arbitrary sounds that have precise meanings. Sure many of them do, but most of us couldn't articulate their meanings without looking them up in a dictionary. For example, tell me precisely what the word "public" means. Or, try something simpler: "smile." While words have precise meanings, we use them ambiguously, with associations, connotations and context trailing them like a comet tail. They are more like gestures that point us at something in the world than like codes that we translate by looking them up in a code book.
We adjust our need for clarity to the situation. If we're two friends sitting around, we can say a lot in a few words. If we're writing the technical documentation for how to fix a jet engine, we will use a precisely defined, standardized vocabulary to describe each operation. Even then, however, we'll assume a lot: We may specify that you turn the bolt a quarter turn to the right using a specific wrench, but we won't explain which end of the wrench to use.
If we generally get the mix of clarity and ambiguity right, where do we get it wrong? The failure to be sufficiently clear often occurs because one is insufficiently attuned to the reader's context: "When the documentation said 'Pull the lever,' I just assumed that it meant 'Pull it toward you.' " Failure to allow ambiguity, however, often has a political motive, for—unless it's presented as a genuine question—it can be a way to shut down talk and ideas, as if ambiguity were a sign of weakness of thought rather than of its richness.
Getting the balance between clarity and ambiguity right is essential for a KM system. Being ambiguous where clarity is required can be dangerous as well as wasteful. But being clear where ambiguity is requisite kills creativity and, in many instances, is motivated by a fear of losing control ... as Orwell understood so clearly