A deep future approach to KM
The word “deep” has been gaining popularity. Deep learning. Deep medicine. Deep science. And the latest hot topic in the political sphere, the “deep state.” All reflect a growing desire to understand that which is hidden from view and which gives rise to what we see happening on the surface. Although not discussed quite as often, we need to give equal consideration to the deep aspects of time.
We’re familiar with the near-term portion of the time spectrum—from femtosecond lasers used in eye surgery to high-frequency trading in milliseconds on the major securities exchanges. Unfortunately, the extreme opposite end of the time spectrum, the “deep future” receives little if any attention. Decisions in fields such as genetic engineering, nuclear energy, geopolitics and the like can have serious implications for human civilization. But the impact of those decisions might not become apparent for many thousands of years and hundreds of generations.
We’re not talking about predicting the deep future. We have a hard enough time predicting day-to-day events. But we can take steps to ensure the knowledge that goes into the decisions we make today is carried forward and evolves along with the event chains that we set into motion by those decisions. To do this, we need a new weapon in our KM arsenal: the “knowledge continuum.”
From preservation to perpetuation
We’re quite familiar with the problems of knowledge loss, especially of the digital variety, as formats and technologies continue to change every few years. In response, we’ve taken steps such as digital records management and information governance to preserve our critical knowledge artifacts. These practices play a key role in societal evolution, including the ability to look back and draw upon lessons learned in order to go forward without making the same mistakes. However, changes in context may render an archived body of knowledge irrelevant, especially over long periods of time.
“Knowledge perpetuation” extends these practices to include not only the preservation of knowledge, but also the curation of its evolution. We define knowledge perpetuation as: the capacity for extending and projecting actionable knowledge over long periods of time spanning multiple generations and even multiple millennia.
This demands the creation of a special type of knowledgebase known as a knowledge continuum. In addition to preservation, knowledge that is curated within a knowledge continuum must have the ability to be recontextualized and correctly interpreted by future generations.
Contextualization deals with the values, norms and goals of the greater societal structure that influenced the decisions that generated the knowledge in the first place. Recontextualization deals with changes in societal values, norms and goals that necessitate re-examining and, if necessary, modifying an existing body of knowledge. In this way, knowledge continuums have the potential to provide broad, long-term benefit to society as a whole.
For knowledge to remain usable, it must also remain “interpretable.” Future users need to be able to understand knowledge in both its historical context (how and why it was created) and future context (how and why people will continue to interact with the knowledge).
To date, humankind seems to have a poor track record at engineering true long-term knowledge continuums. The challenge is to build something that will last into the future for a time longer than all previously recorded human history. The Long Now Foundation is answering the challenge with several interesting projects, including a 10,000-year clock, and the Rosetta Project, which aims to permanently archive all documented human languages.
The challenges
So what’s stopping us from creating as many knowledge continuums as we want, whenever we want? Three major challenges are: records management policies, knowledge artifact design and ontology management.
Records management policies essentially define what to keep for how long and what to dispose of and when. Existing policies and practices can serve as a starting point. However, they must be re-examined in light of long-term perpetuation.
In the United States, for example, the National Archives establishes the policies used to determine which federal records have archival value. They include those deemed to have an enduring historical or other value that warrants preservation beyond the period required to support the originally intended activities.
Although such policies legally apply only to certain elements of the federal government, they are often adopted as a de facto standard for a wide range of records management practices. However, these same policies also carry the implied expectation that most information does not retain its value in perpetuity. For records that don’t meet the “archival” criteria or that are no longer needed, disposition equates to permanent destruction.
Solution: Align current records management policies and practices with the capability for knowledge perpetuation.
Knowledge artifact design has come a long way since the days of cuneiform tablets. But advancements in packaging information on increasingly dense media have introduced tremendous increases in volume and complexity, which can impede the long-term recoverability of actionable knowledge.
Solution: Improve the design of physical knowledge artifacts to meet the longer-term needs of knowledge perpetuation by achieving a balance across density, complexity and recoverability. This includes recognizing when things are changing to the point where a “refresh” is needed, before a particular knowledge artifact is no longer understandable.
Ontology management, i.e., keeping context fresh. The objects and artifacts through which knowledge is encoded are embedded within layers of context. Anne J. Gilliland-Swetland, author and UCLA archival studies professor, has identified the key attributes of ontology management as determining: which information is created, organized and used; the organic relationships with other information objects; the characteristics that provide meaning and evidential value.
Solution: For knowledge continuums to have long-term viability and stability, they will require mechanisms that provide active replenishment and renewal. Knowledge continuums must be able to evolve in a way that is both compatible with the society of which they are a part and consistent with the knowledge they perpetuate. This type of “directed evolution” will require an awareness of what has worked in the past and the ability to anticipate (and plan for) the future well-being of the continuum, while maintaining the validity of the knowledge.
Actions to take
A fun way to get started is to consider making your best prediction about the deep future and placing a “long bet” on the Long Now Foundation’s long betting platform. Other more serious steps include:
♦ making knowledge perpetuation as well-recognized and accepted as the current practice of physical preservation;
♦ advancing the science and methods needed to support knowledge perpetuation and the design, construction and curation of knowledge continuums; and
♦ taking an active role in helping to ensure that knowledge perpetuation is included as part of national-level science policies and priorities.
We are in the very early stages, which is both exciting and frightening, given the long-term implications. As KM professionals, we are ideally positioned to seize the opportunity to step up and lead the development of true knowledge stewardship. This means moving our legacy-minded organizations beyond a strictly preservation-oriented mindset to one that emphasizes perpetuation.
Whether the knowledge artifacts we leave to future generations will be all that much better than the faded legends, inscriptions and hieroglyphics left to us by our ancestors will depend on what we do now. Let’s not keep leaving it to the archeologists to try and rediscover the knowledge of the ancients. Remember, when it comes to the deep future, we are the ancients.