-->

KMWorld 2024 Is Nov. 18-21 in Washington, DC. Register now for $100 off!

Everything is fragmented—Managed serendipity

I’m old enough to remember the excitement surrounding object orientation (OO) when it first started to gain some traction in the IT community. I was responsible for a team that put together the Genus Program, an early attempt at achieving an evolutionary approach to system development. We linked OO with legacy system management and the very early stages of rapid application development (RAD). The idea was to avoid the trauma of all-or-nothing large system procurement and design, and shift to something more dynamic, flexible and evolutionary in nature.

That was well over a decade ago. The ideas were ahead of their times and got lost in mass enthusiasm for enterprise resource planning (ERP) systems linked to business process re-engineering (BPR). The technology was only just developing that could support the radical ideas of OO and the even more radical ideas that the human side of systems could also be thought of as objects with inheritance and polymorphism. It is only in the last few years that I have felt we could renew that early hope with the capability of social computing, in particular the growth of multi-application environments. These days, on a personal computing platform at least, applications evolve from multiple components; they are not designed as a whole based on predefined outcomes.

Given that potential, I am still amazed at the amount of effort we have to place on overcoming linear approaches to solution definition. Most people are taught to start with the customer, find out what the customer wants, seek supporting methods and tools that match those requirements, and implement them. As a result, clients get what they asked for (or rather what the analyst understood they wanted), which is not necessarily what they need. At the same time, useful technologies are not taken advantage of because the client is unaware of their capabilities. Also, the owner of the technology might be unaware of the true potential.

Technology developed for one application frequently becomes more useful for something unexpected. It’s called a freeloader in evolutionary psychology. A trait that is a byproduct of an environment-induced need turns out to be useful in unanticipated ways, and natural selection then augments it. Some argue that human intelligence and language arose in that way.

What we need is to move away from linear requirements capture and instead increase the interaction between capability and needs, allowing applications and solutions to emerge through a series of safe-fail experiments, rather than some over-structured fail-safe design process. In the old days (I’m feeling my age), technology didn’t support that type of approach, but now it can.

One problem in adoption is that the governance processes of corporate IT are locked in the capabilities of the last century and need to catch up. Modern capabilities exceed our past imaginations. For example, we can now capture hundreds of thousands of fragmented stories, audio recordings, sketches, screen shots and URLs in the field under fire as people engage in their work … all for far less investment than is needed in traditional requirements capture. If we use the same interpretative metadata structure for those fragments as we do for our own capabilities as a supplier, we look for overlaps and synergies that cluster and group to indicate new and unexpectedly beneficial applications.

While we have the technology, one of the things we have lacked is a theoretical framework that—in the immortal words of Lincoln to think and act anew—integrates the various capabilities and allows consistent and scalable use. For me, that is provided by complexity science, which allows us to understand the way in which we can achieve the form of serendipity outlined above. Next month, I’ll provide an introduction to that theory and further explore its application.

KMWorld Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues