-->

NEW EVENT: KM & AI Summit 2025, March 17 - 19 in beautiful Scottsdale, Arizona. Register Now! 

Cognitive computing: Why now and why it matters

We are arguably in the midst of a profound shift in computing, from programmatic to so-called cognitive, where intelligent workflows are no longer simply encoded into machines, but are rather discovered automatically through observations of best practices. Many would say that smart systems and data mining have been around for a while, so what is so different now? It turns out that it is not so much about new algorithms, but rather about new engineering approaches, invented out of the necessity to cope with the sheer size, richness and volatility of big data.

Traditionally, inference models, which draw conclusions from data and content, have predominantly been used by experts in a “data lab” environment for the sole purpose of finding new lessons learned, or rules, that could then be leveraged effectively within business workflows, for example. Such models would often rely on human knowledge because the observed data would be insufficient to produce any meaningful rules. In a relatively static and data-poor environment, that approach was in many ways the only viable option.

Fast forward to a world where every interaction is tracked digitally and where open access to shared knowledge creates both tremendous opportunities and pressures on speed and innovation. You are now faced with the imperative to rethink how you leverage your data. For one thing, all this available “sensor data,” Web, social, machine-generated, etc., creates a data-rich environment for learning about your customer needs, and proactively addressing them. But the engineering approach to learn from this dynamic environment needs to be mostly unsupervised and adaptive, or it simply cannot scale.

And as Thomas Davenport reported in his December 2013 Harvard Business Review article “Analytics 3.0”, over the past few years, we have seen century-old businesses such as Schneider Electric, UPS and GE innovate with big data and deep machine learning, or cognitive, approaches to enrich their offerings, and provide their customers with shortcuts to decisions and actions. Data-rich products and services are no longer exclusive to Internet giants like Google and Amazon. Those new engineering approaches for extracting actionable understanding out of data automatically were born on the Web, but have recently been pioneered by early enterprise adopters with spectacular results. GE, for example, reports that for every $1 billion it invests in its big data program, it extracts $6 billion in net profits within 12 months.

In many industries, ranging from financial services to consumer goods, where price erosion has been rampant due to rapid product commoditization and the emergence of online alternatives, data-rich customer service has been the only sustainable competitive advantage to early adopters of enterprise cognitive systems. By synthesizing the complex and unique customer context, and comparing it to similar past scenarios in real time, the system can help identify reliably the best customer actions to take, such as best resolution, best product, best follow-up, etc. Organizations are notoriously bad at learning from the collective smarts, and empowering each frontline employee with the right best practice at the right time. Bridging the gap between action and context is what enterprise cognitive systems (wikipedia.org/wiki/Enterprise_Cognitive_System) are designed to do.

KMWorld Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues