-->

KMWorld 2024 Is Nov. 18-21 in Washington, DC. Register now for $100 off!

Cognitive computing: An evolution in computing

With interest and activity in the field of cognitive computing heating up, a succinct definition of cognitive computing and an overview of some of the technologies behind it might be useful. I’ll also share a few of the practical applications that are emerging in the field. They are already demonstrating just how effective cognitive computing can be at helping answer complex questions.

At its core, cognitive computing is a collaboration between people and computers with a simple goal: to magnify human reason and insight.

To achieve that, cognitive computing systems must learn, iterate and adapt; interact with people and be responsive to them; and understand the correct context for the information they find. Those abilities allow computers to understand human questions, suggest answers and refine those answers based on human response and input.

That process differs greatly from most of today’s computing systems, which have to be programmed and are designed to run specific processes or to give static responses to queries or requests. Those programs are limited to answering the exact question you ask, but cognitive computing systems can answer less specific questions and will change their answers based on human responses.

The technologies involved

To break beyond the constraints of today’s systems, cognitive computing systems must first understand human language. Natural language processing, statistical NLP and latent semantic indexing are some of the technologies that try to understand and interpret human language for computers. We already see that ability in computer voices that increasingly play a role in our day-to-day lives. Apple’s Siri is a ubiquitous example, and voice recognition systems are now a fixture of customer engagement systems.

But as we know from the experience of trying to deal with automated call centers, sounding like a person is not enough. Cognitive computing systems must be able to learn and gather information from around them instead of being programmed explicitly to perform a task. Enter another core technology: machine learning in its several flavors. In machine learning, a computing system uses data from a trusted, verified source to understand the nature and context of its problem and train itself to work better. Machine learning-based systems get better over time as they learn more about a particular area of knowledge, but they, too, run into limitations in their ability to adjust the boundaries of their knowledge domains. (See related article on page 8, KMWorld, Volume 24, Issue 10 http://www.kmworld.com/articles/readarticle.aspx?ArticleID=107050)

The most commonly used machine learning methods today are semi-automated, iterative processes where a person must get involved in running a set of programs or algorithms on a data set, evaluate how well it works and adjust the algorithms’ parameters for the next run.

Neural networks, a software architecture that mimics the communication structure of the human brain, hold real hope for changing those limitations because they are capable of adjusting their own operating parameters while seeking a goal. Though researchers have been trying to simulate the complex thinking processes of the human brain since the 1960s, recent advances in hardware technology and neuroscience have made it possible to create better systems that take advantage of some of the distributed, parallel processing typical of brain functions.

The advantages of using the human brain as a template are myriad. As Roger Kay wrote on Forbes.com, “The human brain integrates memory and processing … operates as a massively parallel distributed processor … reacts to things in its environment, uses little power when active and even less while resting. It is a reconfigurable, fault-tolerant learning system. It is excellent at pattern recognition and teasing out relationships.”

Real-world applications

Organizations are finding many ways to use elements of cognitive computing. UNC Health Care is using natural language processing from IBM to sift through unstructured data, such as doctors’ notes, registration forms, discharge summaries and phone call notes, pairing it with structured data and using it to target high-risk patients and design prevention programs for them. The U.S. government is using analytics to examine some of NASA’s unstructured data to find potential problems with flights by looking at historical pilot reports and mechanics’ logs and to scan social media for evidence of terrorism and biological threats.

Emerging cognitive computing systems will take those successes further. Cognitive systems will coach doctors by asking questions specific to each patient; those questions will be chosen after the cognitive computing system sifts through large volumes of data from many sources and identifies the most likely avenues to pursue. Answers to those questions will guide doctors and patients to diagnoses and treatment options. And intelligence analysts will be able to ask indirect questions such as, “Who is funding this terrorist organization and how are its funds delivered?”

Those examples of cognitive computing applications involve processing large data sets, but it is worth noting that big data and cognitive computing are only related concepts. Gartner defines big data as “high-volume, high-velocity and/or high-variety information assets that demand cost-effective, innovative forms of information processing that enable enhanced insight, decision making, and process automation.” In other words, big data is large volumes of content from different sources that holds the potential to tell us something. Analytics will help pull relevant information from big data as well as define how that information is connected.

Cognitive computing is an emerging, unique form combining analytics, problem solving and communication. It uses big data if necessary to answer ambiguous questions and solve problems. But its key contributions go well beyond the charter of analytics as understood today. Cognitive computing looks within and across disparate data sets, identifies conflicting data, uncovers surprises, finds patterns, understands context, offers suggestions, requests clarifications and provides ranked solution alternatives. Cognitive computing offers a new methodology to uncover the potential in data and capture value whether the data is big or small. 

KMWorld Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues