-->

KMWorld 2024, Washington, DC - November 18 - 21 

Footprints on the Microsoft desktop

Article Featured Image

The vision of cognitive computing—a collaboration between people and computers that magnifies human reason and insight—is incredibly exciting. It’s not just hype and vision; there are some remarkable, and real, early applications out there. But this morning, as I was working through my e-mail, I realized I’m already using cognitive computing every day, just in small and subtle ways.

I’m used to thinking of Cognitive Computing in capital letters—powerful systems that give deep insight in specific domains, a major technology trend toward the future. But lowercase cognitive computing is also already pragmatically at play on my desktop.

In case you don’t know me, I’ll give you the context that I’m a geek and a very close Microsoft partner. I develop software around search and text analytics, and formerly worked on speech recognition and machine learning, so perhaps I’m more attuned than many to what machines are doing on my behalf. But mostly I am just trying to get my work done.

Facial recognition

You don’t hear Microsoft talking much about cognitive computing. But in fact, Microsoft has been quietly working some significantly deep technology into shallow applications. Microsoft Research’s work in machine learning, image recognition and multiple modes of human-machine interaction finds its way to product groups and out into today’s shipping software. These advanced capabilities lurk in pretty prosaic applications, in ordinary places such as my desktop. I’ll describe a couple of them, and you can judge for yourself whether they are cognitive computing.

Windows 10 uses facial recognition with Windows Hello, offering you the ability—on Intel RealSense-equipped PCs—to log in with your face. Is that actually useful? At first it seemed like a gimmick that I could walk into my office and my computer would log me in and automatically connect into whatever virtual meeting I was supposed to be in at the moment. Now it feels like a real help; it saves me a minute or two, but more importantly makes me much less frantic when I’m running late (which is often).

A little nagging

Of course, personal digital assistants are part of the picture, and Microsoft’s Cortana has pulled together speech recognition, intelligent agents, machine learning and a lot of human factors work into a pretty good package. I use speech commands to Cortana sparingly (mostly on my phone; I’m used to typing on my desktop) and notice that suggestions about meeting times are getting better and better. The new reminder features feel a bit like my computer is constantly nagging me, but after a week I’ve already had several pretty important reminders (examples: “Did you book your flights for that trip yet?” “You haven’t replied to that customer’s questions. Do you want me to remind you?” “Your calendar says you are supposed to meet Sid for lunch - NOW”). Perhaps I need a little nagging now and then.

The Office Graph is also driving a bunch of subtle features in my everyday work. The Clutter feature sorts my e-mail into different types and priorities, keeping my focus on the most important messages. At one level, it’s sort of like a spam filter—and I would not call spam filters “cognitive.” But it is clearly tracking what I read and reply to, what I don’t, how I use folders, etc.—and adapting continuously. Much smarter and more convenient than any spam filter I’ve used.

Delve presents information and people that I should be aware of in a Pinterest-like format. I use it continually both to discover information and to see what people are up to. The learning features are highly damped—so that I don’t get a lot of spurious results. But the result is still dynamic and I see “recommendations” several times a day that are truly helpful and timely. I use Delve Organizational Analytics to see trends in who’s working with whom, on what and with what patterns—which hopefully makes me a more effective manager (or at least a better informed one). And I’ve just added in an app that automatically previews who is involved in upcoming meetings, what material they’ve authored or viewed related to the topic of the meeting, and suggests what might be issues to be ready for or fruitful topics to bring up.

When I say these applications are shallow and prosaic, I mean it’s everyday activity—I’m reading and writing e-mails and documents, organizing and participating in trips and meetings, thinking about my team, our projects, etc. I mostly don’t even think about the fact that I’m using these tools or how they work, and when I do it intrigues me and triggers things like this article.

Subtle but cognitive

Is this cognitive computing? I think it clearly is. It’s subtle, but these are actually complex situations with ambiguity and uncertainty, shifting situations with fluid and evolving user goals, and information-rich environments where context is important. My desktop (and to a lesser degree my phone) are now full of tools that are contextual, interactive and adaptive; they provide machine-aided serendipity. Sounds like the definition of cognitive computing.

KMWorld Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues