In front of a standing room-only crowd on the first day of Moogfest 2016, IBM’s Allison Schneider opened her talk with a question: “How many of you would say you know a lot about IBM Watson?” 

Aside from a few scattered hands, there was little response, and a few muffled giggles. “How many of you have at least heard of it?” 
This time, nearly everyone raised their hands, or nodded in their seats. 
The quick sequence revealed what has become a recurring theme over the past few years: people know that Watson exists, but they don’t really know what the supercomputer actually does. Since its memorable 2011 appearance as a Jeopardy! contestant, Watson has remained a bit of a mystery. 
Five years later, though, Schneider—who specializes in working with companies in arts and culture to put Watson to use—explained that Watson’s “brain” is being utilized by developers and decision-makers across the world. Watson is now putto the test in tackling issues as broad and serious as cancer treatment, to fun projects like creating new dinner recipes with unconventional ingredients. 

What is Watson?

As Schneider put it, IBM Watson is a computer system that aims to “enhance and scale the human experience.” 
In more concrete terms, it is a cognitive technology—artificial intelligence—designed to read and understand human language, learn from it and provide answers and responses to questions. The idea behind Watson, Schneider said, is that the system operates and makes decisions in the same way that a human brain would. 
While most technology needs clearly defined, structured data parameters to yield an output, Watson “learns” from unstructured text and audio, and increasingly from images. 

Putting Watson to use

In recent years, IBM has increased public access to the Watson system, which has let developers put the technology to work in surprising ways. 
IBM has publicly released 30 open-source APIs on its Watson Developer Cloud. They’re broken down into four categories: Language, Speech, Vision and Data Insights. Schneider explained that the potential impacts of each API are far-reaching, whether that means crafting visually pleasing artwork, or honing highly targeted marketing tactics. 
Because of Moogfest’s musical focus, Schneider highlighted Quantone, a music recommendation engine built from a Watson API. While most musical recommendations are based on metadata and specific categorical filters, Quantone provides recommendations based on things like blog posts, social media discussion and similar unstructured content around the Internet. 
On the other end of the spectrum, SamPack utilizes Watson to create, new algorithmic samples for musicians as they create new songs. Smack was the product of a SXSW hackathon, after two musicians — doubling as developers — sought out a solution for their struggles with finding quality royalty-free samples for their new songs. 
Watson’s reach has also extended to the kitchen, as—a “cognitive cooking” tool—allows users to input ingredients or meal parameters and receive 100 new recipes in return. Rather than a simple recipe database, Watson actually “creates” the recipes in real time. 
While the implications for Watson are vast, Schneider maintains there is no desire for the supercomputer to “replace” humans in the slightest. 
“It’s meant to equip creatives and designers,” she said, “to think deeper and make cool things happen.” 
“We want to disrupt the creative process and get artists thinking differently.”