IBM’s Watson collaborated with Grammy Award-winning producer Alex Da Kid to create a “cognitive song.” I thought this was a really cool project since you don’t really associate music with artificial intelligence. Nonetheless, looking a little bit deeper you realize that the incorporation of Watson actually provided for even more opportunity as it uncovered unknown inspiration for artists.
To work with Watson, Watson first had to be fed millions of unstructured data points into emotional insights that would help create a new kind of music. To do so, Watson analyzed millions of songs, lyrics, and five years of natural language texts that would ultimately help to teach Watson what made a “good” song.
One large initiative Alex Da Kid wanted was to foster a deeper connection with is audience. Thus, Watson was taught the most significant cultural themes, using Watson Tone Analyzer to read news articles, blogs and tweets to find out what people felt about them. Analyzing years’ worth of popular music, Watson Tone Analyzer API read and ultimately created its own lyrics; while Cognitive Color Design Tool analyzed album art to create their song’s album art. Finally, Watson Beat looked at the composition of songs to generate a fully immersive experience for Alex da Kid.