Expert: AI-Generated Music Is A “Total Legal Clusterf*ck”

The legal industry isn't ready for AI-generated music, leading to all sorts of new questions about copyrights in the age of creative machines.

AI-Generated Music

If you train a music-generating artificial intelligence exclusively on tracks by Beyoncé, do you owe the pop star a cut of any resulting songs’ profits? And is it even legal to use copyrighted songs to train an AI in the first place?

Those are just a couple of the questions The Verge poses in a fascinating new story about AI-generated music published Wednesday. And while the publication consulted numerous experts from the music, tech, and legal industries for the story, the input of one person in particular — Jonathan Bailey, CTO of audio tech company iZotope — seemed to most concisely sum up the issue.

“I won’t mince words,” he told The Verge. “This is a total legal clusterfuck.”

Imitation Game

Despite the U.S. Copyright Office bringing up the potential problems that could arise from computer songwriters way back in 1965, U.S. copyright law has yet to nail down exactly who owns what when a computer is involved in the creative process, according to The Verge.

As it stands, the Beyoncé-trained AI could crank out an entire album of “Lemonade”-esque tracks, and as long as none of them sounded too much like any specific Beyoncé song, the AI-generated music wouldn’t be infringing on her copyrights — and the AI’s creator wouldn’t legally owe the artist a penny, lawyer Meredith Rose told The Verge.

Less clear is the use of copyrighted songs to train an AI. Several of The Verge’s sources said there isn’t a straightforward answer as to whether buying a song grants a person the right to then use it to train a machine learning system.

Clock’s Ticking

Of course, programmers have yet to come anywhere near creating AIs capable of autonomously churning out hit songs in the key of Bey — or anyone else for that matter — but that doesn’t mean they won’t be able to one day.

“It’s like the future of self-driving cars,” media-focused venture capitalist Leonard Brody told Fortune in October. “Level 1 is an artist using a machine to assist them. Level 2 is where the music is crafted by a machine but performed by a human. Level 3 is where the whole thing is machines.”

We’ve already seen several examples of those first two levels — tech-forward songstress Taryn Southern shared songwriting credits with AI on her “I AM AI” album, released in September, and that same month, Iranian composer Ash Koosha released an album on which he sang songs composed by AI-powered software.

If Brody’s prediction is correct, the next step will be AIs creating music by themselves — and if we’re already in the midst of a “legal clusterfuck,” who knows what sort of legislative nightmare that will be?

READ MORE: WE’VE BEEN WARNED ABOUT AI AND MUSIC FOR OVER 50 YEARS, BUT NO ONE’S PREPARED [The Verge]

More on AI songwriters: This Musician Created an AI to Write Songs for Him, and They’re Pretty Strange

The post Expert: AI-Generated Music Is A “Total Legal Clusterf*ck” appeared first on Futurism.

See more here:
Expert: AI-Generated Music Is A “Total Legal Clusterf*ck”

Related Posts

Comments are closed.