What’s the answer to AI’s big copyright problem?

Artists are furious that their work has been used as fodder for AI tools, and copyright lawsuits are mounting as a result. We discuss AI’s intellectual property woes with Jelly head of artist management Nicki Field and illustrator Christoph Niemann

“It’s a bit of a weird grey area,” says Nicki Field, in what feels like a diplomatic understatement. We’re discussing artificial intelligence and its burgeoning new role as an artist, via image-generation tools including Midjourney, Dall-E 2 and Stable Diffusion. Unsurprisingly, the global head of artist management at Jelly says she’s had plenty of conversations with her artists about it.

Feelings about the rapidly developing technology are understandably mixed. Some have hailed AI’s seemingly magical ability to produce artworks, while others have raised questions about where these tools learned this to begin with. It’s a murky arena, and companies haven’t been entirely open about how it all works. “There’s the issue of how these generators have been learning and what they’re being fed, because they’re infringing artists’ rights in the first instance,” explains Field. “There’s been no permission asked in the training data set. If they’re just trawling the internet, they’ve got access to millions and billions of images which they’re learning from.”

Many AI companies are now finding themselves with mounting legal issues. Getty Images is suing Stable Diffusion creators Stability AI, claiming it scraped its image library without permission, and a group of artists is bringing a class action lawsuit against the same company, and Midjourney, for similarly training tools using their images.