Uncovering the covered – YouTube reinstates Frank Sinatra cover of 2004 hit “Toxic”

Share

A recent YouTube spat over an AI created cover of Britney Spears’ 2004 song “Toxic” demonstrates that AI has now made it possible for one artist to sing a cover of another artist, no human required! In this case, a group called DADABOTS, which describes itself as “a cross between a band, a hackathon team and an ephemeral research lab”, created the rendition using software enabling generation of audio content in the voice of a particular artist, in a particular genre or as a novel fusion. The end result was a Frank Sinatra cover of the Britney Spears song.

Whilst this is a prospect that is bound to be exciting to some, from the perspective of conventional copyright law (and perhaps defamation law), the path to creating such a custom cover track using AI is fraught with complexity and uncertainty.

Yet that didn’t stop DADABOTS rendition being met with something a little more conventional, a copyright takedown notice and subsequent removal from YouTube, as Futurism’s Dan Robitzski recently reported. Futurism noted that GreyZone Inc., a company which offers copyright infringement identification and reporting services, was responsible for the complaint but wasn’t able to identify on whose behalf.

Appeal of YouTube take down

The basis for the original takedown notice is unclear – presumably representatives of either Frank Sinatra or Britney Spears’ respective record labels took the view that their copyright was infringed in some form. In Australia, their songs are likely to be covered by a suite of copyright and related rights, potentially including copyright in the lyrics, the musical work, performance rights and recording rights.

In this case, DADABOTS appealed YouTube’s decision to remove the rendition, calling in aid the US copyright doctrine of fair use. This doctrine is codified in the US Copyright Act and allows “fair use” of copyright works without infringing copyright. Unlike in Australia where the equivalent “fair dealing” is limited to quite specific circumstances, the US Courts have a broad discretion to determine what use is “fair”, depending on factors such as the nature of the copyright work, the purpose of the use and the effect of the use on the potential market for the copyright work. YouTube accepted the appeal and DADABOTS upload was reinstated, albeit with YouTube flagging it as a cover of “Toxic”. YouTube offers a service which allows eligible copyright holders to set up rules dealing with any third party copies of their exclusive copyright content uploaded on YouTube. This might include blocking the uploaded media, taking ad revenue from it or tracking the viewership information. YouTube’s flag of the DADABOTS version presumably makes it subject to any such controls in place in respect of “Toxic”.

Use of copyright works during development of AI powered applications

To make the rendition, DADABOTS used an AI software tool developed by California based OpenAI. Known as “Jukebox”, the software utilises neural networks, a form of machine learning. To become competent, Jukebox was trained by processing various datasets, which in turn allowed it to create the rendition on the basis of what it learnt from that data processing. In this case Jukebox then performed lyrics (of “Toxic”) in the voice and/or genre of the artist on which the software had been trained (Frank Sinatra). According to Open AI’s website, Jukebox was trained on 1.2 million songs, corresponding lyrics and metadata, including artist, genre, year and associated keywords.

Interestingly, Futurism’s article suggests the YouTube appeal arguments were crafted without reference to the use of AI to create the DADABOTS rendition. The fair use arguments were focussed on the end result (i.e. the new rendition) as a fair use in itself, rather than by reference to the method used to create it. Separate to issues around copyright ownership of works created wholly or substantially by AI technologies, the use of copyright works for AI related functions, such as machine learning and datamining is very much a live issue globally. The extent to which such uses are thought to be “fair uses”, is likely to influence the development of the fair use doctrine in copyright, the bounds of which already vary considerably from country to country, if not lead to specific copyright provisions. One notable mover in this space is Japan, which from 1 January 2019 enacted various provisions in its Copyright Act aimed at exempting computer data processing and analysis of copyright works from infringement in certain circumstances. The way in which the law on this topic develops globally may well be a factor which influences where companies with a focus on developing AI powered applications choose to base themselves (if it is not already). Differences in the law on this topic around the globe may also lead to potentially complicated questions of jurisdiction where AI generated material is exported from one country to another.

In Australia, the current law is likely to be less friendly to machine learning and AI powered applications, as a result of the stricter confines on fair dealing, and other limited exceptions to copyright infringement. Certainly where AI applications are used for development which is commercial in nature, it is unlikely that Australian copyright exemptions would apply. While section 40 of the Copyright Act 1968 (Cth) does provide an exemption for fair dealing for research purposes, it has generally been given a narrow reach.

Indeed, extension of the fair dealing provisions in Australia was a topic considered extensively by the Productivity Commission in its review of intellectual property laws in Australia and consequent report issued in 2016. Submissions made on this issue included those from tech companies supporting a broader fair use exception and specifically referring to machine learning in this context. The Productivity Commission ultimately recommended significant reforms in this area paving the way for a US-style fair use exception. However following two years of further consultation, the Government announced in August 2020 only a limited extension of the fair dealing exceptions for non-commercial quotation.

As a consequence, use of datasets to train AI systems in Australia appears likely to carry with it a significant risk of copyright infringement for the foreseeable future, in the absence of an appropriate licence. As the law in this area continues to develop worldwide, this is clearly a space to watch for both copyright owners and anyone using databases of copyright material to power or utilise AI technologies.

Share
Back to Articles

Contact our Expert Team

Contact Us