Deepfakes, or AI-generated images impersonating individuals, are increasingly shared online for abusive purposes, from disinformation campaigns to identity theft and fraud. They can also undermine democratic processes: in the 2024 US primary elections, a deepfake impersonating Joe Biden urged thousands of voters to refrain from voting, and Italian Prime Minister Giorgia Meloni has been targeted by dissemination campaigns of AI-generated pornographic videos.
Despite this, there is no unified framework protecting individuals targeted by these abuses. Some countries, however, are moving in that direction: Denmark recently proposed a first-of-its-kind copyright reform that would protect individuals from deepfake impersonation. What if Canada and the rest of the world took inspiration from it?
The Danish Approach
The European nation proposes two legal changes to combat deepfakes, both of which constitute amendments to its Copyright Act.
The first grants individuals the explicit right to their likeness (face, body, voice), allowing them to block non-consensual, AI-generated realistic imitations. This strengthens personal autonomy and counters abuse and disinformation.
The second protects artists and performers by preventing the non-consensual distribution of digital imitations of their artistic work. This approach reforms copyright to pragmatically address deepfake abuses.
Limitations
The Danish proposal, however, has two main limitations.
First, it excludes deepfakes that are not 'realistic digitally generated imitations'. This definition leaves out clearly labeled deepfakes and unlabelled but non-realistic ones. This distinction creates a grey area for 'semi-realistic' deepfakes and ultimately rests on the judge’s opinion.
Second, it also excludes deepfakes that constitute caricature, satire, parody, or social criticism to protect freedom of expression. A satirical deepfake can only be challenged if it constitutes misinformation posing what is deemed a 'serious threat'. Again, the practical impact depends on the Danish courts' subjective interpretation of terms like 'satire', 'social criticism', and what constitutes a 'serious threat'.
Canadian Hurdles
Thus far, deepfakes in Canada have not been approached from this intellectual property-centric angle. Instead, they are more likely to be covered by the ‘right to image’ (droit à l’image) in Quebec alongside the common law’s tort of appropriation of personality. Hence, following Denmark's approach of using copyright for deepfake protection federally risks constitutional conflict between copyright and civil rights in Canada.
Extending Canadian copyright to protect a person's face or voice does not seem suitable (it currently only covers authored works meeting the ‘skill and judgment’ standard1, and aims to protect original works rather than a person’s likeness or identity). Instead, a neighbouring sui generis right, akin to performers' rights and subject to a satire/social commentary exception, offers a more logical path.
However, introducing a new federal likeness copyright could still lead to complications, such as overlapping rights and veto points with existing copyright and moral rights.
A copyright approach, while adding remedies to identified harms, also fails to address authenticity and attribution, making it hard to identify the creator/uploader or distinguish real and artificial imagery.
Therefore, any copyright provisions would need supplementary horizontal rules focusing on labelling, watermarking, and provider transparency (like the EU AI Act and China’s Deep Synthesis regulations).
Should the new Danish regulation prove effective, it could serve as a model for other nations to invent and implement their own. Denmark’s example highlights that bold measures in AI governance are sometimes necessary to tackle emerging challenges posed by such a powerful and ever-evolving technology.
1CCH Canadian Ltd. v. Law Society of Upper Canada, [2004] 1 S.C.R. 339, 2004 SCC 13
The authors would like to express their gratitude to Karl Bissonnette, Paul Gagnon and Eve Gaumond for their comments on this blog post.