Deepfakes, Pornography and Consent

Philosophers' Imprint (forthcoming)
  Copy   BIBTEX

Abstract

Political deepfakes have prompted outcry about the diminishing trustworthiness of visual depictions, and the epistemic and political threat this poses. Yet this new technique is being used overwhelmingly to create pornography, raising the question of what, if anything, is wrong with the creation of deepfake pornography. Traditional objections focusing on the sexual abuse of those depicted fail to apply to deepfakes. Other objections—that the use and consumption of pornography harms the viewer or other (non-depicted) individuals—fail to explain the objection that a depicted person might have to the creation of deepfake pornography that utilises images of them. My argument offers such an explanation. It begins by noting that the creation of sexual images requires an act of consent, separate from any consent needed for the acts depicted. Once we have separated these out, we can see that a demand for consent can arise when a sexual image is of us, even when no sexual activity was actually engaged in, as in the case of deepfake pornography. I then demonstrate that there are two ways in which an image can be ‘of us’, both of which can exist in the case of deepfakes. Thus, I argue: if a person, their likeness, or their photograph is used to create pornography, their consent is required. Whenever the person depicted does not consent (or in the case of children, can’t consent), that person is wronged by the creation of deepfake pornography and has a claim against its production.

Author's Profile

Claire Benn
Cambridge University

Analytics

Added to PP
2024-09-20

Downloads
272 (#74,912)

6 months
272 (#6,760)

Historical graph of downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.
How can I increase my downloads?