Since the dawn of advertising, celebrities have been used to endorse products, to give them a bit of star power and glamour. In more recent years, with the rise of influencer culture, this has become more nuanced: a celebrity showcasing a product or brand might bring appeal and reach, but influencers bring a closer relationship with their followers. If an influencer we admire and engage with says a product is good, we are inclined to trust them (in theory anyway).
So what happens when it transpires that the person fronting a brand or showcasing a product is in fact not quite what they appear? That it’s not entirely their face or voice appearing in an ad, but merely an outline of it?
It’s something that’s unravelling in real time with the use of deepfakes in advertising and marketing content. Deepfakes involve neural networks, which are fed a dataset in order to essentially produce an imitation of someone’s likeness – either their face or voice. Deepfakes on the more ‘professional’ end of the spectrum usually involve an actor with similar facial or vocal characteristics to the person they’re imitating, such as the series of ominous deepfakes of Donald Trump, Mark Zuckerberg and Kim Kardashian made by Bill Posters in 2019, or the excellent Tom Cruise spoofs that spread like wildfire across TikTok and beyond earlier this year.