Deep fakes and democracy: the threat
Anyone of us can easily access the technology for creating fake news
No apologies from me in returning to antisocial media and the daily fallout from their malign influence on our lives. This week I want to write about deep fakes, not least because we are going to be hearing – and seeing – a lot about this technology in the coming months as general elections spring up all over the world, including here and, of course, in the USA, where this issue may well have a much greater prominence and devastating effect.
Recall Michelle Obama saying ‘when they go low, we go high’ in 2008? That’s going to be an interesting question for the Democratic Party in the coming months, as they face off a Republican Party, these days a front for the Trump Cult, who, you may be sure, will be using every kind of fakery they can muster. Including both audio and video clips of Joe Biden mouthing stuff he never uttered.
This particular rabbit hole is really a warren of immense size. I spent some time last weekend exploring this noisome labyrinth and it gets scarier the deeper you go. First off, it has to be said the really alarming (because so good) software is on offer as open source material (no money changes hands). So far it is for nerds and geeks, specialists in computer science, because downloading it is not easy and the manipulative tools hard to understand. However, if you choose to pay a not very high price, you may buy off the shelf software that will do the trick almost as well. In a trice, for a few dollars, you’re in the business of deep-fake audio, video and stills forgery. All the evidence is that you won’t be caught when you start to post.
Unsurprisingly, such technology, which is heavily invested in AI, has mostly been used to create pornographic content. Luba Kassova, journalist and consultant on gender equality writes: ‘Last year’s State of Deep-fakes report revealed a six-fold increase in deep-fake pornography in the year to 2023. Unsurprisingly, women were the victims in 90 per cent of recorded cases. [Some sources suggest the figure is closer to 99 per cent.]
‘Technology now allows a 60-second deep-fake video to be created from a single clear image in under 25 minutes – at no cost. Often using images lifted from private social-media accounts, every day more than 100,000 sexually explicit fabricated images and videos are spread across the web. Referral links to the companies providing these images have increased by 2,408 per cent year on year.
‘There is no doubt that non-consensual deep-fake pornography has become a growing human rights crisis. But what steps can be taken to stop this burgeoning industry from continuing to steal identities and destroy lives?
‘Britain is ahead of the US in having criminalised the sharing – but not creation – of deep-fakes and has some legislation designed to bring greater accountability to search engines and user-to-user platforms. But the legislation does not go far enough.’
‘Sophie Compton is a founder of the #MyImageMyChoice campaign against deep-fake imagery and director of Another Body, a 2023 documentary following female students seeking justice after falling victim to non-consensual deep-fake pornography. For her, search engines have a key role in disabling this abuse.
[A full programme about what happened to her and a fellow student may be found at: https://www.bbc.co.uk/iplayer/episode/m001w2jr/storyville-another-body-my-ai-porn-nightmare ]
‘However, according to Prof Hany Farid, a forensics specialist in digital images at the University of California, Berkeley, all of those parties indirectly making money from deep-fake abuse of women are unlikely to act. Their “moral bankruptcy” will mean they continue to turn a blind eye to the practice in the name of profits unless forced to do otherwise, he says.
‘As a gender-equity expert, it is also clear to me that there is something deeper and more systemic at play here.
‘My research has highlighted that male-dominated AI companies and engineering schools appear to incubate a culture that fosters a profound lack of empathy towards the plight of women online and the devastating impact that sexual deep-fakes have on survivors. With this comes scant enthusiasm for fighting the growing non-consensual sexual image industry.’
For another take on this issue go to:
So far, so bad. It’s yet more depressing evidence that violence against women, of which deep-fakes are an integral part, is on the rise, along with the attempts from all sides to push back on what little progress for women’s rights has been made in the past 50 years.
But, just as alarming, is the ease with which deep-fake imagery is being used to bamboozle the public with regard to political discourse. Given that a majority of younger people get their ‘news’ from anti-social media, and that deep-fakes – short pithy videos – are custom made for such platforms, this puts all democracies at risk. It is worth noting that when I was on various deep-fake software sites, pop up ads kept appearing inviting me to join TikTok. The two go together like a horse and carriage.
The only way to combat this is for much stronger legislation, as direct in intent as laws relating to fraud and money-laundering. Anyone merging images using deep-fake software without the express written consent of both parties should be liable for prosecution. All the antisocial media platforms that allow deep-fake merged images without these consents in place should be prosecuted as well.
Meanwhile, we all need to be vigilant. Deep-faking is part of the armoury of authoritarian regimes. Just as Russia has been called out over the so-called bed bug epidemic in Paris (allegedly caused by Ukrainian refugees) as fake news, we all need to check the sources of everything we read, hear or see in the real-world media. There used to be a simple rule in news gathering: you had to have at least two independent sources to confirm a story before it could be run.
As for antisocial media I’d say, trust nothing you see or hear on any of these platforms. They’ve already lied to you because they say they are not publishers when they palpably are. Best of all, wean yourselves off them entirely; they’re leeches on their users, audiences and followers, sucking the lifeblood out of real discussion, all in the name of data mining, selling you as a commodity to anyone who’ll pay the price. It’s a not so subtle form of mind-enslavement but, unlike the bodily kind, you can set yourself free.
This week: Tim started to read the excoriatingly brilliantWifedomby Anna Funder. It unpicks in unflinching detail the root and branch misogyny that informed Eric Blair’s (George Orwell) writings. His wife, Eileen O’Shaughnessy, was, surprise, surprise, the genius behind the persona of a little man (although over six feet in height), out of his depth all his life and hating it. Once, she literally saved his life in civil war torn Spain. He never acknowledged that: he dared not. It’s a book about the unsung work of all women and it has rightly been called a masterpiece (should that, truly, be mistresspiece?). And, consider this, the story does not take away from George Orwell the writer; it simply adds much needed clarification and context. And it poses the ag-old question again: how do you judge an artist and their output sat against their lived life? Remember this: the utterly sublime music of Wagner was