Back to News & Blogs
4.4 min read|
Barack Obama on Deepfakes

Why a Former US President is Speaking Out on Misinformation

Barack Obama is one of the most photographed humans alive on Earth, and although this level of publicity is to be expected when governing the United States, this has left Obama an easy target for deepfake attacks due to the sheer amount of visual and audio data in the public domain. Starting his two-term presidency in 2008, Obama secured the White House during the meteoric rise of the Internet, but perhaps more importantly, mobile phones. Along with his many TV appearances and speeches from the State of the Union to Inauguration, Obama has been a central figure in the public eye for most of the 21st Century.  

To provide some context, deepfakes are digitally manipulated media which takes a person’s likeness, and mimics that person to create fake media. These deepfakes can be photos, audio recordings or videos, taking pre-existing data of that person, and the more data there is, the more accurate the deepfakes will be. Deepfakes have gradually become a talking point, as deepfake software becomes more sophisticated, the ability to identify a real or fake video is becoming increasingly challenging.  

This has led to hours of genuine footage being manipulated to create fake videos and audio recordings of Obama. Five years ago the BBC released a video showing how the University of Washington designed a program to capture the movement of Obama’s mouth while speaking to create an entirely fake recording of the former president.  

Using free photo editing software, a person could (in a matter of minutes) create a deepfake showing Obama somewhere he hasn’t been, or talking to someone he’s never spoken to. To create deepfake videos takes more prowess, but there are programs available to the public, able to generate deepfake content.  

In a recent interview with his former senior White House adviser David Axelrod, Barack Obama discusses deepfakes and the easy target he has become, due to the vast quantity of material publicly available to manipulate.  

In this article we will discuss the points made by Obama in the interview with Axelrod, what we can do to prevent the spread of misinformation in a digital world, and a solution to ensure video interviews remain tamper evident.  

Barack Obama on Digital Misinformation 

The former US president had much to say about deepfakes and the threats they pose to spreading misinformation to the public. He said that the technology to create these imitations are here now, and will be a major problem at the next election cycle in 2024. Although the US election cycle is a major event, this is becoming a general threat across society, targeting members of the public that have personal data online. Information can be obtained from social media, as these websites host a significant amount of content which can be digitally manipulated. Deepfakes can then be used to threaten or coerce, which has become a serious threat for law enforcement to manage. These deepfakes can be presented in the form of misinformation, by manipulating voters into believing the system is corrupt or rigged, or via fake videos/recordings of politicians.  

He spoke on the misinformation spread during the COVID pandemic, but said that most people nowadays know that not everything that they receive on mobile phones is truthful. It’s important to use multiple sources for the truth.  

Barack Obama

Deepfake Prevention Strategy  

In the same interview with Axelrod, Obama pointed out that there’s much the general public need to do to prepare for identifying deepfake media, especially harmful deepfakes.  

Obama suggested developing ‘digital fingerprints’ to create a watermark over anything that is posted online, so that real media can be detectable.  

They also made note of the public’s tendency to only watch news outlets they share views with, fostering a reliance on only one news source. To prevent this, it’s vital to use multiple sources of information, ideally with verification from experts or the source on the legitimacy of media.  

Deepfake Prevention with Mea: Connexus 

Although a program has yet to be created that can quickly and accurately identify a deepfake, there is a solution using Blockchain technology which enables the content to remain immutable. Mea: Connexus is a tamper evident, secure (to government standards) remote interview tool, providing methods to prove the authenticity of the digital interview content to any party. This makes Mea: Connexus an ideal platform where the content of the interview could be challenged as being false or fake at a later date, for instance in law enforcement, investigative, employment interviews and tribunals and justice use cases. 

Using a patented application of cryptography and Blockchain technology, Mea: Connexus ensures that: 

  • If the content of the interview is challenged as being fake or edited, it is easy to prove that the content has not been tampered with in any way. A video interview file being shared outside of your organisation is an accurate and unedited version of the original interview. 
  • There is confidence in the integrity of the file prior to being used within your organisation’s other systems. 

Book a tailored demonstration for your organisation. 

Get in touch with us to start a free trial so you can try Mea: Connexus for yourself. 

 

 

Recent Posts

  • Unlocking the Potential of Autistic Employees in the Workplace banner

    How can businesses truly embrace diversity and inclusion if they're overlooking a valuable group of individuals? Despite the growing awareness [...]

    Views: 346
  • Six Ways to Adjust the Workplace to Support Autistic Employees banner

    Creating an inclusive environment where everyone can thrive is necessary for employee satisfaction, productivity and retention. This is especially true [...]

    Views: 751
  • Soft Skills Every Manager Needs to Succeed banner

    Welcome to the world of soft skills, the secret ingredients that transform good managers into great leaders. These skills help [...]

    Views: 1086