Connect with us

Hi, what are you looking for?

Technology

(Video) New Digital Storytelling Project from MIT, Educate Public About World of Deepfakes

Massachusetts Institute of Technology (MIT) made a highly realistic manipulation of audio and video, or also knows as a deepfake, that shows Richard Nixon announced that the astronauts Neil Armstrong and Edwin Aldrin had experienced a moon landing disaster, where they will “stay on the moon to rest in peace”.

This deepfake video shows the 37th U.S. President giving a real contingency speech written in 1969 for a scenario in which the Apollo 11 crew could not return back to Earth.

The use of a sophisticated artificial intelligence machine that reads the movements of an actor reading a script would be transferred and manipulated into Mr. Nixon, where it would look like he was the one actually giving it – when it’s not.

This project is called “In Event of Moon Disaster”, a storytelling project from MIT’s Center of Advanced Virtuality.

Halsey Burgund, the co-director from the MIT Open Documentary Lab, hopes that this would raise awareness about the dangers of spreading fake news, especially those that are well-developed and very convincing.

Halsey hopes that this project would educate the people to not be easily convinced at whatever they see online and to always find solid proof before sharing.

Not many can recognize a digitally manipulated video, especially those that produce realistic visuals that could trick our eyes.

It was featured on their website, where the “complete” deepfakes was posted. The team worked with a voice actor and Respeecher, a company that produces synthetic speech using deep learning techniques.

Another company they worked with, Canny AI, was to use video dialogue replacement techniques to study and copy the movements made by Nixon’s mouth and lips.

Through these collaborations and the sophisticated machines and techniques, these deepfakes can be strongly believable.

“Media misinformation is a longstanding phenomenon, but, exacerbated by deepfakes technologies and the ease of disseminating content online, it’s become a crucial issue of our time,” says D. Fox Harrell, professor of digital media and of artificial intelligence at MIT and director of the MIT Center for Advanced Virtuality, part of MIT Open Learning.

“With this project — and a course curriculum on misinformation being built around it — our powerfully talented XR Creative Director Francesca Panetta is pushing forward one of the center’s broad aims: using AI and technologies of virtuality to support creative expression and truth.”

Aside from the film, moondisaster.org had released multiple resources that are interactive and educational regarding deepfakes – teaming up with artists, journalists, filmmakers, designers, and computer scientists that can cater to the media consumers to get a better understanding of deepfakes; how they’re made, the dangers of information misuse and how to identify or combat it once it’s out.

Scientific American released a documentary on the project, and how it all became.

The project is supported by the MIT Open Documentary Lab and the Mozilla Foundation, which awarded “In Event of Moon Disaster” a Creative Media Award last year. These awards are part of Mozilla’s mission to realize more trustworthy AI in consumer technology. The latest cohort of awardees uses art and advocacy to examine AI’s effect on media and truth.

“It’s our hope that this project will encourage the public to understand that manipulated media plays a significant role in our media landscape,” says co-director Burgund, “and that, with further understanding and diligence, we can all reduce the likelihood of being unduly influenced by it.”

Source: Moondisaster.org, News MIT, Scientific American

Click to comment

Leave a Reply

Your email address will not be published.

You May Also Like

Lifestyle

A new deep fake video transforms Millie Bobby Brown into Harry Potter’s Hermione Granger. Brown is famous for her work in Stranger Things and...

Advertisement

Copyright © 2021 Siakap Keli Sdn. Bhd.