Friday, January 11, 2019
Once trust in reality is broken it's a swift decent into chaos.
A staffer at local Fox affiliate Q13 has been fired after the station aired what appears to be a doctored video of President Trump's Tuesday night speech from the Oval Office.
A staffer at local Fox affiliate Q13 has been fired after the station aired what appears to be a doctored video of President Donald Trump’s Tuesday night speech from the Oval Office.
The video was changed to make it look as if Trump was sticking his tongue out languidly between sentences. In addition, the colors in the video look more saturated, leading the president’s skin and hair to appear orange.
During the speech, Trump said, “Hopefully, we can rise above partisan politics in order to support national security,” before briefly licking his lips and continuing to speak.
In the video broadcast on Q13, it appears that Trump lets his tongue hang out, resting it on his lower lip for an unusually long time.
“This does not meet our editorial standards and we regret if it is seen as portraying the president in a negative light,” Q13 news director Erica Hill wrote in an emailed statement early Thursday.
Later Thursday morning, Hill released another statement saying, “We’ve completed our investigation into this incident and determined that the actions were the result of an individual editor whose employment has been terminated.”
It is unclear whether the employee created the doctored video or just allowed it to go on air.
A side-by-side comparison posted on MyNorthwest.comshows the difference between what aired live on CNN and the clip shown later on Q13.
Altering and manipulating images has become all the more easy and tempting in today’s world of fractured politics and accessible technology. And this isn’t the first time a doctored video has made headlines.
CNN reporter Jim Acosta’s access to the White House was temporarily suspended in November when Press Secretary Sarah Huckabee Sanders said Acosta put “his hands on a young woman just trying to do her job as a White House intern.” Sanders tweeted a video snippet that appeared to show Acosta chopping the intern’s arm with his hand as she tried to take the microphone from him. But analysis by fact checkers and researchers noted that the version Sanders tweeted had been altered to distort and exaggerate Acosta’s movements.
Last year, a video of Marjory Stoneman Douglas High School shooting survivor Emma Gonzalez ripping a copy of the U.S. Constitution whipped conservatives into a froth. Gonzalez never did such a thing; the original picture was of her ripping a paper gun target. The video had been doctored and spread via social media.
In 2010, the Economist got into trouble for altering a photo to remove people standing on a beach with former President Barack Obama. Obama and two other people were at a Louisiana beach, with oil platforms in the distance. A photo editor at the magazine cropped out one person and digitally removed the other, creating the illusion of Obama standing alone at water’s edge contemplating the damage.
More common these days, however, are images altered by social-media users and circulated to stoke political ire, as the Gonzalez video did; to spread misinformation or disinformation; to target and harass particular people. For example, a recent Washington Post story explained how realistic-looking pornographic videos are being digitally generated to humiliate women. A fake sex tape, with “Wonder Woman” star Gal Gadot’s face swapped onto someone else’s body, circulated in 2017.
These forgeries, known as “deepfakes,” can be hard to spot. BuzzFeed and Quartz have published guides to help you sort fact from fiction.
Researchers at the University of Washington’s Paul G. Allen School of Computer Science & Engineering said in 2017 that they had developed algorithms that allow a user to turn audio clips into “lip-synced video.”
The UW’s technology is different from the tools that would have been used to doctor the Trump video that aired on Q13. But the announcement raised immediate concerns about the potential for malicious uses. One of the UW researchers defended the invention in a TED talk last year.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment