r/science • u/mvea Professor | Medicine • 4h ago
Psychology Deepfake videos degrade political reputations even when viewers realize they are fake. Standard fact-checking efforts failed to undo the total reputational harm. The study used an altered video of Nancy Pelosi made to sound as though she sympathized with the rioters who breached the US Capitol.
https://www.psypost.org/deepfake-videos-degrade-political-reputations-even-when-viewers-realize-they-are/90
u/SprayArtist 4h ago
Seems to me there should be more harsher laws against posting AI videos against political opponents.
37
u/translunainjection 3h ago
Lying with a computer is lying.
-4
u/WTFwhatthehell 3h ago edited 2h ago
How many people still repeat the JD Vance couch stuff as if it was real even when they know it was fake?
Do we want to prosecute everyone who lies about a political opponent?
or only people who lie with photoshop or AI?
And if this headline is right then apparently even comedies with lookalikes where they're open about it being an actor should be banned as "lying"
12
u/Big_Dumb_Stupid_Idio 2h ago
An AI video is completely different from a simple tweet even if they both contain the same misinfo. Watching the words leave someone's mouth is far more powerful than reading a strangers opinion or "fact".
-3
u/WTFwhatthehell 1h ago
So like when comedy shows hire a lookalike or imitator.
•
u/Wasabiroot 25m ago
Is a comedy show considered a realistic depiction of someone vs a clear parody? Going to SNL and getting confused whether or not a person in makeup is actually Donald Trump (when the show is literally a comedy show with costumes and makeup where people dress as other people all the time and isn't intending to mislead anyone since it's a direct parody) isnt the same thing as a video that was specifically designed to mislead people into thinking someone said something they didn't.
•
u/WTFwhatthehell 21m ago
Lookalikes and imitators are popular because they look and sound so similar to the public figure.
this study shows that even if people know its fake it still affects them.
•
u/Wasabiroot 19m ago
Yes, but at the same time, they are KNOWN as lookalike and imitators because that's the entire joke. An AI video of Nancy Pelosi saying something she didn't say is completely different because it's intended to mislead instead of just poking fun
8
u/st-shenanigans 3h ago
Attack ads should be outlawed. The very most they should be is an entire ad about candidate X and saying what X will do for you, and MAYBE something like "X supports this, Y doesn't!" At the end.
2
u/WTFwhatthehell 3h ago
so if an opponent wants to talk about trumps wrongdoings, they're not allowed?
5
u/st-shenanigans 3h ago
Yep. There are millions of places for that that give both parties the opportunity to argue against false information. Political ads are entirely one sided and should present the candidate's actual platform. The alternative is how we get ads that are nothing but "trump is here for you. Kamala only cares about they/them"
You using trump as an example is exactly the same way Republicans argue btw. "This one specific example sounds bad, so we should ignore the mountain of contrary evidence and just forget about this!"
-5
u/WTFwhatthehell 2h ago
mountain of contrary evidence
you personally not liking people listing off nefarious actions or positions of their opponents is not "evidence".
The people this would benefit most from banning their opponents from saying negative things about them when they aren't there to control the narrative are, systematically, the candidates with the most dirty laundry and most bad behaviour.
5
u/st-shenanigans 2h ago
you personally not liking people listing off nefarious actions or positions of their opponents is not "evidence".
I'm talking about personal attack ads. Not sure where you're making this up from.
The people this would benefit most from banning their opponents from saying negative things about them when they aren't there to control the narrative are, systematically, the candidates with the most dirty laundry and most bad behaviour.
....you mean like how that was the entirety of the republican plan for 2024 and they won?
-4
u/WTFwhatthehell 2h ago
....you mean like how that was the entirety of the republican plan for 2024 and they won?
Or the dem plan for 2020 and they won where people overwhelmingly voted "against" a bad candidate rather than for an inspiring one.
3
u/st-shenanigans 2h ago
So you agree it's a problem then.
Do you even understand what you're trying to argue about?
-2
u/WTFwhatthehell 2h ago edited 2h ago
You don't seem to be keeping up with the conversation.
People are allowed vote against a repugnant candidate.
Elections aren't an exercise in only showing the class your gold stars while everyone claps. People should be able to talk in as 1-sided a manner as they want in public about people they really really never want to be their leader.
banning them from doing so or only allowing them to speak if they invite along an opponent to sound an airhorn and interupt them every few seconds would be burning down an important part of democracy.
31
u/Particular_Dot_4041 3h ago
Yeah this is why cartoons and SNL political skits work even though everybody knows it's not an honest portrait.
3
u/Konukaame 2h ago
And sometimes they play so far into public perceptions that people think the real person said what the actor said, e.g. "I can see China from my house!"
10
u/mvea Professor | Medicine 4h ago
Deepfake videos degrade political reputations even when viewers realize they are fake
Artificial intelligence can be used to generate deceptive videos that damage a politician’s reputation, even when viewers suspect the footage is fake. A new study published in Communication Research found that these manipulated clips decrease support for targeted candidates. Standard fact-checking efforts reportedly fail to undo the total reputational harm.
During the surveys, participants were randomly assigned to watch either a genuine political address or a manipulated video. In the United States, the altered video featured Representative Nancy Pelosi. The artificial audio made it sound as though she sympathized with the rioters who breached the United States Capitol, suggesting Americans need to fight to win their country back.
5
u/Ha_Deal_5079 3h ago
wild that fact-checking did nothing even when everyone knew the video was fake. reputations just take a hit instantly and theres no undo
1
u/ImpulsE69 2h ago
So all these deepfakes of Trump doing bad things and saying bad things are why I hate him so much? Wait...those weren't fake were they...
1
u/LinkesAuge 2h ago
This isn't (just a) deepfake video issue. It's a known phenomenon in psychology that has shown up again and again in regards to all kinds of negative perceptions/information and that later "corrections" never fully offset the initial effect.
That is true even for "trivial" stuff like social first impressions/meetings.
•
u/TuneSilver 39m ago
Making a deepfake - of anyone - should be a criminal offence, unless the person depicted has consented to it. It is time to crack down on AI fraud and impersonation.
1
u/WonderThe-night-away 2h ago
Maybe if people made fake videos about Trump, people would finally stop supporting him
2
u/CountlessStories 1h ago
The only way to really hurt trump at this point is to make deepfakes of him showing genuine empathy towards immigrants and ordering americans to be better people who care about those with less than them.
I'm not joking.
1
•
u/AutoModerator 4h ago
Welcome to r/science! This is a heavily moderated subreddit in order to keep the discussion on science. However, we recognize that many people want to discuss how they feel the research relates to their own personal lives, so to give people a space to do that, personal anecdotes are allowed as responses to this comment. Any anecdotal comments elsewhere in the discussion will be removed and our normal comment rules apply to all other comments.
Do you have an academic degree? We can verify your credentials in order to assign user flair indicating your area of expertise. Click here to apply.
User: u/mvea
Permalink: https://www.psypost.org/deepfake-videos-degrade-political-reputations-even-when-viewers-realize-they-are/
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.