The tl;dr - people are actually pretty good at telling true from false, but they're more skeptical of information that doesn't fit with their beliefs. So it's not that folks'll believe anything - it's that they tend to reject some really important things.
I mean, if you buy that whole "science" thing, that is. There's some irony here.
Completely. Though, I just took a quick glance at the research and they actually excluded deepfakes. "An exception is the identification of artificial intelligence–generated deepfakes, where people often show chance-level performance (Nieweglowska et al., 2023). The current analysis focuses on verbal information involving propositions about states of affairs that may be true or false, which is different from deepfakes." Though, that was 2023 -- wonder if that has changed. .
Nailed it again. You're a freakin carpenter.
Excuse me while I have a nerd moment here. Some recent research validates your point - https://journals.sagepub.com/doi/full/10.1177/09637214241280907
The tl;dr - people are actually pretty good at telling true from false, but they're more skeptical of information that doesn't fit with their beliefs. So it's not that folks'll believe anything - it's that they tend to reject some really important things.
I mean, if you buy that whole "science" thing, that is. There's some irony here.
Completely. Though, I just took a quick glance at the research and they actually excluded deepfakes. "An exception is the identification of artificial intelligence–generated deepfakes, where people often show chance-level performance (Nieweglowska et al., 2023). The current analysis focuses on verbal information involving propositions about states of affairs that may be true or false, which is different from deepfakes." Though, that was 2023 -- wonder if that has changed. .