No one’s ready for this: Our basic assumptions about photos capturing reality are about to go up in smoke.

Everyone who is reading this article in 2024 grew up in an era where a photograph was, by default, a representation of the truth. A staged scene with movie effects, a digital photo manipulation, or more recently, a deepfake — these were potential deceptions to take into account, but they were outliers in the realm of possibility. It took specialized knowledge and specialized tools to sabotage the intuitive trust in a photograph. Fake was the exception, not the rule.

If I say Tiananmen Square, you will, most likely, envision the same photograph I do. This also goes for Abu Ghraib or napalm girl. These images have defined wars and revolutions; they have encapsulated truth to a degree that is impossible to fully express. There was no reason to express why these photos matter, why they are so pivotal, why we put so much value in them. Our trust in photography was so deep that when we spent time discussing veracity in images, it was more important to belabor the point that it was possible for photographs to be fake, sometimes.

This is all about to flip — the default assumption about a photo is about to become that it’s faked, because creating realistic and believable fake photos is now trivial to do. We are not prepared for what happens after. — Sarah Jeong, The Verge

Post was last modified on 26 Aug 2024 10:58 am

Share
Published by
Dennis G. Jerz
Tags: ai