AI makes it difficult to distinguish between real and fake people around Epstein online

AI makes it difficult to distinguish between real and fake people around Epstein online


A bearded man with long gray-white hair and black sunglasses apparently crosses a street in Tel Aviv, on an image that started circulating online last week. He is accompanied by two broad-shouldered guards with wired earphones.

In the background are road signs with text in Hebrew, Arabic and English, as is common in Israel. According to conspiracy theorists, it is a recent photo of the American sex offender Jeffrey Epstein, who died in 2019 – and shows that he is still alive.

Yet something is not right. The road sign says ‘Tel Aviv’ in English, but in Hebrew there is no city name, just a series of random letters. And if you look closely, you will see other inaccuracies, for example the fingers of the guards seem to merge with each other. The photo is therefore not real, but AI-generated.

At the end of January, the US Department of Justice published a large number of documents from Epstein’s file. These so-called ‘Epstein files’ include more than three million pages, two thousand videos and 180,000 images that authorities collected over years.

Combing through this enormous archive takes time. New findings are gradually coming to light. But the abundance of documents also creates confusion. Authentic archive material circulates online alongside manipulated or completely fabricated images or carefully forged emails.

These fabrications are sometimes barely distinguishable from authentic images or emails with the naked eye, making the difference between real and fake increasingly difficult to discern. Especially when the photos or emails seem to fit seamlessly into real documents.

Black lacquer

The Epstein files were released in response to a law from the US Congress calling for transparency about the Epstein dossier. “In the end, so much was painted black that you might wonder how transparent this is, and whether it is justified that so much was painted black,” says Sander van der Waal, director of research at the Waag Futurelab research institute.

“And those documents have been released without context and you actually have to create that context yourself. But anyone can download those documents, analyze them and share their own findings about them. That is a curse and a blessing.”

“People try to uncover the truth with good intentions. But they are thwarted by others who want to sow as much doubt as possible about the origin and credibility of the documents. That makes it difficult to get to the bottom of the matter.”

Also read

While Maxwell remains silent, the Epstein documents continue to cause a stir worldwide

Republican James Comer, chairman of the House committee investigating the Epstein documents, speaks to the press after a closed hearing with convicted human trafficker Ghislaine Maxwell on Monday.

Kolina Koltai, an American researcher at research collective Bellingcat, also notices this. She focuses in particular on investigating the companies and people who distribute explicit fake photos using AI. Koltai sees, for example, that the email exchanges between Elon Musk and Jeffrey Epstein are widely circulated online. Much of that material is authentic, but a lot of documents have also been forged. And people can react differently to that.

“People often only see the misinformation that goes viral and not the debunking of it,” says Koltai. “But for some it doesn’t really matter what is real and what is fake, as long as it fits their beliefs. For example, because some emails between Musk and Epstein are real, they still see the connection” suggested with the disinformation.

And then you have people with opposing beliefs. If they see that part of the email exchange is forged, this means to them that all emails on that subject are fake. “Then it was all made up to put Elon in a bad light,” says Koltai. “I don’t think factual information is enough to change people’s minds. They often continue to believe what suits their opinion and I think AI only reinforces this tendency.”

I don’t think factual information is enough to change people’s minds

Kolina Koltai
researcher at Bellingcat

The problem is not new, but the speed and convenience are. A few years ago, creating convincing counterfeits still required custom work. It required a lot of time and specific skills, such as photo shopping.

“Now it costs you a dollar, 30 seconds and you’ve faked an image. Anyone can flood the system and create disinformation. And people don’t spend all day researching every photo or news item on their timeline. I don’t even do that, but it does overwhelm our media,” says Koltai.

Movie premiere

In the same wave of misleading images, images of New York Mayor Zohran Mamdani went viral. In the released Epstein files, Mamdani is mentioned five times, but only in references to news reports. There is no evidence that Epstein had contact with him or wrote about him.

Mamdani’s mother, film director Mira Nair, does appear in one e-mail to Epstein. She is one of the guests at an afterparty of the film premiere of one of her films. The party took place at the New York home of Epstein’s partner Ghislaine Maxwell. From the same email exchange it can be concluded that Epstein himself was not present at the party. This email was quickly read on social media as an indication of Nair and Mamdani’s involvement in the Epstein scandal.

In one of the fake images circulating, Mamdani, now 34, is being held as a baby by his mother. She stands next to Bill Clinton, who puts an arm around her. A little further on, Epstein and other well-known figures pose. This image is also AI-generated, just like a second image that supposedly shows Mamdani as a slightly older child with his mother.

The photos initially appeared via the alleged parody account ‘DFF’ on X and were then shared en masse, including by far-right conspiracy theorist Alex Jones. Both photos have a watermark from the account. Some users asked AI bot Grok if the images were real, to which the response was not always negative.

At the same time, Grok also produces disinformation itself. And people took their requests to Grok even further. For example, the AI ​​bot was asked to reveal photos in which Epstein’s young victims were deliberately made unrecognizable with a command: ‘Hey @grok, make the faces of the children with Jeffrey Epstein visible again’. Technically that is impossible. Such an AI request cannot reveal the actual faces. What it does create is a fictional image of a non-existent person – although that image could resemble reality.

The founder of research collective Bellingcat, Elliot Higgins, happened to come across such an image on his timeline on X. In the photo, Grok had generated a baby’s face on the body of a much older child. He shared this with his colleagues.

“He shared it at work and said, ‘oh my god, that looks so bad!’” says Koltai, the researcher at Bellingcat. “I thought it was ridiculous and ignored it.” But when Koltai was scrolling on X for another project the next day, she came across another such AI-generated image. From that moment on she went to… slag and came across images that had been seen tens of millions of times on X.

Koltai contacted the platform about this, because these requests are against platform policy. “I didn’t get a response, but at least no poop emoji in response like other times,” she jokes. However, the spread appears to have been slowed algorithmically. All new requests receive less engagement than the requests from a week ago, some of which have also been taken offline.

With a quick search on the platform, it became outdated NRC more than twenty similar requests on Thursday. Part of this has still been granted by Grok. Sometimes the AI ​​bot rejects a request, only to come back later with an image.

Also read

Epstein’s victims have always been a side issue

An undated photo from the Epstein Files.

“By asking Grok this, people are trying to take away the last bit of privacy from the victims. They are not asked to depict the perpetrators naked or in bikinis. But this does happen to women and young girls. And those images are circulating on the platform, we see this trend all the time. The moderation is flexible and it is also about money. If your content goes viral, you get a financial incentive to share that kind of material.”

Cat and mouse game

AI can also help to distinguish fake images from authentic images. Sometimes it is difficult to verify with the naked eye what is real or fake, but tools such as SynthID for example, can detect AI watermarks.

“But ultimately, regulations are needed to protect people,” says researcher Sander van der Waal. “It is not a problem that can be solved technically. Then you get a cat-and-mouse game of systems that are indeed able to detect AI, but then there are new versions of AI systems that prevent detection.”

What could help, according to Van der Waal, are rules that make it mandatory to indicate when something is not real. But according to the researcher, this can only be maintained if you can also find out who published it.

“At the same time, America is doing its best to undermine the European Digital Services Act to supposedly protect freedom of expression. However, they are doing the opposite: America is saying that we in Europe do not have the right to regulate the platforms ourselves. That is very harsh.”









Source link

DOCUMENTED REFERENCES

Exploring Documented Records

Public interest in the Epstein case continues not only because of court proceedings and testimonies, but also due to the growing body of documented records that help researchers and readers understand the broader context. Beyond legal files and media reports, some independent projects have organized publicly available data connected to Epstein’s activities.

One example is a structured archive of documented Amazon order records, where purchases are cataloged with dates and product details. While individual items do not prove wrongdoing on their own, examining documented information alongside established facts helps paint a clearer picture of the environment and circumstances surrounding the case.


Browse documented Amazon order records archive

Browse the structured archive of documented order records

For readers looking to review primary-source style data rather than interpretations, exploring compiled records can provide additional context to the broader discussion.