Deepfakes AI could increase divorce costs and endanger evidence in court


Americans who want to solve divorce and get care of their children could gather unforeseen court costs by trying to refute artificial intelligence (AI) Deepfake videos, Photographs and documents, according to the leading legal representative of family law.

Michelle O'Neill, co -founder of the law firm Owlawyers based in Dallas, said Fox Business That the courts see the “real increase” of false evidence, often created with AI. The problem, as she said, is becoming more common and judges learn at schools and conferences to remain vigilant.

One type of defective evidence is generated by AI vengeance porn – Including fake images and videos of individuals who participate in intimate acts. O'Neill notes that while Deepfakes have primarily broken into news when they affect celebrities, the problem also affects ordinary citizens who experience breakup or court divorce through the family court.

Most small businesses use artificial intelligence

Divorce and artificial intelligence Deepfakes

The use of artificial intelligence to generate fake images and videos could work out clients on the divorce. (Itock/Kirill Kudryavtsev/AFP through Getty Images/Getty Images)

O'Neille claims about these types of content generated by AI “exploding scene” is supported by statistics that show that the prevalence 900% per year Since 2019.

“When the client brings me evidence, do I have to question my own clients more than I have ever had, where you took it? How did you get it? Do you know where it came from? ” Said O'Neill.

Women also affect the problem. The AI ​​research company has constantly found that between 90% and 95% of all online deeepakes are non -consumption porn. About 90% of this number is Non -Non -Porn Women's Porn.

Despite the stunning number O'Neill, it says that social media platforms are slowly acting.

The first lady Melania Trump spoke on the Capitol hill at the beginning of March for the first time since returning to the White House and participated in a round table with legislators and victims of porn and ai -generated revenge.

Congress is currently enforcing the punishment of Internet abuse, including non -non -non -consumption explicit images.

AI fraud is proliferated. A new tool is trying to fight them

Deep

The green wire model covers the lower face of the actor while creating a synthetic video with a face revival, known alternatively as deep, in London in Britain 12 February 2019. (Reuters TV / Reuters / Reuters Photos)

Act is a bill presented in the Senate. Ted Cruz, R-Texas and Amy Klobuchar, D-MINN. The bill unanimously went through the Senate earlier in 2025, and Cruz said he believed on Monday that he would be handed over to the Chamber of Deputies before it became law.

As the government strives for new laws, O'Neill says that AI accustomed to creating fraudulent and explicit content remains a “real threat” for the court system.

“The integrity of our very court system depends on the integrity of evidence that you can go and the present. If you cannot even rely on the integrity of the evidence presented by the judge if the judge cannot rely on the integrity of the evidence they receive – can be absolutely on the heavy existence of artificial intelligence.

AI, O'Neill also remarks, also negatively affects economically attacked Americans who have fallen prey to fraam the court. It may now be a necessary individual that challenges the authenticity of the evidence received, to pay the expert on a forensic video to perform a test and verification.

Almost 50% of voters said Deepfakes had some influence on the election decision: Survey

You have a hacker

According to the legal representative of family law, Michelle O'Neill is a serious risk to the judicial system for the Michelle O'Neill judicial system. (Getty Images / Getty Images)

Fraud evidence can even spread to videos that indicate the child's abuse when two parties fight for binding. If the party does not have the funds to refute this evidence of abuse, AI is generated, the judges now have to decide whether to take the word alleged victims or believe in the shots that have taken place in court.

“What happens to people who don't have money [to disprove that]? So not only do we have a threat to the integrity of the court system, but we also have a problem with access to justice, ”said O'Neill.

A family law attorney noted that judges primarily see how the use of AI is not used to create false documents such as counterfeit bank records or drug tests.

One judge also told O'Neil that they had come across a falsified audio man who threw the other side in the negative light. The recording quality was not adequate enough. The judge rebuked the individuals and the evidence was excluded.

Get Fox business on the go by clicking here

With the rapid increase in this technology, however, O'Neill fears that the gap between what is real and what is generated by AI will shrink.

“I think this is a problem at many levels of our society. And you know, warning is something that is very important, ”she said.

Leave a Reply

Your email address will not be published. Required fields are marked *