At some point, as artificial intelligence expands across the world, there’s going to be a point when AI turns into genuine stupidity. This might be that point:
A judge is suspected of refusing to do the job for which he is paid, and utilizing artificial intelligence instead of legal analysis, to write a court opinion.
According to a report and Law and Crime, the lawyers in the case were “bewildered” by the statements from Henry Wingate.
He’s a federal judge in Mississippi.
His order, from just a week ago, granted a request for a temporary restraining order from education groups, such as the Mississippi Association of Educators, that stops the state government from using several pats of a new law to remove “diversity, equity and inclusion” ideologies.
But the ruling contained “apparent indisputable factual inaccuracies.”
The report suggested the judge may have been using artificial intelligence to write the comments.
Yikes! Another hallucinated opinion, this time justifying a temporary restraining order enjoining enforcement of a State law. The rush to issue orders like this undermines the judiciary. Even worse–apparently the “corrected” opinion still has a hallucinated case . . . pic.twitter.com/qfnRaMtSGQ
— Eric W. (@EWess92) July 30, 2025
Apparently there is little recourse, short of an appellate court (or perhaps a judicial complaint). When attorneys have engaged in behavior like this, they have faced serious sanctions. pic.twitter.com/VlbwuZIUlc
— Eric W. (@EWess92) July 30, 2025
A report at Not the Bee explained that the order later was corrected because it had “multiple errors, which the defendants noted in an unopposed motion to clarify.”
For example, the judge got the names of the plaintiffs wrong. And he got the names of the defendants wrong. And he recited “allegations” that do not appear in the complaint at issue. And they are not supported by evidence. And he inserted language in the disputed law that does not appear in the original. And he included testimony from four people whose statements were not in the record.
The defendants “respectfully request the court take appropriate steps to clarify or correct the following apparent and indisputable factual inaccuracies.”
Not the Bee commented, “This isn’t the first time this has happened, and it definitely won’t be the last: To put this in plain English, a black federal judge (likely) had AI help write an order that temporarily stopped laws passed by the state legislature and governor that would get rid of DEI programs.”
The report continued, “I know legalese is boring and nerdy, but think about the implications here. The residents of a state elected politicians to represent them in the legislature. Those politicians enacted the will of the people by writing a bill that defunds and removes race-based ‘equity’ programs meant to discriminate against residents with European heritage as payback for past injustices against non-Europeans. That bill was then signed and passed into law by the governor.
“Then, at the finish line, a federal judge (a Reagan appointee, no less!) temporarily stops the bill from being implemented. This could very well be in his constitutional authority, but he (or more likely his clerks) decided they can’t be bothered to do their jobs and explain themselves. Instead, they (allegedly) had a computer language model make-up fake rulings out of thin air. Are you starting to see how damaging this could be?”
Click this link for the original source of this article.
Author: Bob Unruh
This content is courtesy of, and owned and copyrighted by, https://www.wnd.com and its author. This content is made available by use of the public RSS feed offered by the host site and is used for educational purposes only. If you are the author or represent the host site and would like this content removed now and in the future, please contact USSANews.com using the email address in the Contact page found in the website menu.