Artificial intelligenceCrime and PunishmentFeaturedJudge's opinionJudicial ethicsTech and AIU.S.WND News Center

Judge delivers opinion rife with ‘indisputable factual inaccuracies,’ prompts charges he used AI * WorldNetDaily * by Bob Unruh

(Pixabay image)

At some point, as artificial intelligence expands across the world, there’s going to be a point when AI turns into genuine stupidity. This might be that point:

A judge is suspected of refusing to do the job for which he is paid, and utilizing artificial intelligence instead of legal analysis, to write a court opinion.

According to a report and Law and Crime, the lawyers in the case were “bewildered” by the statements from Henry Wingate.

He’s a federal judge in Mississippi.

His order, from just a week ago, granted a request for a temporary restraining order from education groups, such as the Mississippi Association of Educators, that stops the state government from using several pats of a new law to remove “diversity, equity and inclusion” ideologies.

But the ruling contained “apparent indisputable factual inaccuracies.”

The report suggested the judge may have been using artificial intelligence to write the comments.

A report at Not the Bee explained that the order later was corrected because it had “multiple errors, which the defendants noted in an unopposed motion to clarify.”

For example, the judge got the names of the plaintiffs wrong. And he got the names of the defendants wrong. And he recited “allegations” that do not appear in the complaint at issue. And they are not supported by evidence. And he inserted language in the disputed law that does not appear in the original. And he included testimony from four people whose statements were not in the record.

The defendants “respectfully request the court take appropriate steps to clarify or correct the following apparent and indisputable factual inaccuracies.”

Not the Bee commented, “This isn’t the first time this has happened, and it definitely won’t be the last: To put this in plain English, a black federal judge (likely) had AI help write an order that temporarily stopped laws passed by the state legislature and governor that would get rid of DEI programs.”

The report continued, “I know legalese is boring and nerdy, but think about the implications here. The residents of a state elected politicians to represent them in the legislature. Those politicians enacted the will of the people by writing a bill that defunds and removes race-based ‘equity’ programs meant to discriminate against residents with European heritage as payback for past injustices against non-Europeans. That bill was then signed and passed into law by the governor.

“Then, at the finish line, a federal judge (a Reagan appointee, no less!) temporarily stops the bill from being implemented. This could very well be in his constitutional authority, but he (or more likely his clerks) decided they can’t be bothered to do their jobs and explain themselves. Instead, they (allegedly) had a computer language model make-up fake rulings out of thin air. Are you starting to see how damaging this could be?”

 

Bob Unruh

Bob Unruh joined WND in 2006 after nearly three decades with the Associated Press, as well as several Upper Midwest newspapers, where he covered everything from legislative battles and sports to tornadoes and homicidal survivalists. He is currently a news editor for the WND News Center, and also a photographer whose scenic work has been used commercially. Read more of Bob Unruh’s articles here.




Source link