Attorneys in Manitoba will now have to disclose if they’ve used artificial intelligence to prepare court documents in the Court of King’s Bench, following a legal mishap south of the border where an AI program generated fake case law.
Manitoba Chief Justice Glenn Joyal issued the practice direction last week, which acknowledged that artificial intelligence might be used in court submissions in the future.
“While it is impossible at this time to completely and accurately predict how artificial intelligence may develop or how to exactly define the responsible use of artificial intelligence in court cases, there are legitimate concerns about the reliability and accuracy of the information generated from the use of artificial intelligence,” the order states.
Though Joyal’s directive doesn’t make mention of it, the directive does come after a case in New York where attorneys blamed ChatGPT for their submission of fictitious legal research in an aviation injury claim.
Lawyers in Manitoba who spoke to CBC news said they welcomed the decision, but they weren’t aware of AI being used to generate court documents within the province.
Chris Gamby with the Criminal Defence Lawyers Association of Manitoba said he thinks the directive is a good proactive measure given AI’s increasing popularity.
“I think that this technology at this point is in its infancy. It has great potential, it also has the possibility of having a number of drawbacks and we really don’t know what those are yet and it’s developing so quickly,” he said.