By Erica Kim, University of Alberta Law Student
The recent proliferation of Artificial Intelligence (AI) use has extended to the legal context. AI is increasingly being used to facilitate legal research, translation, and drafting of court documents.[1]
This increase in AI use is particularly relevant when considering access to justice. Access to justice can be defined as the ability of people to address their law-related problems in ways that are consistent with fair legal standards and processes, and to obtain, understand, and act on information and services related to the law to achieve just outcomes.[2]
In the context of civil procedure, the concept of access to justice is typically used to ensure that procedural rules do not restrict the public’s access to the superior courts. For example, the Supreme Court of Canada in Trial Lawyers found that unreasonably high court hearing fees prevented access to justice, and the hearing fee scheme was deemed unconstitutional.[3]
Beyond physical access to courts, however, access to justice is also relevant with respect to the preparation of pleadings. When a tool such as AI is available to facilitate efficient drafting of pleadings, especially for individuals who may not be able to afford counsel, the issue of whether AI-drafted pleadings comply with the rules on pleadings arises.
The use of AI to draft pleadings can increase access to justice but can also risk breaching rules on pleadings if used without caution. As such, users should only use AI as a supplementary tool and should ensure that their pleadings follow the court rules.
How AI Use for Drafting Pleadings Increases Access to Justice
The use of AI in drafting pleadings can increase access to justice by lowering costs for clients seeking legal help.[4] Reduced costs are possible because AI makes processes necessary for drafting pleadings like file management, legal research, and document review more efficient for counsel.[5]
Apart from lowering costs, AI can also increase access to justice by assisting self-represented litigants who may not be able to afford counsel. AI helps self-represented litigants research more efficiently in their preferred language and provides free guidance regarding how to prepare pleadings.[6]
Along with these benefits, however, come various drawbacks. As such, provincial and federal courts have released requirements regarding disclosure of AI use.
Requirements on Disclosure of AI Use
On June 23, 2023, the Court of King’s Bench of Manitoba released a Practice Direction addressing the use of AI in court submissions.[7] This Practice Direction requires court users to indicate in their materials when those materials were prepared with the help of AI.[8]
On May 7, 2024, the Federal Court also issued a Notice that addresses the use of AI in court proceedings.[9] This Notice requires parties to inform the Federal Court, and each other, if they have used AI to prepare a document filed with the Court.[10] The Notice acknowledges the importance of fair treatment of all parties, indicating that AI-related responsibilities are imposed on both represented and self-represented parties.[11]
These requirements indicate that Canadian courts recognize the potential dangers of AI use. These concerns are especially prevalent when AI is used to draft pleadings, which provide the foundational basis for litigation. As such, it is crucial that court users research the provincial rules on pleadings and ensure that their use of AI does not inadvertently breach these rules.
The Manitoba Rules on Pleadings
The Court of King’s Bench Rules (the “Rules”) are the rules that govern civil proceedings in Manitoba.[12] Part VI of the Rules contains the rules that govern pleadings.
According to the Rules, each pleading must contain a concise statement of the material facts on which the party relies for its claim or defence.[13] However, the party may not include evidence in the pleadings.[14] As well, parties may raise any point of law in their pleadings, but can only plead conclusions of law if the material facts supporting them are pleaded.[15]
There are particular rules on pleadings regarding defences. In a defence, a party must admit every allegation of fact in the opposing party’s pleading that the party does not dispute.[16] All allegations of fact affecting a party that are not denied in the party’s defence are deemed to be admitted unless the party pleads having no knowledge in respect of the fact.[17] As well, in a defence, a party must plead any matter on which it intends to rely to defeat the opposite party’s claim.[18]
Despite the benefits that AI provides when parties draft pleadings, such as increasing access to justice, the dangers lie in the potential for AI-drafted pleadings to breach the above rules.
How Using AI Can Breach the Rules on Pleadings
The shortcomings of AI when it is used in the legal context have become increasingly apparent in recent years. For example, in Zhang v Chen, a Vancouver lawyer used ChatGPT — a generative AI program — to prepare a notice of application.[19] She cited two cases in the notice, both of which were discovered to be non-existent: ChatGPT had hallucinated the cases.[20] The Supreme Court of British Columbia held that the lawyer should bear the additional expense caused to the other party due to her mistakes.[21] The Court also commented that generative AI is no substitute for a lawyer’s professional expertise, and that competence in the use of AI technology is critical.[22]
The lessons from Zhang can be applied to the use of AI to draft pleadings. If AI has the potential to hallucinate non-existent cases, it may certainly fail to catch all the nuanced rules regarding pleadings. For example, it may include evidence or conclusions in the pleadings, which the Rules prohibit. As well, an AI-drafted pleading may fail to deny the allegations of fact that the party drafting the pleading intends to deny. This is especially problematic given that the rules on denying allegations of fact differ depending on the province. For example, in Alberta, every allegation of fact that is not admitted by the opposing party is deemed to be denied.[23] This differs from provinces like Manitoba and Ontario, where an allegation of fact that is not denied is deemed to be admitted.[24] Given these differences, AI may fail to follow the appropriate province’s rules when drafting a pleading. This could lead a party to admit an allegation of fact that it intended to deny, which could adversely affect its case.
How Court Users Should Proceed
Given the apparent risks of using AI to draft pleadings, court users should exercise caution and keep up to date with the provincial court rules.
To begin with, court users should not rely completely on AI to draft their pleadings. While an AI-drafted pleading may appear to comply with the rules at first glance, AI cannot replace a real court user’s judgment.[25] Users should draft their pleadings themselves first, referencing the provincial civil procedure rules. Only then can they use AI to refine their work or provide minor guidance.
Additionally, court users must abide by the Federal Court Notice and the Manitoba Court of King’s Bench Practice Direction. Even if AI was used for editing or other secondary purposes, users must disclose to the respective courts that it was used.
In all, AI is clearly a valuable tool that increases access to justice and facilitates the drafting of court documents. However, given the unpredictability of AI technology, court users must exercise caution when using it to draft pleadings. Researching the relevant court rules on pleadings, and using AI only as a supplementary guiding tool, will ensure that court users use AI effectively in a way that does neither hinders their advocacy nor undermines the fairness of the judicial system.
[1] Office of the Commissioner for Federal Judicial Affairs Canada, “Demystifying Artificial Intelligence in Court Processes” (20 November 2024), online: <fja-cmf.gc.ca> [https://fja-cmf.gc.ca/COVID-19/Demystifying-Artificial-Intelligence-Demystifier-lintelligence-artificielle-eng.html].
[2] Government of Canada, “Access to Justice, Community Justice Help and Legal Empowerment” (15 August 2022), online: <justice.gc.ca> [https://www.justice.gc.ca/eng/rp-pr/jr/ecjh-eamjc/access-acces.html].
[3] Trial Lawyers Association of British Columbia v British Columbia (Attorney General), 2014 SCC 59 at paras 60–64 [Trial Lawyers].
[4] Office of the Commissioner for Federal Judicial Affairs Canada, “Use of Artificial Intelligence by Court Users to Help Them Participate in Court Proceedings” (20 November 2024), online: <fja.gc.ca> [https://www.fja.gc.ca/COVID-19/Use-of-AI-by-Court-Users-Utilisation-de-lIA-par-les-usagers-des-tribunaux-eng.html].
[5] Ibid.
[6] Ibid.
[7] Court of King’s Bench of Manitoba, “Practice Direction: Use of Artificial Intelligence in Court Submissions” (23 June 2023), online (pdf): <manitobacourts.mb.ca> [https://www.manitobacourts.mb.ca/site/assets/files/2045/practice_direction_-_use_of_artificial_intelligence_in_court_submissions.pdf].
[8] Ibid.
[9] Federal Court, “Notice to the Parties and the Profession: The Use of Artificial Intelligence in Court Proceedings” (7 May 2024), online (pdf): <fct-cf.gc.ca> [https://www.fct-cf.gc.ca/Content/assets/pdf/base/FC-Updated-AI-Notice-EN.pdf].
[10] Ibid at 1.
[11] Ibid at 3.
[12] Court of King’s Bench Rules, Man Reg 553/88 [Rules].
[13] Ibid, r 25.06(1).
[14] Ibid.
[15] Ibid, r 25.06(3).
[16] Ibid, r 25.07(1).
[17] Ibid, r 25.07(2).
[18] Ibid, r 25.07(5).
[19] Zhang v Chen, 2024 BCSC 285 at para 8 [Zhang].
[20] Ibid.
[21] Ibid at para 43.
[22] Ibid at para 46.
[23] Alberta Rules of Court, Alta Reg 124/2010, r 13.12(1).
[24] Rules, supra note 12, r 25.07(2); Rules of Civil Procedure, RRO 1990, Reg 194, r 25.07(2).
[25] Zhang, supra note 19 at para 46.