We use cookies to collect and analyse information on site performance and usage to improve and customise your experience, where applicable. View our Cookies Policy. Click Accept and continue to use our website or Manage to review and update your preferences.


AI raises billing questions for US lawyers
Pic: Shutterstock

08 Aug 2024 technology Print

AI raises billing questions for US lawyers

A task force on artificial intelligence (AI) set up by the American Bar Association (ABA) has warned lawyers that profiting from time savings made by the technology may be in breach of its rules.

In a discussion of the ethical issues raised by the use of generative AI models, the report points out that a brief written by prompting an AI programme might take substantially less time to complete than one written directly by a lawyer, even after the lawyer has checked all the citations.

The report adds that profiting from the time savings may be a violation of the ABA rules on professional conduct, if the lawyer promised to bill clients by the hour, but does not provide the billing discount that occurred due to generative AI-created efficiencies.

Growth slowing

The report finds that extractive AI, which provides analysis or information based on a set of data fed into the model, is the most common AI tool in use in legal departments.

Legal practices mainly use such technology for managing data and documents, or for legal research and analysis.

The ABA team finds, however, that several issues have slowed the growth of generative AI for lawyers.

Well-publicised cases demonstrating improper use of the technology – including imposition of discipline and sanctions against lawyers using generative AI – have led a number of law firms to limit preparation of work product using generative AI,” it states.

Confidentiality

The report warns lawyers that they must be careful not to reveal information relating to representation when using generative AI on the client’s behalf without the client’s informed consent.

“Many generative AI platforms do not provide confidentiality for the prompts input into the tool or the outputs produced by it. Unless they represent otherwise, generative AI companies are likely to use these prompts for additional training of their AI models,” the report states.

“The prompts and the responses they produce could be revealed to the general public, either by accident or by specially designed inquiries to the generative AI tool,” it adds.

On ‘hallucinations’ – errors generated by AI models – the report warns that lawyers are responsible for confirming the existence, accuracy, and appropriateness of the citations they submit to a court, whether or not a court has special rules about AI.

Access to justice

The ABA task force is positive, however, about the potential of AI to improve access to justice, arguing that it can be developed to provide reliable and accessible information for litigants and much-needed support for lawyers.

“With trustworthy and responsible generative AI tools, individuals without legal representation can have the ability to get basic legal information to inform them about options when legal issues arise.

“AI tools could also alleviate the repetitive, labour-intensive, and sometimes tedious tasks that can often fill a legal advocate’s day, particularly with high-volume caseloads in most non-profit legal-services offices,” it states.

Gazette Desk
Gazette.ie is the daily legal news site of the Law Society of Ireland

Copyright © 2024 Law Society Gazette. The Law Society is not responsible for the content of external sites – see our Privacy Policy.