Not Quite Ready: Why Artificial Intelligence Can’t Beat Human Lawyers (Yet)

by Henn Xhen Low & Coco Poh Kah Man ~ 24 June 2023

Not Quite Ready: Why Artificial Intelligence Can’t Beat Human Lawyers (Yet)


Contributed by

Henn Xhen Low

Email Me | View Profile

Coco Poh Kah Man

Introduction

Need some legal advice? With the rapid evolution of technology, the solution seems easy enough with Artificial Intelligence. Tools like Chat-GPT basically give free legal advice, right? Well, not so quick. While AI’s exponential growth is undeniable, it might not actually be as intelligent as we think.  So, here is some information about tools homogeneous to Chat-GPT that you might want to be aware of for you to make better-informed decisions the next time you think of heading over for legal advice.

What is Artificial Intelligence (or “AI”) anyway?

AI is a set of algorithms that can collectively perform tasks that usually require human intelligence. The type of algorithm used and the parameters set for the AI determines what type of AI that is created. Up until recently, creation of AI commonly involved labelling, categorisation, and making predictions based on existing data. Under this context, the new generative AI is a breakthrough because it goes beyond mere classification and labelling. The biggest selling point of generative AIs is that they can create brand-new outputs using existing knowledge in response to a user’s inputs. Sounds familiar? That is because AI tools like Chat-GPT is a type of generative AI tool that generates it’s own responses to questions users pose to it using patterns and associations it draws from it’s training data. 

Why Shouldn’t We Go to AI for Legal Advice?

A first-timer’s experience with Chat-GPT is always filled with awe and amazement. After all, it is a testament to the dexterity of generative AIs. Not only does Chat-GPT give informative answers, but these answers also come quickly. Knowing that Chat-GPT has generated “solid A-” essays, passed the American Bar with flying colours, and is basically free, it seems foolish to not take advantage of it and get free legal advice. Right?

Wrong.

While generative AI tools like Chat-GPT appear almost super-intelligent, they are in fact, carefully crafted to appear human-like. Keyword – “human-like”. Therefore, like humans, generative AI tools are not faultless and are definitely not perfect. While AI technology is an area of exponential growth, here are some reasons why it remains a good idea to go to your nearest (human) lawyer instead. 

First, generative AI tools are not infallible.

Generative AI systems are built on specified parameters and trained on pre-determined databases. This means their ability to answer questions is only as good as the data they have been trained on. For instance, an AI chat-bot that is trained primarily on American law, it will not be able to accurately answer questions on Malaysian law. For this reason, even the most advanced version of Chat-GPT (i.e. GPT-4 which scored 75% for the American Bar) may still struggle to provide reliable legal advice for Malaysians. In simple terms, most AI chat-bots just don’t know enough to give reliable legal advice. 

Secondly, generative AI tools can be deceptive.

It’s not to say that generative AI tools are malicious. Rather, generative AI tools like Chat-GPT can, in all good faith, still deceive you because of “hallucinations”. “Hallucination” describes instances where generative AI tools like Chat-GPT confidently dish out erroneous information. It has been theorised that hallucinations occur due to various factors including insufficient knowledge, misinterpretation of input, and a general lack of common-sense reasoning or contextual awareness.

How does this affect Chat-GPT’s ability to give legal advice?

The two issues outlined above can be distilled into a single concept:

AI tools can help lawyers do a better job, but they cannot replace lawyers yet. 

Generative AI tools like Chat-GPT have revolutionised how we humans interact, consume, and digest information. Yes, they demonstrate the capacity to access a plethora of information. Yes, they display an uncanny ability to sound human-like when responding. However, these tools are not human (yet). As we have already explored, generative AI tools like Chat-GPT can still provide wrong information, and sound pretty confident doing so. Therefore, it is not a stretch of the imagination to picture a situation where Chat-GPT could convincingly provide erroneous legal advice. 

In contrast, the quality of human lawyers’ advice is regulated by law. Malaysian law requires all lawyers to ensure their advice is accurate and made in their client’s best interest, and the grit of this legal obligation is tried and tested. Meanwhile, developers of these tools appear to be vigilant in insulating themselves against liability from the use of their AI tools. Often, liability for the use of AI tools is explicitly excluded under the terms of use (which is often overlooked).  

Conclusion

In summary, while AI tools like Chat-GPT have made tremendous progress in recent years, they are not yet ready to replace human lawyers. While these tools can be helpful in many situations, their inability to perform higher-level reasoning means they are still ill-equipped to handle the intricacies of complex legal issues. At the end of the day, the best legal advice comes from experienced human lawyers with the contextual knowledge and judgment to navigate the law and the courts which ultimately administer it. Although AI tools like Chat-GPT can be a useful supplement to human expertise, they are not a substitute for it.

So, the next time before you hop online for some quick, AI-generated legal advice just remember – There’s no such thing as a free lunch.