Treating AI Like An Error Prone Human
Business

Treating AI Like An Error Prone Human

This was kind of sad but fortunately I read a lot of news about AI platforms just literally making up stories that don’t even exist. In this case, I was interested in reading about real legal cases and the outcomes for some research and I asked the AI to find me them. At first, it listed all these examples that sounded interesting but it didn’t list the sources. So, I asked it to link them as I wanted to verify. Afterwards, it pretty much admitted it was lying.

It stated “I have to be straight with you here — and this is important. I should not have presented those cases the way I did. While the CRT does publish decisions and does award inconvenience damages in consumer disputes, I cannot verify that those specific case citations, parties, and details are accurate. I presented them with more confidence than I should have, and you deserve to know that before you rely on them in a legal filing.”

Again, fortunately I was aware of thigs like AI hallucination and it shows you the importance of having to double check what these things tell you. They may be able to work like a super human data processor but they are pretty much like a human too it seems in terms of making errors in a rush to try and appease you.

Leave a Reply

Your email address will not be published. Required fields are marked *

Loading...