Distortion, Dishonesty, Damages: Safeguarding Your Reputation from Papyrus to Chat GPT

Distortion, Dishonesty, Damages: Safeguarding Your Reputation from Papyrus to Chat GPT

words machine letter

Recently, I came across a news story about a lawyer who used Chat GPT to help write a brief for a client claiming negligence from an employer. Not surprisingly, the lawyer greatly regretted using the AI chatbot app after it was brought to the court’s attention that some of the court cases cited in the report didn’t exist. Chat GPT assured him that the cases were real and that they could be found on other legal research databases. I was chagrined to see that the veteran lawyer believed the program despite multiple instances of misinformation from the program making headlines. He claimed that it was an accident, that he was not aware of the non-existence of many of the cases within his brief.

We all know that Chat GPT has been getting negative publicity lately. Multiple reports of hate speech, falsehoods, and overall dysfunction have made “Artificial Intelligence” look plain stupid since the program was released in November 2022. Chat GPT is one of the newest sources of misinformation, but the spreading of false news has been around for centuries.

During the Middle Ages, in Europe, a document called “Constantine’s Donation” was written by an unknown author. It announced that Constantine had been cured of leprosy by Pope Sylvester the I in the 4th Century AD, and so out of gratitude he gave land and political control to the pope. Now, the authenticity of the document was not questioned until Lorenzo Valla was able to prove that the Latin within the document would not have been used in the 4th Century. It is not surprising that the forgery’s origins were the product of political turmoil that took place on the Italian Peninsula during that time and was based off of the Legend of St. Sylvester, which claims that the pope healed the emperor who history has dubbed Constantine the Great.

Imagine the dismay of those who eventually found out that this document was actually a forgery! Land had been warred over, the people of those countries unjustly uprooted. The people under these governments who trusted their leaders must have been outraged. Their superiors had believed false information, and perhaps some people couldn’t trust those officials afterwards.

Chat GPT may not be malicious, but as the lawyer who used it found out, we can be subject to scandals involving misinformation from fake sources, which brings us to another thought. If a program like Chat GPT is known to spread misinformation and cause chaos, why do some people simply ignore this and move on without checking sources to verify their relevancy? Can these people really be trusted to do their work well without cutting corners?

From a public relations standpoint, this lawyer’s reputation definitely plummeted, affecting the way potential clients view him. He cannot reverse the negative aftereffects of this case. Even if he had never heard about the damage Chat GPT has caused, he is not without excuse. As a veteran lawyer, shame on him! He has cast not only his reputation into the ugliness of negative publicity, but also broken trust with the court, his client, and his law firm.

Yet we know that Matthew 7:4 says, “How can you say to your brother, ‘Let me take the speck out of your eye,’ when all the time there is a plank in your own eye?” This verse reminds us that it is rash and hypocritical to judge another’s mistakes without examining and confessing our own.

I personally have a confession to make regarding misinformation. So, in the spirit of transparency, I, too, have been guilty of a lapse in fact-checking. One time, I held a press conference for a big law firm, but I didn’t verify the address of the location I sent to the media. I directed dozens of reporters to the wrong place before realizing my mistake. Shame on me for not checking the address!

Whether we lived one thousand years ago to be deceived by words written on parchment with quill and ink, or exist in today’s digital age, where we can be fed false information from an AI chatbot like Chat GPT, there is only one solution. We must always treat the material we present as a reflection of our reputation.