Could a bot run your safety and compliance program?
Bots and artificial intelligence (AI) are all the rage. They can do your kids’ homework, write new songs, and generate stunning artwork, all at the click of a mouse. But can you trust a bot to guide your safety and compliance program? Hardly.
Take the tool that started the AI craze: ChatGPT. It’s an AI chatbot that can process natural language to engage in a conversation, answer questions, and generate content that sounds eerily human.
Ask it to explain a regulation like the U.S. DOT’s 11-hour driving rule for truck drivers and it spits out a solid answer in seconds:
“The 11-hour driving rule is a federal regulation set by the Federal Motor Carrier Safety Administration (FMCSA) for commercial truck drivers. It states that a truck driver may drive a maximum of 11 hours within a 14-hour period, after which they must take a break of at least 10 consecutive hours. … ”
But chat about something more complex and things can quickly go off the rails. To test the system, we asked the bot a few common compliance questions. The accuracy of the results was not encouraging. For example, the bot told us that:
- A commercial driver’s license is required for 12,000-pound vans,
- Medical exam and drug testing forms must be kept in a driver’s qualification file,
- A driver’s travel time should be logged “off duty,” and
- Every driver who crosses state lines must use electronic logs.
Trust me, I’m an expert
All these “facts” were inserted in lengthy responses that sounded quite authoritative, like you’re chatting with a real expert — and therein lies the danger.
Studies have found that people can be influenced by the perceived authority and confidence of those delivering a message, even if the message itself may be flawed or inaccurate.
When it comes to safety and compliance, bad advice can have costly consequences. In this case, the bot made no mention of how you might violate privacy laws (and face legal trouble) by putting medical and drug testing forms in a driver’s qualification file, for example, or how a driver’s “travel time” is off duty only if the driver gets a full 10-hour break upon arrival (which otherwise could lead to fatigued driving).
1’s and 0’s
For a computer, the world is 1’s and 0’s, black and white. Compliance, on the other hand, is all about nuance, navigating the gray areas around a given set of facts. Given a complex scenario — or only half the facts required — and a chatbot can seem more artificial than intelligent.
At least ChatGPT got one thing right: “If you have concerns or questions about compliance … it’s best to speak with a qualified legal or regulatory expert.” Amen.
Key to remember: Artificial intelligence may be here to stay, but it has a long way to go to replace true expertise when it comes to compliance and safety.