AI hallucinations - Check you med records for accuracy

  • Active since 1995, Hearth.com is THE place on the internet for free information and advice about wood stoves, pellet stoves and other energy saving equipment.

    We strive to provide opinions, articles, discussions and history related to Hearth Products and in a more general sense, energy issues.

    We promote the EFFICIENT, RESPONSIBLE, CLEAN and SAFE use of all fuels, whether renewable or fossil.
  • Hope everyone has a wonderful and warm Thanksgiving!
  • Super Cedar firestarters 30% discount Use code Hearth2024 Click here

begreen

Mooderator
Staff member
Hearth Supporter
Nov 18, 2005
107,114
South Puget Sound, WA
AI can definitely hallucinate. I've used it for review of documents and on one occasion had ChatGPT make up very specific technical requirements that were not in the document being reviewed. I re-phrased the question about 6 different ways and every time got a continuation of the hallucinated requirement, each time adding more detail. The responses were very concise and specific, and would have been completely legitimate if that requirement actually existed in the document.

It's a work in progress, it does have it's uses, unfortunately though, it's not yet the revolutionary tool it is claimed to be.
 
Unfortunately the sales pitch for AI is strong and tons of businesses are adopting AI solutions to reduce labor costs. There are going to be a lot of mistakes happening as these untested systems gain agency.