Skip to content
Prev 394816 / 398502 Next

OFF TOPIC: chatGPT glibly produces a lot of wrong answers?

Hi Bert,
The article notes that chatGPT often gets the concept wrong, rather
than the facts. I think this can be traced to the one who poses the
question. I have often encountered requests for help that did not ask
for what was really wanted. I was recently asked if I could
graphically concatenate years of derived quantities that were based on
an aggregation of daily cycles. You get an average daily cycle for
each year, but this doesn't necessarily connect to the average daily
cycle for the next year. It didn't work, but it was a case of YKWIM
(You Know What I Mean). In this failure of communication, the
questioner frames the question in a way that may be figured out by a
human, but is not logically consistent. It is the basis for some very
funny scenes in Douglas Adams' "Hitchhiker" series, but can be
intensely frustrating to both parties in real life.

Jim
On Mon, Aug 14, 2023 at 3:50?AM Bert Gunter <bgunter.4567 at gmail.com> wrote: