Hand Caught In The Cookie Jar: How GPT4 Sold Me My Own Fake News
Last Updated on December 21, 2023 by Editorial Team
Author(s): John Loewen, PhD
Originally published on Towards AI.
How being naive, and lazy, came back to bite me in the ass
GPT-4 will βfindβ whatever you ask it to look for β this includes βresearchβ on statistical information that you may want.
It will find it, and present it in whatever way you ask it to β in a paragraph, a chart, as a poem if you like.
But as a heavy user over the past 6 months, I have noticed an alarming glitch in how GPT-4 responds to specific statistical research queries.
Itβs not accurate β it uses βplaceholderβ data β without telling you.
Itβs like the kid with crumbs on his face who denies eating the last cookie.
How do I know this? First-hand experience.
GPT-4 sold me my own fake news as factβ from my own article that I published a few months earlier.
Let me tell you what happened.
A few months back (in May), before I noticed this pattern, I wrote an article about how GPT-4 was being heavily adopted by Python programmers to improve their coding workflow.
During the writing of the article, I prompted GPT-4 for some statistics, and it glowingly spits them out for me β A Stack Overflow Survey and a Github survey from the previous 2 months (March/April of 2023)
Holy shit, I thought, this GPT-4 is awesome.
Not really knowing any… Read the full blog for free on Medium.
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming aΒ sponsor.
Published via Towards AI