Unlock the full potential of AI with Building LLMs for Production—our 470+ page guide to mastering LLMs with practical projects and expert insights!

Publication

Hand Caught In The Cookie Jar: How GPT4 Sold Me My Own Fake News
Latest   Machine Learning

Hand Caught In The Cookie Jar: How GPT4 Sold Me My Own Fake News

Last Updated on December 21, 2023 by Editorial Team

Author(s): John Loewen, PhD

Originally published on Towards AI.

How being naive, and lazy, came back to bite me in the ass

GPT-4 will “find” whatever you ask it to look for — this includes “research” on statistical information that you may want.

It will find it, and present it in whatever way you ask it to — in a paragraph, a chart, as a poem if you like.

But as a heavy user over the past 6 months, I have noticed an alarming glitch in how GPT-4 responds to specific statistical research queries.

It’s not accurate — it uses “placeholder” data — without telling you.

It’s like the kid with crumbs on his face who denies eating the last cookie.

How do I know this? First-hand experience.

GPT-4 sold me my own fake news as fact— from my own article that I published a few months earlier.

Let me tell you what happened.

A few months back (in May), before I noticed this pattern, I wrote an article about how GPT-4 was being heavily adopted by Python programmers to improve their coding workflow.

During the writing of the article, I prompted GPT-4 for some statistics, and it glowingly spits them out for me — A Stack Overflow Survey and a Github survey from the previous 2 months (March/April of 2023)

Holy shit, I thought, this GPT-4 is awesome.

Not really knowing any… Read the full blog for free on Medium.

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

Published via Towards AI

Feedback ↓