Towards AI Can Help your Team Adopt AI: Corporate Training, Consulting, and Talent Solutions.


The Danger of Sharing Personal Information With Chatbots – Pay Attention
Latest   Machine Learning

The Danger of Sharing Personal Information With Chatbots – Pay Attention

Last Updated on July 25, 2023 by Editorial Team

Author(s): Maks Lunev

Originally published on Towards AI.

An answer to what happens with your chatbot conversations, who can see them, what topics you should avoid, and why you shouldn’t consider an AI chatbot like your closest friend

Source: Photo by Luis Villasmil on Unsplash

Since the beginning of the AI revolution, chatbots like ChatGPT and Bard have become essential tools with which we are now almost inseparable. We use chatbots for everything: business, school homework, personal projects, and more. Some of us even use them just to chat with someone because they have nothing to do at the moment.

AI chatbots are a great thing. They are very useful, and there’s nothing bad about using them. But as always, not everything is sunshine and roses. You must know some things before starting to consider a chatbot like your closest friend and share everything with him.

In this article, we will discover what happens with your chatbot conversations, what topics you should avoid, and overall why you shouldn’t share personal information when having a conversation with a chatbot. Because you are not the only one who can access your conversations.

What happens with your conversations?

I bet you already know that your conversations are stored and used to upgrade the chatbot’s capabilities. This is the first thing that you should remember. Yes, this is done to receive better responses for the future and it doesn’t necessarily mean that someone will collect your information and start to track you. However, it means that there’s a chance that someone (an employee, for example) sees your conversations.

And that’s why you shouldn’t share personal or sensitive information about yourself or the people you know while chatting with a chatbot. I asked ChatGPT what happens with our conversations with chatbots, whether they are stored, and who can access them. Take a look at his response:

Source: Image from a ChatGPT conversation

He is warning us as well. As you can see, he says that some chatbots do store data and user conversations. Maybe not all of them, but some of them do. He also gives us valuable advice: to check the privacy policy of the specific chatbot before planning to share personal information about ourselves. If you do, you will understand more about how the specific chatbot system works and how it manages data.

But is all of this data stored only to upgrade the quality of the chatbot’s responses? Well, it depends. Some companies may use the information for marketing purposes, such as targeted advertising. For example, Bing’s chatbot could potentially use the stored information to discover what products you like and start showing you ads on these products. I’m taking Bing’s chatbot as an example because Bing is primarily a web browser, which means it is capable of showing you ads, but this applies to all the companies who possess a web browser and a chatbot, such as Google’s Bard.

Let’s be honest; collecting and storing information for targeted advertising isn’t something that you should worry about too much. Yes, it is something that you should consider, but it isn’t something new. Almost every browser and social media is doing that. And, of course, not every chatbot is doing that. So it’s fine as long as you don’t share too much about yourself. There’s nothing too dramatic about this.

On the other side, there’s always a risk that a hacker gets access to your conversations. If the employees of the company that has built the chatbot can access your conversations, the experienced hackers can do it as well. Bing’s chatbot has the same opinion:

Source: Image from Bing’s chatbot

What topics should you avoid?

Now we will see the most common topics that you should avoid mentioning too much. This will help ensure privacy, maintain respectful communication, and uphold ethical standards. We’ll be focusing only on the general discussion since they are the most important.

Personal identification: I think this is obvious, but I must mention it. Personal information such as full name, phone number, address, e-mail address, and others must not be revealed unless it’s necessary.

Finance: Be careful when you seek recommendations about financial troubles. You can explain your struggle without a problem but don’t mention anything about your bank account numbers or your credit cards. Explaining a personal problem without giving personal information is hard, and that’s why you should pay attention and find a way to do it without sharing anything sensible.

Social media accounts: When you have a problem with your social media accounts. It’s always better to ask the support center of the specific social media for help. They are the ones who can find your account and solve the issues you encounter with your usernames and passwords. If you talk about this with a chatbot anyway, again, explain the problem without sharing your personal information.

Politics: We all know that politics is a sensitive topic in general, this means that it is even more sensitive in a chatbot conversation. If this kind of conversation gets exposed, it can lead to potential conflicts and troubles. This is particularly dangerous for famous people since they influence millions of people.

Illegal activities: That one doesn’t need much explanation. If you try to discuss this topic with a chatbot and the conversation gets discovered by someone, you will likely be arrested. But don’t ever do something illegal.

Business: Sharing business-related information with a chatbot isn’t a good idea. Business affairs are confidential, not public. Imagine what will happen if a Google employee accidentally reveals the future business plans of the company to a chatbot created by another big company for example. Your ideas can be stolen, and your business countered or ruined.

Emotions: It can sound strange, but avoid exposing your deepest emotions with a chatbot. AI technology cannot fully understand human emotions yet and cannot help you with your troubles and concerns. A way better option is to meet a professional for this. Besides, from time to time, chatbots can be a little manipulative, and that is the last thing that an emotional person wants to face.


As you can see, you are not the only person who can access your conversation. Nothing is sure; maybe this isn’t happening with all chatbots and all conversations, but keep all of what you’ve read in this article in mind. Think of an AI chatbot more as a useful and powerful tool rather than a friend of yours. This technology is here to assist us and to make our lives easier, but we must know how these systems treat our data for our own safety.

Thank you for reading this! If you have suggestions, please, go on. I will be more than happy to read them.


Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

Published via Towards AI

Feedback ↓