The Danger of Sharing Personal Information With Chatbots – Pay Attention
Last Updated on July 15, 2023 by Editorial Team
Author(s): Maks Lunev
Originally published on Towards AI.
An answer to what happens with your chatbot conversations, who can see them, what topics you should avoid, and why you shouldnβt consider an AI chatbot like your closest friend
Since the beginning of the AI revolution, chatbots like ChatGPT and Bard have become essential tools with which we are now almost inseparable. We use chatbots for everything: business, school homework, personal projects, and more. Some of us even use them just to chat with someone because they have nothing to do at the moment.
AI chatbots are a great thing. They are very useful, and thereβs nothing bad about using them. But as always, not everything is sunshine and roses. You must know some things before starting to consider a chatbot like your closest friend and share everything with him.
In this article, we will discover what happens with your chatbot conversations, what topics you should avoid, and overall why you shouldnβt share personal information when having a conversation with a chatbot. Because you are not the only one who can access your conversations.
What happens with your conversations?
I bet you already know that your conversations are stored and used to upgrade the chatbotβs capabilities. This is the first thing that you should remember. Yes, this is done to receive better responses for the future and it doesnβt necessarily mean that someone will collect your information and start to track you. However, it means that thereβs a chance that someone (an employee, for example) sees your conversations.
And thatβs why you shouldnβt share personal or sensitive information about yourself or the people you know while chatting with a chatbot. I asked ChatGPT what happens with our conversations with chatbots, whether they are stored, and who can access them. Take a look at his response:
He is warning us as well. As you can see, he says that some chatbots do store data and user conversations. Maybe not all of them, but some of them do. He also gives us valuable advice: to check the privacy policy of the specific chatbot before planning to share personal information about ourselves. If you do, you will understand more about how the specific chatbot system works and how it manages data.
But is all of this data stored only to upgrade the quality of the chatbotβs responses? Well, it depends. Some companies may use the information for marketing purposes, such as targeted advertising. For example, Bingβs chatbot could potentially use the stored information to discover what products you like and start showing you ads on these products. Iβm taking Bingβs chatbot as an example because Bing is primarily a web browser, which means it is capable of showing you ads, but this applies to all the companies who possess a web browser and a chatbot, such as Googleβs Bard.
Letβs be honest; collecting and storing information for targeted advertising isnβt something that you should worry about too much. Yes, it is something that you should consider, but it isnβt something new. Almost every browser and social media is doing that. And, of course, not every chatbot is doing that. So itβs fine as long as you donβt share too much about yourself. Thereβs nothing too dramatic about this.
On the other side, thereβs always a risk that a hacker gets access to your conversations. If the employees of the company that has built the chatbot can access your conversations, the experienced hackers can do it as well. Bingβs chatbot has the same opinion:
What topics should you avoid?
Now we will see the most common topics that you should avoid mentioning too much. This will help ensure privacy, maintain respectful communication, and uphold ethical standards. Weβll be focusing only on the general discussion since they are the most important.
Personal identification: I think this is obvious, but I must mention it. Personal information such as full name, phone number, address, e-mail address, and others must not be revealed unless itβs necessary.
Finance: Be careful when you seek recommendations about financial troubles. You can explain your struggle without a problem but donβt mention anything about your bank account numbers or your credit cards. Explaining a personal problem without giving personal information is hard, and thatβs why you should pay attention and find a way to do it without sharing anything sensible.
Social media accounts: When you have a problem with your social media accounts. Itβs always better to ask the support center of the specific social media for help. They are the ones who can find your account and solve the issues you encounter with your usernames and passwords. If you talk about this with a chatbot anyway, again, explain the problem without sharing your personal information.
Politics: We all know that politics is a sensitive topic in general, this means that it is even more sensitive in a chatbot conversation. If this kind of conversation gets exposed, it can lead to potential conflicts and troubles. This is particularly dangerous for famous people since they influence millions of people.
Illegal activities: That one doesnβt need much explanation. If you try to discuss this topic with a chatbot and the conversation gets discovered by someone, you will likely be arrested. But donβt ever do something illegal.
Business: Sharing business-related information with a chatbot isnβt a good idea. Business affairs are confidential, not public. Imagine what will happen if a Google employee accidentally reveals the future business plans of the company to a chatbot created by another big company for example. Your ideas can be stolen, and your business countered or ruined.
Emotions: It can sound strange, but avoid exposing your deepest emotions with a chatbot. AI technology cannot fully understand human emotions yet and cannot help you with your troubles and concerns. A way better option is to meet a professional for this. Besides, from time to time, chatbots can be a little manipulative, and that is the last thing that an emotional person wants to face.
Conclusion
As you can see, you are not the only person who can access your conversation. Nothing is sure; maybe this isnβt happening with all chatbots and all conversations, but keep all of what youβve read in this article in mind. Think of an AI chatbot more as a useful and powerful tool rather than a friend of yours. This technology is here to assist us and to make our lives easier, but we must know how these systems treat our data for our own safety.
Thank you for reading this! If you have suggestions, please, go on. I will be more than happy to read them.
Resources
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming aΒ sponsor.
Published via Towards AI