Eight Things You Should Never Share With an AI Chatbot
A reminder that your conversations aren't private or secure.
Emily Long Freelance Writer
Experience
Emily Long is a freelance writer based in Salt Lake City.
After graduating from Duke University, she spent several years reporting on the federal workforce for Government Executive, a publication of Atlantic Media Company, in Washington, D.C. She has nearly a decade of experience as a freelancer covering tech (including issues related to security, privacy, and streaming) as well as personal finance and travel.
In addition to Lifehacker, her work has been featured on Wirecutter, Tom’s Guide, and ZDNET. Emily has also worked as a travel guide around the U.S. and as a content editor. She has a masters in social work and is a licensed therapist in Utah.
April 10, 2026
Add as a preferred source on Google
Credit: Yalcin Sonat / Shutterstock
Table of Contents
It probably goes without saying at this point, but your conversations with AI chatbots aren't private—everything you type or upload to Gemini, ChatGPT, and other models might be read and used in a variety of ways. If you wouldn't send a document or repeat information to someone you don't know, you shouldn't include it in a chatbot prompt either.
Researchers at Stanford reviewed the privacy policies of the six U.S. companies that developed the most popular AI chatbots, including Claude, Gemini, and ChatGPT, and found that all of them use chat data by default for training purposes. Some retain said data indefinitely, and most merge it with other information collected from consumers, such as search queries and purchases. In most cases, you can opt out of having your data used to train LLMs, but chats can also be read by human reviewers, and long-term retention policies increase the risk of your stored information being leaked in a breach.
If you're going to use an AI chatbot, these are the things you should avoid sharing:
What do you think so far?
Login credentials: Obviously, you should never paste prompts with usernames and passwords into a chatbot, including documents that contain login credentials. AI is also abysmal at generating secure passwords—use the tools in your password manager instead, or better yet, opt for a passkey if available.
Financial data: AI chatbots aren't financial experts, and you shouldn't upload documents or use data related to your specific finances in prompts. This includes bank statements, credit card numbers, investment information, account numbers and balances, etc. Sharing financial details anywhere that isn't secure increases the risk of theft, fraud, and targeting by scammers.
Medical records: AI chatbots also aren't medical professionals and shouldn't be relied upon for medical advice. You probably don't want your medical records to be used to train LLMs—plus, uploading them exposes them to potential data breaches.
Personally identifiable information (PII): AI prompts should never include information like your name, address, email, phone number, birth date, Social Security number, passport number, or any other data that could be used to steal your identity. (Financial information and medical records are also considered sensitive PII.)
General health information: In addition to keeping your sensitive medical records private, you should avoid giving chatbots seemingly benign information about your health that could be used to profile you. For example, the Stanford report notes that it's possible for AI chatbots to infer health status from a request for heart-friendly dinner recipes, which could eventually be accessible to insurance companies. This also includes information related to topics like sexual health, medication use, and gender-affirming care.
Mental health concerns: Another thing your chatbot isn't is a therapist. AI has been unhelpful at best and harmful at worst when it comes to mental health. Even with updates intended to protect users in crisis, chatbots aren't a replacement for real, human support.
Photos: AI image editing is popular, but that doesn't mean it's without risk. You may not want your personal photos used for training purposes, and image metadata contains information like your GPS location. At the very least, avoid uploading images of people (especially minors), and consider stripping EXIF data before sharing.
Company documents: AI may be useful for summarizing documents, creating presentations, drafting emails, and completing other work-related tasks more quickly, but you should use caution when uploading files containing sensitive company information to a chatbot. Your employer may even have a policy prohibiting it.
The bottom line is that you should be cautious what you share with AI chatbots—assume everything in your prompts is stored and could be read by someone else. Avoid anything that is personal or identifiable, and enable all available privacy settings (such as data sharing and training opt-outs).
The Download Newsletter Never miss a tech story
Jake Peterson
Get the latest tech news, reviews, and advice from Jake and the team.
The Download NewsletterNever miss a tech story. Get the latest tech news, reviews, and advice from Jake and the team.
UsenB