One of the most popular means of using Artificial Intelligence (AI) is ChatGPT. Many chatbots like this chatbot are widely used around the world. But experts say some caution should be exercised when asking questions of any chatbot like ChatGPT. According to a New York Post report about the Cleveland Clinic in the United States, experts believe it is better not to rely on chatbots for sensitive information, especially health advice.
A Cleveland Clinic survey found that one in five Americans have sought health advice from chatbots, including ChatGPT. Earlier last year, a survey by the US healthcare company Trevor revealed that about 25 percent of Americans are consulting chatbots before going to the traditional medical system.
Experts stress that chatbots, including ChatGPT, should refrain from sharing personal or medical information and asking for advice. According to the Cleveland Clinic, the trend towards taking guidance from AI is increasing. However, personal, financial and medical information should not be shared with chatbots. This can lead to invasion of privacy and misuse of information.
Experts recommend 7 things not to ask chatbots—
1. Personal information
Never share personal information such as name, address, phone number or email address with AI chatbots. This information may be used to identify you and track your activity.
2. Financial information
Never share financial information like your bank account number, credit card number or social security number with chatbots. Using this information can lead to theft of your money or personal information.
3. password
Do not share any password to consult chatbot. By sharing this password it is possible to break into your account and steal your data. If someone searches the chatbot to find your information, there is a risk of providing your password or personal information to that user.
4. Your confidential information
Never share your confidential information with an AI chatbot. Chatbots are not committed to protecting the privacy of your information.
5. Medical or health advice
Chatbots are not operated by a physician. So following chatbot health advice is risky. Also, don't share your health-related details, such as insurance numbers, with chatbots.
6. Offensive content
Most chatbots filter out offensive content. Sharing something like this can get the chatbot to ban you from its platform. Besides, nothing is lost on the Internet. So the confidential information shared in the chatbot is likely to be exposed somewhere in the future.
7. Things you don't want to tell others
Don't share information on chatbots that you don't want to share with anyone. Because chatbots save what is shared. So never say anything in a conversation with a chatbot that you don't want anyone else to know.
