Do you have a secret you want to keep? Be cautious with AI assistants as the companies running them may hold onto your data longer than expected.
Google’s Gemini, previously Bard, has been highly praised, surpassing OpenAI’s ChatGPT according to many. However, before using Gemini, it’s wise to skim through its privacy policy.
In a recently updated privacy policy, Google advises users not to share sensitive information with Gemini that they wouldn’t want a human to see.
Google keeps many of your questions to enhance their tools. Everything you share with Gemini could be stored by Google for up to three years, even if you delete your data from the app.
Google’s AI data policies align with those of its competitors like OpenAI. OpenAI’s standard ChatGPT saves conversations for 30 days unless users opt for a custom data retention policy through the enterprise-tier plan.
Google stated that human annotators read, label, and process conversations with Gemini to enhance its performance in future interactions.
Conversations are anonymized before review but stored for up to three years, along with related data such as user devices, languages, and location. It’s unclear whether these annotators are internal or outsourced.
“Conversations that have been reviewed or annotated by human reviewers (and related data like your language, device type, location info, or feedback) are not deleted when you delete your Gemini Apps activity because they are kept separately and are not connected to your Google Account. Instead, they are retained for up to three years,” In the latest version of Gemini’s privacy policy.
The recent privacy notice for Gemini attempts to reassure users that some level of anonymization is applied to the data.
“We take your privacy seriously, and we do not sell your personal information to anyone. To help Gemini improve while protecting your privacy, we select a subset of conversations and use automated tools to help remove user-identifying information (such as email addresses and phone numbers),” Google explains in its privacy policy.
Removing someone’s name from the data doesn’t necessarily make it anonymous, as metadata can still reveal a lot. Google is aware of this fact.
“Please don’t enter confidential information in your conversations or any data you wouldn’t want a reviewer to see or Google to use to improve our products, services, and machine-learning technologies,” Google warns.
Avoid sharing any information with Gemini or similar AI-powered tools that you wouldn’t want a human to read. Especially refrain from typing anything into Gemini that you wouldn’t want to be heard in a court of law.
Prosecutors typically obtain search histories and electronic records for individuals accused of serious crimes, so exercise caution.
Google and its competitors are dealing with a major challenge in generative AI, the problems and need for user data in developing and training AI models. Currently, there are unclear guidelines on the ethics, morals, and legality of AI.