ChatGPT introduces privacy-friendly chat history management: How to use
ChatGPT, OpenAI's AI-powered chatbot, has enthralled us with its abilities. However, as great as it is, data privacy has always been a bit of a sore point for the chatbot's critics. The company has now decided to tighten ChatGPT's privacy controls. Users will be responsible for how OpenAI handles data from now on. Let's have a look at how we can do that.
Why does this story matter?
The age of advanced AI models is upon us. There is no walking back from that. Human-like AI systems are our present and future. Their human-like abilities come at a cost—the vast amounts of data companies use to train them. Users and governments around the world have become wary of the potential privacy violations by companies and their AI-powered products.
OpenAI won't be able to use chats as training data
ChatGPT's new privacy control allows users to turn off their chat histories. This will prevent OpenAI from using your inputs as training data. You can turn off history by going to ChatGPT's 'Settings.' There is a new option called 'Chat History & Training' in 'Settings.' If you toggle the switch off, you will not see your recent chats in the sidebar.
The company will retain chat for 30 days
The option to turn off chat history is in addition to the ability to opt out of having your content used for training ChatGPT. There is a catch to the new feature, though. Even if you switch off history and training, OpenAI will store your chats for 30 days. The company says it will review them only to monitor abuse.
Training data of AI models include personal information from internet
Companies like OpenAI train their AI models by scraping millions of pages from the web, Reddit posts, books, social media sites, and more. The data from platforms such as Reddit and Twitter are necessary to create the conversational-style generative text system. The problem, however, is that the scraped data is bound to include some personal information.
Italy asked OpenAI to give users control over data
OpenAI was recently banned in Italy for unlawfully processing users' data. ChatGPT's new feature is interesting from that standpoint. The Italian data protection authority recently released a list of guidelines the company has to comply with to operate in the country. One of them is giving users control over their data. It seems OpenAI does not want similar issues in other countries.