ChatGPT is once again available in Italy: the Garante della Privacy has ruled this after a precautionary stop imposed in April. In recent days OpenAI, the company that owns the chatbot, has been complying with European privacy regulations by complying with the requests of the Italian Garante. OpenAI had until April 30 to comply with the requirements imposed by the Italian Data Protection Authority. These are the general information notice, the rights of data subjects, and the legal basis for the processing of personal data used for algorithm training. Now ChatGPT is more transparent and respectful of the rights of the people who use it. But according to many experts, the real game starts now.​​

ChatGPT is now more transparent but other points need to be clarified. Legitimate interest and the legal basis for training algorithms with users' personal data need to be rediscussed

In the Guarantor's press release there is a summary of the main changes introduced.

Today, anyone is more clearly informed that their personal data-even if not a user of the platform-may have been used to train algorithms, and anyone, user or non-user, can ask OpenAI that their personal data no longer be used to train machine learning;

Anyone, if they believe that ChatGPT generates inaccurate content about them, may request and obtain that the content in question no longer be generated. Or that their personal information no longer appear in the content in question;

OpenAI today claims to base the processing of personal data necessary for algorithm training, on legitimate interest. And it recognizes everyone's right to object, which can be exercised, unlike in the past, through a form that can be filled out online;

The privacy policy is immediately identifiable in the registration flow; Italian users who are already registered will be asked to confirm that they are 18 or over with parental consent before resuming to use the service, while those who are not registered will have to indicate their age;

By September, OpenAI will also have to implement an age verification system to filter access by those under 13. 
But not all doubts are resolved, at least for now, and many insiders point out some open problems. Starting with the conditions under which legitimate interest can be a valid legal basis, for training algorithms with users' personal data. In addition, there are questions about what to do about the historical data set accumulated by OpenAI before the Guarantor's intervention.
More guarantees are needed, and OpenAI in the coming weeks will have to negotiate with the Guarantor a communication campaign to clearly explain to Italian citizens how to exercise their rights. Issues that concern everyone, not just Italy, so they will have to be discussed in the coming months within the European Committee of Guarantors. Clearly, on some issues that are so central to the future of technological development, industry and markets, it is essential for Europe to speak with one voice. 

The successful negotiation between the Privacy Guarantor and OpenAI shows that respect for rights and innovation are not in conflict and can be balanced. OpenAI has improved its service while respecting people's rights. We are talking about issues of extraordinary importance for the future of our societies. The players directly involved, big tech companies, privacy protection bodies and national parliaments will have to find compromises to ensure that citizens can exercise their fundamental rights without unduly compressing the right to innovate.



Italian Privacy Guarantor blocks ChatGPT, just days to come into complianceItalian Privacy Guarantor blocks ChatGPT, just days to come into complianceGP0|#7b11270b-0d07-4ed3-98fe-ad8fbe25bd39;L0|#07b11270b-0d07-4ed3-98fe-ad8fbe25bd39|ChatGPT;GTSet|#731c59c8-4118-4f4c-8ddc-7170dc5cfcbd3/31/2023 10:00:00 PM 2:18:31 PM038433About News Open Innovation Contact us Login operatore Access Arca Customers Access Linear Customers Access UnipolSai Customers single cars Access UnipolSai Customers aspx13054htmlFalseaspx<img alt="" src="/en/PublishingImages/ChatGPT.jpg" style="BORDER:0px solid;" />3 minMore transparency is needed in the collection of personal data, according to the Privacy Authority. More protections for children under 13 as well. AI will also have major consequences for insurtech