Share This Article
The Italian privacy authority (the Garante) ordered the provisional restriction to the processing of personal data of Italian users of a chatbot owned by a U.S. company and powered by an artificial intelligence system.
The Garante intervened after learning from recent press reports of some tests conducted on the chatbot that showed concrete risks for minors of age and, more generally, for people in a state of emotional fragility, as well as violations of Regulation 679/2016 (the GDPR), including the principle of transparency.
The outcome of the Italian data protection authority’s investigaton into the chatbot and its privacy risks
The chatbot, accessible via mobile app, has a written and voice interface, based on an artificial intelligence system, capable of generating a “virtual friend” for the user, whom the user can decide to configure as a friend, romantic partner or mentor.
The Italian privacy authority’s investigation brought to light the following critical issues and risks for users arising from the artificial intelligence powered chatbot, particularly minors and fragile individuals:
- Absence of filters and effective age control procedures: the absence of filters for minors of age and mechanisms for banning or blocking even in the face of user statements that make explicit their minor age and the proposition of “answers” by the chatbot that are inappropriate for minors and, more generally, to all the more fragile subjects, while, during the account creation phase, the platform does not provide any procedure for verifying and checking the user’s age, since the system only asks for name, e-mail and gender;
The measures adopted by the Garante and useful considerations for companies
In light of the shortcomings and critical issues found, the Italian data protection authority found that the processing carried out through the chatbot in relation to the personal data of users, particularly minors, is in violation of Articles 5, 6, 8, 9 and 25 of the GDPR, which respectively establish the principles and conditions for lawful processing, including in relation to minors and special categories of personal data.
As a result, the Garante imposed the provisional restriction of processing on the owner, in relation to all users established in the national territory due to the absence of any mechanism to verify the age of users.
The decision of important because it is not possible to exclude that the approach of the Italian data protection authority regarding age verification mechanisms may extend to other age verification systems, calling them into question. Companies will need to continue to seek a balance between age verification and the protection of personal data, considering the privacy implications of collecting such data: this will be possible by considering other age verification arrangements that involve more limited collection and processing. In addition, companies will have to pay increasing attention when deciding to use artificial intelligence systems, in light of recent activism by the Garante on this issue.
On the same topic, you may be interested in “EU Council adopts proposed AI Act on artificial intelligence.”