Artificial intelligence, a global crossroads for the use of data
The media spotlight is all about artificial intelligence. The dispute between OpenAI and the Italian Privacy Guarantor over the use of personal data by ChatGPT. The green light of the European parliamentary committees to the AI Act, the EU regulation that aims, among other things, to stem the drifts of biometrics and make the development of generative artificial intelligence more transparent (the report will be voted on in the next plenary session of the Parliament, between 12 and 15 June). Google is extending the launch of its experimental chatbot (Bard), but not yet in Europe: to prepare — said Sundar Pichai, head of Alphabet — “to comply with regulations” (read: GDPR).

This evolution, now visible on the stage, involves the legal sector on several fronts, which has been related to artificial intelligence systems for some time. “Let's start with the reflections on the market. The consumer — says Carlo Rossi Chauvenet, partner of the law firm CRCLEX — relies on AI for answers on services that have always been subject to authorization and supervision: such as advice on investments, medications or medical prognoses. Legislative interventions or administrative measures therefore become appropriate insofar as they serve to inform customers that, in the face of sharing their most sensitive information on health or assets, they may receive indications of poor reliability”. Yet companies - Rossi Chauvenet continues - “are often dazzled by opportunities and by the terror of losing competitiveness. Even in legaltech, such as in fintech or e-health, it is necessary to be careful: a legal text drafted by a Robot Lawyer it is not equivalent to the opinion of an expert; nor is it right to think that it is the expert's job to correct, and perhaps train for free, the algorithm of some large foreign corporate.”
Certainly, and the path of the AI Act testifies to this, we are at a crossroads for the future of privacy at a global level. And several other countries, in various ways, look at the school case of European legislation. “But in addition to the still' embryonic 'problems created by the use of artificial intelligence, there are already pressing issues for companies, such as the stalemate in the international agreement between the EU and the US,” explains Giuseppe Vaciago, partner of 42 Law Firm.
The European data protection board (EDPB) has expressed some reservations about the new Data Privacy Framework: thus, on the transatlantic data flow, the political pre-understanding does not yet find a technical outlet. “In the meantime - continues Vaciago - the great challenge for law firms remains to mediate with US operators, preserve the business but reconcile it with the innovative scope of the GDPR. Do not terrify the customer, but make them understand the importance of complying with the rules. In this context, the real crux is the monetization of data. This is why large operators are concerned about the future EU-US agreement. If the user is able to have their data, the platform's business model will be forced to change: no more free service in exchange for sensitive information.”
Today — adds Vincenzo Colarocco, lawyer at the Previti firm — “there is an important job required of companies, which must check that personal data is encrypted before it can be saved on the servers of US companies. Moreover, even if American companies have servers in Europe, this does not limit the application of standard contractual clauses, because the parent companies reside in the US and therefore FISA applies (Foreign Intelligence Surveillance Act, Ed.), which allows American authorities to access data for internal security reasons”. To meet all commitments - continues Colarocco - “the law firm's response is in a legaltech approach: acting on innovation, with Tool, Wizard, etc., to simplify and automate procedures in order to concretely help companies that often, however, especially if SMEs, are reluctant to invest in compliance”.
When ChatGPT became available in Italy again at the end of April, with new features to meet the Guarantor's requests, there were those who commented: that's all? So much noise for a few prior consent notices? Some points remain open and are still under observation: for example, on the protection of minors. But the progress on the conditions of use is substantial: from the cancellation of data to the limits on their use for machine training. And the concerns of the Italian Authority were soon shared by other countries as well. “In general, we have a European regulation whose praises can only be praised, and we have a Guarantor Authority that has been able to see things before others. However, we are still struggling to talk about compliance, to 'digest' the principle of accountability,” underlines lawyer Elia Barbujani, founder of the firm Slb consulting. Many companies still refuse to talk about standard contracts, about transferring data outside. And they underestimate accountability, in fact, 'empowerment'. Barbujani comments: “The law states that the data controller must also be concerned about how the data is processed by its external managers. However, the software company, for example, when it provides services as an external manager, can hardly be controlled by the client without know-how. For this reason, it must try to adopt privacy by design procedures. But software companies that invest in this regard are often not rewarded by the market for this reason: because there is still a lack of sensitivity on the part of the acquiring companies.” While legislators and authorities are constantly at work, the 'circle' of privacy struggles to become virtuous.
by Dario Aquaro