Home / News & Resources / Hayes Connor in the news / Law firm raises ChatGPT sensitive data concern

Law firm raises ChatGPT sensitive data concern

  • Posted on

Italy’s recent ban on ChatGPT and a lack of certainty over the chatbot’s GDPR adherence has raised the debate around business’ data privacy rules, says law firm Hayes Connor.

Richard Forrest, legal director at Hayes Connor, said there are growing concerns that employees are negligently disclosing confidential data, with sensitive data making up 11% of what employees submit to ChatGPT.

Hayes Connor’s main concern is how large language models like ChatGPT use personal data for training purposes, which could be regurgitated later on.

As an example, the firm says a doctor may input patient information to allow the chatbot to write a personalised email to the patient but there may be a risk that ChatGPT  could provide this information if a later user asks something about that patient.

For businesses using ChatGPT for admin tasks, Hayes Connor warns that confidentiality agreements with clients may be at risk, as sensitive information may be entered by employees into the software.

Forrest says: “The news that ChatGPT is now banned in Italy demonstrates the importance of compliance measures for companies operating in Europe. There have been growing concerns over the nature of LLMs like ChatGPT posing issues around the integration and retrieval of data within these systems. If these services do not have appropriate data protection and security measures in place, then sensitive data could become unintentionally compromised.”

Forrest says all businesses that use ChatGPT should implement measures to ensure employees are remaining GDPR compliant. This includes assuming that any information entered could later be accessible in the public domain; revising confidentiality agreements to include the use of artificial intelligence, creating an explicit clause in employee contracts and providing sufficient training on the use of AI. In addition, firms should ensure they do not input software code or internal data and have a specific company policy and employee user guide in place.

Forest adds: “Businesses that use ChatGPT without proper training and caution may unknowingly expose themselves to GDPR data breaches, resulting in significant fines, reputational damage and legal action taken against them. As such, usage as a workplace tool without sufficient training and regulatory measures is ill-advised.”