"Can I Paste This Into ChatGPT?" A UK GDPR Guide for Employees - Digital Compliance Academy
Data privacy is the biggest blocker to AI adoption. We break down UK GDPR compliance into a simple Red, Yellow, Green traffic light system that any employee can understand.
It’s the question that keeps Compliance Officers awake at night.
Your marketing manager is struggling with a report. They grab a CSV file full of customer data, paste it into a public AI chatbot, and ask for a summary.
Boom. You’ve potentially just breached the UK GDPR and the Data Protection Act 2018.
That customer data has now left your secure environment and been sent to a third-party server (often in the US), potentially to be used to train future models.
But you can’t just say “Don’t use AI.” That’s like telling people “Don’t use the internet” in 1999. They will just do it on their personal phones instead, which is even less secure (Shadow AI).
The solution is not a ban. It’s clear, actionable guidance.
At DCA, we simplify complex data regulations into a Red / Yellow / Green traffic light system. It gives employees a split-second decision framework for “Can I paste this?”
🔴 RED LIGHT: Never Input (High Risk)
This information must never be entered into a public or unsecured AI tool. If this data leaked, it would cause significant harm to individuals or the business.
- PII (Personal Identifiable Information): Names, addresses, phone numbers, NHS numbers, precise location data.
- Special Category Data: Health data, political opinions, trade union membership, religious beliefs, biometric data (as defined by UK GDPR Article 9).
- Secrets: Unreleased financial results, trade secrets, passwords, encryption keys, or source code for core IP.
- Client Confidential: Anything covered by a strict NDA.
The Test: If this appeared on the front page of the Daily Mail, would you be fired or sued? If yes, it’s Red.
🟡 YELLOW LIGHT: Anonymise First (Medium Risk)
This information is useful for productivity, but contains identifiers that need cleaning before it’s safe to process.
- Project Reports: Valuable for summarisation, but often contain client names.
- Action: Change “Project Alpha for Barclays” to “Project Alpha for a large bank.”
- Internal Emails: Remove the sender/receiver names, specific dates, and email signatures.
- Sales Figures: Use percentages or trends instead of raw financial totals.
- Action: Change “Revenue: £1.2m” to “Revenue: +20% YoY”.
The Rule: Take 30 seconds to “sanitise” the prompt. If the AI doesn’t need the name to do the job (e.g., summarising a meeting), remove it.
UK GDPR Compliance for AI Tools: What Businesses Must Know
🟢 GREEN LIGHT: Generally Safe (Low Risk)
This information is already public or carries very low risk of harm.
- Public Info: Anything already on your website, in a press release, or published in a brochure.
- Generic Knowledge: “Draft an agenda for a marketing meeting” (no specific company data needed).
- Boilerplate Code: Generic functions (e.g., “Write a Python script to sort a list”) that contain no proprietary logic.
- Creative Ideation: “Give me 10 ideas for a summer social event.”
The “Training Data” Trap
Why does this matter? Most free AI tools (like the free version of ChatGPT or Gemini) reserve the right to use your inputs to train their future models.
This means if you paste a confidential strategy document into the free version today, it could theoretically be regurgitated as an answer to a competitor’s prompt next year.
Note: If your organisation pays for Enterprise versions (like Microsoft Copilot for 365 or ChatGPT Enterprise), you often have “No Training” guarantees. Your data stays within your tenant. We strongly recommend businesses move to these tiers.
Practical Implementation: The Desk Card
A policy document hidden on the intranet is useless. In our workshops, we provide a Laminated Traffic Light Card for employees to keep on their desks.
It acts as a physical “speed bump” before they hit Ctrl+V.
- Pause: Look at the data.
- Categorise: Is it Red, Yellow, or Green?
- Act: Stop, Clean, or Paste.
Safety doesn’t come from complex legal jargon. It comes from simple habits repeated every day.