Photographs have the capacity to reveal sensitive aspects of a person’s life, which could mean a photograph qualifies as a “special category” of personal data under the GDPR, bringing heightened regulatory obligations.
Understanding Special Categories of Personal Data
You’ve probably heard the old adage, “a picture is worth a thousand words.” A photograph not only allows others to identify you, but it can reveal other aspects of your life, some of which are more sensitive than others. For example, a picture of you participating in a political demonstration could reveal your political opinions or affiliations. A picture of you at your wedding could allow others to determine your sexual orientation.
With billions of photos uploaded to the web every day, understanding your company’s obligations with respect to photographs under applicable privacy laws, such as the General Data Protection Regulation (GDPR), has become increasingly important.
For instance, do you need the consent of your users to simply store their photographs on your platform? Can you record information inferred from those photographs?
We have asked these questions to several data protection authorities in the European Economic Area (EEA) and have compiled their answers for you below.
What Does the GDPR Say About Photographs?
The GDPR only mentions photographs once. Recital 51 of the GDPR states that the processing of photographs should not be systematically considered as processing of special categories of personal data. Photographs only constitute special category data if they fall within the scope of biometric data. Article 4 of the GDPR defines biometric data as:
Personal data resulting from specific technical processing relating to the physical, physiological or behavioral characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data.
Accordingly, photographs are considered to be a subset of personal data and subject to the general processing obligations imposed by the GDPR. But they have the potential to fall into a special category of personal data when processed through specific technical means, for example, when they are used in facial recognition.
Implications of Photographs Being Considered Special Categories of Personal Data
As mentioned above, the GDPR allows data controllers to process photographs as personal data. (A data controller determines the purposes and means, such as the what, why, and how, of the processing of personal data.) However, data controllers should exercise a level of caution in the processing of photographs and facial images since they carry the potential to transform from “regular” personal data to “sensitive” personal data. This is because photographs can do much more than merely identify a natural person. They have the potential to reveal “special categories” of personal data.
According to Article 9(1) of the GDPR, special categories of personal data are defined as a natural person’s racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership. In addition, the processing of genetic data, biometric data for the purpose of uniquely identifying a person, data concerning health, and data concerning a person’s sex life or sexual orientation also qualify as “special categories” of personal data.
By looking at a photograph, some of the aforementioned special categories of personal data may be revealed or inferred without being explicitly stated, such as an individual’s racial or ethnic origin or a particular medical condition.
A similar rationale was articulated in a recent judgment of the Grand Chamber of the Court of Justice of the European Union (CJEU). In its opinion, the CJEU emphasized that special categories of personal data should be interpreted widely1 and that “publicizing” the name of a person’s spouse, partner, or cohabitee could reveal certain special categories of personal data, such as a person’s sexual orientation.2 It is not a far stretch to contemplate the CJEU deploying the same rationale to find a photograph of a person to be similarly sensitive in nature since a photograph may reveal a person’s race, medical condition, or religious affiliation.
If particular photographs fall into a special category of personal data, your company would be generally prohibited from collecting or storing those photographs (which could even be photographs of your employees) unless you qualify for one of the very strict and specific exemptions in Article 9 of the GDPR. If your company photographs qualify as special category data, they could be in violation of the GDPR and expose your organization to potential monetary fines and other sanctions.
What DPAs Have to Say
Given the uncertainty about photographs constituting special category data, VeraSafe’s privacy experts reached out to Data Protection Authorities (DPAs) in several member state jurisdictions of the European Economic Area. We posed the following questions:
Question 1:
If an entity collects and stores a photograph of an identified individual, and special categories of personal data could be inferred from that photograph, but the entity does not perform any sort of additional processing on that photograph nor record the sensitive information anywhere, is the entity processing special categories of personal data?
Question 2:
If an entity collects a photograph and also manually records the sensitive information in its systems, is the entity processing sensitive categories of data even though it is not being processed through a specific technical means such as automated image matching and identification?
In 2022, five DPAs provided VeraSafe with a reasoned response, reflected in the table below.
All of the DPAs that responded were of the view that merely storing or collecting a photograph would not be considered processing special categories of personal data. Although the United Kingdom is no longer part of the European Union, we contacted the Information Commissioner’s Office and its responses were aligned to those of the other DPAs mentioned above. However, if a special category of personal data is manually recorded by a company, the DPAs indicated that the company would then likely be considered processing special categories of personal data. This view was given in spite of the proviso that photographs are only considered personal data if processed through a specific technical means.
Conclusive Guidance Needed
The views of the DPAs, while consistent, are singular. Without guidance from an overarching body (such as the European Data Protection Board), data controllers and processors are left in the dark when it comes to processing photographs and facial images. This is a tenuous position, especially in view of the fines that organizations may face. This is demonstrated by the significant fines imposed by the CNIL (the French data protection authority) against Clearview AI. The CNIL found that Clearview AI’s processing of photographs constituted processing of biometric data and that it did not have a lawful basis for such processing. However, recently Clearview successfully appealed a fine imposed by another data protection authority, the United Kingdom’s ICO. The appeal was granted, but mostly on a technical, jurisdictional point. Therefore, the uncertainty about this particular processing activity remains.
VeraSafe will continue monitoring any developments surrounding the processing of photographs under the GDPR to ensure clients are in compliance.
You may also like:
Data Privacy Framework: Frequently Asked Questions
GDPR Data Breach Notification: What You Need to Know as a Data Controller
Clinical Trials and the GDPR: What Non-EU Sponsors Should Consider
Related Topics: Compliance Tools and Advice, GDPR, EU Privacy Laws
-
1.
CJEU’s Judgment responding to a Request for a Preliminary Ruling in OT v. Vyriausioji tarnybinės etikos komisija, Case C‑184/20 at Paragraph 125.
-
2.
Id. at Paragraph 100.