How to fix the ePrivacy Regulation
In spite of all good intentions, the proposed ePrivacy Regulation (ePR) currently being discussed in the European Parliament will irrevocably clutter the field of digital privacy with a wealth of requests for consent that will result in users becoming just as blind to the content as they are to the current Cookie-acceptance popup.
There is no debate that your digital communication must be confidential and no one should be allowed to spy on you on the internet. This should not be bundled with an unreasonable insistence on being 100 pct. anonymous when you enter a public area or someone else’s domain.
Just as we don’t expect the clerk at a shoe store to ask for your permission to look at you and determine your age, your gender and what rack of shoes you are looking for before assisting you with your purchase, we shouldn’t be asking an ecommerce store to gain consent for the same level of service. In the proposed ePR legislation, your interest in shoes and your demographical information is being treated with the same sensitivity as your medical prescriptions and history.
Cookies are not dangerous. They cannot contain virus, malware or spy on you. But information about your online behaviour could be used in a way that you are not comfortable with or even to your disadvantage. To protect us from that we recently adopted the General Data Protect Regulation (GDPR) with very strict rules for processing personal data. This new legislation is entering into force in May of next year, and the entire online industry (and all other industries) are already preparing to get ready for the new requirements before the deadline. The fines for neglecting this would be quite staggering.
A positive feature about the GDPR is that it allows for companies to approach their existing customers with direct marketing and to process information about how people use a website in order to present them with related content – including advertising. This happens with all users being clearly informed of any data processing taking place and each user is entitled to detailed insight or to have the information corrected or deleted and to opt-out of any targeted advertising. There are very strict rules for sensitive data such as health or financial information and more flexible rules for processing less sensitive data such as web statistics and profiling user’s interests in products or subjects. This principle is referred to as a risk-based approach, and it makes a lot of sense from the user perspective.
Unfortunately, the proposed ePR legislation has an entirely different approach. It does not differentiate between personal or non-personal data, or between sensitive or non-sensitive information. It requires for a prior explicit consent before any information can be stored.
At first glance, this might seem quite reasonable. After all, why shouldn’t you be asked for permission beforehand if someone wants to process information about you? But the fact is, that it is not always in your best interest. First of all it will annoy the hell out of you to be constantly asked for your consent. No one will read the endless boring information, and everyone will just click yes or no after what seems most convenient in the given situation. And as we get used to saying yes to data processing with almost every single website we use, we will go blind to the actual content that we give our consent to, and the acceptance becomes so routine that we will end up accepting even the data processing where an actual informed consent should be expected.
Objectively what serves us all best is when data is used to provide us with better service, better products, more innovation, and relevant information in a way that respects and protects our privacy. However, realistically you as an individual are not able to determine which data processing would fulfil these requirements all the time. It is impossible for anyone to make an informed decision in all matters.
If you buy a car, you do not need to know how the motor works, the electronics, the airbag, the internal computers and stuff. You want the car manufactures to produce great cars, which live up to strict safety standards defined and audited by experts. The same logic applies to data processing. Let the data controllers (and processors) be responsible. How can this be achieved? The simplest way would be to apply the same legal grounds for data processing in the ePR that we just spent 4 years working on in the GDPR. But that would be almost too easy wouldn’t it?
Originally published on LinkedIn here.