15 - 03 - 2022
Data protection by design and default is a phrase that has become immortalised as the title of Article 25 of the GDPR, yet this is a subjective test that allows for a great degree of variance in how it is applied. This is something called an “outcomes” focussed regulation as it concentrates more on the factors it wants to you try to consider rather than clearly saying that one action is right, and another is wrong (like a speed limit for example). Part of any data protection lawyer’s job is to advise on risks that could arise based on the way a system is designed or could be used by end users, and that’s where the law starts to step in to prevent marketers from using technology to play too many tricks on us.
It is the responsibility of those who are building or designing new systems or projects to take data protection requirements into account. Unless these requirements are considered early on, this can lead to issues only becoming apparent after it’s already too late or after significant costs have been incurred developing the wrong things or heading in the wrong direction. Seeking legal advice early can and almost certainly will save you money in the long run if you don’t have a decent amount of knowledge in this area.
In all things however, there must be balance, and whilst I am fortunate to spend most of my time advising clients on ways to comply with data privacy laws, I do see bad practices continuing to emerge within the big tech companies. The opposite of "data protection by design and default" has a name, and that name is "dark mapping". Dark mapping involves the use of "dark patterns" which are designed purely to manipulate users or to take advantage of known behaviours or trends in order to gain an advantage (usually more data/revenue) than the user intended to provide. Dark mapping, therefore, I would define as the strategic use of dark patterns on a concerted basis to derive long term value from the use of dark patterns within a specific risk profile or setting.
Some general examples of dark patterns that you might already be familiar with:
- Making it dark. Literally, putting black text on a grey or other dark background to make it look like the option isn't available or in the hope that you won't realise the text or option is there.
- Influencer posts and other adverts that are not obviously an advert. If an influencer is sponsored (including receiving free samples) and produces content which advertises those products, they must make it clear that it is an #Advert. Sometimes you might click on something that later turns out to be an Ad when you didn’t intend to click that thing.
- Trick Questions - the usual marketing trick of phrasing questions in a way that is confusing or designed to trick users into saying "Yes" when the user actually means "No". For example: "Do you not not not not not understand this sentence?" This also includes where the app asks the same question again if you don’t give the answer they wanted to hear. Are you sure?
- Sticky shopping basket - using pre-ticked boxes or other methods to include more than one item in a shopping basket when a user adds something.
- Zuckering - origin of this term is unknown - means tricking people into providing data on a publicly facing site or forum, or making too much data publicly available, in order to argue that the GDPR does not apply to that data.
- The ol' Bait and Switch. Trying to click on one thing, but actually clicking (or being taken to) something else!
- The flying bird. Distractions, which turn out to be nothing special or important, but are designed to purposefully draw your attention away from something negative which might cause you concern or to stop using that thing.
- Barbing. Much like a barbed hook, once it’s in its difficult to get out without doing damage. When you sign up to a free trial or a one-day special discount offer, but in actual fact you're being locked in to a much longer term agreement (that probably then automatically renews for no good reason).
- Confirmshaming. No, I don’t want free money. Are you sure you don't want to give more money to the starving proprietors of Wikipedia, whilst still using their stuff for free without paying for any of it, because you are a terrible person? When questions are framed in a way that makes you feel bad for saying yes (or no).
- Add contacts - friend spamming. When an app says, "it would be great to see if any of your friends also use the app" and suggests allowing access to your contacts to check, only to then send spam emails to every last contact in your address book trying to invite them to join the app.
- The Cambridge Analytica - asking you to give permission for a third party to collect/use your friends' personal data, when these companies realistically know that you cannot give permission for this. As I am coining this dark pattern myself, I'd say it extends to any request for permission which extends beyond what users can reasonably be expected to be able to give.
Whilst I have to admit that I don't know for certain that these companies have made additional profit out of dark mapping, it is more likely than not they have gained some sort of advantage, and it would likely be more difficult to prove that they did not profit in one way or another (part of the problem of fining these companies is precisely because it is hard to quantify that advantage).
A few specific examples of recent ways in which pretty large companies, who should know better, have used dark patterns to map and then influence user actions:
- Microsoft Edge - the browser that no one uses but which Microsoft set as your default browser, unless you opt out, at every possible opportunity. Banks - click continue to switch to paperless banking! Only to be charged if you ever request a paper bill.
- Google Maps requires that you let it scan all Bluetooth devices/Wi-Fi networks near you in order to allow you to use Maps to navigate. Google then knows what devices are near you, including potentially your neighbours, by collecting data via your device.
- Norton 360 including a crypto-miner (Ethereum) within its security product and charging much more than the going rate as commission for using it, and making it difficult if not impossible to uninstall without deleting the whole 360 program.
- Apple - getting rid of the menu button and changing the way that you can show "open apps" to make that process more difficult means that users will close their apps less often because it is now more difficult. The more background apps that continue to run, the more those apps gather data, the more they profit.
There are often many decision points in the course of development or any service or product, and the law will always strive to keep up, but it won’t always give clear answers to those decisions. The trend is for the law to increasingly look to shift the burden of these decisions to those actually designing the systems, rather than prescribing for specific do’s and don’ts, subject to carrying out risk assessments and recording those decisions for future scrutiny.
Developers who embrace this reality will, hopefully, create their own value in making systems which are designed with privacy in mind. With the prospect of official data protection certifications on the horizon, organisations should aim to get ahead of their record keeping in order to prepare for certification, or risk falling behind and losing out to competitors who invested in their compliance throughout development.
Sam Crich is a Digital Solicitor within Berwins’ Commercial team. If you need any advice on any of the above, call 07595 650226, or email SamCrich@Berwins.co.uk.