by Gabriel Mwangi
In Shakespeare’s The Merchant of Venice, Bassanio found himself in a spot of bother, as my colleague Guto Mogere habitually says. He urgently needed money to woo Portia, which Guto doesn’t habitually lack (i.e., neither money nor Portias to woo). Bassanio enlisted the help of his friend, Antonio – a good man – who agreed to be bonded to Shylock for the sums Bassanio needed. Antonio struck a memorable deal with Shylock whose terms – nay, one specific term – continues to be used to date to describe an unconscionable bargain. Shylock proposed the following deal with Antonio which he accepted, in spite of Bassanio’s protestations –
“Shylock: This kindness will I show:-
Go with me to a notary, seal me there
Your single bond; and, in a merry sport,
If you repay me not on such a day,
In such a place, such sum or sums as are
Exprest in the condition, let the forfeit
Be nominated for an equal pound
Of your fair flesh, to be cut off and taken
In what part of your body pleaseth me.” The Merchant of Venice, Act I, Scene III
In a transaction, a stronger party is said to demand a pound of flesh if it exploits its position to derive a benefit which it would not otherwise have obtained if the playing field was more equal; it is an unconscionable bargain which, though legally binding, may still be set aside through the less rigorous doctrines of equity. In such a transaction, it is said that the weaker position’s consent was compromised; it was not full consent by virtue of their position and circumstances.
A recent determination of the Data Commissioner, in which she interpreted section 32 (4) of the Data Protection Act (‘the Act’), suggested to this writer an analogy between Shakespeare’s dramatic transaction involving a pound of flesh and the reality of trade in biometric data. The analogy between flesh and data is not too remote or fantastic. A professor of the philosophy and ethics of information, Prof. Luciano Floridi, argued that human dignity should be the foundation of the right to privacy. In his view, there is a distinction between the use of the word ‘my’ when referring to one’s data, one’s body part like a hand and one’s property like one’s car. The use of the word is similar when referring to one’s data and one’s hand, but not to one’s property. This is so because according to him, personal information has a role in constituting oneself. This is a fascinating question which I will return to in suggesting the philosophical ambivalence in the approach to personal data in the determination under discussion. For now, I will turn to the determination after this winded but not inessential digression.
On 6th September 2023, the Data Commissioner issued a decision on a saga that captivated the Kenyan public for a brief time this year, before it quickly faded from the headlines and Kenyans’ memories. This is until recently, when the President of the Republic of Kenya was reported to have declared that government services will be accessed through iris scans.
In her ‘Determination on the suo motu investigation by the Office of the Data Protection Commissioner on the Operations of the Worldcoin project in Kenya by Tools for Humanity Corporation, Tools for Humanity GMBH & Worldcoin Foundation’, ODPC Complaint No. 1394 of 2023, the Data Commissioner framed 5 issues for determination. This commentary will be limited in its scope to the second issue for determination i.e., ‘Whether TFH [i.e., Tools for Humanity] and Worldcoin obtained proper consent for the processing of sensitive personal data’.
Before addressing the Data Commissioner’s determination on the second issue, it should be stated from the outset that the Data Commissioner was convincing, and right, on most of the issues she decided on, including some aspects of the second one. For example, it cannot be said that someone consented to the downloading and acceptance of terms and conditions if that was done on their behalf and without explanation. It was also wrong for one entity to take over data controller responsibilities from another entity without itself registering as a data controller. Relatedly, it is wrong for the entity assuming data controller responsibilities to continue operations without conducting a data protection impact assessment or at least demonstrating that the one submitted by the former entity addressed similar operations being conducted by the latter. That being said, I will now turn to this commentary’s specific subject.
The Data Commissioner decided the second issue in the complaint by referring to, among other legal provisions, section 32 of the Act. That section refers to “conditions of consent” and provides as follows:
- A data controller or data processor shall bear the burden of proof for establishing a data subject’s consent to the processing of their personal data for a specified purpose.
- Unless otherwise provided under this Act, a data subject shall have the right to withdraw consent at any time.
- The withdrawal of consent under sub-section (2) shall not affect the lawfulness of processing based on prior consent before its withdrawal.
- In determining whether consent was freely given, account shall be taken of whether, among others, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract.
After elaborating her understanding of the concept of consent, the Data Commissioner expressed herself as follows:
“In the instant investigation, consent was relied upon by TFH to collect biometric data and to transfer the biometric data out of Kenya. In particular, the provision of Worldcoin tokens was conditional on provision of consent to process biometric data.
This Office therefore finds that TFH and Worldcoin placed themselves in a position of innate economic influence by issuing Worldcoin tokens, a cryptocurrency that is convertible to flat money/legal tender. This introduced an element of influence upon the data subject’s expression of their free will.”
As I understand the activities which were being investigated by the Office of the Data Protection Commissioner, as well as an entire multi-agency investigation team including that office, individuals were offered something valuable in exchange for their biometric data. In order for them to receive convertible cryptocurrency, they had to offer their biometric data. At the heart of this activity is simply trade in biometric data. In the mind of the Data Commissioner, such a deal was formed in violation of section 32 (4) of the Act. However, I respectfully and humbly beg to differ.
In my reading of section 32 (4) of the Act, it envisions a scenario where the data subject does not want to provide their data, but is required to do so in order for a contract to be performed, or a service to be provided – yet the provision of that data is not necessary for the performance of the contract nor the provision of the service. It is concerned with an ancillary and unnecessary condition imposed on a data subject to provide their data in order for another contract to be performed, i.e., other than the provision of personal data, or another service to be provided i.e., other than the provision of personal data.
An example which may not be too remote from contemporary experience could simplify the point. For instance, a navigational app which requires fingerprint data in order to access it, yet fingerprint data is not necessary for providing navigation services. In my view, this is the mischief that section 32 (4) of the Act seeks to address. The section does not aim to achieve the end of prohibiting trade in biometric data through the means of invalidating consent in such transactions, without more. In my view, the determination of the Data Commissioner in this particular regard may have unwittingly outlawed trade in biometric/personal data without requiring more. In terms of bans on trade in what is personal to humanity, it is akin to the general ban on the trade in human body parts and organs. It is strongly felt in many parts of the world that such items should not be tradeable.
One wonders if the same instincts that apply to the human anatomy lurk in the shadows of our appreciation of personal data. In the same way that human flesh is shunned upon as security for a debt in the Merchant of Venice or as consideration for any transaction, the trade in personal data may intuitively be resisted based on similar instincts. It then becomes the philosophers’ dilemma to grapple with the fundamental questions of how personal data should be conceived: should personal data be tradeable property or a non-tradeable aspect of humanity? Some philosophers of ethics, such as Prof. Beate Roessler, have approached this question from the perspective of the moral limits of markets, which is a captivating discussion for another day.
Although my reading of the Data Protection Act implies that its drafters’ philosophical position was the former i.e., personal data is tradeable property, my respectful view is that the determination of the Data Commissioner suggested the latter philosophical position from the authority in charge of implementing the Act. Moreover, it would appear that the determination could, unwittingly, be incongruent with the purposes of the Act i.e., the regulation of personal data, including its commercial use, which was previously unregulated and left entirely to the market. I submit that the philosophical ambivalence in the approach to personal data and its protection is best resolved in the nascent days of data protection rather than later to ensure consistency in the law and its application.