How to think about consent in the collection of biometric data
ââ¦ The great thing that feminism has put on the table is that autonomy, in which the idea of ââconsent is anchored, is never individual, it is always relationalâ¦ You have to create the conditions that allow them to to be free and equal conditions and so on. to express consent in a meaningful way, âsaid Dr Kovacs at the inaugural session of Confidentiality Nama 2021, a global conference on privacy regulations hosted by MediaNama on October 6-7. Kovacs spoke of consent in digital services and equated it with feminist theories.
Beni Chugh of Dvara Research, Director of the Internet Democracy Project Dr. Anja Kovacs, Executive Director of the Center for Internet and Society Amber Sinha, Executive Director of the Center for Communication Governance Jhalak Kakkar, and Professor at Monash University Mark Andrejevic in a moderated discussion by Nehaa Chaudhari, Partner at Ikigai Law, discussed the collection of biometric data as well as the consent and privacy preservation processes undertaken.
Consent must be more meaningful
Consent in sexual relationships is discussed in a surprisingly different way from consent in data governance, Dr. Kovacs found. âIf we look at how consents on the Internet are about ticking a box where, supposedly, we can completely waive our right to privacy. [â¦] we said that there is an essence of this right that needs to be protected and yet on the internet, certainly with privacy, it seems to go away, âshe said.
Consent is the biggest dark motive
âI’m probably going to say something, which is really provocative. But I think consent is the biggest dark pattern of all, âChugh said. She then clarified: “I mean the consent as it exists, it is not free, it is not a contract between equalsâ¦ it does not give you complete information, it can be modified unilaterally, if we had to apply the prism of consent contracts, that would fail tremendously, right? â¦ So I don’t think consent really needs dark schemes. Dark patterns in privacy policies, terms and conditions, or user interfaces find unobtrusive ways to trick you into giving your consent for certain things.
Support MediaNama’s efforts to enable meaningful conversations around technology policy. Subscribe here.
Consent is modular
Referring to the dark models, Sinha said there should be some leeway to get consent – through modular or granular consent.
Kovacs also concurred with his view saying that “we really need to think too in these specific situations where power relations come into play and what do we need to do to ensure that the available data cannot be misused against whoever is the weaker partner in this relationship. She also highlighted other general practices related to consent:
We have no idea at all why you should be forced to provide data to third parties if that third party company is not doing something that essentially improves the service you receive. Then there are changes to what can happen with your data after you give it away, because right now the terms and conditions are so broad that basically you are literally saying “here is my data, now you go. do what you want. – said Kovacs.
However, Chugh’s remarks on modular consent highlighted the practical challenges it poses for companies collecting data. âThere is a lot of talk around modular consent, that is, the person at the time of consent can say that out of the five data points you collect from me, I agree with only three. collected and two [data points] you shouldn’t collect. Our vendors tell us that the biggest challenge is getting that information flowing through your system and saying that for that particular person, you’re not supposed to be using that particular dataset. How do you educate all of your automated systems to say that this person has expressed some kind of explicit reservation about sharing or using this data point?Chugh said.
Consent makes sense
Professor Andrejevic chose to illustrate the logic of consent with an analogy: âWhen you enter a building, you don’t have to look at every building inspection notice. [â¦] for you to consent to enter this building. Instead, one of the ways the company works is by constructing buildings so that there are systems in place to make sure the thing doesn’t fall on you when you walk into it. [â¦] to give you the option of deciding whether or not you want to enter this building. Andrejevic added that consent can be limited to an individual model so that, in cases where social structures do not necessarily rely on the logics of consent, “we have already built our priorities into how we want these things to be. be built. So when we enter them, we know that they are supposed to follow these rules. ”
Consent is used to legitimize inequality
“We are creating new inequalities of a kind that civil rights movements have been fighting against for decades,” Kovacs said, referring to how body data has led to a fundamental restructuring of relationships between people. and people, people and big companies, and people and governments. âConsent is mainly used right now to legitimize this growing inequality,â Kovacs added.
She gave the example of starvation deaths due to the failure of Aadhaar biometric authentication to convey that “the way we talk about data really needs to change to fundamentally put bodies back at the center of this debate.”
Respect the confidentiality requirement
âJust beyond the harm that can result from, you know, from sharing personal data at its core, there’s an expression of privacy, there’s a claim of privacy that people don’t want to violate. and I think that should be the starting point for all of our conversations, âsaid Chugh, referring to the results of a survey conducted by Dvara Research of people willing to share data. According to the research, nearly 30% of people were comfortable sharing their data – including phone numbers, photographs and fingerprints, with their neighbors, while 42% would be uncomfortable sharing them. share just because it was their private information. Chugh stressed that there is an inherent demand for privacy that should be respected rather than constantly trying to limit the harms of sharing personal data.
Later, Chugh said that people care about privacy and therefore private sector companies with a particular impact on investors should consider it an important area to consider as well.
MediaNama organized this event with the support of Facebook, Flipkart, Internet Society, Mozilla, Mobile Premier League, Omidyar Network, Paytm, Star India and Xiaomi. We also thank our community partners – the CyberBRICS project, the Center for Internet and Society and the Center for Communication Governance (NLU Delhi).
Read also :
Do you have something to add ? Subscribe to MediaNama here and post your comment.