Other links:

Other links:


Privacy is often mistaken for secrecy. Multiple interventions, including the Supreme Court’s Puttaswamy judgement on the Right to Privacy in India, have recognized that the concept has several other crucial dimensions. For example, understanding privacy using the concept of “informational self-determination” would require us to also account for asymmetry of information and power between individuals and processing entities about the nature, extent and value of processed data, and how this processed data is used or shared. Expanding our understanding of privacy in these ways is crucial to an appropriate exercise of the “proportionality” analysis.

Privacy is one of the biggest sticking points for digitisation/AI initiatives. But current technology is often based on a conceptually shallow understanding of privacy.

Privacy is often mistaken as being equivalent to secrecy. But multiple interventions have recognized that it has many other dimensions (see here, here and here).

The following dimensions must feature in any analysis of privacy, and together, they capture a broader notion of INFORMATIONAL SELF-DETERMINATION.

🗃️   Profiling: linking of personal information across multiple domains.

📢   Defamation/Doxxing: injurious use of information.

📲   Function Creep: using data obtained with consent for one feature for another.

⚖️   Unfair Assessment: evaluating individuals based on out-of-context data.

Everyone accepts that there is a privacy trade-off to the potential benefits of AI and digitisation. But there is still deep mistrust associated with some initiatives on the grounds of privacy.

This is because of three reasons. 

  1. Public Interest Unclear : as we have seen, there are no considered articulations of the benefits. 
  2. Lack of Transparency : there is no public acknowledgement about potential privacy costs.
  3. Purpose Limitation : often, there are no clear restricts that ensure that collected data will not be used for other purposes.

This prevents us from applying a Proportionality Analysis: of whether the trade-offs are legal, whether the expected social gains really outweighs the risks to privacy, and if the intrusions to privacy are really necessary to achieve those gains.

The threats arising from these dimensions are compounded by peculiarities of the current socio-economic environment.

👀  Asymmetry of information between individuals and processing entities about nature, extent and value of processed data.

👥  Unequal access to control over data across social groups.

We propose an interdisciplinary program with a mandate to devise frameworks that make proportionality-based analysis widely possible.

This would leverage the methods of several  disciplines to help create usable frameworks for public reasoning on the basis of proportionality.

🖥️ Computer Science and Technology

We should seek to devise the following frameworks:

  1. A formal framework to precisely specify the threat model based on an ideal functionality of an application and associated privacy claims, and clearly articulate the trust requirements.
  2. A framework for formal analysis of privacy risks based on the ideal functionality and the associated privacy claims.
  3. A set of tools and techniques to be able to implement as close an approximation to the ideal functionality as possible.

📈   Economics

What could be the true social impact of the proposed violation of privacy to public interest?

⚖️  Legal Theory

How do we interpret particular trade-offs in light of SC judgements and pending legislation?

👣 Sociology/Anthropology

What are the privacy threats arising out of membership in particular social groups?

🤔  Philosophy

How do we further refine our understanding of what privacy claims individuals have?

🏛️ Political Science

How does the nature of the state allow particular privacy violations to occur?

Study at Ashoka

Study at Ashoka