Authors: Mayank Mishra & Aparajita Bharti
Published: September 16, 2021 in the Hindustan Times
The rise of big data and machine learning has caused an immense growth in powerful technologies and applications. But simultaneously, the same technologies have become a privacy nightmare for their users. The algorithms behind these technologies amass a huge amount of data from individuals, which is then used (or sold to other firms) to target, persuade, reward, or penalise users. While privacy issues have been extensively debated, a discussion on how data governance laws might impact women differently than men and affect their agency on the internet has been mostly missing.
“As India moves towards an increasingly digital society, how privacy and data governance laws may impact women’s safety and agency on the internet should not come as an afterthought.”
Women and men use digital technologies differently. According to a 2017 survey, women use social media (such as Facebook and Instagram) significantly more than men. At the same time, there appears to be a huge disparity in mobile ownership. The National Family Health Survey-5 (2019-20) indicates significant diversity between states and union territories (UTs) in terms of the percentage of women having a mobile phone, with figures ranging from 49% in Gujarat and Andhra Pradesh to 91% in Goa. Areas with less penetration of phones among women indicate shared use of mobile phones in Indian families, which, in turn, impact women’s behaviour on the internet.
Women also face a higher risk of reputational loss online. Between 2017 and 2018, cases of cyberstalking or bullying of women or children increased by 36% while the conviction rate fell from 40% to 25%. Such issues can negatively affect the mental health of victims resulting from humiliation, diminishing self-esteem, and social isolation. These incidents also lead to a perception of the internet as an unsafe place for women.
Given these sensitivities around women’s data and its impact on their ability to use the internet, India’s various data governance proposals that are under discussion currently must be evaluated from a gender lens.
For example, the proposed Personal Data Protection (PDP) Bill, 2019 imposes a blanket requirement for parental consent for processing the personal data of anyone below the age of 18 years. This effectively gives parents control over teens’ access to any internet platform. While protection of minors’ data is indeed important, a blanket requirement such as this coupled with the shared usage of mobile phones, may compromise the agency of teenage girls far more than boys, as families exert control over their usage. Most other countries have this age requirement at 13 years as teenagers make use of the internet to learn new skills, build new relationships and explore their identities.
Another example is the governance of non-personal data, a framework that will facilitate the usage of aggregate data to build Artificial Intelligence (AI) to deliver better services to Indian citizens. The Krish Gopalakrishnan Committee on non-personal data has come out with two different frameworks to facilitate this sharing of data. However, a larger discussion around algorithmic biases against women has been missing. AI algorithms learn the patterns in the training datasets to utilise that learning for predictive analytics, among other things. There are multiple ways in which this could lead to discriminatory outcomes for women.
First, through the underlying bias in the training datasets. For example, if an algorithm is trained on outcomes that are unfavourable for women, it will replicate the same in its predictions. Second, if women are underrepresented in the training dataset (very likely due to the existing digital divide), then it will result in products that aren’t designed for women, furthering the digital divide over time. For instance, if an automated speech recognition system is trained on a dataset that has disproportionately fewer voice snippets of women talking, it will make errors while trying to comprehend women’s voices. Therefore, perhaps a policy around AI development is more urgent and needs guardrails around ensuring that underlying datasets are not biased.
Further, from a privacy perspective, the risks of identification by piecing together different sets of non-personal data are far higher for women than men. For example, non-personal data from women’s health apps, when pieced together with shopping data, may risk revealing their identities and their reproductive health issues.
As India moves towards an increasingly digital society, how privacy and data governance laws may impact women’s safety and agency on the internet should not come as an afterthought. We need to have these discussions front and centre as these regulations can be a key building block to women’s agency on the internet and their participation in the economy of the future. We risk deepening the existing chasms in an increasingly digital world, if we do not get this right.