Needed, a new approach to data protection for minors

Needed, a new approach to data protection for minors

The principles of the ‘best interests of children’ and ‘more responsibility on platforms’ should inform India’s approach to data protection for minors.

Authors: Aparajita Bharti and Nikhil Iyer
Published: January 24, 2023 in The Hindu

How freely should Indian teenagers access the internet and what responsibilities do platforms have towards their minor users? These are important questions to answer correctly for achieving India’s digital ambitions.

The draft Digital Personal Data Protection Bill, 2022 currently provides for mandatory parental consent for all data processing activities for their children, defined as any person aged under 18 years. This approach however misses the mark on two fronts.

First, instead of incentivizing online platforms to proactively build safer and better services for minors, the Bill relies on parents to consent on behalf of the child in all cases. In a country with low digital literacy, where parents in fact often rely on their children (who are digital natives) to help them navigate the internet, this is an ineffective approach to keep children safe online.

Second, it does not take into account the “best interests of the child”, a standard originating in the Convention on Rights of the Child, 1989, to which India is a signatory. India has upheld this standard in laws such as the Commission for Protection of Child Rights, 2005, the Right of Children to Free and Compulsory Education, 2009, and the Protection of Children from Sexual Offences, 2012. However, it has not been applied to the issue of data protection.

The Bill does not factor in how teenagers use various internet platforms for self-expression and personal development and how central it is to the experience of adolescents these days. From taking music lessons to preparing for exams to forming communities with people of similar worldviews – the internet is a window to the world. While the Bill does allow the Government to provide exemptions in the future from strict parental consent requirements, profiling, tracking prohibitions, etc., this whitelisting process does not acknowledge the blurring lines between what a platform can be used for. For example, Instagram is, strictly speaking, a social media platform, but is regularly used as an educational and professional development tool by millions of artists around the world.

Another issue in the current draft of the DPDP Bill is that each platform will have to obtain ‘verifiable parental consent’ in case of minors. This provision, if enforced strictly, can change the nature of the internet as we know it. Since it is not possible to tell if the user is a minor without confirming their age, platforms will have to verify the age of every user. The Government will prescribe later whether verifiability will be based on ID-proof, or facial recognition, or reference-based verification, or some other means. Whatever form verifiability takes, all platforms will have to now manage significantly more personal data than before, and citizens will be at greater risk of harms like data breaches, identity thefts, etc.

We thus need to shift our approach with respect to children’s data before this Bill is brought to the Parliament. To avoid the folly of treating unequals equally and blocking off access to the internet for teenagers, first, we should move from a blanket ban on tracking, monitoring, etc. and adopt a risk-based approach to platform obligations. Platforms should be mandated to undertake a risk assessment for minors and not only perform age-verification related corresponding obligations but also design services with default settings and features that protect children from harm. This approach will bring in an element of co-regulation, by creating incentives for platforms to design better products for children.

Second, we need to relax the age of mandatory parental consent for all services to 13 in line with many other jurisdictions around the world. By relaxing consent requirements, we will minimize data collection, which is one of the principles that the Bill is built on. This relaxation in age of consent in tandem with the risk mitigation approach elucidated above will achieve protection for children online while allowing them access.

This solution draws on the experience and deliberations in United Kingdom, California, New York, etc. where Age Appropriate Design Codes have been introduced. To tailor this solution to the Indian context, the government should also conduct large scale surveys of both children and parents, to find out more about their online habits, digital literacy, preferences and attitudes.

We must design a policy in India that balances safety and agency of children online. We should not put the onus of keeping our young safe only on parents, but instead it should make it a society-wide obligation. We have to get this part of the data protection framework right as India’s ‘techade’ cannot be realised without its young.

Aparajita Bharti is a Founding Partner and Nikhil Iyer is a Senior Analyst at TQH, a public policy consulting firm in Delhi.