Authors: Rohit Kumar & Avi Krish Bedi
Published: June 15, 2022 in The Hindu Business Line
Our data is more valuable than ever. With increased digital penetration, data has undoubtedly unlocked human potential to do a lot more – and efficiently. However, with more data comes a greater risk of misuse, often exemplified by data leaks and the illicit selling of personal data. The discourse on safeguarding our data, including the discussion on the PDP Bill, is emphasizing the primacy of privacy policies and user consent as our key bastions of defense. But, as we become more aware of how businesses and other entities collect, share, and monetize our personal data, we must revisit the structural shortcomings of this approach and consciously work to devise meaningful alternatives to safeguard our privacy and autonomy.
Try recalling the last time you earnestly read through a verbose and jargon-laden privacy policy before consenting to share your data – but don’t beat yourself over being lax about it. Multiple studies have demonstrated that privacy policies and informed consent are broken. They suffer from three behaviourally-linked problems. First, the transparency/ comprehension problem – wherein the verbose legalese used in privacy policies is often incomprehensible to laypeople; this problem is further compounded by low digital literacy in India. Second, the data repurposing problem – where entities do not overtly disclose all the additional purposes for which user data could be used, thereby resulting in ‘function creeps’. And third, the consent fatigue problem – where users, by virtue of having to repeatedly consent to data sharing, are tired of doing so, thereby unwilling to expend the time and effort required to meaningfully consent.
An over-reliance on this approach has led to the prevalence of a binary “tick-the-box” approach to data protection, rendering “informed consent” perfunctory: while users have the choice to share their data, it is far from being a meaningful choice.
Some solutions posit that data collecting entities should remain legally accountable for any breach or misuse of personal data regardless of whether they obtained consent. To give this approach some teeth, a set of inviolable ‘data rights’ are envisaged. However, the problem remains in implementing and enforcing such rights. As it stands, India still does not have a data protection law, and such rights do not have legal grounding. Moreover, it can be difficult and time-consuming to prove infringements. For instance, if my data is used by AI and IoT for purposes other than what I consented to, how would I actually know? And if I somehow found out, will it be straightforward to mount a legal challenge? Moreover, by the time such a matter is adjudicated on, will any recourse offered be enough to offset the harm already done?
If we were to step back and take another look at the problem, we may be able to find some potential alternatives. Many of the core issues around data privacy are also behavioral in nature; users may wish to secure their data but their intention doesn’t always translate into action. So, by nudging human behavior through better design principles we may be able to unlock human-centric design as a potential solution to better data privacy. By placing people rather than the service-contract at the center of this relationship, we can enable better decision-making.
While designing privacy policies, for instance, UI/UX designers should be included at the very outset of the design process. Their inputs should be used to represent privacy policies visually – to show users how their data is going to be collected and utilized if they consent. Studies have shown that visually representing data flows – through short videos / animations – can make users more aware of what happens to their data when they consent, thereby reducing incomprehensibility and increasing transparency, while also tackling consent fatigue. This also has the added benefit of tackling limited literacy and linguistic diversity in a country like India.
Device makers and operating systems can also be encouraged to implement a ‘master privacy preference setting’ on user devices. Effectively, this will allow users to have a master control panel to preconfigure their data sharing preferences – where they can decide the frequency and type of data they are comfortable sharing in the normal course of online activity. And if a user’s master data sharing preferences do not meet the requirements of an app, they can either choose not to use it, or take time to specifically consent to its additional requirements. On the supply side, such a structure would incentivize the app to minimize data collection or even provide a ‘Lite’ version of their app – with basic functionality requiring only essential data from users – to prevent large-scale user drop-off.
Businesses and other entities can also be incentivized to ethically and responsibly collect data by creating a government approved market of accrediting agencies. These accreditors can carry out assessments on an annual basis to evaluate privacy policies and other data collection practices on a range of metrics including data minimization, purpose specificity, etc. – to provide score-based certifications / star ratings. A similar mechanism is also envisaged through the ‘Data Trust score’ in the PDP Bill. If well implemented, it can go a long way in addressing the shortcomings we see in the current context.
Privacy policies today remain complicated and inaccessible for many. There is a case to be made to behaviourally nudge users to invest more energy into comprehending and consenting to how their data is collected and used. Even as our lawmakers work towards devising a robust data protection law, we must also empower people and incentivise businesses to meaningfully safeguard privacy and autonomy in the digital realm – creating a win-win for all in the long term.