IP Osgoode

Is Privacy a Dead Letter?

In Osgoode Hall Law School’s course Legal Values: Artificial Intelligence, professor Maura Grossman engages the class with weekly reflections on the legal and ethical issues with new advancements in technology. Based on class discussions and reading materials, students are encouraged to provide their opinions on specific questions each week. On one particular reflection, Professor Grossman assigned the following questions:

  • Is privacy a dead letter?
  • Regardless of whether or not you think so, how do we make the world safe and habitable under the present circumstances?

I argue that privacy protection, while at an all-time low in its enforcement, is not completely ineffectual. With better legislation and more technical solutions to strengthen the legal rules, privacy protection can improve.

In today’s age of technological connectivity, I do not believe it is realistic to hold organizations accountable to a standard of privacy as anonymity. Instead, privacy should be framed as the ability to control other people’s access to one’s personal information. So how much control do consumers have over their personal information? At this stage, businesses collect consumer data for purposes beyond the consumer’s control, such as turning data into cash flow or refining marketing strategies. Even though companies constantly collect personal data, the current Canadian privacy framework provides guidance on how to de-identify personal information and even provides consumers some limited power through access to information requests. Unfortunately, there is always the risk of re-identification, which is the reconfiguration of de-identified data. With a number of new legislative frameworks on the rise, such as the GDPR, these processes on how to collect, store, and share data can only get more secure.

Contrary to my views, some commentators are of the perspective that if we cannot maintain anonymity completely, then we might as well abandon the concept of privacy. For example, John Suler, professor of psychology at Rider University in New Jersey explained that both a public and a private (or anonymous) self is necessary to form human identity and both need to be intact for our well-being. Further research from Carnegie Mellon University in Pennsylvania found that people sought anonymity as a personal safeguard, which also helped them try new things and express themselves free from judgment. In a modern, connected world, however, I do not believe it is possible to be completely anonymous at all times, whether it be on the internet or just in physical public spaces. This is not inherently a negative thing, and I do not believe it is something everyone wants to feel at all times. Research on the “privacy paradox” demonstrates this point quite well. While people value privacy, they do little to preserve it. I believe that this is one of the issues that has led to the depletion of privacy protection. For a long time, consumers were not showing the government or technology companies that they cared about privacy. However, this was likely due to a lack of understanding surrounding long privacy policies written in complex legal jargon and the idea that if you do not consent to these policies then you cannot use the services.

Is it fair then that all of the onus falls on the individual to protect their privacy? There is a clear power imbalance between big technology companies who provide free, convenient services and individuals without any legal prowess or technical knowledge looking for cost-effective technology. It is very easy to click “I accept” to terms and services and privacy policies when an application is free and everyone you know also uses the service. This has led to the sharing of personal information where individuals had a reasonable expectation that their information would remain private—not shared with, or even sold to, third parties.

I believe that there is hope to turn this trend around. Take the example of California’s Consumer Privacy Act. Following the GDPR, this is the next iteration of legislation that can put consumers on a similar level as big technology companies. The Consumer Privacy Act gives consumers the power to request the personal data that companies collect, how the data is used, and even prohibit the sale of their data to third parties. Most importantly, companies cannot charge consumers a different price for the product if they choose to exercise their privacy rights, thus erasing the dichotomy between privacy and convenience.

Strengthening privacy protection also requires further collaboration between regulators and those in the technology industry. One possibility is for the government to provide some form of incentive (likely monetary) to entrepreneurs to build more ethical technology companies. For example, Graphite Docs provides consumers with a private alternative to Google Docs, which will not collect their personal information and encrypts all of their work. Another good example came from a panel event with The Blockchain Hub at the Lassonde School of Engineering at York University. The panelists discussed the role the government plays in implementing and regulating new technology like “digital identity”. Essentially, this concept allows people to identify themselves electronically without paper documents. The most important part of this technology is that the individual is in control of their personal information and their information does not have to be stored on a central database that can be sold, or even hacked. In order for the technology to be used for good and adopted broadly, the government would have a key role in distributing digital IDs and providing a way to authenticate digital IDs (so we can trust people are who they say they are). The government could get specific guidance from technology experts and look to case studies of countries that have tried to use this technology previously, like Brazil, in order to improve the current state of privacy protections and implement new technology-driven solutions.

Written by Summer Lewis, a second year JD Candidate at Osgoode Hall Law School. Summer is also the Content Editor of the IPilogue.

This post was originally submitted as a reflection for the course Legal Values: Artificial Intelligence and was reworked for publication with the IPilogue.

Related posts

One Response

  1. I believe that Canada ought to enshrine privacy as a fundamental right in the Canadian Charter of Rights and Freedoms. Personally, I believe that this is simply an extension of section 7 of the Charter – the security of the person- because individuals now exist online in unprecedented ways and require protection of this online self. For instance, people can conduct highly sensitive transactions online, such as banking, which requires a high degree of protection of an individual’s privacy. Yet privacy policy does not do an adequate job of addressing constantly evolving privacy threats, nor does it hold companies liable for instances where the company failed to prevent a gap in their privacy protection strategy.
    Protection of one’s online self, is a protection of some degree of anonymity, which simply cannot cease to exist in some contexts. An increase in online convenience cannot lead to the erasure of anonymity.

    Amending the Charter to create a reasonable right to privacy, offers an equal and equitable protection against opportunistic companies hoping to profit from personal data. The way I envision this scheme operating successfully is by including a broad definition of privacy that is not technology dependent and that is written for the benefit of the average consumer. Cases like Hill v Scientology1 and Dolphin Delivery2 allow for the Charter to apply to private individuals, which would then encapsulate conglomerates like Facebook to handle data more ethically.

Leave a Reply

Your email address will not be published. Required fields are marked *

Search
Categories
Newsletter
Skip to content