Privacy and Technology
John D. Gregory*
Osgoode Professional Development Program
September 30, 2004
Introduction
That the evolution of technology affects privacy is common
knowledge; it is a matter of frequent stories in the media.
Larry Ellison, CEO of Oracle, a very large information technology
company, made headlines in April 2002, when he said that with current
technology, “privacy is already gone.” A year
later, Phil Zimmerman, the inventor of the PGP (Pretty Good Privacy)
encryption system, noted that Moore’s law – that computer
power doubles every 18 months - threatens privacy, since the
population of the world is not doubling that often. More
computing power is being aimed at the same number of people.
Whether the speakers consider the statements good or bad news may
vary, of course.
While the legal right to privacy can be said in some ways to have
sprung from Brandeis’ and Warren’s article in 1890,[1]
it is fair to say that technology is why the topic is of such
pressing interest today. The modern examination of privacy has a
couple of starting points: in the United States, Alan Westin’s
book Privacy and Freedom in 1970 and Arthur Miller’s in
1971 (The Assault on Privacy: Computers, Data Banks and
Dossiers). In France, a report on government information
banks in 1974 led to the creation of the Conseil national sur
l’informatique et les libertés in 1978.[2]
The French thinking was a big influence on the Organizations for
Economic Cooperation and Development (OECD), whose Guidelines
for fair information practices and data flow in 1980 have influenced
much of the discussions around the world since then.[3]
All of them were inspired by technology – in fact computer
technology.
This informal note will look at the impact of technology on
information practices, not by business sector but by cross-cutting a
bit through technologies and practices. Some of the issues at
play in evolving technology may be clarified in this way.
The note is not intended to be exhaustive, either as to the
technology or as to the legal implications, or yet as to the policy.
It is not a legal analysis of how particular technologies accord or
can be made to accord with existing or proposed laws. It
collects a few representative ideas that may be pertinent to the
legal themes of this set of materials.
We will look first at some of the big themes – what’s
all the fuss about – and review the key elements of the debate
about privacy and electronics. Then we will look more closely
at two areas where the privacy debate is very active –
authentication and electronic public records. This will permit
a kind of conclusion, or at least an end to the paper.
Themes of talk –
privacy and its limitations
We won’t spend time refining a definition of “privacy”
– since the details can get difficult. For our purposes it can
be said to be the ability of an individual to control information
associated with him or her – his or her personal information
(sometimes called personal identifiable information, which is to say
information that can be linked with him or her, if it is not directly
connected.)
One frequently comes across two related terms that should not be
confused with privacy. The first is security, which involves
protection of information from alteration or destruction, intentional
or not.[4] The second is
confidentiality, which involves protection of access to information,
allowing those who should have access to it and barring those who
should not. This can be considered a subset of security.
One can have a good level of security or confidentiality or both
without having privacy. It is hard to have privacy without
security and confidentiality, however. Thus some of the methods
of protecting privacy require increases in security or
confidentiality. These elements are sometimes comprehensively
known as “data protection” – which are not all of
privacy but which can be an important part of protecting privacy from
computer technology.[5]
Impact on Privacy of
technology
Technology affects privacy in many areas, not just in information
technology. One thinks of the following examples, not as an
exhaustive list:
-
heat sensors that can be used from remote locations – e.g.
airplanes - to detect grow houses for marijuana cultivation;
-
compilations of patterns of electricity consumption that may be used
for the same purpose;
-
automobile data reporting devices (such as Onstar in General Motors
products) that log what the car is doing, and provide a way to locate
it – sometimes without the consent of the owner, and sometimes
with the result of providing data to show that the owner was driving
dangerously before an accident;
-
spyware – microphones, cameras – that are becoming more
sensitive and subtle.
o
A recent article spoke of a “proof of concept” device –
that actually existed, but was not yet on the market – that
could read and copy mobile phones’ calling lists at a distance
of over a kilometer, or from inside buildings at several storeys’
remove from the device;
o
A currently available web site offers 15 “spy cellphones”
that can be turned on from a remote phone and listen in to
conversations where the spyphone is, without giving the appearance of
being active.[6] Combine
this with a phone with a good speakerphone feature and one can tap a
meeting undetectably.
o
A new program allows callers to fake their identity on caller-ID, so
the person called sees someone else’s name on his or her call
display.[7]
-
keystroke monitors – that can read the strokes being made and
allow the users to reconstitute the messages – including
passwords. This was used by the FBI to get into computer records of
an alleged – and thereafter convicted – gangster.
Many of these are physical not electronic technologies, though the
last example applies to a computer and the spyphones have computer
assistance. At least three of them have been considered by the
courts, so the law is keeping up to some extent.[8]
When one does turn to computers, one must also remember that
computers presented privacy issues well before the Internet.
The concerns of the 1970s focused on databases, not on online sharing
or spying.
These examples tell us that a focus on online privacy issues –
or laws limited to online privacy – are not enough to deal with
the threats, or to govern the opportunities, as the case may be.
That is why Canadian laws on privacy do not have much in them about
the Internet – the principles are broader.
That said, much of the interest today in the intersection of privacy
and technology does arise because of computers online. Here are
a few of the features that make this world of interest.
-
Computers talk to each other – the Internet dates from 1969,
its commercial use from a decade ago. There are now of course a lot
of other forms of connection than the Internet itself, and the Net is
bigger than the World Wide Web that has made access so easy
-
Individual computers have unique identifiers for the computer (not
the user), so using the Net leaves a trace that can be analyzed,
compiled and sold.
-
Internet communications are also interactive, in that information
flows both ways in a manner that users may not suspect. Data
can be transferred to the users’ computers – as cookies
are – without knowledge of the user. The cookies can do many
things – many of them harmless or even desirable for the user.
They can analyse the contents of the computer and report it back to
their source. For example, Microsoft offers an automated
software update service which can tell what is in the computer and
transfer patches that install themselves. Such a capacity has
many less benign uses (and not everyone considers an automatic
Microsoft update entirely benign).
-
“Malware” – Trojan horses, worms, viruses, programs
like Back Orifice, and many others affect the operation of computers
and sometimes allow the infiltrated computers to be used for the
purposes of the originators. That can be to launch spam, to
participate in message attacks on chosen targets, or to collect
information for marketing or for fraud. In June 2004 a Trojan
program was found online that would monitor outbound messages and
detect any sent to a list of several prominent financial
institutions, then send a copy of such messages – which could
contain passwords for online banking – to a computer in Russia
-
Malware may affect private systems – of homes or businesses –
but also work to collect data from big databanks. Intercepting a
credit card number from a live message is much more difficult than
tapping into a database of thousands of numbers – or of course
just stealing a computer containing the information, a lower-tech
threat.[9]
-
Technology using Internet protocol offline, or other forms of
distributed computing and now grid computing, involves hooking up a
lot of computer to each other to increase processing power or just
efficiency. This puts a lot more people into the information
loop, with an ability to extract (personal) information for their own
purposes.[10]
-
One might make a separate category of governments’ use of
technology. Many governments today are working towards
“government online”. The former federal Privacy
Commissioner expressed three concerns about this movement –
which he, like his Ontario counterpart, generally supported:
§
Risk of excessive access to data through online programs, with the
temptation to engage in data profiling
§
Tendency to outsourcing, raising the question of what happens to
personal information in the hands of third parties. (We have
heard a lot about this recently because of the risk of outsourcing to
US companies that might be subject to secret disclosure requirements
under US security legislation.[11])
§
Authentication systems are needed so government knows who it’s
dealing with, but what information must be collected and what will it
be used for?
Protection of privacy against
technology-based threats:
There are several methods to protect privacy against the threats from
computer technology.
= by physical security
One can restrict access to the physical computer to prevent tampering
or abusive dissemination of information. These days a common
source of loss of personal information seems to come from the theft
of laptop computers.[12]
Better physical security would be a help in preventing that.
= by technology
Several technologies offer protection to privacy.
-
Encryption is one. It contributes security and confidentiality
to the privacy equation. If the personal information in the stolen
laptops referred to above had been encrypted, then the thieves would
not have been able to get at it.[13]
-
A group of methods to protect privacy have become known as
Privacy Enhancing Technologies (PETs).[14]
o
Among them are methods of defending against online threats –
cookie fighters, spam filters, virus detectors, firewalls, and the
like. Security patches provided by software vendors could be
included as well.
o
Some more advanced PETs include web anonymizers (permitting one to
visit web sites without leaving any personal information) or
pseudonymizers (permitting one to operate under a different name on
different web sites); privacy coders that automatically judge the
privacy policies of the web sites before permitting one’s
browser to log into them (supported in some cases by sites’
technology announcing their policies in machine-readable form); and
relational database analysis that can establish different security
and access controls for the same person’s personal information
on different aspects of the person’s files.[15]
While we are here, we should observe a major weakness of
technological fixes for privacy: the use of “social
engineering”, i.e. exploiting human weaknesses to circumvent
the technology. Why bother to crack encryption when one can
persuade (or bribe) a secretary to provide a password? Or
cruise through an office and find passwords stuck to monitors?[16]
Or persuade people to tell them their information, by pretending to
be someone else (“phishing” for personal information.)
“Trusted insiders” may prove less trustworthy than they
are supposed to be. One security analyst ran an ad last year
reciting all the usual technical protections: anti-virus programs,
firewalls, etc – “but what about Rose in accounting?”
= by law
The law provides a lot of protection for privacy now, including
attacks that arise from technology.
·
The Charter of Rights protects against unreasonable
searches, in section 8. The section has been used to prevent
the use of heat sensor searches as described above.[17]
·
The Criminal Code outlaws fraud, whether by identity
theft or otherwise. Other misuses of personal information are
crimes regardless of the medium used to accomplish them.
·
Four provinces have a statutory tort of invasion of
privacy.[18]
·
The common law can provide remedies as well. A
class action was launched against the business that lost a hard drive
of government records in Saskatchewan in 2003 – though no data
were shown to be extracted, so the losses suffered may be considered
speculative.[19]
·
There is an interesting discussion to be had –
elsewhere – about the role of civil liability generally to
enforce rights or to encourage good behaviour. US statutes
often provide private rights of action, hoping to inspire private
enforcement of public policies. The threat of heavy legal
losses may inspire better practices than exhortation or publicity
alone.
·
The statutory rights to privacy are the subject of much
discussion during this program and need not be elaborated on here.
They apply to most of the public sector and much of the private
sector in Canada. Health information is given special statutory
coverage in several provinces.
·
Why do we need these new(ish) laws when most of the things we
are worried about are already illegal or subject to civil remedy?
In part at least because prevention is worth more than a cure –
once the personal information is out there, it’s hard to
control.
= by policies
Numerous public or private guidelines or “best practices”
are available, and many have been very influential in setting
behaviour and, equally important, in establishing expectations of how
personal information should be treated, and how technology should be
designed if it is to collect, use or disclose personal information.
Here are a few examples.
-
the OECD guidelines of 1980 have been mentioned before.[20]
They are the source of most understandings of “fair information
practices”, including PIPEDA.
-
The Model Privacy Code of the Canadian Standards Association (1996)
was developed by a public-private collaboration and set best
practices that influenced sectoral codes in many parts of the
economy.[21]
-
Public sector policies – e.g. Ontario Management Board
Directives –require privacy impact assessments on new IT
projects, on programs that will permit data matching, and others.
-
Private sector privacy policies can be supported, checked, audited or
otherwise made credible by web seals etc (which may in turn be
enforced by law, e.g. Federal Trade Commission actions in the US that
have held it to be a misleading trade practice not to comply with
your published privacy policy.)
-
Human Resources policies (subject to union resistance, on occasion)
may overlap with efforts towards education (next item). This is
in part where you deal with Rose in accounting
= by education
A lot of information is available about how to protect privacy in an
age of technology, both respecting the implementation of technology
and respecting the human elements.[22]
This is perhaps the best way, as well, to reduce risk of “social
engineering” attacks (as well as by good human resources
policies and practices – subject to limits!)
One may sum up the discussion so far by noting that technology is not
all threats; it can protect as well as harm privacy. Computer
security experts speak of the TRA – threat/risk assessment –
in deciding how to configure their systems. To that the modern
designer adds a PIA – privacy impact assessment. They are
both forms of risk management – as (arguably) is most legal
advice.[23]
It is generally accepted that one has to build in the privacy
protection at the front end, as part of the architecture of an
information system. It is hard and expensive and disruptive –
and maybe too late – to retrofit privacy.
Policy factors
The discussion now needs to become a bit more complicated.
These assessments are not simple matters. They are not merely
technical, they require judgments of values – what values will
our systems promote or protect. Privacy is not the only value
of a system.
This is clear enough in the policies and in the laws, where rules
are followed by exceptions, and exceptions to the exceptions.
One example from Ontario’s Freedom of Information and
Protection of Privacy Act (FIPPA) uses a sextuple negative:[24]
s. 21(1)(f) “A head
shall refuse to disclose personal information to any person
other than the individual to whom the information relates
except if the disclosure does not constitute an
unjustified invasion of personal privacy.”
The formulation may be completely justified, as a matter of
drafting in its context. Certainly elements of it can be defended.
It is cited here to show the contradictory forces that the policy
analysis has to take into account.
One can find dozens if not scores of examples, exceptions –
express or implied – in FIPPA, PIPEDA, and other legislation
across the country.
For the remainder of this paper, we will look at one way the
governmental system works now – since there has been longer
experience in Ontario at least with public sector privacy laws - and
see its impact, then move on to a couple of thematic impacts for a
closer look at how technology affects privacy with legal effect.
The way the FIPPA system works now, institutions are key.
Institutions include all the ministries of the government.
Ministries are independent in FIPPA – one can’t share
personal information with another without statutory justification.
(There are justifications in the Act that this paper does not need to
catalogue.) The basic principle is that the government can’t
talk to itself. This is deliberate and actively defended, not
an oversight in drafting.
This is understandable with sensitive information – e.g. tax
information, where self-assessment depends on confidentiality, or
health information, which needs some sharing but which should not be
handed around for fear of prejudice as well as embarrassment.
When the Ministry of the Attorney
General sent out a million letters to collect old unpaid fines, about
ten years ago, it received a lot of complaints on the order of “My
father died last week”, “We’re bankrupt”, and
so on – as if the government had a master data base of
everybody’s life events, kept right up to date (though people
had not sent in the information they were supposed to). People
expect it – except when they object to it. In any event,
the master database does not exist, and FIPPA is there to prevent it.
What has this got to do with technology? It illustrates the
principles at work even before we go electronic – principles
that can get tested hard when electronics are introduced.
The key issue here can be stated as
privacy versus efficiency. FIPPA requires the government to
sacrifice efficiency for privacy, except for express places where it
is preserved. And electronic communications are often
introduced on grounds of efficiency, or at least effectiveness.
The way out for the good lawyer or
policy advisor is to redefine terms to justify the situation one
finds oneself in. The former federal Privacy Commissioner
explained on the web site that “efficiency” does not just
mean the least-cost solution. Efficiency means getting to where
you want to go with the least cost – and where you want to go
includes privacy.
The question is how much privacy one wants, and at what cost.
That’s the debate, and it’s ongoing, and it’s
inevitable.
This has of course system design implications: could the same
level of protection of important values be achieved with less impact
on program efficiency? Could one impose principles
across the board, rather than insist on ministerial silos?
Would it make a difference to take a PIPEDA approach –
disclosure, consent, controls on use – to government as a
whole?
There are some limits to that approach for government.
·
government collects a lot of information without consent and
has to do so, with the result that an opportunity for informed
consent may not exist;
·
the fear of privacy advocates is often that of big government
– so they want to hobble government with internal baffles and
barriers as well as with principles. This way the privacy
safeguards are easier to see and thus to enforce. We will see
this thinking at work in both our specific focus areas below.
Two examples
Now we will look at two areas where technology and privacy
intersect: one drawn from computer security practices –
authentication ; and other drawn from where computers appear
to enable expansion of public good – but with risks that
privacy advocates are striving to bring under control –
electronic public records. We can see how these analyses
of risks and the policy balances and legal measures can work in
practice.
All of this discussion represents work in progress. The
answers are not yet final and satisfactory for everybody.
Authentication
This discussion is an example of the interplay between security
and privacy (sometimes working in tension, sometimes together)
Authentication is the providing or assessment of evidence
(possibly but not necessarily in the courtroom sense) of a source of
a document or of the attributes of a person (or other entity).
The term is perhaps most frequently used of a demonstration that
people are who they say they are – that a document actually
comes from the person it purports to come from.
It may also go to the integrity of the data – that there
have been no alterations at any material time. I use the term
for the purpose of this discussion to deal with confirming the
identity or other attributes of a person, since those attributes will
be personal information in the sense we have been talking about
today.
The need for authentication is not new and did not arise with any
technology. We have always needed to be able to satisfy
ourselves about the identity or attributes of a person (“the
person carrying this signet ring has my authority”, “the
person carrying this letter is trusted by the signer of the letter”)
and of a document.
A frequent method of authenticating a document is the signature of
originator of the document. This is of course a long-standing
practice, everybody knows it – to the extent that everyone used
to know what a signature is!
In an electronic world, authentication becomes tricky. There
has been much talk of “electronic signatures”, though
some discussions of it (like this one) have veered off into
electronic authentication, which is not quite the same thing.
Technology reopens questions of identification and of how one
links an identity (a term that needs always to be read as including
“or other attribute”) to an electronic text (or image or
other electronic “thing”).
Authentication involves two big challenges that analyze
differently:
i)
authenticating someone you know – is it the person I want it to
be? (You know the attributes of the identity.)
ii)
Authenticating someone for the first time – has this person
attributes I want (for a transaction, for a service…) and do I
trust the assertion of those attributes?
So one can say that identity is just another attribute.
This is true in principle, but knowing what is commonly called an
identity may carry knowledge of other attributes.[25]
It is easier to do the first task, which can be called a “one
to one” matching of authentication data – the purported
attribute (identity) with the existing data.
It is harder to do the second – “one to many”
matching – to see if person (attribute or collection of
attributes) matches anyone in one’s whole population of
possibly trustworthy people, or of known untrustworthy people!
Both tasks can be important for the person doing the
authentication. The second, however, risks more invasion of
privacy of the person coming to be authenticated, since it may pick
him or her out of a crowd, track him or her down, associate him or
her with other attributes also known to the authenticator in ways
that are hard to predict.
Why does one want to be authenticated? What constitutes a
satisfactory identity – and thus supports a satisfactory
authentication of the identity – depends in part on why one
wants to know, what the purpose of identification is.
Authentication runs on, or combines, three main methods: what you
are,[26] what you know, what
you have. Examples in the physical world are my name –
who I am, the key to my house – what I have - and knowledge of
my address – what I know. On the simple technology side,
consider an ATM card – what you have – with its PIN –
what you know.
A common method now for being authenticated for access to systems
involves a user identification (name or variant or even pseudonym)
and password (what you know) (as proof that the person asserting the
name is the right person.)
This may not be very secure: one forgets passwords, posts them
close to machine – or chooses a weak password. So there
is a risk of improper access or even identity theft. So systems
designers look for more security. But more secure access
controls may bring different threats to privacy – partly
because the technology is so powerful one is tempted to over-use it.
Here is an obvious low-tech example: a “single sign-on”,
essentially the same user I.D. and password for everything.
This is often advocated as a way to simplify a complex electronic
world. However, if that one method is compromised, identity
theft is very easy.
Here are two higher tech examples: (i) biometrics, (ii) public key
infrastructure (PKI)
(i)
biometrics – “what/who you are” – finger
scan, retina scan, facial scan, etc (a handwritten signature is
arguably a form of biometric). A system to use biometrics for
authentication technically will have to produce acceptable numbers of
false positives and false negatives. Different systems will err in
different directions. Some work better than others for some
purposes, purely from the point of view of security.
The privacy issues in biometrics make some people very nervous.
The Information and Privacy Commissioner in Ontario has been very
vigilant about government uses of biometrics.[27]
Concerns include:
-
Potential overuse – biometric data give people an identifier
that may be used to trace them (one-to-many searches – IPC’s
policy bans government use of biometrics if they possibly can permit
this. As a result there is no use of biometrics in the Ontario
public sector today.)
-
A human being has only so many physical features – so tendency
to use the readily accessible ones as a single sign-on, for lots of
access/privileges, thus increasing risk of compromise
-
If the biometric identifier is compromised, how do you change what
you are? You can’t get a new retina as you can a new password.
-
There are some promising limited uses, e.g. as access to a PKI key: a
smart card reads a finger scan and gives access to the key: the
biometric info doesn’t have to go beyond the card itself.
(ii)
Public key infrastructure
Public key cryptography is a combination of technology, contract,
and administrative rules that uses encryption to guarantee that a
particular person has created a document (electronically) and that
the document has not been changed between its creation and its
reading. It often uses a certificate to link the encryption
system with a person (identity). The details of how it
works and the varieties of its implementation are beyond the scope of
this paper.[28]
Three main elements of a PKI present privacy challenges:
a)
Enrollment: One has to sign people up for a PKI and give them
the keys to use to encrypt or sign or receive messages.
One needs to identify them clearly – often more clearly than
they have been identified before, since their electronic messages
will have only the “digital certificate” to identify
them, not letterhead or signature or personal identification.
But does this present a risk of over-collection of personal
information? Who gets it, what’s it used for? One
has to face the serious need for detailed information if the
assurance level of the whole system is to be maintained.
b)
Management of information data: One has to deal with people
entering and leaving the system, losing their keys and having them
reissued, keeping keys so the managers can decrypt data when someone
leaves. There may be different criteria for public employees
and members of the general public here.
Another management question involves storage of authentication
data. Is there a central repository of passwords or keys or
certificates that a bad guy could attack and learn lots of useful
stuff all at once? Examples of credit card numbers abound.
c)
Use of keys: what does an encryption key unlock? What
does it give access to? What information is on a key? The
principle is not different from any single sign-on device, but the
power and mystique of PKI is sometimes thought to justify such a
system. The policy preference these days is for a base key and
additional keys for access to additional databases, or multiple keys
(but that may involve heavy administration expenses.)
In addition to these PKI-specific issues, one also faces frequent
challenges to this kind of authentication system based on the alleged
ability of people giving access to see details of what else one has
access to – i.e. is the authentication method a kind of bridge
to let others pass from one data bank to another. When the
Health authority sees your card to give you access to your health
data, it also sees information about your driver’s record and
your education and employment – or the information they keep to
let you in would let their employees get into the other databases.
The compromise could work a number of different ways.
This is not just a PKI issue. It is an example of the
internal barriers within government, referred to earlier, designed to
preserve privacy by the design of the information flow itself, not
just by rules applicable to conduct of people.
One notes here again the increasing levels of interplay between
security and privacy – we have a system that takes one step
towards privacy, but to secure it, one more step that again raises
privacy issues. One example close to home: Ontario wants to do
background checks to ensure that enrolment authorities (the people
who sign people up to a PKI and confirm they are entitled to
participate and get encryption keys) are trustworthy – they are
the bedrock of the integrity of the whole system. But those
checks have been stopped in Ontario for labour relations and privacy
reasons. The background checks themselves are alleged to tell the
employer too much, and possibly irrelevant information, about too
many people – people who are already employees and should not
be put at risk of their employment for such information. It is
hard to say here who has missed the point worse … but it’s
an example of the debate, and the complexity of administering an
electronic authentication system.
That said, it should be noted that strong authentication can
protect privacy as well. There was some debate
lately in the United States whether the Health Insurance
Portability and Accountability Act (HIPAA) required use of PKI to
protect personal information. The answer seems to be no, but
draft regulations originally dealt with electronic signatures in a
way that suggested it.[29]
The fact that the question was asked shows a different perspective on
PKI, not as a threat but as an aid to privacy.) Infomediaries
claim that common use of Social Security Numbers prevents identity
theft by ensuring one has the right person for other information in
one’s possession.
One might be inclined to conclude that technology – all
technology – may be value neutral, or at least subject to being
used properly with proper attention to privacy principles.
Recently the federal government ran a public/private study to develop
principles for authentication systems in government and business.
They work at a high level but are intended to balance the need for
strong authentication with the demands of privacy, while allocating
principled and legal responsibility appropriately.[30]
Here is an outline of the privacy principles among the Authentication
Principles:
-
collect personal information only where necessary
-
use collected information only for authentication, no other purpose
-
focus if possible on business attributes not identity or other
personal characteristic
-
collect, use and disclose only with consent of individuals
-
protect personal information against undue disclosure
-
comply with privacy laws and codes
-
include privacy in assessing compliance with authentication standards
The rules are not really different or surprising, but system
designers would do well to keep them in mind.
For example, one might think if unsigned electronic submission of
business style registrations, where the personal identity of the
person submitting is not important or not kept on paper anyway;
anonymous access to government services where the same service or
information would be provided to anyone; collection of attributes
other than identity to justify services or to keep records of
clients, because non-personal attributes may be harder to link to
others in the construction of a data biography.[31]
In short, adherence to the privacy principles in doing
authentication requires alert system design at the planning and
concept stage. This is no surprise, but it is further
confirmation that different studies with different priorities end up
with the same recommendation.
Public Records
When public records go electronic, it is arguable that they change
their nature. They have always contained personal information,
often subject to mandatory collection. But privacy was protected to a
significant extent by their “practical obscurity” –
it was hard to find the information and to collate it.
For example, the Ontario Companies Branch has records on all
corporations in the province. One can read in the files who are
directors of a particular company. But what companies is a
particular person a director of? One cannot find that out without
looking at all the records!
It is arguable that electronic records make the public records
really accessible to the public for the first time. The
electronic format provides real, convenient, searchable access –
possibly even remote access.
However, technology also allows the
information from the public records to be matched up with other
information from other public records, or from private sources, to
create data files on people – another element of the concern
we’ve seen before. One adds the usual policy concerns:
what will these biographies be used for, and how accurate is the
information anyway?
And we have to consider the permanence of these records, once the
information is out there, out of official or public control. Is
there any “right to forgetting”, in principle (as there
is in law in France, for example)? One thinks of the law on
pardons in Canada, which is only a partial example. How
long should difficulties or even bad actions be held against
someone? When do they become irrelevant?
So do we have to rethink why some records are public, or what
information should be in them, or what uses can be made of them?
This note covers three areas of inquiry on the topic:
i)
PIPEDA’s rules on public records
ii)
IPC policies and their expression
iii)
Electronic court records
i)
PIPEDA
PIPEDA says that one can use personal information in public
records as prescribed by regulation.[32]
This rule applies to public records on paper or in electronic form,
equally. The regulation[33]
says that one can use information in such records only for the
purposes for which the public record is maintained. So for
example (my examples, not in the regulation) one could get names of
directors of a corporation in order to send them suggestions about
corporate policy. One could use information about voters on a
voters’ list to send them solicitations to vote for your
candidate.
Amendments to the Election Act in Ontario in 1998
restricted the disclosure of an electronic version of the voters’
list and the purposes for which it could be used.[34]
The problem is knowing what the purpose is, in many instances.
Who gets to say what the purpose is: it’s not in the governing
statute, usually.
The information may have been very widely used, so far, even if
based on paper records. The commercial information compilers
provide the information to a lot of different users, in ways whose
usefulness is hard to predict. Finding missing people
(including deadbeat support payors), checking elements of a credit
history, creating genealogical studies – even targeting
commercial messages to people likely to be interested in them.
It is arguable that the commercial assembly of this information is
more economical for all potential users than having the information
assembled specially for special “permitted” uses –
but this is an element of the “efficiency” argument we
saw earlier.
At this stage, with little case law or direction from
Commissioner’s orders, we can conclude that the PIPEDA test
seems attractive in theory but it is sometimes very hard to apply.
ii)
Information and Privacy Commission
In Ontario, FIPPA and MFIPPA (its municipal equivalent) allow the
divulgation of personal information from records “specifically
created to be public”. The IPC interprets this to mean
that even in the hands of another ministry the information is
not public. So the fact that ministry A has the personal
information in a public record does not allow ministry B to release
it. This interpretation, which is at least not the only
possible one, cost a minister her job once. It’s another
example of the intentional construction of internal barriers to flow
of information through government, and through government to the
outside, that I referred to in my general discussion of efficiency.
This is the context for IPC’s views on electronic records –
they look hard for ways to limit disclosure of personal information
in electronic public records. They are conscious of the ease with
which the information can be matched and merged with other such
information. The issue was raised in 1994 by the then
Commissioner, in his report to the Legislature.
Recently the issue came up with respect to municipal records of
election contributions. The Municipal Elections Act, 1996
says that records made under the Act are to be available to the
public, despite MFIPPA.[35]
A reporter had looked at the paper records of thousands of
contributions to scores of candidates in the 1997 municipal
election. He discovered that the clerk had an electronic record
of the contributions, which was used to administer the rebates to
contributors. He asked for that record and the municipality
refused, saying that the electronic record was (a) not made under the
Act since it was for another purpose, and (b) too risky to let out
for the kinds of reasons we have discussed.
On appeal to the IPC, the IPC agreed with the city. It read
the Act very narrowly to avoid the statement that “despite
MFIPPA the records are public”. The Commission went on to
say why an electronic record risked excessive disclosure, so the
paper records alone would serve the function of the Act, which was to
allow public scrutiny of contributions and thus of the integrity of
the election system.
The reporter appealed that ruling to the Divisional Court.
The Court had no sympathy with the Commission or the City. It
held that the IPC’s interpretation of the Municipal Election
Act was wrong and the electronic record had been prepared under
the Act and thus, despite MFIPPA, it must be disclosed.[36]
That would have disposed of the appeal, but the Court went on to
find that the IPC’s interpretation on electronic records was
“unreasonable” (the standard of review for an expert
tribunal). Almost all the information in the electronic records
was available in the paper records, which were admittedly public.
The difference in format did not change the obligation to disclose.
Further, in the circumstances the statutory “public interest
override” applied: the public interest in disclosure overrode
any privacy interest of contributors.
iii)
court records
Though the court was not persuaded by the reasoning of the IPC in
the Toronto election contributions case, Canadian courts as well as
foreign ones have been wrestling with what electronic format does to
the accessibility of court records.
Court records are presumptively public in Canada and in common law
systems generally. The Supreme Court has held that privacy concerns,
say in a family law case, do not override that presumption.[37]
But some privacy is protected: cases may be sealed or parties not
named, where they are vulnerable (e.g. young offenders) or the
details are particularly personal (e.g. “M v H”, a
same-sex support case). Publication bans are known, though they
are supposed to be rare.
Llikewise some records “normally” sealed are released
by law for pressing public purposes, notably information on sex
offenders under “Megan’s Law” in US, “Christopher’s
Law” here.[38]
These considerations are at play for paper records, not just
electronic. So the idea of privacy is not new, in the courts.
In the electronic field, the issues can come up for:
-
the names of reported cases (notably in France and Belgium, where
online reports do not give individual parties’ names –
but such a policy is arguably easier for them because cases are
generally known by name of court and date, not names of parties as in
common law);
-
the decision and reasons for judgment – judges are asked to
consider what personal information they need to put in the reasons to
justify their decision; and
-
court records – pleadings, evidence, transcripts, etc.
(There does not seem to be much problem with schedules, dockets, etc
– it is generally admitted that these should proceed without
editing.)
In Canada, a number of courts have been cutting back their online
decisions. The BC Supreme Court, for example, has stopped
putting family law cases online, despite the Edmonton Journal
case. The Supreme Court of Canada does not publish factums
electronically.
US federal and state courts have both collectively published
policies on the topic in the past year or two (and lots of individual
state courts have rules as well.) Federal courts suggested not
putting criminal materials on line at all, though some pilot projects
have been allowed to go ahead since then. State courts were
more case-sensitive, as it were. But both levels look to
“redacting” – cutting out – sensitive
personal information, which commonly includes Social Security
Numbers, credit card numbers, and names of minor children.[39]
Note that there is a cost to this editing – who pays? Who is
liable if the editing is not done? Commercial law publishers
have argued that the free case law websites won’t be able to
keep up with the cost of all the editing for privacy.
Some people suggest that information can be specially coded in
electronic documents at the input stage – by lawyers or judges
- so the publication can edit itself, as it were, at the output
stage. That may presume a lot more technical sophistication
than many such people now have. The whole issue suggests more
sensitivity to the privacy issue than we have had to have before
electronic publication became popular.
The questions came up in Ontario when we were doing rules of
practice for electronic filing. If documents are filed
electronically, should they be available to other parties and to the
public in this form? If so, with what limits? Limits by
user: only parties and their lawyers? Limits by information:
cut out sensitive material? Limits by type of case?
Limits by proposed use, e.g. no commercial use (but we have noted the
claims that commercial data banks of some court information –
e.g. bankruptcies – can be useful.)
Is it possible or even allowable to trace users of the records, or
those who apply? Is there not a privacy right – justified in
public policy – to consult public records anonymously?
In Ontario the proposed system was not going to produce electronic
files in the court offices for a while, so we did not have to answer
the questions to start the e-filing pilot projects. But the
questions are still valid.
And who decides anyway: the courts or the government, or the
privacy commissioner? With what input? What are the right
restrictions, given the constitutional value of openness?[40]
I do not have an answer to propose to these questions. It
seems to me that a good argument can be made for media neutrality –
impose the same standards on paper and electronic records.
Start with the fair information practices or privacy principles that
lie behind the CSA Code and PIPEDA.
That means starting with trying to
decide what the purpose of the collection is: why is the record
public? Then the restrictions and permissions will flow more
readily. In other words, the PIPEDA regulation makes some sense in
principle.
Do not expect that question to be answered easily. Court
records in particular have a history of wide and varied use –
it is hard to predict how accountability may have to play out in
particular situations for particular users – the public, the
government, the court system, and individual litigants.
Just one example of how hard it is to foresee all the possible
legitimate uses: in the US (I do not know about Canada) there have
been lawsuits against employers for ‘negligent hiring’
for not checking the background of people who offend once employed.
Court records checking could prevent liability. One has to
balance the right of “offender” to privacy against rights
of co-workers to protection.
Information gatherers that made submissions to the US courts’
study said that mortgage rates in the US are about 2% a year less
expensive than in Europe, because the more freely available personal
information about borrowers allows lenders to better estimate credit
risks, and thus lose less money, and thus charge less for loans.
Note that not all the privacy principles can apply equally here.
The consent principle in particular is often irrelevant. People
don’t necessarily participate in lawsuits voluntarily –
criminal accused in particular, but civil defendants too – and
they don’t necessarily get to define the public interest in
publicity.
To sum up, court files and decisions are far from the only public
records that raise privacy issues when they go electronic, but they
may help focus lawyers’ attention on the issues that are common
to many.
Conclusions
Looking at these general observations and these two specific areas
where technology has changed the terms or the urgency of the debate
over privacy, what can we conclude?
· Ultimately
protecting privacy is a matter of policy, not a matter of
technology (if I do say so myself, as a policy person not a
technologist), and the law will reflect policy choices.
·
An IBM executive said at a privacy conference last year that
the problem is that the scientists have not been talking enough with
the policy people.
·
Obviously such conversations have been going on in Canada, and
some fairly subtle collaborations have resulted – CSA Code, IPC
publications, web sites – and elsewhere.
·
There is a vital role for education. It is fair to say,
without disrespect for legislation or for regulators, that the
principal responsibility for protecting privacy in the age of
technology rests with the individual.
· Education
is getting lots of official help – and sometimes that help
takes the form of law. The legal form of help, and whether it
does help, occupies much of the rest of this program.
· The
law of privacy makes much of a person’s “reasonable
expectation of privacy”. One of the challenges to
education and to law reform is the ease with which evolving
technology can outdistance people’s expectations. If
people do not understand the threats and how to respond in practice,
then they look to the law to restrain the technology at least until
the other means of protecting privacy can catch up.
·
The debate about the impact of technology on privacy is far
from over – and technology keeps changing, so the debate will
stay open.
* General Counsel, Policy
Division, Ministry of the Attorney General (Ontario). The views in
this note are not necessarily those of the Ministry.
[4] We are talking here
about the security of information, not “national security”
against physical threats to life and property. “Security
measures” in that sense of the word can often threaten
privacy, a subject discussed briefly later in this paper (see below,
footnote 13.)
[5]This vocabulary is not
used consistently by everyone discussing privacy.
[6] Original link was
classified as dangerous by Google in 2013 and removed. Reference was
to a site called 'endoacoustica' and a service called 'spy
telephone'.
[9] Just as Willie Sutton
robbed banks because that is where the money was, so today’s
identity thieves tap into data banks because that’s where the
information is.
[13] And California law
would not require a notice to the individuals whose personal
information was in the computer.
[14] Not to be confused
with STEPS – security technologies enhancing privacy –
again using the word “security” in the sense of national
security. Ontario’s Information and Privacy Commissioner
has a study of STEPS on the web site: www.ipc.on.ca
under Publications and Presentations, Descriptions of Papers.
[15] A brief description
of several PETs is found in a note by Barry Sookman in McIsaac,
Shields and Klein, The Law of Privacy in Canada
(Carswell:Thomson, looseleaf), section 1.3.
[16] Good security design
will not make users change their passwords too often, or users will
inevitably use memory aids that undermine the security.
[17] R. v Tessling, above,
note 8, subject to the views of the Supreme Court of Canada,
pending.
[18] British Columbia
(R.S.B.C. 1996, c. 373); Manitoba (C.C.S.M. c P125): Newfoundland
(R.S.N.L. 1990, c.P-22); and Saskatchewan (R.S.S. 1978, c. P-24).
See also the Uniform Privacy Act of the Uniform Law Conference of
Canada (1994), online:
http://www.ulcc.ca/en/us/index.cfm?sec=1&sub=1p3
[22] The web sites of the
Information and Privacy Commissioner of Ontario (www.ipc.on.ca
) and to a lesser extent the Privacy Commissioner of Canada
(www.privcom.gc.ca)
offer a range of studies and recommendations on setting up
technologies to respect privacy and to conduct one’s personal
use of technology to minimize threats.
[23] Just as a lawyer
sometime has to tell the clients they can’t do what they want,
when the clients have gone too far down a path without talking to
the lawyer, so too the privacy authority may tell the technologists
they can’t build the system they want, while if they had
consulted their expert at the start, a way could have been found to
comply with the law and achieve most or all of the business goals.
[24] Not counting
“disclose”, which has negative form “dis”
but positive meaning “give”
[25] Roger Clarke, an
Australian privacy and IT expert, speaks of “entities”
that may have many “identities”. Identification is
establishing certainty about an identity, without necessarily
needing to know the core “entity” beneath it.
One’s name is only one possible identity and may not be needed
for particular tasks of authentication. His relevant
publication are listed online:
http://www.xamax.com.au/CV/PublDV.html
[26] Roger Clarke says
“what you can do”, like a signature. “What –
or who – you are” implies a biometric clue, and that may
not be needed for most questions of identity.
[27] Section 29 of
Ontario’s Electronic Commerce Act , S.O. 2000
c.17, limits the use of biometrics as an electronic signature
because of representations made by the Commission.
[28] A very brief
description of the principles is in my note “PKI in a (small)
nutshell”, online: www.euclid.ca/pkishort.html
. (Some of the further links are out of date at the moment.)
[31] It is arguable that
public sector needs for authentication differ in some respects –
and thus in legal framework – from private sector needs.
See John D. Gregory, “Solving Legal Issues in Electronic
Government: Authority and Authentication”, (2002) 1 Canadian
Journal of Law and Technology No 2 page 1, online:
http://cjlt.dal.ca/vol1_no2/pdfarticles/gregory.pdf
[32] PIPEDA paragraphs
7(1)(d)(collection), 7(2)(c.1)(use), and 7(3)(h.1)(disclosure).
[34] Election Act,
R.S.O. 1990 c. E.6, section 17.4, enacted by S.O. 1998 c.9 s.15.
The section is worded for any media, but it was inspired by the
prospect of an electronic list being made available.
[35] S.O. 1996 c.32
Schedule, s. 88(5).
[36] Gombu v. Ontario
(Assistant Information and Privacy Commissioner), [2002] O.J.
No. 1776.
[37] Edmonton Journal
v. Alberta (Attorney General), [1989] 2 S.C.R. 1326.
[38] Christopher’s
Law (Sex Offender Registry), S.O. 2000 c.1.
|