WorldLII Home | Databases | WorldLII | Search | Feedback

United Nations Special Rapporteur on the Right to Privacy Publications

You are here:  WorldLII >> Databases >> United Nations Special Rapporteur on the Right to Privacy Publications >> 2018 >> [2018] UNSRPPub 11

Database Search | Name Search | Recent Documents | Noteup | LawCite | Download | Help

UN Special Rapporteur on the Right of Privacy - Annual Report; Seventy-third session of the UN General Assembly 2018 [2018] UNSRPPub 11 (17 October 2018)



A/73/45712

Advance Unedited Version
Distr.: General
17 October 2018

Original: English

Seventy-third session
Item 74 (b) of the provisional agenda[*]
Promotion and protection of human rights: human
rights questions, including alternative approaches for
improving the effective enjoyment of human rights
and fundamental freedoms

Right to privacy[**]

Note by the Secretary-General

The Secretary-General has the honour to transmit to the General Assembly the report prepared by the Special Rapporteur on the right to privacy, Joseph A. Cannataci, submitted in accordance with Human Rights Council resolution 28/16.

Summary
The report is divided into two parts: an executive summary of activities undertaken during 2017-18 is the first, introductory part of the report. The second and main part is the final report on the work of the Big Data Open Data Taskforce established by the Special Rapporteur on the right to privacy.

Report of the Special Rapporteur on the right to privacy

  1. Overview of activities of the Special Rapporteur on the right to privacy
    1. The period from October 2017 to October 2018 has been extremely productive for the Special Rapporteur on the right to privacy, marked by engagements with civil society, Governments, law enforcement, intelligence services, data protection authorities, intelligence oversight authorities, academics, corporations and other stakeholders.
    2. In March 2018, the Special Rapporteur presented to the United Nations Human Rights Council a comprehensive review of his first three-year term as the inaugural holder of the mandate created by the Council in March 2015.[1] The report provided an account of the Special Rapporteur’s activities in each of the mandate’s thematic areas. The Special Rapporteur would like to express that it is a great honour to have had his term extended to 2021 and to continue the mandate’s important work.
    3. The work program schedule was affected when the Special Rapporteur underwent urgent and unexpected surgery in April 2018. The impact upon his commitments was managed well. The Special Rapporteur thanks the Office of the High Commissioner for Human Rights for its support and assistance during the difficult period of his hospitalisation and convalescence. The Special Rapporteur made a full recovery and resumed his duties in June 2018.

A. Work of the ‘Health Data Privacy’ Taskforce

  1. The Task Force on Health Data examined issues under the leadership of Dr. Steve Steffensen, MD Associate Professor, Dell Medical School, University of Texas. Although work had commenced on a draft report, unanticipated events meant the consultation planned for 2018 was postponed until 2019. The Vice Chair, Professor Nikolaus Forgo, has agreed to undertake the responsibilities of Taskforce Chair.

B. Work of the ‘Use of Personal Data by Corporations’ Taskforce

  1. The right to privacy has never been more at the forefront of political, judicial or personal consciousness than now as the tensions between security, corporate business models and privacy continue to take centre stage.
  2. In response to events over the past year, including the Cambridge Analytica breach, the introduction of legislation such as the Clarifying Lawful Overseas Use of Data Act, CLOUD Act (H.R. 4943), in the United States, the Telecommunications and Other Legislation Amendment (Assistance and Access) Bill 2018 in Australia, and the United States v. Microsoft Corp. case before the United States Supreme Court, the Special Rapporteur brought forward the commencement of the Taskforce examining the corporate sector’s use of personal information.
  3. The Taskforce met for the first time in Malta in September 2018. Its membership is drawn from large corporations leading the digital era and key players promoting the protection of the right to privacy in the technology world. The Taskforce will advise the Special Rapporteur on emerging challenges to, and opportunities for the promotion of the right to privacy, including the gender impacts of these issues.

C. Work of the ‘A Better Understanding of Privacy’ Taskforce

  1. The Taskforce explores the Human Rights Council’s recognition of the right to privacy as enabling the development of the person, and barriers to this enablement. It will collaborate with initiatives around the world such as that of the Australian Human Rights Commission’s examination of the impact of the digital era upon human rights.[2]
  2. While all persons are entitled to enjoy the protection provided by international human rights law, there are reports that the enjoyment of the right to privacy is not equal nor universal. Gender is one area where the protective and facilitative effects of privacy and privacy breaches and harms can be experienced differently.
  3. Relevantly, the Indian Supreme Court read down section 377 of the Indian Penal Code, to the extent it criminalised consensual sexual activity between adults, in a judgment that recognises the rights of the LGBTQI community in India. This judgment will have a significant impact on the gender and privacy discourse in India. It flows from the 2017 right to privacy judgment in the Puttaswamy case.[3]
  4. The Special Rapporteur has initiated an online consultation on gender perspectives of the right to privacy in the digital era, seeking feedback on questions such as:

(1) What gender issues arise in the digital era? What challenges need to be addressed and what positive experiences can be promoted more widely?

(2) Has the digital era produced new or significantly different gender based experiences of privacy?[4] If so, what are these?

(3) What are the gendered impacts of privacy invasions on women and men, and individuals of diverse sexual orientations and gender identities, gender expressions and sex characteristics, arising from violations of the right to privacy, including health issues, discrimination in employment, or other areas?

(4) What are good practices in law and service delivery models that address gender-based differences in the enjoyment of the right to privacy?

  1. Submissions were requested by 30 September 2018 for reporting to the Human Rights Council in 2019. The Special Rapporteur is happy to accept Member States’ late submissions to 30 November 2018.
  2. This initiative follows the “Privacy, Personality and Flows of Information” consultations of July 2016, May 2017, and September 2017 around the world. The fourth “Privacy, Personality & Information Flows” event planned for May 2018 in Latin America was postponed due to the Special Rapporteur’s inability to travel, and will occur in mid 2019.[5]

D. Work of the ‘Security and Surveillance’ Taskforce

  1. After Edward Snowden revealed details of surveillance and intelligence sharing programmes operated by the intelligence services of the United States and the United Kingdom, applications were lodged with the European Court of Human Rights (ECHR) concerning the bulk interception of communications; intelligence sharing with foreign governments; and the obtaining of communications data from communications service providers under the United Kingdom Regulation of Investigatory Powers Act 2000.
  2. The ECHR recently found that the United Kingdom’s bulk interception regime violated Article 8 of the European Convention on Human Rights (right to respect for private and family life/communications) due to insufficient oversight of the selection of internet bearers for interception and the filtering, search and selection of intercepted communications for examination, and inadequate safeguards for selection of “related communications data” for examination.
  3. The Court held the regime for obtaining communications data from communications service providers violated Article 8; and that both the regimes for bulk interception and for obtaining communications data from communications service providers violated Article 10 of the Convention due to insufficient safeguards for confidential journalistic material. It further found that the regime for sharing intelligence with foreign governments did not violate either Article 8 or Article 10.[6]
  4. While this judgement concerned the United Kingdom’s earlier statutory framework for surveillance, its findings are very significant and are brought to the attention of Member States for review of their practices and frameworks.
  5. In relation to the December 2016 ruling of the Court of Justice of the European Union regarding the retention of communications data, and the Government of the United Kingdom’s Consultation on its proposed response, the Special Rapporteur provided input in early 2018. This will be available on the OHCHR website according to usual protocols.
  6. In September 2018, the Australian Government introduced into the Australian Parliament the Telecommunications and Other Legislation Amendment (Assistance and Access) Bill which has profound impacts on human rights and cybersecurity internationally and domestically.
  7. The Bill is fatally flawed. It is a poorly conceived national security measure that is equally as likely to endanger security as not; it is technologically questionable if it can achieve its aims and avoid introducing vulnerabilities to the cybersecurity of all devices irrespective of whether they are mobiles, tablets, watches, cars, CCTV, and it unduly undermines human rights including the right to privacy. Assurances that it is not a ‘backdoor’ into encrypted communications are unreliable since it may create, in effect, additional keys to the front door, or even more front doors.
  8. The Bill has an overly high level of discretion on the use of exceptional powers. The accountability is not to the Parliament but to agencies and the Attorney General. It lacks judicial oversight, or independent monitoring, there is an extremely troubling lack of transparency, and the proposed ability to introduce software amongst other actions into device(s) is disturbingly akin to government hacking. It was introduced into Parliament after an inadequate period of consultation and despite receiving reportedly over 14,000 submissions, just two weeks after consultation close.[7]
  9. The Special Rapporteur’s concerns are compounded by the Australian Government’s stance on remedy for serious invasions of privacy and Australia’s limited human rights and privacy protections – that is, no constitutional protection for privacy; no Bill of Rights enshrining privacy, no tort of privacy, and unlike its neighbour New Zealand, its Privacy Act has failed European adequacy assessment.
  10. A new approach to addressing the challenges posed by encryption for law enforcement and national security is required. While there are challenges posed by technology to law enforcement and intelligence services, and countering online child sexual abuse and negating terrorism threats is important, protecting the human rights of citizens is also legitimate and necessary in a democratic society. The technologies that empower criminals and terrorists to evade detection or launch malicious attacks also provide enormous benefits for cybersecurity, privacy, and the economy.[8] Weakening encryption technology puts at risk the modern information economy’s security.[9]
  11. Addressing the complications caused for law enforcement investigations and intelligence collection by encryption, needs an approach that avoids weakening encryption and hence the national security of other countries.
  12. I commend to Member States the approach of the Government of the Netherlands, which has recognised that national action cannot be seen separately from its international context and the lack of options for weakening encryption products without compromising the security of digital systems that use encryption.[10]
  13. The Special Rapporteur’s International Intelligence Oversight Forum (IIOF) will meet in Malta in late November. Interest is such that the Forum is over-subscribed.

E. Communications

  1. The Special Rapporteur submitted 17 communications since 22 September 2017; eight were ‘allegation letters’, seven ‘other letters’, and two were ‘urgent appeals’. Of the 17 communications, 15 were communications submitted jointly with other mandate holders and two communications submitted by the Special Rapporteur alone.

F. Promoting the right to privacy

  1. The Special Rapporteur cooperated with other mandate holders through joint press releases and statements, and by exchanging advice and information. The Special Rapporteur acknowledges the very constructive consultations with the Special Rapporteur on violence against women, its causes and consequences.
  2. The Special Rapporteur has issued eleven press releases and statements. Of these, two were with released jointly with other mandate holders, on the rights of environmental activists in the upcoming 24th Conference of the Parties to the United Nations Framework Convention on Climate Change (COP24)[11] and on Mexico’s draft security law.[12]
  3. On 19-20 February 2018, the Special Rapporteur delivered a presentation on the role of the right to privacy within the human rights framework and for civic space protection, and moderated a session new and emerging trends at the ‘Expert workshop on the right to privacy in the digital age’, organized in Geneva by OHCHR.

G. Country Visits

  1. In June 2018 the Special Rapporteur visited the United Kingdom of Great Britain and Northern Ireland. The Special Rapporteur’s end of mission statement provides preliminary observations.[13] The final report will be submitted to the 40th session of the Human Rights Council in February/March 2019.
  2. In 2015 the Special Rapporteur was very critical of legislative proposals which increased the surveillance powers of the United Kingdom Government. Significant improvements have been made since on the intelligence oversight regime, including the establishment of a better resourced Investigatory Powers Commissioner’s Office (IPCO) and the double lock system, with the equivalent of five full-time Judicial Commissioners reviewing the most sensitive authorization decisions signed off by senior Government officials such as the Home Secretary or the Foreign Secretary. A positive aspect are those safeguards against arbitrary or unlawful surveillance applying equally to all persons under surveillance by the United Kingdom authorities in its territory, without any distinction based on nationality or residence.
  3. The Special Rapporteur remains concerned about possible deficiencies in the new Investigatory Powers Act 2016, including the requirement the IPCO perform the dual task of authorizing surveillance and overseeing that very same surveillance. This may compromise the independence of the post-facto oversight.
  4. The Special Rapporteur identified a need for clear, strong guidelines and oversight of any data-sharing agreement for the National Health Service, and strongly recommended that these guidelines be made public at the earliest opportunity. Discussions with the National Data Guardian suggest this could be during the next 12-24 months. The Special Rapporteur recommended the early completion of placing Data Guardian’s role on a statutory footing.
  5. Other issues within the end of mission statement include anti-radicalization measures, the Prevent program and their impact on Muslims; proposals to criminalise access to extremist material; and matters raised by civil society organisations.

Planned country visits

  1. The next official country visit is to the Federal Republic of Germany (29 October to 9 November 2018). It will be preceded by a call for contributions from interested parties via the Special Rapporteur’s website[14].

Informal visits and international events

  1. While visiting Australia for the Big Data – Open Data consultation, the Special Rapporteur visited three States and met with civil society organisations, a Government Minister and the Shadow Attorney General, Government officials, corporations, professional associations, academics and other individuals. The Special Rapporteur met the Australian Human Rights Commissioner and Commission President, and gave public lectures for Grand Challenges-University New South Wales, Sydney; Melbourne University; La Trobe University; and Edith Cowan University. The Optus Macquarie University Cyber Security Hub held a briefing between the Special Rapporteur and top listed companies. The Australia and New Zealand section of the International Association of Privacy Professionals (iappANZ) organised meetings with privacy practitioners.
  2. The Special Rapporteur also participated in the following conferences: 16th International Conference on Cyberspace (Czech Republic, November 2017); 11th International Conference on Computers, Privacy and Data Protection 2018: The Internet of Bodies (Brussels, January 2018); Expert workshop on the right to privacy in the digital age (Geneva, February 2018); the Global Internet And Jurisdiction Conference (Ottawa, February 2018); MAPPING Conference (Malta, February 2018).

H. Developments on the right to privacy

The ability to seek remedy

  1. The Special Rapporteur continued to draw the attention of relevant Member States to allegations of violations of the right to privacy and advised, in his 2018 report, the Human Rights Council on violations of article 12 of the UDHR and article 17 of the ICCPR.
  2. The Special Rapporteur remains convinced that repairing the harm caused by breaches of privacy requires confidence in receiving a fair hearing and possible remedy. The ability to access a remedy is central to the protection of human rights and remains high on the Special Rapporteur’s priorities.

Artificial intelligence

  1. As more of the decisions affecting the lives of all individuals are made using algorithms and machine learning, their impact on human rights needs to be carefully and continuously evaluated.
  2. These technologies are so pervasive they are even relied upon as evidence in court proceedings. Yet how complex algorithms operate is largely unknown, as is their developmental progression in the case of machine learning. This requires examination from the perspective of all human rights. This assessment is necessary prior to, or in tandem with, policies that encourage and enable the development and deployment of products based on artificial intelligence.[15] Strong legal and ethical frameworks are critical to protect affected human rights.

Introduction of privacy and data protection legislation globally

  1. There has been a great increase in the number of countries introducing privacy/data protection laws,[16] with 2018 having been a particularly active year around the world.
  2. Of particular note is India’s draft law following the Puttaswamy decision of the Supreme Court.[17] The draft Bill, released in mid-2018, has many positive features also found in the European General Data Protection Regulation 2016/679 (GDPR), such as data-protection impact assessments, a right to be forgotten and adequate enforcement penalties. But there are also concerns, such as restrictions on research into the potential re-identification of people in supposedly anonymized datasets. Further, while the use of personal data by law enforcement is to be “necessary and proportionate,” disclosure in legal proceedings has very broad exemptions.[18] The Special Rapporteur urges the Indian Government to engage with academics, researchers and civil society organisations raising such issues.
  3. In a judgement 26 September 2018, the Indian Supreme Court upheld the constitutional validity of the Aadhar Act, but revoked:

Section 57 whereby private companies could ask consumers for Aadhaar details for identification purposes for their services.

Section 33(2) sharing of data with security agencies on the ground of national security.

Section 47 whereby only the government could complain in case of theft of Aadhaar data.[19]

The Court required the Government to introduce robust data protection legislation.

  1. Within the European Union (EU), there has been significant reform. The GDPR came into force 25 May 2018. Also, a specific Directive on data protection in police and justice areas became applicable from 6 May 2018. The Privacy and Electronic Communications Directive 2002/58/EC on Privacy and Electronic Communications (ePrivacy Directive) is due to be replaced by the new ePrivacy Regulation[20]. Regulation (EC) 45/2001 lays down the rules for data protection in EU institutions and the duties of the European Data Protection Supervisor. The European Commission adopted a proposal on 10 January 2017 which repeals Regulation (EC) 45/2001 and aligns it with the GDPR. These are expected to apply from late 2018 onwards. With this reform, the EU will complete the first major modernisation of its framework for protecting privacy and data protection in over 20 years.[21]
  2. These important consolidation measures within the EU apply to all sectors, except to privacy and “national security” - a matter excluded from the EU’s competence by Article 4.2 of the Treaty on European Union. Surveillance within the remit of national security, and not law enforcement, is regulated in a much more disparate manner within the EU through the efforts of countries like Belgium, France, the Netherlands and the United Kingdom to update their legislation.
  3. At the wider regional level, it is encouraging to note that the modernisation of the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, ETS No.108 (Convention 108), was finalised in June 2018 and the enabling legal instrument, Protocol CETS 223 was opened for signature on 10 October 2018. This is an important milestone as this international treaty, unlike GDPR, also covers national security and has been ratified by more than 55 UN member states, with an increasing number of non-European states also joining.
  4. In Brazil, the Federal Senate approved a General Data Protection Law to become effective in February 2020. Key elements include[22]:
  5. Non-compliance can result in fines amounting to two percent of the gross sales of the company/group of companies or a maximum sum, per infringement, of approximately USD 12.9 million.

Indigenous peoples and data

  1. The Special Rapporteur has studied the privacy culture of Aboriginal Australians for many years.[23] Being one of the most sophisticated, lived expressions of privacy involving individual, familial and group privacy implemented through behaviours, rites and practices such as private and communal spaces, the Special Rapporteur was pleased that the Big Data – Open Data consultation explored Indigenous Data Sovereignty albeit in a modest fashion.
  2. The Special Rapporteur encourages Governments and corporations to recognise the inherent sovereignty of indigenous peoples over data about them or collected from them, and which pertain to indigenous peoples’ knowledge systems, customs or territories.

II. Consultation on interim Big Data-Open Data report

  1. The Special Rapporteur presented his interim report on Big Data – Open Data to the General Assembly in October 2017.[24] The report reviewed the challenges to the human right to privacy from a defining feature of the digital era, that is, Big Data – Open Data. Since then, the introduction of the GDPR and the Facebook-Cambridge Analytica revelations have occurred.
  2. Consultation with Government officials, civil society organizations, companies and individuals on the interim report occurred in Australia on 26 and 27 July 2018. It was preceded by a call for submissions on the interim report which concluded on 28 April 2018 and summarised for the consultation. Further input came from meetings with civil society organisations organised by the Australian Privacy Foundation and from submissions received after the consultation.

A. Summary of feedback

  1. The public consultation considered the origins and uses of Big Data and Open Data; the potential benefits and harms of each; the impact of the use of personal data on other human rights; the adequacy of de-identification techniques; good practices on the use of personal data; the importance of human rights and ethics in automated decision-making technologies; and indigenous data sovereignty, consumer and gender issues, and non-European country perspectives.[25] Much of the discussion concerned Open Data and the privacy consequences of its interaction with Big Data.

B. Open Data

  1. Big Data analysis and computational techniques based on artificial intelligence provide benefits while raising potential privacy risks for individuals and communities, and for the very fabric of democratic societies. The opening up of government information, particularly the iterative public release of data sets containing personal information, needs more nuanced and closer examination.[26]
  2. The consultation examined the proposition that Big Data analytics can identify individuals despite de-identification.[27] Participants heard how identifying whether data or the results of a data analytics project contain personal information, is dependent on the circumstances of the use or disclosure, and how this can change depending on other factors. Consequently, re-identification is best described in terms of risk levels rather than as an absolute. Risk levels are based on who has access to the data, how granular the data is (size of the smallest group in the data), what other data sets can be accurately linked to the data and the associated external context.
  3. ‘Personal information’ within data covers a very wide field and descriptions vary in different jurisdictions. What most definitions have in common is that the scope of personal information can be very broad, and looks at the ability to identify an individual, not just whether the data itself identifies the individual.
  4. Two key aspects for identifying data that contains personal information are either:
  5. The three main mechanisms for data sharing (explicit, derived and inferred) each come with considerations about the degree of personal information contained, and the obligations of the organisation which captures, uses and stores that data.
  6. Online browsing and purchasing data can be used to increasingly personalise services without knowing the identity of the user, however concerns have been raised as to whether highly targeted anonymous identifiers constitute personal information. Mobile network data has been used for purposes beyond network optimisation, allowing customer churn prediction and even to reveal relationships to other mobile users without knowing the identity of the individuals involved.[28]
  7. A key challenge for sharing data is that there is currently no way to unambiguously determine if there is personal information within aggregated data, or whether disaggregated data can be re-aggregated. The risk of re-identification depends on access to (and the ability to link) related data sets, the techniques used to de-identify, and the level of aggregation or perturbation of data. Consequently, different techniques and levels of aggregation of data are used across organisations depending on the perceived risk associated with the data being shared.
  8. The development of standards of what constitutes "de-identified" data would help to address the challenges of dealing with privacy. Internationally, there is currently only very high-level guidance, and certainly nothing quantitative, as to what "de-identified" means, hence many organisations must determine what "de-identified" means to them on a case-by-case basis, based on different data sets and on how they can reasonably be used or combined with other data.
  9. In 2017, the Australian Computer Society released a technical whitepaper exploring the challenges of data sharing.[29] The paper highlighted that a fundamental challenge for the creation of smart services is the issue of whether a set of data sets contains personal information. Answering this question is a major challenge as the act of combining datasets creates information. The paper further proposed a modified version of the "Five Safes" framework for data sharing to quantify different thresholds for "Safe". This work continues with the support of Standards Australia and aims to initiate work to create international standards for privacy preserving data sharing. A second whitepaper is planned for October 2018. This is expected to form the basis of international standardisation activities with the goal of ultimately defining robust privacy preserving, data sharing frameworks.[30]
  10. An example presented of the limitations of de-identification for protecting unit level records was the release online, in August 2016 of a large longitudinal dataset for a 10% sample of Australians who had claimed Medicare Benefits since 1984, or Pharmaceutical Benefits since 2003.[31] This affected the medical data of around 2.9 million Australians, comprising prescriptions, surgery episodes, tests (excluding results), and visits to general practitioners and specialists (excluding doctors’ notes).[32] The dataset had been downloaded 1,500 times before being taken offline following reports that doctors' IDs could be easily decrypted[33], and later, that patients could be identified.[34] The release by the Australian Department of Health sought to facilitate medical research.
  11. Important questions raised by such examples are: ‘should government-held personal information datasets be released externally when there are increasing risks of large-scale privacy breaches from re-identification due in part to the availability of other publicly released information, and inadequate organisational technological capabilities?’, and, ‘what is an appropriate response to prevent such incidents re-occurring?’.
  12. The consultation was clear that the release of government-held information requires adequate privacy protections and regulatory responses. There were strong views that unrestricted access to unit level data, and other personal data unable to be disclosed safely in aggregate form, is incompatible with the right to privacy. Feedback was unsupportive of regulatory responses reliant upon the criminalisation of re-identification undertaken to test the security of released datasets.[35]
  13. Participants described existing mechanisms that permit the use of identifiable personal data for research purposes[36], pointing out that these could be expanded, as appropriate, for further public interest uses.[37]
  14. The question is ‘if useful data is viewed as a limited resource rather than as an unlimited untapped resource, would we have more sustainable practices?’[38]
  15. Current practices that alienate the data subject are, in effect, ‘killing the goose that laid the golden egg’, as distrust of government and private actors managing personal information appropriately, results in not using services (risking adverse societal impacts, for example, in public health areas) or by providing incomplete or inaccurate information. [39] These actions also undermine data quality and, ultimately, the accuracy of machine-learning algorithms.
  16. Overwhelmingly, the view was the sustainability of data practices increases if data subjects are fully fledged partners in data operations.[40] Participants heard that nowhere is this more evident and important than for indigenous peoples.

C. Indigenous Data Sovereignty

  1. Data are a cultural, strategic, and economic resource for indigenous peoples. Yet, indigenous peoples remain largely alienated from the collection, use and application of data about them, their lands and cultures.[41] Existing data and data infrastructure fail to recognise or privilege indigenous knowledge and worldviews and do not meet indigenous peoples’ current and future data needs. Current practices around Open Data and Big Data, whether under the auspices of governments or corporations, will likely move indigenous peoples’ data interests even further from where decisions affecting indigenous peoples’ data are made.
  2. Indigenous Data Sovereignty (IDS) is a global movement concerned with the rights of indigenous peoples to own, control, access and possess data that derive from them, and which pertain to their members, knowledge systems, customs or territories.[42] IDS is supported by indigenous peoples' rights to self-determination and governance over their land, resources and culture as described in the United Nations Declaration on the Rights of Indigenous Peoples. Implicit in IDS is the desire for data to be used in ways that support and enhance the collective wellbeing of indigenous peoples.
  3. IDS has a place as an underpinning principle in governance arrangements related to Open Data and Big Data. IDS is practised through indigenous data governance that comprises principles, structures, accountability mechanisms, policy relating to data governance, privacy and security, and legal instruments. IDS frameworks can be applied to internally controlled and owned nation/tribal data as well as data that are stored or managed externally. The IDS networks in Australia and in Aotearoa New Zealand are developing protocols around indigenous data governance.[43]
  4. IDS illustrates that good practices in Big Data and Open Data require an awareness of data that is missing, under-represented or misrepresented[44], and of the interests served, or not, by such practices.

D. Gender issues

  1. The consultation heard that privacy can be experienced differently by persons of different gender or gender identity.
  2. Privacy is a heightened concern for LGBTIQ people[45], for example, and can also be essential for the safety of those, usually women, fleeing domestic, familial or religious violence.[46]
  3. While inclusive data collection practices communicate acceptance and respect, intrusive collection can be a significant barrier to accessing services as LGBTIQ communities and others have justified concerns for privacy following experiences of discrimination, stigma and targeted violence.[47]
  4. This issue will be explored in greater depth in the Taskforce on ‘Privacy and Personality’ and the gender project outlined above, however, in terms of Big Data – Open Data, good data practices require reviewing data collections with an awareness of the possible impact of poor privacy practices and differing consequences according to gender and gender identity.

E. Consumer rights and personal data collection and use

  1. In data-driven consumer markets, the use of more and more data for developing, selling, and promoting consumer products, has meant many data protection issues also become consumer issues, and vice versa. The distinction between consumer law and data protection law is now less sharply defined.[48]
  2. The use of consumers’ personal data by financial services[49] and other sectors, has given rise to concerns at both public policy and individual levels.[50] The fair processing of personal data is increasingly part of the reasonable expectations of consumers regarding the services and products they utilise.[51]
  3. The consultation compared the approaches of consumer law and privacy/data protection law, noting some countries are introducing consumer-privacy initiatives.
  4. Following the Cambridge Analytica scandal, in June 2018 California enacted the AB-375 Consumer Privacy Act, to take effect in January 2020, to protect the data privacy of technology users and others by imposing new rules on companies that gather, use, and share personal data.[52] The Act creates four basic rights[53]:
  5. “The Act also creates a limited right for consumers to sue businesses for data security breaches, based on California’s existing data breach notification law”[55].
  6. It is reported these rights need strengthening, as:[56]
  7. In EU data-protection law, aspects of the GDPR provide EU consumers with new protections including greater transparency and control of data being collected about them by companies than may be provided by the EU consumer protection directives.[59]
  8. Australia is establishing a Consumer Data Right - a data portability right[60] which falls short of the wider protections provided by privacy or data protection law for Australian consumers whose data is being collected, shared and used on a daily basis.[61] The consultation heard that this data portability right, rather than protecting consumers’ data, potentially will expose it to greater use by third parties.[62]
  9. There is advantage in strong collaboration between consumer law and privacy law for example, in co-coordinated legal action.[63] Consumer law can be a useful tool to safeguard the overall balance in the commercial relationship between consumers and suppliers, and can be used to assess the fairness of situations in which companies require consumers to consent to the processing of disproportionate amounts of data, and/or to the sharing of data with third parties.
  10. Feedback received on practical measures to help entities improve their trust relationship with users included communicating the terms of data use through standard licences akin to the six standardised Creative Commons licences. This was seen as potentially able to alleviate some of the challenges of complex privacy policies, while simplifying and standardising communication to users in different countries.[64] Draft licence types could be backed by more detailed ‘standard conditions’ privacy policies:[65]
  11. Capturing privacy risks in privacy rating labels could make privacy choices more accessible to consumers and increase the transparency and disclosure of privacy risks by data controllers.[66]

F. Artificial Intelligence

  1. Machine learning and artificial intelligence (AI) use enormous quantities of data and, in turn, create more data. The combination of data availability, computing power and analytic capabilities using sophisticated algorithms, coupled with machine learning and AI, has the potential to revolutionise societies positively - but also has the potential to profoundly change our world and our chances of survival, not necessarily for the better.[67]
  2. This latter outcome could occur through the potential negative impact on human rights, including the right to privacy. AI methods can be, and are being, used to identify people who wish to remain anonymous; to enable micro-targeting of messaging; to generate sensitive information about people from non-sensitive data; to profile people based upon population-scale data; and to make decisions using this data thereby profoundly affecting people’s lives.[68]
  3. As more and more actions and decision-making are transferred to machines, there is an urgency in ensuring that machine-learning and algorithms are transparent as to their logic and their assumptions. Algorithms used in machine-learning and AI are increasingly complex and transparency will be difficult to achieve. Yet that complexity should not prevent auditing to ascertain lawfulness.[69] Currently the use of Big Data related technologies is not being held sufficiently accountable whether under international human rights law, data protection, sectoral privacy regulation or ethical codes and industry standards.[70] It has been argued that machines should be held to higher ethical standards than humans, and that with ‘the right choices, privacy will not be an historical anomaly - it will be a technologically given right.’[71]
  4. The GDPR limits the use of automated decision-making in certain circumstances, and requires individuals to be provided with information as to the existence of automated decision-making, the logic involved and the significance and envisaged consequences of the processing for the individual.[72] There is an overall prohibition (with narrow exceptions) to have decisions made by solely automated processes when such decisions have legal or other significant effects.
  5. The GDPR defines profiling as the automated processing of data to analyse or to make predictions about individuals, and sets an obligation to incorporate data protection by design and by default. Data Privacy Impact Assessments will be mandatory for many privacy-invasive AI and machine learning applications that fall within the scope of data protection law and have substantial anticipated risks, such as the processing of sensitive data. In the case of AI, a Data Privacy Impact Assessment could (perhaps should) enable entities to model the effects of their algorithms in much the same way climate scientists model climate change or weather patterns.[73]
  6. The European Union Agency for Fundamental Rights has suggested one way of ensuring effective accountability could entail establishing dedicated bodies with an exclusive mandate to provide oversight of Big Data-related technologies, similar to the role of Data Protection Authorities.[74]
  7. While the means still need to be determined, it is relevant that new technologies have required the strengthening of international humanitarian law throughout the last century.[75]

G. Principles for Big Data and Open Data

  1. In the interim report[76], the Special Rapporteur raised the development of principles for regulating Big Data and Open Data. The consultation indicated that any such development should, as far as possible, draw from international agreements for data protection regarded as representing ‘best practice’. At present, these are the EU’s GDPR and the ‘modernised’ Convention 108 (‘Convention 108+’, 2018) which originated at the Council of Europe but is open to accession globally by States which have enacted consistent principles.[77]
  2. GDPR’s influence is not exerted only through local legislative enactments or its extraterritorial application. Companies outside Europe, Microsoft being the most prominent example, are voluntarily adopting 'GDPR compliance’ across their whole business operations irrespective of a legal obligation to do so. This ‘GDPR-creep’ may be just as significant as legislative adoption.[78]
  3. From another perspective, countries with broad data localisation laws are creating new privacy standards for data collected within their jurisdiction. The Cyber Security Law (CSL) of the People's Republic of China introduces restrictions on cross-border data transfers that differ from international privacy regimes such as the GDPR and the voluntary Asia-Pacific Economic Cooperation Cross-Border Privacy Rules (CBPR).[79] While the GDPR and CSL appear to have similar cross-border transfer tests, CSL does not provide for derogations found in the GDPR.[80] Neither does the CSL contain certain GDPR mechanisms such as Binding Corporate Rules and standard data protection clauses for companies to gain approval.
  4. The preliminary recommendations in the Special Rapporteur’s 2017 interim report to the General Assembly[81] were developed independently from the GDPR and Convention 108+, but are aligned to those instruments (see table below).[82] The alignment achieved is important when considering the wider international context: Convention 108 is steadily being ‘globalised’, and the ‘updated’ Convention (‘Convention 108+’) includes many, though not all, of the GDPR’s new elements.[83] It is likely, in the next five to ten years, that the extraterritorial effects of GDPR with the ever-widening club of Convention 108 countries, will have a significant effect on the deepening world-wide privacy culture. The precise nature of this evolution is still emerging, as is its relevance to the need for further developments such as stand-alone principles for Big Data and Open Data.

While a consistent international framework for the regulation of Big Data and Open Data is needed, it would be premature to commence work on stand-alone principles relating specifically to Big Data and/or Open Data before there has been sufficient time to ascertain the robustness and international effect of the GDPR and Convention 108+.

  1. The Big Data and Open Data recommendations are to be understood therefore, in the spirit of existing privacy and data protection principles rather than any new sui generis rules.

Alignment of Interim Report Recommendations with GDPR and Convention 108+.

Interim Report 2017 (A/72/43103), paras. 126–131(a)-(n)
EU General Data Protection Regulation (GDPR)
Convention 108+






131 (a) Accountability
GDPR 5(2) ‘Accountability’
10(1)
131(b) Transparency
GDPR 12 ‘Transparency’; 22(3) ‘Automated decision-making’ transparency
5(4), 8(1)
131(c) Quality
GDPR 5(1)(c) ‘data minimisation’, (d) ‘accuracy’
5(1)
131(d) Predictability of ML
GDPR 22 ‘Automated decision-making’
8(1), (2)
131(e) Security
GDPR 32-34 security (incl. breach notification)
7(1)
131(f) Risk identification/ mitigation tools
GDPR 35 ‘Data protection impact assessment’ (DPIA); 36 ‘Prior consultation’
10(2);
131(g) Employee training etc
GDPR 37-39 ‘Data Protection Officer’
131(h) Unambiguous focus of privacy regulation
GDPR 52 ‘Independence’ of DPA
15(5)
131(i) Sufficient regulatory powers for Big Data
GDPR 57 ‘Tasks’, 58 ‘Powers’
12, 15
131(j) Privacy laws fit to handle technology advances
GDPR 25 4(1) ‘personal data’; 4(4) ‘profiling’; 4(5) ‘pseudonymisation’; 22 ‘Automated decision-making’; 25 DP by design and by default’;
8(1), (2); 10(3)
131(k) Formal consultative mechanisms
GDPR 36 ‘Prior consultation’; 57(b), (c), (d) and (g) DPA ‘Tasks
131(l) Consultations on dangerous practices
GDPR 36 ‘Prior consultation’;
131(m) Investigate new techniques, particularly re de-ID
57(i) DPA ‘Tasks – ‘monitor new technology;’ 25 ‘Data protection by design and by default’
10(3)
131(n) Ensure citizen awareness
GDPR 12 ‘Transparency’; 13-15 Notice to data subjects; 57(b), (c), (e) DPA ‘Tasks’
15(2)(e0
126 Open Data to have binding requirements of reliable de-identification + robust enforcement
GDPR 25 ‘Data protection by design and by default’; 4(1) ‘personal data’; 4(5) ‘pseudonymisation’
10(3)
127 Rigorous Privacy Impact Assessments if unit-record data used in Open Data
GDPR 35 ‘Data protection impact assessment’
10(2)
128 No Open Data or exchange of unit-record data without robust de-identification
GDPR 4(1) ‘personal data’; 4(5) ‘pseudonymisation’
2(d)
129 Ensure extra protections for sensitive data
GDPR 9 ‘Special categories’
6

Source: Greenleaf, G. Post Consultation Submission, 7 August 2018.

H. Conclusions

  1. Data is and will remain a key economic asset, like capital and labour. Its integral dependency upon personal information demands accommodation with privacy and data protection laws.
  2. International human rights law requires that any interference with the right to privacy must be lawful, necessary and proportionate. On occasions, the challenge to privacy may be lawful, but whether it is ethical is another issue. It is questionable whether some examples discussed here are ethical, lawful, necessary and proportionate. Recent cases of mismanagement of personal data by private and public entities require strong responses to prevent reoccurrences.
  3. International human rights law also requires that those who experience a violation of their right to privacy have access to a remedy. This is even more significant in the Big Data and Open Data era.
  4. A key challenge for releasing data publicly as Open Data is the absence of a way to unambiguously determine if there is personal information in supposedly de-identified datasets or aggregated data.
  5. Economic and political drivers underlie the policies and practices surrounding Open Data. The business models inherent in capitalist economies have little incentive to protect personal data when there is no resultant economic disadvantage in counterbalance to the profits to be made.
  6. An international framework with consistent data protections and clear rules for transnational access would help weigh privacy protections and the competing interests that nations may have in accessing data, for example in the context of law enforcement, or that multi-national corporations may have in managing data flows internally.
  7. Initiatives enabling the unrestricted sharing of data and those dismantling existing privacy legal safeguards are contrary to the protection of the right to privacy, and must cease.
  8. The criminalisation of the re-identification (in the public interest) of de-identified datasets is not supported as a safeguard for personal data.
  9. Detailed unit-record level data (identifiable data) should not be disclosed or published online without the data subject's consent. The use of physical and technical methods (such as Secure Research Environments) to restrict access to sensitive unit-record-level data is appropriate.
  10. Consumer law and data protection law can usefully complement each other. Privacy law with its human rights and societal dimensions, provides an anchor for consumer law. Sole reliance upon consumer law will deny individuals the broader enabling aspects of the interdependency between fundamental human rights, and their remedial mechanisms.
  11. The current and potential manifestations of AI require independent oversight by experts in different subject fields. The evolution of this technology needs a strong legal and policy framework grounded in human rights. This is urgent and critical.
  12. The application of article 22 of the GDPR needs to be closely monitored on its ability to address automated processing issues arising from the use of AI.
  13. It is essential to address the lack of technological capabilities required to engineer appropriate systems, methods and processes and to ensure robust systems, methods and processes for strong protection for personal data. This should involve small tech companies and start-ups.
  14. Where Member States are contemplating legislation for the promotion of Open Data,[84] the consultation furnished the following parameters:

I. Recommendations

  1. The original recommendations made in 2017 (A/72/43103) have been expanded based on the consultation, as follows:
  2. The Special Rapporteur’s initial 2017 recommendations (A/72/43103) (with original paragraph number) were:

Governance:

(a) responsibility – identification of accountabilities, decision-making process and as appropriate, identification of decision makers

(b) transparency – what occurs, when and how to personal data prior to it being publicly available, and its use, including ‘open algorithms’.

(c) quality - minimum guarantees of data and processing quality

(d) predictability - when machine learning is involved, the outcomes should be predictable

(e) security - appropriate steps to be taken to prevent data inputs and algorithms from being interfered with without authorisation

(f) develop new tools to identify risks and specify risk mitigation

(g) support – train (and as appropriate accredit) employees on legal, policy and administrative requirements relating to personal information.

Regulatory environment:

(h) Ensure arrangements to establish an unambiguous focus, responsibility and powers for regulators charged with protecting citizens’ data

(i) Regulatory powers to be commensurate with the new challenges posed by big data for example, the ability for regulators to be able to scrutinise the analytic process and its outcomes

(j) Examination of privacy laws to ensure these are ‘fit for purpose’ in relation to the challenges arising from technology advances such as machine-generated personal information, and data analytics such as de-identification.

Inclusion of feedback mechanisms

(k) Formalise consultation mechanisms, including ethics committees, with professional, community and other organisations and citizens to protect against the erosion of rights and identify sound practices;

(l) Undertake a broadbased consultation on the recommendations and issues raised by this report such as the appetite, for example, for prohibition on the provision of government datasets.

Research

(m) Technical: investigate relatively new techniques such as differential privacy and homomorphic encryption to assess if they provide adequate privacy processes and outputs.

(n) Examine citizens’ awareness of the data activities of governments and businesses, uses of personal information including for research, technological mechanisms to enhance individual control of their data and to increase their ability to utilise it for their needs.

III. Upcoming reports

  1. The Special Rapporteur will present further reports:

(a) To the Human Rights Council:

(b) To the General Assembly:

  1. The Special Rapporteur will finalise the reports on the official visits to the United States, France and the United Kingdom, for March 2019 and may also report, time and resources allowing, on other issues such as privacy and gender impacts.

IV Acknowledgements

  1. The Special Rapporteur thanks OHCHR in Geneva for its ongoing work and support for his mandate. The assistance from the United Nations staff in Geneva is gratefully acknowledged.
  2. The Special Rapporteur sourced extra-mural funding and assistance from the Department of Information Policy and Governance, University of Malta, and the Security, Technology & e-Privacy Research Group, University of Groningen, Netherlands.
  3. The Special Rapporteur thanks the University of New South Wales Sydney; the Optus Macquarie University Cyber Security Hub and the University of Technology; Allens Hub UNSW Sydney; L-Universita ta’ Malta; the University of Groningen in the Netherlands; Grand Challenges UNSW Sydney; the Australian Human Rights Institute, the Schools of Mathematics, Social Sciences, Built Environment, Computing and Engineering Sciences, and Law UNSW Sydney, and the Australian Human Rights Commission, for supporting the Big Data - Open Data consultation in Sydney. The Special Rapporteur also thanks the moderators, speakers and audience for their contributions.
  4. The Special Rapporteur thanks the various Government representatives, civil society organisations, academics and individuals met over the past year.
  5. The Special Rapporteur’s work could not be undertaken successfully without the efforts and support of non-governmental organizations, small and large, local, national and international around the world.
  6. The Special Rapporteur thanks the many regulatory and professional bodies who have assisted the mandate, the Taskforce Chairs, members, interns, and the volunteers who provide research and secretariat support. Their commitment to human rights and to the work of Special Procedures mandate holders is unsurpassed.


[*] A/73/150.

[**] The report was submitted after the deadline to reflect the most recent developments.

[1] A/HRC/37/62.

[2] https://www.humanrights.gov.au/news/stories/major-project-focus-human-rights-and-technology

[3] Communication Smitha Krishna Prasad, Centre for Communication Governance, National Law University, Delhi, 24 September 2018.

[4] Including experience inclusive of sexual orientation, gender identity, gender expression and sex characteristics.

[5] The fourth “Privacy, Personality and Flows of Information” edition in Latin America (2019) will discuss gender aspects.

[6] European Court of Human Rights, Case of Big Brother Watch and Others v. the United Kingdom, 13 September 2018. (Applications nos. 58170/13, 62322/14 and 24960/15) at https://hudoc.echr.coe.int/eng#{"itemid":["002-12080"]}

[7] https://www.itnews.com.au/news/decryption-laws-enter-parliament-512867?eid=1&edate=20180921&utm_source=20180921_AM&utm_medium=newsletter&utm_campaign=daily_newsletter

[8] Lewis, J.A., Zheng, D.E. and Carter, W.A. The Effect of Encryption on Lawful Access to Communications and Data, Center for Strategic and International Studies, 2017. https://csis-prod.s3.amazonaws.com/s3fs-public/publication/170221_Lewis_EncryptionsEffect_Web.pdf?HQT76OwM4itFrLEIok6kZajkd5a.r.rE

[9] New America, Coalition Raises Serious Concerns About Australian Draft Bill and Encryption Backdoors, Press Release, 9 September 2018; Mosey, M. and Henschke, A. Defining thresholds in law – sophisticated decryption and law enforcement Policy Options Paper No 8, National Security College, Australian National University, April 2018

[10] nl-cabinet-encryption-position provided to To the President of the House of Representatives of the States General 4 January 2016.

[11] https://www.ohchr.org/en/NewsEvents/Pages/DisplayNews.aspx?NewsID=23042&LangID=E

[12] See Footnote 5.

[13] See Footnote 5.

[14] https://www.ohchr.org/EN/Issues/Privacy/SR/Pages/SRPrivacyIndex.aspx.

[15] OpenGov Newsletter (Bhunia, P), Taskforce recommends establishment of national mission for coordinating Al-related activities across India, 9 April 2018.

[16] Greenleaf, G., Global Data Privacy Laws 2017: 120 National Data Privacy Laws, including Indonesia and Turkey, Privacy Laws & Business International Report, 10 [2017] UNSWLRS 45.

[17] http://supremecourtofindia.nic.in/pdf/jud/ALL%20WP(C)%20No.494%20of
%202012%20Right%20to%20Privacy.pdf.

[18] The Register (Chirgwin, R.), India mulls ban on probes into anonymized data use – with GDPR-style privacy laws, 31 July 2018 https://www.theregister.co.uk/2018/07/31/india_privacy_boffin_ban/.

[19] Supreme Court of India Civil Original Jurisdiction Writ Petition (civil) no. 494 2012; http://economictimes.indiatimes.com/articleshow/65961697.cms?utm_source=contentofinterest&utm_medium=text&utm_campaign=cppst.

[20] Regulation of the European Parliament and the Council concerning the respect for private life and protection of personal data in electronic communications repealing Directive 2002/58/EC.

[21] https://edps.europa.eu/data-protection/data-protection/legislation_en as at 4 September 2018.

[22] https://www.onetrust.com/what-is-the-brazil-general-data-protection-law-lgpd/ as at 4 September 2018.

[23] Cannataci, Joseph, (ed) ‘The Individual and Privacy’, Ashgate, 2015 https://www.privacyandpersonality.org/publications/#LvuxXrjgZDTVzYa2.99; Public lecture ANU College of Law 2016: http://www.anu.edu.au/events/public-lecture-by-un-privacy-rapporteur-joe-cannataci.

[24] www.ohchr.org/Documents/Issues/Privacy/A-72-43103_EN.docx.

[25] Lo, A. The Right to Privacy in the Age of Big Data and Open Data, article for AHRI, UNSW Sydney, August 2018. (Law student, University of Hong Kong, intern Allens Hub for Technology, Law and Innovation).

[26] Submission Professor M. Paterson, Monash University, August 2018.

[27] See Office of the Victorian Information Commissioner, Protecting unit-record level personal information The limitations of de- identification and the implications for the Privacy and Data Protection Act 2014, May 2018, <https://www.cpdp.vic.gov.au/images/content/pdf/privacy_papers/20180503-De-identification-report-OVIC-V1.pdf>.

[28] Submission Dr Ian Opperman, CEO and Chief Data Scientist, Data Analytics Centre, NSW Government, Australia, 31 August 2018.

[29] https://www.acs.org.au/content/dam/acs/acs-publications/ACS_Data-Sharing-Frameworks_FINAL_FA_SINGLE_LR.pdf in the above submission.

[30] Dr Opperman, as above.

[31] Dr Vanessa Teague, University of Melbourne Australia, UN SRP Consultation Big Data – Open Data, 26-27 July, 2018 Sydney Australia. See also Office of the Australian Information Commissioner (OAIC), Publication of MBS/PBS data Commissioner initiated investigation report, 20 March 2018, pp 7-9, <https://www.oaic.gov.au/resources/privacy-law/commissioner-initiated-investigation-reports/publication-of-mbs-pbs-data.pdf>..

[32] https://pursuit.unimelb.edu.au/articles/the-simple-process-of-re-identifying-patients-in-public-health-records.

[33] Culnane, C., Rubinstein, B., Teague, V., University of Melbourne https://arxiv.org/ftp/arxiv/papers/1712/1712.05627.pdf 2016.

[34] The OAIC found doctors were identifiable, and that patients could also be identified but not "reasonably identifiable," in the terms of the Australian Privacy Act. It is understood affected people have not been notified.

[35] Submissions eg Professor M. Paterson, Monash University, August 2018.

[36] These mechanisms are commonly subject to oversight by ethics committees with access restricted to researchers under confidentiality obligations.

[37] For example, based on mechanisms such as the 5 Safes Framework: see ABS, Managing the risk of disclosure: the five safes framework, http://www.abs.gov.au/ausstats/abs@.nsf/Latestproducts/1160.0Main%20Features4Aug%202017?

opendocument&tabname=S.

[38] Submission from Associate Professor Theresa Dirndorfer Anderson, Course Director, Master of Data Science & Innovation, University of Technology Sydney.

[39] Office of the Australian Information Commissioner, Community Attitudes to Privacy Survey, 2017, at

https://www.oaic.gov.au/engage-with-us/community-attitudes/australian-community-attitudes-to-privacy-survey-2017, Community Attitudes to Privacy Survey, 2013, at https://www.oaic.gov.au/images/documents/privacy/privacy-resources/privacy-reports/2013-community-attitudes-to-privacy-survey-report.pdf.

[40] Submission to ‘Big Data Open Data’ Consultation, Associate Professor Theresa Dirndorfer Anderson, Course Director, Master of Data Science & Innovation, University of Technology Sydney.

[41] Kukutai, K & Walter M. (2015) ‘Indigenising Statistics: Meeting in the Recognition Space’ Statistical Journal of the IAOS 31(2), 317-326.

[42] Kukutai, T. & Taylor, J. (2016) ‘Data Sovereignty for Indigenous Peoples: Current practice and future needs’ pp. 1-24 in Kukutai and J. Taylor (eds) Indigenous Data Sovereignty: Towards an Agenda. CAEPR Research Monograph, 2016/34. ANU Press. Canberra. https://press.anu.edu.au/publications/series/centre-aboriginal-economic-policy-research-caepr/indigenous-data-sovereignty; Snipp, M. (2016) ‘What does data sovereignty imply: what does it look like?’ pp 39-56 in T. Kukutai and J. Taylor (eds) Indigenous Data Sovereignty: Towards an Agenda. CAEPR Research Monograph, 2016/34. ANU Press. Canberra. https://press.anu.edu.au/publications/series/centre-aboriginal-economic-policy-research-caepr/indigenous-data-sovereignty.

[43] Submission Professor Maggie Walter Professor Sociology and Pro Vice-Chancellor (Aboriginal Research and Leadership) University of Tasmania, Australia.

[44] Submission Associate Professor Dirndorfer Anderson, as above.

[45] A guide to LGBTIQ-inclusive data collection The Canberra LGBTIQ Community Consortium, November 2017.

[46] A/HRC/38/47.

[47] OHCHR, Born Free and Equal. Sexual Orientation and Gender Identity in International Human Rights Law, at https://www.ohchr.org/Documents/Publications/BornFreeAndEqualLowRes.pdf, 2012.

[48] Helberger, N., Zuiderveen Borgesius, F. and Reyna, A. The Perfect Match? A Closer Look at the Relationship between EU Consumer Law and Data Protection Law, Common Market Law Review, Volume 54 (2017), Issue 5.

[49] Privacy International, Fintech: Privacy and Identity in the New Data-Intensive Financial Sector, 2017; Consumers International, Banking on the future: an exploration of FinTech and the consumer interest, July 2017.

[50] Pew Research Center (Rainie, L and Duggan, M), Privacy and Information Sharing, 2016: people’s comfort level depends on perception of trustworthiness, what happens post collection, and retention length. Consumer Policy Research Centre (Nguyen, P. Solomon, L.), Consumer data and the digital economy - Emerging issues in data collection, use and sharing, 2018, found consumers wanted more options over what data is collected, its use, and the government to participate in improving consumer control over data and protections from data misuse.

[51] Helberger, N., Zuiderveen Borgesius, F. and Reyna, A. The Perfect Match? A Closer Look at the Relationship between EU Consumer Law and Data Protection Law, Common Market Law Review, Volume 54 (2017), Issue 5.

[52] https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201720180AB375.

[53] Electronic Frontier Foundation, (Schwartz, A., Tien, L, McSherry, C.) How to Improve the California Consumer Privacy Act of 2018, August 8, 2018.

[54] Ibid.

[55] Ibid.

[56] Ibid.

[57] “The Act defines a “business” as a for-profit legal entity with: (i) annual gross revenue of $25 million; (ii) annual receipt or disclosure of the personal information of 50,000 consumers, households, or devices; or (iii) receipt of 50% or more of its annual revenue from selling personal information (Section 140(c))” Electronic Frontier Foundation, How to Improve the California Consumer Privacy Act of 2018, August 8, 2018.

[58] Submission Dr Roland Wen, UNSW Sydney, Australia.

[59] Helberger, N., Zuiderveen Borgesius, F. and Reyna, A. The Perfect Match? A Closer Look at the Relationship between EU Consumer Law and Data Protection Law, Common Market Law Review, Volume 54 (2017), Issue 5.

[60] Australian Competition and Consumer Commission, ‘Consumers' right to their own data is on its way’, Press Release, 16 July 2018. https://www.accc.gov.au/media-release/consumers-right-to-their-own-data-is-on-its-way.

[61] Consumer Policy Research Centre (Nguyen, P. Solomon, L.), Consumer data and the digital economy - Emerging issues in data collection, use and sharing, 2018.

[62] Dr Katherine Kemp, Centre for Law, Markets and Regulation, UNSW Sydney, UN SRP Consultation on Big Data – Open Data, 26-27 July, 2018 Sydney Australia.

[63] United States and EU consumer groups have jointly asked consumer agencies and data protection authorities to look at data protection and consumer law infringements of connected toys. BEUC, “Consumer organisations across the EU take action against flawed internet-connected toys”, 6 Dec. 2016, <www.beuc.eu/publications/consumer-organisations-across-eu-take-action-against-flawed-internet- connected-toys/html> quoted in Helberger, N., Zuiderveen Borgesius, F. and Reyna, A. above.

[64] Submission Allens Hub for Technology, Law and Innovation, 14 August 2018.

[65] See footnote 66.

[66] See Lorrie Faith Cranor, “Necessary but not Sufficient: Standardized Mechanisms for Privacy Notice and Choice”, Journal on Telecommunications and High Technology Law Vol 10 Issue 2, 2012, 273-308.

[67] Walsh, T. 2062 The World that AI Made, La Trobe Press, 2018.

[68] Privacy International and Article19, Privacy and Freedom of Expression In the Age of Artificial Intelligence, May 2018.

[69] Fundamental Rights Agency, #BigData: Discrimination in Data Supported Decision Making, 2018.

[70] Pew Research Center (2017), Code-Dependant: Pros and Cons of the Algorithm Age referenced in Fundamental Rights Agency, #BigData: Discrimination in Data Supported Decision Making, 2018.

[71] Walsh, T. 2062 The World that AI Made’, La Trobe Press, 2018, p171.

[72] Articles 13, 14 and 22 of GDPR.

[73] The Guardian (Smith, A), Franken-algorithms: the deadly consequences of unpredictable code 30 August 2018, https://www.theguardian.com/technology/2018/aug/29/coding-algorithms-frankenalgos-program-danger?utm_source=esp&utm_medium=Email&utm_campaign=Morning+briefing&utm_term=284469&subid=25666105&CMP=ema-2793, quoting Johnson, N.F., Manrique, P., Zheng, M., Cao, Z., Botero, J., Huang, S., Aden, N., Song, C., Leady, J., Velasquez, N., Restrepo, E.M. Population polarization dynamics and next-generation social media algorithms https://arxiv.org/pdf/1712.06009.pdf viewed 30 August 2018.

[74] Fundamental Rights Agency, #BigData: Discrimination in Data Supported Decision Making, 2018.

[75] Walsh, T. Op. cit. p. 146.

[76] A/72/43103.

[77] Submission of Professor Graham Greenleaf, School of Law UNSW Sydney, August 2018.

[78] Greenleaf, G. Global convergence of data privacy standards and laws Speaking notes for the European Commission events on the launch of the General Data Protection Regulation in Brussels & New Delhi, 25 May 2018.

[79] Samm Sacks, Paul Triolo, Graham Webster, “Beyond the Worst-Case Assumptions on China’s Cybersecurity Law”, New America, October 2017 Submission Allens Hub for Technology, Law and Innovation, 14 August 2018.

[80] Xiaoyan Zhang, “Cross-Border Data Transfers: CSL vs. GDPR”, The Recorder, January 2018 Submission Allens Hub for Technology, Law and Innovation, 14 August 2018.

[81] A/72/43103, par. 125-131.

[82] Submission Professor Graham Greenleaf, School of Law, UNSW Sydney, August 2018.

[83] Since 2011, Convention 108 has added to its 47 European parties, Uruguay, Mauritius, Senegal, and Tunisia; Morocco, Cape Verde, Argentina, Mexico, and Burkina through Accession requests. Eleven other countries, or their data protection authorities, are Observers on its Consultative Committee.

[84] An example is Australia’s proposals in New Australian Government Data Sharing and Release Legislation: Issues paper for consultation, 4 July 2018. https://www.pmc.gov.au/resource-centre/public-data/issues-paper-data-sharing-release-legislation in Submission from Professor M. Paterson, Monash University.


WorldLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.worldlii.org/int/other/UNSRPPub/2018/11.html