Privacy is an evolving area of law as regulators try to keep up with fast-developing technologies, the rapid accumulation of private data and increasingly sophisticated cyber-criminals.
It is important to stay on top of these developments. The risk for organisations for getting it wrong can be very high – both when the organisation is a victim and when the organisation fails to maintain expected standards of confidentiality and data integrity.
Resources to assist Privacy Act 2020 compliance
The Office of the Privacy Commissioner (OPC) has rolled out a range of tools and resources to assist compliance with Privacy Act 2020.
- NotifyUs is an online tool to help businesses and organisations decide whether a privacy breach meets the “serious harm” threshold requiring that it be notified to the Privacy Commissioner and, if it does, to make that notification.
- New principle 12 introduces restrictions on disclosing personal information to foreign persons or entities. The Principle 12 Decision Tree takes users through a series of questions covering all of the possible grounds for disclosure under the principle.
- If you are relying on an agreement to ensure personal information remains protected by adequate safeguards when disclosed to foreign persons or entities, the OPC has released template model clauses and a Model Clause Agreement Builder that can be used as a base for creating such information sharing agreements.
Guidance on responding to requests for CCTV footage
The Privacy Commissioner has also provided advice on how to respond to requests for CCTV footage, which comes under privacy principle six stating that people have the right to information about themselves.
Where the CCTV is in a public place, such as a supermarket, the response should be reasonably straightforward. But where the video is more sensitive (e.g. from a prison, a hospital, a swimming pool or a gym), it may be necessary to protect the identity of third parties, requiring the use of a secure masking technique. The guidance explores the protection options available.
For more information, read the guidance here.
Rental sector probe – Privacy Commissioner flexes new muscles
The OPC is now in the second stage of a proactive investigation into the collection, retention and disclosure of personal data in the residential rental sector, using its new powers under the Privacy Act.
The intervention follows reports that some landlords and property managers are being inappropriately intrusive in the information they are asking of prospective tenants and are using public forums to compile bad tenant blacklists.
“I am concerned about some of the practices we are seeing, particularly during a time when pressure on tenants is high,” Privacy Commissioner John Edwards said.
Tenants could legitimately be asked to provide proof of identity or whether they have pets, but questions regarding nationality, marital status, gender or detailed banking history were “almost never justified”.
The first phase of the investigation, which begun in February, was focused on general information gathering. The OPC is embarking on a round of direct engagements with people in the sector.
Privacy breach forces review of vaccine booking systems
The Ministry of Health launched a review of vaccine booking systems nationwide last month after a privacy breach within the Canterbury District Health Board exposed the personal data of more than 700 people.
The information released was sensitive (including name, date of birth, National Health Index number, and where and when individuals were getting vaccinated) – all of which was readily accessible within the public-facing code of the website.
Read the article here.
New Zealand’s facial recognition regulations to be reviewed
Digital Economy and Communications Minister David Clark is exploring whether New Zealand’s current facial recognition regulations provide adequate safeguards.
Clark told Radio New Zealand he would seek advice on the matter after a Victoria University of Wellington report funded by The Law Foundation found that our existing regime is not up to the task.
Fifteen recommendations are proposed. Among them are:
- that a definition of “biometric information” be inserted into the Privacy Act 2020
- that new statutory mechanisms be created to give people more control over their information including a right to object, a right to erasure and a right to portability, and
- that a Biometrics Commissioner or other oversight mechanism be established.
Inquiry into Police photographing members of the public
The Privacy Commissioner and the Independent Police Complaints Authority have released the Terms of Reference for their joint inquiry into the Police practice of photographing for database purposes members of the public who have not been detained and are not suspected of a crime.
A draft report is due by September this year.
Among the issues to be investigated are what compliance or enforcement actions would be required if the Police were found to have breached the privacy of the individuals concerned.
The investigation follows reports last year of the Police taking photos of young Māori in the Wairarapa.
Read the Terms of Reference here.
EU proposes stronger regulation of digital sector
The EU is planning to introduce a new level of regulation in the digital economy through two separate pieces of legislation:
- a Digital Services Act (DSA), which would create “a safer digital space” in which the fundamental rights of all users are protected, and
- a Digital Markets Act (DMA), which would prevent the established players from using their market dominance to squeeze out competition and would “establish a level playing field to foster innovation, growth, and competitiveness, both in the European Single Market and globally”.
The European Commission Vice President Margrethe Vestager said:
We should be able to do our shopping in a safe manner and trust the news we read. Because what is illegal offline is equally illegal online.
The proposals would:
- create ground rules around how businesses grow
- make major online platforms legally responsible for the content that users post on their sites and require them to be far more active in pursuing abuse and misinformation
- require companies that allow other businesses to sell services through their platform to allow equal access to their rivals, rather than prioritising their own products, and
- make more of their algorithms transparent.
EU proposes rules around AI
The European Commission has issued proposed regulations for the harmonisation of artificial intelligence (AI), which would ban some AI applications and restrict others but would allow the use of facial recognition technology in the event of a terrorist attack or when searching for criminals or missing children.
To become law the proposed rules, which have been long-awaited, will need to be approved by the European Parliament and member states. This cannot be guaranteed as the European Data Protection Supervisor has condemned facial recognition a “deep and non-democratic intrusion” into people’s lives which should be banned from Europe.
Read the article here.
UK almost adequate?
The European Data Protection Board is now deciding whether the UK post-Brexit should retain its adequacy status allowing data to continue to flow freely between it and the EU bloc.
The Board released a draft decision on 19 February which found that, while the UK’s data protection legislation was largely identical to the GDPR, there were concerns over the extent of interception powers under the Investigatory Powers Act 2016, the rules about onwards transfer of information outside of the UK, and the “immigration exception” to the application of certain data protection obligations.
If adequacy is denied, businesses that share information between the EU and the UK will need to put other measures in place in order to continue with those cross-border transfers. As New Zealand’s adequacy status is also under review by the Board, we will be keeping a close eye on the UK outcome.
Read further commentary here.
Apple’s privacy update good for consumers, bad for online advertisers
In today’s online marketplace, it is standard practice for advertisers to collect information about individuals’ online activities in order to target their advertising and marketing messages.
But Apple’s new App Tracking Transparency feature has just made that much harder.
It allows users to opt out from having their online activity tracked and prompts them to make that decision as they use their device, and initial indications are that there will be a large take-up rate.
Flurry Analytics, which has been tracking the consumer reaction since 25 April, has reported that only 5% of US users and 13% of users worldwide have not taken up the option.
The Privacy Commissioner has applauded Apple's move, and said that it shows that Apple believes consumers are influenced by privacy considerations when making their decisions.
To learn more, read the article.
NZX on path back from cyberattacks
The NZX is now in the implementation phases of an action plan agreed with the Financial Markets Authority (FMA) to address the issues exposed by a series of cyberattacks last year which forced NZX to repeatedly stop trading.
The plan is comprehensive, covering NZX’s arrangements for governance oversight, industry engagement, information technology capability, cyber security, human resources, crisis management planning and risk management.
For more information, see the FMA's update.
Large increase in reported cyber security breaches last year
CERT NZ received 7,809 security incident reports last year, up 65% on 2019. The most common complaints related to phishing and credential harvesting (up 76%), followed by scams and fraud (up 11%), and malware (up 2,008%). In 14% of cases, some form of financial loss was sustained, to a total value of $16.9m.
Win for Dotcom against Attorney-General
The Court of Appeal has found that Dotcom’s privacy was interfered with when some 52 public agencies transferred personal information requests received from him in 2015 to the Attorney-General.
Dotcom had asked that the requests be treated as urgent as they pertained to legal action relating to an extradition eligibility hearing due to commence in September that year. However the Attorney-General declined the requests on the ground that they were vexatious and included information that was trivial.
The Court of Appeal had been asked to rule on whether a request under the Privacy Act could be transferred to another agency where the requestor sought urgency and the basis for the urgency request was a matter that could only be properly evaluated by that agency (as opposed to the original recipient).
The Court determined that, even under these circumstances, agencies can only transfer a request if the personal information requested is more closely connected with the functions or activities of the transferee agency. It is irrelevant that the nature of the request is more closely connected to the transferee agency.
The Court did, however, find that an application for urgency may be a relevant factor for an agency in determining whether to refuse an information request, “not of itself” but depending on the context – although the threshold for vexatiousness would be very high.
Examples might include a “grossly excessive” number of urgency requests or reasons given for urgency, which were not credible
The case was decided under the Privacy Act 1993, but the principles in contention have been carried over to the Privacy Act 2020.
Read the decision here.
Invasion of privacy development
We reported in our May 2020 Data Points edition that the High Court judgment in Henderson v Walker expanded the scope of the tort of invasion of privacy by finding that the threshold for an action being an invasion of privacy was met by providing personal documents without authorisation. Widespread publicity of those personal documents or personal information was not required in order for an action to be available.
In Hyndman v Walker, the Court of Appeal was asked to rule on a similar case – this time determining whether Mr Hyndman’s privacy had been invaded after Mr Walker indirectly disclosed Mr Hyndman’s private communications with Mr Henderson to a Mr Holden, who then used that information against Mr Hyndman.
The Court accepted that Mr Hyndman had a reasonable expectation of privacy in his communications, but found that the disclosure was not “highly offensive” so, under New Zealand statute and unlike in the UK, no offence was established.
The Court did point out that the tort has not been developed much in the 16 years since its inception in New Zealand law and that it “may well benefit from re-examination” including being updated and liberalised to deal better with developments in technology.
To find out more, read the decision.
Win against Google in Australia may lead to similar action here
The recent successful prosecution against Google by the Australian Competition and Consumer Commission (ACCC) may encourage the New Zealand Commerce Commission to take similar action here.
The ACCC successfully argued that Google’s privacy settings were misleading as they did not make it clear that, even if individuals switched off the ‘Location History’ setting on their Android device, the information was still available to Google through the ‘Web and Activity’ setting.
“This is an important victory for consumers, especially anyone concerned about their privacy online, as the Court’s decision sends a strong message to Google and others that big businesses must not mislead their customers,” ACCC Chair Rod Sims said.
Google is considering appealing the case.
Read the ACCC statement.
H&M fined €35m for violating workers’ privacy
H&M has been fined €35m by the German privacy watchdog for recording extensive personal information from hundreds of employees, including medical symptoms and religious beliefs, and using it to evaluate work performance and to make employment decisions.
A configuration error in October 2019 resulted in the data base, which had been collected from 2014 onwards, becoming accessible company-wide.
Read the article here.
UK Information Commissioner cracks down on unsolicited calls
- CPS Advisory Limited: fined £130,000 for making over 100,000 unauthorised marketing calls in relation to personal pensions, using personal information bought from third party data providers.
- Digital Growth Experts Limited: fined £60,000 for sending over 16,000 “nuisance” marketing text messages without valid consent between 29 February and 30 April 2020. An aggravating factor was that the marketing sought to capitalise on the COVID-19 pandemic by promoting a hand sanitiser as effective against the virus.
- Reliance Advisory Limited: fined £250,000 for making over 15 million unauthorised calls in relation to claims management services between 1 January and 1 June 2019.
- Pownall Marketing Limited: fined £250,000 for making over 365,000 unsolicited marketing calls in relation to claims management services during 1 January and 28 May 2019.
- Rancom Security Limited: fined £110,000 for making over 850,000 direct marketing calls of which over 560,000 were to numbers specifically protected against direct marketing calls during 1 June 2017 and 31 May 2018.
- House Guard UK Limited: fined £150,000 for making over 660,000 direct marketing calls of which over 370,000 were to numbers specifically protected against direct marketing calls during 8 May 2018 and 21 December 2018.
…and big fines for failing to keep personal data secure:
- British Airways: fined £20m for failing to protect the personal data of over 420,000 individuals between 22 June and 5 September 2018 following a cyber-attack on the company’s IT systems. Names, addresses and credit card details were accessed, as well as the usernames and passwords of British Airways employees and exclusive club members.
- Marriot International Inc: fined £18.4m for failing to keep the personal data of over 330m individuals secure following a cyber-attack on two of its hotel networks during 2014. The privacy breach was not detected until September 2018.
- Ticketmaster UK Limited: fined £1.25m for failing to keep the personal data of potentially 9.4 million individuals secure between February and June 2018 after a chat-bot installed on its online payment page was compromised.
Clearview’s facial recognition deemed illegal in Canada
Canadian privacy authorities have ordered Clearview AI to delete all Canadian faces from its facial recognition database.
The order follows a year-long inquiry which found that the information was collected without consent and was being used for inappropriate purposes, creating a system that “inflicts broad-based harm on all members of society, who find themselves continually in a police line-up”.
Social media companies, including Facebook and LinkedIn, sent cease and desist orders to Clearview last year after a New York Times investigation revealed that Clearview was lifting images illegally from their platforms.
RBNZ Governor leads data breach response
Reserve Bank Governor Adrian Orr led the Bank’s response to the malicious breach of a file sharing application, saying the Bank was giving the incident its full attention and apologising unreservedly to those affected. “Personally, I own this issue and I am disappointed and sorry”, he said.
The Bank immediately commissioned a forensic cyber investigation and has also appointed an independent third party to undertake a comprehensive general review. The Terms of Reference for that review are publicly available.
Be assured, we are taking action. We are working closely with public authorities and utilising international experts as we respond. We are doing so in a whole of government framework, using the National Security System.
The Bank was not a specific target but was caught up in a global attack on customers of US software company Accellion. Orr said later that the information leak might have been avoided had Accellion notified the Bank of the hack earlier, or told it that a patch was available which would have secured the system.
The Bank is currently looking for a new platform but has reported difficulty in sourcing a reliable alternative.
Spotless victim of serious data breach
Cleaning and catering company Spotless has sustained a serious data breach in both New Zealand and Australia.
Indications are that the company’s HR files may have been hacked, compromising personal information that might be used for identity theft, including the passport and tax numbers of past and present employees.
Spotless has advised the victims and notified the relevant privacy authorities.
Lumino dental firm patient information accessed
A Lumino staff member’s email account was hacked, resulting in personal patient information being accessed, and some patients receiving suspicious emails.
Lumino notified its patients, the Privacy Commissioner, CertNZ, and its third party cybersecurity provider – demonstrating how a company should respond to hacks under the Privacy Act 2020.
CertNZ’s incident response manager said anyone experiencing a phishing scam should immediately report it. Phishing scams that mimic organisations such as Lumino and trick people into sharing their personal and financial data resulted in a cumulative loss of $7m loss in New Zealand last year.
Read the article.
High commercial value attached to data protection in Australia and New Zealand
A recent survey by OPen Text of 1,000 Australian and New Zealand consumers has established that 44% would pay more to do business with an organisation that is committed to protecting their data privacy.
This is high by international standards. The equivalent numbers in Germany, Spain and France are 41%, 36% and 17% respectively.
However there is little public confidence (only 9%) in Australia and New Zealand that businesses are meeting their privacy obligations or that there will ever be full compliance across the board (only 23%).
Read the report.