Privacy impact assessments, Audits and privacy health checks, Privacy by design and Algorithmic impact assessments
Much of the data in organisations is about people, their lives, what they do, where they go, what they buy, what they like, what they say, what they look for, what they do for entertainment and so on – it is personal information and, in many instances, is thus subject to privacy and data protection requirements. Increasingly, decisions impacting people are made by algorithms through automated decision-making. Privcore can adapt privacy impact assessment frameworks and methodologies to assess the risk of automated decision-making through a process known as an algorithmic impact assessment (AIA). Our experienced staff have led, quality assured, supervised and conducted hundreds of client deliverables, including PIAs, audits and privacy health checks.
Privacy impact assessments
Privacy impact assessments (PIAs) help assess and minimise privacy risk. They are generally required for projects, systems, technologies or processes that are likely to have a high privacy impact, due to, for example, the effect on the individuals’ whose data it is, other stakeholders, such as media or community, the amount of data, sensitivity or complexity. They help your products or services become more robust, futureproof and successful.
Privcore can undertake PIAs to help you maintain the trust and confidence of your customers, which can include a report that you can choose to publish if you so wish, including the privacy benefits and risks. Recommendations will be included to help you manage and minimise your privacy risks. Privcore has partnered with TrustArc which provides a leading technology platform to help manage the PIA process on client request.
Audits and privacy health checks
Reviewing your organisational practices from time to time to identify where your high-risk privacy issues lie should be a routine part of your internal or external audit practices. Sometimes, these are also referred to as ‘privacy health checks’. Check that privacy is included in your risk management frameworks and assess accordingly, taking into account for example, the quantity of data that your organisation holds or processes.
Privcore can undertake audits and privacy health checks of your operations, talking to staff, reviewing systems to assess where your high privacy risks lie, so that you can prioritise your resources accordingly.
Privacy by design
If you create products that are Internet-enabled, such as an Internet of Things (IoT) device, then you should be thinking about privacy (which includes security) by design. Privcore works with a global certification company to help manufacturers build in privacy by design into their products and services.
Algorithmic impact assessments
An AIA can help identify issues such as data quality, transparency, bias, harms, failure management and help create greater accountability for automated decision-making. This often involves internal and external stakeholder consultation processes. Conducting an AIA early (like privacy by design) can increase social licence, confidence and trust in automated decision-making processes, determine where risks lie and whether those risks are proportionate to the objectives sought. Privcore is available to help you both create AIA frameworks and to apply that methodology to AI projects.
Data breach prevention and recovery
Each quarter, the Office of the Australian Information Commissioner (OAIC) reports on the causes of data breaches by regulated entities. A notifiable data breach is a subset of a privacy breach and refers to the unauthorised disclosure of, access to or loss of, personal information that is likely to cause serious harm to the person to whom the information relates. The majority of data breaches are due to human factors such as falling for phishing attacks, which can be thwarted if secure forms of two-factor or multi-factor authentication are in place. Indeed, half of cyber incidents as the OAIC reported in the July – September 2018 quarter were due to phishing attacks.
Data breaches rapidly become an ecosystem problem. One entity’s poor security practices can massively impact the security footprint of other entities. For example, the proliferation of data breaches exposing attributes of individuals means that data points like date of birth, address, mother’s maiden name etc are no longer secure methods of identifying individuals. Trust becomes a two way problem – individuals don’t trust entities and entities don’t trust individuals.
Our experienced staff help clients respond to data breaches in ways which restore trust with customers. Privcore can assist with responding to data breaches, including regulator and customer engagement strategies and developing data breach response plans and breach simulations (preferably prior to a breach). Privcore has partnered with Iron Bastion, a cybersecurity company to provide the technical expertise to ensure your systems are less likely to be prone to data breaches through, for example, phishing.
Research and training
Our staff have extensive experience conducting privacy and cybersecurity training, including data breach simulations to train executives. We also have extensive research experience in areas such as automated decision making, biases in artificial intelligence, metadata retention, the right to be forgotten, data localisation, governance of data, mandatory data breach reporting, cross-border data transfer, two-factor and multi-factor authentication, cloud and privacy by design.
Privcore undertakes research and training to help make this complex area simple for you and help you apply insights within your organisation.
Thought leadership and advisory
Privcore provides thought leadership and strategic privacy advice to clients, including on the governance of data, the APEC Cross Border Privacy Rules System and the privacy aspects of artificial intelligence. Privcore has partnered with O'Connor Kingsford, a change, innovation and strategy firm to develop privacy management frameworks and privacy maturity models to help clients understand how to govern and implement appropriate privacy practices within their organisations.
Governance of data
We are at a pivotal point in history – technology is changing our lives rapidly both for good and bad and needs to be governed. We need to build core human values and ethics into our products and services. We must keep individuals at the centre and build technology that respects human values, including privacy and security.
Download our paper on "Data is your organisation's core business: Are you prepared to govern it?"
APEC Cross Border Privacy Rules
Today, we trade in services more than goods and data will soon be the most-traded commodity. Half of the world’s trade is occurring in APEC economies. Yet, APEC economies have disparate data protection frameworks or none at all. The APEC Cross Border Privacy Rules (CBPR) System helps businesses and supply chains do business with one another using a common data protection framework across APEC. This facilitates trade whilst protecting the personal data flows across APEC borders.
APEC approved Australia and Chinese Taipei in November 2018 to be part of the APEC CBPR System. The USA, Mexico, Japan, Canada, Singapore and Republic of Korea are also approved economies. When Australian businesses engage suppliers that are certified under CBPR, it helps manage their risk with suppliers, as they are no longer accountable for data breaches or other breaches of privacy under the Australian Privacy Act of their supplier.
In a report APEC published, lead author, Annelies Moens looks at the benefits of the system from a multi-stakeholder perspective.
Machine decision making is replacing human decision making. Machine decision making is a tool to assist human decision making. They are both fallible on their own, but together can optimise decision making, so long as biases and other flaws are recognised.
Privcore staff have looked at the trends impacting levels of trust in organisations and explained different types of artificial intelligence (AI). There are numerous pitfalls and biases in both human and machine decision making. Privacy and algorithmic impact assessments can improve AI so that it is more trusted and accountable to society.
Dr John Selby presented to the NSW Society for Computers and the Law on Algorithmic Bias on 12 September 2018.
Annelies Moens presented to Telstra Enterprise's Brilliant Connected Women Network on AI and accountability on 28 May 2019 and to the iappANZ Summit on AI - What's the Drama? on 30 October 2019 along with Australia's Human Rights Commissioner, Ed Santow, Microsoft's Sam Brown and Dr John Selby.