Personal Data and Privacy for Motor Vehicle Litigation and Law Firms

Chapter5A

The pervasive use of data in our digital economy is paving the way for artificial intelligence (AI), machine learning, and the Internet of Things (IoT) to uncover insights and make decisions that can transform organizations and industries at large. Data has become an organization’s most valuable asset. It is the oil of the digital age. This holds true for the insurance industry as well. 

The insurance industry has been perceived to be traditional and archaic in its processes, from onboarding a consumer to claims handling. However, insurance companies have recently begun to recognize the value in data and are adopting emerging technologies. By doing so, insurance companies are provided with more data about its consumers which they can analyze for insights. Those insights are helping to change the way insurers manage risk and transforming the way premiums are set, claims are processed, and litigation is handled. 

Gone are the days where insurers relied on statistical samples of past performance to predict future outcomes. Today, insurers can make predictions based in real time, on real events, and a large dataset rather than a small sample size. 

Undoubtedly, the digital transformation in which insurance companies are engaging will affect the way personal injury claims are handled. Claims will be processed faster and more efficiently, which will benefit both the insurer and the claimant. However, insurers will also have an abundance of data about the claimant that they otherwise would not have, which could negatively affect the claimant and put the insurer at risk if mishandled. 

The collection, use and disclosure of personal data may lead to a host of privacy implications that must be explored in order to protect both the injured claimant whose data is being analyzed and the insurer who is processing or defending a claim using the claimant’s personal data. 

5A:10.10 A Primer on AI & Big Data

To benefit from AI and achieve accurate results, a large dataset, which includes personal data, must be processed and analyzed using algorithms. As the system digests more data, the algorithm improves upon itself, recognizing more patterns and revealing more accurate outputs without human intervention. This is known as machine learning. The neural networks in the system continue to advance, creating complex algorithms and teaching itself new rules. 

Algorithms are considered to live in a ‘‘black box’’ due to the lack of transparency into how they evolved and how they subsequently creates outputs. Despite the lack of transparency, humans continue to rely on algorithms to make important decisions that affect lives on a daily basis. 

5A:10.20 Building a Digital Profile: Getting to Know You 

Through the use of phones, watches, home devices, computers and, in the near future, vehicles, consumers disclose personal data on an almost constant basis. 

Personal data is any information which can directly or indirectly identify an individual. Personal data is more than just an individual’s name, address, and date of birth. It also includes an individual’s geolocation, health information, financial information, political opinions, religious beliefs, ethnicity, and online and offline behavior. Dependence on digital devices helps to capture personal data from an array of sources. 

Organizations, including insurers, collect data crumbs that individuals spread in the digital world. This data may be combined with other datasets, and reassembled through the use of AI and machine learning to create a precise digital profile of an individual. The more data that is collected and the larger the dataset is, the more accurate the analysis becomes. The data collected about an individual, such as attendance at a gym, type of food eaten, number of steps taken, heartbeat, music listened to, and expressions captured through facial recognition, can be used to predict an individual’s past and future physical and psychological state. This information is useful for insurers that underwrite and defend personal injury claims. Today, discovery evidence and clinical notes and records are relied on to create a picture of an injured plaintiff. However, personal data collected and analyzed can create a more accurate profile of a plaintiff so long as the data is accurate and the algorithms are transparent to ensure analytics leads to accurate, unbiased results. 

5A:10.30 Auto Insurance: How Personal Data is Collected 

Insurers have broadened the way they collect data about their consumers. Below are examples of ways personal data is collected by insurers: 

  • Chatbots: Insurers use chatbots to interact with potential consumers during the quotation or claims process. Chatbots answer or filter inquiries before forwarding the call to a customer service agent. The interaction with consumers allow chatbots to collect data a human agent might not be able to do (or not as efficiently). The data collected is added to a data pool and analyzed to identify suggested products, policies, or answers to assist in the interaction with the consumer. That data is stored to create or add to the digital profile of a consumer. Chatbots are also used to identify fraud risks.

  • Insurance Health Apps: Insurance companies provide consumers with health apps that tracks consumers’ health. The more consumers interact with the app, the greater the discount they get on their premiums; creating an incentive for consumers to use the app and insurer to collect data. The app, which can be installed on a customer’s mobile or wearable device, captures customer data such as the number of steps taken, meals, fluid intake, blood pressure, heart rate, fitness level, location etc. This data is inputted into the consumer’s digital profile. By doing so, insights and predictions are generated into the consumer’s physical and psychological health.

  • Auto Insurance Apps: Auto insurers provide consumers with mobile apps or devices for their vehicles which collects the driver’s geolocation and data generated by the accelerometer and gyroscope. Through the use of telematics, the app is able to collect data about an individual’s driving habits such as speed, acceleration, hard braking, hard cornering, distance travelled, and time of day travelled. That data is analyzed to generate a score, which may lead to a reduction in auto insurance premiums.

  • Autonomous Vehicles: With the use of sensor technology, autonomous vehicles capture data about their surroundings as well as the passengers’ daily routine. The vehicle not only interacts with the road and traffic signals, but also with other vehicles. Vehicle-to-vehicle communication is necessary to avoid both traffic and collisions. However if vehicles need to communicate, they must share data with each other. For example, one vehicle may reveal to another vehicle its destination, revealing where its passenger is being driven. The data collected by a vehicle may be shared with the insurer of that vehicle, along with data from other vehicles with which it has interacted. 

The data collected about a claimant can assist an insurer when handling an accident benefits claim or defending an uninsured/underinsured claim. However, an insurer will not have personal data about a plaintiff if that plaintiff is not insured by the insurance company. Similar to the way in which defense counsel asks for undertakings for clinical notes and records, it is conceivable for counsel to ask for undertakings for personal data collected in an app, whether that app was provided by an insurer or not. The relevancy of that data will be discussed later in this paper. 

5A:10.40 The Benefits of Big Data in Auto Insurance 

Advancements in technology results in benefits to society. Through the use of AI, cancer can be diagnosed early, banks can spot fraudulent activities, and daily activities can be made more efficient. 

Insurance companies, along with consumers, benefit from the use of AI and the collection of data. Advancements in auto insurance include the following: 

  • Quality of Life: Through the use of predictive analysis, insurers can warn consumers when their health is declining and make recommendations on how to mitigate against illnesses. Consumers receive reminders about maintaining a well-balanced lifestyle and improving their health while insurers benefit by processing less claims.

  • Profitability: The use of AI provides insurers with the ability to identify fraud quickly, not only benefiting insurers but also leading to a reduction in premiums for consumers. Through the use of analytics, insurers set premiums more accurately.

  • Safety: The data collected about a consumer’s driving habits can generate recommendations for driving improvements. An insurer’s driving app provides incentives for consumers who drive safely, resulting in safer roads for drivers, passengers, and pedestrians.

  • Customer Service: Insurers can process a claim expeditiously through the use of automated decision-making; at the same time providing consumers a convenient and customized experience. For example, some apps have automatic first notice of loss service allowing claimants with property damage to take a picture of the vehicle and submit it over the app, advising their insurer of a collision and eliminating the need to call and report the collision. 

5A:10.50 The Risks of Big Data in MVA Insurance 

While the use of AI has great potential for the auto insurance industry, there are risks that must be considered, especially in the context of personal injury litigation. Litigators must become familiar with serious legal implications associated with the use of AI in personal injury claims and the risks of privacy breaches. There needs to be clarity about the boundaries in which parties can use personal data in personal injury claims. 

Privacy is not about keeping a secret, but rather about having control over whom personal information is shared with and for what purpose. In Canada, the federal Personal Information Protection and Electronic Documents Act (PIPEDA) governs how private organizations, including insurance companies, collect, use and disclose personal information. As data stewards, insurance companies must comply with PIPEDA and its embedded privacy principles, which include but are not limited to the following: 

  1. Choice and Consent - PIPEDA is a consent-based legislation. Insurers must seek meaningful consent from consumers before collecting, using, and disclosing personal data. In obtaining ‘meaningful’ consent, insurers must provide a clear explanation about what it intends to do with that information. In the recently published Guideline for Obtaining Meaningful Consent, the Office of the Privacy Commissioner of Canada (OPC) emphasized key elements in obtaining meaningful consent. Those include: 

  • Privacy policies that have user-friendly, easy to understand and concise language for consumers to understand how their personal data will be used.

  • Transparency into the organization’s privacy management practices and accountability 

  • Information about what personal information is being collected, for what purpose, and who it may be shared with

  • Potential consequences in providing consent to collect, use and disclose personal data 

Consumers should also be provided with a choice about what they share. Just like consumers can choose to provide consent to share their personal data, they can also choose to withdraw consent at any time. Similarly, consumers should have a choice to provide some information but not all information. For example, if an app seeks consent to collect personal data, such as a heartbeat, which is not integral to the service of the app, than the app must provide a choice to the consumer to produce data about the consumer’s heartbeat. The consent cannot be bundled with the consent to collect geolocation, which may be integral to the app.

2. Identifying Purpose/Notice – Insurers must outline the purpose for which they are collecting data in sufficient detail to ensure consumers understands why such data is collected. For example, if the insurer intends to use the data to value claims or to use it in a personal injury claim, it must disclose that purpose in its Privacy Policy or Terms and Conditions. Using personal data for a purpose other than what was disclosed to the consumer will likely constitute a privacy breach.

3. Limiting Use, Disclosure and Retention – Once an insurer uses the data for a particular purpose and no longer requires that data for the identified purpose, the insurer should dispose that data unless required by law to retain it. Insurers should not retain data for longer than necessary. Doing so exposes the insurer to a greater risk of a breach.

4. Limiting Collection / Data Minimization – Insurers should only collect personal information that will fulfill the identified purpose. It should not collect excess information unless it obtains consent from the individual to do so.

5. Accuracy/Quality – Insurers must ensure that any personal data collected is accurate, complete and relevant for the purposes identified in the notice. Lack of accuracy may generate inaccurate and/or biased AI outcomes, which may significantly affect the consumer.

6. Individual Access – Consumers who release personal information to insurers have the right to access and review their personal data. If an insurer cannot fulfill this request, it must provide the consumer with an explanation and inform the consumer that they have a right to make a complaint to the OPC. 

Other risks in using personal data in personal injury claims include: 

  • Profiling: Through the use of AI, consumers can be automatically segmented into categories based on their attributes. Humans, who are inherently biased, develop algorithms which segment consumers through a biased lens leading to profiling. The OPC recently prepared Guidance on Inappropriate Data Practices emphasizing that profiling, which is contrary to human rights law and leads to discrimination, is prohibited.

  • Costs to the Consumer: With the use of intimate personal data and the ability to create a profile using predictive analytics, consumers are segmented into categories that attract higher premiums. Similarly, a personal injury plaintiff whose personal data is analyzed may be profiled based on demographic data, past medical history, and past activities. This analysis may affect the damages recovered in a lawsuit. However, if the personal data about that plaintiff was inaccurate or the algorithm was biased, than the plaintiff would be at a disadvantage, and would have little to prove that to be the case, unless the plaintiff was able to provide the algorithm was flawed.

  • Lack of Transparency: Algorithms lives in a ‘‘black box’’ making it impossible to know what data points it used or how it arrived at its decision or outcome. An insurer will have difficulties explaining how an AI system generated a decision, if that decision was challenged by a consumer. Amendments to PIPEDA are expected in the next couple of years. Algorithmic transparency is expected to be a requirement under PIPEDA. The right to opt-out of automated decision-making may also be incorporated into PIPEDA as will the need to ensure there is a ‘‘human in the loop’’ when decisions are made by AI that affects a consumer’s livelihood or may result in discrimination. 

As more data is being collected by insurance companies, defence counsel in an accident benefits claim may be able to collect a digital profile of a claimant directly from the insurer (depending on the terms and conditions). However, in tort claims, it is conceivable for defence counsel to seek that information in the form of an undertaking. Counsel may request access to that information either from the plaintiff’s insurer or from any other entity collecting personal data about the plaintiff (e.g. wearable devices). 

Before collecting personal data from a third party, both plaintiff and defence counsel should review the Privacy Policy and Terms and Conditions of the third party that holds the personal data to ensure the data was collected with informed consent in the first place. If the data was not collected in compliance with privacy legislation and regulatory guidance, that data might not be used in litigation as the validity of the collection would be put into question. 

Plaintiff’s counsel who is asked to undertake to produce personal data collected by an insurer or another third party, should take that request under advisement. Plaintiff’s counsel should first engage in due diligence to ensure the data was collected, used, and disclosed appropriately. For example, if the third party did not obtain meaningful consent, the third party might be in breach of privacy and the data cannot be used. Plaintiff’s counsel should also explore whether the data collected was in excess of the identified purpose and therefore should not have been collected in the first place. Plaintiff’s counsel should also review the data, just like a review of clinical notes and records, to ensure its accuracy. Lastly, it may be necessary for plaintiff’s counsel to get transparency into the algorithm that analyzed the plaintiff’s data, if defence counsel is relying on decisions made by that algorithm. 

5A:20 PRIVACY IN YOUR LAW FIRM 

PIPEDA applies to all private organizations, including law firms. As custodians of personal information, law firms should have a robust Privacy Management Program (PMP). Law firms need to be aware of their legal obligation in handling personal information. They should evaluate their current-state privacy maturity as well as risk appetite for potential breaches, complaints, investigations, and/or lawsuits. As part of the PMP, law firms should account for the following: 

  • An easy to read, user-friendly and concise Privacy Policy that is accessible to clients. Amongst other things, the Privacy Policy should address the type of personal information collected, the purpose for collecting the information, who it will be disclosed to and for what purpose, how long the information will be retained, and how the client may access the information.

  • Information Management Policy to provide staff with direction and guidance for creating and managing information. A Records Retention Schedule should be included outlining minimum and maximum retention periods for various types of personal information. Instructions on how to securely dispose of personal information, including backups, should also be outlined and certificates of destruction should be obtained when necessary.

  • Confidentiality Agreements with staff and third parties who have access to personal information.

  • Accountability framework outlining the roles and responsibilities of staff and who is responsible for enforcing/monitoring the PMP.

  • Vendor due diligence ensuring third party vendors have similar or more mature privacy and security programs.

  • Information Security Policy with security controls (e.g. encryption, multi-factor authentication) in place to prevent security vulnerabilities.

  • Limit access to personal data to only those staff at the firm that need access to it.

  • Compliance with Canada’s Anti-Spam Legislation (CASL) if commercial electronic messages are sent by the law firm. If the firm uses cookies on its website, ensure proper consent is obtained.

  • Mandatory annual privacy and security training for all staff to ensure they understand their roles and responsibilities in handling personal information and what to do in the event of a breach or potential breach.

  • Breach Response Plan outlining a cyber playbook in the event of a breach or suspected breach. The Breach Response Plan should address a detailed escalation process including the breach response team, and how to report the breach to the OPC, notify affected individuals, contain the breach and remediate vulnerabilities. Being prepared for a breach and having a detailed plan can prevent privacy regulatory orders, significant fines, lawsuits, and irreparable reputational harm. 

As law firms become more technologically advanced, through the use of IoT (e.g. video conferencing systems, smart TVs, security cameras), or outsource services to third party vendors, law firms should become aware of their vulnerability to data breaches, either in the form of a cyberattack and/or employee negligence. Breaches can occur through phishing expeditions by bad actors who can retrieve data from a system or freeze the digital environment in exchange for a ransom. Breaches can also occur when an employee loses a laptop, sends personal information to the wrong recipient, or leaves a notebook containing personal information at a coffee shop, to name just a few examples. 

When a breach occurs, a law firm must consider whether the breach may result in a real risk of significant harm (‘‘RROSH’’) to an individual. RROSH is defined broadly to include bodily harm, humiliation, damage to reputation or relationships, loss of employment, financial loss, and identity theft. If the breach meets the RROSH threshold, the firm must report the breach to the OPC as soon as feasibly possible and notify the affected individual(s), otherwise expose itself to a fine of up to $100,000 per breach. Law firms also have an obligation to record of all breaches involving personal information irrespective of whether the breach resulted in a RROSH. 

If a law firm shares its personal information with a third party vendor who is then breached, as the controller of that data, the law firm is responsible for reporting the breach to the OPC and notifying its clients. For that reason, law firms need to be vigilant in conducting vendor due diligence, ensuring that vendors have an adequate privacy and security hygiene. 

Law firms which mitigate against privacy and security risk by implementing a PMP, training their staff on privacy and security, and instilling a privacy culture, put themselves in a defensible position in the event of a breach, complaint, investigation, or lawsuit. 

Unfortunately, breaches are inevitable and no law firm, regardless of its privacy and security maturity, is completely safe from a breach. For that reason, law firms should consider obtaining cyber insurance to cover the cost associated with a breach (e.g. fines, damages arising from lawsuits, PR costs to repair reputational harm, and the cost of a forensic team to investigate a cyberattack and patch vulnerabilities). 

5A-8

* This paper was included in materials for The Law Society of Ontario’s program: 2nd Motor Vehicle Litigation Summit, co-chairs Susan Gunter and Adam Wagman C.S., held in Toronto on March 28, 2019 and March 29, 2019 and subsequently published in the 2020 Oatley-McLeish Guide to Personal Injury.

Previous
Previous

Cyber threat to high net worth individuals is growing (Part 1)

Next
Next

Online Gaming Industry Needs More "Privacy by Design"