The ABCs of Bill 194: Pt. 3 Cybersecurity & AI

In the rapidly evolving environment of artificial intelligence (AI), staying ahead in the global AI race is crucial for maintaining technological and economic leadership. Ontario's newly instated (though not yet enforceable) Bill 194 marks a significant shift in the province's cybersecurity and AI landscape. Schedule 1 of Bill 194, the Enhancing Digital Security and Trust Act (EDSTA), enhances cybersecurity standards and regulates AI systems across various institutions, including those under FIPPA (e.g., ministries, agencies, universities, and hospitals) and MFIPPA (e.g., municipalities, law enforcement, hydro companies, and school boards), as well as children’s aid societies (CAS).

This article is Part 3 of the series, The ABCs of Bill 194 and outlines what organizations from both public and private sectors need to know to be compliant with these cybersecurity and AI changes. Read Parts 1 and 2 of this series to learn about Amendments to FIPPA, and Balancing Children’s Privacy.

PART 3: CYBERSECURITY AND AI

Governments worldwide are racing to set cybersecurity standards and regulate AI to ensure public safety, boost economies, foster innovation, and protect personal information. Tech innovation is a top priority for many, and we can see how the AI race is heating up, with major players like the United States, China, and Europe vying for dominance.

The exciting synergy between AI and cybersecurity is crucial for protecting digital infrastructure but requires careful ethical and regulatory considerations. And amidst this international AI race, Bill 194 addresses the pressing need for robust cybersecurity measures and AI regulation within Ontario's public sector.  

Cybersecurity

Cybersecurity, under Bill 194, is understood as the comprehensive measures and technologies designed to protect digital information and its infrastructure. To further contextualize things, under Bill 194, any collection, use, retention, or disclosure of digital information by a public sector entity also encompasses these actions when performed by third parties on behalf of the public sector entity. This is where it is important for all organizations to take note of this bill because if your private sector entity enters into an agreement with a FIPPA/MFIPPA institution (or CAS), your contract and cybersecurity standards must be compliant with the mandates of Bill 194’s EDSTA.

In terms of what to expect moving forward, Bill 194 authorizes the government to develop cybersecurity regulations across several areas, such as:

a)        Requiring public sector entities to create and implement cybersecurity programs.

b)        Defining the components that must be included in these programs.

c)        Mandating public sector entities to submit reports on cybersecurity incidents to the Minister (or a specified individual), with varying requirements for different incident types.

d)        Specifying the form and frequency of these reports.

These newly regulated cyber programs may be subject to a variety of criteria, such as:

a)        Roles and responsibilities of specific individuals within the entity related to cybersecurity.

b)        Progress reports on the entity's cybersecurity efforts.

c)        Education and awareness initiatives related to cybersecurity.

d)        Response and recovery measures for cybersecurity incidents.

e)        Oversight measures for the implementation of the program.

Moreover, the government’s new power will also include the ability to set technical standards that affected public sector entities must follow for cybersecurity compliance.

So, what can you do?

As is the case for many components of Bill 194, institutions must play a “wait and see" game for the impending EDSTA regulations that will outline more prescribed and detailed policy requirements.

In the meantime, public and private organizations can evaluate the state of their personal information safeguards, including physical, technical, and administrative, to reduce privacy and security risks. While we don’t know exactly what kind of cybersecurity programs will be prescribed, we know that industry standards can be relied on to ensure overarching technical compliance. These standards include the use of encryption and access controls as technical measures, and up-to-date incident response policies and plans as administrative safeguards. In addition to these practices, implementing a security framework such as NIST or ISO27001 is also a strong proactive measure toward security compliance. Conducting privacy risk assessments is also conducive to evaluating your organization’s state of privacy and security, identifying potential risks, and being prepared for the onset of regulations.  

ARTIFICIAL INTELLIGENCE

Per Bill 194 standards, an AI system is a machine-based system that infers from the input it receives to generate outputs (i.e., predictions, content, recommendations, or decisions) that can influence physical or virtual environments. These are systems that are publicly available and developed or produced by a public sector entity or developed by a third party on an entity's behalf.

In terms of practical use, if your public sector institution is utilizing an AI system it will have to abide by these requirements:

a)        Publish information about your use of an AI system.

b)        Develop and implement an accountability framework.

c)        Manage risks associated with your use of an AI system.

More details will follow as regulations are deployed, but overall, the government will have the power to regulate how you use AI and even prohibit its use in certain cases.

The lack of details is exactly why the Information and Privacy Commissioner of Ontario was critical of Bill 194, namely for Ontario’s missed opportunity to take center stage and implement prescribed AI regulations from the start rather than delegate responsibility to future regulations. The vague specificities and risk-based approach to AI are mirrored between Bill 194 and Canada’s fallen Artificial Intelligence and Data Act (AIDA) leaving Canadians hungry for clear AI direction. However, while the lack of transparency can leave many uneasy, there are still ways to equip oneself in the fight for AI compliance.

So, what can you do?

In the case of AI developments, knowledge and preparation are key. If not already in place, the department responsible for AI must prioritize the design and implementation of relevant AI policies. An AI Acceptable Use Policy (AUP) serves as a guide and privacy-by-design rulebook that governs the responsible use of artificial intelligence within an institution. The primary goals of an AI AUP are to ensure the ethical, fair, and secure use of AI technologies while protecting data privacy and mitigating risks. By implementing an AI AUP, an institution can then evaluate and cross-reference all its privacy and security policies to ensure AI coherence and a consistent commitment to privacy and data security.

AI literacy is another important consideration when it comes to staying ahead of regulatory compliance. AI literacy means understanding how AI works, its potential uses, and the ethical considerations involved. It empowers people to use AI responsibly, make informed decisions, and navigate the opportunities and challenges AI presents. Without the proper understanding of AI, an institution will be held back in developing and sustaining a strong and unifying AI strategy across all departments. Some helpful tools involve:

a)        Training Curriculum: Creating a training curriculum that covers basic AI concepts, terminology, and technologies. This curriculum can be deployed alongside the AI AUP to inform staff on how their institution is using AI and what staff need to know about it.

b)        AI Knowledge Hub: Institutions can also create digital AI Knowledge Hubs to compile approved resources on AI for staff reference as well as recorded videos on how to safely use AI.

With all of these foundational pieces in place, an institution must have a process and trained individual(s) in place to govern AI activities. With Bill 194 pushing for an accountability framework for AI systems, institutions must be able to assess and determine what is a risk within AI system use and how to mitigate it. The AI-informed team should be made up of privacy, IT, and data science professionals to ensure a well-rounded approach to AI.

Moving Forward

In the face of the intensifying AI race and vulnerable digital ecosystem around the world, Ontario's Bill 194 represents a proactive approach to cybersecurity and AI regulation. By implementing robust measures and staying ahead of the curve, Ontario empowers its public sector institutions to take their cybersecurity and AI responsibilities seriously. The future of AI is uncertain, but with strong regulation and a commitment to innovation, Ontario can navigate the challenges and opportunities that lie ahead.

Our team at Bamboo Data Consulting has robust experience helping our public sector clients reach privacy compliance through well-designed privacy programs and employee training initiatives. Don’t hesitate to reach out if you have any questions about the requirements of Bill 194 or your organization's privacy compliance posture.

Previous
Previous

Vendor Vulnerabilities: The Privacy Risks Lurking in Your Supply Chain

Next
Next

The ABCs of Bill 194: Pt. 2 Balancing Children’s Privacy