Regulating AI: What marketers need to know

Sep 02, 2022
Compliance

In June, the Canadian government proposed new legislation to regulate artificial intelligence (AI). The proposed Artificial Intelligence and Data Act (AIDA) is part of Bill C-27, which also proposes a new privacy framework, the Consumer Privacy Protection Act (For more on federal privacy reform, see our recent blog). If passed, AIDA would be the first comprehensive law in Canada regulating AI.

What is AIDA trying to achieve?

AIDA intends to promote the responsible use of AI. It aims to ensure high-impact AI systems are developed in a way that mitigates risk of harm and bias. It would prohibit conduct that may result in serious harm to individuals or their interests, including when AI systems unlawfully obtain data or are used in a reckless way.

Who would AIDA apply to?

AIDA would apply to any private sector organization that designs, develops or “makes available for use” an AI system in the course of business domestically or internationally. The proposed law aims to focus on “high-impact” AI systems.

The AIDA proposes some specific responsibilities for organizations, including to:

  • Assess and document whether or not a system is considered “high-impact”.
  • Create measures to identify and mitigate any potential risk of bias or harm resulting from a system.
  • Publish information publicly (e.g., on a website) about the system the organization is using and the risk mitigation measures that it is implementing.
  • If using anonymized data, abide by new requirements for how data is to be anonymized and used.
  • Abide by new recordkeeping and monitoring requirements.

If the use of a high-impact system results (or is likely to result) in material harm, organizations would be required to notify the regulator.

Important terms in AIDA still need to be defined, including what constitutes a “high-impact” system and “material harm”.

AIDA is intentionally a principles-based framework, and does not prescribe specific measures that organizations would be required to adopt to meet its requirements. This enables the law, once passed, to stand the test of time, without requiring constant updates as technology advances.

What are the consequences for non-compliance?

AIDA would create an AI and Data Commissioner to monitor compliance and issue audits. Proposed penalties would be significant – up to 5% of global revenue or $25 million. An organization would not be found guilty of an offense if it was able to establish that it exercised due diligence to prevent the outcome.
Individuals can also face penalties, with those found guilty of an indictable offense under AIDA liable for a fine up to $100,000 - or even jail time.

What’s next?

Over the coming months, the bill will be debated in the House of Commons and the Senate, and studied by parliamentary committees. As it proceeds through this process, amendments will be proposed. If the bill is adopted, many aspects of the rules will be left to upcoming consultation, regulations and government guidance.

The CMA will be actively involved throughout this process, recognizing that AI is an increasingly important tool for marketers, fueling data analytics and personalization and helping companies better understand customers and improve their experience. We have set up a working group to shape the input that we will provide to government on behalf of the marketing profession.

If you are a CMA member in the AI space and interested in shaping our response to the bill and subsequent regulations, please drop us a line.

In the meantime, organizations can begin preparing for the anticipated rules by voluntarily adopting risk and harm mitigating strategies for their AI systems.


Authors:
Fiona Wilson, Director, Public Policy and Chief Privacy Officer, CMA
Gurwinder Sidhu, Public Policy Intern, CMA





UPCOMING EVENTS & LEARNING OPPORTUNITIES

|

VIEW ALL

LATEST PERSPECTIVES

 | View All

Council
Council
Council
Council
Council
Council
Council
Council

Major Sponsors

  • canada post
  • CIBC-800x450
  • Microsoft-2023