Blog /

Submission to the Standing Committee on Industry and Technology on Bill C-27

Submission to the Standing Committee on Industry and Technology on Bill C-27

“Dear Members of the Standing Committee on Industry and Technology, The Canadian Chamber of Commerce welcomes the opportunity to comment...

“Dear Members of the Standing Committee on Industry and Technology,

The Canadian Chamber of Commerce welcomes the opportunity to comment on Bill C-27. The Canadian Chamber of Commerce and its members strongly believe in protecting customers, clients, and employees’ personal information.

Few businesses these days take any action without the data to support them. Data is essential to drive innovation and business, which is not mutually exclusive to protecting the privacy of Canadians. To sustain economic growth and Canada’s position as a leader in emerging technology, Parliament must balance data protection and the use of data to meet our goals as a society. Regulations do not build trust automatically, which makes it crucial that the Government of Canada creates regulations with industry that provide certainty for business.

This is why the Canadian Chamber of Commerce supports and welcomes the government’s efforts to strengthen the protection of the data of all Canadians, particularly children. Members of the Canadian Chamber of Commerce underline that using data to drive innovation and business is not mutually exclusive with protecting the privacy of Canadians.

One of our key concerns remains that a range of amendments are needed to better harmonize the Consumer Privacy Protection Act (CPPA) with comparable provincial, EU and US private sector privacy laws. A patchwork of inconsistent and potentially conflicting privacy laws that do not follow international best practices and standards increases regulatory costs and raises prices for consumers.

Members of the Canadian Chamber of Commerce strongly support the efforts of CPPA and the Artificial Intelligence and Data Act (AIDA) to improve existing legislation and support the competitiveness of Canadian business but believe both pieces of legislation would be strengthened if separated:

  • The Canadian Chamber of Commerce recommends that AIDA be separated from the bill so that the need for greater consultations on AIDA will not slow the adoption of Parts 1 and 2 of Bill C-27.
  • Ideally, provisions addressing automation in CPPA should be moved to AIDA to regulate AI/automated decision making, or at minimum, harmonize all definitions of automation between both pieces of legislation and with the automated decision disclosures in Quebec Law 25 and other comparable provincial laws.

In addressing Part 3 of Bill C-27 on AIDA, members of the Canadian Chamber of Commerce recommend that Parliament consider separating AIDA from the rest of Bill C-27 to allow for more comprehensive consultation between government and stakeholders to ensure the quick adoption of the needed CPPA.

  • This will ensure that AIDA adequately regulates artificial intelligence (AI) in Canada without hindering innovation and is more harmonious with existing AI and AI-adjacent regulations of domestic and international counterparts.
  • This will also ensure that the urgently needed Parts 1 and 2 of Bill C-27 are not slowed down in Parliament due to the consultations needed to address AIDA.

There is a critical need for the government to conduct a broad consultation exercise with key stakeholders before shaping regulations and criteria for “high-impact” systems under AIDA. To date, consultations on AIDA have been insufficient in addressing many concerns, such as alignment with existing legal frameworks, unclear, vague, or missing definitions, overreliance on regulations to answer lack of clarity in legislation; and lack of clarity on responsibilities and authorities.

Members of Parliament and the government must understand that algorithms are not inherently a privacy risk – they are a tremendous tool to improve people’s lives that are not exclusive to AI or emerging technology. AI models hold powerful potential to improve people’s lives, and any risks associated with their use can be managed appropriately.

AIDA must be strengthened with explicit references to, and alignment with, international standards and industry best practices. In addition to providing more clarity around the requirements and what is expected in compliance, alignment with international standards ensures global interoperability of AI systems. It minimizes regulatory fragmentation and barriers to trade in the highly global AI ecosystem. In the development of regulations, a practical, principles and risk-based analysis of AI should be used on the understanding that AI can be a tool providing significant benefit to Canadians.

As an example of the actions needed to ensure synchronization, the Canadian Chamber of Commerce advises Parliament to address the definition of AI exclusively in AIDA to avoid confusion with the definitions in the CPPA and thereby establish a uniform application of regulations.

A concern for members of the Canadian Chamber of Commerce is that AIDA provides for very strong penalties but leaves a significant amount of detail to the regulatory process, which by its nature receives much less democratic scrutiny than legislation and due process, which adds undue risk on businesses and may stifle economic investment and growth.

The Canadian Chamber of Commerce has provided an annex of recommendations to aid in the amendment process to ensure harmonization with international and domestic legislation of Bill C-27. We also gladly make ourselves available at your convenience to meet to discuss Bill C-27 further.


Annex A – CPPA Recommendations

A core position of the Canadian Chamber of Commerce is the need for amendments to better define many of the principles and concepts in Bill C-27 and harmonize the bill with the norms and standards found in existing provincial and international law. To help address this, members of the Canadian Chamber of Commerce have provided recommendations and reasons for the recommended amendment.

Section 2 (1): Definition of Anonymization

  • Recommend: Include a reasonableness notion within the definition of “anonymize” to avoid imposing an absolute standard for organizations.

Reasoning: It is impossible to achieve a zero risk of re-identification within an anonymized dataset. For this reason, many jurisdictions have favoured a more nuanced, risk-based approach to the anonymization of personal information.

The provisions should focus on addressing risks that may arise when de-identified information is outside the control of the original accountable organization or when the purpose of re-identification is inconsistent with the original intent or requirements of the Act (e.g., many organizations use de-identification internally, such as to safeguard information).

“Irreversibly and permanently” is too strict – prefer harmonization with Quebec, where it is “reasonably foreseeable” that the person cannot be identified.

Section 2 (1): Definition of Automated Decision Making

  • Recommend: Remove all provisions that refer to “automated decision system” from the CPPA, as automated decision-making presents a complex set of policy issues better addressed in stand-alone legislation focused on AI.
  • If automated decision-making cannot be removed from the CPPA, we recommend changing the definition so that decisions that are made with the “assistance” of technology should not be considered to be automated to align with the definition of automated decisions in Quebec Law 25.
  • Recommend: Alternatively, ensure that all definitions and references to automated decision-making are harmonized between the CPPA and AIDA and Quebec Law 25.

Reasoning: The definition of “automated decision system” in the CPPA, which includes “any technology that assists…the judgment of human decision-makers…” is so broad as to cover most common software. The EU GDPR defines automated decision-making as making a decision by technological means without any human involvement. Quebec has taken the same approach. Canada should align with this.

Section 2 (2)

  • Recommend: Define the term “minor” as a natural person under a single, defined age that is consistent with other privacy laws (i.e., GDPR), which Quebec Law 25 sets at under 14 years.

Reasoning: As the term “minor” is not defined in the CPPA, the term will have the meaning ascribed to it by provincial/territorial “age of majority” laws, which provide that, in the absence of a definition or an indication of a contrary intention, a “minor” is a natural person under the age of 18 in AB, MB, ON, PEI, QC, and SK and a natural person under the age of 19 in BC, NB, NL, NT, NS, NU, YT. Differing definitions of “minor” across Canadian jurisdictions will require businesses operating in multiple jurisdictions to develop and implement different: (1) consent management policies, practices, and procedures; (2) user/customer experiences; (3) retention and breach reporting policies; and (4) security safeguards for different sets of jurisdictions. It may also require such businesses to engage in age profiling in jurisdictions where a “minor” includes a person who is 18 years old. This will impose an undue burden on such businesses and may lead to customer confusion. It is recommended to harmonize the definition with Quebec Law 25, which establishes under 14 years of age as a minor and should be used as the definition for a “minor” to remain consistent with Canadian law.

Section 14: New Purpose

  • Recommend: Clarify that an organization needs only collect consent for a new purpose if consent would have otherwise been required for the new purpose.

Reasoning: The language requires consent for new uses and disclosure that would otherwise not require consent simply because the data is used for a new purpose. This expands the consent requirement to any potential use case regardless of whether the new use or disclosure. Involves business activities, legitimate interest, or another valid exception to consent.

Section 18: Business Activities

  • Recommend: Clarify the definition by explicitly stating the intent. For example, “the personal information is not collected or used for the purpose of directly influencing decision making or behaviours.”

Reasoning: The language is too ambiguous, covering too much.

  • Recommend: Revise to include the following business activities: (1) an activity undertaken for the purpose of influencing the individual’s behaviour or decision when necessary to provide a product or service that the individual has requested from the organization; (2) an activity that is necessary to develop new products or services; and (3) an activity that is necessary to improve, monitor or audit the quality or performance of existing products or services.

Reasoning: These are standard business activities to which a consent exception should apply.

  • Recommend: Remove Section 18(5) from the CPPA.

Reasoning: An organization is required to comply with the conditions described in Section 18(4) of the CPPA before relying on the legitimate interest exception. It does not benefit Canadians to require an organization to prepare a written record describing how those conditions are satisfied every time this exception is relied upon. Further, this requirement is unduly burdensome on businesses and is inconsistent with the other consent exceptions under the CPPA (which do not require written records to be prepared every time an exception is relied upon).

  • Recommend: In section 18(4)(a), replace “any adverse effect” with “any reasonably foreseeable adverse effect.”

Reasoning: The original wording sets an impossible standard.

Section 21: Research, Analysis, and Development

  • Recommend: Remove the qualifier “internal” to the meaning of “research, analysis and development.”

Reasoning: The current drafting prevents organizations from transferring information to a third-party service provider or an affiliate to conduct research and analytics on their behalf, thereby requiring organizations to invest further resources in developing their own “internal” research and development capabilities. The current drafting may undermine the development of innovative research partnerships that allow various stakeholders to share datasets to create broad enough data substrates to produce valuable and actionable insights. In addition, the current draft would create an uneven playing field for small and medium-sized businesses as larger organizations, with access to greater internal resources, could rely on this exception more easily.

Section 39: Socially Beneficial Purposes Consent Exception

  • Recommend: Expand this consent exception to include disclosures made to all public sector entities and private sector organizations, not just designated public sector entities.

Reasoning: Much private sector work leads to tremendous social benefits, and we do not want to lose this innovation in Canada. For example, the many innovative treatments developed by Canada’s life sciences sector with the help of advanced analytics and access to big data resources drawn from across healthcare systems while maintaining the most stringent safeguards around privacy and data protection.

  • Recommend: Provide a principled basis on which a purpose that is otherwise “related to health, public education, diversity, equity, & inclusion, the provision or improvement of public amenities or infrastructure [or] the protection of the environment” can be labelled as “socially beneficial.”

Section 43: Request of Government Institution

  • Recommend: Strengthen by adding requirements that the government institution precisely explain the purpose of collection, the intended use, the retention period, the cybersecurity protections over the personal information, and how the request represents the minimum collection of personal information necessary in the circumstances.

Reasoning: The government is increasingly requesting large amounts of data from companies, and these requests undermine our members’ commitment to protect this personal data.

Section 54: Personal information for decision-making

  • Recommend: Delete section or qualify language to indicate the type of decision as one having a legal effect on an individual.

Reasoning: As currently drafted, the section requires the retention of all personal information used to make any decision about an individual for an indeterminate amount of time. This language is overbroad in terms of the scope of what is retained and for how long it is retained and creates risk by requiring the retention of personal information that could otherwise be disposed of. It also runs counter to retention/destruction requirements in other laws.

Section 55: Disposal of Personal Information

  • Recommend: Require deletion of personal information “unless this proves impossible or involves disproportionate effort.”

Reasoning: This is too broad and could be more limited, like in the Quebec legislation. The ability to refuse is too limited (e.g., not enough protection in areas like retention for fraud monitoring, retention to protect against the possibility of a human rights complaint, and situations where there is an ongoing law enforcement investigation where tipping may jeopardize an investigation).

  • Recommend: Ensure consistency with the disposal notification requirements already established in the EU GDPR and California CPA.

Reasoning: It is not practical to require organizations to ensure that data is deleted by their service providers.

  • Recommend: The CPPA should be revised to require an organization to “take reasonable steps to ensure that the service provider has disposed of the [personal information]” rather than requiring an organization to “ensure that the service provider has disposed of the [personal information].”

Reasoning: Requiring organizations to “ensure” that a service provider disposes of personal information is a more onerous requirement that is difficult to fulfill, considering that the first organization does not control the systems of the service provider.

  • Recommend: Add to section (2) Exception, (g) to enable solely internal uses that are reasonably aligned with an individual’s expectations, (h) to help ensure security and integrity to the extent the information is reasonably necessary for those purposes; (i) to address cases involving the prevention, detection and suppression of fraud or financial crimes; and (j) where necessary to protect confidential commercial information, or where the information is related to an ongoing investigation or would contravene a law.

Reasoning: There are legitimate internal use cases such as audit, finance, or security requirements that are not considered in the current exceptions. In addition, consistent with the first point noted above, the exceptions should clearly address instances involving fraud/financial crimes, commercial information, investigations, or legal contraventions.

Section 56: Extent of Accuracy

  • Recommend: In association with the recommended change to Section 54, delete section or qualify language to indicate the type of decision as one having a legal effect on an individual.

Reasoning: Language “used to make a [any] decision about an individual” is too broad.

Section 57 (3): Scope of Security Safeguards

  • Recommend: Delete or revise to clarify under what circumstances the requirement “and must include reasonable measures to authenticate the identity of the individual to whom the personal information relates” would apply.

Reasoning: The language is unclear as to when it would be applied. Typically, identity authentication of the individual to whom the personal information relates comes about when that person requests access to that information (e.g., a privacy rights request), but the wording is ambiguous as to when it would apply.

Section 62 (1) (e): Disclosing Sensitive Personal Information Retention Periods

  • Recommend: Not require organizations to disclose to the public important strategic information such as retention periods applicable to sensitive personal information.

Reasoning: To disclose the retention periods applicable to sensitive personal information could attract the attention of malicious actors who could take advantage of the opportunity to target the data repositories of organizations that retain a significant volume and variety of health data. This may significantly increase the risk of cyberattacks for organizations that legitimately hold health information for research purposes.

Section 62 (2) (c) and 63 (3): Definition of “Significant Impact”

  • Recommend: The CCPA should define what “significant impact” means so there is clarity for its use in Sections 62(2)(c) and 63(3) regarding automated decision systems and align with the definition of “high-impact systems.”

Section 72: Mobility of Personal Information

  • Recommend: Revise only to require organizations subject to a data mobility framework to return data to the subject in a readily usable format.
  • Recommend: Consider developing an “allow list” of data categories that would be narrowly scoped to allow individuals to switch to new service providers more easily and that key exceptions be codified in the legislation.

Reasoning: There are significant challenges to implementing data portability between companies, including defining the scope of what is or isn’t subject to data portability, how to transfer data in industries that lack agreed standard formats, and what to do about data that is formatted or organized in a way that is proprietary to a business.

In the implementation phase for the CPPA, we recommend working with industry to develop voluntary best practices that align with regional or international standards and codes of conduct.

Section 75: Prohibition on Re-Identification of De-Identified Personal Information

  • Recommend: Revise section 75 by replacing “de-identified” with “anonymized.”

Reasoning: Organizations de-identify personal information for safeguarding, data minimization and other privacy-enhancing purposes. Without this amendment, an organization that de-identifies personal information for these purposes would be worse off than if it had retained the information in an identifiable form. This would also align the CPPA with the EU GDPR.

Section 94: Financial Penalties

  • Recommend: Replace all references to “gross global revenue” or “gross global revenues” in CPPA and PIDPTA with references to “gross Canadian revenue” or “gross Canadian revenues,” as applicable.

Reasoning: Financial penalties imposed on an organization for contravening a Canadian privacy law should be tied to the organization’s gross Canadian revenues rather than its global gross revenues to ensure that such penalties are: (1) related to the organization’s activities in Canada (rather than related to its activities in other jurisdictions) and (2) proportionate to the offence and the size of the Canadian market. The proposed amendment ensures that the maximum penalties create strong incentives for organizations to comply with Canadian privacy law without discouraging organizations from doing business in Canada.

Section 103 (2): Standard of Review

  • Recommend: Lower the bar for judicial review.

Reasoning: As currently written, Bill C-27 lacks procedural fairness. The proposed standard of judicial review that the Commissioner must commit a “palpable and overriding error” for their decision to be overturned is so high that it makes judicial review almost impossible.

Section 107: Private Right of Action

  • Recommend: Revise Section 107(1) of the CPPA to provide that “An individual who is affected by an act or omission by an organization that constitutes a contravention of this Act and is set out in Section 94(1) has a cause of action against the organization for damages for loss or injury that the individual has suffered as a result of the contravention if the contravention was intentional or resulted from gross fault and…”

Reasoning: Narrowing the scope of the private right of action will render the CPPA consistent with Quebec Law 25, which recognizes the possibility for individuals to claim punitive damages when an unlawful infringement of a right conferred by the Private Sector Act or by articles 35 to 40 of the Civil Code causes an injury, provided the infringement is intentional or results from gross negligence. Narrowing the scope of the private right of action will ensure that expert regulators are positioned to shape Canadian privacy laws while balancing privacy protections and the legitimate needs of enterprises to collect, use and disclose personal information. Regulators understand the complexities of encouraging compliance while preventing and remediating harms. Furthermore, broad private rights of action may encourage inappropriate litigation, including costly class actions, aimed at forcing large settlements and payment of legal fees to plaintiffs’ class action counsel without improving the rights of individuals.

  • Recommend: Add a new Section 107(5) to the CPPA, which states, “The findings of the Commissioner or the Tribunal are not binding on a court which shall be required to make all determinations of facts and law and mixed questions of fact and law itself.”

Reasoning: Clarifying that the Commissioner’s or the Tribunal’s findings are not binding on a court is critical to address (in part) the lack of procedural protections found in the CPPA, including the informal and expedited process for inquiries mandated by Section 91(1).

  • Recommend: Add a new Section 107(6) to the CPPA, which states, “A person is not to be found liable under Section 107 if they establish that they exercised due diligence to prevent the commission of the acts or omissions. “

Reasoning: Adding a due diligence defence in respect of the private right action is required to make the enforcement regime in the CPPA internally consistent.

Transitional Provisions

  • Recommend: Adopt a phased implementation plan for Bill C-27 over a period of 36 months, allowing organizations to adapt to the new requirements successfully.
  • Recommend: Following implementation, adopt a phased enforcement process to allow organizations to adapt to the new legal obligations imposed by the legislation without being subject to unreasonable penalties.

Reasoning: Bill C-27 introduces significant changes that will require substantial time and resources to adequately implement, especially considering the material increase in fines and penalties for violations of the federal data protection regime’s requirements. These changes will be particularly challenging for SMEs to implement in a short timeframe. A 36-month transitional period will allow businesses an adequate runway to implement the changes and is consistent with the period afforded under similar legislative reforms in other jurisdictions, such as Quebec, California and the EU.

Grandfathering

  • Recommend: Include a grandfather clause allowing existing data supply agreements to comply only with PIPEDA requirements rather than Bill C-27 for the remainder of their unexpired term.

Reasoning: Under the CPPA, organizations must ensure that service providers with whom they have a data outsourcing agreement provide an “equivalent” level of protection of personal information. The switch from “comparable” to “equivalent” creates ambiguity around the status of contractual safeguards, which may require organizations to revisit agreements with service providers. In most cases, reviewing agreements will require significant time and resources. However, in many cases, revisiting existing agreements will not be possible.

  • Recommend: Include a grandfathering clause allowing implied consent obtained under PIPEDA to remain sufficient for the collection of personal information.

Reasoning: If an organization has implied consent now, they should not have to collect express consent from their base since this would impose unreasonable costs on the organization. Compliance with such requirements would be particularly onerous for SMEs. Instead, organizations should be required to obtain express consent only when entering a new contract with their base.


ANNEX B – AIDA Recommendations

Section 2: Definition of Artificial Intelligence System

  • Recommend: Align the definition with the OECD/EU AI definition

Reasoning: The reference to “related to human activities” is initially confusing and inconsistent with other widely used definitions and could affect interoperability.

Section 5: Definition of Anonymization

  • Recommend: Amend AIDA to be the same definition of anonymization as in the CPPA.

Reasoning: This will ensure that the concept of anonymized data is used consistently in both statutory frameworks.

Section 5 (1): Definition of Harm

  • Recommend: Clarification around the concept and definition of “harm.”

Reasoning: The legislation centers around the concept of a high-impact AI system being “likely to cause material harm” but does not explain how that should be determined, the threshold at which a system becomes “likely to cause material harm,” or what types and levels of harm would constitute “material harm” (beyond a general reference to physical, psychological, property and economic harms). It also does not specify what is expected of organizations in terms of ongoing monitoring to determine whether a system becomes “likely to cause material harm.” This leaves significant uncertainty for organizations developing, deploying and using AI systems, which could slow innovation and adoption of AI (particularly by SMEs). The concept of harm in AIDA should align with globally accepted definitions already applied in industry, such as those from the European Commission’s High-Level Expert Group on AI and the OECD Network of Experts on AI.

  • Recommend: Introduce a principles-based approach to the analysis of “harm.”

Reasoning: Often, “harm” to one individual is the result of practical decisions that benefit a far greater number.

Section 5 (1): Definition of High-Impact System

  • Recommend: The legislation should specify clear criteria under which systems may be classified as “high-impact AI systems,” as well as a straightforward process and criteria for amending the list of “high-impact AI systems,” as the EU AI Act does.

Reasoning: “High-impact AI systems,” which form the crux of the law, are entirely undefined, and no criteria are even provided for determining what constitutes a “high-impact AI system.” Under this law, as written, regulations introduced could scope it very narrowly and in line with the EU or extremely broadly, leaving significant uncertainty and lack of legal clarity.

  • Recommend: Using a risk-based approach would focus on identifying and regulating “high-impact” applications of AI.

Reasoning: What qualifies as “high-impact” should be determined based on an assessment of the AI use case rather than dictated by law. Due attention must be placed on the differing purposes and contexts between “high-impact” and non “high-impact” AI systems.

For example, the EU GDPR requires a data protection impact assessment (“DPIA”) where the processing of personal data is likely to result in a “high-impact” on the rights and freedoms of individuals. A DPIA is designed to assess the level of risk, particularly whether it is a high risk, looking at the likelihood and severity of the potential harm and the benefits of the proposed data processing activity.

Section 5 (2): Person Responsible

  • Recommend: Clearly delineate the roles and responsibilities of different actors along the AI value chain.

Reasoning: The AI value chain is highly complex and can involve multiple developers, licenses, vendors, and users. As currently drafted, AIDA’s requirements could capture multiple entities, creating confusion around who is responsible for complying with the legislation, particularly the responsibilities of those regulated related to managing and making available for use components.

Section 11: Publication of Description

  • Recommend: This requirement should include an exception for high-impact systems intended to protect systems/prevent financial crime. At a minimum, such descriptions should be provided to the Minister but not publicly disclosed.

Reasoning: Publicizing such descriptions could unveil sensitive information and jeopardize the security of critical systems.

Section 15: Ministerial Orders to Demand Records

  • Recommend: Not give the Minister the power to demand all records related to an AI system if an offence is suspected.

Reasoning: For technology companies heavily reliant on AI, the information requests alone could be crippling.

Section 33: Artificial Intelligence and Data Commissioner

  • Recommend: Split the inspection/investigation, prosecution, and adjudication functions under AIDA.

Reasoning: Having all of them exercised by a single individual does not meet administrative law standards of natural justice.”

Share this

Sign Up for Our Newsletter

Sign Up to receive the latest news from the Canadian Chamber of Commerce