Experts urge changes to proposed Canadian privacy, AI laws before today’s hearing

Please vote:

In advance of the start of committee hearings on proposed Canadian privacy and artificial intelligence legislation today, 45 groups, experts, and academics released an open letter to Innovation Minister François-Philippe Champagne calling on major changes.

Champagne and department officials are scheduled to testify before the House of Commons Industry and Technology committee at 3:30 p.m. Eastern on Bill C-27, a combination piece of legislation that proposes:

• the Consumer Privacy Protection Act (CPPA). The CPPA would replace the Personal Information Protection and Electronic Data Act  (PIPEDA). Like PIPEDA, the CPPA would cover federally regulated businesses and businesses in provinces that don’t have their own private-sector privacy law;

• the Personal Information and Data Protection Tribunal Act, which would create a tribunal to hear recommendations from the federal privacy commissioner for punishing those who violate the CPPA;

• and the Artificial Intelligence and Data Act (AIDA) for overseeing AI.

All three proposed pieces of legislation — announced over a year ago — already face criticism. In a written submission to the committee, federal privacy commissioner Philippe Dufresne said the CPPA is “a step in the right direction,” but must go further to protect fundamental privacy rights.

On the other hand, in April, 75 AI experts urged Parliament to quickly pass AIDA despite its possible flaws.

Key recommendations in the letter from protesters released this week call on the Liberal government to:

— recognize privacy in the CPPA as a fundamental Canadian human right;

— remove AI regulation from the Industry department’s sole jurisdiction because its mandate to bolster the AI industry conflicts with the public interest in regulating the potential dangers of AI;

— address poorly defined language in AIDA that creates loopholes and a lack of enforceable rules;

— commit to far more active consultation with stakeholders beyond industry insiders to ensure AIDA and subsequent AI rules are well-balanced and protect rights;

— and expand AI regulation to apply to both the public and private sectors, including government security agencies.

Those signing the letter include the Canadian Civil Liberties Association, OpenMedia, the Public Interest Advocacy Centre, and University of Ottawa law professors Teresa Scassa and Michael Geist.

“Excluding private sector AI tech developed for government intelligence, defence and national security purposes from any form of regulation means a free pass for some of the most potentially harmful AI tools,” Tim McSorley, national coordinator at the International Civil Liberties Monitoring Group, said in a statement accompanying the letter. “If the government is serious about protecting the rights of people in Canada, AIDA isn’t up to task.”




Get The Latest Updates

Subscribe To Our Weekly Newsletter

No spam, notifications only about new products, updates.
On Key

Related Posts

Let’s try to understand AI monosemanticity

Let’s try to understand AI monosemanticity

You’ve probably heard AI is a “black box”. No one knows how it works. Researchers simulate a weird type of pseudo-neural-tissue, “reward” it a little every time it becomes a little more like the AI they want, and eventually it becomes the AI they want.

Read More »