Skip to main content

Bill C-216

If you have any questions or comments regarding the accessibility of this publication, please contact us at accessible@parl.gc.ca.

Skip to Document Navigation Skip to Document Content

First Session, Forty-fifth Parliament,

3 Charles III, 2025

HOUSE OF COMMONS OF CANADA

BILL C-216
An Act to enact the Protection of Minors in the Digital Age Act and to amend two Acts

FIRST READING, June 19, 2025

Ms. Rempel Garner

451035


SUMMARY

Part 1 of this enactment enacts the Protection of Minors in the Digital Age Act, the purpose of which is to provide for a safe online environment for minors by requiring owners and operators of platforms such as online services or applications to ensure that minors’ personal data is not used in a manner that could compromise their privacy, health or well-being.

Part 2 amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to, among other things,

(a)clarify the types of Internet services covered by that Act;

(b)simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;

(c)require that transmission data be provided with the mandatory notice in cases where the content is manifestly child sexual abuse and exploitation material;

(d)extend the period of preservation of data related to an offence;

(e)extend the limitation period for the prosecution of an offence under that Act; and

(f)add certain regulation-making powers.

Part 3 amends the Criminal Code to, among other things,

(a)prohibit the publication of the image of a person created or edited through the use of computer software that falsely represents the person, in a manner that is intended to make the image appear authentic, as being nude, as exposing their genital organs, anal region or breasts or as being engaged in explicit sexual activity;

(b)create a separate offence of criminal harassment that is conducted by means of the Internet, a social media service or other digital network and require the court imposing a sentence for the offence to consider as an aggravating factor the fact that the offender, in committing the offence, communicated with the victim anonymously or using a false identity; and

(c)provide for the circumstances in which a person who presents a risk of committing an offence of online harassment may be required to enter into a recognizance and, if the person has communicated anonymously or using a false identity, provide for the circumstances in which a court may make a production order for the purpose of identifying the person.

Available on the House of Commons website at the following address:
www.ourcommons.ca


TABLE OF PROVISIONS

An Act to enact the Protection of Minors in the Digital Age Act and to amend two Acts
Short Title
1

Promotion of Safety in the Digital Age Act

PART 1
Protection of Minors in the Digital Age Act
2

Enactment of Act

An Act to provide for the protection of minors in the digital age
Short Title
1

Protection of Minors in the Digital Age Act

Interpretation
2

Definitions

Purpose
3

Purpose

Duty of Care
4

Duty of care

Safeguards
5

Safety settings

6

Parental controls

7

Accessibility

Reporting Channel
8

Reporting mechanism

Prohibitions
9

Prohibition

Disclosure
10

Clear and readily accessible information

Advertising and Marketing
11

Advertising and marketing

Transparency
12

Keeping records

13

Independent review

14

Annual report

Market Research Guidelines
15

Guidelines

Offences and Punishment
16

Contravention of sections 4 to 9

17

Contravention of sections 10 to 12

18

Due diligence

Private Right of Action
19

Private right of action

Standards and Conformity Assessment
20

Standards or codes of practice

Regulations
21

Regulations

Coming into Force
3

18 months after royal assent

PART 2
An Act respecting the mandatory reporting of Internet child sexual abuse and exploitation material by persons who provide an Internet service
Coming into Force
9

Coming into force of Part

PART 3
Criminal Code
Coming into Force
19

Coming into force of Part



1st Session, 45th Parliament,

3 Charles III, 2025

HOUSE OF COMMONS OF CANADA

BILL C-216

An Act to enact the Protection of Minors in the Digital Age Act and to amend two Acts

His Majesty, by and with the advice and consent of the Senate and House of Commons of Canada, enacts as follows:

Short Title

Short title

1This Act may be cited as the Promotion of Safety in the Digital Age Act.

PART 1
Protection of Minors in the Digital Age Act

Enactment of Act

Enactment

2The Protection of Minors in the Digital Age Act is enacted as follows:

An Act to provide for the protection of minors in the digital age
Short Title
Short title
1This Act may be cited as the Protection of Minors in the Digital Age Act.
Interpretation
Definitions
2The following definitions apply in this Act.

child means an individual who is under the age of 16 years.‍ (enfant)

Commission means the Canadian Radio-television and Telecommunications Commission established by the Canadian Radio-television and Telecommunications Commission Act.‍ (Conseil)

Minister means the Minister of Industry.‍ (ministre)

minor means an individual who is under the age of 18 years.‍ (mineur)

operator means the owner or operator of a platform, such as an online service or application, that connects to the Internet and that is used, or could reasonably be expected to be used, by a minor, including a social media service and an online video gaming service.‍ (exploitant)

parent, in respect of a minor, includes a person who, in law,

  • (a)has custody of the minor or, in Quebec, parental authority over the minor; or

  • (b)is the guardian of the minor or, in Quebec, the tutor or curator to the person of the minor.‍ (parent)

personal data means information that identifies or is linked or may reasonably be linked to a particular minor and includes a mobile device identifier associated with a minor.‍ (données personnelles)

personalized recommendation system means a fully or partially automated system or computer algorithm used to suggest, promote or rank information based on the personal data of users.‍ (système de recommandations personnalisées)

social media service means a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication among users of the website or application by enabling them to access and share content.‍ (service de média social)

Purpose
Purpose
3The purpose of this Act is to provide for a safe online environment for minors by requiring operators to take meaningful steps to protect them and address online risks to their health and well-being, including by ensuring that their personal data is not used in a manner that could compromise their privacy, health or well-being.
Duty of Care
Duty of care
4(1)Every operator must act in the best interests of a user whom it knows or should reasonably know is a minor by taking reasonable steps in the design and operation of its products and services to prevent or to mitigate the effects of the following:
  • (a)physical harm or incitement of such harm;

  • (b)online sexual violence against minors, including any conduct directed at a minor online that constitutes an offence under the Criminal Code and is committed for a sexual purpose or any unsolicited or unwanted sexual actions or communications directed at a minor online;

  • (c)the creation or dissemination of imagery of a minor, whether altered or not, that is sexually exploitative;

  • (d)the promotion and marketing of products or services that are unlawful for minors, such as any controlled substance as defined in subsection 2(1) of the Controlled Drugs and Substances Act, alcohol, cannabis and tobacco and products or services relating to gambling or pornography;

  • (e)mental health disorders, including eating disorders and substance use disorders, and the promotion of self-harm, suicide and suicidal behaviours;

  • (f)patterns of use that indicate or encourage addiction-like behaviours;

  • (g)the operation of an account by a user whom it knows or should reasonably know is a minor without first making reasonable efforts to verify the contact information for any of the user’s parents through, for example, the appropriate Internet service provider; and

  • (h)predatory or deceptive marketing practices.

Clarification
(2)Nothing in subsection (1) is to be construed as
  • (a)requiring an operator to prevent any minor from deliberately and independently searching for specific content; or

  • (b)preventing an operator or any user from providing resources for the prevention or mitigation of any harm described in subsection (1), including evidence-based information and clinical resources.

Safeguards
Safety settings
5(1)Every operator must provide any parent of a user whom the operator knows or should reasonably know is a child, as well as that user, with clear and readily accessible safety settings on its platform, including settings to
  • (a)control the ability of other individuals to communicate with the child;

  • (b)prevent other individuals from consulting personal data of the child that is collected by, used or disclosed on the platform, in particular by restricting public access to personal data;

  • (c)reduce features that increase, encourage or extend the use of the platform by the child, including automatic displaying of content, rewards for time spent on the platform, notifications and other features that could result in addictive use of the platform by the child;

  • (d)control personalized recommendation systems, including the right to

    • (i)opt out of such systems, while still allowing content to be displayed in chronological order, with the latest published content displayed first, and

    • (ii)limit types or categories of recommendations from such systems; and

  • (e)restrict the sharing of the child’s geolocation and notify the child and their parent when their geolocation is being tracked.

Default settings
(2)The operator must ensure that the default setting for the safeguards described in subsection (1) is the option that provides the highest level of protection.
Additional obligations
(3)The operator must
  • (a)in restricting access to its platform or any of its content that is inappropriate for children, use computer algorithms that ensure reliable age verification and that preserve privacy;

  • (b)implement adequate measures to protect the privacy, health and well-being of children; and

  • (c)take remedial measures when it becomes aware of any issues raised in relation to the privacy, health or well-being of children on the platform.

Additional options
(4)The operator must provide any parent of a user whom it knows or should reasonably know is a child, as well as that user, with clear and readily accessible options on its platform to
  • (a)delete the child’s account;

  • (b)delete any personal data collected from or shared by the child on the platform; and

  • (c)limit the amount of time spent by the child on the platform.

Parental controls
6(1)Every operator must provide on its platform clear and readily accessible controls for any parent to support a user that the operator knows or should reasonably know is a minor, including the ability to
  • (a)manage the minor’s privacy and account settings;

  • (b)view metrics of time spent by the minor on the platform; and

  • (c)prevent purchases and financial transactions by the minor.

Default parental controls
(2)The parental controls referred to in subsection (1) must be set as a default setting in the case of a user whom the operator knows or should reasonably know is a child.
Opt-out
(3)Every operator must provide any parent with a clear and readily accessible option to opt out of or turn off the default parental controls.
Notice to minor
(4)Every operator must notify a user whom it knows or should reasonably know is a minor when the parental controls are in effect and which settings or controls have been activated.
Notice to parent
(5)If the operator has reasonable grounds to believe that the default parental controls have been turned off by a minor, it must notify the parent.
Accessibility
7Every operator must make the following readily accessible and provide it in the language, form and manner in which its platform provides the product or service used by minors and their parents:
  • (a)information and control options that take into consideration the differing ages, capacities and developmental needs of the minors most likely to access the platform and that do not encourage minors or parents to weaken or disable safety settings or parental controls; and

  • (b)options to enable or disable safety settings or parental controls, as appropriate.

Reporting Channel
Reporting mechanism
8(1)Every operator must provide on its platform a dedicated and readily accessible reporting channel that any person may use to alert the operator to online harms and risks to minors.
Internal process
(2)Every operator must establish an internal process to receive and respond to the reports received through the reporting channel and must take any measures necessary to respond to the person who makes a report and to address any issues raised in a reasonable and timely manner.
Prohibitions
Prohibition
9(1)It is prohibited for an operator to use any platform design features, including personalized recommendation systems, or use personal data in a manner that facilitates the advertising, marketing, soliciting, offering or selling of products or services that are unlawful for minors, such as any controlled substance as defined in subsection 2(1) of the Controlled Drugs and Substances Act, alcohol, cannabis and tobacco and products or services relating to gambling or pornography.
Prohibition
(2)It is prohibited for an operator to design, modify or manipulate a user interface in a manner that subverts or impairs user autonomy, decision-making or choice in order to weaken or disable the safety settings or parental controls required under this Act.
Prohibition
(3)It is prohibited for an operator to require or request the use of a digital identifier that serves as an electronic representation of an individual’s identity and of their right to access information or services online.
Interpretation
(4)Nothing in this section is to be construed as
  • (a)preventing an operator from taking reasonable measures to

    • (i)block, detect or prevent the distribution of unlawful, obscene or other harmful material, as described in paragraph 4(1)‍(a), to minors, or

    • (ii)block or filter spam, prevent criminal activity or protect the security of its platform or service; or

  • (b)requiring the disclosure of a minor’s browsing behaviour, search history, messages, contact list or other content or metadata of their communications.

Disclosure
Clear and readily accessible information
10Every operator must, in a prominent location on its platform, provide clear and readily accessible information regarding the following:
  • (a)its policies, practices and safety settings, including those pertaining to and available for minors and their parents;

  • (b)access to the safety settings and parental controls required under sections 5 and 6, respectively;

  • (c)the type of personal data that the platform collects, uses or discloses and the manner in which it does so;

  • (d)the platform’s use of any personalized recommendation systems to prioritize, assign weight to or rank different categories of personal data and of the options available to users to modify or disable these settings; and

  • (e)the platform’s use of any labels or tags to indicate that specific advertisements, information, products or services are directed at minors.

Advertising and Marketing
Advertising and marketing
11Every operator must provide, with respect to advertising on its platform, clear and readily accessible information and labels regarding the following:
  • (a)the name of the product, service or brand and the subject matter of each advertisement;

  • (b)if the platform conducts targeted advertising, the reasons for targeting minors regarding any given advertisement and the ways in which minors’ personal data is used to engage in such advertising; and

  • (c)the fact, if applicable, that content displayed to a minor consists of an advertisement or marketing material, including the disclosure of endorsements of products, services or brands, made by other users of the platform for commercial consideration.

Transparency
Keeping records
12Every operator must keep and maintain audit logs for the collection, processing and use of personal data and relevant records of data and personal data in its possession or control that are necessary to determine whether it has complied with this Act.
Independent review
13Every two years, every operator must cause an independent review of its platform to be conducted, including in respect of the risks and harms it poses to minors and the cumulative effects the use of the platform has on minors. The operator must make the findings publicly available.
Annual report
14(1)Every operator must, in each year, prepare a report for the previous year on the risks and harms to minors identified in the independent review and the prevention and mitigation measures taken to address them.
Content
(2)The report must also include a systemic risk and impact assessment in relation to the following:
  • (a)the extent to which the operator’s platform is likely to be accessed by minors;

  • (b)if the platform is accessed by minors, data on the number of minors using it and on their daily, weekly and monthly usage;

  • (c)the platform’s safety settings and parental controls, including an assessment of their efficacy and a description of any breaches reported in relation to them;

  • (d)the extent to which the platform’s design features, including its personalized recommendation systems and its use of automatic displaying of content, rewards for time spent and notifications, pose risks to minors, including to their privacy, health or well-being;

  • (e)the collection, use and disclosure by the platform of personal data, such as geolocation or health data, and the purposes for which and the manner in which the data is collected, used or disclosed;

  • (f)the reports the operator has received through its reporting channel, including the number and nature of the reports;

  • (g)the internal process the operator has implemented to receive reports and the timeliness, effectiveness and types of responses provided following each report; and

  • (h)the prevention and mitigation measures taken by the operator to address any issues raised in the independent review.

Publication
(3)The operator must publish the report in a prominent place on its platform.
Market Research Guidelines
Guidelines
15The Commission must, in consultation with relevant stakeholders, establish guidelines setting out how operators may conduct market research and product-focused research in relation to minors.
Offences and Punishment
Contravention of sections 4 to 9
16Every operator that contravenes any of sections 4 to 9 is guilty of an offence and liable,
  • (a)on conviction on indictment, to a fine of not more than twenty-five million dollars; and

  • (b)on summary conviction, to a fine of not more than twenty million dollars.

Contravention of sections 10 to 12
17Every operator that contravenes any of sections 10 to 12 or a provision of the regulations made under section 21 is guilty of an offence and liable on summary conviction to a fine of not more than ten million dollars.
Due diligence
18An operator is not to be found guilty of an offence under this Act if it establishes that it exercised due diligence to prevent its commission.
Private Right of Action
Private right of action
19(1)The user of a platform who is a minor, or any of their parents, who alleges that they have suffered serious harm as a result of a failure by its operator to comply with its duty of care under subsection 4(1) may, in any court of competent jurisdiction, bring an action against the operator and, in the action, claim relief by way of one or more of the following:
  • (a)damages for any serious harm, loss or damage suffered;

  • (b)aggravated or punitive damages;

  • (c)an injunction;

  • (d)an order for specific performance; or

  • (e)any other appropriate relief, including the costs of the action.

Limitation period
(2)Unless the court decides otherwise, no action may be brought later than three years after the day on which the minor or their parent becomes aware of the act or omission on which their action is based.
Definition of serious harm
(3)In this section, serious harm includes significant physical or psychological harm and substantial economic loss.
Standards and Conformity Assessment
Standards or codes of practice
20If an operator has implemented standards or a code of practice that, in the Minister’s opinion, provides for substantially the same or greater protections as those provided for under this Act, the Minister may cause a notice to be published in the Canada Gazette confirming the extent to which this Act applies to the operator’s platform.
Regulations
Regulations
21The Governor in Council, on the recommendation of the Minister following consultations with the Commission, may make regulations for carrying out the purposes and provisions of this Act, including regulations
  • (a)setting out the form and manner, including the languages, in which information is to be provided to users under section 10; and

  • (b)for the purpose of section 12, providing for the records of data and personal data to be kept and maintained by an operator, the manner in which they are to be kept and maintained and the period during which they are to be kept and maintained.

Coming into Force

18 months after royal assent

3(1)Sections 1 to 11 and 16 to 21 of the Protection of Minors in the Digital Age Act, as enacted by section 2 of this Act, come into force on the day that, in the eighteenth month after the month in which this Act receives royal assent, has the same calendar number as the day on which this Act receives royal assent or, if that eighteenth month has no day with that number, the last day of that eighteenth month.

Second anniversary

(2)Sections 12 to 15 of the Protection of Minors in the Digital Age Act, as enacted by section 2 of this Act, come into force on the second anniversary of the day on which this Act receives royal assent.

PART 2
An Act respecting the mandatory reporting of Internet child sexual abuse and exploitation material by persons who provide an Internet service

2011, c. 4

Amendments to the Act

4The definition Internet service in subsection 1(1) of An Act respecting the mandatory reporting of Internet child sexual abuse and exploitation material by persons who provide an Internet service is replaced by the following:

Internet service Insertion start includes Insertion end a service

  • Insertion start (a) Insertion end providing Internet access;

  • Start of inserted block

    (b)providing Internet content hosting, regardless of the originator of the content or the manner by which the content is made accessible; or

  • (c)facilitating interpersonal communication over the Internet, including a service providing electronic mail.‍ (services Internet)

    End of inserted block

5Sections 3 and 4 of the Act are replaced by the following:

Duty to notify
3If a person who provides an Internet service to the public has reasonable grounds to believe that their Internet service is being or has been used to commit a child sexual abuse and exploitation material offence, the person must notify Insertion start the law enforcement body designated by the regulations Insertion end of that fact, as soon as feasible and in accordance with the regulations.
Transmission data
Start of inserted block
3.‍1If the content relating to an offence referred to in section 3 is manifestly child sexual abuse and exploitation material, a person who makes a notification under that section must include with the notification a document containing any transmission data, as defined in section 487.‍011 of the Criminal Code, associated with the content that could assist in the investigation of the offence.
End of inserted block
Preservation of computer data
4(1)A person who makes a notification under section 3 must preserve all computer data related to the notification that is in their possession or control for Insertion start one year Insertion end after the day on which the notification is made.
Destruction of preserved computer data
(2)The person must destroy the computer data that would not be retained in the ordinary course of business and any document that is prepared for the purpose of preserving computer data under subsection (1) as soon as feasible after the Insertion start end Insertion end of the Insertion start one-year Insertion end period, unless the person is required to preserve the computer data by a judicial order made under any other Act of Parliament or Insertion start any Act of Insertion end the legislature of a province.

6The Act is amended by adding the following after section 9:

Clarification — privacy legislation
Start of inserted block
9.‍1For greater certainty, this Act is not to be construed as limiting in any way any obligation under the Privacy Act or any applicable provincial privacy legislation.
End of inserted block

7Section 11 of the Act is replaced by the following:

Limitation period
11A prosecution for an offence under this Act cannot be commenced more than Insertion start five Insertion end years after the time when the act or omission giving rise to the prosecution occurred.

8(1)Paragraph 12(a) of the Act is replaced by the following:

  • Start of inserted block

    (a)specifying the services included under the definition Internet service in subsection 1(1);

    End of inserted block
  • ( Insertion start a.‍1 Insertion end )designating an organization for the purpose of section 2;

(2)Section 12 of the Act is amended by adding the following after paragraph (c):

  • Start of inserted block

    (c.‍1)designating the law enforcement body for the purpose of section 3;

    End of inserted block

(3)Section 12 of the Act is amended by adding the following after paragraph (d):

  • Start of inserted block

    (d.‍1)requiring the law enforcement body designated under paragraph (c.‍1) to submit to the Minister of Justice and the Minister of Public Safety and Emergency Preparedness an annual report in relation to the information that it has received under this Act;

  • (d.‍2)specifying the form and content of the annual report referred to in paragraph (d.‍1), as well as the manner and time of its submission;

    End of inserted block

Coming into Force

Coming into force of Part

9This Part comes into force on the day on which it receives royal assent unless, on that day, An Act to amend the Criminal Code and to make consequential amendments to other Acts (child sexual abuse and exploitation material), chapter 23 of the Statutes of Canada, 2024, is not in force, in which case this Part comes into force on the day on which that Act comes into force.

PART 3
Criminal Code

R.‍S.‍, c. C-46

Amendments to the Act

10(1)Paragraph 162.‍1(1)‍(a) of the Criminal Code is replaced by the following:
  • (a)of an indictable offence and liable to imprisonment

    • ( Insertion start i Insertion end )for a term of not more than five years,

    • Start of inserted block

      (ii)for a term of not more than 10 years, if the person depicted in the image is engaged in explicit sexual activity, or

    • (iii)for a term of not more than 14 years, if the accused knew or ought to have known that, at the time the intimate image was created, aggravated sexual assault was being, or had just been, committed against the person depicted in the image; or

      End of inserted block
(2)Section 162.‍1 of the Act is amended by adding the following after subsection (1):
Publication, etc.‍, of a false intimate image without consent
Start of inserted block
(1.‍1)Everyone who knowingly publishes, distributes, transmits, sells, makes available or advertises a false intimate image of a person knowing that the person depicted in the image did not give their consent to that conduct, or being reckless as to whether or not that person gave their consent to that conduct, is guilty
  • (a)of an indictable offence and liable to imprisonment

    • (i)for a term of not more than five years,

    • (ii)for a term of not more than 10 years, if the image depicts the person as being engaged in explicit sexual activity, or

    • (iii)for a term of not more than 14 years if the accused knew or ought to have known that, at the time the false intimate image was created or edited, aggravated sexual assault was being, or had just been, committed against the person depicted in the image; or

  • (b)of an offence punishable on summary conviction.

    End of inserted block
(3)Section 162.‍1 of the Act is amended by adding the following after subsection (2):
Definition of false intimate image
Start of inserted block
(2.‍1)In this section, false intimate image means a visual recording made by any means, including a photographic, film or video recording, that is created or edited through the use of computer software, including artificial intelligence software, and that falsely represents a person, in a manner that is intended to make the recording appear authentic, as being nude, as exposing their genital organs or anal region or their breasts or as being engaged in explicit sexual activity.
End of inserted block

11Subsection 162.‍2(1) of the Act is replaced by the following:

Prohibition order
162.‍2(1)When an offender is convicted, or is discharged on the conditions prescribed in a probation order under section 730, of an offence referred to in subsection 162.‍1(1) Insertion start or (1.‍1) Insertion end , the court that sentences or discharges the offender, in addition to any other punishment that may be imposed for that offence or any other condition prescribed in the order of discharge, may make, subject to the conditions or exemptions that the court directs, an order prohibiting the offender from using the Internet or other digital network, unless the offender does so in accordance with conditions set by the court.

12(1)Paragraph 164(1)‍(b) of the Act is replaced by the following:

  • (b)the recording, copies of which are kept for sale or distribution in premises within the jurisdiction of the court, is an intimate image Insertion start or a false intimate image Insertion end ;

(2)Subsections 164(3) to (5) of the Act are replaced by the following:

Owner and maker may appear
(3)The owner and the maker of the matter seized under subsection (1), and alleged to be obscene, child sexual abuse and exploitation material, a voyeuristic recording, an intimate image, Insertion start a false intimate image Insertion end , an advertisement of sexual services or an advertisement for conversion therapy, may appear and be represented in the proceedings to oppose the making of an order for the forfeiture of the matter.
Order of forfeiture
(4)If the court is satisfied, on a balance of probabilities, that the publication, representation, written material or recording referred to in subsection (1) is obscene, child sexual abuse and exploitation material, a voyeuristic recording, an intimate image, Insertion start a false intimate image Insertion end , an advertisement of sexual services or an advertisement for conversion therapy, it may make an order declaring the matter forfeited to His Majesty in right of the province in which the proceedings take place, for disposal as the Attorney General may direct.
Disposal of matter
(5)If the court is not satisfied that the publication, representation, written material or recording referred to in subsection (1) is obscene, child sexual abuse and exploitation material, a voyeuristic recording, an intimate image, Insertion start a false intimate image Insertion end , an advertisement of sexual services or an advertisement for conversion therapy, it shall order that the matter be restored to the person from whom it was seized without delay after the time for final appeal has expired.

(3)Subsection 164(8) of the Act is amended by adding the following in alphabetical order:

Start of inserted block

false intimate image has the same meaning as in subsection 162.‍1(2.‍1); (fausse image intime)

End of inserted block
13(1)The portion of subsection 164.‍1(1) of the Act before paragraph (a) is replaced by the following:
Warrant of seizure
164.‍1(1)If a judge is satisfied by information on oath that there are reasonable grounds to believe that there is material — namely, child sexual abuse and exploitation material as defined in section 163.‍1, a voyeuristic recording, an intimate image, Insertion start a false intimate image Insertion end , an advertisement of sexual services or an advertisement for conversion therapy, or computer data as defined in subsection 342.‍1(2) that makes child sexual abuse and exploitation material, a voyeuristic recording, an intimate image, Insertion start a false intimate image Insertion end , an advertisement of sexual services or an advertisement for conversion therapy available — that is stored on and made available through a computer system as defined in subsection 342.‍1(2) that is within the jurisdiction of the court, the judge may order the custodian of the computer system to

(2)Subsection 164.‍1(5) of the Act is replaced by the following:

Order
(5)If the court is satisfied, on a balance of probabilities, that the material is child sexual abuse and exploitation material as defined in section 163.‍1, a voyeuristic recording, an intimate image, Insertion start a false intimate image Insertion end , an advertisement of sexual services or an advertisement for conversion therapy, or computer data as defined in subsection 342.‍1(2) that makes child sexual abuse and exploitation material, the voyeuristic recording, the intimate image, Insertion start the false intimate image Insertion end , the advertisement of sexual services or the advertisement for conversion therapy available, it may order the custodian of the computer system to delete the material.

14Subparagraph (a)‍(xxvii.‍2) of the definition offence in section 183 of the Act is replaced by the following:

  • (xxvii.‍2) Insertion start subsection Insertion end 162.‍1( Insertion start 1 Insertion end ) (intimate image),

  • Start of inserted block

    (xxvii.‍3)subsection 162.‍1(1.‍1) (false intimate image),

    End of inserted block
15(1)Subsection 264(2) of the Act is amended by adding the following after paragraph (b):
  • Start of inserted block

    (b.‍1)repeatedly communicating with, either directly or indirectly, the other person or anyone known to them through the Internet, a social media service or other digital network;

    End of inserted block
(2)Section 264 of the Act is amended by adding the following after subsection (2):
Definition of social media service
Start of inserted block
(2.‍1)In paragraph (2)‍(b.‍1), social media service means a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication among users of the website or application by enabling them to access and share content.
End of inserted block

(3)Subsection 264(4) of the Act is replaced by the following:

Factors to be considered
(4) Insertion start If Insertion end a person is convicted of an offence under this section, the court imposing the sentence on the person shall consider as an aggravating factor that, at the time the offence was committed,
  • (a)the person contravened the terms or conditions of an order made Insertion start under Insertion end section 161 or a recognizance entered into Insertion start under Insertion end section 810, 810.‍1 or 810.‍2;

  • (b) Insertion start the person contravened Insertion end the terms or conditions of any other order or recognizance, or of an undertaking, made or entered into under the common law, this Act or any other Act of Parliament or of a provincial legislature that is similar in effect to an order or recognizance referred to in paragraph (a); or

  • Start of inserted block

    (c)in the case of conduct referred to in paragraph (2)‍(b.‍1), the person communicated anonymously or using a false identity.

    End of inserted block

16Subparagraph (a)‍(x) of the definition primary offence in section 490.‍011 of the Act is replaced by the following:

  • (x) Insertion start subsection Insertion end 162.‍1( Insertion start 1 Insertion end ) (publication, etc.‍, of an intimate image without consent),

  • Start of inserted block

    (x.‍1)subsection 162.‍1(1.‍1) (publication, etc.‍, of a false intimate image without consent),

    End of inserted block

17Paragraph 738(1)‍(e) of the Act is replaced by the following:

  • (e)in the case of an offence under subsection 162.‍1(1) Insertion start or (1.‍1) Insertion end , by paying to a person who, as a result of the offence, incurs expenses to remove the intimate image Insertion start or the false intimate image, as the case may be Insertion end , from the Internet or other digital network, an amount that is not more than the amount of those expenses, to the extent that they are reasonable, if the amount is readily ascertainable.

18(1)Subsection 810(1) of the Act is amended by striking out “or” at the end of paragraph (a), by adding “or” at the end of paragraph (b) and by adding the following after paragraph (b):
  • Start of inserted block

    (c)will continue to engage in conduct referred to in paragraph 264(2)‍(b.‍1).

    End of inserted block
(2)Section 810 of the Act is amended by adding the following after section (2):
Production order — identification of party
Start of inserted block
(2.‍1)If the information is in respect of the commission of an offence referred to in paragraph (1)‍(c) and the identity of the person against whom it is to be laid is unknown because that person communicated anonymously or using a false identity, the justice or summary conviction court may make an order under any of sections 487.‍015 to 487.‍017 for the purpose of identifying the person who transmitted the communication if, in addition to the conditions required for the issue of the order, the justice or court is satisfied that there is no other way by which any information that would reveal their identity can reasonably be obtained.
End of inserted block
(3)Section 810 of the Act is amended by adding the following after subsection (3):
Recognizance — online criminal harassment
Start of inserted block
(3.‍001)However, if the information is in respect of the commission of an offence referred to in paragraph (1)‍(c), the justice or summary conviction court may order that a recognizance be entered into only if the conduct that is the subject of the offence was threatening or obscene and the defendant engaged in a pattern of repetitive behaviour or persistent aggressive behaviour.
End of inserted block
Conditions
Start of inserted block
(3.‍002)If an order referred to in subsection (3.‍001) is made, the justice or summary conviction court
  • (a)must order that the recognizance include a condition prohibiting the defendant from communicating by any means — including a means referred to paragraph 264(2)‍(b.‍1) — directly or indirectly, with the person on whose behalf the information was laid; and

  • (b)may order the recognizance be entered into for any period — definite or indefinite — that the justice or summary conviction court considers necessary to protect the security of the person on whose behalf the information was laid, taking into consideration whether the defendant

    • (i)communicated with the person anonymously or using a false identity, or

    • (ii)created more than one social media service account to prevent their communications with the person from being blocked.

      End of inserted block
Definition of social media service
Start of inserted block
(3.‍003)In subparagraph (3.‍002)‍(b)‍(ii), social media service means a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication among users of the website or application by enabling them to access and share content.
End of inserted block

Coming into Force

Coming into force of Part

19This Part comes into force on the day on which it receives royal assent unless, on that day, An Act to amend the Criminal Code and to make consequential amendments to other Acts (child sexual abuse and exploitation material), chapter 23 of the Statutes of Canada, 2024, is not in force, in which case this Part comes into force on the day on which that Act comes into force.

Published under authority of the Speaker of the House of Commons

Publication Explorer
Publication Explorer
ParlVU