S722: Children's Online Safety Act/Funds. Latest Version

Session: 2025 - 2026

Senate
Passed 1st Reading
Rules
Committee


AN ACT enacting safeguards to protect children online, establishing the online safety division at the department of justice and the cyberbullying unit at the state bureau of investigation, creating the online child safety commission, and appropriating funds for those purposes.



The General Assembly of North Carolina enacts:



SECTION 1.(a) Chapter 114 of the General Statutes is amended by adding a new Article to read:



Article 11.



Online Safety Division.



§ 114‑75.  Division established; duties.



(a)        There is hereby established in the Office of the Attorney General of North Carolina, the  Online Safety Division (Division). The attorneys assigned to this Division shall function as an independent regulatory office within the Department of Justice focusing on online safety, with an emphasis on the protection of children online. The Division shall have the following duties:



(1)        Investigation of complaints made under the Chapter 114B of the General Statute (Children's Online Safety Act).



(2)        Educating of law enforcement agencies and the general public about the online safety of all North Carolinians, with an emphasis targeting harmful activities and dark patterns.



(3)        Issuance of online safety standards and guidelines, and review of relevant industry codes pertaining to internet safety, age‑appropriate design, and compliance with the Children's Online Safety Act.



(4)        Facilitating advisory panels on internet safety, including child development experts, technology specialists, parent representatives, community stakeholders, and industry representatives.



SECTION 1.(b)  Effective July 1, 2025, there is appropriated from the General Fund to the Department of Justice the sum of five million dollars ($5,000,000) for the 2025‑2026 fiscal year and the sum of four million five hundred thousand dollars ($4,500,000) for the 2026‑2027 fiscal year to establish the Child Safety Division as enacted by this section.



SECTION 2.(a)  The General Statutes are amended to add new Chapter to read:



Chapter 114B.



Children's Online Safety Act.



§ 114B‑1.  Definitions.



The following definitions apply in this Article:



(1)        Age‑appropriate design. – Design focused on developmental needs of children.



(2)        Child or minor. – An individual under eighteen (18) years of age



(4)        Covered platform. – An internet platform providing online services having more than five million (5,000,000) users in North Carolina and revenues exceeding twenty‑five million dollars ($25,000,000) annually.



(5)        Dark patterns. – Manipulative design elements in online environments.



(6)        Division. – The Online Safety Division of the North Carolina Department of Justice.



(7)        Harmful content. – Content that includes cyberbullying, self‑harm promotion, sexually explicit material, violent content, discrimination, and other physical or psychological harms.



(8)        Online service. – Social media, gaming platforms, messaging services, content‑sharing platforms, and other user‑interactive services provided by a covered platform.



§ 114B‑2.  Duty of care established; mandated protections for children; requirements for covered platforms.



(a)        It is the public policy of this State that its children are owed a duty of care with regard to their online activities in order to limit foreseeable harm and their exposure to dark patterns and harmful content on covered platforms. Further, the General Assembly endorses age‑appropriate design and strong parental controls as central to protecting children in online service environments.



(b)        Covered platforms shall require parental notification for accounts created by children and must offer robust, easy‑to‑use parental supervision tools such as filtering options, contact management, time limits and scheduling, purchase controls, and activity reporting.



(c)        It is unlawful for covered platforms to use dark patterns or deploy features known to be addictive or manipulative.



(d)       Covered platforms shall establish clear definitions of what constitutes cyberbullying and include provisions for direct reporting to the Division,  as well as provide intervention and support services for affected children, including mediation options where appropriate.



(e)        Each covered platform shall submit to the Online Safety Division an annual Child Impact Assessment for new an existing services on the platform. The assessment shall include documentation of potential risks to children and assessment of addiction and compulsive usage risks. The documentation supporting each annual assessment shall be retained for at least a three‑year period



(f)        Covered platforms shall utilize the highest privacy settings by default for all users reasonably likely to be children and shall establish strict data minimization principles to include the following:



(1)        Limiting collection of personal data to what is necessary for the service.



(2)        Requiring deletion when no longer needed.



(3)        Prohibiting data use for commercial purposes unless strictly necessary.



(4)        Including Right to be Forgotten provisions empowering minors to request deletion of their data and content.



(5)        Prohibiting profiling and behavioral advertising targeting children.



(6)        Requiring child‑friendly privacy information and controls.



(7)        Mandating transparency about personal data use.



(8)        Establishing clear restrictions on geolocation data collection and use.



(9)        Data broker restrictions for children's information.



§ 114B‑3.  Enforcement; construction; severability.



(a)        The Attorney General shall bring civil actions to enforce this Article. In any suit instituted by the Attorney General, in which the defendant is found to have violated this Chapter and the acts or practices which constituted the violation were, when committed, knowingly violative of a statute, the court may, in its discretion, impose a civil penalty against the defendant of up to five hundred thousand dollars ($500,000) for each violation. Civil penalties may be imposed in a new action or by motion in an earlier action, whether or not such earlier action has been concluded. In determining the amount of the civil penalty, the court shall consider all relevant circumstances, including, but not limited to, the extent of the harm caused by the conduct constituting a violation, the nature and persistence of such conduct, the length of time over which the conduct occurred, the assets, liabilities, and net worth of the defendant, and any corrective action taken by the defendant. The clear proceeds of penalties so assessed shall be remitted to the Civil Penalty and Forfeiture Fund in accordance with G.S. 115C 457.2.



(b)        This Article shall be liberally construed for the protection of minors and the general public. Noting in this Article shall be construed to infringe on any rights protected by the North Carolina or U.S. Constitutions.



(c)        If any provision of this Article is held by a court of competent jurisdiction to be invalid, void, or unenforceable, in whole or in part, the decision shall not affect the validity, enforceability, or applicability of the remaining provisions, which shall remain in full force and effect as if the provision held invalid, void, or unenforceable had not been included.



SECTION 2.(c)  Except as otherwise provided, this section becomes effective December 1, 2025, and applies to acts or omissions occurring on or after that date.



SECTION 3.(a)  Effective July 1, 2025, Article 13A of Chapter 143B of the General Statutes is amended by adding a new section to read:



§ 143B‑1209.  Cyberbullying Unit established.



(a)        There is established in the State Bureau of Investigation the Cyberbullying Unit (Unit) dedicated to the protection of children online and to aid in the enforcement of Article 11 of Chapter 114 of the General Statutes.



(b)        In addition to any other duties assigned by law, the Unit shall operate a toll‑free number and website on online child safety and cyberbullying jointly with the Department of Justice.



SECTION 3.(b)  Effective July 1, 2025, there is appropriated from the General Fund to the State Bureau of Investigation (SBI) the sum of two million dollars ($2,000,000) for the 2025‑2026 fiscal year and the sum of one million dollars ($1,000,000) for the 2026‑2027 fiscal year to create the Cyberbullying Unit at the SBI.



SECTION 4.(a)  There is hereby established the North Carolina Online Child Safety Commission as an independent regulatory body with oversight authority on matters relating to the safety and wellbeing of minors in digital environments. The Commission's primary mission shall be to protect North Carolina's children from online harms through research, education, regulation, enforcement, and ongoing adaptation to the evolving digital landscape. The Commission shall serve as the state's foremost authority on online child safety, advancing the digital wellbeing of minors while respecting fundamental rights. The Commission shall be guided by the best interests of the child, the recognition that digital safety is fundamental to childhood development, the understanding that technology must adapt to children's needs rather than the reverse, and the principle that powerful digital platforms must be held accountable for the safety of young users. The Commission shall consist of nine (9) members with demonstrated expertise and commitment to child welfare, digital technology, mental health, education, or related fields relevant to children's online safety. Members shall be appointed as follows:



(1)        Three (5) members appointed by the Governor



(2)        Two (1) members appointed by the President Pro Tempore of the Senate



(3)        Two (1) members appointed by the Speaker of the House of Representatives



(4)        One (1) member appointed by the Attorney General



(5)        One (1) member appointed by the Superintendent of Public Instruction.



The Commission membership shall include, at minimum:



(1)        One (1) member with expertise in child development and psychology



(2)        One (1) member with expertise in digital technology and data ethics



(3)        One (1) member representing parents or guardians



(4)        One (1) member with expertise in cybersecurity and privacy



(5)        One (1) member with expertise in education



(6)        One (1) member with legal expertise in child protection



(7)        One (1) youth advocate between the ages of 18‑25



Commission members shall serve four‑year staggered terms, with initial appointments varying in length to ensure continuity. The Commission shall elect a Chair and Vice‑Chair from among its members, serving two‑year terms. No member shall serve more than two consecutive terms.



SECTION 4.(b)  (a) The Commission with the assistance and input of the Department of Justice may do all of the following:



(1)        Issue binding regulations implementing the provisions of this act.



(2)        Establish safety standards for covered platforms.



(3)        Review and approve industry codes of practice.



(4)        Initiate investigations into potential violations.



(5)        Issue orders requiring compliance with the act.



(6)        Impose penalties for violations as provided in this act.



(7)        Seek injunctive relief through the courts when necessary to prevent harm to children.



(b)        The Commission shall provide expert guidance to:



(1)        The Governor and General Assembly on matters relating to online child safety.



(2)        State agencies on implementation of digital safety programs.



(3)        Educational institutions on digital literacy and safety curricula.



(4)        Parents and caregivers on tools and strategies to protect children online.



(5)        Technology platforms on best practices for age‑appropriate design.



(c)        Research and data collection. – The Commission shall:



(1)        Conduct or commission research on emerging online risks to children.



(2)        Collect and analyze data on the prevalence and impact of online harms.



(3)        Monitor global developments in online safety regulation.



(4)        Evaluate the effectiveness of interventions and regulations.



(5)        Maintain a database of safety incidents, platform responses, and outcomes.



(d)       Educational Initiatives. – The Commission shall develop and oversee:



(1)        Public awareness campaigns on online child safety.



(2)        Resource development for schools, parents, and children.



(3)        Training programs for educators and other professionals.



(4)        Digital literacy standards for K‑12 education.



(e)        Complaint handling. – The Commission shall:



(1)        Establish accessible mechanisms for receiving complaints about online harms to children.



(2)        Create specialized response protocols for different categories of harm.



(3)        Oversee platform compliance with complaint response requirements.



(4)        Intervene directly in serious cases where platforms fail to act appropriately.



(5)        Provide support resources for affected children and families.



(f)        Coordination and collaboration. – The Commission shall:



(1)        Coordinate with relevant state agencies including the Department of Justice, Department of Health and Human Services, and Department of Public Instruction.



(2)        Collaborate with federal authorities including the Federal Trade Commission and Department of Justice.



(3)        Engage with international counterparts to develop consistent approaches to digital safety.



(4)        Partner with academic institutions, non‑profit organizations, and other stakeholders dedicated to child safety.



SECTION 4.(c)  Annual duties and reporting requirements. – Key language elements:



(a)        State of Children's Online Safety Assessment. – The Commission shall prepare and publish an annual State of Children's Online Safety in North Carolina report that:



(1)        Assesses current online risks facing North Carolina's children.



(2)        Analyzes trends in online harms and platform responses.



(3)        Evaluates platform compliance with this act and related regulations.



(4)        Identifies emerging threats and technologies of concern.



(5)        Measures progress in addressing previously identified issues.



(6)        Provides data‑driven insights on the digital experiences of children by age group.



(b)        Platform Compliance Review. – The Commission shall conduct annual compliance reviews of covered platforms that:



(1)        Evaluate implementation of required safety measures.



(2)        Assess the effectiveness of age verification systems.



(3)        Review Child Impact Assessments and safety documentation.



(4)        Analyze complaint handling and response times.



(5)        Examine algorithmic systems for compliance with safety standards.



(6)        Identify best practices and areas requiring improvement.



(7)        Result in public compliance ratings for each major platform.



(c)        Legislative Recommendations. – The Commission shall submit annual recommendations to the General Assembly that:



(1)        Identify necessary amendments to this act based on technological developments.



(2)        Propose new legislative measures to address emerging concerns.



(3)        Suggest improvements to enforcement mechanisms.



(4)        Recommend funding priorities for child online safety initiatives.



(5)        Identify areas where federal action or coordination is needed.



(d)       Public Hearings and Testimony. – The Commission shall:



(1)        Hold at least four public hearings annually across different regions of the State.



(2)        Receive testimony from children, parents, educators, platforms, and experts.



(3)        Conduct specialized hearings on emerging issues of concern.



(4)        Ensure diverse perspectives are represented in deliberations.



(5)        Make hearing records publicly available.



(e)        Industry Engagement. – The Commission shall:



(1)        Convene an annual Industry Safety Summit with platform representatives.



(2)        Facilitate regular working groups on specific safety challenges.



(3)        Review and approve updates to industry codes of practice.



(4)        Evaluate voluntary safety initiatives.



(5)        Promote adoption of safety innovations across the industry.



(f)        Transparency Reporting. – The Commission shall publish annual transparency reports detailing:



(1)        Enforcement actions taken and their outcomes.



(2)        Complaints received and resolved.



(3)        Penalties assessed and collected.



(4)        Allocation and impact of Children's Online Safety Fund expenditures.



(5)        Commission activities, investigations, and initiatives.



(6)        Metrics for measuring the effectiveness of the Act's implementation.



(g)        Audit of Educational Programs. – The Commission shall annually audit and evaluate:



(1)        Digital literacy programs in North Carolina schools.



(2)        Parent education initiatives.



(3)        Professional development for educators and youth workers.



(4)        Public awareness campaign effectiveness.



(5)        Resource allocation and accessibility across communities.



(h)        Research Agenda Development. – The Commission shall:



(1)        Establish annual research priorities based on identified gaps.



(2)        Commission or conduct studies on priority areas.



(3)        Award research grants from the Children's Online Safety Fund.



(4)        Publish findings and recommendations based on research.



(5)        Ensure research informs regulatory and educational approaches.



SECTION 4.(d)  Operational Framework:  Key Language Elements –



(a)        Staffing and Structure. – The Commission shall:



(1)        Be supported by a professional staff led by an Executive Director.



(2)        Maintain specialized divisions for enforcement, education, research, and policy.



(3)        Employ technologists, child development experts, data analysts, and legal specialists.



(4)        Establish advisory committees on specific issue areas as needed.



(b)        Funding Mechanisms. – The Commission shall:



(1)        Receive an annual appropriation from the General Assembly.



(2)        Administer the Children's Online Safety Fund.



(3)        Allocate penalties collected to prevention, education, and enforcement activities.



(4)        Report annually on budget allocation and performance metrics.



(c)        Technological Capabilities. – The Commission shall:



(1)        Maintain technical testing facilities to evaluate platform compliance.



(2)        Develop data analysis capabilities to identify patterns of harm.



(3)        Employ experts capable of evaluating platform algorithms and safety systems.



(4)        Keep pace with emerging technologies that may pose risks to children.



(d)       Ethical Frameworks. – The Commission shall:



(1)        Establish clear ethical guidelines for its work.



(2)        Ensure privacy protection in research and investigations.



(3)        Develop age‑appropriate methods for involving children in policy development.



(4)        Balance safety imperatives with other rights and considerations.



(e)        Accountability Mechanisms. – The Commission shall:



(1)        Be subject to oversight by the General Assembly.



(2)        Maintain transparent decision‑making processes.



(3)        Publish the basis for regulatory determinations.



(4)        Establish clear metrics for measuring its own effectiveness.



(5)        Undergo periodic independent evaluation.



SECTION 4.(e)  Implementation timeline. Key Language Elements –



(a)        Within 90 days of this act's effective date, initial appointments to the Commission shall be made and the Commission shall convene its first meeting.



(b)        Initial Responsibilities. – Within the first year of operation, the Commission shall:



(1)        Hire key staff and establish organizational structure.



(2)        Develop initial regulations implementing this act.



(3)        Create complaint handling systems.



(4)        Establish public education programs.



(5)        Develop platform compliance guidelines.



(6)        Submit its first annual report to the General Assembly.



(c)        Phased Implementation. – The Commission shall develop a phased implementation plan that:



(1)        Prioritizes addressing the most serious harms.



(2)        Accommodates different compliance timelines based on platform size.



(3)        Allows for industry adjustment to new requirements.



(4)        Includes benchmarks for measuring progress.



(d)       Review and Adaptation. –Every three years, the Commission shall:



(1)        Conduct a comprehensive review of its activities and impact.



(2)        Assess changing technological landscape and emerging challenges.



(3)        Revise strategic priorities and approaches as needed.



(4)        Recommend statutory amendments to maintain effectiveness.



SECTION 5.  Except as otherwise provided, this act is effective when it becomes law.