H860: Social Media Control in IT Act. Latest Version

Session: 2025 - 2026

House
Passed 1st Reading


AN ACT TO COMBAT SOCIAL MEDIA ADDICTION BY REQUIRING THAT SOCIAL MEDIA PLATFORMS RESPECT THE PRIVACY OF NORTH CAROLINA USERS' DATA AND NOT USE A NORTH CAROLINA MINOR'S DATA FOR ADVERTISING OR ALGORITHMIC RECOMMENDATIONS, and appropriating funds for that purpose, AND TO MAKE WILLFUL VIOLATIONS OF DATA USER PRIVACY AN UNFAIR PRACTICE UNDER Chapter 75 of the general statutes.



The General Assembly of North Carolina enacts:



SECTION 1.  Chapter 75 of the General Statutes is amended by adding a new Article to read:



Article 2B.



Social Media Control in Information Technology.



§ 75‑70.  Title; definitions.



(a)        This Article shall be known and may be cited as the Social Media Control in Information Technology Act.



(b)        Definitions. – The following definitions apply in this Article:



(1)        Accessible mechanism. – A user‑friendly, clear, easy‑to‑use, readily available, and technologically feasible method that allows individuals to exercise their data privacy rights without undue burden. The mechanism must be designed to accommodate diverse user needs, including those with disabilities, and should be available across commonly used platforms. The mechanism should provide clear instructions, function without excessive complexity, and be free of unreasonable barriers such as length procedures, hidden settings, or excessive delays.



(2)        Algorithmic recommendation system. – A computational process that uses machine learning, natural language processing, artificial intelligence techniques, generative artificial intelligence, or other computational processing techniques that makes a decision or facilitates human decision making with respect to user‑related data to rank, order, promote, recommend, suggest, amplify, or similarly determine the delivery or display of information to an individual.



(3)        Collects, collected, or collection. – Buying, renting, gathering, obtaining, receiving, or accessing any personal information pertaining to a user by any means. This includes receiving information from the consumer, either actively or passively, or by observing the consumer's behavior.



(4)        Consent. – Any freely given, specific, informed, and unambiguous indication of a user's wishes by which the consumer, or the consumer's legal guardian, a person who has power of attorney, or a person acting as a conservator for the consumer, including by a statement or by a clear affirmative action, signifies agreement to the processing of personal information relating to the consumer for a narrowly defined particular purpose. None of the following constitutes consent:



a.         Acceptance of a general or broad terms of use, or similar document, that contains descriptions of personal information processing along with other, unrelated information.



b.         Hovering over, muting, pausing, or closing a given piece of content.



c.         Agreement obtained through use of dark patterns.



(5)        Default settings. – The predetermined options, values, and configurations that a program is initially set to whenever it is installed and initially accessed.



(6)        Minor. – An individual who is under 18 years of age.



(7)        Operator. – Defined in section 1302 of the Children's Online Privacy Protection Act of 1998, 15 U.S.C. § 6501.



(8)        Opt‑in mechanism. – An accessible mechanism separate from any other notifications, disclosures, or consents, such as a privacy policy or terms of service, that allows the user to consent to the platform engaging in a specific, narrow, and well‑defined practice. The Division of Health Service regulations has the authority to specify requirements for the notification and consent process, including specific language and disclosures that may include a warning on the harmful effects of manipulative algorithms, the length of time for which the notification must appear before the user has the option to consent, and the process that the user must follow to consent.



(9)        Personal information. – Information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. Personal information includes, but is not limited to, the following if it identifies, relates to, describes, is reasonably capable of being associated with, or could be reasonably linked, directly or indirectly, with a particular user or household:



a.         Identifiers such as a real name, alias, postal address, unique personal identifier, online identifier, Internet Protocol address, email address, account name, social security number, drivers license number, passport number, or other similar identifiers.



b.         Commercial information, including, but not limited to, records of personal property, products, or services purchased, obtained, or considered, or other purchasing or consumer histories or tendencies.



c.         Biometric information, that is any information relating to an individual's physiological, biological, or behavioral characteristics, including, but not limited to, imagery of the iris, retina, fingerprint, face, hand, palm, gait, vein patterns, and voice recordings.



d.         Internet or other electronic network activity information, including, but not limited to, browsing history, search history, and information regarding a user's interaction with an internet website application or advertisement.



e.         Usage data.



f.          Third‑party data.



g.         Geolocation data.



h.         Audio, electronic, visual, thermal, olfactory, or similar information.



i.          Professional or employment‑related information.



j.          Education information, defined as information that is not publicly available personally identifiable information as defined in the Family Education Rights and Privacy Act (20 U.S.C. § 1232(g); 34 C.F.R. Part 99).



k.         Financial information from a user, including, but not limited to, a user's account log‑in, financial account, debit card, or credit card number in combination with any required security or access code, password, or credentials allowing access to an account.



l.          The contents of a user's mail, email, and text messages unless the platform is the intended recipient of the communication.



m.        A user's racial or ethnic origin, citizenship or immigration status, religious or philosophical beliefs, or union membership.



n.         Information related to a user's health, sex life, or sexual orientation.



o.         Inferences drawn from any of the information identified in this subdivision reflecting the user's preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.



(10)      Platform user. – An individual who resides in North Carolina who uses a social media platform.



(11)      Social media platform, covered platform, or platform. – An electronic medium with more than 1,000,000 monthly active users in the United States that functions as a social media service. The term does not include any of the following:



a.         An entity acting in its capacity as a provider of a common carrier service subject to the Communications Act of 1934 (47 U.S.C. § 151 et seq.) as amended and supplemented.



b.         A broadband internet access service under section 8.1(b) of Title 47 of the Code of Federal Regulations.



c.         An electronic mail service.



d.         Internet search engines specifically designed to lead a user to a result which a user expressly searched for.



e.         Internet service providers.



f.          A wireless messaging service provided through the short messaging service or multimedia messaging service protocols.



g.         Video game services specifically designed to serve as a platform to solely play video games.



h.         Online shopping or e‑commerce services specifically designed for that sole purpose.



i.          Video‑streaming services that solely provide non‑user generated content.



(12)      Third‑party data. – Personal data from another person, company, data broker, and/or platform that is not the user to whom the data pertains and is not the platform. The term does not refer to persons, companies, data brokers, and/or platforms that collect personal data from another entity if the entity shares common branding with the platform, controls the platform, is controlled by the platform, or is under common control of another legal entity with the platform.



(13)      Usage data. – Any information that is gathered about a user's interactions, behaviors, preferences, and usage patterns on a platform, including, but not limited to, information related to pages visited, clicks, scrolls, navigation patterns, search queries, button presses, feature usage, frequency of logins, session duration, items added or removed from a shopping cart, purchasing history, subscription usage, content watched, content read, content listened to, or time spent using or engaging with any feature or piece of content on the platform. This includes any and all inferences derivable and related to a user from this usage data, including user engagement statistics, content metrics, feature usage statistics, user flow data, retention rates, and churn rates.



§ 75‑71.  User data privacy; targeting minors prohibited; registry.



(a)        Privacy Requirements. – The General Assembly finds that unhealthy social media use has been linked to depression, anxiety, eating disorders, and suicidal ideation, especially among young people. Exploitation of user data can result in users being targeted in ways that increase unhealthy social media use. It is the policy of this State that user data shall be respected by platforms. Special protections are warranted for users who are minors. Therefore, the operator of a social media platform shall comply with all of the following requirements for platform users:



(1)        The platform must specifically and clearly inform users in the following ways:



a.         A disclosure in a clear, easy‑to‑read, and accessible format when a user first initializes their use of a platform for the first time, or after a period of inactivity greater than or equal to six months, about how the platform collects personal information, what personal information the platform collects, how the personal information is used by the platform for every use case, and how the user can exercise their rights and choices on the platform. This disclosure must be provided in no more than 500 words, and the platform must obtain a user's consent before the platform collects any user‑related data on the user.



b.         A disclosure in a clear, easy‑to‑read, and accessible format that details (i) the categories of information the platform has collected about the user, (ii) the categories of sources from which the information is collected, (iii) the business or commercial purpose for collecting, selling, or sharing personal information, (iv) the categories of third parties to whom the business discloses personal information, and (v) the specific pieces of personal information it has collected about that user. Such information must be available upon receipt of a verifiable consumer request made through an accessible mechanism on the platform.



(2)        Personal information may be used in algorithmic recommendations only when both of the following requirements are met:



a.         The platform reasonably determines the user is not a minor from personal information collected by and available to the covered platform in its ordinary course of business.



b.         The user has been notified and expressly consents to the use of their own data in this manner by consenting in an opt‑in mechanism.



(3)        Through an accessible mechanism, users must be given the capacity to alter, change, and delete what categories of personal information are used in a platform's algorithmic recommendation system or systems. This selection shall be modifiable at any time. If a user indicates that they intend a certain category of personal information not to be used in an algorithmic recommendation system, then the platform must not include said category or categories within an algorithmic recommendation system. A covered platform shall not discriminate against a user because the user exercised any of the rights under this Article in the provision of functionality or features of the covered platform, unless the use of user‑related data in an algorithmic recommendation system is reasonably necessary to the feature or functionality.



(b)        Targeting Minors Prohibited. – A covered platform must establish comprehensive and effective controls to ensure that a minor's personal information is not used in any algorithmic recommendation system.



(c)        Exceptions. – Subsection (b) of this section does not apply to any of the following:



(1)        Recommending or presenting content from accounts that a user follows in reverse chronological order or a similar method of recommending or presenting content.



(2)        A user's explicit search for content or request for information for the sole purpose of providing immediate results to the search, and without retention or use of the user‑related data from the search or request for purposes other than providing results to the search or request.



(3)        A covered platform's action, voluntarily taken in good faith to restrict access to or availability of material as described in section 230(c)(2)(A) of the Communications Act of 1934 (47 U.S.C. § 230(c)(2)(A)), is not subject to this subsection, and nothing in this section otherwise limits or otherwise affects the provisions of section 230 of the Communications Act of 1934, except as otherwise provided in this Article.



(d)       The operator of a social media platform may be held liable for violating subsection (a) of this section if the user was given algorithmic content recommendations without a proper opt‑in mechanism or affirmation from the user from the opt‑in process. The operator of a social media platform may be held liable for violating subsection (b) of this section if the operator of the social media platform knew or had reason to know that the user was a minor. The operator of a social media platform that has made an estimation of a user's age based upon the user's self‑attestation is not liable if the user was a minor who falsely attested to not being a minor.



§ 75‑72.  Design features and digital rights of users.



(a)        Protective Default Settings for Minors. – A covered platform shall configure all privacy settings provided to any user by the online service, product, or feature be both available to minors and, by default, set to preferences that offer the highest level of privacy, unless the business can demonstrate a compelling reason that a different setting is in the best interest of minors. These settings must include all of the following:



(1)        Notifications must be turned off by default.



(2)        The visibility of reaction or interaction counts on all content, including content generated by a minor and content seen by a minor generated from others, must be turned off by default.



(3)        The ability of other users, not added by the user to a list of approved contacts, to communicate with the minor must be turned off by default.



(4)        The ability of other users, whether registered or not, and not added by the user to a list of approved contacts, to view the minor's user‑related data collected by or shared on the platform must be disabled by default.



(5)        The ability of other users to see the geolocation of a minor must be disabled by default.



(6)        Features that increase, sustain, or extend the use of the covered platform by a minor, such as automatic playing of media and rewards for time spent on the platform, must be disabled by default.



(b)        Rights to Change and Delete Data. – A covered platform shall provide users with both of the following:



(1)        An accessible mechanism to request the correction of any inaccurate personal information about the user, taking into account the nature of the personal information and the purposes of the personal information. A platform that receives a verifiable request to correct inaccurate personal information shall use commercially reasonable efforts to correct the inaccurate personal information as directed by the user. A covered platform shall maintain a record of all requests.



(2)        An accessible mechanism to request the deletion of personal information about the user, taking into account the nature of the personal information and the purposes of the personal information. If the personal information is reasonably necessary for the platform to complete a transaction, to ensure the security and integrity of the user's personal information, to debug or identify and repair errors in the platform, to exercise free speech and ensure the user's right to exercise free speech, to comply with existing federal and State regulations, to engage in public or peer‑reviewed scientific research, or to enable solely internal uses reasonably aligned with a consumer's expectations, then the covered platform is not required to comply with the user's request. Otherwise, the covered platform is required to complete the request. A covered platform shall maintain a confidential record of all requests.



(c)        Digital Rights of the User. – All of the following rights belong to every minor utilizing covered platforms:



(1)        Right to protection from manipulative design. – Every minor has the right to be protected from manipulative design techniques which exploit psychological vulnerability or have been shown by the preponderance of the evidence to create addiction or dependency.



(2)        Right to transparency. – Every minor has the right to understand the nature of their digital experiences. Platforms and services should provide clear and accessible explanations of the platform features as well as how covered platforms can negatively affect their well‑being.



(3)        Right to protection from personalized recommendation systems. – Every minor has the right to be protected from algorithmic recommendation systems.



(d)       The operator of a covered platform may be subject to violations of subsection (a), (b), or (c) of this section if any of the requirements and rights established herein have been determined to be violated.



§ 75‑73.  Investigation; enforcement; private right of action.



(a)        Violations. – Effective January 1, 2026, a platform's violation of this Article is an unfair or deceptive act or practice under G.S. 75‑1.1.



(b)        Investigations. – The Attorney General shall monitor social media platforms for compliance with this Article.



(c)        Complaints. – A platform user may make a complaint to the Attorney General alleging that a social media platform has failed to comply with the requirements of this Article. The Attorney General may bring a civil action in any case in which the Attorney General has reason to believe that the interest of the residents of this State has been or is threatened due to noncompliance with this Article.



(d)       Private Right of Action. – Minors can file suit if they are affected by any covered platform found to be in violation of this Article through mechanisms involved in parens patriae jurisdiction by the following:



(1)        Civil suit brought through private action attorneys.



(2)        Relief. – In a civil action brought under subsection (c) of this section or this subsection in which a plaintiff prevails, the court may award the plaintiff any one or more of the following:



a.         An amount equal to the sum of any compensatory damages.



b.         Punitive damages.



c.         Injunctive relief.



d.         Declaratory relief.



e.         Reasonable attorneys' fees and litigation costs.



§ 75‑74.  North Carolina Data Privacy Task Force.



(a)        There is created the North Carolina Data Privacy Task Force (Task Force) within the Department of Justice for budgetary purposes only.



(b)        The Task Force shall be composed of 21 members. The ex officio members listed in subdivisions (1) through (6) of this subsection may designate representatives from their particular departments, divisions, or offices to represent them on the Task Force. In making appointments or designating representatives, appointing authorities and ex officio members shall use best efforts to select members or representatives with sufficient knowledge and experience to effectively contribute to the issues examined by the Task Force and, to the extent possible, to reflect the geographical, political, gender, and racial diversity of this State. The members shall be as follows:



(1)        The Attorney General.



(2)        The State Chief Information Officer.



(3)        The Secretary of the Department of Health and Human Services.



(4)        The Director of the State Bureau of Investigation.



(5)        The Director of the Maternal and Child Health Section of the Department of Health and Human Services.



(6)        The Director of the Division of Mental Health, Developmental Disabilities, and Substance Use Services.



(7)        A representative from NC Child, appointed by the Governor upon recommendation of the President of the organization.



(8)        A representative from a private group, other than NC Child, that advocates for children, appointed by the Governor upon recommendation of private child advocacy organizations.



(9)        A pediatrician, licensed to practice medicine in North Carolina, appointed by the President Pro Tempore of the Senate.



(10)      A psychiatrist, licensed to practice medicine in North Carolina, appointed by the Speaker of the House of Representatives.



(11)      Two public members, one of whom is an educator, appointed by the Speaker of the House of Representatives.



(12)      Two public members, one of whom is a social worker, appointed by the President Pro Tempore of the Senate.



(13)      Two members of the Senate, appointed by the President Pro Tempore of the Senate, and two members of the House of Representatives, appointed by the Speaker of the House of Representatives.



(14)      A representative from the North Carolina Young People's Alliance, appointed by the Governor upon recommendation of the head of the organization.



(15)      Two youth representatives under the age of 21 appointed by the Secretary of the Department of Health and Human Services after conducting an application‑based selection process.



(c)        All members of the Task Force are voting members. Vacancies in the appointed membership shall be filled by the appointing officer who made the initial appointment. Terms shall be two years. The members shall elect a chair who shall preside for the duration of the chair's term as a member. In the event a vacancy occurs in the chair before the expiration of the chair's term, the members shall elect an acting chair to serve for the remainder of the unexpired term.



(d)       Beginning March 15, 2026, and then annually thereafter, the Task Force shall report to the General Assembly on its work, with a special focus on mental health issues related to social media, along with findings, recommendations, and any legislative proposals.



SECTION 2.  Effective July 1, 2025, there is appropriated from the General Fund to the Department of Justice the sum of one hundred thousand dollars ($100,000) for the 2025‑2026 fiscal year and the sum of one hundred thousand dollars ($100,000) for the 2026‑2027 fiscal year to develop the registry created in G.S. 75‑71, as enacted by this act.



SECTION 3.  Section 1 of this act becomes effective October 1, 2026. The remainder of this act becomes effective July 1, 2025.