H375: AI/Ban Deceptive Ads. Latest Version

Session: 2025 - 2026

House
Passed 1st Reading


AN ACT to regulate the use of deepfakes and deceptive advertisements in elections and protect minors and the general public from misuse of Artificial Intelligence and synthetic media.



The General Assembly of North Carolina enacts:



SECTION 1.  The General Statutes are amended by adding a new Chapter to read:



Chapter 170.



Artificial Intelligence and Synthetic Media.



Article 1.



Political Campaigns.



§ 170‑1.  Title; definitions.



(a)        This Chapter shall be known and may be cited as the Artificial Intelligence and Synthetic Media Act.



(b)        The following definitions apply in this Chapter:



(1)        Artificial intelligence or AI. – A machine‑based system that can, for a given set of human‑defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments.



(2)        Creator. – A person that uses artificial intelligence to generate synthetic media. The term does not include a person that solely provides the technology used in the creation of the synthetic media.



(3)        Deceptive and fraudulent deepfake. – Synthetic media that depicts a candidate or political party with the intent to injure the reputation of the candidate or political party or otherwise deceive a voter and that either:



a.         Appears to a reasonable person to depict a real individual saying or doing something that did not actually occur in reality; or



b.         Provides to a reasonable person a fundamentally different understanding or impression of the appearance, action, or speech in an image, audio recording, or video recording than a reasonable person would have from an unaltered, original version of the image, audio recording, or video recording.



(4)        Deepfake. – A video, audio, or any other media of a person in which the person's face, body, or voice has been digitally altered so that the person appears to be someone else, the person appears to be saying something that the person has never said, or the person appears to be doing something that the person has never done.



(5)        Digital content provenance. – Purely factual information that details a digital resource's creator, origin, context, history, and editing process; and conforms to an open industry technical standard.



(6)        Digital impersonation. – Synthetic media, typically video or audio, that:



a.         Has been digitally manipulated to convincingly replace one person's likeness or voice with that of another using deep generative methods and artificial intelligence techniques, or for which one person's likeness or voice has otherwise been simulated using deep generative methods and artificial intelligence techniques;



b.         Was created with the intention to deceive or lead reasonable listeners or viewers into believing that the content is authentic;



c.         Reasonable viewers or listeners would believe actually represents the person's voice or likeness;



d.         Would cause reasonable viewers or listeners to conclude that the recording or image is a true and accurate depiction of something the person said or did;



e.         Is not commentary, parody, satire, criticism, or artistic expression; and



f.          Was not created by the person or with the person's consent.



(7)        Digitization. – Creating or altering an image of a person in a realistic manner utilizing images of another person or computer‑generated images, regardless of whether the creation or alteration is accomplished manually or through an automated process. The term includes, but is not limited to, creation or alteration of an image with the use of artificial intelligence.



(8)        Fabricated intimate image. – Any photograph, motion picture film, videotape, digital image, or any other recording or transmission of another person who is identifiable from the image itself or from information displayed with or otherwise connected to the image, and that was created or altered by digitization to depict:



a.         Computer‑generated intimate body parts or the intimate body parts of another person as the intimate body parts of the depicted person, whether nude or visible through less than opaque clothing and including the genitals, pubic area, anus, or postpubescent female nipple; or



b.         The depicted person engaging in sexual conduct in which the depicted person did not actually engage.



(9)        Generative artificial intelligence or Gen AI. – Artificial intelligence that:



a.         Is trained on data;



b.         Interacts with a person using text, audio, or visual communication; and



c.         Generates non‑scripted outputs similar to outputs created by a human, with limited or no human oversight.



(10)      Generated child pornography. – Any image that has been created, altered, adapted, or modified by electronic, mechanical, or other computer‑generated means to portray a fictitious person, who a reasonable person would regard as being a minor, engaged in sexual conduct.



(11)      Information content provider. – A person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.



(12)      Materially deceptive media. – Synthetic audio or visual media that:



a.         Exhibits a high level of authenticity or convincing appearance that is visually or audibly indistinguishable from reality to a reasonable person;



b.         Depicts a scenario that did not actually occur; or that has been altered in a significant way from how they actually occurred such that it significantly changes how a reasonable person would understand the original content;



c.         Is likely or meant to harm reputation or mislead voters; and



d.         Is created by generative artificial intelligence or with software, machine learning, or any other computer‑generated or technological means, including adapting, modifying, manipulating, or altering a realistic depiction.



(13)      Regulated occupation. – Any occupation that is subject to licensing or certification by a state occupational licensing board or commission.



(14)      Sexual conduct. – As defined in G.S. 14‑190.5A. The term includes sexual activity as defined by G.S. 14‑190.13.



(15)      Sponsor. – A person that pays for the content that uses artificial intelligence to generate synthetic media.



(16)      Synthetic audio media. – Audio content that was substantially produced by generative artificial intelligence.



(17)      Synthetic media. – An image, audio recording, or video recording of an individual's appearance, speech, or conduct that has been created or intentionally manipulated with the use of digital technology in a manner to create a realistic but false image, audio, or video.



(18)      Synthetic visual media. – An image or video that was substantially produced by generative artificial intelligence.



§ 170‑2.  Use of synthetic media in political campaigns.



(a)        Within 90 days before an election at which a candidate for elected office will appear on the ballot, a person who acts as a creator shall not sponsor or create and distribute a synthetic media message that the person knows is a deceptive and fraudulent deepfake of that candidate or of a political party that is on that ballot unless the synthetic media message includes a clear and conspicuous disclosure that meets the following criteria:



(1)        An audio communication that contains synthetic audio media shall include audibly at the beginning and end of the communication the words, Contains content generated by AI. If the audio content is greater than two minutes in length, the words shall be interspersed within the audio at intervals of not greater than two minutes each, in the same language as the rest of the audio used in the communication, and in a pitch that can be easily heard by the average listener.



(2)        A visual communication that contains synthetic media shall display throughout the duration of each portion of the communication containing synthetic media, in legible writing, the words:



a.         This video content generated by AI, if the content is a video that includes synthetic visual media but not synthetic audio media;



b.         This image generated by AI, if the content is an image that includes synthetic visual media but not synthetic audio media;



c.         This audio content generated by AI, if the video includes synthetic audio media but not synthetic visual media; or



d.         This content generated by AI, if the communication includes both synthetic audio media and synthetic visual media.



(3)        For visual media, the disclosure shall be printed or typed in a legible font size easily readable by the average viewer that is no smaller than other text appearing in the visual media and in the same language used on the communication to read as follows: This (image, video, or audio) has been manipulated.



(b)        In addition to the requirements in subsection (a) of this section, a creator or sponsor who publishes an online digital audio or visual communication that is viewable, audible, or accessible in this State shall ensure the advertisement carries embedded tamper‑evident digital content provenance that discloses:



(1)        The initial author and creator of the content;



(2)        Any subsequent entities that edited, altered, or otherwise modified the content; and



(3)        Any use of generative artificial intelligence in generating or modifying the substantive content.



(c)        This section applies to an audio or visual communication that:



(1)        Is paid for by a candidate campaign committee, political action committee, political issues committee, political party, or a person using a contribution;



(2)        Is intended to influence voting for or against a candidate or ballot proposition in an election or primary in this State; and



(3)        Contains synthetic media.



§ 170‑3.  Use of materially deceptive media in political communications.



(a)        A person that distributes or publishes any political communication that was produced by or includes materially deceptive media and knows or should know that it is materially deceptive shall disclose this use, as follows:



(1)        For visual media, the disclosure shall be printed or typed in a legible font size easily readable by the average viewer that is no smaller than other text appearing in the visual media and in the same language used on the communication to read as follows: This (image, video, or audio) has been manipulated. This subdivision does not apply to any of the following:



a.         Materially deceptive media that constitutes satire or parody.



b.         Materially deceptive media created for the purposes of bona fide news reporting when the required disclosure is included.



c.         Initial dissemination by a platform or service, including, but not limited to, a website, regularly published newspaper, or magazine, where the content disseminated is materially deceptive media provided by another information content provider when a good‑faith effort has been made to establish that the depiction is not materially deceptive media.



d.         An interactive computer service as defined in 47 U.S.C. § 230.



(2)        For communication that is auditory, such as radio or automated telephone calls, clearly speaking the statement at the beginning of the audio, at the end of the audio, and, if the audio is greater than two minutes in length, interspersed within the audio at intervals of not greater than two minutes each and in the same language as the rest of the audio used in the communication, and in a pitch that can be easily heard by the average listener.



§ 170‑4.  Enforcement and remedies for violations.



(a)        A candidate whose appearance, action, or speech is depicted through the use of a deceptive and fraudulent deepfake in violation of this Article may seek injunctive or other equitable relief prohibiting the publication of the deceptive and fraudulent deepfake.



(b)        A candidate whose voice or likeness appears in materially deceptive media in violation of this Article may seek reasonable attorneys' fees, costs, and injunctive relief prohibiting the distribution, publication, or broadcasting of any materially deceptive media in violation of this Article against such individual or entity who disseminated or published the media without the consent of the person depicted and who knew or should have known that it was materially deceptive.



An action under this section shall be initiated by filing an application for an order to show cause in the superior court where the materially deceptive media at issue could deceive and influence voters in an upcoming election. The action shall be entitled to an automatic calendar preference and be subject to expedited pretrial and trial proceedings.



(c)        In any action alleging a violation of this Article in which a plaintiff seeks preliminary relief with respect to an upcoming election, the court shall grant relief if it determines that plaintiffs are more likely than not to succeed on the merits and it is possible to implement.



(d)       The plaintiff bears the burden of establishing the use of materially deceptive media by clear and convincing evidence in any action brought under this Article.



(e)        Any person who violates this Article is guilty of a Class 1 misdemeanor, except that:



(1)        A person who commits the violation within five years of one or more prior convictions under this section is guilty of a Class A felony.



(2)        A person who commits the violation with the intent to cause violence or bodily harm is guilty of a Class A felony.



§ 170‑5.  Exceptions.



This Article does not apply to any of the following:



(1)        A radio or television broadcasting station, including a cable or satellite television operator, programmer, or producer that:



a.         Broadcasts a deceptive and fraudulent deepfake that is prohibited by this Article and that is part of a bona fide newscast, news interview, or news documentary or on‑the‑spot coverage of bona fide news events, if the broadcast clearly acknowledges through its content or a disclosure in a manner that can be easily heard or read by the average listener or viewer that there are questions about the authenticity of the materially deceptive audio or visual media; and



b.         Is paid to broadcast a deceptive and fraudulent deepfake and has made a good‑faith effort to establish that the depiction is not a deceptive and fraudulent deepfake.



(2)        An internet website or a regularly published newspaper, magazine, or other periodical of general circulation, including an internet or electronic publication, that routinely carries news and commentary of general interest and that publishes materially deceptive audio or visual media that is prohibited by this Article if the publication clearly states that the materially deceptive audio or visual media was generated by artificial intelligence.



(3)        Media that constitutes satire or parody.



(4)        An interactive computer service as defined in 47 U.S.C. § 230.



Article 2.



Pornography and Fabricated Images.



§ 170‑6.  Generated child pornography.



(a)        A person who intentionally creates generated child pornography is guilty of a Class A felony.



(b)        It is unlawful for a person to knowingly possess, control, or intentionally view a photograph, a motion picture, a representation, an image, a data file, a computer depiction, or any other presentation which, in whole or in part, the person knows includes generated child pornography. A person who violates this subsection is guilty of a Class A felony.



(c)        The possession, control, or intentional viewing of each such photograph, motion picture, representation, image, data file, computer depiction, or other presentation constitutes a separate offense.



(d)       This section does not apply to any material possessed, controlled, or intentionally viewed as part of a law enforcement investigation.



(e)        In a criminal proceeding, any property or material that constitutes generated child pornography must remain secured or locked in the care, custody, and control of a law enforcement agency, the district attorney, or the court. Notwithstanding any law or rule of court to the contrary, a court shall deny, in a criminal proceeding, any request by the defendant to copy, photograph, duplicate, or otherwise reproduce any property or material that constitutes generated child pornography so long as the district attorney makes the property or material reasonably available to the defendant. For purposes of this section, property or material is deemed to be reasonably available to the defendant if the district attorney provides ample opportunity at a designated facility for the inspection, viewing, and examination of the property or material that constitutes generated child pornography by the defendant, the defendant's attorney, or any individual whom the defendant uses as an expert during the discovery process or at a court proceeding.



§ 170‑7.  Disclosure of fabricated intimate images.



(a)        A person is guilty of a Class 1 misdemeanor when the person knowingly discloses a fabricated intimate image of another person and the person disclosing the image:



(1)        Knows or should have known that the depicted person has not consented to the disclosure; and



(2)        Knows or reasonably should know that disclosure would cause harm to the depicted person.



(b)        A person who is under the age of 18 is not guilty of the crime of disclosing fabricated intimate images unless the person:



(1)        Intentionally and maliciously disclosed a fabricated intimate image of another person; and



(2)        Knows or should have known that the depicted person has not consented to the disclosure.



(c)        This section does not apply to:



(1)        Disclosures made in the public interest, including, but not limited to, the reporting of unlawful conduct, or the lawful and common practices of law enforcement, criminal reporting, legal proceedings, or medical treatment; or



(2)        Images that constitute commentary, criticism, or disclosure protected by the North Carolina Constitution or the United States Constitution.



(d)       This section does not impose liability upon the following entities solely as a result of content provided by another person:



(1)        An interactive computer service, as defined in Title 47 U.S.C. § 230(f)(2);



(2)        A mobile telecommunications service provider; or



(3)        A telecommunications network or broadband provider.



(e)        In any prosecution for a violation of this section, it is not a defense that:



(1)        The perpetrator lacked knowledge of whether the disclosed image had been created or altered by digitization; or



(2)        The depicted person consented to the creation or alteration of the image.



(f)        The crime of disclosing fabricated intimate images:



(1)        Is a Class 1 misdemeanor on the first offense; or



(2)        Is a Class A felony if the defendant has one or more prior convictions for a violation of this section or the section governing disclosure of intimate images.



(g)        A minor who possesses any image of any other minor which constitutes a fabricated intimate image forfeits any right to continued possession of the image and any court exercising jurisdiction over such image shall order forfeiture of the image.



§ 170‑8.  Remedies for disclosure of fabricated intimate images.



(a)        A depicted individual who is identifiable and who suffers harm from a person's intentional disclosure or threatened disclosure of a fabricated intimate image without the depicted individual's consent has a cause of action against the person if the person knew or acted with reckless disregard for whether:



(1)        The depicted individual did not consent to the disclosure; and



(2)        The depicted individual was identifiable.



(b)        A depicted individual's consent to the creation of the fabricated intimate image does not by itself establish that the depicted individual consented to its disclosure. Consent is deemed validly given only if it (i) is set forth in an agreement written in plain language signed knowingly and voluntarily by the depicted individual and (ii) includes a general description of the fabricated intimate image and, if applicable, the audiovisual work into which it will be incorporated.



(c)        It is not a defense to an action under this section that there is a disclaimer stating that the fabricated intimate image of the depicted individual was unauthorized or that the depicted individual did not participate in the creation or development of the fabricated intimate image.



(d)       In an action under this section, a prevailing plaintiff may recover:



(1)        The greater of economic and noneconomic damages proximately caused by the defendant's disclosure or threatened disclosure, including damages for emotional distress whether or not accompanied by other damages; or statutory damages not to exceed ten thousand dollars ($10,000) against each defendant found liable for all disclosures and threatened disclosures;



(2)        An amount equal to any monetary gain made by the defendant from disclosure of the fabricated intimate image; and



(3)        Punitive damages in an amount not to exceed three times the amount of damages under subdivision (1) of this subsection.



The court may award a prevailing plaintiff reasonable attorneys' fees and costs; and additional relief, including injunctive relief.



(f)        An action under this section for an unauthorized disclosure may not be brought later than four years from the date the disclosure was discovered or should have been discovered with the exercise of reasonable diligence. A threat to disclose may not be brought later than four years from the date of the threat to disclose.



(g)        In an action under this section by a depicted individual who was a minor on the date of the disclosure or threat to disclose, the time specified in subsection (f) of this section does not begin to run until the depicted individual attains the age of majority.



§ 170‑9.  Exemption from liability.



(a)        A person is not liable under this Article if the person proves that disclosure of, or a threat to disclose, a fabricated intimate image was made in good faith, as follows:



(1)        In connection with law enforcement activities, legal proceedings, or medical education or treatment.



(2)        In the reporting or investigation of unlawful conduct.



(3)        In connection with a matter of public concern or public interest.



If a defendant asserts an exception to liability under this section, the exception does not apply if the plaintiff proves the disclosure was prohibited by law other than this Article or made for the purpose of sexual arousal, sexual gratification, humiliation, degradation, or monetary or commercial gain.



(b)        Disclosure of, or a threat to disclose, a fabricated intimate image is not a matter of public concern or public interest solely because the depicted individual is a public figure.



§ 170‑10.  AI‑generated images in court proceedings.



(a)        In any criminal proceeding, any property or material that constitutes a depiction of a minor engaged in sexually explicit conduct, including any fabricated depictions, shall remain in the care, custody, and control of either a law enforcement agency or the court.



(b)        Despite any request by the defendant or prosecution, any property or material that constitutes a fabricated depiction of a minor shall not be copied, photographed, duplicated, or otherwise reproduced, so long as the property or material is made reasonably available to the parties. The property or material shall be deemed to be reasonably available to the parties if the prosecution, defense counsel, or any individual sought to be qualified to furnish expert testimony at trial has ample opportunity for inspection, viewing, and examination of the property or material at a law enforcement facility or a neutral facility approved by the court upon petition by the defense.



(c)        The defendant may view and examine the property and materials only while in the presence of his or her attorney. If the defendant is proceeding pro se, the court will appoint an individual to supervise the defendant while he or she examines the materials.



(d)       The court may direct that a mirror image of a computer hard drive containing such depictions be produced for use by an expert only upon a showing that an expert has been retained and is prepared to conduct a forensic examination while the mirror imaged hard drive remains in the care, custody, and control of a law enforcement agency or the court. Upon a substantial showing that the expert's analysis cannot be accomplished while the mirror imaged hard drive is kept within the care, custody, and control of a law enforcement agency or the court, the court may order its release to the expert for analysis for a limited time. If release is granted, the court shall issue a protective order setting forth such terms and conditions as are necessary to protect the rights of the victims, to document the chain of custody, and to protect physical evidence.



(e)        Whenever a depiction of a minor engaged in sexually explicit conduct, regardless of its format and whether it is a fabricated depiction, is marked as an exhibit in a criminal proceeding, the prosecutor shall seek an order sealing the exhibit at the close of the trial. Any exhibits sealed under this section shall be sealed with evidence tape in a manner that prevents access to, or viewing of, the depiction and shall be labeled so as to identify its contents. Anyone seeking to view such an exhibit must obtain permission from the superior court after providing at least 10 days' notice to the prosecuting attorney. Appellate attorneys for the defendant and the State shall be given access to the exhibit, which must remain in the care and custody of either a law enforcement agency or the court.



(f)        If the criminal proceeding ends in a conviction, the clerk of the court shall destroy any exhibit containing a depiction of a minor engaged in sexually explicit conduct, including any fabricated depictions, five years after the judgment is final, as unless otherwise required by law. Before any destruction, the clerk shall contact the prosecuting attorney and verify that there is no collateral attack on the judgment pending in any court. If the criminal proceeding ends in a mistrial, the clerk shall either maintain the exhibit or return it to the law enforcement agency that investigated the criminal charges for safekeeping until the matter is set for retrial. If the criminal proceeding ends in an acquittal, the clerk shall return the exhibit to the law enforcement agency that investigated the criminal charges for either safekeeping or destruction.



Article 3.



Various Regulations.



§ 170‑11.  Generative artificial intelligence transparency disclosures.



(a)        Except as provided in subsection (b) of this section, when a person uses generative artificial intelligence to interact with an individual, the business or person shall disclose that the individual is interacting with Gen AI only if the individual asks whether the interaction involves generative artificial intelligence.



(b)        When generative artificial intelligence is utilized in the provision of services of a regulated occupation, a prominent mandatory disclosure must be clearly and conspicuously provided.



(c)        Regulated occupation professionals must disclose the use of Gen AI either verbally at the start of an exchange or conversation with a client or customer or through an electronic message before a written exchange.



(d)       Violation of this section is a Class A misdemeanor. Each interaction with a consumer constitutes a potential separate violation.



Article 4.



Miscellaneous Provisions.



§ 170‑12.  Nonexclusive remedies.



The remedies in this Chapter are nonexclusive. When an aggrieved person has multiple options for seeking relief, choosing an option provided by this Chapter does not prevent the person from pursuing other remedies. The aggrieved person may pursue multiple forms of relief allowed by law simultaneously or sequentially.



§ 170‑13.  Criminal liability for AI‑assisted offenses.



(a)        A defendant is guilty of a criminal offense under this Chapter if the defendant commits the offense with the aid of generative artificial intelligence or intentionally prompts or otherwise causes generative artificial intelligence to commit the offense.



(b)        It is not a defense to the violation of any statute that generative artificial intelligence made the violative statement, undertook the violative act, or was used in furtherance of the violation.



§ 170‑14.  Statutory construction.



(a)        It is the intent of the General Assembly that the provisions of this Chapter be liberally construed in the best interest of the citizens of this State, especially minors and voters.



(b)        Nothing in this section shall be construed to conflict with or prohibit compliance with Title IX of the Education Amendments of 1972, as amended; the Americans with Disabilities Act, as amended; the Age Discrimination in Employment Act, as amended; Title VI of the Civil Rights Act of 1964; or other applicable State or federal law. This section does not apply to speech protected by the First Amendment of the United States Constitution.



(c)        If a provision of this Chapter or its application to any person or circumstance is held invalid, the invalidity does not affect other provisions or applications of the Act that can be given effect without the invalid provision or application and, to this end, the provisions of this Chapter are severable.



SECTION 2.  This act becomes effective December 1, 2025, and applies to acts or omissions occurring on or after that date.