H1161: Omnibus Artificial Intelligence Protections. Latest Version

2025-2026



AN ACT enacting protections CONCERNING DEPLOYMENT OF ARTIFICIAL INTELLIGENCE AND RELATED TECHNOLOGIES IN ELECTIONS, EDUCATION, employment, the COURTS, AND THE INSURANCE INDUSTRY IN THIS STATE and appropriating funds for implementation.



The General Assembly of North Carolina enacts:



 



PART I. elections



 



PROHIBIT USE OF ARTIFICIAL INTELLIGENCE IN POLITICAL ADVERTISEMENTS



SECTION 1.1.  Article 22A of Chapter 163 of the General Statutes is amended by adding a new section to read:



§ 163‑278.18A.  Prohibit use of artificial intelligence in political advertisements.



(a)        For purposes of this section, the following definitions shall apply:



(1)        Artificial intelligence. – The capability of computer systems or algorithms to imitate intelligent human behavior. The term includes generative artificial intelligence.



(2)        Political advertisement. – An advertisement as defined under G.S. 163‑278.38Z(1), including communications sent by email, text, or appearing on a website or social media platforms.



(b)        No candidate, candidate campaign committee, political party organization, political action committee, referendum committee, individual, or other sponsor shall use artificial intelligence in any form of political advertisement, including, but not limited to, images, videos, voices, or writings.



(c)        A person convicted of violating subsection (b) of this section shall be guilty of a Class 1 misdemeanor and pay a fine to the State Board based on the cost to produce and distribute the artificial intelligence‑generated political advertisement multiplied by 10. In addition, the following shall apply:



(1)        If a candidate's campaign committee is convicted for a violation under this section, the campaign committee is prohibited from soliciting donations or making contributions for two years from the date of conviction, regardless of whether an election is occurring.



(2)        If a political action committee is convicted for a violation under this section, the political action committee shall have its bank account frozen and shall not solicit donations or make contributions. The chair or director of the political action committee is prohibited from (i) soliciting donations or making contributions for five years from the date of conviction and (ii) establishing, participating in, or working for any other political action committee during that five‑year period.



(d)       Any person not affiliated with a candidate's campaign or political action committee creating an artificial intelligence‑generated video, writing, voice, or image of a candidate or a candidate's campaign with the intent to confuse or interfere with a candidate's campaign shall be subject to conviction of fraud or election interference in accordance with Article 22 of this Chapter. This subsection shall not apply to (i) artificial intelligence that clearly depicts or explicitly states that it is artificial intelligence or (ii) parody laws, so long as neither would qualify as any form of harassment, including sexual harassment.



SECTION 1.2.  This Part is effective when it becomes law and applies to political advertisements using artificial intelligence on or after that date.



 



PART II. EDUCATION



 



ALLOW PUBLIC SCHOOL UNITS TO RESTRICT STUDENT USE OF ARTIFICIAL INTELLIGENCE



SECTION 2.1.(a)  Part 3A of Article 8 of Chapter 115C of the General Statutes is amended by adding a new section to read:



§ 115C‑102.13.  Policy for student use of artificial intelligence.



(a)        For the purposes of this section, artificial intelligence (AI) tool means any algorithm, product, software, or system that uses artificial intelligence to perform tasks.



(b)        A local board of education may adopt a policy for student use of artificial intelligence. If a local board of education adopts a policy pursuant to this section, the policy may include any of the following:



(1)        Allow schools to block access to AI tools on student electronic devices.



(2)        Allow schools to block access to AI tools on internet connections available to students.



(3)        Disciplinary actions for unauthorized use of artificial intelligence, up to retaining the student in the current grade for repeated violations.



(c)        Any policy adopted pursuant to this section shall include exceptions for student use of artificial intelligence if the use is authorized by a teacher for educational purposes or is required by the student's individualized education program or section 504 (29 U.S.C. § 794) plan. If a student will be using artificial intelligence under an exception in this subsection, the parent of the student and the principal of the school shall be notified prior to the use.



SECTION 2.1.(b)  G.S. 115C‑150.12C is amended by adding a new subdivision to read:



(39)    Artificial intelligence. – The board of trustees may adopt a policy for student use of artificial intelligence so long as the policy is consistent with the requirements of G.S. 115C‑102.13.



SECTION 2.1.(c)  G.S. 115C‑218.33 is amended by adding a new subsection to read:



(c)      A charter school may adopt a policy for student use of artificial intelligence so long as the policy is consistent with the requirements of G.S. 115C‑102.13.



SECTION 2.1.(d)  G.S. 115C‑238.66(1) is amended by adding a new sub‑subdivision to read:



h.        The board of directors may adopt a policy for student use of artificial intelligence so long as the policy is consistent with the requirements of G.S. 115C‑102.13.



SECTION 2.1.(e)  G.S. 116‑239.8(b) is amended by adding a new subdivision to read:



(26)    Artificial Intelligence. – A laboratory school may adopt a policy for student use of artificial intelligence so long as the policy is consistent with the requirements of G.S. 115C‑102.13.



SECTION 2.1.(f)  This section is effective when it becomes law and applies beginning with the 2026‑2027 school year.



 



INSTRUCTION ON CRITICAL THINKING, CIVICS, AND TECHNOLOGY AWARENESS, INCLUDING ARTIFICIAL INTELLIGENCE



SECTION 2.2.(a)  Part 1 of Article 8 of Chapter 115C of the General Statutes is amended by adding a new section to read:



§ 115C‑81.46.  Middle school course on critical thinking and civics.



The State Board of Education shall include instruction on critical thinking and civics in the standard course of study for middle school students. The State Board shall develop standards for this instruction to be offered in a semester course that may be extended to a yearlong course at the discretion of the local board of education. The course shall include instruction in at least the following areas:



(1)        Components of the civic and citizenship standards adopted by the State Board pursuant to G.S. 115C‑81.45 that the State Board deems appropriate for this course.



(2)        Proper methods of research on topics related to civics and government, including methods of identifying reputable sources of information.



(3)        Identification of disinformation and misinformation.



(4)        Basic structures of the United States government and the North Carolina government at both the State and local levels.



(5)        Beginner level introduction into philosophies of governance, including instruction on at least the following writings:



a.         The Prince by Niccolo Machiavelli.



b.         The Republic by Plato.



SECTION 2.2.(b)  G.S. 115C‑81.90(b) reads as rewritten:



(b)      Introductory Course. – Each public school unit shall offer to middle school students an elective a required introductory computer science course that surveys the field of computer science. The State Board of Education, in consultation with the Department of Public Instruction, shall adopt a list of approved courses that fulfill this requirement and make it publicly available on the Department's website. The approved courses shall include instruction in at least the following topics:



(1)        Instruction on typing on a physical keyboard.



(2)        Introduction to basic software design.



(3)        Proper uses of artificial intelligence in an academic and professional setting.



(4)        Instruction on identifying content generated using artificial intelligence.



(5)        Introduction to basic cybersecurity.



(6)        Instruction on the internet as a tool, including the benefits and dangers of its use.



SECTION 2.2.(c)  There is appropriated from the General Fund to the Department of Public Instruction the sum of five hundred thousand dollars ($500,000) in nonrecurring funds for the 2026‑2027 fiscal year to implement the provisions of this section.



SECTION 2.2.(d)  This section is effective when it becomes law and applies beginning with the 2026‑2027 school year.



 



REQUIRE UNIVERSITIES AND COMMUNITY COLLEGES TO DEVELOP STANDARDS FOR CLASSROOM USE OF ARTIFICIAL INTELLIGENCE



SECTION 2.3.(a)  G.S. 116‑11 is amended by adding a new subdivision to read:



(15)    The Board of Governors shall adopt a policy requiring each constituent institution of The University of North Carolina to develop standards for the use of artificial intelligence in the classroom. The policy shall permit constituent institutions to set their own standards but shall ensure that each constituent institution (i) presents its standards to all students every year and (ii) provides for appropriate disciplinary action in the event a student misuses artificial intelligence in a class. Appropriate disciplinary action, as described in this subdivision, could include automatic receipt of a failing grade in a class.



SECTION 2.3.(b)  Part 4 of Article 1 of Chapter 115D of the General Statutes is amended by adding the following new section to read:



§ 115D‑10.85.  Artificial intelligence.



The State Board of Community Colleges shall adopt a policy requiring each community college to develop standards for the use of artificial intelligence in the classroom. The policy shall permit community colleges to set their own standards but shall ensure that each community college (i) presents its standards to all students every year and (ii) provides for appropriate disciplinary action in the event a student misuses artificial intelligence in a class. Appropriate disciplinary action, as described in this section, could include automatic receipt of a failing grade in a class.



SECTION 2.3.(c)  This section is effective when it becomes law and applies beginning with the 2026‑2027 academic year.



 



PART III. employment



SECTION 3.1.(a)  Effective October 1, 2026, the General Statutes are amended by adding a new Chapter to read:



Chapter 95A.



Fair Artificial Intelligence Hiring Act.



§ 95A‑1.  Title; findings.



(a)        This Chapter shall be known and may be cited as the North Carolina Fair AI Hiring Act (NC‑FAHA).



(b)        The General Assembly finds that:



(1)        Artificial intelligence and algorithmic decision‑making tools are increasingly used by employers in North Carolina to screen, rank, evaluate, and select applicants for employment and to evaluate employees for promotion. However, these AI tools can perpetuate or amplify patterns of discrimination based on race, sex, ethnicity, national origin, disability, age, and other protected characteristics, often through mechanisms that are opaque to both applicants and employers.



(2)        North Carolinians have a substantial interest in ensuring that the use of algorithmic tools in a covered employment decision is subject to independent scrutiny, public accountability, and meaningful notice to employees.



(3)        Existing anti‑discrimination laws prohibit discriminatory employment practices but do not specifically address the disclosure, audit, and notice requirements necessary to effectively regulate automated decision‑making tools.



(4)        It is the public policy of this State that hiring and promotion employment decisions made using artificial intelligence tools must be fair and not perpetuate or amplify patterns of discrimination.



§ 95A‑2.  Definitions.



The following definitions apply in this Chapter:



(1)        Automated employment decision tool or AEDT. – Any computational process, or any technology that incorporates such a process, derived in whole or in material part from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues a simplified output, including a score, classification, ranking, or recommendation, that is used to substantially assist or replace the exercise of discretionary judgment by an employer or employment agency in making a covered employment decision. A tool does not qualify as an AEDT solely because it translates or transcribes text, performs arithmetic computation on manually entered data without autonomous parameter adjustment, or conducts background checks governed exclusively by the federal Fair Credit Reporting Act, 15 U.S.C. § 1681, et seq.



(2)        Bias audit. – An impartial evaluation by an independent auditor that at a minimum: (i) calculates the selection rate and scoring rate for each race and ethnicity category, each sex category, and each intersectional race‑and‑sex category assessed by the AEDT; (ii) calculates the impact ratio for each such category relative to the most‑selected or highest‑scoring category; (iii) identifies the source and scope of the historical or test data used in the audit; and (iv) states whether the AEDT was found to have a statistically significant adverse impact on any protected category.



(3)        Covered employment decision. – A decision to hire an individual for, or to promote an employee to, a position within this State, including a remote position performed primarily by an individual who resides within this State.



(4)        Deployer. – Any employer or employment agency that uses an AEDT to make or substantially influence a covered employment decision, regardless of whether the AEDT was developed by the deployer or by a third party.



(5)        Employer. – A public or private employer.



(6)        Employment agency. – Any person that regularly undertakes, with or without compensation, to procure employees for an employer or to procure for employees opportunities to work for an employer.



(7)        Independent auditor. – A person that: (i) is not employed by and has no financial interest in the deployer or any vendor of the AEDT being audited, other than compensation for the audit itself; (ii) has demonstrated expertise in the statistical or computational methods necessary to evaluate the AEDT; and (iii) conducts the audit in accordance with the standards established by the State Human Resources Commission for State employment or the Commissioner of Labor as to private employers.



(8)        Private employer. – As defined in G.S. 95‑25.2, except that the term does not include public employers.



(9)        Public employer. – Any State agency, department, or institution; any constituent institution of The University of North Carolina; the North Carolina Community College System; and any local governmental employer that receives State appropriations used in whole or in part to compensate employees or that has any local employees subject to provisions of Chapter 126 of the General Statutes.



(10)      Substantially assist or replace. – Where the output of an AEDT: (i) serves as the sole basis for a covered employment decision; (ii) serves as a weighted factor that overrides or supersedes other evaluation criteria; or (iii) is a determinative threshold below or above which an individual is screened out of further consideration without independent human review. A tool does not substantially assist or replace discretionary judgment if the output is one of multiple factors reviewed by a human decision maker who retains genuine independent authority and who regularly departs from the tool's recommendation.



§ 95A‑3.  Bias audit requirement.



(a)        No deployer shall use an AEDT to make a covered employment decision in this State unless all of the following conditions are met:



(1)        The AEDT has been subjected to a bias audit conducted by an independent auditor within the 12‑month period immediately preceding each use of the AEDT.



(2)        A summary of the results of the most recent bias audit is publicly accessible as required by this Chapter.



(3)        Advance notice of the AEDT's use has been provided as required by this Chapter.



(b)        A deployer may commission a bias audit of its own AEDT or may use a bias audit commissioned by the vendor that developed or supplied the AEDT, provided that the auditor meets the independence requirements of this Chapter and the audit was conducted within the applicable 12‑month period.



(c)        Where the AEDT is used for the first time and no historical data from the deployer's own use is available, the bias audit may be conducted using test data or historical data from comparable deployers, provided that the audit summary discloses this limitation.



(d)       The absence of statistically significant adverse impact findings in a bias audit does not constitute a defense to a claim of employment discrimination under any other provision of State or federal law.



§ 95A‑4.  Disclosure requirement.



(a)        Within 30 days of completing a bias audit required under G.S. 95A‑3, and prior to any use of the AEDT in a covered employment decision, a deployer shall publish on the employment section of its publicly accessible website a summary of the bias audit results that includes all of the following:



(1)        The date of the bias audit and the name of the independent auditor.



(2)        The AEDT's name or description and the date the deployer began using it.



(3)        The source and scope of the data used in the audit, including whether historical or test data was used and any limitations on the data.



(4)        The selection and scoring rates and impact ratios for each required demographic category.



(5)        The auditor's findings regarding statistically significant adverse impact, if any.



(b)        A deployer that does not maintain a public website shall make the required disclosure available upon written request within five business days.



(c)        Bias audit summaries shall remain publicly accessible for a minimum of three years from the date of publication.



§ 95A‑5.  Advance notice requirement.



(a)        No later than 10 business days before using an AEDT to make a covered employment decision with respect to a particular applicant or employee, a deployer shall provide written notice to that individual indicating all of the following information:



(1)        That an AEDT will be used in connection with the covered employment decision.



(2)        The qualifications, characteristics, or criteria that the AEDT is designed to evaluate.



(3)        A hyperlink (URL) to the bias audit summary required under G.S. 95‑4 or, if no website exists, instructions for requesting the summary.



(4)        The right to request an alternative selection or evaluation process under subsection (b) of this section.



(b)        Upon a timely written request from an applicant or employee, a deployer shall provide an alternative selection or evaluation process that does not rely upon the AEDT, unless the deployer can demonstrate that no reasonable alternative process exists. A request is timely if made within five business days of receiving the notice required under subsection (a) of this section.



(c)        For job applicants, the notice required by subsection (a) of this section may be included in the job posting, provided that it is clear and conspicuous and not buried within general terms and conditions.



§ 95A‑6.  Enforcement against private employers.



(a)        The Attorney General is authorized to investigate and bring a civil action against any private employer or employment agency that violates this Chapter.



(b)        Upon finding a violation, a court may impose a civil penalty of:



(1)        Not less than five hundred dollars ($500.00) and not more than one thousand five hundred dollars ($1,500) per violation per day for an initial violation; and



(2)        Not less than one thousand dollars ($1,000) and not more than five thousand dollars ($5,000) per violation per day for each subsequent violation by the same deployer within any rolling three‑year period.



(c)        Any applicant or employee aggrieved by a violation of this Chapter by a private employer may bring a civil action in the General Court of Justice within one year of the date the plaintiff knew or reasonably should have known of the violation. A prevailing plaintiff may recover one or more of the following remedies:



(1)        Compensatory damages.



(2)        Injunctive or declaratory relief.



(3)        Reasonable attorneys' fees and costs.



(4)        Such other relief as the court deems equitable.



(d)       It shall be a complete defense that the deployer conducted a timely and conforming bias audit; published a conforming audit summary; provided conforming advance notice; and that the violation was a technical or inadvertent defect subsequently cured within 30 days of notice from the complainant or the Attorney General.



(e)        The clear proceeds of any civil penalties collected under this section shall be remitted to the Civil Penalty and Forfeiture Fund pursuant to G.S. 115C‑457.2.



§ 95A‑7.  Administration; enforcement against public employers.



(a)        For public employers that are State agencies, the State Human Resources Commission (SHRC) shall, as part of its rulemaking authority under G.S. 126‑4, incorporate the requirements of this Chapter into the State Human Resources Act.



(b)        Public employers that are not State agencies shall incorporate the requirements of this Chapter into each entity's respective policies governing recruitment, selection, and promotion.



(c)        Bias audit summaries required of public employers shall be posted on the relevant agency's public website.



(d)       The SHRC shall investigate and adjudicate complaints and referrals alleging violation of this Chapter by a public employer. Any applicant or employee aggrieved by a violation of this Chapter by a public employer may file a written complaint with the SHRC within one year of the date the complainant knew or reasonably should have known of the violation. The SHRC shall do all of the following:



(1)        Conduct an investigation within 60 days of receipt of a complaint.



(2)        Issue written findings.



(3)        Where a violation is found, direct the public employer to remedy the violation within 30 days and, if the violation resulted in an adverse covered employment decision, afford the complainant priority consideration in any subsequent non‑AEDT selection process for the same or a comparable position.



(e)        The OSHR may conduct compliance audits of public employer AEDT use on a periodic basis, not less than biennially, and shall report findings to the General Assembly, the Office of State Budget and Management (OSBM), and the Fiscal Research Division.



(f)        Nothing in this section creates or is intended to create a waiver of sovereign immunity beyond the administrative remedies provided herein. This section does not create a right of action in the General Court of Justice against a public employer, except that an aggrieved party who has exhausted the Commission's administrative process may seek judicial review of a final Commission order pursuant to G.S. 150B‑43.



(g)        Following a finding by the SHRC that a public employer has violated this Chapter, the Commission shall simultaneously transmit the finding to OSHR for initiation of personnel disciplinary proceedings under Chapter 126 of the General Statutes against the employee responsible for the violation, unless the Commission affirmatively determines that the violation was caused solely by circumstances beyond the responsible employee's reasonable control. Disciplinary proceedings initiated under this subsection shall be conducted in accordance with G.S. 126‑34.02 through G.S. 126‑34.05. Sanctions commensurate with the severity and willfulness of the violation may include written warning, demotion, suspension without pay, or dismissal.



(h)        Following a finding by the SHRC that a public employer subject to the State Budget Act, Chapter 143C of the General Statutes, has violated the independent bias audit provisions of this Chapter and has failed to cure the violation within 30 days, the SHRC shall certify the finding and the agency's failure to cure to the OSBM. Upon receipt of the certification, the OSBM shall withhold from the noncompliant agency's personnel services budget allotment an amount equal to two times the reasonable cost of a conforming independent bias audit for the AEDT at issue, as estimated by OSHR. However, the OSBM shall not withhold amounts from any program services, capital, or other budget allotments of the agency. The withholding shall take effect no later than 30 days after certification. The withheld amount shall be held in a reserve account by OSBM and shall be credited back to the agency's personnel services allotment in full upon OSHR's certification to OSBM that the agency has cured the violation. If the violation is not cured within 180 days of the withholding, the withheld amount shall be transferred to the General Fund as a nonrecurring credit.



(i)         A single withholding action under this section shall not exceed fifty thousand dollars ($50,000) per violation finding. Where multiple violations are found simultaneously, the OSBM shall sequence withholdings to avoid operational impairment of essential agency services, as determined in OSBM's reasonable discretion in consultation with the agency head.



(j)         The OSBM shall report each withholding action taken under this section to the General Assembly and Fiscal Research Division within 30 days of taking the action and shall include a summary of all withholding actions in its annual report.



§ 95A‑8.  Standards.



The SHRC, in conjunction with the Attorney General and the Department of Administration, shall develop and publish the following concerning employer use of AEDT:



(1)        Minimum qualification standards for independent auditors.



(2)        Methodological standards for bias audits, including approved statistical methods and minimum sample size requirements.



(3)        A standardized format for bias audit summaries.



(4)        A standardized form for the required advance notice.



(5)        The annual inventory reporting process for employer deployment of AEDT.



§ 95A‑9.  Government contractors; contract employees.



(a)        In addition to the provisions of G.S. 95A‑2, the following definitions apply in this section:



(1)        Contract employee. – Any individual selected to perform work for a State agency under a covered contract, whether provided by the prime contractor or a subcontractor of a prime contractor.



(2)        Covered contract. – A personal service contract with a total value exceeding twenty‑five thousand dollars ($25,000), including all renewals and amendments.



(3)        Personal service. – Professional or technical expertise provided by a consultant to accomplish a specific study, project, task, or other work statement. The term does not include professional services procured using the competitive selection requirements required by Chapter 143 of the General Statutes.



(4)        Personal service contract. – An agreement, or any amendment thereto, with a consultant for the rendering of personal services.



(5)        Prime contractor. – Any person that enters directly into a covered contract with a State agency.



(6)        Purchased services. – Services provided by a vendor to accomplish routine, continuing, and necessary functions. The term includes, but is not limited to, services for equipment maintenance and repair; operation of a physical plant; security; computer hardware and software maintenance; data entry; key punch services; and computer time‑sharing, contract programming, and analysis.



(b)        A prime contractor shall comply with the audit, disclosure, and notice requirements of this Chapter with respect to any AEDT used to select contract employees under a covered contract. Such use constitutes a covered employment decision for the purposes of this Chapter.



(c)        The Department of Administration shall include in the standard terms and conditions for all covered contracts a requirement that the prime contractor do all of the following:



(1)        Certify at contract execution and at each annual renewal whether an automated employment decision tool was used or will be used to select contract employees performing work under the contract.



(2)        Provide the contracting agency the bias audit summary required by G.S. 95A‑4 prior to such use.



(3)        Acknowledge that material failure to comply with this section constitutes a breach of contract entitling the State to withhold payment, terminate for cause, or both, at the State's election.



(d)       Any individual who applied for or performed work as a contract employee under a covered contract and who was subjected to a nonconforming automated employment decision tool may file a written complaint with the SHRC within one year of discovering the violation. The SHRC shall investigate, issue written findings within 60 days, and upon finding a violation shall do all of the following:



(1)        Direct the contracting State agency to pursue available contract remedies under subdivision (3) of subsection (c) of this section.



(2)        Notify the Attorney General, who may bring a civil action against the prime contractor under G.S. 95A‑6.



§ 95A‑10.  Exemptions.



(a)        This Chapter does not apply to any of the following:



(1)        Tools used solely for the purpose of administering or scoring a standardized test, the content of which was designed entirely by human experts without autonomous algorithmic parameter adjustment.



(2)        Tools used solely by employers with fewer than 15 employees.



(3)        Tools used solely for recruiting outreach or advertising, where no individual applicant's qualifications are evaluated.



(4)        Background checks governed exclusively by the federal Fair Credit Reporting Act.



(5)        Tools used by an employer subject to an express federal regulatory requirement governing the hiring process that is inconsistent with the requirements of this Chapter, to the extent of the inconsistency.



(b)        This Chapter does not apply to the judicial or legislative branches of government except that the Chief Justice of the Supreme Court and the Legislative Services Officer may establish substantially equivalent provisions and requirements concerning the deployment of AEDT.



§ 95A‑11.  Miscellaneous.



(a)        This Chapter supplements and does not supplant any obligation under: Title VII of the Civil Rights Act of 1964; the Age Discrimination in Employment Act; the Americans with Disabilities Act; the North Carolina Equal Employment Practices Act, G.S. 143‑422.1, et seq.; or any other State or federal antidiscrimination law.



(b)        This Chapter does not preempt any local ordinance, rule, or policy of a county, municipality, or other local political subdivision of the State that imposes obligations on automated employment decision tools equal to or greater than those imposed by this Chapter.



(c)        If any provision of this Chapter or its application to any person or circumstance is held invalid, the invalidity does not affect other provisions or applications that can be given effect without the invalid provision or application and, to this end, the provisions of this Chapter are severable.



SECTION 3.1.(b)  Effective July 1, 2026, there is appropriated from the General Fund to:



(1)        The State Human Resources Commission, Office of State Human Resources, the sum of two hundred fifty thousand dollars ($250,000) for the 2026‑2027 fiscal year for the purpose of implementing the rulemaking, compliance auditing, and oversight functions required by this act.



(2)        The Department of Justice the sum of two hundred fifty thousand dollars ($250,000) for the 2026‑2027 fiscal year for implementing the provisions of this act.



(3)        The Department of Administration the sum of one hundred thousand dollars ($100,000) for the 2026‑2027 fiscal year for implementing the provisions of this act.



 



PART IV. INSURANCE



 



LIMIT USE OF ARTIFICIAL INTELLIGENCE DURING INSURANCE CLAIM PROCESSING



SECTION 4.1.(a)  G.S. 58‑63‑15 reads as rewritten:



§ 58‑63‑15.  Unfair methods of competition and unfair or deceptive acts or practices defined.



The following are hereby defined as unfair methods of competition and unfair and deceptive acts or practices in the business of insurance:





(11)      Unfair Claim Settlement Practices. – Committing or performing with such frequency as to indicate a general business practice of any of the following: Provided, however, that no violation of this subsection shall of itself create any cause of action in favor of any person other than the Commissioner:





m.        Failing to promptly settle claims where liability has become reasonably clear, under one portion of the insurance policy coverage in order to influence settlements under other portions of the insurance policy coverage; and



n.         Failing to promptly provide a reasonable explanation of the basis in the insurance policy in relation to the facts or applicable law for denial of a claim or for the offer of a compromise settlement.settlement; and



o.         Using artificial intelligence as the primary method of processing a claim. For purposes of this sub‑subdivision, artificial intelligence means any machine‑based system that, for a given set of objectives, generates predictions, recommendations, or decisions influencing outcomes without direct human control.



….



SECTION 4.1.(b)  This Part becomes effective October 1, 2026, and applies to policies issued or renewed on or after that date.



 



part v. courts



SECTION 5.1.(a)  Chapter 7A of the General Statutes is amended by adding a new Article to read:



Article 64.



Use of Artificial Intelligence.



§ 7A‑810.  Definitions.



The following definitions apply in this Article:



(1)        Artificial intelligence. – A machine‑based system that can, for a given set of human‑defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments.



(2)        Deepfake. – A type of media that uses a generative artificial intelligence system to create or alter images, audio, or video in highly realistic ways, including to falsely depict people making statements or completing actions. This definition includes a person using the person's own likeness, image, or voice in such media.



(3)        Generative artificial intelligence. – Artificial intelligence technology that is capable of creating content such as text, audio, image, or video based on patterns learned from large volumes of data rather than being explicitly programmed with rules. This definition does not include any form of artificial intelligence used only for spell‑check, grammar, or editing purposes.



(4)        Hallucination. – Output from a generative artificial intelligence system that appears plausible but is inaccurate, fabricated, or unsupported by the underlying data or sources, including all of the following:



a.         Invented statutes, regulations, cases, citations, quotes, or facts.



b.         Misstatements of existing statutes, regulations, or case holdings.



c.         Prerecorded videos or statements or live playback portraying a nonexistent person or factual scenario.



§ 7A‑811.  Judicial power to dismiss.



If any party to a case in a trial court uses any form of generative artificial intelligence in a court filing or appearance, regardless of whether a deepfake or hallucination is present, the trial judge may dismiss the case without prejudice.



§ 7A‑812.  Right to bring a new action or appeal; judicial discretion.



The party against whom a case has been dismissed pursuant to G.S. 7A‑811 may take either of the following actions:



(1)        Bring a new action based on the same claim in the same trial court, after the party removes any and all forms of generative artificial intelligence from the court filings and subject to the approval of the trial judge.



(2)        Appeal the dismissal to the Court of Appeals if the party believes the dismissal was in error. The Court of Appeals may reverse the trial judge's decision if it finds the party in fact did not use any form of generative artificial intelligence.



§ 7A‑813.  Second use of generative artificial intelligence.



(a)        A party permitted to bring a new action or continue an initial case pursuant to G.S. 7A‑812 shall not use any form of generative artificial intelligence, including previously used and newly created content.



(b)        Upon the discovery of a violation of subsection (a) of this section, the trial judge may dismiss the case with prejudice and take any of the following disciplinary actions:



(1)        Impose monetary sanctions, including payment of a fine to the court and reasonable attorneys' fees.



(2)        Refer the party to the North Carolina State Bar for disciplinary proceedings, if the party is an attorney.



(3)        Hold the party in contempt of court.



SECTION 5.1.(b)  This Part becomes effective July 1, 2026, and applies to all court documents filed and proceedings initiated on or after that date.



 



part VI. Appropriation



SECTION 6.1.  Effective July 1, 2026, there is appropriated from the General Fund to the Office of State Budget and Management the sum of one million dollars ($1,000,000) in the 2026‑2027 fiscal year to be allocated for implementation of this act upon application made by the affected State agency, department, or institution. These funds are not subject to reversion under the State Budget Act.



 



PART ViI. EFFECTIVE DATE



SECTION 7.1.  Except as otherwise provided, this act is effective when it becomes law.