S747: AI Learning Agenda. Latest Version

Session: 2025 - 2026

Senate
Passed 1st Reading
Rules


AN ACT establishing the office of artificial intelligence policy and creating the AI learning Laboratory program.



Whereas, artificial intelligence technologies are transforming economies, societies, and industries globally, requiring proactive exploration of their implications for North Carolina; and



Whereas, this act establishes an iterative, stakeholder‑informed learning model to support thoughtful, evidence‑based governance of AI technologies; Now, therefore,



The General Assembly of North Carolina enacts:



SECTION 1.  The General Assembly finds it essential to proactively explore artificial intelligence technologies through an inclusive and iterative learning model that promotes innovation while protecting consumer interests and civil rights.



SECTION 2.  Chapter 114 of the General Statutes is amended by creating a new Article to read:



Article 11.



Artificial Intelligence Learning Laboratory.



§ 114‑75.  Definitions.



The following definitions apply in this Article:



(1)        Applicant. – A person that applies for participation in the regulatory Learning Laboratory.



(2)        Learning Agenda. – The areas of artificial intelligence applications, risks, and policy considerations selected by the Office established by this act for focus by the Learning Laboratory.



(3)        Learning Laboratory. – The artificial intelligence analysis and research program created in this a ct.



(4)        Office. – The Office of Artificial Intelligence Policy created in this act.



(5)        Participant. – A person that is accepted to participate in the Learning Laboratory.



(6)        Regulatory mitigation. – When restitution to users may be required, the (i) terms and conditions related to any cure period before penalties may be assessed, (ii) any reduced civil fines during the participation term; and (iii) any other terms tailored to identified issues of the artificial intelligence technology.



(7)        Regulatory mitigation agreement. – An agreement between a participant, the Office and relevant State agencies entered into under this act.



(8)        State agency or agency. –  A State agency, department, or institution in the executive or legislative branches of government or a political subdivision of the State.



§ 114‑76. Office of Artificial Intelligence Policy.



(a)        The Office of Artificial Intelligence Policy is created in the Department of Commerce. The Secretary of Commerce shall appoint a Director to oversee the management and operations of the office and adopt rules to carry out the purposes of this section.



(b)        The purposes of the Office are to do all of the following:



(1)       Identify regulatory barriers to artificial intelligence (AI) development, deployment, and use in North Carolina and recommend regulatory proposals to remove or avoid such barriers.



(2)        Identify regulatory gaps where existing law is insufficient to prevent or redress substantial, non‑speculative, concrete, and redressable harm and recommend regulatory proposals to fill such gaps.



(3)        Conduct an inventory of existing State regulation of AI technology.



(4)        Create and administer an Artificial Intelligence Learning Laboratory program.



(5)        Consult with businesses and other stakeholders in the state about potential regulatory proposals.



(6)        Consult with  Learning Laboratories or similar bodies in other states.



(7)        Establish and convene a multidisciplinary AI Learning Advisory Panel composed of academic experts, industry representatives, legal scholars, and civil society organizations to provide input into the learning agenda and ongoing evaluations.



(c)        At a minimum, rules adopted under this Article shall concern the following:



(1)        Procedures, requirements, and fees to apply to participate in the learning laboratory program and criteria for invitation, acceptance, denial, or removal of participants.



(2)        Data usage limitations and cybersecurity criteria for participants.



(3)        Required participant disclosures to consumers.



(4)        Reporting requirements for participants to the Office.



(5)        Criteria for limited extension of the participation period.



(6)        Other requirements as necessary to administer the Learning Laboratory.



(c1) The Office shall maintain a public registry of Learning Laboratory participants and publish summary reports of research findings, best practices, and policy recommendations, excluding proprietary or security‑sensitive information.



(d)       Beginning July 1, 2026, the Office shall report annually to the General Assembly about the following:



(1)        The proposed learning agenda for the Learning Laboratory.



(2)        The findings, participation, and outcomes of the Learning Laboratory.



(3)        Recommended legislation from findings from the inventories and Learning Laboratory.



(4)       A review of the effectiveness of the Learning Laboratory model, and whether any elements should be codified into permanent regulatory structures, expanded, or sunset.



§ 114‑77.  State AI inventory.



(a)        By October 1, 2026, each State agency may compile, in a form specified by the Office, an inventory of all artificial intelligence technologies that are in use by the State agency or being developed or considered by the State agency for use. The inventory shall be submitted to the Office, the Secretary of Commerce, the Governor, the Speaker of the House of Representatives, the President Pro Tempore of the Senate, and the Chairs and Ranking Minority Members of the Joint Legislative Commission on Governmental Operations. By March 1, 2026, the Office may prescribe a form for use by State agencies for compilation and submission of the inventory required by this subsection. This inventory shall include the following information for each artificial intelligence technology included in the inventory:



(1)        The vendor of the artificial intelligence technology.



(2)       A description of the function and capabilities of the artificial intelligence technology.



(3)        A description of (i) the purpose or purposes for which the state agency uses the artificial intelligence technology; (ii) any purpose for which the agency contemplates using the artificial intelligence technology in the future; and (iii) examples of the data or information produced by the artificial intelligence technology for each purpose.



(4)        Whether the artificial intelligence technology provides the agency with information or data that is used by the State agency to inform decisions made by the agency; or decisions, without human intervention, that are implemented by the agency.



(5)        The types of information or data used by the artificial intelligence technology and the source of the information used by the artificial intelligence technology.



(6)        The manner in which the State agency secures the following from unauthorized access:



a.         Artificial intelligence technology.



b.         Information or data used by the artificial intelligence technology; and



c.         Information or data produced by the artificial intelligence technology.



(7)        Any person with which the State agency shares the information or data produced by the artificial intelligence technology and the purpose for which the state agency shares the information or data with the person.



(8)        The documented or anticipated benefits and risks of the state agency's use of the artificial intelligence technology for both the State agency and State residents served by the agency.



(9)        Any information or data used by the State agency to assess the benefits and risks of the agency's use of the artificial intelligence technology.



(10)      The fiscal effect of the State agency's use of the artificial intelligence technology, including the following:



a.         Costs associated with the artificial intelligence technology, including initial acquisition or development costs and ongoing operating costs, including costs of licensing, maintenance, legal compliance, and data storage and security.



b.         Any funding source that is used, or could be used, by the state agency to defray the costs described.



c.         An estimate of the degree to which the costs described are offset by a reduction in the State agency's operating costs attributable to the agency's use of the artificial intelligence technology.



(11)      Whether the artificial intelligence technology has been tested or evaluated by an independent third party.



(12)      Whether the data or information produced by the artificial intelligence technology has been evaluated for bias against constitutionally protected classes of individuals, found to exhibit bias, and adjusted to mitigate any such bias.



(b)        By January 1, 2027, the Office, in consultation with relevant State agencies, shall conduct a comprehensive analysis of the existing regulatory governance of artificial intelligence technology in the State, as follows:



(1)        Contents of Inventory. – The analysis conducted under this subsection shall include all of the following:



a.         A review of existing laws, regulations, executive orders, and state agency rulemaking that pertain to the regulation of the development and use of artificial intelligence technology within the State.



b.         An assessment of the specific category of artificial intelligence use governed by each existing law, regulation, executive order, and State agency rulemaking, and whether each achieves its purpose without impeding the development and use of artificial intelligence technology.



c.         An identification of any gaps where existing law, regulation, executive order, and state agency rulemaking are insufficient to prevent or redress substantial, non‑speculative, concrete, and redressable harm from a specific use of artificial intelligence technology.



d.         An identification of state agencies that possess statutory authority to regulate development and use of artificial intelligence technology.



(2)        Submission of Analysis. – Upon completion of the analysis under this subsection, the Office shall submit the findings to the Secretary of Commerce, the Governor, the Speaker of the House of Representatives, the President of the Pro Tempore of the Senate, and the Chairs and Ranking Minority Members of the Joint Legislative Commission on Governmental Operations.



§ 114‑78.  AI Learning Laboratory Program.



(a)        There is established the Artificial Intelligence Learning Laboratory Program, to be administered by the Office. The purpose of the Learning Laboratory is to:



(1)       Analyze and research the benefits, risks, impacts, and policy implications of artificial intelligence technologies to inform the state regulatory framework.



(2)        Encourage development of artificial intelligence technologies in the State.



(3)        Evaluate the effectiveness and viability of current, potential, or proposed regulation on artificial intelligence technologies in cooperation with artificial intelligence developers.



(4)        Produce findings and recommendations for legislation and regulation of specific artificial intelligence uses.



(5)       Analyze how AI technologies affect individual rights, fairness, and the public interest, including impacts on protected classes and opportunities for bias mitigation.



(b)       The Office shall periodically set a learning agenda for the Learning Laboratory that establishes the specific areas of artificial intelligence policy the office intends to study. The initial learning agenda shall include identifying specific categories of artificial intelligence uses with similar profiles of benefits, risks, impacts, and associated regulatory bodies. In establishing the learning agenda, the office may consult with relevant agencies, industry leaders, academic institutions in the State, and key stakeholders with relevant knowledge, experience, or expertise in the area. The Office may invite and receive an application from a person to participate in the Learning Laboratory. The Office shall establish the procedures and requirements for sending an invitation and receiving requests to participate in the Learning Laboratory in accordance with the purposes of the Learning Laboratory. Open‑source projects shall be eligible for participation in the Learning Laboratory. In selecting participants for the Learning Laboratory, the Office shall consider each of the following:



(1)        The relevance and utility of an invitee or applicant's artificial intelligence technology to the learning agenda.



(2)        The invitee or applicant's expertise and knowledge specific to the learning agenda.



(3)        Other factors identified by the Office as relevant to participation in the Learning Laboratory.



(c)        The Office shall work with participants to establish benchmarks and assess outcomes of participation in the Learning Laboratory.



§ 114‑79.  AI Learning Laboratory participation.



(a)        The Office may approve an applicant to participate in the program. An approved applicant becomes a participant by entering into a participation agreement with the Office and relevant State agencies. Each participant shall provide required information to State agencies in accordance with the terms of the participation agreement and report to the Office as required in the participation agreement. A participant shall retain records as required by Office rule or the participation agreement. A participant shall immediately report to the Office any incidents resulting in consumer harm, privacy breach, or unauthorized data usage, which may result in removal of the participant from the Learning Laboratory.



(b)        A participant who uses or wants to utilize an artificial intelligence technology in the State may apply for regulatory mitigation according to criteria and procedures outlined by the Office by rule. The Office may grant, on a temporary basis, regulatory mitigation to a participant by entering into a regulatory mitigation agreement with the Office and relevant agencies. To receive regulatory mitigation, a participant must demonstrate that the applicant meets eligibility criteria established under this Article. Any regulatory mitigation agreement between a participant and the Office and relevant agencies shall specify limitations on scope of the use of the participant's artificial intelligence technology, including (i) the number and types of users, (ii) , geographic limitations and other limitations to implementation, and (iii) safeguards to be implemented any regulatory mitigation granted to the applicant. The Office shall consult with relevant agencies regarding appropriate terms in a regulatory mitigation agreement. A participant remains subject to all legal and regulatory requirements not expressly waived or modified by the terms of the regulatory mitigation agreement. The Office may remove a participant at any time and for any reason, and the participant does not have an expectation of a property right or license to participate in the Learning Laboratory.



(c)        A participant demonstrating an artificial intelligence technology that violates legal or regulatory requirements (considering any regulatory mitigation agreement), or that violates the terms of the participation agreement, may be immediately removed from further participation and subject to all applicable civil and criminal penalties.



(d)       Participation in the Learning Laboratory does not constitute an endorsement or approval from the State.



(e)        Participation or non‑participation in the Learning Laboratory does not constitute a legally cognizable factor for any tort claim, civil law violation, or criminal law violation.



(f)        The State shall not be responsible for any claims, liabilities, damages, losses, or expenses arising out of a participant's involvement in the learning laboratory.



§ 114‑80.  Regulatory mitigation eligibility.



(a)        To be eligible for regulatory mitigation, a participant shall demonstrate all of the following to the Office:



(1)        That the participant has the technical expertise and capability to responsibly develop and test the proposed artificial intelligence technology.



(2)        The participant has sufficient financial resources to meet obligations during testing.



(3)        The artificial intelligence technology provides potential substantial consumer benefits that may outweigh identified risks from mitigated enforcement of regulations.



(4)        The participant has an effective plan to monitor and minimize identified risks from testing.



(5)        The scale, scope, duration of proposed testing is appropriately limited based on risk assessments.



(6)       For participants with fewer than 50 employees or under $10 million in annual revenue, the Office may adjust eligibility or reporting requirements to reflect organizational capacity, while ensuring appropriate risk controls.



(b)        To evaluate whether an applicant meets eligibility criteria to receive regulatory mitigation, the Office may consult with relevant agencies and outside experts regarding the application.



(c)        An initial regulatory mitigation agreement shall be in force for no longer than 12 months. A participant may request a single 12‑month extension for participation in the Learning Laboratory period no later than 30 days before the end of the initial 12‑month period. The Office shall grant or deny an extension request before expiration of the initial demonstration period.



SECTION 3.  This act is effective when it becomes law.