33328.5.
(a) For purposes of this section, the following definitions apply:(1) “Educator” means a certificated or classified employee of a local educational agency or charter school.
(2) “Local educational agency” means a school district or county office of education.
(b) The Superintendent, in consultation with the state board, shall convene a working group for all of the following purposes:
(1) Identifying safe and effective uses of artificial intelligence in education settings.
(2) Developing guidance on the safe use of artificial intelligence in education.
(3) Developing a model policy for local educational agencies and charter schools regarding the safe and effective use of artificial intelligence in ways that benefit, and do not harm, pupils and educators.
(4) Identifying other ways in which the state can support educators in developing and sharing effective practices involving artificial intelligence that minimize risk and maximize benefits to pupils and educators.
(c) The working group shall include all of the following:
(1) Current, credentialed teachers serving in elementary and secondary teaching positions.
(2) Classified public school
staff.
(3) Schoolsite administrators.
(4) School district or county office of education administrators.
(5) University and community college faculty.
(6) Representatives of private sector business or industry.
(7) Pupils enrolled in public school.
(d) The working group shall do all of the following:
(1) (A) Assess the current and future state of artificial intelligence use in education, including both of the following:
(i) The current state of artificial intelligence use by
local educational agencies and charter schools, including all of the following:
(I) Technologies most commonly in use.
(II) The typical cost of those technologies.
(III) The ownership structure of those technologies.
(IV) The licensing agreements for those technologies.
(V) The ability to access source code for those technologies.
(VI) The degree to which educators were involved in the decision to use artificial intelligence.
(ii) Anticipated and potential developments in artificial intelligence technology in education.
(B) Conduct at least six public meetings to incorporate feedback from pupils, families, and relevant stakeholders into the assessment required by subparagraph (A).
(2) (A) Identify safe and effective uses of artificial intelligence in education settings, including all of the following:
(i) The ethical, legal, and data privacy implications of artificial intelligence use in education.
(ii) Uses of artificial intelligence to support teaching and learning, including which pupils may benefit from, and avoid harm from, artificial intelligence technology.
(iii) Uses of artificial intelligence to support the work of educators, including ways in which educators may benefit from, and
avoid harm from, artificial intelligence technology.
(iv) Strategies to ensure equitable pupil access to the benefits of artificial intelligence technology.
(v) Strategies to provide effective professional development for educators on the use of artificial intelligence technology.
(B) In performing the work required by this subdivision, the working group shall solicit input from educators and pupils on their experience using the technologies identified in subparagraph (A).
(3) On or before January 1, 2026, develop guidance for local educational agencies and charter schools on the safe use of artificial intelligence in education that addresses all of the following:
(A) Academic integrity
and plagiarism.
(B) Acceptable and unacceptable uses of artificial intelligence for pupils and educators.
(C) Pupil and teacher data privacy and data security.
(D) Parent and guardian access to information that pupils enter into artificial intelligence systems.
(E) Procurement of software that ensures the safety and privacy of pupils and educators, and the protection of their data.
(4) On or before July 1, 2026, develop a model policy for local educational agencies and charter schools regarding the safe and effective use of artificial intelligence in ways that benefit, and do not harm, pupils and educators. This policy shall include all of the following topics:
(A) Academic integrity and plagiarism.
(B) Acceptable and unacceptable uses of artificial intelligence for pupils and educators.
(C) Pupil and teacher data privacy and data security.
(D) Parent and guardian access to pupil information.
(E) Procurement of software that ensures the safety and privacy of pupils and educators and their data.
(F) Effective use of artificial intelligence to support, and avoid risk to, teaching and learning.
(G) Effective practices to support, and avoid risk to, educators.
(H) Strategies to ensure equitable access to the benefits of artificial intelligence technology.
(I) Professional development strategies for educators on the use of artificial intelligence.
(5) Identify other ways in which the state can support educators in developing and sharing effective practices that minimize risk and maximize benefits to pupils and educators, including, but not limited to, establishing communities of practice on the use of artificial intelligence in education.
(6) On or before September 1, 2026, submit a report to the appropriate policy and fiscal committees of the Legislature, the Legislative Analyst’s Office, the state board, and the Department of Finance, in compliance with Section 9795 of the Government Code, on the process and products of the working group in meeting the
requirements of this section, and any related findings or recommendations.
(e) The department shall post on its internet website the guidance developed pursuant to paragraph (3) of subdivision (d) and the model policy for local educational agencies and charter schools developed pursuant to paragraph (4) of subdivision (d).
(f) Implementation of this act is contingent upon an appropriation by the Legislature for these purposes in the annual Budget Act or another statute.