You are here

Policy & Governance

Policy & Governance for Responsible AI Implementation

For administrators and school leaders, establishing a strong policy and governance framework is essential to the thoughtful implementation of artificial intelligence (AI). This framework ensures that AI integration aligns with both instructional and operational goals while upholding ethical standards and legal compliance.

Acceptable/Responsible Use (AUP) Procurement & Ethics Code of Conduct Academic Integrity Data Governance Data Privacy Breach Notification Ethical Guardrails Digital Citizenship

School & District Policies

Districts and libraries should review and update existing policies to incorporate AI-specific language, ethical expectations, and data privacy concerns. The following existing school and district policies should be considered.

Acceptable / Responsible Use Policy (AUP)

Outlines appropriate uses of internet and digital tools, emphasizing ethical expectations for AI interactions in schools, libraries, and classrooms.

Sample Language: “Users may not employ generative AI tools to produce content that violates academic integrity, shares personally identifiable information (PII), or circumvents learning. Educators must supervise AI use in classrooms, ensuring students understand ethical use expectations.”

Quick use matrix

Task Type Allowed Required Disallowed
Brainstorm/outline Yes (AI-assist) Disclose tool & how used
Grammar/clarity edits Yes Track changes or note edits
High-stakes assessments Human-Only unless accommodation Teacher approval Fully AI-generated submissions
Sensitive data (PII) No Entering PII into prompts
Procurement Standards and Ethics
  • Guide the ethical, compliant, and strategic acquisition of generative AI tools.
  • Align procurement decisions with the Blueprint for an AI Bill of Rights and privacy laws.
  • Include protocols for evaluating, upgrading, and renewing AI-enabled tools.
  • Vendors must notify districts when generative AI is added to existing products.
  • Address environmental impact by encouraging sustainable, energy-efficient tools.
  • Manage vendor relationships to ensure tools continue to meet district standards.

Sample Language: “Vendors must disclose if AI is embedded in their tools or platforms. Districts will prioritize procurement of tools that are Family Educational Rights and Privacy Act (FERPA) / Children’s Online Privacy Protection Act (COPPA) compliant, prohibit third-party data sharing, and allow full data deletion upon contract termination.”

Suggested Request for Proposal (RFP) prompts

  • Describe your model’s training data sources and any guardrails for bias and safety.
  • Confirm no use of our tenant’s data for model training; provide your Data Processing Agreement (DPA).
  • Detail accessibility conformance with Web Content Accessibility Guidelines (WCAG) 2.1 Level AA and provide an Accessibility Conformance Report (ACR).
  • Explain data retention/deletion timelines after contract end and breach response workflow.
Code of Conduct Policy

Defines standards of conduct for staff, students, and families, including expectations and consequences related to AI use.

Sample Language: “All staff and students must refrain from using AI-generated content to mislead, bully, impersonate others, or promote disinformation. Violations will be addressed using existing behavioral guidelines and disciplinary processes.”

  • Explicitly align to anti-bullying/harassment policies; include examples (deepfake memes, fake quotes, impersonation).
  • Create a simple reporting pathway for AI-related harm and misinformation.

Here is one comprehensive example from the Sheboygan Area School District .

Academic Integrity Policies
  • Clarify expectations for using AI in research, writing, and curriculum projects.
  • Include guidelines for citation, attribution, and documentation of AI-generated content.
  • Focus on educating students and staff about responsible AI use over relying solely on detection tools.

Sample Language: “Students must clearly identify when and how AI tools were used in completing assignments. Unauthorized use of AI for completing assessments or assignments will be considered a violation of academic integrity.”

Disclosure box (copy-ready): AI tool(s): ____ • Purpose (brainstorm / outline / edit / summarize): ____ • What I kept/changed: ____ • Verification steps (facts/sources): ____ • Date used: ____.
Data Governance Policy

Ensures consistent, secure, and lawful management of data—covering storage, use, and sharing—while complying with federal and state law.

Sample Language: “The district will maintain an inventory of AI-enabled tools in use and review them annually. AI systems must undergo review to ensure they align with board-approved data classification, retention, and risk assessment protocols.”

  • Maintain a living “AI Tool Registry” with owner, purpose, data flows, and retention.
  • Require Privacy Impact Assessment (PIA) / Data Protection Impact Assessment (DPIA) for new tools.
Data Privacy Policy

Specifies how student and staff personal data is collected, used, and shared in compliance with applicable privacy laws.

Sample Language: “AI applications used in the district must adhere to data minimization principles, collecting only what is essential. All AI tools must demonstrate compliance with FERPA, COPPA, and Wisconsin state privacy regulations.”

  • Use role-based access; log access to sensitive records.
  • Ensure encryption in transit (TLS 1.2+) and at rest (e.g., AES-256).
Data Loss Notification Policy

Outlines procedures for notifying stakeholders in the event of a data breach, including timing, content, and responsible roles.

Sample Language: “In the event of an AI-related data breach or incident, parents/guardians, staff, and relevant agencies must be notified within 72 hours. Incident response protocols must include AI system risk assessment.”

Timeline Action Owner
0–24 hrs Containment, log preservation, initial assessment IT/Security
24–48 hrs Legal review, draft notifications, regulator consult (as applicable) Legal
≤72 hrs Notify affected parties; publish support steps Comms/Admin
Ethical Considerations Policy

Confirms that AI use must uphold student rights and avoid unintentional harm or inequity.

Sample Language: “District staff must consider bias, transparency, and human oversight when adopting or using AI tools. AI must not be used to make high-stakes decisions (e.g., grading, discipline) without human review.”

  • Adopt a Human–AI–Human (H–AI–H) workflow: humans set intent, AI assists, humans evaluate/decide. How technology is reinventing education — Stanford
  • Publish a “Bias Review” quick check (representation, sources, impact, mitigations)—example below in “Bias Prevention & Equity.”
Digital Citizenship Standards & Curriculum

Prepares students to engage critically with AI, building media literacy, digital literacy, and responsible online behavior.

Sample Language: “Students will receive age-appropriate instruction on ethical AI use, algorithmic bias, and digital decision-making. AI literacy will be integrated into digital citizenship lessons across grade levels.”


Legal & Data Privacy Considerations

Federal Regulations
Wisconsin Statutes
Vendor Contractual Obligations
  • Vendors must not use student/library user data to train AI models without consent.
  • Require full data deletion upon contract termination.
  • Prohibit third-party data sharing unless explicitly authorized.
  • All contracts must include specific terms around generative AI use.

Clause examples (copy-ready)

Topic Sample Clause
Model training “Provider shall not use District Data to train, fine-tune, or otherwise improve models, except where explicitly authorized in writing by the District.”
Deletion “Within 30 days of contract end, Provider will permanently delete District Data and certify deletion in writing.”
Security “Data must be encrypted in transit (TLS 1.2+) and at rest (AES-256 or stronger). Provider will maintain SOC 2 Type II or equivalent controls.”
Student Data Privacy Consortium (SDPC)

Leaders should explore resources such as the National Data Privacy Agreement (NDPA) and Global Education Security Standards (GESS) to streamline contracts and protect student data.

Human-Centered AI Approach (H > AI > H)

Use AI as a complement to—not replacement for—human thinking. Encourage workflows where humans define the task and criteria, AI assists, and humans evaluate the output before implementation.

  • Scheduling: AI drafts options; principals decide with equity and safety in mind.
  • Family communications: AI drafts bilingual notices; staff review for accuracy and tone before sending.
Bias Prevention and Equity
  • AI tools may reflect societal or systemic bias—leaders must address and mitigate this risk.
  • Vendors should demonstrate a commitment to algorithmic fairness (testing on representative datasets, publishing model cards).
  • Ethical AI use includes fostering equity and discussing systemic fairness with staff and students.
Bias review quick check: Who is represented? Which sources? What harms could result? What mitigations are in place?
Data Encryption and Storage

Ensure secure data storage and encryption protocols to protect sensitive user information.

  • Encryption at rest (e.g., AES-256) and in transit (TLS 1.2+).
  • Backups encrypted; documented retention and disposal schedules.
  • Role-based access; audit logs for sensitive records.
Transparency and Consent

Users should be informed about how their data is used and have the opportunity to provide informed consent.

  • Publish plain-language privacy notices and parent letters describing data flows and purposes.
  • Offer an opt-out for model training where feasible; provide non-AI alternatives.
Accessibility & Language Access (WCAG)

Ensure tools and content meet WCAG 2.1 Level AA and provide language access (translations, interpreter info) in family communications.

Digital Equity and Access Policies
  • Ensure all stakeholders—regardless of background or ability—have equitable access to generative AI tools.
  • Improve infrastructure (devices, bandwidth, IT support) and provide non-AI alternatives when needed.
  • Support multilingual learners and students with disabilities via captions, alt text, and assistive technologies.

Explore more for Administrators & School Leaders

← Back to AI for Administrators & School Leaders

For questions about this information, contact Amanda Albrecht (608) 267-1071, Amy Bires (608) 266-3851