You are here

Readiness Self-Assessment

K–12 Gen AI Maturity Tool

The K–12 Gen AI (Generative Artificial Intelligence) Maturity Tool from CoSN (Consortium for School Networking) and CGCS (Council of the Great City Schools) is a comprehensive self-assessment resource designed to help school districts evaluate their readiness for implementing Generative AI. It supports district leaders in identifying strengths and gaps across key readiness areas, laying the groundwork for thoughtful and secure adoption aligned with educational goals.

Executive Leadership Operations Data Technical Security Legal/Risk Academic AI Literacy

Purpose & Structure

Purpose

This tool guides districts through a structured self-evaluation, helping them:

  • Assess their maturity in Generative AI implementation
  • Identify focus areas for growth and investment
  • Align AI use with instructional and operational goals
  • Build upon foundational steps introduced in the DPI AI Guidance & Roadmap (see planning checklist)
Tip: Pair the maturity tool with a short stakeholder survey (students, families, educators) to validate priorities before setting timelines.
Structure and Maturity Rubric
  • Organizes readiness into seven major domains with clearly defined subdomains
  • Uses a three-level maturity rubric—Emerging, Developing, and Mature—to guide self-assessment and progression

Rubric snapshot (examples)

Level What it looks like Evidence you can collect
Emerging Pilots with ad-hoc guidance; limited staff training; minimal documentation. List of pilots; draft policy notes; 1–2 PD (professional development) sessions.
Developing Documented policies; cross-functional team; growing PD; initial metrics. Approved AUP addendum; AI tool registry; PD agendas; baseline KPI dashboard.
Mature AI integrated into strategy and curriculum; continuous improvement cycle; clear KPIs. Annual review minutes; updated vendor DPAs; accessibility/equity audits; outcomes data.
Maturity Levels
  • Emerging: Low awareness and limited infrastructure; early experimentation with minimal planning or evaluation.
  • Developing: Growing awareness, increasing investment, evolving governance and training; AI use is becoming regular.
  • Mature: Deep AI expertise, full integration into educational and operational processes; dynamic policies and continuous evaluation in place.
Note: “Mature” ≠ “finished.” Districts revisit policies, equity reviews, and training as tools and laws evolve.

Seven Major Domains of Gen AI Readiness

1. Executive Leadership Readiness

Evaluates leadership’s ability to develop and sustain strategic AI initiatives. Areas include:

  • Vision and alignment with district goals
  • Policy and legislative awareness
  • Equity, ethics, and oversight responsibilities

Examples

  • Adopt a board-approved AI vision statement and success metrics tied to the Strategic Plan.
  • Convene a cross-functional AI Steering Committee (curriculum, IT, SPED, ELL, legal, HR, library/media, family engagement).
  • Publish a one-page “What AI Means for Our District” for families (plain language, multiple languages).

Helpful links

2. Operational Readiness

Focuses on organizational structures and staffing needed for effective Gen AI implementation:

  • Staffing models and capacity building
  • Procurement processes for AI tools
  • Cross-departmental coordination

Examples

  • Designate product owners for each AI tool; keep a living AI Tool Registry with purpose, data flows, and renewal dates.
  • Use RFP (Request for Proposal) prompts that require a VPAT/ACR (Voluntary Product Accessibility Template / Accessibility Conformance Report) and NDPA (National Data Privacy Agreement).
  • Set a 90-day pilot → feedback → approval/retire decision cycle.

Helpful links

3. Data Readiness

Assesses systems for managing data with integrity and compliance:

  • Data governance, quality, and privacy protocols
  • IAM (Identity and Access Management) & SSO (Single Sign-On)
  • Security safeguards and system monitoring

Examples

  • Implement role-based access and periodic access reviews for AI tools.
  • Adopt data minimization: redact PII (Personally Identifiable Information) in prompts and uploads.
  • Create a Data Catalog with owners, retention, and lawful basis for processing.

Helpful links

4. Technical Readiness

Ensures infrastructure supports Gen AI with reliability and accountability:

  • Technology architecture and system compatibility
  • Monitoring and response systems for misuse
  • Management of “hallucinations” (confident but incorrect AI output) or inappropriate content

Examples

  • Enable content filtering, logging, and safe-search for AI features in browsers and LMS (Learning Management System).
  • Adopt a ticketing tag (e.g., “AI-Issue”) to track patterns and remediation steps.
  • Pilot on-prem / private-tenant AI where needed for sensitive workflows.

Helpful links

5. Security Readiness

Evaluates practices that protect systems and users:

  • Cybersecurity frameworks and policies
  • Ongoing staff training and security awareness
  • Incident response and threat mitigation planning

Examples

  • Require MFA (Multi-Factor Authentication) for staff/admin roles on AI tools.
  • Run quarterly phishing simulations and AI-prompt safety training (do not paste PII).
  • Maintain an IRP (Incident Response Plan) with 24/72-hour comms templates.

Helpful links

6. Legal/Risk Readiness

Prepares districts to manage legal obligations and risk exposure:

  • Risk assessments and mitigation strategies
  • Compliance with legal standards
  • Protocols for loss notification and remediation

Examples

  • Use DPIA (Data Protection Impact Assessment) or PIA (Privacy Impact Assessment) for new AI tools.
  • Ensure contracts include a DPA (Data Processing Agreement), data-deletion timelines, and breach terms.
  • Publish a plain-language AI FAQ for families, with translation options.

Helpful links

7. Academic AI Literacy Readiness

Supports AI integration into teaching and learning:

  • Curriculum design and instructional materials
  • Teacher professional development and support
  • Ethical use and evaluation of AI in academic settings
  • Accessibility, equity, and automation for instructional improvement

Examples

  • Map AI skills to ELA/Science/Social Studies standards; add “Human → AI → Human” checkpoints (student drafts → AI feedback → student revision).
  • Create brief disclosure prompts students paste at the end of work (tool, purpose, what they kept/changed, verification steps).
  • Ensure WCAG-aligned materials (captions, alt text, plain-language summaries) for AI-assisted lessons.

Helpful links

For a full version of the tool and all subdomains—including detailed evidence, examples, and next steps—refer to the K–12 Gen AI Maturity Tool (conference edition) and the current CoSN/CGCS PDF.


Explore more for Administrators & School Leaders

Back to AI for Administrators & School Leaders

For questions about this information, contact Amanda Albrecht (608) 267-1071, Amy Bires (608) 266-3851