AI & K-12 Glossary (Jargon + Acronyms)
Quick definitions for terms used across DPI’s AI Guidance. Use your browser’s Find (Ctrl/Cmd+F) to jump to a term, or click a letter below.
A
Academic Integrity — Upholding honesty and original work when using AI (proper attribution, citations, allowed uses).
Why it matters: protects learning and fairness.
Adaptive Learning — Software that adjusts content or pacing based on learner data.
Why it matters: supports personalization and UDL.
AI (Artificial Intelligence) — Computer systems that perform tasks typically requiring human intelligence (recognizing patterns, generating text, etc.).
AI Literacy — Knowledge, skills, and dispositions to use, question, and create with AI ethically and effectively.
Why it matters: foundation of DPI’s guidance and the AI Literacy Framework.
Anonymization / De-identification — Removing direct identifiers from data; true anonymization is hard.
Why it matters: protects PII when using AI tools.
Attribution — Crediting AI tools when they contribute to a product (and citing sources used by the model when appropriate).
Why it matters: transparency & integrity.
B
BadgerLink — Wisconsin’s statewide portal to licensed online resources (articles, reference sources).
Why it matters: trusted sources to verify AI outputs.
Bias (Algorithmic) — Systematic, unfair outcomes in model training or outputs (e.g., stereotypes, exclusion).
Why it matters: equity & legal considerations.
C
CESA — Cooperative Educational Service Agency (WI regional support organizations).
Why it matters: training & implementation partners.
Chatbot — Conversational AI interface (text or voice) that answers questions or completes tasks.
Context Window — The amount of text (tokens) a model can consider at once.
Why it matters: limits document length for analysis.
COPPA / PPRA — U.S. student privacy laws: Children’s Online Privacy Protection Act and Protection of Pupil Rights Amendment.
Why it matters: parent consent & student data protections.
D
Differential Privacy — Adding statistical noise so individual records can’t be identified.
Why it matters: safer analytics on student data.
Digital Durability — The idea that data and AI outputs persist and can be reused, resurfaced, or misused later.
Why it matters: teach students long-term consequences.
E
Ethical Use — Using AI with fairness, transparency, privacy, safety, and accountability.
Embedding — Vector representation of text/images that capture meaning for search or retrieval.
Why it matters: powers RAG and semantic search.
F
FERPA — Family Educational Rights and Privacy Act.
Why it matters: governs disclosure of student education records.
Fine-Tuning — Training an existing model further on your examples to specialize behavior.
G
Generative AI — Models that create new content (text, images, audio, code) from prompts.
G&T (Gifted & Talented) — Programs for advanced learners; in this guidance, also “Academic Support Faculty.”
H
Hallucination — Confident but incorrect or fabricated AI output.
Why it matters: teach verification habits.
H → AI → H (Human-in-the-Loop) — Human designs the task → AI drafts → Human reviews, revises, and owns final product.
HIPAA — Health Insurance Portability and Accountability Act (health data privacy; see School Nursing notes).
I
IEP — Individualized Education Program for students with disabilities.
Inference — A model generating outputs from inputs after training (the “use” phase).
ITL Standards — Wisconsin’s Information & Technology Literacy Standards (2025 revision).
J
Jailbreak — Techniques used to make a model ignore its safety rules; not appropriate for K-12 use.
K
K-12 GenAI Maturity Tool — Self-assessment rubric for district readiness and responsible AI implementation.
L
LEA / SEA — Local Education Agency (district) / State Education Agency (DPI).
LLM (Large Language Model) — A model trained on massive text to generate and reason over language.
LMS / SIS — Learning Management System (e.g., Canvas) / Student Information System (e.g., Infinite Campus).
M
Model Card — A transparency document describing how a model was built, tested, and its limitations.
Why it matters: helps vet tools.
MTSS — Multi-Tiered System of Supports (tiered academic/behavioral supports).
N
NIST AI RMF — U.S. National Institute of Standards and Technology’s AI Risk Management Framework.
NLP — Natural Language Processing (AI working with human language).
O
Opt-Out / Consent — District procedures letting families control student data sharing with vendors and tools.
P
PII (Personally Identifiable Information) — Data that can identify a student (name, ID, image, voice, etc.).
Why it matters: never paste PII into public AI tools.
Prompt / Prompt Engineering — The instructions you give an AI model (and techniques to improve results).
Q
Quality Review — The human evaluation step for accuracy, bias, privacy, and age-appropriateness
(see DPI AI Output Review Checklist).
R
RAG (Retrieval-Augmented Generation) — Technique where the model looks up documents first, then writes using those sources.
Why it matters: reduces hallucinations; enables local content.
Red Teaming — Testing a system to find safety, bias, or privacy failures before public use.
S
SEL — Social-Emotional Learning (skills for self-awareness, self-management, relationships, decision-making).
System Prompt — Hidden instructions that set a model’s role/behavior in a tool (“You are a student writing coach…”).
T
Temperature — Controls randomness/creativity in text generation (lower = more predictable).
Token — A chunk of text a model reads/writes (≈ 3–4 characters or 0.75 word on average).
Why it matters: affects limits & cost.
U
UDL (Universal Design for Learning) — Designing learning that is accessible and flexible from the start.
V
Vector Database — Specialized database for embeddings, used to power semantic search and RAG.
Watermarking (AI) — Hidden or visible markers to signal AI-generated content.
W
WISELearn — Wisconsin’s OER hub for educator-curated resources (collections include AI PD and classroom tools).
WDLC — Wisconsin Digital Learning Collaborative (statewide online/virtual learning partnership).
Wisconsin DPI — Department of Public Instruction (SEA for WI; this guidance site).
For questions about this information, contact Amanda Albrecht (608) 267-1071, Amy Bires (608) 266-3851