You are here

Vulnerable Decision Point 4: Interpreting Data and Information

Introduction

After compiling existing data and collecting any additional assessment data and other information as part of a comprehensive special education evaluation, and in addition to taking overt steps to reduce confirmation bias, IEP team participants can also take specific steps to reduce attribution bias. When IEP team participants take pieces of information about a student and try to use that piece of information to over-explain student performance then there is a likelihood for attribution bias to occur.

Untrue, Unalterable, and Unfounded Attribution Bias

Attribution bias refers to when IEP team participants take some data and use it to inappropriately or inaccurately explain (interpret) the student’s behavior or performance. Attribution bias can take many forms, but three forms commonly observed on school teams are: untrue attributions, unfounded attributions, and unalterable attributions.

Untrue Attributions
 

For untrue attributions, it is important for IEP team participants to state that the attribution is untrue (e.g. poverty does not cause behavioral difficulties or academic difficulties); therefore, as data-based decision-makers, we cannot use that information to guide our assessment procedures.

Unfounded Attributions
 

For unfounded attributions (e.g. a student is having reading difficulties because the parents do not care about education), IEP team participants can state that we do not have evidence or data to support that view; therefore, we cannot use that information to guide our assessment process.

Unalterable Attributions
 

For unalterable attributions (e.g. parents are incarcerated), IEP team participants can state that this is a factor that we cannot change; therefore, let’s focus on alterable factors that we can change to ensure that the student is college and career ready.

Interpreting Data and Information

Special education evaluation teams review and analyze a significant amount of data but gathering and reviewing the data is only the first step. Teams must also accurately analyze the data to make appropriate decisions. One of the primary challenges to accurately analyzing these data is bias (Newell & Newell 2011); and therefore, interpreting existing or new assessment data and other information, including information provided by the parent, is another vulnerable decision point. As explained in the Guide for Problem-Solving Teams (2017), there are 4 manifestations of bias that can occur when interpreting data:

  1. relying on stereotypes and prejudice and ignoring data,
  2. weighting data from school-based professionals more than parents or guardians and students,
  3. taking a deficit-based view of the data, and
  4. not seeking convergence across multiple sources of data.

Relying on Stereotypes and Prejudice Based on Identity instead of Data
 

During a comprehensive special education evaluation, IEP teams collect multidimensional data from multiple perspectives in order to reduce bias (Newell & Newell 2011). However, if IEP team participants ignore that data and instead rely on their own prejudices and stereotypes about the student’s identity, then the interpretation of data can be biased. Oftentimes, when data is pointing to ways in which educational systems have failed students, especially students who have been historically marginalized, then we may turn from that data and instead rely on our prejudices or stereotypes to blame the student.

For example, the multidimensional data could indicate that a student who lives in poverty has not received appropriate instruction in reading due to having substitute reading teachers who were not appropriately qualified. Upon review and analysis of all the data, it may become clear that a lack of appropriate instruction, not a disability, is the determinant factor for the student’s reading comprehension difficulties. However, the team may ignore the data indicating the missed instruction and instead focus on the unrelated fact that the student lives in poverty and attribute this to the reason the student’s reading comprehension is not meeting grade level expectations. With that interpretation, the team is likely to identify the student as having a disability instead of recognizing that the student’s reading difficulties are due to a lack of instruction, which is an exclusionary factor.

Similarly, a student might have limited English proficiency due to the lack of bilingual and other multilingual learner supports in the school, but the team might ignore this information (which would be another exclusionary factor) and instead identify the student as having a disability. It can be difficult to hold educational systems accountable for failing to meet the needs of students who are marginalized; however, the purpose of exclusionary factors is to try to remedy this issue by acknowledging that educational systems can fail to educate students who are marginalized and these students should not be further marginalized by being identified as having a disability when a lack of access to quality education has created the problem.

Weighting Some Data More Heavily than Other Types or Sources of Data
 

When using multidimensional data, teams will compile or collect multiple types of data from multiple sources. However, the team may weigh some types and sources more heavily than others due to bias. For example, a team may weigh test data more heavily than classroom work products. Or, teams may weigh teacher reports more heavily than parent or student reports. Unless there is a valid reason to dismiss a type or source of data, all data should be considered equally; bias should not guide how much teams value some data over others.

Taking a Deficit-based View of the Data
 

Teams may sometimes review data and only see what the student cannot do, what the student does not know, or what is wrong with the student and family. When this occurs, bias is leading to a deficit view of the student, and when this happens, student difficulties are viewed as being within-person, pathological, and unalterable. As a result, teams conclude that there is nothing they can do to improve the student’s performance. Instead, teams should take a strengths-based approach to data interpretation, also identifying what students can do, what students do know, and the assets and strengths the student brings to learning activities and environments.

Relying on One Type of Data instead of Seeking Convergence and Triangulation across Multiple Types and Sources of Data
 

When analyzing multidimensional data, teams should not rely on one source or type of data (e.g., teacher interview). Instead, teams should look across all data to identify points of convergence, which occurs when multiple pieces of data are pointing to the same problem and hypothesis. If the data does not converge, then the team should ask additional educationally relevant questions to re-evaluate the problem, re-evaluate the hypothesis, or when applicable, request consent from the parent to collect additional data.

Strategies to Interrupt Bias

Given these vulnerabilities, teams should ask themselves, “Did we consider all the data and identify convergence to verify the problem and confirm the hypothesis?” To help teams achieve this goal, they can use the following strategies.

Strategy 1: Ask Questions When Interpreting Data
 

To reduce bias when interpreting data, the team can ask the following questions during and after the discussion of data:

  • Did we equally consider all the data? If not, what data did we not fully consider and why?
  • Did the data tell us what the student can do as well as student strengths? If not, go back and review the data or gather additional data.
  • Did the data converge to confirm the problem?
  • Did the data converge to confirm why the problem is occurring?
  • Did the data tell us what the student’s disability-related needs are so that we can develop IEP goals and align college and career ready IEP services?

If the team is unable to answer these questions, then they should revisit the data collection process so that all of these questions can be clearly answered and documented in the review and interpretation of the data. There is no one answer to these questions, rather they are designed to guide the team through a reflective process that can illuminate what was learned, as well as not learned from the data. Moreover, this process keeps the team’s decisions grounded in data rather than biases that are not grounded in data.

Reflection and Application Activities

The following reflection and application activities were developed to build the knowledge, skills, and systems of adults so they can develop better systems for conducting comprehensive special education evaluations.

  1. Look at the four manifestations of bias, referenced in this section, that can occur when interpreting data listed in this section.

    • What do statements of these types of bias look and sound like (e.g. what experiences have you seen with these types of bias)?

    • Why is it important to redirect statements of bias when educational decisions about students are being made?

    • What processes, protocols (e.g. IEP meeting norms or agreements) can or have been developed to address statements of bias, in a direct and professional manner, during IEP team meetings?

  2. What processes does the school or district have to ensure multiple perspectives about the student, including the student, are heard and discussed when interpreting data and making educational decisions at IEP team meetings?

  3. Explore how school staff value and interpret different types of academic information about the student.

    • When interpreting academic assessments, data, and other information, what process does the school or district have, or could have, to ensure staff are consistent in how they value and interpret the relevancy of certain academic data (e.g. homework, tests, assessments)?

    • Why is it important for school staff to have similar or the same values and expectations related to interpreting and making decisions about academic data and information?

    • What are the unintended consequences of staff having different values or interpretations about academic information when making decisions about special education eligibility and services?

  4. Explore how school staff value and interpret different types of functional information about the student (e.g. social and emotional learning, medical and health, independence, communication).

    • When interpreting functional assessments, data, and other information, what process does the school or district have, or could have, to ensure staff are consistent in how they value and interpret the relevancy of certain functional data (e.g. attendance, observable behavior, independence)?

    • How do different views on “behavioral” expectations lead to different interpretations of functional information discussed and interpreted at IEP team meetings?

    • What are the unintended consequences of staff having different values or interpretations about functional or behavioral information when making decisions about special education eligibility and services?

  5. Engage school and district staff in exploring different values, expectations, and mindsets about “student behavior” by reviewing resources from Wisconsin DPI’s “Inclusive Strategies to Address Behavioral Needs of Students with IEPs.”