The Bee has a stack of IDC resources!

Resource Library

Guides. Briefs. Toolkits. Quick reference information. IDC and its partners created these data quality resources to help states better prepare to address their existing or emerging IDEA data quality needs. Use our search and filtering tools to navigate the library.

Resources 1 - 7 of 37

View Archived Resources
    An IDC Resource

    Format: Quick Reference

    618 Data Collection and Submission Timeline

    A graphic illustrating how different IDEA data collections can span multiple years and how a state may be working simultaneously with data from multiple school years.

    An IDC Resource

    Format: Applications and Spreadsheets

    618 Data Collection Calendar

    Who doesn't love a good calendar? This tool serves as a reference for states as they plan for the timely and accurate submission of their 618 data collections. Simply select a month to view the data collection and corresponding tasks and activities for that month, or select a report to view a monthly breakdown of activities and related resources. 

    An IDC Resource

    Format: Guides and Briefs

    Graduation Rate and Dropout Rate: Indicators 1 and 2 Measurement Changes From FFY 2019 to FFY 2020–2025

    This resource focuses on recent changes in the data source and measurement of Part B Indicators 1 and 2. The resource specifically addresses the treatment of “alternate diploma” in the new calculation. In FFY 2019, the calculation of graduation rate included students receiving an alternate diploma in the numerator. For FFY 2020–2025 the calculation of graduation rate includes students receiving an alternate diploma in the denominator. The calculation for Indicator 2 remains similar from FFY 2019 to FFY 2020–2025; however, it explicitly adds students receiving an alternate diploma in the denominator. 

    An IDC Resource

    Format: Applications and Spreadsheets

    Graduation Rate (Indicator 1) and Dropout Rate (Indicator 2) Calculator

    This tool from IDC and NTACT calculates graduation and dropout rates using the 618 Exiting data, as OSEP will require in coming years for reporting in states’ SPP/APRs. The tool can accumulate and graph multiple years of data, allowing users to observe trends in the rates over time and share the information easily with stakeholders.  

    An IDC Resource

    Format: Guides and Briefs

    Parent Involvement Data: How to Measure and Improve Representativeness for Indicator B8

    This interactive resource provides states with an overview on how to gather representative parent involvement data for Part B SPP/APR Indicator 8. The resource defines key concepts such as representativeness, sampling, nonresponse bias, response rates, and weighting. It also offers information on how to improve the quality of parent involvement data, including strategies that can help states collect representative data and evaluate and improve the representativeness of their data before, during, and after data collection. 

    An IDC Resource

    Format: Quick Reference

    SPP/APR Indicator Sampling Plan Checklist

    States are allowed to use sampling for collecting data for select Part B State Performance Plan/Annual Performance Report indicators. Sampling can provide an effective means for targeting resources for data collection and improving data quality. However, there are important requirements that states must consider when designing and implementing their sampling plans. States can use this interactive self-assessment tool to determine whether their state’s sampling plan addresses Office of Special Education Programs sampling requirements for best practice and to identify action steps to improve their sampling procedures.

    An IDC Resource

    Format: Applications and Spreadsheets

    Survey Response Analysis App

    The Survey Response Analysis App (SRA App) is an interactive application powered by state-of-the-art statistical software IDC designed to help states examine their SPP/APR Indicator 8 and Indicator 14 data. The SRA App allows users to conduct reproducible analyses of response rates, representativeness, and nonresponse bias, tailored to their survey’s data collection method.