Archived Resources

The Resource Library houses tools and products that were developed by IDC, developed with its collaborators, or submitted by IDC stakeholders. Search and filtering tools are available to help users navigate through the library.

Archived Resources 71 - 77 of 154

View Current Resources
Archived Indicators
Archived Topics
Archived Format
    An IDC Resource

    Format: Presentations

    618 Data—What’s That? Getting to Know Your 618 Data

    Have you heard these terms 618, 616, EDFacts, EMAPS, file specifications, OMB-MAX, GRADS360, Data Quality Reports? Do you understand what they are referencing? Do you want to gain a higher knowledge of these terms? Participants in this presentation learned more about IDEA data reporting requirements in relation to the 618 data collections. They also learned about data quality considerations and tools states can use when going through the data collecting and reporting procedures.

    An IDC Resource

    Format: Presentations

    SSIP Phase III: Operationalizing Your Evaluation Plan

    During this interactive role-alike workshop for SSIP Coordinators, the IDC Evaluation Team helped participants refine SSIP evaluation plans, identify action steps and timelines for implementing evaluation activities, and learn about resources that may be helpful in operationalizing their evaluation plans. For the 2016 Conference on Improving Data, Improving Outcomes held August 15-17, this workshop was conducted with an updated presentation and handout.

    Format: Presentations

    Resources to Support Implementation of Evidence-Based Practices in Academics and Behavior

    The presentation explores DBI in reading, a research-based approach that entails systematically collecting and using data to determine when and how to intensify reading intervention. Showing promise for helping to meet the needs of those children who do not respond to universal reading instruction, DBI offers a methodical and efficient means for using data to analyze intervention instruction, address areas of deficit and capitalize on learners’ strengths.

    Format: Presentations

    Data Collection Decisions to Improve Data Quality for Quality Data Use

    This presentation provided an overview of the challenges of data collection for IDEA Part B Indicator 14 and identified some successful efforts to overcome challenges such as representativeness and response rates. Additionally, presenters shared one state's effort to improve the quality of their student post-school outcome data to increase confidence in programmatic decisions that could be made from those data.

    Format: Presentations

    How States Use IDEA's Fiscal Flexibilities: LEA Maintenance of Effort and Coordinated Early Intervening Services

    The presentation provided an overview of LEA MOE and CEIS requirements, including the four new data collection elements implemented by OSEP. Presenters also provided information about areas of IDEA fiscal flexibilities that are available to states and discussed the fact that not many LEAs take advantage of these flexibilities. The flexibilities relate to the MOE reduction, CEIS, and their possible correlation to the SSIP.

    An IDC Resource

    Format: Online Applications

    Part C Exiting Counts

    IDC's Part C Exiting Counts app allows users to test their knowledge of the 10 Part C Exiting categories by either starting with a child scenario and deciding which reason and category best fit the scenario or starting with a reason and category and deciding which child scenario best fits that reason and category.

    Format: Presentations

    MOE and CEIS Data Reporting and Quality Tools and Tips

    IDC and CIFR staff presented an overview of tools available to states and LEAs around LEA MOE and CEIS requirements and data quality. They also presented information on considerations states and LEAs need to keep in mind in relation to the MOE and CEIS requirements and data quality. One state presented information about the policies, procedures, practices, and tools/ templates it has in place to ensure LEAs meet the LEA MOE and CEIS requirements and improve data quality. The state staff also discussed their work with CIFR around the LEA MOE calculator.