Wsu Greg Lobdell September 2008 Data And Decision Making

download Wsu Greg Lobdell September 2008 Data And Decision Making

of 105

  • date post

    04-Dec-2014
  • Category

    Education

  • view

    1.181
  • download

    0

Embed Size (px)

description

 

Transcript of Wsu Greg Lobdell September 2008 Data And Decision Making

  • 1. The Use of Data to Inform Instruction, Building, & District Decisions Greg Lobdell Co-founder & Research Director 19 September, 2008
  • 2. CEE & Data: Supporting a Cycle of Continuous Improvement
    • Expand Capacity
    • Provide Value-add services
    • Expertise in core areas
    • Partnership is critical
    CEE Services OSPI & Summit Partners ESDs Districts Schools & Classrooms Demographics & Community Characteristics Perceptual Academic Achievement Contextual-Program and Process
  • 3. Who we serve . . .
  • 4. Who We Serve
  • 5. Todays Outcomes
    • Knowledge and skills: Models and strategies for using data to inform our practice
    • Ideas and application: how do we get from here to there
    • Time to reflect and share
  • 6. How Will We Get There?
    • Models of Interpretation
    • Application, investigation, and understanding the data use in the Schools of Distinction
    • Taxonomy of Assessment Responsiveness
  • 7. So Much Data, So Little Time
    • By itself, Data has no value. When data is put into a form that is easily understandable, it becomes information . When information is used to guide decisions that are in the best interest of the students and families we serve, it becomes applied knowledge .
    • Stan Beckelman , former President of Boeing Information Services
    • and Board Member for the Center for Educational Effectiveness
  • 8. 3 Models of Assessment Interpretation
    • Status
    • Improvement
    • Growth
  • 9. Status
    • Where are we?
      • Typically viewed relative to a target or desired state
      • Can be viewed system wide or with any unit of analysis down to student-by-student
      • AYP is a 1-year Status model
  • 10. Improvement (Change)
    • Are we getting better?
    • Requires historical data
      • 1 year does NOT make a trend
      • Common tools (tests, surveys, etc)
    • Requires leadership to define Better
    • Safe Harbor is a simple Improvement model
  • 11. Growth
    • Most often applied to student achievement when viewing student-by-student achievement
  • 12. Measuring Growth and Improvement
    • Systems for measuring and acting-on require:
      • Summative (evaluative) Assessments
        • E.g. WASL
      • Formative: Guiding (predictive) assessments
        • E.g. NWEA MAP assessment
      • Diagnostic: indicate student-level strengths and challenges which can be used by staff to assist each student
        • E.g. Pearson Benchmark
  • 13. Conclusive Validity
    • Are the conclusions we draw from the data the right ones?
    • Are the conversations and actions taken as a result of the conclusions
      • Supported by the data?
      • Constructive- leading to positive change?
    • Due to the volume of data- Summary is a required technique
  • 14. Be Careful with Cause & Effect (Jumping to Conclusions)
    • Establishing Cause (3 required conditions) :
      • The cause is related to the effect
      • No plausible alternative explanation for the effect exists other than the cause
      • The cause precedes the effect
      • Robsinson, D. (et al.). The Incidence of Causal Statements in Teaching-and-Learning Research Journals. AERA Research Journal. Vol. 44 No. 2. June 2007
  • 15. Status, Improvement, and Growth Data Must Share Certain Attributes
    • Educationally Significant
    • Aligned with Appropriate Standards
    • Longitudinal
    • Comparative / relevant
    • Community Sensitive / Culturally Responsive
    • Appropriate to target
    • Accurate / Valid
  • 16. The Three Critical Questions
    • Where are we?
      • Comparison points provide context
    • Where do we want to be?
      • The movement from Good to Great
    • Are we improving and growing? Are we on a path to get were we want to be?
  • 17. Application Across Domains of Data
    • Viewing data across Status, Improvement, and Growth should not be limited to student outcome data
    • But lets start with a simple achievement example
  • 18. Comparison Point
  • 19. Comparison Points
  • 20. Relevant SSD Data
  • 21. Additional Comparison Points % = Results from 17 Latino-Majority Districts
  • 22. What does your eye say about improvement? % = Results from 17 Latino-Majority Districts
  • 23.
  • 24. Improvement?
  • 25. 3 Models of Assessment Interpretation
    • Status
    • Improvement
    • Growth
  • 26. Growth Models
    • Models of accountability which measure progress by tracking the achievement of the same students from year to year with the intent of determining whether the students made progress (growth)
  • 27. Two Flavors of Growth Models
    • Growth Models
    • Value-Add Models
    • Growth Modeling is NOT a silver-bullet rather, another data analysis technique to use in improvement conversations.
  • 28. Interpreting Growth Rates
  • 29. Interpreting Growth
  • 30. Interpreting Growth Highest (or fastest) growth Rate