- - - - - - - - - - - - - - - - - - - - - -
 Divisions
  Overview
  Academic Policy & Innovation
  Accountability, Assessment, and Data Systems
     2003-2004 Alternate Maryland School Assessment
     2012 HSA and Mod-HSA Technical Report
     HSA Technical Reports
     Maryland Standard Setting Technical Reports
     Modified HSA Technical Reports
     MSA Science Technical Reports
     MSA Technical Reports
     Staff and Student Publications
     State Test Administration and Security Committee Fact Sheet
  Career and College Readiness
  Communications, Partnerships, and Grants
  Curriculum, Assessment & Accountability
  Educator Effectiveness
  Early Childhood Development
  Finance
  Information Technology
  Library Services
  Office of the State Superintendent
  Rehabilitation Services
  Special Education and Early Intervention
  Student, Family, and School Support
Divisions
Divisions > Accountability, Assessment, and Data Systems
2006 HSA Technical Report

Technical Documentation for the
Maryland High School Assessment Program

Algebra/Data Analysis, Biology, English, and Government End-of-Course Assessments



TABLE OF CONTENTS

Introduction

Section 1 Test Construction and Administration

Test Development
Table 1.1 Number of Items by Item Type for Each MDHSA Content Area

  Test Specifications
Table 1.2 Algebra Blueprint
Table 1.3 Biology Blueprint
Table 1.4 English Blueprint
Table 1.5 Government Blueprint

Item Selection and Form Design
Table 1.6 Form Construction Specifications for the January 2006 Administration
Table 1.7 Form Construction Specifications for the May 2006 Administration
Table 1.8 Form Construction Specifications for the 2006 Summer Administration

Section 2 Validity

Section 3 Scoring Procedures and Score Types

Scale Scores

Conditional Standard Errors of Measurement

Lowest and Highest Obtainable Test Scores

Cut-Scores
Table 3.1 MDHSA 2006 Cut-Scores by Content Area

Year-to-Year Scale Maintenance

Section 4 Test-Level Analyses

Demographic Distributions
Table 4.1 Demographic Information for Algebra
Table 4.2 Demographic Information for Biology
Table 4.3 Demographic Information for Government

Reliability

Summary Statistics
Table 4.4 Mean Scores by Administration
Table 4.5 Comparison of Mean Scores
Table 4.6 Comparison of Percentage Passing Rates
Table 4.7 Classification Rates for Algebra
Table 4.8 Summary Statistics for Algebra Primary Forms
Table 4.9 Summary Statistics for Algebra Make-Up Forms
Table 4.10 Summary Statistics for Biology Primary Forms
Table 4.11 Summary Statistics for Biology Make-Up Forms
Table 4.12 Summary Statistics for Government Primary Forms
Table 4.13 Summary Statistics for Government Make-Up Forms

Decision Accuracy and Consistency
Table 4.14 Decision Accuracy and Consistency for the Algebra Tests
Table 4.15 Decision Accuracy and Consistency for the Biology Tests
Table 4.16 Decision Accuracy and Consistency for the Government Tests

Section 5 Field Test Analyses

Classical Item Analyses

Differential Item Functioning

IRT Calibration and Scaling

Statistical Summary Tables
Table 5.1 Distribution of P-Values for the January Field Test SR Items
Table 5.2 Distribution of P-Values for the January Field Test CR Items
Table 5.3 Distribution of Item-Total Correlations for the January Field Test SR Items
Table 5.4 Distribution of Item-Total Correlations for January Field Test CR Items
Table 5.5 Distribution of P-Values for the May Field Test SR Items
Table 5.6 Distribution of P-Values for the May Field Test CR Items
Table 5.7 Distribution of Item-Total Correlations for the May Field Test SR Items
Table 5.8 Distribution of Item-Total Correlations for May Field Test CR Items
Table 5.9 Field Test Items Excluded from Analyses by Administration and Content Area
Table 5.10 Field Test Items with Statistical Flags Retained in Analysis

Section 6 English Test

Procedures for Selecting Operational Items
Table 6.1 Number of Items Flagged by Field Test Form
Table 6.2 Summary Statistics Describing the P-V Values for the Operational Selected Response and Constructed Response Items by Form
Table 6.3 Summary Statistics Describing the Item-Test Correlations for the Operational Selected Response and Constructed Response Items by Form
Table 6.4 Demographic Information for the May 2006 Population by Form

Calibration and Linking Procedure
Table 6.5 Content Representation of the 2006 Anchor Set

Summary Statistics of Student Achievement
Table 6.14 Demographic Information
Table 6.15 Summary Statistics for English Primary Forms
Table 6.16 Summary Statistics for English Make-Up Forms
Table 6.17 Percent of Students Performing in Each Performance Category

Decision Accuracy and Consistency
Table 6.18 Decision Accuracy and Consistency for the 2006 English Exam

IRT Analyses
Figure 6.1 Test Characteristic Curves for 2006 English Forms
Figure 6.2 Conditional Standard Error of Measurement for the 2006 English Forms

Factor Analysis
Table 6.19 Factor Analysis Results by Form


Appendices
Appendix A Scree Plots of Factor Analysis Results for the May 2006 English Forms
Appendix B Classical Item Statistics

A complete copy of the 2006 HSA Technical Report is also available.

Contact Information
Leslie Wilson, Assistant State Superintendent
Division of Accountability and Assessment
Maryland State Department of Education
200 West Baltimore Street
Baltimore, MD 21201
Maryland State Department of Education
200 West Baltimore Street
Baltimore, MD 21201
MSDE Privacy Statement Disclaimer  | Copyright © 2003 MSDE