Formative Assessment Delivery System (FADS)

Principal Investigator: 
Project Overview
Background & Purpose: 
  1. Develop the FADS software through field-testing, providing:
    1. Teacher-designed on-line formative assessments of course material, where the assessment items offer computer interfaces (e.g., moving objects on the screen, rotating arrows) that are engaging to students.
    2. Reports on individual students and groups of students that teachers find useful.
  2. Implement additional functions, including a user-friendly scoring interface and an import/export wizard.
  3. Conduct field tests of the FADS system in order to answer the research questions.
Setting: 

K-12 classrooms.

Research Design: 

This research design for this project is longitudinal, cross-sectional, and comparative. The project will generate evidence that is descriptive [observational], associative and/or correlational [quasi-experimental], and causal [instrumental variables]. Original data is collected from K-12 students associated with 20 classroom teachers from each of the IQWST Science curriculum and the CDMW Mathematics curriculum using assessments of learning, observation [personal observation], and survey research [self-completion questionnaires, semi-structured or informal interviews, and focus groups].

The intervention is online formative assessments by students and teachers using the Formative Assessment Deliver System (see citations below), which will be compared at various points of the trials to paper/pencil assessments, and with a variety of reports. The analysis plan for this project uses Item Response Models to generate student proficiency estimates on latent traits, followed by statistical analysis of results for comparison across treatments.

Findings: 

Four FADS field trials with innovative assessments have been completed to date, and the fifth trial is underway. Findings to date indicate that reasonable assessment evidence can be generated fairly rapidly, in time intervals as short as an average of ten minutes, using model-based assessment in STEM settings. A range of innovative item types can be effectively incorporated. These include interactive and hands-on assessments and formats that teachers and students described following the trials as interesting and engaging. The NSF-supported FADS project is designed to help teachers improve their instructional practice by more easily integrating formative assessment.

Growth over time can be tracked with robust evidence. The system employs item response models to substantiate validity and reliability evidence regarding rich tasks and learning progressions. FADS offers tools that developers can use to take advantage of measurement and technology advances. It is a computerized web-based online system to design, develop, deliver and report on assessments within an interpretive context that helps teachers accurately diagnose students’ comprehension and learning needs. Teachers to date indicate that the system is easy to use and to incorporate into classroom settings.

The computer-based assessments were based on state and national science standards, and aligned to progress variables that indicate student progress as skills and knowledge develop. Data were modeled with a partial credit model. In the most recent assessment, an average assessment time of approximately 10 minutes online per respondent resulted in estimated test reliabilities of .79 (Cronbach's alpha), .81 (MLE person separation reliability) and .80 (EAP/PV reliability). Item fit was reasonably good, with none of the estimated item difficulty or step parameters outside 3/4-4/3 mean square weighted fit (M. Wu, Adams, & Wilson, 1998), for parameters in which the weighted fit T was greater than 2. These item parameters and fit statistics were generated in ConQuest (M. Wu, et al., 1998). Itanal item discrimination averaged .47 (SD = .13, min = .17, max = .65). In this assessment, automated scoring was possible for 91% of the items in the assessment objects. The remainder were scored with rubric. Missing data percentage were low due to the engaging nature of the assessments and some computer adaptive components were also included in the assessments to better align questions with areas of challenge and learning growth for students.

Publications & Presentations: 

Wilson, M., Scalise, K., Kennedy, C. A., Galpern, A., Lin, Y.-H., Su, Y., et al. (2009). Formative Assessment Delivery System (FADS). Paper presented at the American Educational Research Association Annual Meeting 2009 in Technology Supports for Formative Assessment, San Diego, CA.

Scalise, K., & Wilson, M. (2009). Formative Assessment Delivery System and Uses for Teachers in Curriculum Development. Paper presented at the National Science Foundation 2009 DR-K12 PI Meeting, Arlington, VA.

Wilson, M., Scalise, K., Albornoz Reitze, A. M., Bein, E., Bousselot, T.,
Gochyyev, P., et al. (2010). Progress Monitoring for Real Classroom Contexts: The Formative Assessment Delivery System, Session on Classroom Assessments in Balanced State Assessment Systems. Paper presented at the American Educational Research Association Annual Meeting 2009, Denver, CO.

Other Products: 

Online formative assessment tools for STEM education.