Proposal Title

Enabling More Feedback With Less Delay: Exploring novel approaches to constrain essay questions

Presenter Information

Eli Meir, SimBioFollow

Session Type

Presentation

Room

PAB 150

Start Date

9-7-2013 11:00 AM

Keywords

evolution, evaluation, education

Primary Threads

Education Technologies and Innovative Resources

Abstract

There is a widely acknowledged tradeoff between evaluation items that are tractable to grade for large numbers of students (e.g. multiple choice questions) and those which provide insight into students higher-order thinking skills and deeper understanding of scientific concepts (i.e. essay questions, performance-based assessments). The latter provide much more information, but are currently difficult or impossible to automatically grade and time-prohibitive to manually grade in large classes. One place among many that this presents problems is in trying to develop open-ended lab experiences for students which can be done in large classes, but still provide students the immediate formative feedback they need in order to learn how to properly do an experiment. Ideally, they would also provide instructors enough summative feedback to target common problems in their classroom.

I'll present a project that is attempting to find a middle ground by taking open-ended assessments and adding constraints that make it possible to automatically categorize student answers. I'll focus on two forms of constrained essay question, which we are calling "LabLibs" and "WordBytes", and show preliminary data that students' answers in these constrained essays capture some of the confusions evident in free response or interview answers. I'll also discuss ways in which we are attempting to constrain other portions of the full experimental cycle within open-ended simulations (focusing on a virtual lab on natural selection) to enable immediate feedback to students. Please bring your computer to play with the tools and simulations yourself.

This document is currently not available here.

Share

COinS
 
Jul 9th, 11:00 AM

Enabling More Feedback With Less Delay: Exploring novel approaches to constrain essay questions

PAB 150

There is a widely acknowledged tradeoff between evaluation items that are tractable to grade for large numbers of students (e.g. multiple choice questions) and those which provide insight into students higher-order thinking skills and deeper understanding of scientific concepts (i.e. essay questions, performance-based assessments). The latter provide much more information, but are currently difficult or impossible to automatically grade and time-prohibitive to manually grade in large classes. One place among many that this presents problems is in trying to develop open-ended lab experiences for students which can be done in large classes, but still provide students the immediate formative feedback they need in order to learn how to properly do an experiment. Ideally, they would also provide instructors enough summative feedback to target common problems in their classroom.

I'll present a project that is attempting to find a middle ground by taking open-ended assessments and adding constraints that make it possible to automatically categorize student answers. I'll focus on two forms of constrained essay question, which we are calling "LabLibs" and "WordBytes", and show preliminary data that students' answers in these constrained essays capture some of the confusions evident in free response or interview answers. I'll also discuss ways in which we are attempting to constrain other portions of the full experimental cycle within open-ended simulations (focusing on a virtual lab on natural selection) to enable immediate feedback to students. Please bring your computer to play with the tools and simulations yourself.