Proposal Title

The Integrated Testlet: A powerful multiple-choice approach for STEM assessment.

Session Type

Workshop

Room

P&A Rm 106

Start Date

July 2015

Keywords

assessment, scaffolded learning, answer-until-correct (AUC), immediate feedback, multiple choice testing, knowledge integration

Primary Threads

Evaluation of Learning

Abstract

Multiple-choice testing is becoming more common as student populations rise and instructional resources dwindle. Such testing is easy to implement, reliable, and inexpensive, yet its validity is often in question (Scott, Stelzer, & Gladding, 2006). In an effort to find inexpensive, streamlined, and valid ways to test knowledge integration and deeper levels of understanding we have introduced "integrated testlets" which complement recent extensions of the traditional multiple-choice approach (e.g. Ding, Reay, Lee, & Bao, 2011; and Wilcox & Pollock, 2014). Integrated testlets efficiently assess higher echelons of learning by utilizing answer-until-correct assessment tools within a set of multiple choice questions that share a common scenario and which build upon one another. These also allow for straightforward and demonstrably-valid granting of partial credit. Integrated testlets enable conceptual scaffolding to be tested and, if desired, assembled during the assessment. Thus, they serve both summative and formative purposes.

In this workshop we will introduce examples of integrated testlets that span the STEM disciplines. Participants will actively engage with one or two testlets of their choosing to gain experience with both the technology and the workings of this assessment tool. Time will then be devoted to unpacking these experiences and to highlighting the pedagogical implications of being able to assess integration of knowledge within a multiple-choice framework. Finally we will discuss how integrated testlets are currently transforming students’ learning experiences at Trent University.

Wilcox, B.R., & Pollock, S. J. (2014). Coupled multiple-response versus free-response conceptual assessment: An example from upper-division physics. Physical Review Special Topics - Physics Education Research, 10(2), 020124-1 – 020124-11. http://dx.doi.org/10.1103/physrevstper.10.020124

Ding, L., Reay, N., Lee, A., & Bao, L. (2009). Exploring the role of conceptual scaffolding in solving synthesis problems. Physical Review Special Topics - Physics Education Research, 7(2), 020109-1 – 020109-11. http://dx.doi.org/10.1103/physrevstper.7.020109

Scott, M., Stelzer, T., & Gladding, G. (2009). Evaluating multiple-choice exams in large introductory physics courses. Physical Review Special Topics - Physics Education Research, 2(2), 020102-1 – 020102-14. http://dx.doi.org/10.1103/physrevstper.7.020102

Elements of Engagement

Participants will be given time to work with a variety of pre-prepared integrated testlets, initially from the perspective of an individual student, and then within a small group setting, within groups of same or cognate disciplines. It is essential for educators to experience the testlets in both settings to fully appreciate the nuanced aspects that make this assessment tool transformative. Time will then be devoted to larger group discussion to unpack this engagement exercise, and for subsidiary aspects to be considered, such as the extent to which different disciplines can benefit from this tool, and what determines the possible extent of testlet integration.

This document is currently not available here.

Share

COinS
 
Jul 9th, 1:00 PM

The Integrated Testlet: A powerful multiple-choice approach for STEM assessment.

P&A Rm 106

Multiple-choice testing is becoming more common as student populations rise and instructional resources dwindle. Such testing is easy to implement, reliable, and inexpensive, yet its validity is often in question (Scott, Stelzer, & Gladding, 2006). In an effort to find inexpensive, streamlined, and valid ways to test knowledge integration and deeper levels of understanding we have introduced "integrated testlets" which complement recent extensions of the traditional multiple-choice approach (e.g. Ding, Reay, Lee, & Bao, 2011; and Wilcox & Pollock, 2014). Integrated testlets efficiently assess higher echelons of learning by utilizing answer-until-correct assessment tools within a set of multiple choice questions that share a common scenario and which build upon one another. These also allow for straightforward and demonstrably-valid granting of partial credit. Integrated testlets enable conceptual scaffolding to be tested and, if desired, assembled during the assessment. Thus, they serve both summative and formative purposes.

In this workshop we will introduce examples of integrated testlets that span the STEM disciplines. Participants will actively engage with one or two testlets of their choosing to gain experience with both the technology and the workings of this assessment tool. Time will then be devoted to unpacking these experiences and to highlighting the pedagogical implications of being able to assess integration of knowledge within a multiple-choice framework. Finally we will discuss how integrated testlets are currently transforming students’ learning experiences at Trent University.

Wilcox, B.R., & Pollock, S. J. (2014). Coupled multiple-response versus free-response conceptual assessment: An example from upper-division physics. Physical Review Special Topics - Physics Education Research, 10(2), 020124-1 – 020124-11. http://dx.doi.org/10.1103/physrevstper.10.020124

Ding, L., Reay, N., Lee, A., & Bao, L. (2009). Exploring the role of conceptual scaffolding in solving synthesis problems. Physical Review Special Topics - Physics Education Research, 7(2), 020109-1 – 020109-11. http://dx.doi.org/10.1103/physrevstper.7.020109

Scott, M., Stelzer, T., & Gladding, G. (2009). Evaluating multiple-choice exams in large introductory physics courses. Physical Review Special Topics - Physics Education Research, 2(2), 020102-1 – 020102-14. http://dx.doi.org/10.1103/physrevstper.7.020102