Proposal Title

Inter-rater Reliability: Armies of Graduate T.A.’s Grading in First Year

Session Type

Presentation

Room

P&A Rm 106

Start Date

July 2015

Keywords

inter-rater reliability, T.A. graders, large first-year classes, grading consistency, computer assisted grading

Primary Threads

Evaluation of Learning

Abstract

Large, laboratory-based courses in first year inevitably require large numbers of graduate teaching assistants to manage the safety, operation, instruction, and assessment of students. At Guelph, we have around 2200 students in the fall semester of first year chemistry and around 1800 in the companion winter course. This past winter semester, we had 74 laboratory sections staffed by 21 different graduate students. The G.T.A.s graded the written laboratory reports for the students in their sections. In addition, we held a mid-term and final exam that included two pages of written answers. A group of about 28 G.T.A.s would gather in the morning following the exam and grade those student papers en masse (about four hours). This past semester we developed a method to electronically capture the grading of each T.A. and have compared parameters such as speed, average grade assigned, variance, and accuracy. We will discuss these results and how they may impact our confidence in the final grade assigned to a particular student. In addition in the labs, we have introduced the electronic submission of student lab reports, portions of which can be graded automatically by the computer. The T.A.s review the computer grading while other parts of the reports still require full T.A. assessment. We will compare lab grades before and after this electronic change to see if inter-rater reliability has been affected.

Elements of Engagement

If time and space permit, we may provide a grading activity in which the audience can participate.

This document is currently not available here.

Share

COinS
 
Jul 10th, 1:00 PM

Inter-rater Reliability: Armies of Graduate T.A.’s Grading in First Year

P&A Rm 106

Large, laboratory-based courses in first year inevitably require large numbers of graduate teaching assistants to manage the safety, operation, instruction, and assessment of students. At Guelph, we have around 2200 students in the fall semester of first year chemistry and around 1800 in the companion winter course. This past winter semester, we had 74 laboratory sections staffed by 21 different graduate students. The G.T.A.s graded the written laboratory reports for the students in their sections. In addition, we held a mid-term and final exam that included two pages of written answers. A group of about 28 G.T.A.s would gather in the morning following the exam and grade those student papers en masse (about four hours). This past semester we developed a method to electronically capture the grading of each T.A. and have compared parameters such as speed, average grade assigned, variance, and accuracy. We will discuss these results and how they may impact our confidence in the final grade assigned to a particular student. In addition in the labs, we have introduced the electronic submission of student lab reports, portions of which can be graded automatically by the computer. The T.A.s review the computer grading while other parts of the reports still require full T.A. assessment. We will compare lab grades before and after this electronic change to see if inter-rater reliability has been affected.