By Gina M. Copas & Steven Del Valle
Summary: New and innovative classroom technology may introduce novel learning environments and provide solutions for tedious grading tasks; however, it is important to determine if such technology influences test performance. Results from a usability study on the Classroom Performance System, a system allowing students to upload test answers via a remote control, indicates discrepancies between perceived ease of use and actual performance.
The Classroom Performance System (CPS) is a testing software and hardware package developed by eInstruction Corporation that allows students to upload their quiz answers into the software via remote response pads. Once the answers are uploaded into the receiver unit, the software can instantly grade student responses and generate detailed reports to instructors. One of the options available to instructors, the Student Managed mode, is to give students printed homework or tests and to have the students electronically transmit their answers. This is an automated assessment feature that allows the students to respond at an individual pace (eInstruction, n.d.).
The Student Managed mode allows the instructor to have an option to permit students to advance to the next test question, either automatically or manually. The automatic forward allows students to enter a response after which they are automatically moved to the next question. The manual forward only advances to the next question when the student presses an arrow key after entering a response.
In a pilot usability study of this product, it was reported that the Student Managed mode was preferred approximately 2 to 1 to the Instructor Managed mode in which test questions were displayed on a screen and the instructor determined the pace (Copas, 2003). While most of the participants of that study reported that they were able to tell which test question they were answering, some also reported that they became lost. This study investigates the performance of participants using CPS in the Student Managed mode. To determine if students experienced any sense of being “lost,” a test was developed so that all participants should attain a perfect score.
Wichita State University students from beginning and upper level psychology classes were invited to participate in the study. There were 137 participants (32% male and 64% female) tested in six different groups. Twenty-six percent of the participants reported to have completed high school; 36% completed 1-2 years of college; 33% completed 3-4 years of college; and 2% reported completed post graduate studies. Three percent of the participants reported having used the CPS Remote Response system prior to the test.
Each participant was given a consent form, a quiz with either 15 or 45 questions, a satisfaction survey, and a CPS remote. The quiz questions were developed by a group of high school students participating in a human factors psychology class in the Upward Bound Math & Science program. The quiz questions were designed to be very easy in order to detect the influence of the remote on the test scores. Sample questions were as follows:
1. What rhymes with pink?
2. How many sides does a triangle have?
A large screen in the front of the classroom displayed the response indicators and the CPS remote receiver was used to receive the responses. Each student had a remote control (shown above) which they used to upload their answers to each test question.
The participants were seated together in a classroom. The remotes were set so the participants would answer in one of the two advance conditions, either the automatic forward or the manual forward. Instructions were read to each group regarding how to use the CPS remote and how to determine which question they were answering by using the indicator box corresponding with their remote number displayed on the large screen in the front of the class. The projection screen, Figure 1, displays the number of each response pad (top) and the quiz item number of the next question to be answered (bottom). A question and answer period was given before the test began. Participants were instructed to read the questions on the quiz paper and then use their remote control to upload their answer to the receiver in the middle of the classroom. The participants answered the satisfaction survey once they felt they had successfully completed beaming all of the quiz answers into the receiver. Students were not given their scores at the end of the quiz.
Figure 1. Student Managed Assessment Answer Window
The mean percent correct quiz score for all participants was 77.37 (SD=30.97). There was not a significant difference between those participants who used the automatic forward and those who used the manual forward. Thirty-six percent of the participants received perfect scores (see Figure 2). Of those participants who did not receive perfect scores the mean score was 64.37. An item analysis showed that for 40% participants their incorrect answers did not appear to be random, but all towards the end of the quiz. In fact, 37% of these participants failed to answer the last question.
Figure 2. Difference between quizzes with 100% correct and less than 100% correct
Perceived Ease of Use
When asked to report if they ever experienced a feeling of being “lost” as to which test question they should be answering, 57% of participants reported never feeling lost while 39% reported that they did (4% did not answer this question) (see Figure 3).
Figure 3. Reported feeling lost during beaming
Forty-seven percent of the participants reported that the CPS system was “very easy” to use; 44% reported that it was “easy” to use; 7% reported that it was “difficult” to use; and 1% reported that it was “very difficult” to use (see Figure 4).
Figure 4. Reported ease of using the CPS system
For taking multiple choice or true/false tests, 38% of participants reported preferring CPS, while 58% reported preferring the more traditional paper/pencil method (see Figure 5). For choosing to have CPS system in their classrooms, 45% reported yes, they would like to have CPS in their classroom; however, 51% reported they would not.
Figure 5. Participants who would like to have CPS system in classroom.
An open-ended question on the satisfaction survey requested participant concerns. Four main concerns were: 1) not being able to see responses and worrying about pushing an incorrect button, 2) not being able to go back and look over answers, 3) having to hold arm up while testing and/or having to look up and down constantly during test-taking, and 4) weak or dead batteries in the remote in the middle of a test.
Results from this usability study of the Classroom Performance System (CPS) indicate that first-time users of the system experienced difficulty. In fact, over half of the participants failed to answer all the questions on the quiz correctly, and 40% of those with less than 100% scores (or 18% of total participants) could be contributed to the difficulty in using the system.
A large majority (91%) of the participants reported that CPS was “easy” to use and that they were “satisfied” with their effort to use it. These satisfaction scores are inconsistent with actual performance or number of perfect scores (36%). This suggests that many participants believed they performed better than they actually did (final test scores were not disclosed to the participants). There is also discrepancy between the 91% of participants who thought the system was “easy” to use, but only 57% reported that they never felt “lost.” Also, participants reported feeling confident using the CPS but, in fact, did not perform well.
It is important to further investigate what influence this technology has on actual performance scores. The fact that many students in this study were unable to complete the CPS quiz with a perfect score is concerning. Additional usability studies on the Classroom Performance System should address discrepancies found in this study as well as the effects of practice on performance and satisfaction.
Copas, G. (2003). Where’s my clicker? Bringing the remote into the classroom. Usability News, 5(2). Available at: http://psychology.wichita.edu/surl/usabilitynews/52/remote_test2.asp.
eInstruction Corporation (no date). Classroom performance system user’s guide. (Version 3.12)
Authors would like to acknowledge the 2003 Upward Bound Human Factors Psychology class, Corey Alexander, De’ja Collins, Daphne Darter, Brandon Hardin, Jesse Hernandez, Byron Hightower, Angie Johnson, Carolyn Johnson, Kimberly Livengood, Michael Mendenhall, Ashley Reynolds, Tran Cuong, and Carl Wegener, for their assistance in developing the quiz questions for this usability study.