Skip to main content

Office of Information Technology (OIT)

UT Arlington
OIT: Office of Information Technology

helpdesk@uta.edu ·  Work Order · 817-272-2208 · System Status

We are your IT partner!

Scanning Services

Output provided from Scanned Tests
(These will be emailed to the instructor. No printed copies will be provided.)

  • Key Analysis Report

This report will show how the key was read, the correct responses, and how the key was processed by the scanner. This report should always be checked.

Example 6: Example of Key Analysis Report

  • Exam Analysis Report

Instructor and course information will be on the report.
Number of students tested will be on the report.

High: The highest score (number of questions answered correctly and percent of questions answered correctly) obtained by any student on the test.
Low: The lowest score (number of questions answered correctly and percent of questions answered correctly) obtained by any student on the test.
Mean: Average of all scores (number of questions answered correctly and percent of questions answered correctly) of students who took the test.
Standard Deviation: Standard deviation of all scores (number of questions answered correctly) of students who took the test.
KR-21 Score: This is a common measure of inter-item reliability.

ITEM: In this column each row represents a question (item) number and the percent of students who answered it correctly. This is a measure of the difficulty of an item, the larger the percent, the easier the item.
PBS: The Point Biserial Correlation is the correlation between the responses (correct or not correct) to the item and the total exam scores of the students. PBS scores range from -1 to 1. Low PBS scores for an item indicate that the scores on that item are not tracking well with the overall scores on the exam. PBS scores that should be examined for reliability are highlighted in red.
The columns A through J represent the possible responses to the item. The TTL sub-column under each possible response lists the number of students who picked that response for the item. The R sub-column under each possible response is the average test score of the students who selected that particular response and its associated standard deviation. The higher the R number, the better the group that chose that response did on the exam overall.  So, when looking at the Point Biserial Correlation (PBS), you can immediately tell if a distractor (incorrect response) pulled students who did well on the exam away from the correct answer. In cases where students who did well on the exam chose a common incorrect response, the R would be higher for that response with a lower PBS. You can also quickly see that students who did not do well overall, should have been pulled away from the correct response by well written distractors. On that item you would see a lower R for that response and a higher PBS. You would expect that the students who did well on the exam selected the correct response thus generating a higher R for that response and a higher PBS on the item. The correct response for each item is highlighted in grey.

Example 7: Example of Exam Analysis Report

  • Exam Activity Report

Instructor and course information will be on the report.
Number of students tested will be on the report.

Student ID: Student ID number as entered on scan sheet. Each row represents a student.
Student Name: Student Name as entered on scan sheet.
Date Posted: Date and time exam was scanned.
Missed: Number of questions missed by that student.
Correct: Number of questions answered correctly by that student.
Points: Number of points student received on the exam.
Score: Exam score for the student.

Example 8: Example of Exam Activity Report

  • Student Score Report (Instructor Score Report)

There will be a Student Score Report for each student.
The information in the left hand box will include student name, student ID, instructor and course ID.

Exam Results:  Information presented will be Total Points Earned, Possible Points, and Grade.

Questions Missed: For questions missed students will be given question number, student’s answer (an “*” means student chose more than one response for the item, a “?” means student left answer blank for the item), and correct answer when requested by instructor. The option to have this information on the student reports must be chosen on the scanning submittal form. Otherwise, the student score reports will only contain the information under Exam Results above.

Example 9: Example of Student Score Report with Student Scores only

Example 10: Example of Student Score Report with Student Scores, which questions missed, student response to questions missed, and correct answer to questions missed

  • CSV file

We will provide a CSV file that can be opened in Excel.  It will be a spreadsheet showing all students ordered by Last Name, First Name with the following information.

Instructor and course information will be in the file.

Student ID: Student ID number as entered on scan sheet. Each row represents a student.
Student Name: Student Name as entered on scan sheet.
Course ID
Missed: Number of questions missed by that student.
Correct: Number of questions answered correctly by that student.
Points: Number of points student received on the exam.
Maximum Points Possible
Score: Exam score for the student.
Date Posted: Date and time exam was scanned.

Example 11: Example of CSV file