Scanning Services
OIT offers optical scanning services to UTA faculty and staff. OIT can scan both test and data on the following sheets, NCS form number 4521 (the blue form) and NCS form number 6703 (the green form) only. To order forms for your department call 1-800-367-6627 (you will need to provide your UTA purchase order number when you place the order).
Customer please note: OIT equipment cannot scan forms smaller than the standard 8 x 11 inch NCS forms (this includes the smaller Scantron forms used by different departments). Please contact your department head for scanning these forms.
To have your materials scanned, follow these steps:
- Fill out a Scanning Submittal Form for all scanning material (see Completing the Scanning Submittal Form , - a copy of this form is also included in PDF format).
- Include all appropriate key sheets for your scanning materials (see Completing Key Sheets).
- Orient sheets to be scanned in the same direction (edge to edge, front to back).
- Drop off and pick up scanning materials at the OIT Production Control Office, B51 Davis Hall, Monday through Friday from 8am to 5pm.
Bring a photo ID, such as your UTA ID card or your drivers license, when you return to retrieve scanning materials.
Scanning reports will be couriered by UTA vehicle between Davis Hall and Arlington Regional Data Center (ARDC formerly UTACC) Ft. Worth. OIT staff will strive to provide the fastest turnaround possible. We ask that each department dropping off test scanning or evaluations to please provide clear contact phone numbers and e-mail addresses so we may reach you with scanning questions if necessary, in order to prevent any delay of your results. Please print your e-mail address ID in block style evenly spaced letters. Scanning Operations cannot be responsible if contact cannot be made for illegible e-mail addresses and phone numbers.
Our general courier pickup and delivery schedule is:
- delivered to OIT Production Control Office in B66 Davis Hall by 9:30 am ? will be back by 4:00 pm same day (see exception below).
- delivered to OIT Production Control Office in B66 Davis Hall by 2:30 pm ? will be back by 9:00 am next workday (see exception below).
E-mail results are sent immediately.
The exception to the above schedule would be during midterm or final exams where the workload is increased. You may contact OIT Production Control at any time regarding the status of your scanning. OIT Production Control Office e-mail: oit.ctl@uta.edu
Evaluations are ready only after Dr. Nancy Rowe has contacted the user.
Scanning Location: Arlington Regional Data Center ARDC (formerly UTACC), Ft. Worth
Scanning and evaluations will be dropped off and picked up in the OIT main office, B51 Davis Hall, where the Production Control office staff will assist you. The actual scanning of forms will take place at Arlington Regional Data Center (ARDC), formerly known as UTACC, in Fort Worth.
Scanning reports will be couriered by UT Arlington vehicle between Davis Hall and ARDC Ft. Worth. OIT staff will strive to provide the fastest turnaround possible. We ask that each department dropping off test scanning or evaluations to please provide clear contact phone numbers and e-mail addresses so we may reach you with scanning questions if necessary, in order to prevent any delay of your results. Please print your e-mail address ID in block style evenly spaced letters. Scanning Operations cannot be responsible if contact cannot be made for illegible e-mail addresses and phone numbers.
Our general courier pickup and delivery schedule:
- Delivered to OIT Production Control Office in B66 Davis Hall by 9:30 am - will be back by 4:00 pm same day (see exception below).
- Delivered to OIT Production Control Office in B66 Davis Hall by 2:30 pm - will be back by 9:00 am next workday (see exception below).
E-mail results are sent immediately.
The exception to the above schedule would be during midterm or final exams where the workload is increased. You may contact OIT Production Control at any time regarding the status of your scanning. OIT Production Control Office e-mail: oit.ctl@uta.edu.
Evaluations are ready only after Dr. Nancy Rowe has contacted the user.
If you have any questions please contact Joe Gilstrap, Computer Operations Manager, extension: 27594, e-mail: joerg@uta.edu.
Optical scanning of data and test answer sheets are available at the Office of Information Technology. For information about buying scanning sheets, please refer to the OIT Scanning Services Guide.
Scanning Guide
- Completing the Scanning Submittal Form
- Scanning Services
- Completing Key Sheets
- Test Scoring
- Data Entry Only
- Filling in the SPECIAL CODES Section
- Designating Correct Answers
- Using a Weighting Key Sheet
- Marking Key Sheets
- Appendix A: Examples
- Example 1
- Example 2
- Example 3
- Appendix B: Generation Data Configuration
- Appendix C: Scanning Analysis Statistics
- Sample Scanning Submittal Form
Scanning Services
To use the services listed below, mark the appropriate box on the Scanning Submittal Form.
Test Scoring
When you mark this box, OIT performs the following three services (you can mark the following services individually if you only want one or two of them):
-
Alphabetical Printout of the Scores
When you mark this box, OIT produces a list of student scores in ascending order by name with a calculated mean and standard deviation. -
Student Reports
When you mark this box, OIT produces individual student reports showing a percentage grade with the number of points given for a correct response to a question or the correct response(s) for each incorrectly answered question. -
Item Analysis
When you mark this box, OIT produces an in-depth item analysis of the test. The analysis includes: distribution, Z score, standard score, histogram of scores, frequencies, item-test correlation table, homogeneity analysis, and item response graphs.
Data Entry Only
When you mark this box, OIT saves the output of the scanning materials to a file. OIT retains a backup copy of this file for a limited time of 3 days. If you want the file saved to your account, you should specify the account where you want the file saved. For information about the arrangement of scanning data on disk, please refer to Appendix B: Generation Data Configuration.
Remember, you must provide you must provide an IBM PC formatted 3 floppy disk to store your data on if you choose Data Entry Only.
Completing Key Sheets
When you submit scanning materials for grading, you must include a key sheet. There are two types of key sheets: Response Key sheets, where you indicate the correct response for each question; and Weighting Key sheets, where you indicate the point value for each question. As many as four Response Key sheets and one Weighting Key sheet may be used for each test.
Each key sheet must include the following information:
NAME
Enter the name of the person for whom the materials are to be scanned. See example below (be sure to bubble the letters below your name or the scanner won't read them).
IDENTIFICATION NUMBER
Mark all the circles containing nines (9) for each key sheet.
GRADE
Indicate the number of key sheets (including Response and Weighting sheets) by marking the appropriate response.
SPECIAL CODES
Used to indicate the type of key sheet (Response or Weighting) and the number of questions on the key sheet to be processed (see Filling in the SPECIAL CODES section, next).
Above you see all 9s (A-J) have been marked in the IDENTIFICATION NUMBER box (identify the sheet is an answer key to the scanner hardware).
All 0s have been marked in the KLM fields of the "SPECIAL CODES" box (this tells the scanner that this is a response key and not a Weight key - Weight keys have KLM marked by 9s only). 020 in the NOP fields have been marked to indicate tha this test will have 20 questions scanned per test (question 21 and on are not marked because test only had 20 questions).
Below in the GRADE section 1 has been marked for 1 key answer sheet (the 20 questions are marked as an example).
Filling in the SPECIAL CODES Section
The SPECIAL CODES area must include the following information:
For a response key sheet, mark circles containing zeroes (0) in the K-L-M fields. Mark the number of questions on the Response sheet in the N-O-P fields (an example is available in Appendix A:, Example 3).
For a Weighting Key sheet, mark all the circles containing nines (9) in the K-L-M fields. Mark the number of questions on the Response sheet in the N-O-P fields (an example is available in Appendix A:, Example 1 and 2).
Designating Correct Answers
You can designate which responses are correct for each question by following these rules:
If a question should be skipped, leave the answers to that question blank (see questions 6-10 in
Example 2).
If a question has only 1 correct answer, mark that answer (see questions 1-20 in Example 1).
If a question has 2 to 4 correct answers, mark each correct answer on separate Response Key sheets (see question 8 in Example 3).
If a question is unconditionally correct (all answers are correct), mark all answers on the same Response Key sheet (see questions 16 and 20 in Example 2).
Using a Weighting Key Sheet
On the Weighting Key sheet, you can designate weighting (differential) point values for each question.
If all questions are worth only one point each, you don't need to fill out a Weighting Key sheet. The scanner assigns one point to each question marked on the response key.
If any question is worth MORE than one point, you must mark the response that indicates the point value for each question AND you must mark each question that is one point each as well. All questions must be marked (see question 1-6 in Example 3).
Remember to mark the circles containing nines (9) in the K-L-M fields in the SPECIAL CODES section of a Weighting Key sheet (see Example 3).
Marking Key Sheets
If scanning materials are not adequately marked, scanning requires much more time to complete and, in the case of lightly marked responses, cannot be completed if the requestor's intentions are not understood.
Please be certain that relevant key sheet areas are adequately darkened, and that all the sheets in each scanning data set are aligned with the black hash marks (called "skunk" marks by NCS along the same side. if not scanning will be returned.
Erased answers should be as dim as possible. If you have too many dimmed (erased) answers, you should complete a new key sheet. Sheets with tears or missing pieces in most cases will not scan. Please inspect your scan sheets before you bring them to OIT for scanning. Coffee stains, wrinkled and folded pages most often cannot be scanned. It is better to remark bad pages with fresh scanner sheet (see beginning of this document if you need to order new sheets).
Appendix A: Examples
Example 1
A test has 20 questions, each with one correct answer. All questions are equally weighted (all are worth one point out of a total of 20).
| 1 Key sheet: | (response key) |
| GRADE: | 1 |
| ID NUMBER: | 9999999999 |
| SPECIAL CODES: | 000020 |
| Questions 1-20: | Correct answer for the question (a, b, c, d, or e) |
| Questions 21-200: | Not marked |
Example 2
Of the questions on the 20 question test, questions 6-10 are survey questions and are not to be graded as part of the test. Also, questions 16 and 20 were not phrased well and are to be counted as correct (unconditionally correct answer) for everybody who took the test.
Example 3
A 12 question test is given. The first 6 questions count three times as much as the other 6 questions. After the test is given, it is discovered that question 8 has 2 answers that are correct.
FIRST SHEET: (Response Key)
SECOND SHEET: (Response Key)
Now you are almost done. Since you want Questions 1-6 to count as 3 points each instead of 1 point you need to fill out the Weight Key to give the point value to each question on the test.
A common mistake is to confuse Alternate Response Keys and Weight Keys. If you follow these instructions your keys will be correct and your scanning results should be on time. Call (817) 272-2271 if you have any questions.
Third Sheet (EXAMPLE 3 weighting key)
Remember: Weight keys must be placed (beneath) response keys. If your weight key is placed on top of your key sheet the scanner will corrupt you data. Place the response key(s) first then the weight key.
Appendix B: Generation Data Configuration
The data from optical scan sheets is available online for individual use (via e-mail). Each sheet is represented as one record of 272 bytes (characters). The records have the following fields:
| Field Name | Contents | Columns | Length | |
| Header | Test ID | 'G2' | 1-2 | 2 |
| Document ID | Digit | 3 | 1 | |
| 3 Digits | 4-6 | 3 | ||
| Blank | 7 | 1 | ||
| Check Digit | Digit | 8 | 1 | |
| Blank | 9 | 1 | ||
| Serial Number | 4 Digits | 10-13 | 4 | |
| Page ID | Blank | 14 | 1 | |
| Blanks | 15-16 | 2 | ||
| Run Number | 1, 2, or 3 | 17 | 1 | |
| Blanks | 18-19 | 2 | ||
| Worst Mark Flag | '#&#' | 20 | 1 | |
| Invalid Count | '000' | 21-23 | 3 | |
| Blank | 24 | 1 | ||
| Biographical | Name | 20 Characters | 25-44 | 20 |
| ID Number | 10 Digits | 45-54 | 10 | |
| Gender | 'M' or 'F' | 55 | 1 | |
| Grade | 2 Digits | 56-57 | 2 | |
| Date | 6 Digits | 58-63 | 6 | |
| Blank | 64 | 1 | ||
| Special Codes | 6 Digits | 65-70 | 6 | |
| Blanks | 71-72 | 2 | ||
| Responses | Values | 200 Char | 73-272 | 200 |
The response field is stored in one of two forms. For tests, the character stored is one of the letters 'A' to 'J'. For data, the character stored is one of the digits '1' to '9' or the digit '0' (To get zero, mark response number 10).
In the biographical and response areas, a blank is stored if no response was marked and an asterisk (*) is stored if two or more responses were marked. The serial number field of the header indicates the order in which the sheets were scanned.
Data may be sent upon request to your UTA e-mail address (example jdoe@uta.edu). To request to have results emailed to you make a note on the Special Instructions box of the scanning submittal form and include your e-mail address.
Please Note: Scanner data output is in text (.txt) format only. Special formatting is the responsibility of the user.
Appendix C: Scanning Analysis Statistics
Note: All statistics are computed on the dichotomized response
x(i)=0 answer is wrong For item i, i=1,.....,M
x(i)=1 answer is right
-
For each of the first three subjects there is printed:
-
Name of subject
-
Item responses (right answer=1, wrong answer=0)
-
Total score
-
Mean response
-
-
-
Number of items for analysis
-
Number of items in this test (Maximum possible score)
-
Number of subjects
-
Overall average score
-
Standard deviation (S.D.) of overall average score
-
-
For each of the scores from the lowest score on this test to the highest score on this test there is printed:
-
The number of subjects with this score
-
The percentage of subjects with this score
-
Cumulative percentage (the percentage of all subjects with this score or a lower score)
-
Normalized score (Z score)
lower bound of interval - overall average score
Z =---------------------------------------------------------------
S.D. of overall average score
-
Standardized score = Z*10.0 + 50.0
The overall average score = 50 on this scale
The S.D. of overall average score = 10 on this scale -
The group number in which the subjects with this score are placed in for the last part of the analysis
-
Histogram (horizontal bar diagram) of the percentage of the subjects with this score
-
-
For each item there is printed:
-
Item number
-
Mean response
-
S.D. of mean response = MEAN * (1-MEAN)
-
Pearson correlation between the mean item response and the overall scanning set mean response
-
Frequency count of the actual response values given; the possible response values are: omitted, multiple,..., unknown
-
Standard Error of Pearson correlation = 1/SQRT(N), where N = number of subjects
-
-
-
-
Number of items (Maximum possible score)
-
Number of subjects
-
Sum of all responses
-
Overall average score
-
Sum of responses/M, where M is the number of items
-
Sum of responses/(N*M)
-
-
Sums of squares (S.S.)
-
S.S.total
-
S.S.error = S.S.total - S.S.subject - S.S.item
-
S.S.subject
-
S.S.item
-
Estimated variances
-
VAR(error) = S.S.error/[(N-1)*(M-1)]
-
VAR(subject) = S.S.subject/(N-1)
-
VAR(item) = S.S.item/(M-1)
-
Observed variances
-
Total: S.S.total/(N*M)
-
Error: S.S.error/(N*M)
-
Observed variances of the mean
-
Subject: S.S.subject/(N*M)
-
Item: S.S.item/(N*M)
-
Estimated variances of the mean
-
Subject:[VAR(subject)-VAR(error)]/M
-
Item: [VAR(item)-VAR(error)]/N
-
Proportions of total variance
-
Error: S.S.error/S.S.total
-
Subject: S.S.subject/S.S.total
-
Item: S.S.item/S.S.total
-
Cronbach's alpha indices
RGG = 1.0-[VAR(error)/VAR(item)]
RTT = 1.0-[VAR(error)/VAR(subject)]-
Alpha indices stepped down to one element
RPP = RGG/(N+RGG-N*RGG)
RII = RTT/(M+RTT-M*RTT)-
Same indices as in table B projected for a test of 100 items.
-
Subjects are divided into 5 groups according to their Z-score
Group 1: Z SCORE -0.84 (worst overall)
Group 2: -0.84=Z SCORE -0.26
Group 3: -0.26=Z SCORE 0.26
Group 4: 0.26=Z SCORE 0.84
Group 5: 0.84=Z SCORE (best overall)For each group there is printed:
-
Group number
-
Range of Z-scores for this group
-
Number of subjects with a Z-score in this range
-
Percentage of subjects with a Z-score in this range
-
For each item there is printed:
-
Mean Correct = mean of this item * 100
-
Difficulty = 100 - Mean Correct
-
For each group there is printed:
-
Frequency count of responses of the subjects who are in this group
-
Histogram (horizontal bar diagram) of the percentage of subjects in this group who answered this item correctly.
-
Correct response(s) for the item.
Sample Scanning Submittal Form
-

