U.S. patent application number 10/765749 was filed with the patent office on 2004-09-23 for method for scoring and delivering to a reader test answer images for open-ended questions.
This patent application is currently assigned to Harcourt Assessment, Inc.. Invention is credited to Gonzalez, Jose, Kucinski, Bernard.
Application Number | 20040185424 10/765749 |
Document ID | / |
Family ID | 25417866 |
Filed Date | 2004-09-23 |
United States Patent
Application |
20040185424 |
Kind Code |
A1 |
Kucinski, Bernard ; et
al. |
September 23, 2004 |
Method for scoring and delivering to a reader test answer images
for open-ended questions
Abstract
A method for scoring an answer page containing an answer to an
open-ended question includes viewing a first visual image of a
first portion of an answer page. The first portion contains an
answer space in which an answer to an open-ended question is
expected to reside. If the first portion of the answer page
contains a complete answer, the answer is electronically scored. If
the first portion of the answer page does not encompass a complete
answer, a second visual image of a second portion of the answer
page is accessed and viewed. The second portion contains a sector
of the answer page outside the answer space. A method is also
provided for delivering an answer page to a reader for scoring.
Inventors: |
Kucinski, Bernard; (San
Antonio, TX) ; Gonzalez, Jose; (Richardson,
TX) |
Correspondence
Address: |
ALLEN, DYER, DOPPELT, MILBRATH & GILCHRIST, PA
P.O. BOX 3791
ORLANDO
FL
32802-3791
US
|
Assignee: |
Harcourt Assessment, Inc.
San Antonio
TX
|
Family ID: |
25417866 |
Appl. No.: |
10/765749 |
Filed: |
January 27, 2004 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
10765749 |
Jan 27, 2004 |
|
|
|
10113035 |
Apr 1, 2002 |
|
|
|
6684052 |
|
|
|
|
10113035 |
Apr 1, 2002 |
|
|
|
09707252 |
Nov 6, 2000 |
|
|
|
6366760 |
|
|
|
|
09707252 |
Nov 6, 2000 |
|
|
|
08903646 |
Jul 31, 1997 |
|
|
|
6173154 |
|
|
|
|
Current U.S.
Class: |
434/359 |
Current CPC
Class: |
G06K 17/0032
20130101 |
Class at
Publication: |
434/359 |
International
Class: |
G09B 007/00 |
Claims
What is claimed is:
1. A method for scoring an answer page containing an answer to an
open-ended question, the method comprising the steps of: viewing a
first visual image of a first portion of an answer page, the first
portion comprising an answer space in which an answer to an
open-ended question is expected to reside; if the first portion of
the answer page contains a complete answer, electronically scoring
the answer; if the first portion of the answer page does not
encompass a complete answer, accessing and viewing a second visual
image of a second portion of the answer page, the second portion
comprising a sector of the answer page outside the answer space,
and electronically scoring the answer.
2. The method recited in claim 1, wherein the viewing step
comprises receiving the first and the second visual image through a
processor onto a display device.
3. The method recited in claim 1, further comprising the steps of:
entering an electronic scoring system; requesting to view an answer
to score; and receiving the first visual image from a queue
comprising a plurality of answer page images.
4. The method recited in claim 1, wherein the electronically
scoring step comprises selecting a numerical score for the answer
from a score sector on a display.
5. The method recited in claim 4, wherein the score sector
comprises a score button bar displayed on a common display with the
first visual image.
6. The method recited in claim 1, further comprising the steps, if
the first and the second visual image do not encompass a complete
answer, of repeating the accessing and viewing steps until
substantially the entire answer page is viewed.
7. The method recited in claim 6, wherein, if the entire answer
page does not encompass a complete answer, viewing a first visual
image of a second answer page to search for a complete answer.
8. The method recited in claim 1, wherein the viewing steps
comprise receiving the first and the second visual image through a
processor onto a display device, and the accessing step comprises
electronically manipulating a scroll bar on the display device.
9. The method recited in claim 1, further comprising the step of,
if a question occurs during the scoring step, electronically
transmitting a query to a supervisor.
10. The method recited in claim 1, wherein the answer comprises an
answer in verbal form.
11. The method recited in claim 1, wherein the answer comprises a
geometric diagram.
12. The method recited in claim 11, further comprising the step of
accessing an electronically manipulable display of a geometric tool
for assessing the geometric diagram.
13. The method recited in claim 1, wherein the answer comprises a
calibration answer, and further comprising the steps of receiving a
score and comparing the received score with a target score.
14. The method recited in claim 13, further comprising the step of
calculating a reader effectiveness from the comparing step.
15. The method recited in claim 13, further comprising the steps of
calculating a time span between the first visual image viewing step
and the scoring step, and comparing the time span with a target
scoring time.
16. The method recited in claim 15, further comprising the step of
calculating a reader efficiency from the time-span comparing
step.
17. A method for delivering an answer page containing an answer to
an open-ended question to a reader for scoring, the method
comprising the steps of: retrieving from a storage device a visual
image of an answer page containing an answer to an open-ended
question; formatting a first display screen comprising a first
visual image of a first portion of an answer page, the first
portion comprising an answer space in which an answer to an
open-ended question is expected to reside; transmitting to a reader
display device the first display screen; and permitting reader
visual access on the display device to a second portion of the
answer page at least partially outside the first portion.
18. The method recited in claim 17, wherein the retrieving step
comprises the steps of: determining a batch comprising a plurality
of answers to be scored, the plurality of answers comprising
answers to a unitary test question; fetching answer page images
corresponding to the determined batch from a storage device; and
holding the fetched answer page images in a cache.
19. The method recited in claim 18, further comprising the steps,
prior to the formatting and transmitting steps, of retrieving a
scoring and a display protocol for answers to the test
question.
20. The method recited in claim 19, wherein the scoring protocol
comprises a numerical score range for answers to the test
question.
21. The method recited in claim 20, wherein the first visual image
comprises a score selection element, and the formatting step
comprises including the score selection element as dictated by the
display protocol.
22. The method recited in claim 21, wherein the score selection
element comprises a score button bar.
23. The method recited in claim 17, wherein the first display
screen further comprises a score selection element.
24. The method recited in claim 17, wherein, if the answer page
does not encompass a complete answer, repeating the retrieving,
formatting, transmitting, and permitting steps on a second answer
page.
25. The method recited in claim 17, wherein the first display
screen further comprises an electronically manipulable scroll bar,
and wherein the permitting step comprises receiving a signal
indicative of a manipulation of the scroll bar and transmitting to
the reader display device a second display screen comprising the
second portion of the answer page corresponding to the signal.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application is a continuation of and incorporates by
reference co-pending application Ser. No. 10/113,035, filed Apr. 1,
2002, now U.S. Pat. No. 6,684,052, which itself is a continuation
of application Ser. No. 09/707,252, filed Nov. 6, 2000, now U.S.
Pat. No. 6,366,760, issued Apr. 2, 2002, which itself is a
divisional application of application Ser. No. 08/903,646, filed
Jul. 31, 1997, now U.S. Pat. No. 6,173,154, issued Jan. 9, 2001,
which are commonly owned with the present invention and which are
incorporated herein by reference.
BACKGROUND OF THE INVENTION
[0002] 1. Field of the Invention
[0003] The present invention relates to systems and methods for
imaging test answer sheets and, more particularly, to automated
systems and methods for processing and storing test answer sheet
images that include answers to open-ended questions.
[0004] 2. Description of Related Art
[0005] The automation of test scoring is a complex problem that has
generated a great deal of interest, owing to a significant economic
pressure to optimize efficiency and accuracy and to minimize human
involvement. Optimal mark reading (OMR) systems are well known in
the art, such as those for scanning forms having pencil marks
within preprinted areas such as ovals. OMR systems generally sense
data recorded within the preprinted areas by detecting light
absorbed in the near infrared, which is referred to as NIR
scanning. This method permits the differentiation of the pencil
marks from the preprinted information, which is provided in a
pigment that does not absorb in the NIR. OMR systems thus permit a
gathering of data that is easily converted into digital form,
scored against an answer database, and saved without consuming
excessive storage space.
[0006] An additional level of complexity is added, however, with
the inclusion of open-ended or essay-type questions. These
questions must typically be scored by a human reader, and thus
either the physical test form or a visible image thereof must be
available for at least the time required for scoring. A digitally
stored visible image can be obtained by an image processing
apparatus, for example.
[0007] A multiplicity of systems and methods for addressing the
scoring of test answer sheets have been disclosed in the art. For
example, Poor (U.S. Pat. No. 5,452,379), Keogh et al. (U.S. Pat.
No. 5,134,669), Clark and Clark et al. (U.S. Pat. Nos. 5,321,611;
5,433,615; 5,437,554; 5,458,493; 5,466,159; and 5,558,521) disclose
systems and methods for combining OMR and image processing wherein
only a predefined area of a document (an "area of interest") is
captured and stored.
[0008] Another aspect of the problem of processing test answer
sheets having both multiple-choice and open-ended questions
involves the scanning apparatus used to convert a written document
into digital data. The use of combined OMR and image capture
devices is disclosed by Poor '379, Keogh et al. '669, Clark et al.
'554.
SUMMARY OF THE INVENTION
[0009] It is therefore an object of the present invention to
provide a system and method for processing and scoring test answer
sheets having both multiple-choice and open-ended questions.
[0010] It is another object to provide such a system and method
that retains a full image of a test form so that it is retrievable
by a scorer.
[0011] It is an additional object to provide such a system and
method that captures OMR and image data in a unitary device.
[0012] It is a further object to provide such a system and method
that obviates the need for trigger or timing marks on a test
form.
[0013] It is yet another object to provide such a system and method
that distributes answers for scoring to a qualified reader.
[0014] It is yet an additional object to provide a flexible system
architecture for imaging test answer sheets, storing the images,
and distributing the images to a qualified reader for scoring.
[0015] It is yet a further object to provide such a system and
method that includes a tool for performing a geometric measurement
upon a displayed image of an answer sheet.
[0016] These and other objects are provided by the imaging and
scoring system and method of the present invention. The system
includes integrated hardware elements and software processes for
capturing optical mark and full visual images of an answer page,
for storing the images, for retrieving the images, for distributing
the visual images to a reader for scoring, for assisting the reader
in scoring, and for monitoring the reader's performance.
[0017] The scanning system comprises means for sequentially
advancing each page of a plurality of answer pages along a
predetermined path. Positioned along the path are mark imaging
means (OMR, optical mark recognition; OCR, optical character
recognition) for capturing a location of an optical mark on each
answer page and visual imaging means for capturing a full visual
image of each answer page. A forms database in a server is provided
that contains data on the physical location and type (e.g.,
multiple-choice or open-ended) of each answer on each page.
Software means resident in the server operate with the forms
database to determine whether the captured image contains an answer
to an open-ended question. If such an open-ended answer is supposed
to be found on the page being imaged, the full visual image of the
page is stored.
[0018] In a particular embodiment the scanner further comprises
means for aligning the page image without the use of timing or
tracking marks. The aligning means comprises means for detecting a
page edge, which is sufficient for pages having only open-ended
answers.
[0019] The present invention further includes a system and method
for distributing one of a batch of answer images to a reader for
scoring. The answer images typically comprise open-ended answers
such as are obtained from the scanning system and method as
described above. Preferably each batch of answer images are from a
common test, although this is not intended as a limitation.
[0020] The method comprises the steps of fetching a batch of
answers to a test question from a storage device and placing them
in a temporary cache. These fetching and temporary storing steps
are preferably under the control of a server. This server contains
a database associating each answer batch with a qualification
required of a reader. Another database resident therein contains a
list of qualifications possessed by each reader.
[0021] A reader who is in electronic communication with the cache
indicates a readiness for scoring, and that reader's
qualifications, which are resident in the server, permit the
routing to the reader of one of an available batch of answers based
upon predetermined criteria such as priority associated with a test
to be scored. An answer image from an appropriate answer batch is
electronically delivered to the reader's workstation for scoring.
Once the scoring of that answer is complete, the server will
distribute additional answer images to that reader until the batch
is completely scored or the reader exits the system. Typically, a
similarly qualified group of readers score answer images from the
same batch.
[0022] The present invention additionally includes a system and
method for displaying a test answer page to a reader for scoring.
In this aspect, the page number for a particular test is used to
access a forms layout database, which contains a location of the
sector on which the open-ended question is expected to be found.
The page image is then formatted to display that answer sector to
the reader. Means are also provided for permitting access to the
remainder of the page, such as by scrolling on a workstation
screen, or to additional pages if the item answer covers multiple
pages.
[0023] Formatting also comprises providing a scoring protocol for
the answer and displaying commensurate indicia to the reader to
assist in scoring. For example, a button bar can be displayed on a
screen, an item of which can be selected for entering a score.
[0024] Another scoring facilitator available to the reader
comprises a geometric measurement tool that can be superimposed on
an answer and manipulated to provide an indication of how close to
an "ideal" answer the student has come.
[0025] Scoring is also assisted by an electronic querying system
and method, whereby a query is electronically transmitted to
successively higher levels of supervisors until an answer can be
obtained. The answer is then electronically relayed back through
the same levels so that all intermediate personnel can benefit from
the knowledge.
[0026] In order to monitor the scoring effectiveness of a reader,
means are provided for transmitting a calibration answer for
scoring. The reader is unaware that this is not another answer in
the regular workflow queue. The score granted by the reader can be
compared against a target score to judge that reader's
effectiveness. In addition, scoring time can be tracked to obtain a
measure of scoring speed. Similarly, the calibration answer can be
given to a plurality of readers for obtaining effectiveness and
speed statistics for a group of readers.
[0027] The features that characterize the invention, both as to
organization and method of operation, together with further objects
and advantages thereof, will be better understood from the
following description used in conjunction with the accompanying
drawing. It is to be expressly understood that the drawing is for
the purpose of illustration and description and is not intended as
a definition of the limits of the invention. These and other
objects attained, and advantages offered, by the present invention
will become more fully apparent as the description that now follows
is read in conjunction with the accompanying drawing.
BRIEF DESCRIPTION OF THE DRAWINGS
[0028] FIG. 1 is a schematic of a hardware configuration of a
preferred embodiment of the scoring system.
[0029] FIG. 2 is a schematic of the data processing functions and
applications of the scoring system.
[0030] FIG. 3 is a schematic of a network architecture useful in
the scoring system.
[0031] FIG. 4 is a flowchart of representative image processing and
storing steps in the method of the present invention.
[0032] FIG. 5 is a flowchart of a representative process for
distributing an answer to a reader for scoring in the method of
this invention.
[0033] FIG. 6 is a flowchart of representative steps in the scoring
process of the present invention following the distribution of an
answer to a reader.
[0034] FIG. 7A illustrates an exemplary page of a literature test
having one multiple-choice question and one open-ended
question.
[0035] FIG. 7B illustrates a display of the image processed from
the page of FIG. 8A as displayed to a reader for scoring.
[0036] FIG. 8A illustrates an exemplary page of a geometry test
having one multiple-choice question and one question requiring the
student to draw a diagram.
[0037] FIG. 8B illustrates a display of the image processed from
the page of FIG. 8A as displayed to a reader for scoring.
[0038] FIG. 9 is a flowchart of representative steps in the reader
calibration process of the present invention for tracking scoring
efficiency and effectiveness.
[0039] FIG. 10 illustrates an exemplary header sheet for a batch of
test booklets.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0040] A description of the preferred embodiments of the present
invention will now be presented with reference to FIGS. 1-10.
[0041] The Image Capturing and Storage System and Method
[0042] A schematic of a hardware configuration of a preferred
embodiment of the present invention is illustrated in FIG. 1, which
includes the imaging and image storing elements, and in FIG. 3,
which includes the network architecture. Software application
elements are included in the data processing flow diagram of FIG.
2. A flowchart of representative image processing and storing steps
is given in FIG. 4, and two exemplary answer pages are illustrated
in FIGS. 7A and 8A. The imaging and scoring system 10 hardware
elements include a scanner 20 for imaging answer pages. A preferred
embodiment of the scanner 20 comprises a modified Scan-Optics 9000
unit, rated for 120 pages/min.
[0043] Standardized tests are typically given in batches to
students belonging to a particular group, for example, a plurality
of sixth-grade students from different schools and different
classrooms in a particular geographical region. Each student
receives a coded booklet comprising a plurality of pages, and,
following test administration, all the test booklets are delivered
to a scoring center for processing. A header page 13 (FIG. 10)
provides alphanumeric character and OMR-readable data for tracking
the booklets. Header page 13 includes, for example, such
information as teacher name 131 ("Mrs. Smith"), grade level 133
("6"), and school code 132 (134274), the latter two having an
associated "bubble" filled in for each number. This configuration
is exemplary and is not intended as a limitation. One or more of
such batches may together form an "order," and a number is also
assigned to track this (e.g., all Grade 6 classes in Greenwich,
Conn.). Another tracking means comprises "cart number," which
indicates a physical location of the booklets. Each test booklet is
entered, for example, via bar code, for later demographic
correlation with scores, and is cut apart into individual, usually
two-sided pages (FIG. 4, step 899).
[0044] The test booklet pages are stacked sequentially into an
entrance hopper 201 of a scanner 20, and each page 12 is fed by
methods well known in the art onto a belt 21 for advancing the page
12 along a predetermined path (FIG. 4, step 900). The belt 21 has a
substantially transparent portion for permitting the page 12 to be
imaged on both sides simultaneously by two sets of cameras.
[0045] A first set of cameras includes an upper 22 and a lower 23
camera, each filtered for infrared wavelengths. This set 22,23 is
for optical mark recognition (OMR), used to detect the location of
pencil marks, for example, filled-in bubbles such as are common in
multiple-choice answers, on both sides of the page 12 (step 903).
Alternatively, OCR marks are detected and processed (step 903).
[0046] The OMR scan data are greyscale processed by means 42 known
in the art for detection of corrections and erasures. The data are
then routed to a long-term storage device (step 906), such as
magnetic tape 41, for later scoring and further processing in a
mainframe computer 40.
[0047] A second set of cameras includes an upper 24 and a lower 25
camera, each substantially unfiltered. This set 24,25 is for
capturing a full visual image of both sides of the page 12 (step
907).
[0048] The page 12 continues along the path on the belt 21 and is
collected in sequence with previously scanned pages in an exit
hopper 202.
[0049] The scanner 20 is under the control of a first server 26,
such as a Novell server, which performs a plurality of
quality-control functions interspersed with the imaging functions.
Software means 261 resident in the first server 26 determine that
each page being scanned is in sequence (step 904) from preprinted
marks on the page indicating page number. If it is not, the
operator must correct the sequence before being allowed to continue
scanning (step 905).
[0050] The first server 26 also has software means 262 for
determining whether the page 12 is scannable (step 901). Pages
containing OMR data contain timing tracks 125 as are known in the
art (see FIG. 7A) for orienting the page with respect to optical
mark position. A page that has these missing is not scannable, and
a substitute page marked "unscannable" in placed into the document
indicating to the reader that a request for a hard copy must be
made before this page can be scored (step 902).
[0051] In addition, a screen 27 is in communication with the first
server 26 that displays to the operator a preselected number of
visual images (step 911). For example, the operator may choose to
view every nth page scanned. Should the quality be deemed
insufficient (step 912), the scanner 20 is stopped (step 913),
maintenance functions or repairs are performed (step 914), and the
affected group of pages is rescanned (step 900). This is a
custom-designed function, a scanning activity monitor, that
automatically searches the output files looking for the latest
cart-stack combination and then displays the latest images from the
cameras 24,25 for operator review.
[0052] The first server 26 further contains a forms database 265 of
answer pages that comprises data on the physical location of each
answer and a type of answer for each page in the answer booklet.
The answer type may be, for example, an answer to an open-ended
question or a multiple-choice question. FIG. 7A illustrates a
sample page 12 from a literature test, wherein Question #1 71 is
multiple-choice and Question #2 72 is open-ended, with an answer
space 73 provided for writing an answer 74. Likewise in FIG. 8A, a
sample page 12' from a geometry test, Question #1 81 is
multiple-choice and Question #2 82 is open-ended, with an answer
space 83 provided for drawing a diagram 84. A correlation is
performed between the page number and the forms database (step 908)
to determine whether the page 12,12' contains an open-ended answer.
If so (step 909), the page image is prepared for storage (step
910); if not, the page image is not saved.
[0053] The first server 26 also contains means for detecting an
edge, preferably an uncut edge 120, of the imaged page. Edge
detection is utilized to align the visual image for answer pages
containing only open-ended answers. This is beneficial for several
reasons:
[0054] (1) the answer booklets are more economical to produce,
since tracks do not need to be printed and printing accuracy is
less important; (2) there is less chance of tampering; and (3) the
booklets have greater aesthetic appeal.
[0055] A page image that is to be saved is stored temporarily in a
second server, comprising a fast storage server 28 (step 915) that
has a response time sufficiently fast to keep pace with the visual
image scanning step 907. Such a second server 28 may comprise, for
example, a Novell 4.x, 32-Mb RAM processor with a 3-Gb disk
capacity. Means are provided here for ensuring that the OMR and
image data are in synchrony (step 916). If they are not, data may
have to be reconstructed or images rescanned (step 917).
[0056] The data are transferred at predetermined intervals to a
third server 30 having software means 302 resident therein for
performing a high-performance image indexing (HPII) on the visual
image (step 918). This is for processing the data for optical
storage and retrieval (OSAR). Third server 30 may comprise, for
example, a UNIX 256-Mb RAM processor with a 10-Gb disk capacity
having 3.2.1 FileNet and custom OSAR software resident thereon.
[0057] The answer images are finally transferred to a long-term
storage (step 919) unit 34 for later retrieval. Such a unit 34 may
comprise, for example, one or more optical jukeboxes, each
comprising one or more optical platters. Preferably two copies are
written, each copy to a different platter, for data backup.
[0058] Next the transaction log data are transferred to a fourth
server 32. Fourth server 32 may comprise, for example, a UNIX 64-Mb
RAM processor having Oracle and FileNet software resident
thereon.
[0059] Th Distribution and Qu u Monitoring System and Method
[0060] Once a complete batch of answer pages have been imaged and
stored, a "batch" comprising, for example, all test booklets from a
particular grade level from a particular school, scoring can
commence. FIG. 5 is a flowchart of an exemplary distribution
process of the present invention, wherein a first step 950
comprises determining an answer batch from a queue to be scored
during a particular time period.
[0061] In a preferred embodiment, a determination is made prior to
the start of a scoring session as to which batches of answers are
desired to be scored during that session. This determination may be
based, for example, on predetermined criteria including an assigned
priority, project number, order number, and number and type of
readers available, and is entered into a fifth server 36, which
provides a communication link between the fourth server 32, the
cache 38, reader workstations 50, and the mainframe 40, as will be
discussed in the following (FIG. 1). Fifth server 36 comprises, in
an exemplary embodiment, a DEC-Alpha server having 512 Mb RAM and
12-Gb disk capacity, with 3.2c UNIX and 7.2.2.3 Oracle resident
therein.
[0062] The desired batches are prefetched (step 951) from the
long-term storage unit 34 and temporarily stored (step 952) in a
cache 38, as directed by the OSAR system 322 in the fourth server
32 under the control of the fifth server 36. These prefetching and
temporary storage steps 951,952 confer a speed advantage over
having readers access the long-term storage unit 34 directly, which
is comparatively slow, whereas the cache 38 response time is rapid.
An exemplary cache 38 for use in the system comprises a FileNet
residing on the OSAR server and contains 12 GB of magnetic storage
for this transient database.
[0063] The fifth server 36 contains a first database 362
associating each answer batch with a qualification required of a
reader (e.g., sixth-grade math, New York State test). A second
database 364 resident therein contains a list of qualifications
possessed by each reader. A third database 366 resident therein
contains the form data for each answer, including the number of
questions and pages in the test, how each answer is to be scored,
and in what form the answer image is to be presented to a reader.
For example, information on the page in FIG. 7A would include the
location of the answer blank 73 to Question #2 and the answer scale
to be used in scoring that question (e.g., a score of 1-5).
[0064] After the answer batch is lodged in the cache 38, the
question qualification 362 and forms 366 databases are referenced
(steps 953 and 954), and a work queue is established, which is
selected by a supervisor managing a group of readers (step
955).
[0065] When a reader logs onto a workstation 50, his or her
qualifications will have been checked by the supervisor. The reader
receives an answer from the chosen batch for scoring (step 957).
The answer image is formatted for display (step 958) and delivered
to the reader's workstation 50 (step 959).
[0066] The formatting step 958 comprises accessing the forms
database 366 to determine how the answer image and scoring protocol
are to be displayed to the reader. For example, an area of interest
73 (FIG. 7A) or 83 (FIG. 8A), which comprises the space left for
writing in an answer, is delineated on each page image, and it is
this area that initially appears on the reader's workstation screen
51 (FIGS. 7B and 8B). An important feature of the present invention
is that the reader can also access the remainder of the image if
desired, which can be necessary if the student has written outside
the area provided for that particular question (see FIG. 6, steps
988,989), and may even spill over onto another page. Such access is
typically provided by a scroll bar 510 such as are known in the art
in Windows.RTM.-type applications (FIGS. 7B and 8B). This feature
provides an advantage over other systems known in the art in which
the visual image is clipped to include only a predetermined area of
interest, in which case this extra display information is lost.
[0067] Once the reader has finished with an answer, a score is
entered into the workstation 50 (step 960), which is delivered to
and stored at the fifth server 36 (step 962). Next the reader
receives another answer to score from the same batch, if there are
additional answers of the same test question remaining in the queue
(step 962). If that queue is empty, the supervisor selects another
answer batch from the queue (step 955). Once the batch is
completely scored, the scores are assembled and transmitted by the
fifth server 36 to the mainframe 40 (step 965), where all the
individual answer scores are correlated for each booklet and a
total test score is calculated. This step typically occurs once per
day.
[0068] The progress and speed of any particular reader or the
status of a particular queue are monitored by accessing the fifth
server 36, which maintains statistics (step 963) and a table of
workflow queues (step 964). Access to this information may be
limited, for example, to supervisory or managerial personnel by
means known in the art.
[0069] The Scoring and Reader Monitoring System and Method
[0070] One aspect of the scoring system and method of the present
invention is illustrated in the flowchart of FIG. 6, which provides
further details of the steps occurring between step 957, the
delivery of an answer to a reader for scoring, and step 960, the
entry of a score, in FIG. 5.
[0071] As indicated above, the answer, prior to delivery (step
957), is formatted for electronically selecting an area of interest
73 or 83 for displaying to the reader, along with a scroll bar
75,85 for permitting the reader to access the remainder of the page
12,12' (FIGS. 7A,8A). The answer is also formatted for scoring
protocol, and, as illustrated in FIGS. 7B and 8B, a score button
bar 76,86 is provided that corresponds to the scoring range for
that question. In FIG. 7B, the scores are given on a scale of 1 to
5; in FIG. 8B, 1 to 4. Answers that cannot be give a numeric grade
are considered invalid and are scored in a separate category (e.g.,
blank, foreign language, off-topic).
[0072] Scoring facilities such as are known in the art generally
comprise groups of readers having similar qualifications who are
assigned to types of questions to score. Such groups may be further
subdivided into smaller groups, and a commensurate management tree
structure created. Preferably this tree structure is mirrored in
the hardware architecture (FIG. 3), wherein, for example, a
supervisor has access to all reader workstations 50 in that
group.
[0073] To proceed with scoring, formatted answer and score button
bar 76,86 are displayed to the reader (step 980). If the reader has
a question regarding the scoring protocol (step 981), a query is
sent electronically upline to the reader's next-level supervisor
(step 982). If that supervisor can answer the question (step 983),
a response is transmitted electronically to the reader (step 984);
if that supervisor cannot answer the question (step 983), a query
is transmitted upline to the next-level supervisor (step 982),
looping through as many levels of supervisors as are present until
the query can be addressed. When the query is answered, the answer
is relayed to the reader through all intermediate query relayers
(step 984) so that all levels of personnel can view the answer to
the query. While the query is being routed, the reader can continue
scoring another answer.
[0074] Once the query is answered, or if there was no query, the
reader can continue scoring that answer. If the test is in geometry
or some other discipline wherein an answer can comprise the drawing
of a diagram, a software tool is made available to the reader to
assist in scoring (step 985). If needed, the geometric tool is
fetched (step 986) and utilized to score the answer. In the example
shown in FIG. 8B, a right triangle was drawn, and thus a floating
protractor 87 can be used to measure the right angle 840. Also
available are screen-manipulable tools for measuring areas, lines,
and circles. This software in the preferred embodiment comprises a
custom-designed package.
[0075] The reader then determines if the image display is
sufficient for scoring the answer (step 987). If so, the reader can
score the answer (step 960); if not, the reader can use the scroll
bar 510 to access another area of the page, or an area on another
page, to view additional parts of the visual image (step 988).
[0076] Another aspect of the present invention includes a system
and method for monitoring the scoring effectiveness of a reader,
the steps for which are included in the flowchart of FIG. 9. A
group supervisor, for example, sends a calibration answer having a
predetermined target answer to a reader (step 990). This answer is
interspersed with "real" student answers and are substantially
identical in form thereto, which permits the calibration to be
performed transparently.
[0077] A score entered by the reader (step 991) is collected (step
992) and electronically compared with the target score (step 993)
for providing an indication of effectiveness (step 994). At the
same time, the scoring time can be collected (step 992) and
compared with a target scoring time (step 993) for a calculation of
scoring efficiency (step 994).
[0078] Another check is performed by comparing a score given
holistically and analytically by an inconsistency application (970,
FIG. 2). If these scores differ too widely, they are rechecked to
ensure that an error was not made.
[0079] As mentioned, scoring is typically performed by
electronically linked groups of readers having similar
qualifications. Thus the method illustrated in FIG. 9 can also be
expanded to monitor the effectiveness and efficiency of the entire
group of readers (steps 991-991") substantially simultaneously if
desired.
[0080] Statistics can also be amassed at the system level on
scoring progress for each workflow queue, broken down into scoring
groups or by individual readers. As these statistics are being
collected continuously, the system provides enormous flexibility in
terms of optimization of effort.
[0081] System Architecture and Software System Flow
[0082] An exemplary architecture for a preferred embodiment of the
present system 10 is schematically illustrated in FIG. 3, and
comprises a fiber-optic database distributed interface 61 (FDDI)
having a throughput of 100 Mbits. In this embodiment a 100-Mbit
fiber is employed to link the subsystems.
[0083] Connected to the FDDI 61 are the Novell server 28 and the
UNIX servers 30 and 36. The cache 38 and the jukebox 34 are
connected through the server 30. A first hub 62 is connected to the
FDDI 61 and, via 10-Mbit lines, to the scanners 20, which output to
magnetic tape 41, as shown in FIG. 1, and thence to mainframe 40. A
second hub 63 is connected to the FDDI 61 and, via 10-Mbit lines,
to the reader workstations 50. Second hub 63 acts as a concentrator
and has 100 Mbits from FDDI 61. Each workstation 50 has 10 Mbits
out on ethernet.
[0084] It is believed that this architecture confers advantages
over systems previously known in the art, which employ token rings
having limited throughput and one server per group. The present
system comprises central servers supporting all readers, which
permits improved flexibility both in hardware and in software
implementation. This architecture further permits the adaptation to
remote scoring sites.
[0085] The software system flow is illustrated in FIG. 2, wherein
each "scoring work unit," (SCO WRK UN), here shown as 74 in FIG.
7A, comprises an answer image. The applications bear like numbers
to the steps they perform in the flowcharts. In addition, various
caches are maintained between applications, including: transaction
data 971 from the scanning operation 907; rescanned 972 and new
booklet 973 information from HPII document committal; image quality
work units 974 acted upon by the image quality application 912, the
distributor application 957, the question application 981, and the
scoring application 960; regular holistic and analytical scores 975
from the scoring 960, route 965, and question 981 applications;
domain item questions 976, wherein pending questions are held until
they are resolved; pending scores 977 for holding incomplete
scores; calibration work units 978; and inconsistency work units
979.
[0086] New Form Definition
[0087] The system of the present invention further comprises a
table-driven system for entering new project configurations,
including teams, forms, domains, and orders. This allows the
scoring to be customized for each project without any recoding.
[0088] It may be appreciated by one skilled in the art that
additional embodiments may be contemplated, including analogous
systems and methods for processing questionnaires.
[0089] In the foregoing description, certain terms have been used
for brevity, clarity, and understanding, but no unnecessary
limitations are to be implied therefrom beyond the requirements of
the prior art, because such words are used for description purposes
herein and are intended to be broadly construed. Moreover, the
embodiments of the apparatus illustrated and described herein are
by way of example, and the scope of the invention is not limited to
the exact details of construction.
[0090] Having now described the invention, the construction, the
operation and use of preferred embodiment thereof, and the
advantageous new and useful results obtained thereby, the new and
useful constructions, and reasonable mechanical equivalents thereof
obvious to those skilled in the art, are set forth in the appended
claims.
* * * * *