Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS5987302 A
Publication typeGrant
Application numberUS 09/045,622
Publication dateNov 16, 1999
Filing dateMar 20, 1998
Priority dateMar 21, 1997
Fee statusPaid
Also published asCA2284912A1, CA2284912C, WO1998043223A1
Publication number045622, 09045622, US 5987302 A, US 5987302A, US-A-5987302, US5987302 A, US5987302A
InventorsGary Driscoll, Lou A. Hatfield, Arlene A. Johnson, Helen D. Kahn, Theodore E. Kessler, David L. Kuntz, Cheryl A. Pocino, Michele Rosenthal, Patricia G. Williams
Original AssigneeEducational Testing Service
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
On-line essay evaluation system
US 5987302 A
Abstract
Systems and methods for on-line essay evaluation offer students the opportunity to prepare practice essays, submit the essays to trained, expert readers, and retrieve an evaluation at the student's convenience. The system provides the capability for a user or test taker to submit essays at any time during the year, independent of the timing of an actual testing event, and to receive prompt, consistent evaluations of the submitted essays. Further, the system provides the capability to prioritize essays and schedule readers so that essays can be evaluated on a rush basis. The essays are evaluated in a manner that provides useful instructional feedback to students about their skills relative to any assessment or test that the student wishes to take.
Images(29)
Previous page
Next page
Claims(15)
What is claimed is:
1. A method for on-line evaluation of a constructed response to an examination question, comprising the steps of:
accessing a web site via the Internet, ordering an evaluation account, and selecting an examination question;
an examinee constructing an essay response to said examination question;
electronically submitting said essay response to said web site for evaluation;
positioning the submitted essay response in a queue of essay responses that need to be evaluated by a grader, the essay being placed in the queue based upon at least one of the time of submission of said essay response and the date an evaluation of the essay response is due to the examinee;
the grader evaluating the essay response in accordance with calibrated grading guidelines, selecting an overall evaluation, at least one pre-defined feedback comment, and a score; and
releasing for retrieval by the examinee the overall evaluation and the pre-defined feedback comments regarding said essay response.
2. An on-line essay evaluation system for submitting essay responses for evaluation by graders and retrieving an evaluation of the essay response from the grader for submission to the examinee, comprising:
a database which stores a plurality of questions which elicit essay responses from an examinee;
an essay response submission system which selects a question from said database and which submits an essay response to the selected question;
an essay response administrative system which stores essay responses from a plurality of examinees in a queue and selectively distributes the queued essay responses to graders for evaluation;
an automated grading system which enables a grader to evaluate an essay response, to select an overall evaluation, at least one pre-defined additional comment, and a score; and
an evaluation delivery system which releases to the examinee for the examinee's retrieval said overall evaluation and any pre-defined additional comments from the grader.
3. The system of claim 2 wherein said essay response administrative system schedules graders to evaluate queued essay responses so that essay responses are evaluated and feedback is stored and released for retrieval by the examinee within a predetermined time period.
4. The system of claim 2 wherein said essay response administrative system prioritizes the presentation and distribution of essay responses to graders so that essay responses are evaluated and feedback is stored and released for retrieval by examinees within a predetermined time period.
5. The system of claim 2 wherein said essay response administrative system automatically calculates any compensation for the graders based upon the number of constructed responses the grader has evaluated.
6. The system of claim 2 wherein said automated grading system enables a grader leader to view the overall evaluation and pre-defined additional comments of the grader and assign a new overall evaluation and pre-defined additional comments if desired.
7. The system of claim 2 wherein said evaluation delivery system releases to the examinee for the examinee's retrieval at least one exemplary essay response, said overall evaluation, and said pre-defined additional comments.
8. The system of claim 2 wherein said essay response administrative system includes software which:
identifies the number of essay responses which need to be evaluated in an upcoming time period;
identifies the number of essay responses which are scheduled to be evaluated in the upcoming time period;
if the number of essay responses that need to be evaluated in the upcoming time period is less than the number of essay responses that are scheduled to be evaluated in the upcoming time period, identifies the number of essay responses which need to be evaluated in the upcoming time period; and
if the number of essay responses that need to be evaluated in the upcoming time period is greater than the number of essays that are scheduled to be evaluated in the upcoming time period, electronically notifies backup graders of the need to provide assistance in the upcoming time period.
9. The system of claim 2 wherein said essay response administrative system includes software which:
identifies the date and time an essay response was submitted;
identifies whether the essay response is to be graded on a regular schedule or a rush schedule;
calculates a date for which an evaluation of the essay response is due based upon the date the essay response was submitted and whether the essay response is to be evaluated on a regular schedule or a rush schedule; and
places the essay response in a queue of essay responses that is ordered by the date on which an evaluation of each response is due.
10. The system of claim 2 wherein said essay response administrative system includes software which:
identifies a grader;
identifies the total number of essay responses read by the grader; and
calculates the compensation for the grader as function of the number of essay responses read.
11. The system of claim 2 wherein said essay response administrative system includes software which:
identifies that a grader has submitted an overall evaluation and score for an essay response;
forwards the essay response and overall evaluation to a grader leader for review; and
if the overall evaluation and score are not consistent with scoring and evaluation standards, allows grader leader to assign a new score and evaluation, so long as the supervisory grader performs the review in a predetermined time period.
12. The system of claim 2 wherein said automated grading system presents an essay response to a grader so that the grader may evaluate the essay response; and presents evaluation categories to the grader so that the grader may select an overall evaluation.
13. The system of claim 2 wherein said evaluation delivery system groups the examination question, scoring guidelines, sample essay responses, the overall evaluation, and pre-defined additional comments into a feedback package; stores said feedback package; and releases said feedback package for retrieval by the examinee.
14. An Internet based system for submitting and evaluating essay responses to examination questions, comprising:
a server computer which forwards examination questions to examinees and accepts essay responses to said examination questions for evaluation, said server being connected to the Internet;
a first computer connected to said server computer so as to permit examinees to set up an account for purchasing access to an examination question and evaluation of an essay response submitted in response to said examination question; and
a second computer connected to said server computer so as to permit graders to read an essay response that was submitted by an examinee, evaluate the essay response by selecting an overall comment and at least one pre-defined feedback comment, store said overall comment and said at least one re-defined feedback comments, and release said overall comment and said at least one pre-defined feedback comment for retrieval by the examinee via said server computer.
15. A method for on-line evaluation of constructed responses to an examination question over a system having a server computer connected to the Internet, a first computer electronically connected to said server computer, and second computer connected to said server computer, comprising the following steps:
an examinee accessing the server computer from said first computer to set up an account for purchasing access to an examination question and evaluation of an essay response which is submitted in response to said examination question;
an examinee accessing the server computer from said first computer to submit an essay response in response to said examination question;
accepting said essay responses at the server computer; and
a grader accessing the server computer from the second computer to read the essay response submitted by the examinee, evaluate the essay response by providing an overall comment and at least one pre-defined feedback comment, storing said overall comment and said at least one pre-defined feedback comment, and releasing said overall comment and said at least one pre-defined feedback comment for retrieval by the examinee.
Description
CROSS REFERENCE TO RELATED APPLICATIONS

This application is related to U.S. Provisional patent application Ser. No. 60/039,383, filed Mar. 21, 1997, entitled "On-Line Essay Evaluation System," and U.S. Provisional patent application Ser. No. 60/071,893, filed Jan. 20, 1998, entitled "On-Line Essay Evaluation System," the contents of which are hereby incorporated by reference in their entirety.

FIELD OF THE INVENTION

The present invention relates to systems for on-line evaluation of essay responses to examination questions. More specifically, the invention relates to a system for the submission of essays to experienced readers/graders for evaluation and for providing consistent, instructional feedback to students.

BACKGROUND OF THE INVENTION

The use of standardized testing has increased significantly in recent years. Indeed, standardized tests are now an essential part of almost every admission or qualification procedure. A recent trend in standardized testing emphasizes a move beyond traditional multiple choice tests in favor of tests that require open-ended responses such as essay responses. These open-ended responses are often referred to as constructed responses (CRs). CRs are not limited to written text, but may include graphics, videotaped performances, audio responses, as well as other forms of responses. In order to improve the efficiency of scoring large scale standardized tests, both those offered at periodic administrations as well as those offered essentially on a daily basis, computer systems have been developed to automatically score multiple-choice responses and other simple response types. While some automatic scoring systems have been designed to score particular types of CRs (see e.g., U.S. patent application Ser. No. 08/794,498, entitled Automatic Scoring for Categorized Figural Responses, assigned to the same assignee hereunder), the evaluation of CRs is particularly well-suited to human raters. For this reason, certain computer scoring systems have been developed to facilitate and automate the electronic transmission of CRs to human raters for evaluation and scoring. However, these conventional CR transmission systems currently have many disadvantages.

Generally, essay evaluation systems operate in the context of a particular test administration. The systems are used to collect, distribute, and grade actual test responses for a particular test. There is not presently available a system which provides the capability of practicing test taking skills, demonstrating content mastery, and receiving constructive feedback from qualified faculty who have scored actual test responses. Such a system would have tremendous benefits for those persons interested in improving essay writing scores and those preparing for tests. Thus, there is a need for a system that allows persons to improve testing skills by formulating responses to actual test questions in a practice setting and having their responses evaluated by actual test evaluators.

Current evaluation systems also lack an adequate means for scheduling readers/graders. For a system which can accept large numbers of constructed responses, a large number of readers are required to evaluate the constructed responses. Such a system should track when readers are scheduled to be on the system and if scheduled readers do not log on or grade a sufficient number of essays, to notify backup readers.

Some prior systems permanently associate multiple essays or constructed responses with a particular evaluator or reader. The constructed responses remain associated with the reader regardless of whether the reader is logged onto the system. In such systems, when the reader is not logged-in, the constructed responses sit idle waiting for the reader to evaluate the response. A constructed response may wait in a reader's queue for several days before the reader evaluates the response. Under such a system it is not possible to control when a constructed response may be evaluated. Thus, there is a need in the art for a system for evaluating constructed responses that provides automated work load leveling of queued essay responses so that essay responses are evaluated and returned to the submitter or student within a predetermined time.

A further shortcoming of prior systems is the inability to prioritize constructed responses for grading. There is not presently available a means to expedite grading of particular constructed responses. Further, current systems do not account for such prioritization markings in the routing of constructed response to readers so that the desired deadlines are satisfied. Thus, there is a need in the art for a system which allows users to prioritize their essays and which automatically routes essays to evaluators so that the desired dates for evaluation are met.

Another shortcoming in the art is the inability to automatically determine reader compensation based on work performed. Typically, in prior systems, a reader or evaluator is paid a flat fee for participation in a particular scoring project. In an optimal system, a reader's compensation is based upon the reader's work product, i.e., the number of constructed responses that were graded. Further, in an optimal system, the compensation might vary between questions. Thus, there is a need for a system which provides a means for calculating compensation based on the work actually performed by the reader.

An additional shortcoming in present systems is the inability to monitor reader evaluations so as to ensure consistency and quality in scoring. Without such a mechanism, scoring is random and greatly diminishes the usefulness of the system to a student who needs consistent scoring in order to measure improvement. Thus, there is a need for a system whereby reader evaluations are monitored so as to insure quality and consistency in grading.

Still a further shortcoming in present systems is the inadequate feedback that is provided to a student or user who has submitted an essay. Specifically, present systems lack the capability to provide consistent essay feedback from multiple readers and fail to provide students with samples of previously graded essay responses. Typically, in present grading systems, a single numerical score is assigned to a constructed response. In such systems, the user receives no feedback other than an overall score. Further, in those systems that provide additional feedback other than an overall score, the responses are not consistent and non-standardized between readers; generally, the feedback comprises free-hand comments by the reader. The lack of consistent feedback diminishes significantly the benefit of the scoring system to the user or test-taker. Additionally, present systems do not provide the user with samples of scored essays for the same question. Thus, there is a need in the art for a scoring system that provides consistent feedback and sample essay responses.

SUMMARY OF THE INVENTION

Methods and systems for on-line essay evaluation in accordance with the present invention addresses the above described and other shortcomings in the art. According to one aspect of the invention there is provided a method for on-line evaluation of a constructed response to an examination question. The method comprises the following steps: accessing a web site via the Internet, ordering an evaluation account, and selecting an examination question; an examinee constructing an essay response to the examination question; electronically submitting the essay response to the web site for evaluation; positioning the submitted essay response in a queue of essay responses that need to be evaluated by a grader, the essay being placed in the queue based upon at least one of the time of submission of the essay response and the date an evaluation of the essay response is due to the examinee; the grader evaluating the essay response in accordance with calibrated grading guidelines, selecting an overall evaluation, one or more pre-defined feedback comments, and a score; and releasing for retrieval by the examinee the overall evaluation and the pre-defined feedback comments regarding the essay response.

According to another aspect of the invention there is provided an on-line essay evaluation system for submitting essay responses for evaluation by graders and retrieving an evaluation of the essay response from the grader for submission to the examinee. The system comprises the following items: a database which stores a plurality of questions which elicit essay responses from an examinee; an essay response submission system which selects a question from the database and which submits an essay response to the selected question; an essay response administrative system which stores essay responses from a plurality of examinees in a queue and selectively distributes the queued essay responses to graders for evaluation; an automated grading system which enables a grader to evaluate an essay response, to select an overall evaluation, one or more pre-defined additional comments, and a score; and an evaluation delivery system which releases to the examinee for the examinee's retrieval the overall evaluation and any pre-defined additional comments from the grader.

The essay response administrative system schedules graders to evaluate queued essay responses so that essay responses are evaluated and feedback is stored and released for retrieval by the examinee within a predetermined time period. The essay response administrative system includes software which performs the following operations: identifies the number of essay responses which need to be evaluated in an upcoming time period; identifies the number of essay responses which are scheduled to be evaluated in the upcoming time period; if the number of essay responses that need to be evaluated in the upcoming time period is less than the number of essay responses that are scheduled to be evaluated in the upcoming time period, identifies the number of essay responses which need to be evaluated in the upcoming time period; and if the number of essay responses that need to be evaluated in the upcoming time period is greater than the number of essays that are scheduled to be evaluated in the upcoming time period, electronically notifies backup graders of the need to provide assistance in the upcoming time period.

The essay response administrative system prioritizes the presentation and distribution of essay responses to graders so that essay responses are evaluated and feedback is stored and released for retrieval by examinees within a predetermined time period. The essay response administrative system includes software which performs the following operations: identifies the date and time an essay response was submitted; identifies whether the essay response is to be graded on a regular schedule or a rush schedule; calculates a date for which an evaluation of the essay response is due based upon the date the essay response was submitted and whether the essay response is to be evaluated on a regular schedule or a rush schedule; and places the essay response in a queue of essay responses that is ordered by the date on which an evaluation of each response is due.

The essay response administrative system automatically calculates any compensation for the graders based upon the number of constructed responses the grader has evaluated. The essay response administrative system includes software which performs the following operations: identifies a grader; identifies the total number of essay responses read by the grader; and calculates the compensation for the grader as function of the number of essay responses read.

The essay response administrative system includes software which performs the following operations: identifies that a grader has submitted an overall evaluation and score for an essay response; forwards the essay response and overall evaluation to a grader leader for review; and if the overall evaluation and score are not consistent with scoring and evaluation standards, allows grader leader to assign a new score and evaluation, so long as the supervisory grader performs the review in a predetermined time period.

The automated grading system presents an essay response to a grader so that the grader may evaluate the essay response; and presents evaluation categories to the grader so that the grader may select an overall evaluation. The automated grading system enables a grader leader to view the overall evaluation and pre-defined additional comments of the grader and assign a new overall evaluation and predefined additional comments if desired.

The evaluation delivery system releases to the examinee for the examinee's retrieval at least one exemplary essay response, the overall evaluation, and the pre-defined additional comments. The evaluation delivery system groups the examination question, scoring guidelines, sample essay responses, the overall evaluation, and pre-defined additional comments into a feedback package; stores the feedback package; and releases the feedback package for retrieval by the examinee.

According to another aspect of the invention there is provided an Internet based system for submitting and evaluating essay responses to examination questions. The system comprises the following elements: a server computer which forwards examination questions to examinees and accepts essay responses to the examination questions for evaluation, the server being connected to the Internet; a first computer connected to the server computer so as to permit examinees to set up an account for purchasing access to an examination question and evaluation of an essay response submitted in response to the examination question; and a second computer connected to the server computer so as to permit graders to read an essay response that was submitted by an examinee, evaluate the essay response by providing an overall comment and pre-defined additional comments, store the overall comment and the pre-defined additional comments, and release the overall comment and the pre-defined additional comments for retrieval by the examinee via the server computer.

According to another aspect of the invention there is provided a method for on-line evaluation of constructed responses to an examination question over a system having a server computer connected to the Internet, a first computer electronically connected to the server computer, and second computer connected to the server computer. The method comprising the following steps: an examinee accessing the server computer from the first computer to set up an account for purchasing access to an examination question and evaluation of an essay response which is submitted in response to the examination question; an examinee accessing the server computer from the first computer to submit an essay response in response to the examination question; accepting the essay responses at the server computer; and a grader accessing the server computer from the second computer to read the essay response submitted by the examinee, evaluate the essay response by providing an overall comment and pre-defined additional comments, storing the overall comment and pre-defined additional comments, and releasing the overall comment and pre-defined additional comments for retrieval by the examinee.

BRIEF DESCRIPTION OF THE DRAWINGS

A full understanding of the invention can be gained from the following description of preferred embodiments when read in conjunction with the accompanying drawings in which:

FIG. 1 provides an operational diagram for the essay evaluation system of the invention;

FIG. 2 depicts the architecture of the system of FIG. 1;

FIG. 3 provides a diagram of the functional components of the system of FIG. 1;

FIG. 4 provides an entity relationship diagram for the system data dictionary;

FIG. 5 represents several of the functional operations provided by the system of FIG. 1;

FIG. 6 provides a flowchart of the essay submission and evaluation process provided by the system of FIG. 1;

FIG. 7 illustrates an introductory system screen through which a student or user may access the system's user functionality;

FIG. 8 illustrates a system screen providing directions for using the system of FIG. 1;

FIG. 9 illustrates a system screen which provides fee information for using the system;

FIG. 10 illustrates a system screen which provides sample feedback information similar to that which would be provided upon submission and evaluation of an essay;

FIG. 11 illustrates a system screen which provides the topics for which a user can prepare and submit essays;

FIG. 12 illustrates a system screen which allows a user to order an access code to be used in submitting essays;

FIG. 13 illustrates a system screen which provides information on writing an essay for evaluation;

FIG. 14 illustrates a system screen which allows a user to submit an essay for evaluation;

FIG. 15 illustrates a system screen which allows a user to access feedback regarding an essay;

FIG. 16 illustrates a system screen which allows a user to access responses to frequently asked questions;

FIG. 17 illustrates a system screen which allows a user to access information on the operation of the system;

FIG. 18 provides a flow chart of the essay monitoring and reader scheduling process;

FIG. 19 provides a flow chart of the work load leveling process of the system of FIG. 1;

FIG. 20 provides a flow chart of the process for calculating reader compensation in the system of FIG. 1;

FIG. 21 provides a flow chart of the feedback process of the system of FIG. 1;

FIG. 22 illustrates a system screen which a reader may encounter when logging into the system for purposes of scoring essays;

FIG. 23 illustrates a system screen that may be presented to a reader who selects to view a system training pack;

FIG. 24 illustrates a system screen that may be presented to a reader who selects to view the question to which the user's essay is a response;

FIG. 25 illustrates a system screen that may be presented to a reader who selects to view the scoring guidelines;

FIG. 26 illustrates a system screen that may be presented to a reader who has selected to review an essay response that has a particular score associated with it;

FIG. 27 illustrates a system screen that may be presented to a reader who has selected to evaluate an essay; and

FIG. 28 illustrates a system screen that may be presented to a reader who has selected to review/assign pre-defined rubrics to an essay response.

DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS

An on-line essay evaluation system with the above-mentioned beneficial features in accordance with a presently preferred exemplary embodiment of the invention will be described below with reference to FIGS. 1 through 28. It will be appreciated by those of ordinary skill in the art that the description given herein with respect to those figures is for exemplary purposes only and is not intended in any way to limit the scope of the invention. All questions regarding the scope of the invention may be resolved by referring to the appended claims.

The present system offers students the opportunity to write practice essays, submit the essays to trained, expert readers, and retrieve an evaluation at the student's convenience. The system provides the capability for a user or test taker to submit essays at any time during the year, independent of the timing of an actual testing event. Further, the system provides the capability to prioritize essays and schedule readers so that essays can be evaluated on a rush basis. The essays are evaluated in a manner that provides useful instructional feedback to students about their skills relative to the assessment or test that the student wishes to take.

The present system provides an excellent learning opportunity and is unique in that the essay evaluations may be used for instructional purposes rather than strictly for evaluative purposes. Since the essay scoring and feedback is provided by skilled readers, the evaluation comes from an unbiased source and can prove to be extremely beneficial in helping students develop writing skills and demonstrate content mastery. Further, the system allows for submission of essay responses continuously throughout the year so that a student can improve and prepare for a given test administration. The student receives detailed feedback information which develops the student's knowledge of the subject matter and provides the student with instructional guidance on how to improve his or her writing skills. The student can submit essays in response to the same question several times and obtain feedback each time an essay is submitted. In this way, a user is able to measure improvement. Students can select among the various topics to explore writing techniques applicable to each topic as well as develop their knowledge content in particular topic areas.

Essays are selected for presentation to a reader based on the essay topic and are presented in order by date and time received so that the first essays received are read first. Readers continue to process essays one after the other in the order the system presents the essays to them; readers do not select specific essays to read. Readers can opt to change to other topics for which they are qualified to read, thereby releasing any essays queued for them back into the pool to be presented to other readers.

Readers schedule themselves to perform evaluations on particular reading days. In the preferred embodiment, Monday, Tuesday, Wednesday, Thursday, and Friday are separate reading days for which a reader may schedule himself or herself. The weekend, Saturday and Sunday, is categorized as a single reading day for which a reader may schedule him or herself.

Typically, a primary reader schedules him or herself to evaluate essay responses to a specific question on a specific reading day. The primary reader assumes responsibility for evaluating a certain number of essays, if the essays are available, on selected days. A reader may also schedule him or herself as a backup reader whereby he or she will be available on a particular reading day to read essays if the primary readers are unable to grade all of the essays that are scheduled for reading. Further, selected readers are assigned to be scoring leaders; these highly expert leaders can monitor other readers or themselves serve as a primary reader. A scoring leader reader monitors the scores assigned by primary and backup readers for a specific question on a specific reading day.

An operational diagram for a preferred embodiment of the inventive system is presented in FIG. 1. As shown, a student or user can connect to the on-line essay evaluation system via personal computer 110 or a similar device such as a network computer. A student may connect using either personal account 112 with a credit card or under school account 114. When a user logs into the system, he or she is connected to system server 116. When a user attempts to use a credit card, the credit card information is updated and verified using exterior credit card authorization system 118 accessed via system server 116 and database server 128. It should be noted that when an account is created, it must be identified whether essays that are submitted under that account are to be evaluated on a regular five day schedule or a two day rush schedule. When the user orders the service, the user or student identifies the type of service which is desired, i.e. either standard 5 day turn-around or rush 2 day turn-around. Students select an essay question, fashion a response, and enter the response into the system for evaluation.

Primary readers and backup readers or evaluators also connect to system server 116 via personal computer 119 or similar device. Readers rate the essays submitted by students and provide feedback as explained below.

Scoring leader readers connect to system server 116 via personal computer 120 or equivalent device. Scoring leaders perform normal reader functions and may also check reader work and override a reader where appropriate. Staff of the scoring system service provider can connect to the system via personal computer 121 or similar device to perform system maintenance. Various activities performed by administrative staff such as generating reports 122, exporting data 124, and processing purchase orders 126 require interaction with system database server 128.

A preferred system architecture for an Internet based essay evaluation system is depicted in FIG. 2. As shown, a user may connect to the evaluation system over the Internet or through another network using web browser 140. The server side of the system comprises a web server 142 such as Netscape's Enterprise Server and Oracle database 144 running on a multitasking operating system 146 such as UNIX or Microsoft NT. Firewall 148 protects system resources such as legacy databases 150 and database administrative facilities 152 of the evaluation provider from unauthorized tampering.

FIG. 3 provides a diagram of the functional components of a preferred embodiment of the system. As shown, the system can be broken into several modules: administration module 160; subscription module 162; student and reader interface module 164; and reporting module 166. The administration module or essay response administrative system 160 comprises several subsystems: student management subsystem 168; reader management subsystem 170; essay management subsystem 172; and work distribution subsystem 174. Subscription module 162 comprises several subsystems: purchase order subsystem 176; credit card processing subsystem 178; and subscription management subsystem 180. Similarly, reporting module 166 comprises several subsystems: e-mail notification subsystem 182; and extract subsystem 184 which allows for sales, reader performance evaluation, and submission summaries. Finally, student and reader interface module 164 comprises several subsystems: submission subsystem or essay response submission system 186; user authentication subsystem 188; essay evaluation subsystem or automated grading system 190; and essay result viewing subsystem or evaluation delivery system 192.

In the preferred embodiment, the system is implemented with a relational database. Of course, an object oriented database or flat file implementation could be used. FIG. 4 provides a portion of the entity relationship diagram for the database data dictionary. As shown, the database of the present system comprises table 193 in which is stored information regarding essays submitted for scoring. Data related to the question and exam for which a particular question is stored in tables 194 and 195. Information about the students who submit the essays as well as the readers who evaluate the essays are stored in table 196. Likewise, the rubrics and overall evaluation scores are stored in tables 197 and 198. The system provides storage for additional information such as reader scheduling and credit card information (not shown).

FIG. 5 depicts some of the functional operations provided by the system. As shown, the system provides means 200 for a user or organization such as a school to set up accounts for submitting essays. Times and dates for which readers are scheduled to act as primary readers, backup readers, and reader leaders can also be entered into and retrieved from the system by means 202. Of course, the system provides means 204 to submit essays for evaluation. Readers use means 206 to evaluate essays. Means 208 to compensate readers for their efforts is also provided. As shown, these operations involve updating and accessing information which is stored on system database server 128.

FIG. 6 provides an overview of the essay submission and evaluation process. At step 210, the student/user purchases an access code. Access codes are purchased for either rush or regular grading. Thus, the access code identifies an essay as one to be scored either on a regular 5 day turnaround or a rush 2 day turnaround. At step 212, the student submits an essay. A primary reader who has signed up for the particular reading day, logs into the system and is presented with a customized list of topics and questions which the reader is qualified to score. An additional icon appears if the reader is also qualified to serve as a scoring leader for the particular question. At step 214, a reader retrieves an essay from the question pool. The reader has two hours in which to read and score the essay. At step 216, if the reader does not submit the essay within two hours, the essay is returned to the pool at step 218 for another reader to evaluate. If at step 216 the reader completes evaluating the essay, the reader can continue scoring essays or can exit the system. After an essay is evaluated, at step 220 a reader leader has a set time frame in which he or she can override the primary or backup reader's assigned score. The time frame for overriding a score is configurable and is preferably set to 15 minutes. The review function may be used to provide critique of a reader. After an essay has been evaluated and the time frame for the reader leader override has expired, a student may retrieve the evaluation and feedback information at step 222. In the preferred embodiment, the student or user may retrieve his or her evaluation only if their account information indicates the bill has been paid.

FIGS. 7 through 17 provide a view of the various screens that a user may encounter while using the system to submit essays. FIG. 7 illustrates an introductory screen through which a user may access the system functionality. As shown, general information regarding the system is provided. The upper left hand side of the screen provides linked lines of text, 230 through 239, which a user may "click on" with a mouse button to access various components of the user interface.

A user can access directions for using the user interface of the system by selecting text line 230. When a user selects line 230, the user is presented with a screen similar to that shown in FIG. 8. A user can access information regarding the fees for using of the system by selecting text line 231. When a user selects line 231, the user is presented with a screen similar to that shown in FIG. 9. A user can sample the type of feedback information that is presented by the system when an essay is submitted by selecting text line 232. When a user selects line 232, the user is presented with a screen similar to that shown in FIG. 10. A user can preview the topics for which he or she can write and submit an essay by selecting text line 233. When a user selects line 233, the user is presented with a screen similar to that shown in FIG. 11. A user can order an access code under which an essay may be submitted by selecting text line 234. When a user selects line 234, the user is presented with a screen similar to that shown in FIG. 12. A user can access information regarding writing an essay for evaluation by selecting text line 235. When a user selects line 235, the user is presented with a screen similar to that shown in FIG. 13. A user can submit an essay for evaluation by selecting text line 236. When a user selects line 236, the user is presented with a screen similar to that shown in FIG. 14. A user can access feedback regarding an essay that he or she previously submitted by selecting text line 237. When a user selects line 237, the user is presented with a screen similar to that shown in FIG. 15. A user can access responses to frequently asked questions about the system by selecting text line 238. When a user selects line 238, the user is presented with a screen similar to that shown in FIG. 16. A user can access help on the system by selecting text line 239. When a user selects line 239, the user is presented with a screen similar to that shown in FIG. 17.

In the preferred embodiment, the on-line essay evaluation system may be accessed over the Internet. Using the above described screens, all of which may be accessed over the Internet with world wide web browser, a user or student is able to order an access code under which he or she may submit an essay or constructed response. Using an access code, a user may access system functionality for submitting essays or constructed responses. After the essay has been evaluated, a user may enter the system to retrieve an evaluation of the essay that he or she submitted. All of this system functionality can be accessed over the world-wide web. Thus, the preferred embodiment is an Internet based on-line essay evaluation system which provides functionality to read an essay question, construct an essay response to the question, submit the essay for evaluation, and retrieve feedback information from experienced, trained readers.

As noted above, the essay evaluation system provides for reader scheduling and prioritization of essay responses so that all essay responses are evaluated within the requested time period. In the preferred embodiment, essays are positioned in the queue by priority of date and time due. Thus, a rush essay which is due before a standard 5 day order essay, is positioned in the queue of essays to be read. The system automatically monitors the number of essays waiting to be read so that all essays are evaluated within the scheduled 48 hour or 5 day turnaround. When the system detects that essays are not being processed fast enough to meet the turnaround commitment, e-mails are automatically sent to the backup readers and to administrative staff. Alternatively, or in addition, a manual process is initiated to call back up readers and to schedule additional readers in order to process the backlog.

A flow chart illustrating the essay monitoring and reader scheduling process is depicted in FIG. 18. At step 250, the number of essays that need to be scored in the upcoming 24 hours in order to satisfy all scoring deadlines is calculated. At step 252, the system searches the database to identify the readers and the total number of each of the various question responses the readers are scheduled to read during the same 24 hours. If at step 254 the number of essays that need to be scored in the up-coming 24 hours is less than the number of essays that are scheduled to be read by scheduled primary readers in the same period, the system returns to step 250 of determining the number of essays to be scored. If at step 254, however, the number of essays that need to be scored in the coming 24 hours is greater than the number of essays that are scheduled to be read by scheduled readers, the system proceeds to step 256 where backup readers and appropriate administrative personnel are notified of the possible shortcoming. In the preferred embodiment, the readers and operators are notified by e-mail. It is then incumbent upon the backup readers and the administrative personnel to insure that the appropriate essays are read so as to meet all time restraints.

To insure smooth operation, the system also automatically performs workload leveling. Workload leveling guards against the situation where the first reader who logs onto the system "ties up" all the essays on a particular topic. Rather, the present system provides essay responses one at a time to readers when an essay is requested by the reader. When the queue of unread essays on a given topic has been exhausted, a message will be displayed indicating that there are no more essays on that topic. The reader then has the option of logging off or, if he or she is qualified to do so, switching to another topic.

FIG. 19 depicts the workload leveling process of the present invention. As shown, at step 270, the reader selects a topic area in which to evaluate essays. If at step 272 it is determined that essays for the particular topic and question are available to be scored, the next essay in the queue of essays to be scored is assigned to the particular reader at step 282. If, however, at step 272, it is determined that no essays on the particular topic need to be scored, the system at step 274 notifies the reader of such and determines which other topics the reader is authorized to score. If at step 276 it is determined that other topics which the reader is authorized to score have essays waiting to be read, the reader is allowed to select from the additional topics and essays. However, if at step 276 it is determined that there are no other essays to be read for which the reader is qualified, the system at step 280 notifies the reader of such.

The system also provides for a novel method of compensating readers whereby readers in the present system are compensated for each essay that is read. This in contrast to the traditional reader compensation model whereby readers are paid by the scheduled reading event without regard to the actual number of essays read. A scheduled reading event would be defined as, for example, "the AP reading" which would last for several days each June; in prior systems a participating reader would be compensated for the reading regardless of the number of essays he or she actually read. The compensation method of the present system correlates compensation to work actually performed. Because credit card rejects are so low and to meet time commitments to end users, readers are paid regardless of whether or not the purchase is approved by the credit approval system. The system also provides a means to vary the amount paid for evaluating responses for different examination types or even different questions in the same examination. Quality is ensured by scoring leaders who monitor evaluations and through periodic reviews by statistical analysis of reader performance data.

FIG. 20 depicts a view of a process for calculating reader compensation. The first step in the process, step 310, involves identifying the particular reader. At step 312 the total number of essays evaluated by the particular reader is accessed, usually in a database where such information is stored. At step 314 the compensation for the particular reader is calculated as a function of the number of essays identified in step 312. In a simple arrangement, the compensation is equal to the number of essays evaluated multiplied by a constant rate. In more complicated embodiments, the rate of compensation may vary by question.

Another novel aspect of the present system is the scoring and feedback functionality. In the present system, a reader selects an overall evaluation comment and an overall score for each essay response. Thereafter, the reader may select various rubrics or feedback bullets which have been pre-stored in the system for the particular question and thereby provide more specific feedback comments regarding the essay. A reader may only choose from the bullets which are provided by the system. In this way the system insures consistent, standardized feedback is provided to users. Readers may choose as many feedback bullets as applicable to the particular essay. The overall evaluation and the particular rubrics that may be selected for a particular evaluation are controlled and may be changed using system functionality. In addition to an overall evaluation and the rubrics, a user receives a copy of the scoring standards and guidelines for the particular question as well as several exemplar essay responses that have received various scores. In the preferred embodiment, the user does not receive the overall score. The overall score is kept in the system for monitoring and reader evaluation.

FIG. 21 provides a flow chart of the feedback process of the invention. At step 320, the reader is provided with a list of essays to score. The reader selects and evaluates an essay. The reader can reference various information regarding scoring the particular essay such as guidelines for scoring the essay and sample responses with the scores that were assigned to each. At step 322, the reader is provided with a list of overall scores. The reader selects an overall evaluation comment and an overall score for the essay. At step 324, the reader may select from a predefined group of feedback rubrics or bullets. The rubrics associated with a high overall score tend to be positive and point out the positive aspects of the user's essay while the rubrics associated with a low overall score tend to point out the weaknesses in the user's essay and suggest methods of improving. At step 326, the information that will be made available to the user is linked together for convenience and access. The feedback information comprises the following items: the original question; the user's response; the scoring standards or guidelines for the particular question; sample essays that have received various scores; an overall evaluation; and a set of rubrics or feedback bullets. The score assigned by the reader is not provided to the user but is stored for use in research and monitoring readers. The final step 328, involves providing the feedback information to the user. In the preferred embodiment the information is accessed by the user when the user logs onto the testing web site. Alternatively, the information could be transmitted to the user via e-mail or some other means.

FIG. 22 provides an illustrative screen that a reader may encounter when logging into the system for purposes of scoring essays. As shown, the reader is presented with a group of topics from which they may select responses for grading. It should be noted that the reader will be able to access only those topics and related responses for which the reader has been approved. For each question in each topic area, the reader may select to view the training pack, grade the essay, or perform scoring leader functions if qualified.

If the reader selects to view the training pack, the reader will be presented with a screen such as that shown in FIG. 23. At the top of the screen the reader has available several items which may assist in his or her grading of the particular question. The reader may select to review the question that was actually posed to the user by selecting text item 400. The reader might also review the scoring guide by selecting text item 402. Alternatively, the reader may select to review sample essay responses that have been assigned particular scores by clicking on text item 404.

If the reader selects to view the question that was posed to the user, the reader may be presented with a screen similar to that shown at FIG. 24. If the reader selects to view the scoring guideline, the reader may be presented with a screen similar to that shown in FIG. 25. This screen presents information that the reader may find helpful in grading essays. If the reader selects to view a sample essay which has a particular score associated with it, the reader may be presented with a screen similar to that shown in FIG. 26. By reviewing essays that may have been assigned various scores, the reader is able to calibrate him or herself for scoring recently submitted essays.

At the screen shown at FIG. 22, the reader may also select to score an essay. If the reader selects to score the essay, the system may present a screen similar to that shown in FIG. 27. At the top of the screen is displayed the possible scores that may be assigned. The lower portion of the screen displays the essay that is being scored. As shown at the top of the screen, once the reader has assigned a score and rubrics, the reader may submit the score and quit grading or submit the score and continue grading. The reader may also clear any scores that he or she has assigned by pressing the reset button. If the reader wishes, he or she may also select to see a description of the rubrics that correspond to the various overall scores. The screen on which the description of the rubrics is provided may appear similar to that shown in FIG. 28.

The reader functionality described above, like the user functions of the on-line essay evaluation system may be accessed over the Internet. Using the above described screens, all of which may be accessed over the Internet with an Internet browser, a primary reader, backup reader, and reader leader can access essays or constructed response that have been submitted by a user. The reader may proceed to evaluate the essay using the evaluation guidelines and other aides that are described above and provided on-line. A reader may then submit an evaluation of an essay. Thus, the preferred embodiment of the on-line essay evaluation system provides Internet accessible functionality for retrieving an essay response, reviewing information helpful in evaluating the response, assigning an overall evaluation and rubrics to the essay, and submitting an essay evaluation into the system for retrieval by the user or student.

Software for performing the above described processing and presenting the above described screens resides primarily on system server 116 and database server 128. The online-essay evaluation system, however, employs a client server technology. Thus, portions of the system software execute on the personal computers from which users and readers access the system.

Although exemplary embodiments of the invention have been described in detail above, those skilled in the art will readily appreciate that many additional modifications are possible in the exemplary embodiment without materially departing from the novel teachings and advantages of the invention. For example, those skilled in the art will appreciate that the screens employed may vary from those described above. Further, the rate of compensation for readers may vary between questions and test types. Accordingly, all such modifications are intended to be included within the scope of this invention as defined in the following claims.

Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US4671772 *Oct 22, 1985Jun 9, 1987Keilty, Goldsmith & BoonePerformance appraisal and training system and method of utilizing same
US4768087 *Oct 7, 1983Aug 30, 1988National Information Utilities CorporationEducation utility
US4867685 *Sep 24, 1987Sep 19, 1989The Trustees Of The College Of AeronauticsAudio visual instructional system
US4895518 *Nov 2, 1987Jan 23, 1990The University Of MichiganComputerized diagnostic reasoning evaluation system
US4953209 *Oct 31, 1988Aug 28, 1990International Business Machines Corp.Self-verifying receipt and acceptance system for electronically delivered data objects
US4958284 *Dec 6, 1988Sep 18, 1990Npd Group, Inc.Open ended question analysis system and method
US4978305 *Jun 6, 1989Dec 18, 1990Educational Testing ServiceFree response test grading method
US5002491 *Apr 28, 1989Mar 26, 1991ComtekElectronic classroom system enabling interactive self-paced learning
US5011413 *Jul 19, 1989Apr 30, 1991Educational Testing ServiceMachine-interpretable figural response testing
US5033969 *Mar 13, 1990Jul 23, 1991Pioneer Electronic CorporationSupport device for resolving questions about reproduced information
US5059127 *Oct 26, 1989Oct 22, 1991Educational Testing ServiceComputerized mastery testing system, a computer administered variable length sequential testing system for making pass/fail decisions
US5170362 *Jan 15, 1991Dec 8, 1992Atlantic Richfield CompanyRedundant system for interactively evaluating the capabilities of multiple test subjects to perform a task utilizing a computerized test system
US5176520 *Apr 17, 1990Jan 5, 1993Hamilton Eric RComputer assisted instructional delivery system and method
US5180309 *Dec 4, 1990Jan 19, 1993United States Of America As Represented By The Secretary Of The NavyAutomated answer evaluation and scoring system and method
US5195033 *Jun 8, 1990Mar 16, 1993Assessment Systems, Inc.Testing system including removable storage means for transfer of test related data and means for issuing a certification upon successful completion of the test
US5204813 *Jun 8, 1990Apr 20, 1993Assessment Systems, Inc.Computer-controlled testing process and device for administering an examination
US5211563 *Jun 25, 1992May 18, 1993Hitachi, Ltd.Computer assisted learning support system and processing method therefor
US5211564 *Apr 25, 1991May 18, 1993Educational Testing ServiceComputerized figural response testing system and method
US5247670 *Mar 9, 1989Sep 21, 1993Fuji Xerox Co., Ltd.Network server
US5287519 *Sep 17, 1992Feb 15, 1994International Business Machines Corp.LAN station personal computer system with controlled data access for normal and unauthorized users and method
US5309564 *Mar 19, 1992May 3, 1994Bradley Graham CApparatus for networking computers for multimedia applications
US5318450 *Nov 22, 1989Jun 7, 1994Gte California IncorporatedMultimedia distribution system for instructional materials
US5321611 *Feb 5, 1993Jun 14, 1994National Computer Systems, Inc.System for increasing speed at which answers are processed
US5432932 *Oct 23, 1992Jul 11, 1995International Business Machines CorporationSystem and method for dynamically controlling remote processes from a performance monitor
US5433615 *Feb 5, 1993Jul 18, 1995National Computer Systems, Inc.Categorized test item reporting system
US5437554 *Feb 5, 1993Aug 1, 1995National Computer Systems, Inc.System for providing performance feedback to test resolvers
US5442759 *Dec 9, 1993Aug 15, 1995International Business Machines CorporationInteractive online tutorial system with user assistance function for software products
US5466159 *Aug 12, 1994Nov 14, 1995National Computer Systems, Inc.Collaborative and quality control scoring system
US5509127 *Dec 4, 1992Apr 16, 1996Unisys CorporationIn a central processing module
US5513994 *Apr 3, 1995May 7, 1996Educational Testing ServiceCentralized system and method for administering computer based tests
US5524193 *Sep 2, 1994Jun 4, 1996And CommunicationsInteractive multimedia annotation method and apparatus
US5558521 *Aug 12, 1994Sep 24, 1996National Computer Systems, Inc.System for preventing bias in test answer scoring
US5565316 *Jun 22, 1993Oct 15, 1996Educational Testing ServiceSystem and method for computer based testing
US5672060 *Nov 28, 1994Sep 30, 1997Meadowbrook Industries, Ltd.Apparatus and method for scoring nonobjective assessment materials through the application and use of captured images
US5772446 *Sep 19, 1995Jun 30, 1998Rosen; Leonard J.Apparatus for providing information relating to a text file
US5782642 *Dec 19, 1995Jul 21, 1998Goren; MichaelInteractive video and audio display system network interactive monitor module interface
US5788504 *Oct 16, 1995Aug 4, 1998Brookhaven Science Associates LlcComputerized training management system
US5829983 *May 2, 1995Nov 3, 1998Fujitsu LimitedSystem for carrying out educational management
WO1995015654A1 *Nov 16, 1994Jun 8, 1995Zing Systems L PTransaction based interactive television system
Non-Patent Citations
Reference
1"The Integrated Instructional Systems Report", Feb., 1990, EPIE Institute.
2"The MicroCAT Testing System", 1992 Computerized Testing Products Catalog, Assessment Systems Corporation, 1992.
3 *Abdel fattah, Comparison of the Chi Square Procedure with the Symmetric Regression Procedure for Developing a Common Metric in Item Response Theory , 1992, Abstract.
4Abdel-fattah, "Comparison of the Chi-Square Procedure with the Symmetric Regression Procedure for Developing a Common Metric in Item Response Theory", 1992, Abstract.
5Adams et al., "Quest: The Interactive Test Analysis System", 1993, Abstract.
6 *Adams et al., Quest: The Interactive Test Analysis System , 1993, Abstract.
7Anthony DePalma, "Standardized College Exam Is Customized by Computers", The New York Times, Front Page Story, Mar. 21, 1992.
8 *Anthony DePalma, Standardized College Exam Is Customized by Computers , The New York Times, Front Page Story, Mar. 21, 1992.
9Burstein et al., "GE FRST Evaluation Report: How Well Does a Statistically-Based Natural Language Processing System Score Natural Language Constructed-Responses?", 1995, Abstract.
10 *Burstein et al., GE FRST Evaluation Report: How Well Does a Statistically Based Natural Language Processing System Score Natural Language Constructed Responses , 1995, Abstract.
11de la Torre et al., "The Development and Evaluation of a Computerized Adaptive Testing System", 1991, Abstract.
12 *de la Torre et al., The Development and Evaluation of a Computerized Adaptive Testing System , 1991, Abstract.
13Elliot Solway, "Quick, Where Do the Computers Go; Computers in Education", Communications of the AMC, Association for Computing, Machinery, 1991, Feb., 1991, vol. 34, No. 2, p. 29.
14 *Elliot Solway, Quick, Where Do the Computers Go; Computers in Education , Communications of the AMC, Association for Computing, Machinery, 1991, Feb., 1991, vol. 34, No. 2, p. 29.
15 *ETS/ACCESS Summer 1992 Special Edition Newsletter.
16G. Gage Kingsbury, "Adapting Adaptive Testing: Using the MicroCAT Testing System in a Local School District", Educational Measurement: Issues and Practice, Summer, 1990, pp. 3-6 and 29.
17 *G. Gage Kingsbury, Adapting Adaptive Testing: Using the MicroCAT Testing System in a Local School District , Educational Measurement: Issues and Practice, Summer, 1990, pp. 3 6 and 29.
18Gearhart et al., "Writing Portfolios at the Elementary Level: A Study of Methods for Writing Assessment", 1992, Abstract.
19 *Gearhart et al., Writing Portfolios at the Elementary Level: A Study of Methods for Writing Assessment , 1992, Abstract.
20Halpin et al., "So You Have Chosen an Unequal Cell Size ANOVA Option--Do you Really Know What you Have?", 1991, Abstract.
21 *Halpin et al., So You Have Chosen an Unequal Cell Size ANOVA Option Do you Really Know What you Have , 1991, Abstract.
22Henning et al., "Automated Assembly of Pre-equated Language Proficiency Tests", Language Testing, 1994, 11, Abstract.
23 *Henning et al., Automated Assembly of Pre equated Language Proficiency Tests , Language Testing , 1994, 11, Abstract.
24Kaplan et al., "Using the Free-Response Scoring Tool to Automatically Score the Formulating-Hypotheses Item", 1994, Abstract.
25 *Kaplan et al., Using the Free Response Scoring Tool to Automatically Score the Formulating Hypotheses Item , 1994, Abstract.
26 *Results of liturature search re: new products and educational/psychological academic literature performed by Educational Testing Service on Jul. 29, 1992 using various commercial databases.
27Sebrechts et al., "Agreement between Expert System and Human Raters' Scores on Complex Constructed-Response Quantitative Items", 1991, Abstract.
28 *Sebrechts et al., Agreement between Expert System and Human Raters Scores on Complex Constructed Response Quantitative Items , 1991, Abstract.
29Smittle, "Computerized Adaptive Testing: Revolutionizing Academic Assessment", Community College J., 1994, 65, Abstract.
30 *Smittle, Computerized Adaptive Testing: Revolutionizing Academic Assessment , Community College J. , 1994, 65, Abstract.
31Solomon et al., "A Pilot Study of the Relationship between Experts' Ratings and Scores Generated by the NBME's Computer-Based Examination System", Acad. Med., 1992, 67, Abstract.
32 *Solomon et al., A Pilot Study of the Relationship between Experts Ratings and Scores Generated by the NBME s Computer Based Examination System , Acad. Med. , 1992, 67, Abstract.
33Stowitschek et al., "A Computerized Risk Index Screening Program for At-Risk Students", 1990, Abstract.
34 *Stowitschek et al., A Computerized Risk Index Screening Program for At Risk Students , 1990, Abstract.
35 *The Integrated Instructional Systems Report , Feb., 1990, EPIE Institute.
36 *The MicroCAT Testing System , 1992 Computerized Testing Products Catalog, Assessment Systems Corporation, 1992.
37 *Tse chi Hsu and Shula F. Sadock, Computer Assisted Test Construction: A State of the Art , TME Report 88, Educational Testing Service, Nov. 1985.
38Tse-chi Hsu and Shula F. Sadock, "Computer-Assisted Test Construction: A State of the Art", TME Report 88, Educational Testing Service, Nov. 1985.
39Wang, "An Analytical Approach to Generating Norms for Skewed Normative Distributions", 1992, Abstract.
40 *Wang, An Analytical Approach to Generating Norms for Skewed Normative Distributions , 1992, Abstract.
41Wayne Patience, "Software Review-MicroCAT Testing System Version 3.0", Journal of Educational Measurement/Spring 1990, vol. 27, No. 1, pp. 82-88.
42 *Wayne Patience, Software Review MicroCAT Testing System Version 3.0 , Journal of Educational Measurement/Spring 1990, vol. 27, No. 1, pp. 82 88.
43Weiss et al., "User's Manual for the Basic Math Mastery Tests", 1992, Abstract.
44 *Weiss et al., User s Manual for the Basic Math Mastery Tests , 1992, Abstract.
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US6144838 *Dec 18, 1998Nov 7, 2000Educational Testing ServicesTree-based approach to proficiency scaling and diagnostic assessment
US6181909 *Jul 22, 1998Jan 30, 2001Educational Testing ServiceSystem and method for computer-based automatic essay scoring
US6254395 *Apr 12, 1999Jul 3, 2001Educational Testing ServiceSystem and method for automated testing of writing skill
US6256399Sep 27, 1999Jul 3, 2001Ncs Pearson, Inc.Method of distribution of digitized materials and control of scoring for open-ended assessments
US6461166 *Oct 17, 2000Oct 8, 2002Dennis Ray BermanLearning system with learner-constructed response based testing methodology
US6466683May 21, 2001Oct 15, 2002Ncs Pearson, Inc.System and method of distribution of digitized materials and control of scoring for open-ended assessments
US6471521 *Jul 26, 1999Oct 29, 2002Athenium, L.L.C.System for implementing collaborative training and online learning over a computer network and related techniques
US6544042Apr 13, 2001Apr 8, 2003Learning Express, LlcComputerized practice test and cross-sell system
US6558166Sep 12, 2000May 6, 2003Ncs Pearson, Inc.Multiple data item scoring system and method
US6615020 *Mar 23, 2001Sep 2, 2003David A. RichterComputer-based instructional system with student verification feature
US6625617 *Dec 8, 2000Sep 23, 2003Timeline, Inc.Modularized data retrieval method and apparatus with multiple source capability
US6631382 *Jul 31, 2000Oct 7, 2003Timeline, Inc.Data retrieval method and apparatus with multiple source capability
US6675133Mar 5, 2001Jan 6, 2004Ncs Pearsons, Inc.Pre-data-collection applications test processing system
US6749435Apr 29, 2003Jun 15, 2004Ncs Pearson, Inc.Collaborative and quality control scoring system and method
US6751351Mar 5, 2001Jun 15, 2004Nsc Pearson, Inc.Test question response verification system
US6755662 *Nov 6, 2002Jun 29, 2004Fujitsu LimitedMethod for presenting most suitable question and apparatus for presenting most suitable question
US6768894 *Dec 10, 2002Jul 27, 2004Harcourt Assessment, Inc.Computerized system and method for teaching and assessing the holistic scoring of open-ended questions
US6773266Jun 18, 2002Aug 10, 2004Athenium, L.L.C.Method for implementing collaborative training and online learning over a computer network and related techniques
US6810232Mar 5, 2001Oct 26, 2004Ncs Pearson, Inc.Test processing workflow tracking system
US6859211 *Sep 13, 2001Feb 22, 2005Terry H. FriedlanderSystem and method for generating an online interactive story
US6859523Nov 14, 2001Feb 22, 2005Qgenisys, Inc.Universal task management system, method and product for automatically managing remote workers, including assessing the work product and workers
US6913466 *Aug 21, 2001Jul 5, 2005Microsoft CorporationSystem and methods for training a trainee to classify fundamental properties of media entities
US6918772Oct 20, 2003Jul 19, 2005Ncs Pearson, Inc.Categorized data item reporting system and method
US6938048Nov 14, 2001Aug 30, 2005Qgenisys, Inc.Universal task management system, method and product for automatically managing remote workers, including automatically training the workers
US6961482Mar 5, 2001Nov 1, 2005Ncs Pearson, Inc.System for archiving electronic images of test question responses
US6965752Aug 19, 2003Nov 15, 2005Ecollege.ComOn-line educational system having an electronic notebook feature
US6970677Jul 27, 2004Nov 29, 2005Harcourt Assessment, Inc.Computerized system and method for teaching and assessing the holistic scoring of open-ended questions
US6976847Oct 20, 2003Dec 20, 2005Ncs Pearsons, Inc.System for providing feedback to evaluators
US6988895Jan 12, 2001Jan 24, 2006Ncs Pearson, Inc.Electronic test item display as an image with overlay controls
US7035748Jul 20, 2004Apr 25, 2006Data Recognition CorporationPriority system and method for processing standardized tests
US7054464Oct 15, 2002May 30, 2006Ncs Pearson, Inc.System and method of distribution of digitized materials and control of scoring for open-ended assessments
US7074128May 12, 2003Jul 11, 2006Drb Lit Ltd.Method and system for enhancing memorization by using a mnemonic display
US7127208 *Jun 24, 2002Oct 24, 2006Educational Testing ServiceAutomated annotation
US7155400Nov 14, 2001Dec 26, 2006Qgenisys, Inc.Universal task management system, method and product for automatically managing remote workers, including automatically recruiting workers
US7162198Sep 23, 2004Jan 9, 2007Educational Testing ServiceConsolidated Online Assessment System
US7162433 *Feb 22, 2006Jan 9, 2007Opusone Corp.System and method for interactive contests
US7182600 *Dec 12, 2002Feb 27, 2007M.I.N.D. InstituteMethod and system for teaching vocabulary
US7308413 *May 5, 2000Dec 11, 2007Tota Michael JProcess for creating media content based upon submissions received on an electronic multi-media exchange
US7318191 *Oct 10, 2002Jan 8, 2008Bhk Systems, L.P.Automated system and method for dynamically generating customized typeset question-based documents
US7321858Nov 30, 2001Jan 22, 2008United Negro College Fund, Inc.Selection of individuals from a pool of candidates in a competition system
US7357640Mar 31, 2004Apr 15, 2008Drb Lit Ltd.Lock-In Training system
US7364432Mar 31, 2004Apr 29, 2008Drb Lit Ltd.Methods of selecting Lock-In Training courses and sessions
US7390191Feb 9, 2005Jun 24, 2008Drb Lit Ltd.Computer system configured to sequence multi-day training utilizing a database
US7398467 *Jun 13, 2000Jul 8, 2008International Business Machines CorporationMethod and apparatus for providing spelling analysis
US7406392Apr 13, 2006Jul 29, 2008Data Recognition CorporationPriority system and method for processing standardized tests
US7574276Dec 21, 2005Aug 11, 2009Microsoft CorporationSystem and methods for providing automatic classification of media entities according to melodic movement properties
US7657220May 21, 2004Feb 2, 2010Ordinate CorporationAdaptive scoring of responses to constructed response questions
US7729655Sep 22, 2004Jun 1, 2010Educational Testing ServiceMethods for automated essay analysis
US7769339Jul 27, 2006Aug 3, 2010Educational Testing ServiceAutomated essay scoring
US7770143Jan 19, 2007Aug 3, 2010Hughes John MSystem and method for design development
US7778866Jan 14, 2005Aug 17, 2010Topcoder, Inc.Systems and methods for software development
US7792685Jul 30, 2007Sep 7, 2010United Negro College Fund, Inc.Selection of individuals from a pool of candidates in a competition system
US7796937Aug 21, 2006Sep 14, 2010Educational Testing ServiceAutomated annotation
US7831195 *Dec 11, 2006Nov 9, 2010Sharp Laboratories Of America, Inc.Integrated paper and computer-based testing administration system
US7881898Jul 7, 2008Feb 1, 2011Data Recognition CorporationPriority system and method for processing standardized tests
US7937657Jun 2, 2008May 3, 2011International Business Machines CorporationUser specific error analysis
US8019641Sep 18, 2008Sep 13, 2011Opusone CorporationSystem and method for interactive contests
US8021221Dec 18, 2008Sep 20, 2011Topcoder, Inc.System and method for conducting programming competitions using aliases
US8027867Oct 14, 2008Sep 27, 2011Blenk Christopher WSystem and method for decision of publishing literary work based on reviewer's satisfaction demographic factors
US8073792Mar 13, 2008Dec 6, 2011Topcoder, Inc.System and method for content development
US8082279Apr 18, 2008Dec 20, 2011Microsoft CorporationSystem and methods for providing adaptive media property classification
US8121851Jul 30, 2007Feb 21, 2012United Negro College Fund, Inc.Selection of individuals from a pool of candidates in a competition system
US8137172Dec 18, 2008Mar 20, 2012Topcoder, Inc.System and method for programming tournaments
US8155578Sep 27, 2004Apr 10, 2012Educational Testing ServiceMethod and system for generating and processing an assessment examination
US8187004 *Sep 3, 2004May 29, 2012Desensi Jr Francis JosephSystem and method of education administration
US8202098 *Feb 28, 2006Jun 19, 2012Educational Testing ServiceMethod of model scaling for an automated essay scoring system
US8213856Jul 28, 2008Jul 3, 2012Vantage Technologies Knowledge Assessment, L.L.C.Unified web-based system for the delivery, scoring, and reporting of on-line and paper-based assessments
US8374540Dec 29, 2006Feb 12, 2013Educational Testing ServiceConsolidated on-line assessment system
US8385811May 14, 2009Feb 26, 2013Data Recognition CorporationSystem and method for processing forms using color
US8398407Oct 28, 2003Mar 19, 2013Iplearn, LlcLearning method and system in a window environment
US8452225May 24, 2010May 28, 2013Educational Testing ServiceMethods for automated essay analysis
US8467716Jul 30, 2010Jun 18, 2013Educational Testing ServiceAutomated essay scoring
US8472860 *May 8, 2009Jun 25, 2013Benny G. JohnsonArtificial intelligence software for grading of student problem-solving work
US8475174 *May 26, 2012Jul 2, 2013Iplearn-Focus, LlcLearning method and system using detached sensor
US8488220Oct 22, 2008Jul 16, 2013Data Recognition CorporationMethod and apparatus for calibrating imaging equipment
US8491311Sep 29, 2003Jul 23, 2013Mind Research InstituteSystem and method for analysis and feedback of student performance
US8499278Feb 28, 2011Jul 30, 2013Topcoder, Inc.System and method for software development
US8526055Oct 22, 2008Sep 3, 2013Data Recognition CorporationStandardized test and survey imaging system
US8538320Mar 14, 2013Sep 17, 2013Iplearn-Focus, LlcLearning method and system using detached sensor
US8538321Mar 14, 2013Sep 17, 2013Iplearn-Focus, LlcComputing method and system using detached sensor
US8560333Feb 16, 2012Oct 15, 2013United Negro College Fund, Inc.Selection of individuals from a pool of candidates in a competition system
US8608477 *Apr 6, 2006Dec 17, 2013Vantage Technologies Knowledge Assessment, L.L.C.Selective writing assessment with tutoring
US8626054Jul 20, 2010Jan 7, 2014Educational Testing ServiceAutomated annotation
US8632344May 25, 2012Jan 21, 2014Educational Testing ServiceMethod of model scaling for an automated essay scoring system
US8649601Oct 22, 2008Feb 11, 2014Data Recognition CorporationMethod and apparatus for verifying answer document images
US8655715Dec 20, 2006Feb 18, 2014Opusone Corp.System and method for interactive contests
US8657606 *Jul 5, 2005Feb 25, 2014Paul FisherAsynchronous art jurying system
US8727857Sep 30, 2011May 20, 2014IgtWager gaming voting leaderboard
US8727858Sep 30, 2011May 20, 2014IgtWager gaming voting leaderboard
US8734220Sep 30, 2011May 27, 2014IgtWager gaming voting leaderboard
US8734221Sep 30, 2011May 27, 2014IgtWager gaming voting leaderboard
US8734257Sep 30, 2011May 27, 2014IgtWager gaming voting leaderboard
US8738659Oct 22, 2008May 27, 2014Data Recognition CorporationMethod and apparatus for managing priority in standardized test and survey imaging
US8761658Jan 31, 2011Jun 24, 2014FastTrack Technologies Inc.System and method for a computerized learning system
US8776042Dec 19, 2005Jul 8, 2014Topcoder, Inc.Systems and methods for software support
US20090286218 *May 8, 2009Nov 19, 2009Johnson Benny GArtificial intelligence software for grading of student problem-solving work
US20090311659 *Jun 11, 2008Dec 17, 2009Pacific Metrics CorporationSystem and Method For Scoring Constructed Responses
US20100331088 *Jun 29, 2009Dec 30, 2010Daniel Jason CulbertMethod and System for Real Time Collaborative Story Generation and Scoring
US20110269110 *May 3, 2011Nov 3, 2011Mcclellan CatherineComputer-Implemented Systems and Methods for Distributing Constructed Responses to Scorers
US20120166316 *Mar 6, 2012Jun 28, 2012Richard Angelo MessinaCollective community Method of Integrated Internet-Based tools for Independent Contractors, their Collaborators, and Customers
US20120237920 *May 26, 2012Sep 20, 2012Chi Fai HoLearning Method and System Using Detached Sensor
USRE39435 *Sep 2, 2003Dec 19, 2006Drb Lit Ltd.Learning system with learner-constructed response based methodology
WO2001061670A1 *Feb 14, 2001Aug 23, 2001Starlab NvOpen source university
WO2001073724A1 *Mar 23, 2001Oct 4, 2001David A RichterComputer-based instructional system with student verification feature
WO2002031800A1 *Oct 11, 2001Apr 18, 2002Katzenbach Partners LlcAssessment system and method
WO2002089093A1 *May 1, 2002Nov 7, 2002Daniel BoehmerMethod for communicating confidential educational information
WO2003079313A1 *Mar 14, 2003Sep 25, 2003Educational Testing ServiceConsolidated online assessment system
WO2003094131A1 *May 1, 2003Nov 13, 2003Psychological CorpHolistic question scoring and assessment
WO2004001700A1 *Jun 23, 2003Dec 31, 2003Educational Testing ServiceAutomated essay annotation system and method
WO2005098651A1 *Mar 22, 2004Oct 20, 2005Bhk Systems L PAutomated system and method for dynamically generating customized typeset question-based documents
WO2006093928A2 *Feb 28, 2006Sep 8, 2006Yigal AttaliMethod of model scaling for an automated essay scoring system
Classifications
U.S. Classification434/353, 434/118, 434/322, 434/350, 382/321
International ClassificationG09B5/06, G09B5/14, G09B7/02
Cooperative ClassificationG09B7/02, G09B5/065, G09B5/14
European ClassificationG09B5/06C, G09B5/14, G09B7/02
Legal Events
DateCodeEventDescription
May 2, 2011FPAYFee payment
Year of fee payment: 12
May 16, 2007FPAYFee payment
Year of fee payment: 8
Mar 19, 2003FPAYFee payment
Year of fee payment: 4
Jun 1, 1998ASAssignment
Owner name: EDUCATIONAL TESTING SERVICE, NEW JERSEY
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DRISCOLL, GARY;HATFIELD, LOU A.;JOHNSON, ARLENE A.;AND OTHERS;REEL/FRAME:009248/0197;SIGNING DATES FROM 19980520 TO 19980528