WO2003042786A2 - Extensible exam language (xxl) protocol for computer based testing - Google Patents

Extensible exam language (xxl) protocol for computer based testing Download PDF

Info

Publication number
WO2003042786A2
WO2003042786A2 PCT/US2002/036288 US0236288W WO03042786A2 WO 2003042786 A2 WO2003042786 A2 WO 2003042786A2 US 0236288 W US0236288 W US 0236288W WO 03042786 A2 WO03042786 A2 WO 03042786A2
Authority
WO
WIPO (PCT)
Prior art keywords
test
type
segments
data
name
Prior art date
Application number
PCT/US2002/036288
Other languages
French (fr)
Other versions
WO2003042786B1 (en
WO2003042786A3 (en
Inventor
Clarke D. Bowers
Tronster M. Hartley
Kyle M. Kvech
William H. Garrison
Original Assignee
Prometric, A Division Of Thomson Learning, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Prometric, A Division Of Thomson Learning, Inc. filed Critical Prometric, A Division Of Thomson Learning, Inc.
Priority to AU2002361616A priority Critical patent/AU2002361616A1/en
Publication of WO2003042786A2 publication Critical patent/WO2003042786A2/en
Publication of WO2003042786A3 publication Critical patent/WO2003042786A3/en
Publication of WO2003042786B1 publication Critical patent/WO2003042786B1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/12Use of codes for handling textual entities
    • G06F40/14Tree-structured documents
    • G06F40/143Markup, e.g. Standard Generalized Markup Language [SGML] or Document Type Definition [DTD]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • G06F40/174Form filling; Merging
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers

Definitions

  • the present invention generally relates the field of computer-based testing, and in particular, the present invention relates to a non-deterministic test definition language that defines a test specification and content of a computer-based test.
  • standardized testing has been a common method of assessing examinees as regards educational placement, skill evaluation, etc. Due to the prevalence and mass distribution of standardized tests, computer-based testing has emerged as a superior method for providing standardized tests, guaranteeing accurate scoring, and ensuring prompt return of test results to examinees.
  • Tests are developed based on the requirements and particulars of test developers. Typically, test developers employ psychometricians or statisticians and psychologists to determine the specific requirements specific to human assessment. These experts often have their own, unique ideas regarding how a test should be presented and regarding the necessary contents of that test, including the visual format of the test as well as the data content of the test. Therefore, a particular computer-based test has to be customized to fulfill the client's requirements.
  • Figure 1 illustrates a prior art process for computerized test customization, denoted generally by reference numeral 10.
  • a client details the desired test requirements and specifications, step 12.
  • the computerized test publisher then creates the tools that allow the test publisher to author the items, presentations, etc., required to fulfill the requirements, step 14.
  • the test publisher then writes an item viewer, which allows the test publisher to preview what is being authored, step 16.
  • An item presenter is then written to present the new item, for example, to the test driver, step
  • test driver Presenting the new item to the test driver requires a modification of the test driver's executable code.
  • the test driver must be modified so that it is aware of the new item and can communicate with the new item presenter, step 20.
  • the test packager must then also be modified, step 22.
  • the test packager which may also be a compiler, takes what the test publisher has created and writes the result as new object codes for the new syntax.
  • the scoring engine must also be modified to be able to score the new item type, step 24.
  • results processor must be modified to be able to accept the new results from the new item, step 26. This process requires no less than seven software creations or modifications to existing software.
  • U.S. Patent No. 5,827,070 (Kershaw et al.) and U.S. Patent No. 5,565,316 (Kershaw et al.) are incorporated herein by reference.
  • the '070 and '316 patents which have similar specifications, disclose a computer-based testing system comprising a test development system and a test delivery system.
  • the test development system comprises a test document creation system for specifying the test contents, an item preparation system for computerizing each of the items in the test, a test preparation system for preparing a computerized test, and a test packaging system for combining all of the items and test components into a computerized test package.
  • the computerized test package is then delivered to authorized examinees on a workstation by the test delivery system.
  • Figure 2 illustrates the relationship among session scripts 30, test scripts 32, and units.
  • a script consists of a series of files and further specifies the option settings and configuration data, which the
  • Test Delivery Application needs for operation.
  • scripts are prepared and combined with the items prepared during item preparation. Scripts control the sequence of events during a testing session.
  • Two types of scripts are preferably used: the session script 30 and one or more test scripts 32.
  • the session script 30 controls the order in which units within the testing session are presented. Units provide specific services to the examinee, such as delivering a test or presenting a score repor
  • the test script controls what is presented to the examinee during the testing unit.
  • Each testing unit may include one or more delivery units, which are separately timed and scored subdivisions of a test.
  • the system can dynamically select, or spiral, scripts and other test components so that examinees are given what appear to be different tests.
  • FIG. 24 shows the relationship among session scripts 30, test scripts 32, and units.
  • the session script is the second-level component of the testing package. It performs two primary functions: First, it specifies the Session Control Information, which defines the default options that are in effect for the entire examinee testing session. Second, it controls the order in which units within the testing session are presented and the options used to present them.
  • the units that can be presented within a session script are: General information screen units, tutorial units, Break units, Data collection units, Scoring and reporting units, and Testing units.
  • the session control information contains the default options in effect for the entire session.
  • Control information can be provided at multiple levels within the testing session. Thus, the control information provided at the session level can be overridden by information that occurs later in the session.
  • the information provided at the session level would generally include the following: Name-- the session script name to be used by administrators in selecting a specific session script from Administrative Application menus; Input device—the input device to be used during the session (e.g., mouse or keyboard); Color—the colors to be used during the session; Messages— program-specific messages to override default messages during the session; Demo Script—indicates whether the script presents a demonstration or operational test; Research Indicator—indicates whether the script presents a research pilot test; Special Timing— indicates whether the script is standard or specially timed version.
  • the testing unit presents a test, based on the contents of a test script that may have been selected at runtime.
  • the following units can be included within a testing unit: general information screen unit; tutorial unit; break unit; delivery unit, which delivers items to the examinee. This permits testing programs to interleave general information screens, tutorials, and breaks with sections of a test.
  • the testing unit contains the following information: script selection mode indicates whether dynamic runtime selection is to be used to select the test script; reference to a test script which controls the sequence of events and options used during the testing unit. If dynamic runtime selection is to be used, the reference is to a set of test scripts.
  • the test script performs two primary functions. First, it specifies the test and delivery unit control information.
  • Test control information defines the options that are in effect for the testing unit.
  • Delivery unit control information defines the options that are in effect for a particular delivery unit within a testing unit. It controls the order in which units are presented within the testing unit and the options used to present them. The rules for presentation of units are the same as those for the session script, except that an additional unit, the delivery unit, can be included within a test script.
  • U.S. Patent No. 5,513,994 (Kershaw et al.), which is incorporated herein by reference, discloses a centralized administrative system and method of administering standardized test to a plurality of examinees.
  • the administrative system is implemented on a central administration workstation and at least one test workstation located in different rooms at a test center.
  • the administrative system software which provides substantially administrative functions, is executed from the central administration workstation.
  • the administrative system software which provides function carried out in connection with a test session, is executed from the testing workstations.
  • test definition language that allows the addition of new test functionality without necessitating changes to a test driver's executable code or other implementing functionality.
  • test definition language used to implement a computer-based test supports, for example, a non-predetermined properties set.
  • test definition language used to implement a computer-based test supports named properties to, for example, any area of the test definition.
  • test definition language used to implement a computer-based test is in extensible markup language format and optionally has a grammar that is defined by a schema.
  • the test driver manages the computer-based test, controls progression of the computer-based test, controls scoring of the computer-based test, controls timing of the at least one test, controls printing of the at least one test, and controls results reporting of the computer-based test based on the test definition language.
  • the optional memory stores a plurality of first data structures.
  • the plurality of first data structures includes element specific data objects indicating a classification of at least one of the plurality of segments of the test definition language.
  • the plurality of segments defines information comprises the data content, the presentation format, the progression, the scoring, the printing, the timing, and the results reporting of the computer-based test.
  • the memory also stores second data structures that optionally depend from or are subordinate to the first data structures.
  • the second data structures include attribute specific data objects indicating at least one attribute of the segments of the test definition language implemented by the computer.
  • the memory further stores third data structures that depend from or are subordinate to the plurality of first data structures.
  • the third data structures include data specific data objects indicating at least one sub-classification of the at least one of the plurality of segments of the test definition language.
  • the memory further stores third data structures that depend from or are subordinate to the plurality of first data structures.
  • the plurality of third data structures include element specific data objects indicating a sub-classification of the at least one of the plurality of segments of the test definition language.
  • the sub-classification further indicates at least one property specific to the at least one of the plurality of segments of the test definition language.
  • the memory further stores fourth data structures that depend from or are subordinate to the plurality of first data structures.
  • the plurality of fourth data structures include group specific data objects indicating an order of an appearance of the at least one of the plurality of third data structures, a minimum occurrence of the appearance of the at least one of the third data structures, and a maximum occurrence of the appearance of the at least one of the third data structures.
  • a memory storing a schema for a test definition language in extensible markup language format that that characterizes a computer-based test delivered to an examinee using a test driver and is implemented by a computer.
  • the test definition language has a plurality of segments.
  • the computer-based test has a presentation format and data content and the test driver delivers the computer-based test to an examinee using a display device, manages the computer-based test, controls progression of the computer-based test, controls scoring of the computer-based test, controls timing of the at least one test, controls printing of the at least one test, and controls results reporting of the computer-based test based on the test definition language, wherein the schema defines a permissible grammar for the test definition language.
  • An optional memory stores a plurality of first data structures.
  • the plurality of first data structures includes element definition specific data objects defining an element classification of at least one of the plurality of segments of the schema.
  • the plurality of segments defines classification identification information comprising the data content, the presentation format, the progression, the scoring, the printing, the timing, and the results reporting of the computer-based test.
  • the memory also stores second data structures.
  • the second data structures include attribute definition specific data objects defining at least one attribute classification of the plurality of segments of the schema.
  • the memory further stores third data structures.
  • the third data structures include element specific data objects indicating at least one element sub-classification of the at least one of the plurality of segments of the schema.
  • the memory also stores fourth data structures.
  • the fourth data structures include attribute specific data objects indicating at least one attribute of the at least one of the plurality of segments of the test definition language implemented by the computer.
  • a method for computer-based testing includes authoring a test specification and content of the at least one test using a test definition language.
  • the test specification and content defines the presentation format and the data content of the at least one test.
  • the method also includes compiling the test specification and content of the at least one test to create a compiled test specification and content. Compiling the test specification and content includes validating the test specification and content.
  • the method further includes storing the compiled test specification and content to a resource file and retrieving the compiled test specification and content from the resource file during delivery of the test.
  • a method for defining a schema for a test definition language includes defining a first set of elements, defining a set of attributes, and defining a second set of elements.
  • the second set of elements references the first set of elements and the set of attributes.
  • a method for a computer-based testing system that executes a test controlled by a test driver.
  • the test driver has an executable code that controls the test driver and functionality performed by the test driver that enables the test driver to deliver the at least one test to an examinee using a display device, manage the at least one test, control progression of the at least one test, control scoring of the at least one test, control timing of the at least one test, control printing of the at least one test, and control results reporting of the at least one test based on a test definition language in extensible markup language format.
  • the test has a presentation format and data content.
  • the test definition language has a plurality of segments that defines information comprising the data content, the presentation format, the progression, the scoring, the printing, the timing, and the results reporting of the test.
  • the method includes the sequential, non-sequential and/or sequence independent steps of authoring at least one of the plurality of segments and storing the at least one of the plurality of segments to the source file.
  • the method also includes instantiating a validation expansion module during a test production cycle and loading the at least one of the plurality of segments of the test definition language into a memory from the source file, validating the at least one of the plurality of segments.
  • the method further includes unloading the at least one of the plurality of segments from the memory into at least one of a plurality of storage elements and providing to the memory the at least one of the plurality of storage elements.
  • the method also includes loading the at least one of the plurality of segments of the test definition language from the at least one of the plurality of storage elements into the memory during a test delivery cycle and implementing directly by the validation expansion module the information defined by the at least one of the plurality of segments.
  • the method further includes accessing by the test driver the at least one of the plurality of segments of the test definition language to enable the functionality of the test driver via the direct implementation by the validation expansion module.
  • the test definition language has a plurality of element specific data objects and a plurality of attribute specific data objects that define information comprising the data content, the presentation format, the progression, the scoring, the printing, the timing, and the results reporting of the test.
  • the method includes the sequential, non-sequential and/or sequence independent steps of authoring at least one of the plurality of element specific data objects and at least one of the plurality of attribute specific data objects and storing the at least one of the plurality of element specific data objects and the at least one of the plurality of attribute specific data objects to the source file.
  • the method also includes instantiating a validation expansion module during a test production cycle, loading the at least one of the plurality of element specific data objects and the at least one of the plurality of attribute specific data objects of the test definition language into a memory from the source file.
  • the method further includes validating the at least one of the plurality of element specific data objects and the at least one of the plurality of attribute specific data objects and unloading the at least one of the plurality of element specific data objects and at least one of the plurality of attribute specific data objects from the memory into at least one of a plurality of storage elements.
  • the method also includes providing to the memory the at least one of the plurality of storage elements and loading the at least one of the plurality of element specific data objects and the at least one of the plurality of attribute specific data objects of the test definition language from the at least one of the plurality of storage elements into the validation expansion module during a test delivery cycle.
  • the method further includes implementing directly by the validation expansion module the information defined by the at least one of the plurality of element specific data objects and at least one of the plurality of attribute specific data objects and accessing by the test driver the at least one of the plurality of element specific data objects and at least one of the plurality of attribute specific data objects of the test definition language to enable the functionality of the test driver via the direct implementation by the validation expansion module.
  • Figure 1 is a flow diagram of a prior art method for computerized test customization
  • Figure 2 is a block diagram of a prior art testing script
  • Figure 3 is a schematic diagram of a computer-based testing system according to the present invention
  • Figure 4 illustrates various components that comprise an exam source file
  • Figures 5A and 5B are schematics illustrating the components, classes, and interfaces that comprise a test definition language compiler according to the present invention
  • Figure 6 is a flow diagram that illustrates a compile order for compiling a source file according to the present invention
  • Figure 7 illustrates how a test publisher defines the compile order
  • Figure 8 illustrates an output of the test definition language compiler based on the compile order
  • Figure 9 is a block diagram of main storage branches of an exam resource file according to the present invention.
  • Figure 10 is a block diagram illustrating an exams branch of the exam resource file
  • Figures 11 A and 1 IB are block diagrams illustrating a forms branch of the exam resource file
  • Figure 12 is a block diagram illustrating an items branch of the exam resource file
  • Figure 13 is a block diagram illustrating a categories branch of the exam resource file
  • Figure 14 is a block diagram illustrating a templates branch of the exam resource file
  • Figure 15 is a block diagram illustrating a sections branch of the exam resource file
  • Figure 16 is a block diagram illustrating a groups branch of the exam resource file
  • Figures 17A, 17B, 17C, and 17D are block diagrams illustrating an events sub-branch of the groups branch of the exam resource file
  • Figure 18 is a block diagram illustrating a plugins branch of the exam resource file
  • Figure 19 is a block diagram illustrating a data branch of the exam resource file
  • Figure 20 is a block diagram illustrating a formGroups branch of the exam resource file
  • Figure 21 is a block diagram illustrating an attributes branch of the exam resource file
  • Figure 22 is a block diagram illustrating a scripts branch of the exam resource file
  • Figure 23 is a block diagram illustrating a message box branch of the exam resource file
  • Figure 24 is a flow chart of a method of test production and test delivery according to the present invention
  • Figure 25 is a flow chart of a method for validation of test specification and content according to the present invention.
  • Figure 26 is a flow chart of a method for test delivery according to the present invention
  • Figure 27 is a flow diagram illustrating the flow of a test according to the present invention
  • Figures 28 illustrates an example of a schema according to the present invention
  • Figure 29 illustrates an example of a schema definition of an item according to the present invention
  • Figure 30 illustrates the hierarchy of the test definition language according to the present invention
  • Figure 31 illustrates how a plugin is used with the test definition language to enable delivery of the computer-based test according to the present invention
  • Figure 32 illustrates a rendering of an example of a definition of data using the test definition language according to the present invention
  • Figure 33 illustrates and example of using the test definition language to define plugin data
  • Figure 34 illustrates test definition language referencing according to the present invention.
  • a test is delivered using a test driver that is, for example, object-oriented and is architected to dynamically add functionality through, for example, the use of an expansion module, and preferably through the use of plugins.
  • the test driver preferably references component object model servers using standard interfaces, and uses, for example, class names (that can be an Active Document) defined in a custom test definition language entitled extensible eXam Language (“XXL”) based on extensible Markup Language (“XML”) format to interact with existing applications while offering the flexibility of allowing development of new plugins.
  • XXL extensible eXam Language
  • XML extensible Markup Language
  • These new plugins can be customized to a client's needs without changing the core test driver.
  • the XXL language is defined using an XXL schema that structures the allowable grammar of the XXL language.
  • the plugins advantageously enable the test driver to support, for example, new item types, navigation algorithms, information displays, scoring algorithms, timing algorithms, test unit selection algorithms, results persistence reporting, printed score reporting, and/or helm types without change to the test driver's executable.
  • Plugins also allow expansion of the test driver's functionality without requiring the test driver to be recompiled or re-linked, and without requiring the test publisher to learn to program. Since plugins are written independently of the test driver, plugins can be written long after the test driver is built.
  • the client and the software developer can design and test the plugins and distribute the plugins to each test site. By using this method, large-scale regression testing of other examinations will not usually be necessary unless changes are made to the plugins that may be used by many examinations.
  • Test driver 110 is responsible for controlling all aspects of the computer-based test. Test driver 110 identifies examinees scheduled to take the computer-based test and identifies and creates the appropriate test. Test driver
  • Test driver 110 presents all of the test components to examinees using a display device (not shown), such as a computer monitor, and enables examinees to enter responses to test questions through the use of an input device (not shown), such as a keyboard, a mouse, etc.
  • Test driver 110 also monitors the security of the test. For example, test driver 110 can prevent access to the Internet and can validate examinees, although, these functions are preferably performed by the test center administration system. Test driver 110 also monitors the timing of the test, providing relevant warnings to examinee regarding the elapsed time of the test and the time remaining for a particular section of the test or for the entire test.
  • Test driver 110 is also responsible for scoring the test, once the test is completed or while the test is in progress, and for reporting the results of the test by physical printout using printer 182 or in a file format using candidate exam results file 180. If the test is interrupted while in progress for example, due to a power failure, test driver 110 restarts the test, preferably at the point at which the test was interrupted, as will be described subsequently in more detail. Finally, if the test is left incomplete, test driver 110 cleans up the incomplete test. An incomplete test will have an exam instance file in the examinee's directory but will not have created a results file. A results file is created even though generally the candidate will fail. The number of items delivered to the examinee is recorded in the results file. Test drive 110 picks up where the event was interrupted and invisibly deliveries the rest of the units of the test.
  • test specification is authored by a test publisher according to the specifications of the client and stored in exam source files 130.
  • Exam source files 130 include data files 132, XXL files 134, multimedia files 136, and hypertext markup language ("HTML") files 138.
  • XXL files 134 include the test specification, which contains the client's requirements for the test, a bank of test items or questions, templates that determine the physical appearance of the test, plugins, and any additional data necessary to implement the test. Additional data is also stored in data files 132.
  • an adaptive selection plugin may need a, b & c theta values. These values are stored in a binary file created by a statistical package.
  • HTML files 130 include, for example, any visual components of the test, such as the appearance of test items or questions, the appearance of presentations on the display device, the appearance of any client specified customizations, and/or the appearance of score reports. HTML files 130 preferably also include script, for example, VBscript and Jscript, or Java script. HTML files 130 are preferably authored using Microsoft's FrontPage 2000. FrontPage 2000 is preferably also used to manage the source files in a hierarchy that is chosen by the test publisher. Multimedia files 136 include, for example, any images (.jpg, .gif, etc.) and/or sound files (.mp3, .wav, .au, etc.) that are used during the test.
  • XXL compiler 140 retrieves XXL files 134 from exam source files 130 using interface 190 and compiles the XXL test content stored in XXL files 134. XXL compiler 140 stores the compiled test files in exam resource file 120. In another embodiment, exam source files 130 do not contain XXL files 134 and contains, for example, only multi-media files, hi this embodiment, XXL compiler 140 is merely a test packager that writes the data directly to exam resource file 120 without modification or validation. The data appears in a stream under the "data" branch of exam resource file 120. The name of the steam is specified by the test author.
  • XXL files 134 also include XXL language that defines plugins 150, in which case, plugins 150 assist XXL compiler 140 in compiling XXL files 134.
  • Test driver 110 preferably supports, for example, nine different types of plugins 150, including, for example: display plugin 152; helm plugin 154; item plugin 156; timer plugin 158; selection plugin 160; navigation plugin 162; scoring plugin 164; results plugin 166; and report plugin 168.
  • Plugins 150 which are also included in XXL files 134, are the first XML files compiled into exam resource file 120.
  • Exam resource file 120 receives the compiled test content from XXL compiler 140 and plugins
  • OLE object-linking and embedding
  • POLESS object-linking and embedding
  • Other storage formats may optionally be used.
  • OLE allows different objects to write information into the same file, for example, embedding an Excel spreadsheet inside a Word document.
  • OLE supports two types of structures, embedding and linking.
  • OLE embedding the Word document of the example is a container application and the Excel spreadsheet is an embedded object.
  • the container application contains a copy of the embedded object and changes made to the embedded object affect only the container application.
  • OLE linking the Word document of the example is the container application and the Excel spreadsheet is a linked object.
  • Test driver 110 comprises Active Document container application 112 for the visible plugins, display plugin 152, helm plugin 154, and item plugin 156, which function as embedded objects, preferably COM objects.
  • Both XXL compiler 140 and plugins 150 are involved in storing the compiled test content into exam resource file 120, if any of plugins 150 are being used.
  • Exam resource file 120 comprises, for example, a hierarchical storage structure, as will be described in further detail below. Other storage structures may optionally be used.
  • XXL compiler 140 determines to which storage location a specific segment of the compiled test content is to be stored. However, if any of plugins 150 are used to validate the portion of any of the data from exam source files 130, then the plugins 150 store the data directly to the exam resource file, based upon directions from XXL compiler 140.
  • XXL compiler uses IPersistResource interface 192, co-located with I-Plugin interface 167 in Figure 3, to control the persistence of the data to exam resource file 120.
  • XXL compiler 140 and plugins 150 write the data to exam resource file 120 using POLESS interfaces 192.
  • Figure 4 illustrates the contents of exam source file 130, which are compiled into exam resource file 120 by XXL compiler 140 and plugins 150.
  • FrontPage 2000 Web 200 is used, for example, to author the test.
  • Exam source files 130 contain media files 210, visual files 220, and logic files 230.
  • Media files 210 are multimedia files used to enhance the presentation of the test, including, for example, XML data files 212, sound files 214, image files 216, and binary files 218.
  • XML data files 212 include the XXL test definition language and the XXL extensions from the plugins 150 that use XML.
  • the test specification, presentation, scoring and other information is specified in the XML files.
  • Sound files 214 include any sounds that are to be used during the test, such as .mp3 files, .au files, etc.
  • Image files 216 include any images to be used during the test, such as .jpg files, .gif files, etc.
  • Binary files 218 include any data needed by a plugin 150 that is not in XXL format.
  • Visual files 220 are HTML files that specify the visual presentation of the test as presented to the examine on the display device, including items files 222, presentation files 224, score report files 226, and custom look files 228.
  • Items files 222 include HTML files that are used to specify the visual component of test questions, e.g., stems and distractors. Items files 222 are capable also of referencing external exhibits. An exhibit could be a chart, diagram or photograph. Formats of exhibits include, for example: .jpg,
  • Score report files 226 is typically an HTML file with embedded script that includes, for example candidate demographics, appointment information, and candidate performance. The performance might include pass/fail, achievement in different content areas, etc.
  • Custom look files 228 are typically HTML files with embedded script to layout, for example, the title bar and information contained therein.
  • Logic files 230 are XML files that specify the functional aspects of the test, including test specification files 232, plugin files 234, item bank files 236, and template files
  • Test specification files 232 specify the content and progression of the test as provided by the client.
  • Plugin files 234 define plugins 150 and contain any data necessary to implement plugins 150.
  • Item bank files 236 include the data content and properties of the items, or test questions, that are to be presented to the examinee during the test. Properties of an item include the correct answer for the item, the weight given to the item, etc.
  • Template files 238 define visual layouts that are used with the display screen during the test. Referring again to Figure 3, once a test has begun, test driver 110 accesses exam resource file
  • Test driver 110 also access plugins 150 for additional data that expands the functionality of test driver 110 in the areas of items, navigation algorithms, information displays, scoring algorithms, timing algorithms, test unit selection algorithms, results persistence reporting, printed score reporting, and/or helm types.
  • Test driver 110 communicates with plugins 150 using various COM interfaces 169.
  • COM interfaces facilitate OLE linking.
  • test driver 110 is an Active Document container application and plugins 150 are embedded objects. The COM interfaces function as communications paths between the container application and the objects.
  • Plugin interface 167 which is also a COM interface, is supported by all of plugins 150.
  • COM interfaces 169 therefore, includes the Plugin interface.
  • the Plugin interface contains generic operations such as loading and unloading required of all plugins 150.
  • each plugin 150 also uses, for example, a second, individual COM interface 169 to communicate with test driver 110.
  • Alternative structures of the Plugin interface may also be used. Table 1 shows the relationship between each plugin 150 and the COM interface 169 used with that particular plugin 150.
  • Exam instance file 170 is used to restart a test if the test has been interrupted, for example, because of a power failure.
  • exam instance file 170 receives examination state information from test driver 110 and plugins 150 regarding the state of all running objects being used to deliver the test.
  • the examination state information includes the presentation that was being delivered on the display device before the interruption, the responses the examinee had entered in that presentation, etc.
  • the exam instance file 170 loads the state information back to test driver 110 and plugins 150, allowing the test to return to operation at the point where the test had been interrupted.
  • the running state of all objects is saved to exam instance file 170 rather than of only some of the objects.
  • Exam instance file 170 may also store additional information relating to the test, including, for example: the timing utilized and time remaining on units of the exam, the current unit of delivery, candidate score, etc.
  • Test driver 110 and plugins 150 communicate with exam instance file 170 using POLESS interfaces 195.
  • Test driver 110 controls communication between test driver 110 and plugins 150 using Persistlnstance interface 196, which is collocated with COM interfaces 169 in Figure 3.
  • TCM Test Center Manager
  • ETS Educational Testing Service
  • UAS Unified Administration System
  • Test driver 110 There are preferably two ways to run Test driver 110. The first is through a series of command line options and the second is using COM interfaces describing appointment information.
  • the command line option exists for backwards compatibility in a standard ETS environment and a TCM environment.
  • Table 2 shows a list of command line options test driver 110 supports. There are, for example, four programs which launch the test through the COM interface, for example: 1)
  • Test Center Manger environment Other number of environments and/or programs may optionally be used.
  • IAppointment interface 176 is part of UAS 174 and allows access by test driver 110 to examinee information for the examinee taking the test, such as demographics.
  • the examinee information is included in candidate exam results file 180, which is created by the test driver.
  • ILaunch2 interface 177 functions as the primary control interface for UAS 174 and allows UAS 174 to control various components such as test driver 110, screen resolution change, accommodations for disabled candidates, examinee check-in, etc., in a test center, which is the physical location where the examinee is taking the test.
  • ITransfer interface 199 transfers candidate exam results file 180 and other files back to UAS 174.
  • Print interface 198 sends information regarding any reports to printer 182.
  • Figures 5A and 5B illustrate the main diagram for XXL compiler 140.
  • XXL compiler 140 comprises the following classes, for example: cCompile 2000; cData 2004; cArea 2006; cTemplate 2008; cCategory 2010; cltem 2012; cPresentation 2014; cGroup 2016; cSection 2018; cForm 2020; cFromGroup 2022; cExam 2024; cMsgBox 2026; cChecksum 2028; cEvent 2030; cResult 2032; cReport 2024; cPlugin 2036; and cXXL 2038.
  • cCompile 2000 cData 2004; cArea 2006; cTemplate 2008; cCategory 2010; cltem 2012; cPresentation 2014; cGroup 2016; cSection 2018; cForm 2020; cFromGroup 2022; cExam 2024; cMsgBox 2026; cChecksum 2028;
  • ICompile interface 2002 The main interface to XXL compiler 140 is ICompile interface 2002.
  • ICompile interface 2002 is implemented by cCompiler class 2000. All control and initiation of compilation of exam source files 130 into exam resource file 120 occurs by way of this single public interface.
  • the core, non-plugin related elements of the XXL test definition language, as stored in XXL files 134, are compiled by classes in XXL compiler 140. For example, cSection class 2018, compiles the section element, and cGroup class 2016 compiles the group element.
  • ICompile interface 2002 supports the following operations, for example: createResource(); addSourceO; addData(); closeResource(); about(); linkResource(); openResource() and getCryptoObjectQ.
  • CreateResourceQ creates a resource file, for example, an XXL based resource file such as exam resource file 120.
  • AddSource() compiles an XXL file into the resource file.
  • AddData() adds a file directly to a data branch of the resource file.
  • CloseResource() closes the resource file.
  • LinkResourceO links a resource in the resource file and is performed after all compiling of the source files are completed.
  • GetCryptoObject() returns an ICrypto object containing the current encryption setting of POLESS, as described below.
  • XXL compiler 1040 handles individual XXL core language elements. All of these classes compile the specific XXL source element into exam resource file 120. All of these class language elements are also symbols used in later references. Therefore, the classes all derive from cSymbol class 2040. cSymbol class 2040 allows the classes of XXL compiler 140 to reside in a symbol table.
  • the XXL element plugin 150 appears as follows in XXL files 134:
  • XXL compiler 140 also contains the following token classes, for example: cToken 2042; cTokenCreatorNoRef 2044; cTokenCreator 2046; CtokenCreatorRef 2048; cTokenCreatorBase 2050; and cTokenFactory 2054.
  • token classes are involved in the identification of tokens. Tokens turn into symbols after identification. Symbols are any class derived from cSymbol, e.g., cTemplate, cSection, etc.
  • XXL compiler 140 also contains the following symbol table classes, for example: cPluginSymbolTable 2058; cTemplateSymbolTable 2060; cSymbolTable 2062; cFFGSymbolTable 2064; cSGPSymbolTable 2066; and cSymbolTableBase 2068. These classes are varieties of symbol tables. There are different symbol tables for different groups of symbols. A group of symbols define a name space for the symbol. Common symbol table functions are located in the base symbol table classes and templates.
  • URI "itembank/info_item.htm#wantABreak"/> ⁇ /data> ⁇ /item>
  • the item element is handled by a cltem class 2012 object.
  • the data element in the XXL definition is handled by a cData class 2004 object.
  • Item plugin 156 Plugin 150 will receive the source to compile from the cData class 2004 object, in this example, a multiChoice element.
  • cWrapXML class 2052 a wrapper class for XML DOM nodes, supports error handling.
  • cCustomAttributes class 2056 compiles the custom attributes XXL element.
  • cWrapPropertySet class 2070 is a wrapper class for a POLESS property storage.
  • test publisher advantageously and optionally is not forced to combine the entire test specification and content of the test into a single file. Rather the test publisher is encouraged to break apart exam source files 130 to allow for maximum reuse between tests. Therefore, in accordance with one embodiment, in a single XXL source file, the order of the elements is enforced by XXL compiler 140 with the symbol tables, i alternative embodiments, more than one source file may be used.
  • An element defined in the test specification and content or an attribute of an element is preferably and optionally defined before it is referenced by the element or by a sub-element.
  • Exam source files 130 include, for example, data files 132.
  • Data files 132 include, for example, several multimedia files, e.g., sound files 2072 (.wav, .mp3, etc.) and image files 2070 (.jpg, .gif, etc.).
  • Data files 132 are typically globally accessible to the test specification and content as defined in XXL files 134. Therefore, data files 132 are compiler first. It does not matter, however, in which order data files 132 are themselves compiled.
  • XXL files 134 preferably are compiled after data files 132, if data files 132 exist. Otherwise, XXL files 134 are compiled first. Other compilation orders may optionally be used. Any globally available scripts 2078 or other global data 2076 preferably are compiled first. Plugins 150 are compiled next. It should be noted that data files 2070, other data files 2076, and scripts 2078 are optional.
  • plugins 150 can be the first files to be compiled if the other files are not present in exam source file 130. Any files concerning the layout of the test, i.e., layout files 2082, are next in the compilation order.
  • Titlebar.html file 2084 and .bmp file 2086 are examples of pulled files.
  • Pulled files are typically files that are used to create the visual format of the test and are usually defined using HTML. (See HTML files 138 in Figure 3.) If a file is reference in HTML then the file is compiled at the same time as the XXL file that is referencing the HTML file.
  • categories files 2084 are compiled next, since categories files 2084 can reference any global data files 132, plugins 150, and layout files 2082. Items files 2086, which include test questions that are to be delivered to the examinee, are compiled next and any HTML files referenced by items files 2086 are compiled along with items files 2086. Finally test specification files 2090, which are part of XXL files 134, are compiled. Test specification files 2090 define the groups, sections, forms, form groups, and exams that comprise the test. Various files, e.g., score report files 2092 and displays files 2094 can be referenced by test specification files 2090 and are compiled along with test specification files 2090.
  • the test publisher defines the compile order before starting the first compile sequence.
  • Figure 7 illustrates how the test publisher defines the compile order.
  • the test publisher first compiles all .jpg and .gif image files. All XML files are compiled next.
  • Plugin files 2080 are first in the sequence, followed by the template files, which are included in layout files 2082.
  • Next category files 2081 are compiled, followed by three items files 2086.
  • test specification files 2090 are compiled.
  • Figure 8 shows the output of the compilation process. The files are compiled in the order specified by the test publisher, as shown in Figure 7.
  • Other compile sequences may optionally by used that accomplish the functionality and/or objects of the present invention.
  • Figure 9 illustrates the main storage branches of exam resource file 120, which corresponds to the top-level elements of the XXL test definition language, denoted by reference numeral 500.
  • the main storage branches of exam resource file 120 are, for example: exams branch 550; forms branch 600; items branch 650; category branch 700; templates branch 750; sections branch 800; groups branch 850; plugins branch 900; data branch 950; formGroups branch 1000; attributes branch 1050; scripts branch 1100; and message box (“Msgbox”) branch 1150.
  • Other storage branches may alternatively be used.
  • Exam branch 550 stores, for example, the primary attributes, properties, and data that govern the test.
  • Exam branch 550 can store information for various tests, as is denoted by the three, vertical ellipses.
  • a specific test is identified by the data stored in name attribute storage 552 or other identification schemes. Again, the various tests may each be identified by a different name, as denoted by the solid border around name attribute storage 552.
  • Attributes storage 554 stores version information 555, and title information 556 of the test as a stream of data or other data storage format.
  • Title information 556 is optional, as is denoted by the broken border. Any optional, customized information regarding the test is stored in custom properties 558 as a property storage or other data storage format. Information relating to the forms of the test are optionally stored in forms property storage 560. A form is a fixed or substantially fixed order of testing events. Many different forms can be stored in forms storage 560, giving flexibility to test driver 110 in controlling progression of the test.
  • FormGroups storage 562 optionally stores information relating to a collection of exam forms as a stream of data. Preferably, a single form from the formGroup is chosen to deliver to an examinee. The selection of the form from the group is performed by a selection plugin 160.
  • Exam branch 550 preferably contains at least one forms storage 560 either independently or within formGroups storage 562. Other information relating to the test may be stored under exam branch 550.
  • Forms branch 600 stores, for example, the primary attributes, properties, and data that govern the progress of the test.
  • Forms branch 600 can store information for various forms, as is denoted by the three, vertical ellipses. As described previously, a form is a fixed or substantially fixed order or substantially fixed of testing events. A single form is identified by the data stored in name attribute storage 602. Other identification formats may optionally be used. Again, the various forms may each be identified, for example, by a different name, as denoted by the solid border around name attribute storage 602.
  • Attribute storage 604 stores, for example, begin section information 605, end section information 606, event information 607, and optionally stores version information 608, title information 609, skip allowed information 610, restartable information 611, with information 612, height information 613, and bit depth information 614. All information stored in attribute storage 604 is stored as a stream of data or other storage format. Begin section information 605 and end section information 606 indicates respectively which section of the test begins and ends the test.
  • Event information 607 indicates, for example, the order of events of the test for that form. Each event has a name and is prefixed with an event type and a colon. Other formats are optional.
  • the event type includes "section", “report”, and “results”.
  • Version information 608 and title information 609 indicate the version and title of the form, respectively.
  • Skip allowed information 610 indicates whether or not by default skipping of sections is allowed.
  • Restartable information 611 indicates whether the form can be restarted. Any optional, customized information regarding the form is stored in custom storage 616 as a property set or other data storage format.
  • Timer storage 628 stores, for example, information relating to how the form is to be timed as a storage element. Attributes storage
  • Timer Plugin 158 stores, for example, the names of Timer Plugin 158 to be used with the form.
  • Plugin data storage stores, for example, the names of Timer Plugin 158 to be used with the form.
  • plugin data storage 633 store any data necessary for timer plugin 158 as a storage element and a stream of data, respectively.
  • Plugin data storage 632 and plug in data storage 633 are optional.
  • Scoring storage 634 stores, for example, information relating to the scoring of the form.
  • Attributes storage 636 stores, for example, the name of scoring plugin 164 to be used with the form.
  • plugin data 639 optionally store any data needed for scoring Plugin 164 as a storage element and a stream of data respectively.
  • Items Branch 650 stores, for example, the primary attributes, properties, and data that govern the items, or test questions, to be delivered to the examinee during the test.
  • Items branch 650 can store information for various items, as is denoted by the three, vertical ellipses.
  • a single item is identified by the data stored in name attributes storage 652.
  • the various items may each be identified by a different name, as denoted by the solid border around name attributes storage 652.
  • Attributes storage 654 stores, for example, weight information 654, scored information 655, and optionally stores, for example, skip allowed information 656, title information 657, start information 658, finish information 659, and condition information 660.
  • Weight information 654 indicates a value used for judging and scoring the item.
  • Scored information 655 indicates whether or not the item is scored as opposed to whether the item is being used as an example. The default of scored information 655 is true.
  • Skip allowed information 656 indicates whether the examinee can skip the item without answering.
  • Start information 658 indicates script execution at the beginning of the item and finish information 659 indicates script execution at the end of the item.
  • Condition information 660 indicates whether or not there is a condition on the item being delivered to the examinee.
  • the information stored in attributes storage 654 is stored as a stream of data or other data storage format. Data storage 662 and data stream 664 store any information regarding the properties of the item.
  • data storage 662 or data stream 664 can store the correct answer of a multiple choice item.
  • Data storage 662 and data stream 664 stored the information as a storage element and a stream of data respectively.
  • Any optional, customized information regarding the item is stored in customs storage 666 as a stream of data or other data storage format.
  • Category storage 668 stores, for example, information relating to each category to which the item belongs.
  • the information stored in category storage 668 preferably and optionally is redundant, as category branch 700 stores, for example, all the items within the specific categories. The reason for the optionally redundancy is so that test driver 110 can quickly look up the category of any item.
  • Category branch 700 stores, for example, the primary attributes, properties, and data that govern the test categories.
  • a test category provides a grouping mechanism, which is independent of delivery of the test, allowing for exotic reporting and scoring if necessary.
  • the test delivers 50 questions of fire safety from a pool of 200 questions. The 50 questions are chosen at random. All 50 questions are delivered together to the examinee in one section. Each question is a member of one of three categories: fire prevention, flammable liquids and fire retardants.
  • the test sponsor wants the report and results to show how well the examinee did in each category.
  • Category branch 700 is optional as denoted by the broken border.
  • Category branch 700 can store information for various categories, as is denoted by the three, vertical ellipses.
  • a single category is identified by the data stored in name attributes storage 702. Again, the various categories may each be identified by a different name, as denoted by the solid border around name attributes storage 702.
  • Attributes storage 704 stores, for example, complete information 705, duplicates information 706, contents information 707, and optionally stores description information 708.
  • Complete information 705 indicates whether or not every item in the category must appear within the category or within its subcategories.
  • Duplicates information 706 indicates whether the item can appear more than once within the category or within the subcategories.
  • Contents information 707 determines what can exist within a category.
  • Description information 708 is used within the category to contain a description of the category's contents.
  • Category storage 710 stores, for example, information relating to any subcategories under the category identified in name attribute storage 702. Items storage 712 indicates any items that exist within the category.
  • Sections storage 714 contains information indicating what any sections that exist within the category.
  • Scoring storage 716 contains information relating to the scoring of the items within the category. Attributes storage 718 stores, for example, the name of the scoring plugin to be used with the item. Data storage 720 and data stream 722 contain the information needed to initialize scoring plugin 164. Data storage 720 and data stream 722 store the information as a storage element and a stream of data respectively.
  • Templates branch 750 stores, for example, the primary attributes, properties, and data that govern the templates used in the test.
  • Template branch 750 can store information for various main templates, as is denoted by the three, vertical ellipses.
  • a single main template is identified by the data stored in name attributes storage 752.
  • the various templates may each be identified by a different name, as denoted by the solid border around name attributes storage 752.
  • Attributes storage 754 stores, for example, split information 756, order information 757, and optionally stores size information 759.
  • Split information 656 defines how a specific area within the template is to be split or separated, for example, either by rows or columns or other shapes and/or sizes.
  • Size information 759 indicates possible values for describing the size of the template, for example, pixels, percentages, or html syntax.
  • Template storage 760 stores, for example, information relating to any sub-templates to be used under the templates specified by the information in name attributes storage 752. Sub-templates are identified by the information in name attributes storage 762. Many sub-templates 760 can exist as denoted by the three vertical ellipses.
  • Areas storage 764 indicates, for example, information relating to the areas used within the template denoted by the information in name attributes storage 752. Many areas may exist within a template as denoted by the three vertical ellipses. Each area is identified by the information stored in name attribute storage 766. Attribute storage 768 stores, for example, visible plugin name information
  • Plugin name information 760 indicates the name of the visible plugin to be used with the area.
  • Size information 770 indicates the size of the area, as for example a pixel value, a percentage value, or HTML syntax.
  • Plugin data 772 and plugin data 774 store information relating to the visible plugin to be used in the area. The data stored in either plugin data storage 772 or plugin data stream 774 is executed by the visible plugin when the template is loaded.
  • Plugin data storage 772 and plugin data stream 774 stores, for example, the information as either a storage element or a stream of data, respectively. Other information may optionally be stored.
  • Section branch 800 stores, for example, the primary attributes, properties, and data that govern test sections.
  • Test sections dictate the navigation and timing of groups of items as well as displays within the test.
  • Sections branch 800 can store information for various sections, as is denoted by the three, vertical ellipses.
  • a single section is identified by the data stored in name attribute storage 802. Again, the various sections may each be identified by a different name, as noted by the solid border around name attributes storage 802.
  • Attributes storage 804 stores, for example, group information 805 and optionally stores, for example, title information 806, skip allowed information 807, start information 808, finish information 809, and condition information 810.
  • Group information 805 indicates to which group of the test the section belongs.
  • Skip allowed information 807 indicates whether or not the items within the section may be skipped.
  • Start information 808 indicates script execution at the beginning of the section and finish information 809 indicates script execution at the end of the section.
  • Condition information 810 indicates any conditions that exist regarding the section. Any optional, customized information regarding this section is stored in custom property storage 812 as a stream of data or other data storage format. Custom attributes will be stored as a property set. The "key" for each attribute will be a string or other acceptable format.
  • Timer storage 814 stores, for example, information regarding, for example, the timing of the section.
  • Attribute storage 816 stores, for example, information identifying timer plugin 158, which is to be used with a section.
  • Plugin data storage 818 and plugin data storage 820 stores, for example, data needed for timer plugin 158.
  • Plugin data storage 818 and plugin data storage 820 stores, for example, information as a storage element and a string of data or other acceptable format, respectively.
  • Navigation storage 822 stores, for example, information relating to the delivery of presentations and groups within the section.
  • Attributes storage 824 stores, for example, information indicating which navigation plugin 162 is to be used with this section.
  • Plugin data storage 826 and plugin data stream 828 store information needed for the navigation plugin 162.
  • Plugin data storage 826 and plugin data stream 828 store the information as a storage element and a stream of data respectively.
  • Groups branch 850 as seen in Figure 16, stores, for example, the primary attributes, properties, and data that govern the groups within the test. A group determines the order of events within the test.
  • Groups branch 850 can store information for various groups, as is denoted by the three, vertical ellipses.
  • a single group is identified by the data store in name attributes storage 852.
  • the various groups may each be identified by a different name, as noted by the solid border around name attributes storage 852. Attributes storage
  • Type information 855 indicates whether the group is either a "group holder” (group of presentations), or a "section holder” (group of sub-sections). These are mutually exclusive.
  • Event information 856 indicates, for example, the order of events within the test.
  • Review name information 858 indicates whether or not a presentation within the group is to be used as a review screen. Any optional, customized information regarding the group is stored in custom storage 860 as a stream of data or other data storage format.
  • Events storage 862 stores, for example, event information, for example, as is described in further detail in Figure 17.
  • Scoring storage 864 stores, for example, information relating to the scoring of items within the group.
  • Attributes storage 866 stores, for example, information indicating which scoring plugin 164 is to be used with the group.
  • Selection storage 872 stores, for example, information relating to the selection of items within the group.
  • Attributes storage 874 indicates which selection plugin 160 is to be used with the group.
  • FIGs 17A, 17B, 17C, and 17D illustrate the events sub-branch of groups branch 850 in greater detail in accordance with one embodiment of the invention.
  • events sub-branch 862 can store information for various events.
  • events sub-branch 862 is storing information in events name sub-branch 880, event name sub-branch 890, and event name sub-branch 897.
  • Attributes storage 881, in Figure 17B, under events name storage 880 stores, for example, type information 882, template information 883, and optionally stores, for example, title information 884, counted information 885, start information 886, finish information 887, and condition information 888.
  • Type information 882 indicates whether the event is an item or a display.
  • Template information 883 indicates which template is being used with the event.
  • Counted information 885 indicates whether a presentation should be included in the totals of presentations presented to the examinee in a section. Generally, presentations with items, or questions, are counted and introductory presentations are not counted.
  • Start information 886, finish information 887, and condition information 888 indicates start, finish, and conditional scripts respectively. Any optional, customized information regarding the event is stored in custom storage 889.
  • the "key" for each custom attribute will be a string.
  • event name storage 890 indicates, for example, a different event, which contains different attributes.
  • area information 891, in Figure 17B indicates, for example, which area is rendering the presentations content and item information 892 indicates the name of the associated item if the event is of the item type.
  • data storage 893, data stream 894, data storage 895, and data storage 896 contain information used in a nested presentation. The data off of a nested presentation are the contents of the item or the presentation. This data may be a stream, a storage, a link to a stream, a link to a storage, or other format.
  • event name 897 indicates another event, which includes a sub-event 898, in Figure 17D.
  • Plugins branch 900 stores, for example, the primary attributes, properties, and data that govern any plugins 150 used for the test.
  • Plugins branch 900 can store information for various plugins, as is denoted by the three, vertical ellipses.
  • a single plugin is identified by the data stored in name attribute storage 902.
  • a CLSID is stamped with the name of the plugin 150.
  • Attributes storage 904 stores, for example, information identifying the plugin 150 by a program ID.
  • Data branch 950 stores, for example, the data, for example, as either a stream, set of data, or as a storage element if plugin 150, respectively.
  • Data branch 950 stores, for example, any global data needed for the test.
  • Data stored optionally under data branch 950 may be stored as either a storage element or a stream of data as indicated by data storage 952 and data storage 954.
  • Data stored under data branch 950 may be directly used by a plugin 150 or the data may be resources (.gif, .jpeg, .wab, .mpeg, etc.) used internally by a plugin 150.
  • FormGroups branch 1000 stores, for example, the primary attributes properties and data that govern the formGroups of the test.
  • FormGroups branch 1000 can store information for various formGroups, as is denoted by the three, vertical ellipses.
  • a single formGroup is identified by the data stored in name attributes storage 1002.
  • the various formGroups may each be identified by a different name, as denoted by the solid border around name attributes storage 1002.
  • Attributes storage 1004 stores, for example, information indicating which forms are to be used within the formGroup.
  • Selections storage 1006 stores, for example, information relating to the selection of items within the formGroup.
  • Attributes storage 1008 indicates which selection plugin 160 is to be used with the formGroup.
  • Plugin data storage 1010 and plugin data storage 1012 store any information needed for the selection plugin 160.
  • Attributes storage branch 1050 stores, for example, attribute information that is global to exam resource file 120. This includes the last execution state of XXL compiler 140 [sMode], the major [ XXLMajorVersion] and the minor version [iXXLMinorVersion] of the XXL language.
  • Scripts branch 1100 stores, for example, information relating to scripts used within the test.
  • Attributes storage 1102 stores, for example, type information that specifies which type of language the script is in, for example, VB script of J script.
  • Scripts storage 1104 stores, for example, global scripts used within the test that may be referenced by the test driver.
  • MsgBox branch 1150 stores, for example, information relating to the size and content of any message boxes that may be delivered to the examinee during the test. Message boxes may be triggered by plugins 150 during the exam.
  • Figure 24 is a flow chart illustrating the overall method of test production and test delivery according to the present invention, denoted generally by reference numeral 1500.
  • the test publisher first authors the test specification and content in the test definition language, for example, XXL, step 1502.
  • the test specification and content are then stored in exam source files 130, specifically, in XXL files 134, step 1504.
  • the content of XXL files 134 are then compiled and validated, step 1506.
  • the compiled XXL test specification and content are stored in exam resource file 120, step 1508.
  • the compiled XXL test specification and content are delivered to the examinee, step 1510.
  • the validation of the test specification and content is illustrated in greater detail in Figure 25, by the method denoted generally by reference numeral 1512.
  • the XXL test specification and content stored in exam source files 130 specifically references a plugin 150
  • that plugin 150 is instantiated, step 1514.
  • the segment of the XXL test specification and content relating to that plugin 150 is loaded into the plugin 150 from exam source files 130, step 1516.
  • the partial test specification and content is loaded into a private memory in data communication with the plugin 150.
  • the plugin 150 validates the segment of the XXL test specification and content, step 1518.
  • the validated segment of the XXL test specification and content is then unloaded from the plugin 150 into a storage element within exam resource file 120.
  • Figure 26 illustrates the method of the test delivery cycle in greater detail.
  • the plugin 150 is instantiated, step 1525.
  • the storage element in exam resource file 120 containing the validated segment of XXL test specification and content is provided to the plugin 150, step 1527.
  • the validated segment of XXL test specification and content is loaded into the plugin 150 from the storage element within exam resource file 120, step 1529.
  • test driver 110 and plugins 150 retrieve the test specification and content from exam resource file 120.
  • Figure 27 illustrates the flow of a test.
  • Exam resource file 120 stores the entire test specification and contents of the test.
  • Exams branch 550 stores, for example, information identifying the various forms that define the order (e.g.., sequential, non-sequential, random, etc.) of delivery of the test.
  • Forms branch 600 stores, for example, the information for the various forms, including which sections begin and end the test, the order in which presentations to be delivered, timing information, and scoring information included in a particular form.
  • Sections branch 800 stores, for example, information identifying a particular group and the timing and navigation information associated with that group.
  • Groups branch 850 identifies the order of presentations 862, also called events, that are to be delivered to the examinee during a particular section. Groups branch 850 also stores, for example, information for the scoring and selection of items delivered to the examinee during a presentation 862. Groups 850 can also store information for other sub-groups stored in groups branch 850 and sub-sections stored in sections branch 800.
  • the XXL language is advantageously based on XML, although other functionally similar languages may optionally be used.
  • XML is a meta-language, meaning that XML can be used to define other languages.
  • XML based languages have several common concepts, including, for example: elements, attributes, and tags.
  • An element for example " ⁇ exam>,” is a label for a tag type.
  • a tag is a region of data surrounded by ⁇ >.
  • any start tags have, for example, matching end tags that are preceded by a forward slash. For example: Correct : ⁇ exam> ... ⁇ /exam> Wrong : ⁇ exam> ... ⁇ exam>
  • An XML document is considered to be “valid” if the document correctly describes a specific set of data.
  • a document is "valid” if it correctly describes XXL.
  • a language created with XML uses a schema to define what tags create the language, what attributes can appear on a certain tag, and the order in which tags may appear.
  • Figure 28 illustrates an example of an XXL schema that defines the global attribute "name” and the element "form".
  • the element "form” has the global "name” attribute, as well as the attribute "restartable”, which is defined within the "form” definition.
  • the "form” element also has child, or sub, element “scoring”. The element “scoring” needs to be defined before being assigned (not shown).
  • the XXL schema makes many element token legal at the beginning of the schema.
  • Figure 29 illustrates another XXL schema example.
  • the element "item” is being defined.
  • the “item” elements has several attributes, all of which are defined within the element definition, for example: "name”; “title”' “template”; “area”; “weight”; “scored”; and “skipAllowed”. In this example, no global attributes have been defined, therefore, all attributes being assigned to the "item” element must be defined within the element definition.
  • the item “element” also has a child element, "data.”
  • the XXL schema in its entirety may be found in Appendix A.
  • Figure 30 illustrates the hierarchy of the XXL test definition language.
  • the XXL language's outer most element is XXL element 1700.
  • XXL 1700 includes, for example, the following elements: exam element 1702, category element 1704, item element 1728, plugin element 1706, template element 1708, and MsgBox element 1710. Additionally, form element 1716, section element 1722, and group element 1724 may be defined directly inside XXL element 1700 or in their appropriate place in the hierarchy. When form element 1716, section element 1722, and group element 1724 are defined directly under XXL element 1700, form element 1716, section element 1722, and group element 1724 must be referenced later to be utilized.
  • Exam element 1702 contains formGroup element 1712 and form element 1716.
  • FormGroup element 1712 may also contain form element 1716 directly.
  • Form element 1716 may contain report element 1720, results element 1718, and section element 1722.
  • Section element 1722 may contain group element 1724.
  • Group element 1724 may contain section element 1722, other group elements 1724 and presentation elements 1726.
  • Presentation element 1726 references item elements 1728 that are defined under XXL element 1700. Presentation element 1726 may contain other presentation elements 1726.
  • Category element 1704 may contain other category elements 1704.
  • Template element 1708 may contain area elements 1714 and other template elements 1708.
  • Elements attribute element 1730, script element 1732, and data element 1734 may adorn many language constructs. Attribute element 1730 may be attached to many core elements, including, for example, exam element 1702, form element 1716, section element 1722, presentation element 1726, and item element 1728.
  • Script element 1730 may appear under the following elements, for example: XXL element 1700, form element 1716, section element 1722, presentation element 1726, and item element 1728.
  • Data element 1734 may appear in which ever category contains data for a plugin 150, for example: category element 1704, plugin element 1706, area element 1714, form element 1716, formGroup element 1712, section element 1722, group element 1724, presentation element 1726, and item element 1728.
  • a) Elements Elements are defined using the ⁇ ElementType> tag.
  • An element can have child elements and/or attributes. There are several attributes on the ⁇ ElementType> tag that define the element, for example: "name,” “content,” “order,” and “model”.
  • the "name” attribute defines the name of the element, i.e., ⁇ form>.
  • the "name” attribute can be assigned, for example, any alphanumerical string value.
  • the "content” attribute defines whether test and/or elements can be defined within the element.
  • the "content” element may be assigned values including, for example: element only
  • the "order” attribute defines the order of any child elements defined within an element.
  • the "order” element may be assigned values including, for example: “many,” which indicates that any ordering of child elements is permissible, and "one,” which indicates that only one ordering of child elements is permissible.
  • the "model” attribute indicates whether the XML has to be closed or open and expandable.
  • An open element may contain constructs from another schema or 'namespace'. For example, a 'stem' XXL element, which is the text body of a question, is open. This will allows a test publisher to insert an extensible hypertext markup language (“XHTML”) tag provided the test publisher supplies the location of the schema defining XHTML.
  • XHTML extensible hypertext markup language
  • the XHTML tag allows the test publisher to embed the question text inside the stem element, complete with formatting.
  • a closed element can contain only the elements that are defined.
  • Figure 28 illustrates how an ⁇ ElementType> is used to define the "form” element and
  • Figure 29 illustrates how an ⁇ ElementType> tag is used to define the "item” element.
  • the ⁇ element> tag is used.
  • attributes on the ⁇ element> tag that define the use of the element, for example: "type,” “minOccurs,” and “maxOccurs”.
  • the "type” attribute identifies which element is being placed. For example, in Figure 28, the child element “scoring” is being placed and is the value assigned to the "type” attribute.
  • the "type” attribute can be assigned any alphanumerical string value.
  • minOccurs element defines the minimum number of times the element can appear and is assigned a numerical value, for example, "0".
  • maximumOccurs element defines the maximum number of time the element can appear and is also assigned a numerical value. Both “minOccurs” and “maxOccurs” can also be assigned the value "*,” or other suitable designation, which indicate that the element may appear an infinite number of times. This value is typically only assigned to the "maxOccurs” attribute.
  • Attributes are defined using the ⁇ AttributeType> tag.
  • attributes on the ⁇ AttributeType> tag that define the attribute, for example: "name,” “dt:type,” “required,” “devalues,” and "default”.
  • the "name” attribute define the name of the attribute, i.e. "restartable”.
  • the "name” attribute can be assigned any alphanumerical string value.
  • the "dt:type” attribute defines the type of the attribute.
  • An ID is a string that is unique within the XML document. For example, if the attribute 'name' was a dt:type 'ID', then one name, e.g. "Joe", could be contained in the XML document. ID types are not used in XXL.
  • the "required” attributed indicates whether or not the element must exist on the element tag in which it is being placed.
  • the "required” attribute is assigned the values, for example, "yes" or "no".
  • the "devalues” attribute defines the possible values of the enumeration type if "dfctype” is assigned the value "enumeration”. For example, “d values” may be assigned the value "true false”.
  • the "default” attribute indicates the value the attribute is assigned if no other value is specified when the attribute is placed.
  • the ⁇ AttributeType> tag is used to define the elements "name,” "title,”
  • the ⁇ attribute> tag is used.
  • the attribute on the ⁇ attribute> that defines the use of the attribute is "type".
  • the "type” attribute identifies which attribute is being placed.
  • the "type” attribute may be assigned any alphanumeric string value.
  • the ⁇ group> tag is used to allow for complex grouping of different elements.
  • the ⁇ group> tag is used to group, for example, elements “sectionRef and "section” within the definition for the "form” element.
  • There are several attributes on the ⁇ group> tag that define the group for example: “order”; “minOccurs”; and “maxOccurs”.
  • the "order” attribute defines the order in which the child elements being grouped must appear.
  • the "order” element may be assigned values including, for example: “many,” which indicates that any ordering of child elements is permissible, and "one,” which indicates that only one ordering of child elements is permissible.
  • the "minOccurs” element defines the minimum number of times the element can appear and is assigned a numerical value, for example, "0".
  • the "maxOccurs” element defines the maximum number of time the element can appear and is also assigned a numerical value. Both “minOccurs” and “maxOccurs” can also be assigned the value "*,” which indicate that the element may appear an infinite number of times. This value is typically only assigned to the "maxOccurs” attribute.
  • Plugin files 2080 are the first, for example, .xml files that are compiled into exam resource file 120.
  • Plugins 150 and plugin files 2080 allow the test publisher to customize the behavior of test driver 110.
  • Plugin files 2080 describe what type of data can be accepted for a particular plugin 150.
  • Figure 30 illustrates how a plugin 150, in this case, a report plugin 168, is used with data stored in a plugin file 2080 to produce the test' s data content and/or presentation format 3000. Having all plugin 150 information in a separate XML file 134 is a convenient way of re-using XML file 134 for many tests and for organizing test data, however this needn't be the case.
  • All XXL information may be in a single XML file 134 or in many files as long as the data is compiled preferably in the order specified in Fig 6.
  • the ⁇ data> tag is used within the XXL definition of a plugin 150 to place all information intended for the plugin 150.
  • Data tells a plugin 150 how the plugin 150 should behave. For example, data can tell the plugin 150 what to display (i.e., visual plugins), what information to print (i.e., report plugin 168), and in what order items should be delivered (i.e., selection plugins 160.
  • Attribute type "name” />
  • Data can be defined either within the data tag in an element definition or in an external file.
  • the "URT' attribute is a path to an external file. If the data should not be added to exam resource file 120, the "keepExternal" attribute should be assigned the value "true”.
  • Data can be placed as a child element within an element definition.
  • the ⁇ data> tag is used to identify the data being placed.
  • Figure 32 illustrates the rendering of the "nextPrevious" helm plugin 154 defined in part with the ⁇ data> tag above.
  • the appearance of Next button 3002, Previous button 3004, and Review button 3006 are defined by the values assigned to the "img" attribute.
  • FIG 33 illustrates another ⁇ data> example.
  • Timer plugin 158 named “mainTimer,” is assigned to the attribute "timer" within the ⁇ form> element "foobarOOl" definition.
  • the ⁇ form> element is a child element of the ⁇ exam> element “foobar”.
  • the ⁇ data> tag assigns the information for "mainTimer” timer plugin 158.
  • the information assigned to "mainTimer” is a twenty-minute standard timer for form “foobarOOl” along with a sixty-second "One Minute Left” warning.
  • Figure 34 illustrates the concept of XXL referencing.
  • XXL allows for an element to be defined once, and then referenced multiple times later on in the test specification. The references all re-use the original definition of the element and do not perform copy operations.
  • CategoryRef references an existing category element.
  • FormRef references an existing form element.
  • GroupRef references an existing group element.
  • SectionRef references an existing section element.
  • the test definition illustrated in Figure 34 is defining the section element
  • similar languages to XML may optionally be used, such as web programming, object oriented programming, JAVA, etc., that provides the functionaUty described herein.
  • similar functionality as described herein may be utilized without departing from the present invention.
  • a form must contain at least one "section" (or
  • a section with a condition script is run if evaluated to
  • a presentation is what tells the test driver what data
  • ⁇ !-- presentation tags can be nested.
  • the outer most presentation —> ⁇ tags contains no attributes, while the inner set of presentation —> ⁇ tags makes references to the areas to be updated.
  • a "data" tag can exist 'inline", to create a display or define —> ⁇ l- a helm Another options is to the "data" attribute to point —>
  • Teesstt drivers will support "plug-ins" to handle scoring, item -. -> ⁇ !-- delivery, navigation, and many other functions which were
  • the type attribute can be one of the following:

Abstract

A memory stores a plurality of first data structures, which includes element specific data objects indicating a classification of at least one of the plurality of segments of the test definition language, and second data structures, which include attribute specific data objects indicating at least one attribute of the segments of the test definition language implemented by a computer. A method for computer-based testing includes authoring a test specification and content of the at least one test using a test definition language, compiling the test specification and content of at least one test to create a compiled test specification and content, which includes validating the test specification and content, storing the compiled test specification and content to a resource file, and retrieving the compiled test specification and content from the resource file during delivery of the test.

Description

EXTENSIBLEEXAMLANGUAGE (XXL) PROTOCOLFORCOMPUTERBASED
TESTING
CROSS REFERENCE TO RELATED APPLICATIONS
This application is related to and claims the priority of U.S. Provisional Application Serial No. 60/331,228, filed November 13, 2001 and incorporated herein by reference, and is further related to: U.S. Patent Application entitled "METHOD AND SYSTEM FOR COMPUTER BASED TESTING USING PLUGINS TO EXPAND FUNCTIONALITY OF A TEST DRIVER" and having inventor Clarke Daniel Bowers (Docket No. 26119-142); U.S. Patent Application entitled "METHOD AND SYSTEM FOR COMPUTER BASED TESTING USING CUSTOMIZABLE TEMPLATES" and having inventor Clarke Daniel Bowers (Docket No. 26119-143); U.S. Patent Application entitled "METHOD AND SYSTEM FOR COMPUTER BASED TESTING USING A NON- DETERMINISTIC EXAM EXTENSIBLE LANGUAGE (XXL) PROTOCOL" and having inventor Clarke Daniel Bowers (Docket No. 26119-144); and U.S. Patent Application entitled "METHOD AND SYSTEM FOR COMPUTER BASED TESTING USING AN AMALGAMATED RESOURCE FILE" and having inventor Clarke Daniel Bowers (Docket No. 26119-145) all of which are being filed concurrently herewith and all of which are incorporated by reference in their entirety herein.
BACKGROUND OF THE INVENTION
FIELD OF THE INVENTION
The present invention generally relates the field of computer-based testing, and in particular, the present invention relates to a non-deterministic test definition language that defines a test specification and content of a computer-based test.
BACKGROUND OF THE RELATED ART
For many years, standardized testing has been a common method of assessing examinees as regards educational placement, skill evaluation, etc. Due to the prevalence and mass distribution of standardized tests, computer-based testing has emerged as a superior method for providing standardized tests, guaranteeing accurate scoring, and ensuring prompt return of test results to examinees.
Tests are developed based on the requirements and particulars of test developers. Typically, test developers employ psychometricians or statisticians and psychologists to determine the specific requirements specific to human assessment. These experts often have their own, unique ideas regarding how a test should be presented and regarding the necessary contents of that test, including the visual format of the test as well as the data content of the test. Therefore, a particular computer-based test has to be customized to fulfill the client's requirements.
Figure 1 illustrates a prior art process for computerized test customization, denoted generally by reference numeral 10. First, a client details the desired test requirements and specifications, step 12. The computerized test publisher then creates the tools that allow the test publisher to author the items, presentations, etc., required to fulfill the requirements, step 14. The test publisher then writes an item viewer, which allows the test publisher to preview what is being authored, step 16.
An item presenter is then written to present the new item, for example, to the test driver, step
18. Presenting the new item to the test driver requires a modification of the test driver's executable code. The test driver must be modified so that it is aware of the new item and can communicate with the new item presenter, step 20. The test packager must then also be modified, step 22. The test packager, which may also be a compiler, takes what the test publisher has created and writes the result as new object codes for the new syntax. Subsequently, the scoring engine must also be modified to be able to score the new item type, step 24. Finally, the results processor must be modified to be able to accept the new results from the new item, step 26. This process requires no less than seven software creations or modifications to existing software.
Current computer-based test definition languages are deterministic and finite. There is a fixed set of grammar constructs that define the extent of the language. Also, test components in current computer-based test drivers are fixed to the set of exam constructs. Therefore, new test functionality cannot be added without code changes and compilation of many modules.
U.S. Patent No. 5,827,070 (Kershaw et al.) and U.S. Patent No. 5,565,316 (Kershaw et al.) are incorporated herein by reference. The '070 and '316 patents, which have similar specifications, disclose a computer-based testing system comprising a test development system and a test delivery system. The test development system comprises a test document creation system for specifying the test contents, an item preparation system for computerizing each of the items in the test, a test preparation system for preparing a computerized test, and a test packaging system for combining all of the items and test components into a computerized test package. The computerized test package is then delivered to authorized examinees on a workstation by the test delivery system.
Figure 2 illustrates the relationship among session scripts 30, test scripts 32, and units. A script consists of a series of files and further specifies the option settings and configuration data, which the
Test Delivery Application (TDA) needs for operation. During test preparation, scripts are prepared and combined with the items prepared during item preparation. Scripts control the sequence of events during a testing session. Two types of scripts are preferably used: the session script 30 and one or more test scripts 32. The session script 30 controls the order in which units within the testing session are presented. Units provide specific services to the examinee, such as delivering a test or presenting a score repor Just as the session script controls the session, the test script controls what is presented to the examinee during the testing unit. Each testing unit may include one or more delivery units, which are separately timed and scored subdivisions of a test. The system can dynamically select, or spiral, scripts and other test components so that examinees are given what appear to be different tests. FIG. 24 shows the relationship among session scripts 30, test scripts 32, and units.
The session script is the second-level component of the testing package. It performs two primary functions: First, it specifies the Session Control Information, which defines the default options that are in effect for the entire examinee testing session. Second, it controls the order in which units within the testing session are presented and the options used to present them. The units that can be presented within a session script are: General information screen units, Tutorial units, Break units, Data collection units, Scoring and reporting units, and Testing units. The session control information contains the default options in effect for the entire session.
Control information can be provided at multiple levels within the testing session. Thus, the control information provided at the session level can be overridden by information that occurs later in the session. The information provided at the session level would generally include the following: Name-- the session script name to be used by administrators in selecting a specific session script from Administrative Application menus; Input device—the input device to be used during the session (e.g., mouse or keyboard); Color—the colors to be used during the session; Messages— program-specific messages to override default messages during the session; Demo Script—indicates whether the script presents a demonstration or operational test; Research Indicator—indicates whether the script presents a research pilot test; Special Timing— indicates whether the script is standard or specially timed version. The testing unit presents a test, based on the contents of a test script that may have been selected at runtime. The following units can be included within a testing unit: general information screen unit; tutorial unit; break unit; delivery unit, which delivers items to the examinee. This permits testing programs to interleave general information screens, tutorials, and breaks with sections of a test. The testing unit contains the following information: script selection mode indicates whether dynamic runtime selection is to be used to select the test script; reference to a test script which controls the sequence of events and options used during the testing unit. If dynamic runtime selection is to be used, the reference is to a set of test scripts. Like the session script, the test script performs two primary functions. First, it specifies the test and delivery unit control information. Test control information defines the options that are in effect for the testing unit. Delivery unit control information defines the options that are in effect for a particular delivery unit within a testing unit. It controls the order in which units are presented within the testing unit and the options used to present them. The rules for presentation of units are the same as those for the session script, except that an additional unit, the delivery unit, can be included within a test script.
U.S. Patent No. 5,513,994 (Kershaw et al.), which is incorporated herein by reference, discloses a centralized administrative system and method of administering standardized test to a plurality of examinees. The administrative system is implemented on a central administration workstation and at least one test workstation located in different rooms at a test center. The administrative system software, which provides substantially administrative functions, is executed from the central administration workstation. The administrative system software, which provides function carried out in connection with a test session, is executed from the testing workstations.
None of the Kershaw et al. patents appear to make any mention of a test definition language that is non-linear and does not require interpretation of the commands at delivery time. What is required is a non-deterministic test definition language that is able to expand with the definition of new testing components and allows the test driver to be expanded to support new item types, scoring algorithms, etc., without making any changes to the test driver's executable or recompiling the test driver to support the new testing components as described below in connection with the present invention. Other features and advantages in addition to the above, or in the alternative to the above, are described in the Summary of the Invention and the Detailed Description provided below.
SUMMARY OF THE INVENTION
It is one feature and advantage of the present invention to implement a test definition language that allows the addition of new test functionality without necessitating changes to a test driver's executable code or other implementing functionality.
It is another optional feature and advantage of the present invention that a test definition language used to implement a computer-based test supports, for example, a non-predetermined properties set.
It is another optional and advantage of the present invention that the test definition language used to implement a computer-based test supports named properties to, for example, any area of the test definition.
It is another optional and advantage of the present invention that the test definition language used to implement a computer-based test is in extensible markup language format and optionally has a grammar that is defined by a schema. These and other advantages are achieved in an optional memory storing a test definition language in extensible markup language format that characterizes or comprises a computer-based test delivered to an examinee using a test driver and is implemented by a computer. The test definition language has, for example, a plurality of segments. The computer-based test has a presentation format and data content and the test driver delivers the computer-based test to an examinee using a display device. The test driver, for example, manages the computer-based test, controls progression of the computer-based test, controls scoring of the computer-based test, controls timing of the at least one test, controls printing of the at least one test, and controls results reporting of the computer-based test based on the test definition language.
The optional memory stores a plurality of first data structures. The plurality of first data structures includes element specific data objects indicating a classification of at least one of the plurality of segments of the test definition language. The plurality of segments defines information comprises the data content, the presentation format, the progression, the scoring, the printing, the timing, and the results reporting of the computer-based test. The memory also stores second data structures that optionally depend from or are subordinate to the first data structures. The second data structures include attribute specific data objects indicating at least one attribute of the segments of the test definition language implemented by the computer. In an alternative embodiment, the memory further stores third data structures that depend from or are subordinate to the plurality of first data structures. The third data structures include data specific data objects indicating at least one sub-classification of the at least one of the plurality of segments of the test definition language. In another alternative embodiment, the memory further stores third data structures that depend from or are subordinate to the plurality of first data structures. The plurality of third data structures include element specific data objects indicating a sub-classification of the at least one of the plurality of segments of the test definition language. The sub-classification further indicates at least one property specific to the at least one of the plurality of segments of the test definition language. hi a further alternative embodiment, the memory further stores fourth data structures that depend from or are subordinate to the plurality of first data structures. The plurality of fourth data structures include group specific data objects indicating an order of an appearance of the at least one of the plurality of third data structures, a minimum occurrence of the appearance of the at least one of the third data structures, and a maximum occurrence of the appearance of the at least one of the third data structures.
In another embodiment of the present invention, a memory is provided storing a schema for a test definition language in extensible markup language format that that characterizes a computer-based test delivered to an examinee using a test driver and is implemented by a computer. The test definition language has a plurality of segments. The computer-based test has a presentation format and data content and the test driver delivers the computer-based test to an examinee using a display device, manages the computer-based test, controls progression of the computer-based test, controls scoring of the computer-based test, controls timing of the at least one test, controls printing of the at least one test, and controls results reporting of the computer-based test based on the test definition language, wherein the schema defines a permissible grammar for the test definition language. An optional memory stores a plurality of first data structures. The plurality of first data structures includes element definition specific data objects defining an element classification of at least one of the plurality of segments of the schema. The plurality of segments defines classification identification information comprising the data content, the presentation format, the progression, the scoring, the printing, the timing, and the results reporting of the computer-based test. The memory also stores second data structures. The second data structures include attribute definition specific data objects defining at least one attribute classification of the plurality of segments of the schema.
The memory further stores third data structures. The third data structures include element specific data objects indicating at least one element sub-classification of the at least one of the plurality of segments of the schema. The memory also stores fourth data structures. The fourth data structures include attribute specific data objects indicating at least one attribute of the at least one of the plurality of segments of the test definition language implemented by the computer. hi another embodiment of the present application, a method for computer-based testing is provided, which includes authoring a test specification and content of the at least one test using a test definition language. The test specification and content defines the presentation format and the data content of the at least one test. The method also includes compiling the test specification and content of the at least one test to create a compiled test specification and content. Compiling the test specification and content includes validating the test specification and content. The method further includes storing the compiled test specification and content to a resource file and retrieving the compiled test specification and content from the resource file during delivery of the test.
In another embodiment of the present invention, a method for defining a schema for a test definition language is provided. The method includes defining a first set of elements, defining a set of attributes, and defining a second set of elements. The second set of elements references the first set of elements and the set of attributes.
In another embodiment of the present invention, a method is provided for a computer-based testing system that executes a test controlled by a test driver. The test driver has an executable code that controls the test driver and functionality performed by the test driver that enables the test driver to deliver the at least one test to an examinee using a display device, manage the at least one test, control progression of the at least one test, control scoring of the at least one test, control timing of the at least one test, control printing of the at least one test, and control results reporting of the at least one test based on a test definition language in extensible markup language format. The test has a presentation format and data content. The test definition language has a plurality of segments that defines information comprising the data content, the presentation format, the progression, the scoring, the printing, the timing, and the results reporting of the test.
The method includes the sequential, non-sequential and/or sequence independent steps of authoring at least one of the plurality of segments and storing the at least one of the plurality of segments to the source file. The method also includes instantiating a validation expansion module during a test production cycle and loading the at least one of the plurality of segments of the test definition language into a memory from the source file, validating the at least one of the plurality of segments. The method further includes unloading the at least one of the plurality of segments from the memory into at least one of a plurality of storage elements and providing to the memory the at least one of the plurality of storage elements. The method also includes loading the at least one of the plurality of segments of the test definition language from the at least one of the plurality of storage elements into the memory during a test delivery cycle and implementing directly by the validation expansion module the information defined by the at least one of the plurality of segments. The method further includes accessing by the test driver the at least one of the plurality of segments of the test definition language to enable the functionality of the test driver via the direct implementation by the validation expansion module. In another embodiment of the present invention, the test definition language has a plurality of element specific data objects and a plurality of attribute specific data objects that define information comprising the data content, the presentation format, the progression, the scoring, the printing, the timing, and the results reporting of the test. The method includes the sequential, non-sequential and/or sequence independent steps of authoring at least one of the plurality of element specific data objects and at least one of the plurality of attribute specific data objects and storing the at least one of the plurality of element specific data objects and the at least one of the plurality of attribute specific data objects to the source file. The method also includes instantiating a validation expansion module during a test production cycle, loading the at least one of the plurality of element specific data objects and the at least one of the plurality of attribute specific data objects of the test definition language into a memory from the source file. The method further includes validating the at least one of the plurality of element specific data objects and the at least one of the plurality of attribute specific data objects and unloading the at least one of the plurality of element specific data objects and at least one of the plurality of attribute specific data objects from the memory into at least one of a plurality of storage elements.
The method also includes providing to the memory the at least one of the plurality of storage elements and loading the at least one of the plurality of element specific data objects and the at least one of the plurality of attribute specific data objects of the test definition language from the at least one of the plurality of storage elements into the validation expansion module during a test delivery cycle. The method further includes implementing directly by the validation expansion module the information defined by the at least one of the plurality of element specific data objects and at least one of the plurality of attribute specific data objects and accessing by the test driver the at least one of the plurality of element specific data objects and at least one of the plurality of attribute specific data objects of the test definition language to enable the functionality of the test driver via the direct implementation by the validation expansion module.
There has thus been outlined, rather broadly, the more important features of the invention and several, but not all, embodiments in order that the detailed description thereof that follows may be better understood, and in order that the present contribution to the art may be better appreciated. There are, of course, additional features of the invention that will be described hereinafter and which will form the subject matter of the claims appended hereto.
In this respect, before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction and to the arrangements of the components set forth in the following description or illustrated in the drawings. The invention is capable of other embodiments and of being practiced and carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein are for the purpose of description and should not be regarded as limiting. As such, those skilled in the art will appreciate that the conception, upon which this disclosure is based, may readily be utilized as a basis for the designing of other structures, methods and systems for carrying out the several purposes of the present invention. It is important, therefore, that the claims be regarded as including such equivalent constructions insofar as they do not depart from the spirit and scope of the present invention.
Further, the purpose of the foregoing abstract is to enable the U.S. Patent and Trademark Office and the public generally, and especially the scientists, engineers and practitioners in the art who are not familiar with patent or legal terms or phraseology, to determine quickly from a cursory inspection the nature and essence of the technical disclosure of the application. The abstract is neither intended to define the invention of the apphcation, which is measured by the claims, nor is it intended to be limiting as to the scope of the invention in any way.
These, together with other objects of the invention, along with the various features of novelty, which characterize the invention, are pointed out with particularity in the claims annexed to and forming a part of this disclosure. For a better understanding of the invention, its operating advantages and the specific objects attained by its uses, reference should be had to the accompanying drawings and descriptive matter in which there is illustrated preferred embodiments of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is a flow diagram of a prior art method for computerized test customization;
Figure 2 is a block diagram of a prior art testing script; Figure 3 is a schematic diagram of a computer-based testing system according to the present invention;
Figure 4 illustrates various components that comprise an exam source file;
Figures 5A and 5B are schematics illustrating the components, classes, and interfaces that comprise a test definition language compiler according to the present invention; Figure 6 is a flow diagram that illustrates a compile order for compiling a source file according to the present invention;
Figure 7 illustrates how a test publisher defines the compile order;
Figure 8 illustrates an output of the test definition language compiler based on the compile order; Figure 9 is a block diagram of main storage branches of an exam resource file according to the present invention;
Figure 10 is a block diagram illustrating an exams branch of the exam resource file;
Figures 11 A and 1 IB are block diagrams illustrating a forms branch of the exam resource file;
Figure 12 is a block diagram illustrating an items branch of the exam resource file; Figure 13 is a block diagram illustrating a categories branch of the exam resource file;
Figure 14 is a block diagram illustrating a templates branch of the exam resource file;
Figure 15 is a block diagram illustrating a sections branch of the exam resource file; Figure 16 is a block diagram illustrating a groups branch of the exam resource file;
Figures 17A, 17B, 17C, and 17D are block diagrams illustrating an events sub-branch of the groups branch of the exam resource file;
Figure 18 is a block diagram illustrating a plugins branch of the exam resource file; Figure 19 is a block diagram illustrating a data branch of the exam resource file;
Figure 20 is a block diagram illustrating a formGroups branch of the exam resource file;
Figure 21 is a block diagram illustrating an attributes branch of the exam resource file;
Figure 22 is a block diagram illustrating a scripts branch of the exam resource file;
Figure 23 is a block diagram illustrating a message box branch of the exam resource file; Figure 24 is a flow chart of a method of test production and test delivery according to the present invention;
Figure 25 is a flow chart of a method for validation of test specification and content according to the present invention;
Figure 26 is a flow chart of a method for test delivery according to the present invention; Figure 27 is a flow diagram illustrating the flow of a test according to the present invention;
Figures 28 illustrates an example of a schema according to the present invention;
Figure 29 illustrates an example of a schema definition of an item according to the present invention;
Figure 30 illustrates the hierarchy of the test definition language according to the present invention;
Figure 31 illustrates how a plugin is used with the test definition language to enable delivery of the computer-based test according to the present invention;
Figure 32 illustrates a rendering of an example of a definition of data using the test definition language according to the present invention; Figure 33 illustrates and example of using the test definition language to define plugin data; and
Figure 34 illustrates test definition language referencing according to the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Reference now will be made in detail to the presently preferred embodiments of the invention. Such embodiments are provided by way of explanation of the invention, which is not intended to be limited thereto, hi fact, those of ordinary skill in the art may appreciate upon reading the present specification and viewing the present drawings that various modifications and variations can be made. For example, features illustrated or described as part of one embodiment can be used on other embodiments to yield a still further embodiment. Additionally, certain features may be interchanged with similar devices or features not mentioned yet which perform the same or similar functions. It is therefore intended that such modifications and variations are included within the totality of the present invention. The present invention discloses a system and method OF computer-based testing using a non- deterministic test language. A test is delivered using a test driver that is, for example, object-oriented and is architected to dynamically add functionality through, for example, the use of an expansion module, and preferably through the use of plugins. The test driver preferably references component object model servers using standard interfaces, and uses, for example, class names (that can be an Active Document) defined in a custom test definition language entitled extensible eXam Language ("XXL") based on extensible Markup Language ("XML") format to interact with existing applications while offering the flexibility of allowing development of new plugins. These new plugins can be customized to a client's needs without changing the core test driver. The XXL language is defined using an XXL schema that structures the allowable grammar of the XXL language.
The plugins advantageously enable the test driver to support, for example, new item types, navigation algorithms, information displays, scoring algorithms, timing algorithms, test unit selection algorithms, results persistence reporting, printed score reporting, and/or helm types without change to the test driver's executable. Plugins also allow expansion of the test driver's functionality without requiring the test driver to be recompiled or re-linked, and without requiring the test publisher to learn to program. Since plugins are written independently of the test driver, plugins can be written long after the test driver is built.
The client and the software developer can design and test the plugins and distribute the plugins to each test site. By using this method, large-scale regression testing of other examinations will not usually be necessary unless changes are made to the plugins that may be used by many examinations.
I. Overview of Computer-Based Test Delivery System
Figure 3 shows an overview of the software architecture for the computer-based test delivery system of the present invention, denoted generally by reference numeral 100. Test driver 110 is responsible for controlling all aspects of the computer-based test. Test driver 110 identifies examinees scheduled to take the computer-based test and identifies and creates the appropriate test. Test driver
110 then presents all of the test components to examinees using a display device (not shown), such as a computer monitor, and enables examinees to enter responses to test questions through the use of an input device (not shown), such as a keyboard, a mouse, etc. Test driver 110 also monitors the security of the test. For example, test driver 110 can prevent access to the Internet and can validate examinees, although, these functions are preferably performed by the test center administration system. Test driver 110 also monitors the timing of the test, providing relevant warnings to examinee regarding the elapsed time of the test and the time remaining for a particular section of the test or for the entire test. Test driver 110 is also responsible for scoring the test, once the test is completed or while the test is in progress, and for reporting the results of the test by physical printout using printer 182 or in a file format using candidate exam results file 180. If the test is interrupted while in progress for example, due to a power failure, test driver 110 restarts the test, preferably at the point at which the test was interrupted, as will be described subsequently in more detail. Finally, if the test is left incomplete, test driver 110 cleans up the incomplete test. An incomplete test will have an exam instance file in the examinee's directory but will not have created a results file. A results file is created even though generally the candidate will fail. The number of items delivered to the examinee is recorded in the results file. Test drive 110 picks up where the event was interrupted and invisibly deliveries the rest of the units of the test.
A test specification is authored by a test publisher according to the specifications of the client and stored in exam source files 130. Exam source files 130 include data files 132, XXL files 134, multimedia files 136, and hypertext markup language ("HTML") files 138. XXL files 134 include the test specification, which contains the client's requirements for the test, a bank of test items or questions, templates that determine the physical appearance of the test, plugins, and any additional data necessary to implement the test. Additional data is also stored in data files 132. For example an adaptive selection plugin may need a, b & c theta values. These values are stored in a binary file created by a statistical package.
HTML files 130 include, for example, any visual components of the test, such as the appearance of test items or questions, the appearance of presentations on the display device, the appearance of any client specified customizations, and/or the appearance of score reports. HTML files 130 preferably also include script, for example, VBscript and Jscript, or Java script. HTML files 130 are preferably authored using Microsoft's FrontPage 2000. FrontPage 2000 is preferably also used to manage the source files in a hierarchy that is chosen by the test publisher. Multimedia files 136 include, for example, any images (.jpg, .gif, etc.) and/or sound files (.mp3, .wav, .au, etc.) that are used during the test.
XXL compiler 140 retrieves XXL files 134 from exam source files 130 using interface 190 and compiles the XXL test content stored in XXL files 134. XXL compiler 140 stores the compiled test files in exam resource file 120. In another embodiment, exam source files 130 do not contain XXL files 134 and contains, for example, only multi-media files, hi this embodiment, XXL compiler 140 is merely a test packager that writes the data directly to exam resource file 120 without modification or validation. The data appears in a stream under the "data" branch of exam resource file 120. The name of the steam is specified by the test author.
In a preferred embodiment, XXL files 134 also include XXL language that defines plugins 150, in which case, plugins 150 assist XXL compiler 140 in compiling XXL files 134. Test driver 110 preferably supports, for example, nine different types of plugins 150, including, for example: display plugin 152; helm plugin 154; item plugin 156; timer plugin 158; selection plugin 160; navigation plugin 162; scoring plugin 164; results plugin 166; and report plugin 168. Plugins 150, which are also included in XXL files 134, are the first XML files compiled into exam resource file 120. Exam resource file 120 receives the compiled test content from XXL compiler 140 and plugins
150, if applicable, and stores the compiled test content in an object-linking and embedding ("OLE") structured storage format, called POLESS, which is described in greater detail below. Other storage formats may optionally be used. OLE allows different objects to write information into the same file, for example, embedding an Excel spreadsheet inside a Word document. OLE supports two types of structures, embedding and linking. In OLE embedding, the Word document of the example is a container application and the Excel spreadsheet is an embedded object. The container application contains a copy of the embedded object and changes made to the embedded object affect only the container application. In OLE linking, the Word document of the example is the container application and the Excel spreadsheet is a linked object. The container application contains a pointer to the linked object and any changes made to the linked object change the original linked object. Any other applications that link to the linked object are also updated. POLESS supports structured storage such that only one change made to an object stored in exam resource file 120 is globally effective. Test driver 110 comprises Active Document container application 112 for the visible plugins, display plugin 152, helm plugin 154, and item plugin 156, which function as embedded objects, preferably COM objects.
Both XXL compiler 140 and plugins 150 are involved in storing the compiled test content into exam resource file 120, if any of plugins 150 are being used. Exam resource file 120 comprises, for example, a hierarchical storage structure, as will be described in further detail below. Other storage structures may optionally be used. XXL compiler 140 determines to which storage location a specific segment of the compiled test content is to be stored. However, if any of plugins 150 are used to validate the portion of any of the data from exam source files 130, then the plugins 150 store the data directly to the exam resource file, based upon directions from XXL compiler 140. XXL compiler uses IPersistResource interface 192, co-located with I-Plugin interface 167 in Figure 3, to control the persistence of the data to exam resource file 120. XXL compiler 140 and plugins 150 write the data to exam resource file 120 using POLESS interfaces 192.
Figure 4 illustrates the contents of exam source file 130, which are compiled into exam resource file 120 by XXL compiler 140 and plugins 150. FrontPage 2000 Web 200 is used, for example, to author the test. Exam source files 130 contain media files 210, visual files 220, and logic files 230. Media files 210 are multimedia files used to enhance the presentation of the test, including, for example, XML data files 212, sound files 214, image files 216, and binary files 218. XML data files 212 include the XXL test definition language and the XXL extensions from the plugins 150 that use XML. The test specification, presentation, scoring and other information is specified in the XML files. Sound files 214 include any sounds that are to be used during the test, such as .mp3 files, .au files, etc. Image files 216 include any images to be used during the test, such as .jpg files, .gif files, etc. Binary files 218 include any data needed by a plugin 150 that is not in XXL format. Visual files 220 are HTML files that specify the visual presentation of the test as presented to the examine on the display device, including items files 222, presentation files 224, score report files 226, and custom look files 228. Items files 222 include HTML files that are used to specify the visual component of test questions, e.g., stems and distractors. Items files 222 are capable also of referencing external exhibits. An exhibit could be a chart, diagram or photograph. Formats of exhibits include, for example: .jpg,
.png, etc. Presentation files 224 define what is seen by the examinee on the display device at a particular instant during the test. Score report files 226 is typically an HTML file with embedded script that includes, for example candidate demographics, appointment information, and candidate performance. The performance might include pass/fail, achievement in different content areas, etc.
Custom look files 228 are typically HTML files with embedded script to layout, for example, the title bar and information contained therein. Logic files 230 are XML files that specify the functional aspects of the test, including test specification files 232, plugin files 234, item bank files 236, and template files
238. Test specification files 232 specify the content and progression of the test as provided by the client. Plugin files 234 define plugins 150 and contain any data necessary to implement plugins 150. Item bank files 236 include the data content and properties of the items, or test questions, that are to be presented to the examinee during the test. Properties of an item include the correct answer for the item, the weight given to the item, etc. Template files 238 define visual layouts that are used with the display screen during the test. Referring again to Figure 3, once a test has begun, test driver 110 accesses exam resource file
120 for the instructions and files needed to implement the test, using POLESS interfaces 193. Test driver 110 also access plugins 150 for additional data that expands the functionality of test driver 110 in the areas of items, navigation algorithms, information displays, scoring algorithms, timing algorithms, test unit selection algorithms, results persistence reporting, printed score reporting, and/or helm types. Test driver 110 communicates with plugins 150 using various COM interfaces 169. COM interfaces facilitate OLE linking. As stated previously, test driver 110 is an Active Document container application and plugins 150 are embedded objects. The COM interfaces function as communications paths between the container application and the objects.
There are, for example, ten COM interfaces utilized in computer-based test delivery system 100. Plugin interface 167, which is also a COM interface, is supported by all of plugins 150. COM interfaces 169, therefore, includes the Plugin interface. The Plugin interface contains generic operations such as loading and unloading required of all plugins 150. In addition to the global Plugin interface, each plugin 150 also uses, for example, a second, individual COM interface 169 to communicate with test driver 110. Alternative structures of the Plugin interface may also be used. Table 1 shows the relationship between each plugin 150 and the COM interface 169 used with that particular plugin 150.
TABLE 1: COMINTERFACEFORPLUGINS
Figure imgf000014_0001
Figure imgf000015_0001
Figure imgf000016_0001
Exam instance file 170 is used to restart a test if the test has been interrupted, for example, because of a power failure. During delivery of the test, exam instance file 170 receives examination state information from test driver 110 and plugins 150 regarding the state of all running objects being used to deliver the test. The examination state information includes the presentation that was being delivered on the display device before the interruption, the responses the examinee had entered in that presentation, etc. When the test is restarted, the exam instance file 170 loads the state information back to test driver 110 and plugins 150, allowing the test to return to operation at the point where the test had been interrupted. Preferably, the running state of all objects is saved to exam instance file 170 rather than of only some of the objects. Saving the state of only some of the objects to exam instance file 170 causes the potential problem of only a portion of the test information being restored after a test interruption. Exam instance file 170 may also store additional information relating to the test, including, for example: the timing utilized and time remaining on units of the exam, the current unit of delivery, candidate score, etc. Test driver 110 and plugins 150 communicate with exam instance file 170 using POLESS interfaces 195. Test driver 110 controls communication between test driver 110 and plugins 150 using Persistlnstance interface 196, which is collocated with COM interfaces 169 in Figure 3.
Several administrative environments perform the administrative functions of computer-based test delivery system 100, for example: Test Center Manager ("TCM") Bridge 172; Educational Testing Service ("ETS") Bridge 174; and Unified Administration System ("UAS") 174. Administrative functions include, for example: checking-in an examinee, starting the test, aborting the test, pausing the test, resuming the test, and transmitting results.
There are preferably two ways to run Test driver 110. The first is through a series of command line options and the second is using COM interfaces describing appointment information. The command line option exists for backwards compatibility in a standard ETS environment and a TCM environment. Table 2 shows a list of command line options test driver 110 supports. There are, for example, four programs which launch the test through the COM interface, for example: 1)
LaunchTest.exe (for test production and client review); 2) UAS; 3) UTD2ETS.dll (an internal compatibility module for use with the ETS administration environment); and 4) UTD2TCM (for the
Test Center Manger environment). Other number of environments and/or programs may optionally be used.
Figure imgf000017_0001
The administration environments use several interfaces to communicate with test driver 110. IAppointment interface 176 is part of UAS 174 and allows access by test driver 110 to examinee information for the examinee taking the test, such as demographics. The examinee information is included in candidate exam results file 180, which is created by the test driver. ILaunch2 interface 177 functions as the primary control interface for UAS 174 and allows UAS 174 to control various components such as test driver 110, screen resolution change, accommodations for disabled candidates, examinee check-in, etc., in a test center, which is the physical location where the examinee is taking the test. ITransfer interface 199 transfers candidate exam results file 180 and other files back to UAS 174. Print interface 198 sends information regarding any reports to printer 182. II. Compilation of Exam Source Files
A. XXL Compiler Interfaces and Classes
Figures 5A and 5B illustrate the main diagram for XXL compiler 140. XXL compiler 140 comprises the following classes, for example: cCompile 2000; cData 2004; cArea 2006; cTemplate 2008; cCategory 2010; cltem 2012; cPresentation 2014; cGroup 2016; cSection 2018; cForm 2020; cFromGroup 2022; cExam 2024; cMsgBox 2026; cChecksum 2028; cEvent 2030; cResult 2032; cReport 2024; cPlugin 2036; and cXXL 2038.
The main interface to XXL compiler 140 is ICompile interface 2002. ICompile interface 2002 is implemented by cCompiler class 2000. All control and initiation of compilation of exam source files 130 into exam resource file 120 occurs by way of this single public interface. The core, non-plugin related elements of the XXL test definition language, as stored in XXL files 134, are compiled by classes in XXL compiler 140. For example, cSection class 2018, compiles the section element, and cGroup class 2016 compiles the group element.
ICompile interface 2002 supports the following operations, for example: createResource(); addSourceO; addData(); closeResource(); about(); linkResource(); openResource() and getCryptoObjectQ. CreateResourceQ creates a resource file, for example, an XXL based resource file such as exam resource file 120. AddSource() compiles an XXL file into the resource file. AddData()adds a file directly to a data branch of the resource file. CloseResource() closes the resource file. LinkResourceO links a resource in the resource file and is performed after all compiling of the source files are completed. GetCryptoObject() returns an ICrypto object containing the current encryption setting of POLESS, as described below.
The classes of XXL compiler 1040, e.g., cForm 2020 and cltem 2012, handle individual XXL core language elements. All of these classes compile the specific XXL source element into exam resource file 120. All of these class language elements are also symbols used in later references. Therefore, the classes all derive from cSymbol class 2040. cSymbol class 2040 allows the classes of XXL compiler 140 to reside in a symbol table.
For example, the XXL element plugin 150 appears as follows in XXL files 134:
<plugin name="helmNextPrevious" progid="UTDP. cNextPrevious" /> This XXL call causes an instance of cPlugin class 2036 to be created, compiles the source, and writes the compiled result to exam resource file 120. The name and ID of Plugin 150 is also added to the symbol table for later reference.
XXL compiler 140 also contains the following token classes, for example: cToken 2042; cTokenCreatorNoRef 2044; cTokenCreator 2046; CtokenCreatorRef 2048; cTokenCreatorBase 2050; and cTokenFactory 2054. These token classes are involved in the identification of tokens. Tokens turn into symbols after identification. Symbols are any class derived from cSymbol, e.g., cTemplate, cSection, etc.
XXL compiler 140 also contains the following symbol table classes, for example: cPluginSymbolTable 2058; cTemplateSymbolTable 2060; cSymbolTable 2062; cFFGSymbolTable 2064; cSGPSymbolTable 2066; and cSymbolTableBase 2068. These classes are varieties of symbol tables. There are different symbol tables for different groups of symbols. A group of symbols define a name space for the symbol. Common symbol table functions are located in the base symbol table classes and templates.
All content and specification destined for a plugin 150 appears in the data element in XXL. For example, below is an item definition in XXL: <item name="wanABreakl" skipAllo ed="false">
<data>
<multiChoice correctAnswer="A" max esponses="1" minResponses="l" autoPrompt="false"
URI="itembank/info_item.htm#wantABreak"/> </data> </item> The item element is handled by a cltem class 2012 object. The data element in the XXL definition is handled by a cData class 2004 object. Item plugin 156 Plugin 150 will receive the source to compile from the cData class 2004 object, in this example, a multiChoice element. cWrapXML class 2052, a wrapper class for XML DOM nodes, supports error handling. cCustomAttributes class 2056 compiles the custom attributes XXL element. cWrapPropertySet class 2070 is a wrapper class for a POLESS property storage.
B. Order of Compilation
The test publisher advantageously and optionally is not forced to combine the entire test specification and content of the test into a single file. Rather the test publisher is encouraged to break apart exam source files 130 to allow for maximum reuse between tests. Therefore, in accordance with one embodiment, in a single XXL source file, the order of the elements is enforced by XXL compiler 140 with the symbol tables, i alternative embodiments, more than one source file may be used. An element defined in the test specification and content or an attribute of an element is preferably and optionally defined before it is referenced by the element or by a sub-element.
Figure 6 illustrates a compile order for exam source file 130 in accordance with one embodiment. Other compile orders are possible so long as the basic functionality described herein is performed. Exam source files 130 include, for example, data files 132. Data files 132 include, for example, several multimedia files, e.g., sound files 2072 (.wav, .mp3, etc.) and image files 2070 (.jpg, .gif, etc.). Data files 132 are typically globally accessible to the test specification and content as defined in XXL files 134. Therefore, data files 132 are compiler first. It does not matter, however, in which order data files 132 are themselves compiled.
XXL files 134 preferably are compiled after data files 132, if data files 132 exist. Otherwise, XXL files 134 are compiled first. Other compilation orders may optionally be used. Any globally available scripts 2078 or other global data 2076 preferably are compiled first. Plugins 150 are compiled next. It should be noted that data files 2070, other data files 2076, and scripts 2078 are optional.
Therefore, plugins 150 can be the first files to be compiled if the other files are not present in exam source file 130. Any files concerning the layout of the test, i.e., layout files 2082, are next in the compilation order. Titlebar.html file 2084 and .bmp file 2086 are examples of pulled files. Pulled files are typically files that are used to create the visual format of the test and are usually defined using HTML. (See HTML files 138 in Figure 3.) If a file is reference in HTML then the file is compiled at the same time as the XXL file that is referencing the HTML file.
If the test uses categories, categories files 2084 are compiled next, since categories files 2084 can reference any global data files 132, plugins 150, and layout files 2082. Items files 2086, which include test questions that are to be delivered to the examinee, are compiled next and any HTML files referenced by items files 2086 are compiled along with items files 2086. Finally test specification files 2090, which are part of XXL files 134, are compiled. Test specification files 2090 define the groups, sections, forms, form groups, and exams that comprise the test. Various files, e.g., score report files 2092 and displays files 2094 can be referenced by test specification files 2090 and are compiled along with test specification files 2090.
The test publisher defines the compile order before starting the first compile sequence. Figure 7 illustrates how the test publisher defines the compile order. In source webs window 2092, the test publisher first compiles all .jpg and .gif image files. All XML files are compiled next. Plugin files 2080 are first in the sequence, followed by the template files, which are included in layout files 2082. Next category files 2081 are compiled, followed by three items files 2086. Finally, test specification files 2090 are compiled. Figure 8 shows the output of the compilation process. The files are compiled in the order specified by the test publisher, as shown in Figure 7. Other compile sequences may optionally by used that accomplish the functionality and/or objects of the present invention. HI. XXL Test Definition Language
A. Exam Resource File
Figure 9 illustrates the main storage branches of exam resource file 120, which corresponds to the top-level elements of the XXL test definition language, denoted by reference numeral 500. The main storage branches of exam resource file 120 are, for example: exams branch 550; forms branch 600; items branch 650; category branch 700; templates branch 750; sections branch 800; groups branch 850; plugins branch 900; data branch 950; formGroups branch 1000; attributes branch 1050; scripts branch 1100; and message box ("Msgbox") branch 1150. Other storage branches may alternatively be used.
Exam branch 550, as seen in Figure 10, stores, for example, the primary attributes, properties, and data that govern the test. Exam branch 550 can store information for various tests, as is denoted by the three, vertical ellipses. A specific test is identified by the data stored in name attribute storage 552 or other identification schemes. Again, the various tests may each be identified by a different name, as denoted by the solid border around name attribute storage 552. Attributes storage 554 stores version information 555, and title information 556 of the test as a stream of data or other data storage format.
Title information 556 is optional, as is denoted by the broken border. Any optional, customized information regarding the test is stored in custom properties 558 as a property storage or other data storage format. Information relating to the forms of the test are optionally stored in forms property storage 560. A form is a fixed or substantially fixed order of testing events. Many different forms can be stored in forms storage 560, giving flexibility to test driver 110 in controlling progression of the test. FormGroups storage 562 optionally stores information relating to a collection of exam forms as a stream of data. Preferably, a single form from the formGroup is chosen to deliver to an examinee. The selection of the form from the group is performed by a selection plugin 160. Exam branch 550 preferably contains at least one forms storage 560 either independently or within formGroups storage 562. Other information relating to the test may be stored under exam branch 550.
Forms branch 600, as seen in Figures 1 IA and 1 IB, stores, for example, the primary attributes, properties, and data that govern the progress of the test. Forms branch 600 can store information for various forms, as is denoted by the three, vertical ellipses. As described previously, a form is a fixed or substantially fixed order or substantially fixed of testing events. A single form is identified by the data stored in name attribute storage 602. Other identification formats may optionally be used. Again, the various forms may each be identified, for example, by a different name, as denoted by the solid border around name attribute storage 602. Attribute storage 604 stores, for example, begin section information 605, end section information 606, event information 607, and optionally stores version information 608, title information 609, skip allowed information 610, restartable information 611, with information 612, height information 613, and bit depth information 614. All information stored in attribute storage 604 is stored as a stream of data or other storage format. Begin section information 605 and end section information 606 indicates respectively which section of the test begins and ends the test.
Event information 607 indicates, for example, the order of events of the test for that form. Each event has a name and is prefixed with an event type and a colon. Other formats are optional. The event type includes "section", "report", and "results". Version information 608 and title information 609 indicate the version and title of the form, respectively. Skip allowed information 610 indicates whether or not by default skipping of sections is allowed. Restartable information 611 indicates whether the form can be restarted. Any optional, customized information regarding the form is stored in custom storage 616 as a property set or other data storage format. Timer storage 628 stores, for example, information relating to how the form is to be timed as a storage element. Attributes storage
630 stores, for example, the names of Timer Plugin 158 to be used with the form. Plugin data storage
632 and plugin data storage 633 store any data necessary for timer plugin 158 as a storage element and a stream of data, respectively. Plugin data storage 632 and plug in data storage 633 are optional. Scoring storage 634 stores, for example, information relating to the scoring of the form. Attributes storage 636 stores, for example, the name of scoring plugin 164 to be used with the form. Plugin data
638 and plugin data 639 optionally store any data needed for scoring Plugin 164 as a storage element and a stream of data respectively.
Items Branch 650, as seen in Figure 12, stores, for example, the primary attributes, properties, and data that govern the items, or test questions, to be delivered to the examinee during the test. Items branch 650 can store information for various items, as is denoted by the three, vertical ellipses. A single item is identified by the data stored in name attributes storage 652. Again, the various items may each be identified by a different name, as denoted by the solid border around name attributes storage 652. Attributes storage 654 stores, for example, weight information 654, scored information 655, and optionally stores, for example, skip allowed information 656, title information 657, start information 658, finish information 659, and condition information 660. Weight information 654 indicates a value used for judging and scoring the item. By default an item is given a weight of one in accordance with one embodiment, but other values may be utilized. Scored information 655 indicates whether or not the item is scored as opposed to whether the item is being used as an example. The default of scored information 655 is true. Skip allowed information 656 indicates whether the examinee can skip the item without answering. Start information 658 indicates script execution at the beginning of the item and finish information 659 indicates script execution at the end of the item. Condition information 660 indicates whether or not there is a condition on the item being delivered to the examinee. The information stored in attributes storage 654 is stored as a stream of data or other data storage format. Data storage 662 and data stream 664 store any information regarding the properties of the item. For example, data storage 662 or data stream 664 can store the correct answer of a multiple choice item. Data storage 662 and data stream 664 stored the information as a storage element and a stream of data respectively. Any optional, customized information regarding the item is stored in customs storage 666 as a stream of data or other data storage format. Category storage 668 stores, for example, information relating to each category to which the item belongs. The information stored in category storage 668 preferably and optionally is redundant, as category branch 700 stores, for example, all the items within the specific categories. The reason for the optionally redundancy is so that test driver 110 can quickly look up the category of any item.
Category branch 700, as seen in Figure 13, stores, for example, the primary attributes, properties, and data that govern the test categories. A test category provides a grouping mechanism, which is independent of delivery of the test, allowing for exotic reporting and scoring if necessary. For example, the test delivers 50 questions of fire safety from a pool of 200 questions. The 50 questions are chosen at random. All 50 questions are delivered together to the examinee in one section. Each question is a member of one of three categories: fire prevention, flammable liquids and fire retardants.
The test sponsor wants the report and results to show how well the examinee did in each category.
Hence the results are grouped exotically by the category, rather than by the order delivered. Category branch 700 is optional as denoted by the broken border. Category branch 700 can store information for various categories, as is denoted by the three, vertical ellipses. A single category is identified by the data stored in name attributes storage 702. Again, the various categories may each be identified by a different name, as denoted by the solid border around name attributes storage 702.
Attributes storage 704 stores, for example, complete information 705, duplicates information 706, contents information 707, and optionally stores description information 708. Complete information 705 indicates whether or not every item in the category must appear within the category or within its subcategories. Duplicates information 706 indicates whether the item can appear more than once within the category or within the subcategories. Contents information 707 determines what can exist within a category. Description information 708 is used within the category to contain a description of the category's contents. Category storage 710 stores, for example, information relating to any subcategories under the category identified in name attribute storage 702. Items storage 712 indicates any items that exist within the category. Sections storage 714 contains information indicating what any sections that exist within the category. Scoring storage 716 contains information relating to the scoring of the items within the category. Attributes storage 718 stores, for example, the name of the scoring plugin to be used with the item. Data storage 720 and data stream 722 contain the information needed to initialize scoring plugin 164. Data storage 720 and data stream 722 store the information as a storage element and a stream of data respectively.
Templates branch 750, as seen in Figure 14, stores, for example, the primary attributes, properties, and data that govern the templates used in the test. Template branch 750 can store information for various main templates, as is denoted by the three, vertical ellipses. A single main template is identified by the data stored in name attributes storage 752. Again, the various templates may each be identified by a different name, as denoted by the solid border around name attributes storage 752. Attributes storage 754 stores, for example, split information 756, order information 757, and optionally stores size information 759. Split information 656 defines how a specific area within the template is to be split or separated, for example, either by rows or columns or other shapes and/or sizes. Size information 759 indicates possible values for describing the size of the template, for example, pixels, percentages, or html syntax. Template storage 760 stores, for example, information relating to any sub-templates to be used under the templates specified by the information in name attributes storage 752. Sub-templates are identified by the information in name attributes storage 762. Many sub-templates 760 can exist as denoted by the three vertical ellipses.
Areas storage 764 indicates, for example, information relating to the areas used within the template denoted by the information in name attributes storage 752. Many areas may exist within a template as denoted by the three vertical ellipses. Each area is identified by the information stored in name attribute storage 766. Attribute storage 768 stores, for example, visible plugin name information
760, size information 770, and allow more information 771. Plugin name information 760 indicates the name of the visible plugin to be used with the area. Size information 770 indicates the size of the area, as for example a pixel value, a percentage value, or HTML syntax. Plugin data 772 and plugin data 774 store information relating to the visible plugin to be used in the area. The data stored in either plugin data storage 772 or plugin data stream 774 is executed by the visible plugin when the template is loaded. Plugin data storage 772 and plugin data stream 774 stores, for example, the information as either a storage element or a stream of data, respectively. Other information may optionally be stored. Section branch 800, as seen in Figure 15, stores, for example, the primary attributes, properties, and data that govern test sections. Test sections dictate the navigation and timing of groups of items as well as displays within the test. Sections branch 800 can store information for various sections, as is denoted by the three, vertical ellipses. A single section is identified by the data stored in name attribute storage 802. Again, the various sections may each be identified by a different name, as noted by the solid border around name attributes storage 802. Attributes storage 804 stores, for example, group information 805 and optionally stores, for example, title information 806, skip allowed information 807, start information 808, finish information 809, and condition information 810. Group information 805 indicates to which group of the test the section belongs. Skip allowed information 807 indicates whether or not the items within the section may be skipped. Start information 808 indicates script execution at the beginning of the section and finish information 809 indicates script execution at the end of the section. Condition information 810 indicates any conditions that exist regarding the section. Any optional, customized information regarding this section is stored in custom property storage 812 as a stream of data or other data storage format. Custom attributes will be stored as a property set. The "key" for each attribute will be a string or other acceptable format. Timer storage 814 stores, for example, information regarding, for example, the timing of the section. Attribute storage 816 stores, for example, information identifying timer plugin 158, which is to be used with a section. Plugin data storage 818 and plugin data storage 820 stores, for example, data needed for timer plugin 158. Plugin data storage 818 and plugin data storage 820 stores, for example, information as a storage element and a string of data or other acceptable format, respectively. Navigation storage 822 stores, for example, information relating to the delivery of presentations and groups within the section. Attributes storage 824 stores, for example, information indicating which navigation plugin 162 is to be used with this section. Plugin data storage 826 and plugin data stream 828 store information needed for the navigation plugin 162. Plugin data storage 826 and plugin data stream 828 store the information as a storage element and a stream of data respectively. Groups branch 850, as seen in Figure 16, stores, for example, the primary attributes, properties, and data that govern the groups within the test. A group determines the order of events within the test. Groups branch 850 can store information for various groups, as is denoted by the three, vertical ellipses. A single group is identified by the data store in name attributes storage 852. The various groups may each be identified by a different name, as noted by the solid border around name attributes storage 852. Attributes storage
854 stores, for example, type information 855, event information 856, title information 857, and reviewed name information 858. Type information 855 indicates whether the group is either a "group holder" (group of presentations), or a "section holder" (group of sub-sections). These are mutually exclusive.
Event information 856 indicates, for example, the order of events within the test. Review name information 858 indicates whether or not a presentation within the group is to be used as a review screen. Any optional, customized information regarding the group is stored in custom storage 860 as a stream of data or other data storage format. Events storage 862 stores, for example, event information, for example, as is described in further detail in Figure 17. Scoring storage 864 stores, for example, information relating to the scoring of items within the group. Attributes storage 866 stores, for example, information indicating which scoring plugin 164 is to be used with the group. Selection storage 872 stores, for example, information relating to the selection of items within the group. Attributes storage 874 indicates which selection plugin 160 is to be used with the group.
Figures 17A, 17B, 17C, and 17D illustrate the events sub-branch of groups branch 850 in greater detail in accordance with one embodiment of the invention. In Figure 17A, events sub-branch 862 can store information for various events. For example, events sub-branch 862 is storing information in events name sub-branch 880, event name sub-branch 890, and event name sub-branch 897. Attributes storage 881, in Figure 17B, under events name storage 880 stores, for example, type information 882, template information 883, and optionally stores, for example, title information 884, counted information 885, start information 886, finish information 887, and condition information 888. Type information 882 indicates whether the event is an item or a display. Template information 883 indicates which template is being used with the event. Counted information 885 indicates whether a presentation should be included in the totals of presentations presented to the examinee in a section. Generally, presentations with items, or questions, are counted and introductory presentations are not counted.
Start information 886, finish information 887, and condition information 888 indicates start, finish, and conditional scripts respectively. Any optional, customized information regarding the event is stored in custom storage 889. The "key" for each custom attribute will be a string. Referring again to Figure 17A, event name storage 890 indicates, for example, a different event, which contains different attributes. Additionally, area information 891, in Figure 17B, indicates, for example, which area is rendering the presentations content and item information 892 indicates the name of the associated item if the event is of the item type. Additionally, data storage 893, data stream 894, data storage 895, and data storage 896 contain information used in a nested presentation. The data off of a nested presentation are the contents of the item or the presentation. This data may be a stream, a storage, a link to a stream, a link to a storage, or other format. In Figure 17C, event name 897 indicates another event, which includes a sub-event 898, in Figure 17D.
Plugins branch 900, as seen in Figure 18, stores, for example, the primary attributes, properties, and data that govern any plugins 150 used for the test. Plugins branch 900 can store information for various plugins, as is denoted by the three, vertical ellipses. A single plugin is identified by the data stored in name attribute storage 902. A CLSID is stamped with the name of the plugin 150. Attributes storage 904 stores, for example, information identifying the plugin 150 by a program ID. Data storage
906 stores, for example, the data, for example, as either a stream, set of data, or as a storage element if plugin 150, respectively. Data branch 950, as indicated in Figure 19, stores, for example, any global data needed for the test. Data stored optionally under data branch 950 may be stored as either a storage element or a stream of data as indicated by data storage 952 and data storage 954. Data stored under data branch 950 may be directly used by a plugin 150 or the data may be resources (.gif, .jpeg, .wab, .mpeg, etc.) used internally by a plugin 150. FormGroups branch 1000, as seen in Figure 20, stores, for example, the primary attributes properties and data that govern the formGroups of the test. FormGroups branch 1000 can store information for various formGroups, as is denoted by the three, vertical ellipses. A single formGroup is identified by the data stored in name attributes storage 1002. The various formGroups may each be identified by a different name, as denoted by the solid border around name attributes storage 1002. Attributes storage 1004 stores, for example, information indicating which forms are to be used within the formGroup. Selections storage 1006 stores, for example, information relating to the selection of items within the formGroup. Attributes storage 1008 indicates which selection plugin 160 is to be used with the formGroup. Plugin data storage 1010 and plugin data storage 1012 store any information needed for the selection plugin 160. Attributes storage branch 1050 stores, for example, attribute information that is global to exam resource file 120. This includes the last execution state of XXL compiler 140 [sMode], the major [ XXLMajorVersion] and the minor version [iXXLMinorVersion] of the XXL language.
Scripts branch 1100 stores, for example, information relating to scripts used within the test. Attributes storage 1102 stores, for example, type information that specifies which type of language the script is in, for example, VB script of J script. Scripts storage 1104 stores, for example, global scripts used within the test that may be referenced by the test driver. MsgBox branch 1150 stores, for example, information relating to the size and content of any message boxes that may be delivered to the examinee during the test. Message boxes may be triggered by plugins 150 during the exam. B. Defining a Test with the XXL Test Definition Language
1) Test Production and Test Delivery
Figure 24 is a flow chart illustrating the overall method of test production and test delivery according to the present invention, denoted generally by reference numeral 1500. The test publisher first authors the test specification and content in the test definition language, for example, XXL, step 1502. The test specification and content are then stored in exam source files 130, specifically, in XXL files 134, step 1504. The content of XXL files 134, are then compiled and validated, step 1506. The compiled XXL test specification and content are stored in exam resource file 120, step 1508. Finally, the compiled XXL test specification and content are delivered to the examinee, step 1510. The validation of the test specification and content is illustrated in greater detail in Figure 25, by the method denoted generally by reference numeral 1512. When the XXL test specification and content stored in exam source files 130 specifically references a plugin 150, that plugin 150 is instantiated, step 1514. The segment of the XXL test specification and content relating to that plugin 150 is loaded into the plugin 150 from exam source files 130, step 1516. In an alternative embodiment, the partial test specification and content is loaded into a private memory in data communication with the plugin 150. The plugin 150 validates the segment of the XXL test specification and content, step 1518. The validated segment of the XXL test specification and content is then unloaded from the plugin 150 into a storage element within exam resource file 120.
Figure 26 illustrates the method of the test delivery cycle in greater detail. When the previously validated segment of XXL test specification and content stored in exam resource file 120 references a plugin 150, the plugin 150 is instantiated, step 1525. The storage element in exam resource file 120 containing the validated segment of XXL test specification and content is provided to the plugin 150, step 1527. Finally, the validated segment of XXL test specification and content is loaded into the plugin 150 from the storage element within exam resource file 120, step 1529. 2) Flow of a Test
During delivery of the test, test driver 110 and plugins 150 retrieve the test specification and content from exam resource file 120. Figure 27 illustrates the flow of a test. Exam resource file 120 stores the entire test specification and contents of the test. Exams branch 550 stores, for example, information identifying the various forms that define the order (e.g.., sequential, non-sequential, random, etc.) of delivery of the test. Forms branch 600 stores, for example, the information for the various forms, including which sections begin and end the test, the order in which presentations to be delivered, timing information, and scoring information included in a particular form. Sections branch 800 stores, for example, information identifying a particular group and the timing and navigation information associated with that group. Groups branch 850 identifies the order of presentations 862, also called events, that are to be delivered to the examinee during a particular section. Groups branch 850 also stores, for example, information for the scoring and selection of items delivered to the examinee during a presentation 862. Groups 850 can also store information for other sub-groups stored in groups branch 850 and sub-sections stored in sections branch 800.
3) XML-based Language
The XXL language is advantageously based on XML, although other functionally similar languages may optionally be used. XML is a meta-language, meaning that XML can be used to define other languages. XML based languages have several common concepts, including, for example: elements, attributes, and tags. An element, for example "<exam>," is a label for a tag type. An attribute, for example "name="star_trek" is a property of an element. Therefore, the "exam" element would be named or identified as "star trek". A tag is a region of data surrounded by <>. Tags are made up of elements and attributes, for example, <exam name="star_trek">. Alternative formats and/or implementations may optionally be used to implement the attributes, elements, and tags in the present invention. Below are other examples of XXL definitions used for a test: <exam name="star trek" >
<form name="strek01" title="Star Trek Certification" >
<item name="sta_001" weight="l .0" scored="true" >
The element declaration for each element is followed by the attributes assigned to that element. Any XML based language, including XXL of the present invention, must be both well formed and valid to serve its intended purpose. There are three rules which an XML definition generally follows to be considered "well formed." First, any start tags have, for example, matching end tags that are preceded by a forward slash. For example: Correct : <exam> ... </exam> Wrong : <exam> ... <exam>
Wrong: <exam> ... </narf>
Secondly, there is generally one end tag for every start tag or a single tag with a slash. For example: Correct : <presentation> ... </presentation>
Correct : presentation /> Wrong : <presentation>
Wrong : presentation
Finally, all tags preferably are properly nested without overlap. For example: Correct: <exam> <form> ... </form> </exam>
Wrong: <exam> <form> ... </exam> </form>
The above rules are exemplary for use of the XXL language in accordance with XML format. However, other alternative formats that implement the standards of the rules described above may optionally be used.
An XML document is considered to be "valid" if the document correctly describes a specific set of data. For a test definition language such as XXL, a document is "valid" if it correctly describes XXL. For example: Correct : <exa name= "star_trek" > <formRef name="strek01" title="Star Trek
Certification" />
<formRef name="strek02" title="Star Trek
Certification" />
</exam>
Wrong: <exam name="star_trek">
<music artist="Chosen Few" title="Name of My
DJ" />
</exam> 3) XXL Schema
A language created with XML uses a schema to define what tags create the language, what attributes can appear on a certain tag, and the order in which tags may appear. Figure 28 illustrates an example of an XXL schema that defines the global attribute "name" and the element "form". The element "form" has the global "name" attribute, as well as the attribute "restartable", which is defined within the "form" definition. The "form" element also has child, or sub, element "scoring". The element "scoring" needs to be defined before being assigned (not shown). The XXL schema makes many element token legal at the beginning of the schema.
Figure 29 illustrates another XXL schema example. In this example, the element "item" is being defined. The "item" elements has several attributes, all of which are defined within the element definition, for example: "name"; "title"' "template"; "area"; "weight"; "scored"; and "skipAllowed". In this example, no global attributes have been defined, therefore, all attributes being assigned to the "item" element must be defined within the element definition. The item "element" also has a child element, "data." The XXL schema in its entirety may be found in Appendix A.
Figure 30 illustrates the hierarchy of the XXL test definition language. The XXL language's outer most element is XXL element 1700. XXL 1700 includes, for example, the following elements: exam element 1702, category element 1704, item element 1728, plugin element 1706, template element 1708, and MsgBox element 1710. Additionally, form element 1716, section element 1722, and group element 1724 may be defined directly inside XXL element 1700 or in their appropriate place in the hierarchy. When form element 1716, section element 1722, and group element 1724 are defined directly under XXL element 1700, form element 1716, section element 1722, and group element 1724 must be referenced later to be utilized.
Exam element 1702 contains formGroup element 1712 and form element 1716. FormGroup element 1712 may also contain form element 1716 directly. Form element 1716 may contain report element 1720, results element 1718, and section element 1722. Section element 1722 may contain group element 1724. Group element 1724 may contain section element 1722, other group elements 1724 and presentation elements 1726.
Presentation element 1726 references item elements 1728 that are defined under XXL element 1700. Presentation element 1726 may contain other presentation elements 1726.
Category element 1704 may contain other category elements 1704. Template element 1708 may contain area elements 1714 and other template elements 1708. Elements attribute element 1730, script element 1732, and data element 1734 may adorn many language constructs. Attribute element 1730 may be attached to many core elements, including, for example, exam element 1702, form element 1716, section element 1722, presentation element 1726, and item element 1728. Script element 1730 may appear under the following elements, for example: XXL element 1700, form element 1716, section element 1722, presentation element 1726, and item element 1728. Data element 1734 may appear in which ever category contains data for a plugin 150, for example: category element 1704, plugin element 1706, area element 1714, form element 1716, formGroup element 1712, section element 1722, group element 1724, presentation element 1726, and item element 1728. a) Elements Elements are defined using the <ElementType> tag. An element can have child elements and/or attributes. There are several attributes on the <ElementType> tag that define the element, for example: "name," "content," "order," and "model". The "name" attribute defines the name of the element, i.e., <form>. The "name" attribute can be assigned, for example, any alphanumerical string value. The "content" attribute defines whether test and/or elements can be defined within the element.
For example, the "content" element may be assigned values including, for example: element only
("eltOnly"); mixed; empty; and text only. The "order" attribute defines the order of any child elements defined within an element. For example, the "order" element may be assigned values including, for example: "many," which indicates that any ordering of child elements is permissible, and "one," which indicates that only one ordering of child elements is permissible. The "model" attribute indicates whether the XML has to be closed or open and expandable. An open element may contain constructs from another schema or 'namespace'. For example, a 'stem' XXL element, which is the text body of a question, is open. This will allows a test publisher to insert an extensible hypertext markup language ("XHTML") tag provided the test publisher supplies the location of the schema defining XHTML. The XHTML tag allows the test publisher to embed the question text inside the stem element, complete with formatting. A closed element can contain only the elements that are defined. Figure 28 illustrates how an <ElementType> is used to define the "form" element and Figure 29 illustrates how an <ElementType> tag is used to define the "item" element.
To use an element within the definition of another element, the <element> tag is used. There are several attributes on the <element> tag that define the use of the element, for example: "type," "minOccurs," and "maxOccurs". The "type" attribute identifies which element is being placed. For example, in Figure 28, the child element "scoring" is being placed and is the value assigned to the "type" attribute. The "type" attribute can be assigned any alphanumerical string value. The
"minOccurs" element defines the minimum number of times the element can appear and is assigned a numerical value, for example, "0". The "maxOccurs" element defines the maximum number of time the element can appear and is also assigned a numerical value. Both "minOccurs" and "maxOccurs" can also be assigned the value "*," or other suitable designation, which indicate that the element may appear an infinite number of times. This value is typically only assigned to the "maxOccurs" attribute. b) Attributes
Attributes are defined using the <AttributeType> tag. There are several attributes on the <AttributeType> tag that define the attribute, for example: "name," "dt:type," "required," "devalues," and "default". The "name" attribute define the name of the attribute, i.e. "restartable". The "name" attribute can be assigned any alphanumerical string value. The "dt:type" attribute defines the type of the attribute. For example, the "dt:type" attribute may be assigned values including, for example: string, for an alphanumeric string, enumeration, for an attribute that is assigned a value out of a list of possible values (i.e., attribute = true or false), integer, float, for an attribute that has floating decimal point value, and ID. An ID is a string that is unique within the XML document. For example, if the attribute 'name' was a dt:type 'ID', then one name, e.g. "Joe", could be contained in the XML document. ID types are not used in XXL. The "required" attributed indicates whether or not the element must exist on the element tag in which it is being placed. The "required" attribute is assigned the values, for example, "yes" or "no".
The "devalues" attribute defines the possible values of the enumeration type if "dfctype" is assigned the value "enumeration". For example, "d values" may be assigned the value "true false". The "default" attribute indicates the value the attribute is assigned if no other value is specified when the attribute is placed. In Figure 29, the <AttributeType> tag is used to define the elements "name," "title,"
"template," "area," "weight," "scored," and "skipAllowed".
To use an attribute within the definition of an element, the <attribute> tag is used. The attribute on the <attribute> that defines the use of the attribute is "type". The "type" attribute identifies which attribute is being placed. The "type" attribute may be assigned any alphanumeric string value. Figure
29 illustrates how the attributes defined within the "item" element definition and subsequently placed for use with that element using the <attribute> tag. c) Groups
The <group> tag is used to allow for complex grouping of different elements. In Figure 28, the <group> tag is used to group, for example, elements "sectionRef and "section" within the definition for the "form" element. There are several attributes on the <group> tag that define the group, for example: "order"; "minOccurs"; and "maxOccurs". The "order" attribute defines the order in which the child elements being grouped must appear. For example, the "order" element may be assigned values including, for example: "many," which indicates that any ordering of child elements is permissible, and "one," which indicates that only one ordering of child elements is permissible. The "minOccurs" element defines the minimum number of times the element can appear and is assigned a numerical value, for example, "0". The "maxOccurs" element defines the maximum number of time the element can appear and is also assigned a numerical value. Both "minOccurs" and "maxOccurs" can also be assigned the value "*," which indicate that the element may appear an infinite number of times. This value is typically only assigned to the "maxOccurs" attribute.
C. Defining Plugin with the XXL Language
Plugin files 2080 (Figure 6) are the first, for example, .xml files that are compiled into exam resource file 120. Plugins 150 and plugin files 2080 allow the test publisher to customize the behavior of test driver 110. Plugin files 2080 describe what type of data can be accepted for a particular plugin 150. Figure 30 illustrates how a plugin 150, in this case, a report plugin 168, is used with data stored in a plugin file 2080 to produce the test' s data content and/or presentation format 3000. Having all plugin 150 information in a separate XML file 134 is a convenient way of re-using XML file 134 for many tests and for organizing test data, however this needn't be the case. All XXL information may be in a single XML file 134 or in many files as long as the data is compiled preferably in the order specified in Fig 6. The <data> tag is used within the XXL definition of a plugin 150 to place all information intended for the plugin 150. Data tells a plugin 150 how the plugin 150 should behave. For example, data can tell the plugin 150 what to display (i.e., visual plugins), what information to print (i.e., report plugin 168), and in what order items should be delivered (i.e., selection plugins 160. Below is an example of an XXL data schema:
<ElementType name="data" content= "mixed" model="open"
<AttributeType name="name" type="ID" required="no" />
<AttributeType name="URI" type="string" required="no" />
<AttributeType name="keepExternal" type="string" required="no" />
<attribute type="name" /> <attribute type="URI" /> <attribute type="keepExternal" />
</ElementType>
Data can be defined either within the data tag in an element definition or in an external file. The "URT' attribute is a path to an external file. If the data should not be added to exam resource file 120, the "keepExternal" attribute should be assigned the value "true".
Data can be placed as a child element within an element definition. In the XXL specification, the <data> tag is used to identify the data being placed. Below is an example of an XXL definition of data place using the <data> tag: <data>
<nextPrevious bgcolor="gray">
<button img= "next.bmp" action= "next" /> <button img= "prev . bmp" action= "previous" / >
<button img= "review . bmp" action= "review" / >
< /next Previous >
</data>
Figure 32 illustrates the rendering of the "nextPrevious" helm plugin 154 defined in part with the <data> tag above. The appearance of Next button 3002, Previous button 3004, and Review button 3006 are defined by the values assigned to the "img" attribute.
Figure 33 illustrates another <data> example. Timer plugin 158, named "mainTimer," is assigned to the attribute "timer" within the <form> element "foobarOOl" definition. The <form> element is a child element of the <exam> element "foobar". The <data> tag assigns the information for "mainTimer" timer plugin 158. The information assigned to "mainTimer" is a twenty-minute standard timer for form "foobarOOl" along with a sixty-second "One Minute Left" warning.
Figure 34 illustrates the concept of XXL referencing. XXL allows for an element to be defined once, and then referenced multiple times later on in the test specification. The references all re-use the original definition of the element and do not perform copy operations. The following are XXL reference elements, for example: categoryRef ; formRef; groupRef ; and sectionRef. CategoryRef references an existing category element. FormRef references an existing form element. GroupRef references an existing group element. SectionRef references an existing section element. For example, the test definition illustrated in Figure 34 is defining the section element
"Jeannie." In the later test specification, the form element "JoeFamilyl" is defined, which references section "Jeannie" using the sectionRef element. Another form, "JoeFamily2," references the same section "Jeannie" also using the sectionRef element.
In alternative embodiments, similar languages to XML may optionally be used, such as web programming, object oriented programming, JAVA, etc., that provides the functionaUty described herein. In addition, in alternative embodiments, similar functionality as described herein may be utilized without departing from the present invention.
The many features and advantages of the invention are apparent from the detailed specification, and thus, it is intended by the appended claims to cover all such features and advantages of the invention, which fall within the true spirit and scope of the invention. Further, since numerous modifications and variations will readily occur to those skilled in the art, it is not desired to limit the invention to the exact construction illustrated and described, and accordingly, all suitable modifications and equivalence may be resorted to, faUing within the scope of the invention.
APPENDIX A-
XXL SCHEMA
<?xml version="1.0" ?>
<Schema name="xxl" xmlns="urn:schemas-microsoft-com:xml-data" xmlns:dt="urn:schemas- icrosoft-com:datatypes">
<!-
— >
<!--
— >
/ _.. /Λ\ // // // ._ _Λ/\ //. ./ //.._ ,/
-->
<!-- \:.' \ \/:: = / |::
— >
<!-- \ \/ \ \/
1 1 — >
<l~ > < \ 1
]/ / — >
<!-- / Λ \ \ / /\ \ \ 1 — >
/ / \ \/ / / \ \/
<!— P - R - 0
M - E - T - R
<!--
<!— XDR Schema for Prometric ' s extensible eXam Language
—> <!-- Copyright 2000, 2001 Prometric - Thomson Learning
—> <!--
<!— [ Version 1.30 ]
<!— Released on 2001.12.28 by Ξd Long
<!-- - Added "alloMore" to "area".
— >
<!--
<!— [ Version 1.29 ]
<!— Released on 2001.07.31 by Todd "Tronster" Hartley
—> <>-- - Added "condition" to "report" and "results
—>
<!— [ Version 1.28 ]
<!— Released on 2001.04.04 by Todd "Tronster" Hartley
— > <!— - Added "title" to "tnsgBox" .
<!-
<!— t Version 1.27 ] <!— Released on 2001.03.06 by Todd "Tronster" Hartley
--> <!-- - Added min and max occurs to elements without it.
—>
<!-- - Modified empty elements to be of type empty.
—> <!— - Modified ID to be id on "data" .
--> <!--
<!— [ Version 1.26 ]
<!-- Released on 2001.01.30 by Todd "Tronster" Hartley
—>
<!— - Added "skipAllowed" to "section".
—>
<!-- - Modified "categories" back to "sections" from "groups" . -->
<!-- - Modified "contents" on "categories", supporting "sections". —>
<!--
<!-- [ Version 1.25 ]
<!— Released on 2001.01.26 by Todd "Tronster" Hartley
--> <!— - Modified "keepΞxternal" on "data" to be an enumeration. —>
<!-- - Modified "skipAllowed" on "item" to have no default. —>
<!— - Modified "categories" from "sections" to
"groups" . -->
<!--
<!— t Version 1.24 ]
<!-- Released on 2000.12.11 by Todd "Tronster" Hartley
-->
<!— - Added "group" and "groupRef" from "section" .
—>
<!— - Modified name on "template" to be required.
—>
<!— - Modified name and title attributes to be global in schema. —>
<!--
<!— [ Version 1.23 ]
<!— Released on 2000.11.21 by Todd "Tronster" Hartley
—>
<!-- - Language update for QC release #2.13.
—> <!--
<!-- [ Version 1.22 ] <i— Released on 2000.10 31 by Todd "Tronster" Hartley
—> <!-- - Added "message' tag
<!-- - Added to "results" and "report" the "message" tag -->
<!--
<!-- [ Version 1.21 ]
<!-- Released on 2000.10.19 by Todd "Tronster" Hartley
—>
<!-- - Modified "height " from "heigth" on msgBox children attributes . — >
<!--
<!-- [ Version 1.20 ]
<!— Released on 2000.10.09 by Todd "Tronster" Hartley
—> <!— - Added "messageBox" to xxl .
<!--
<l— [ version 1.19 ]
<l-- Released on 2000 09.25 by Todd Tronster" Hartley
—>
<!-- - Added "name" to presentation (It should have been there ) —>
<!--
<!-- [ Version 1.18 ]
<!-- Released on 2000.08.29 by Todd 'Tronstei" Hartley
—>
<l— - Modified category's "empty" to be "contents" and reorganised —>
<'-- it's possible enumerations.
<ι~
<!-- [ Version 1.17 ]
<!— Released on 2000.08.17 by Todd "Tronster" Hartley
—>
<!-- - Modified header so it's comment style is consistent. -->
<!-- - Modified many comments so the style is consistent. —>
<!— - Modified default value of category's "duplicates"
Figure imgf000038_0001
<!--
<!-- [ Version l.is ] < !— Released on 2000 . 07 .27 by Todd "Tronster" Hartley
--> < !— - Modified " form" to requir e " scoring" and " cimer" .
— >
< ! — - Added (missing) "data" to navigation and selection . — >
< ! — [ Version 1 . 15 ]
<!-- Released on 2000.07.21 by Todd "Tronster" Hartley
—>
<!-- - Added comments about types allowed within atributes. -->
<!-- - Added (missing) "data" to navigation and selection. —>
<!--
<!— [ Version 1.14 ]
<!-- Released on 2000.07.10 by Todd "Tronster" Hartley
—>
<!— - Removed all iDREFs as it prevented cross file compiles . -->
<!--
<!-- [ Version 1.13 ]
<!— Released on 2000.07.07 by Todd "Tronster" Hartley
—>
<!-- - Modified "name" requirement on "presentation".
—> <!--
<!-- [ Version 1.12 ]
<!— Released on 2000.06.26 by Todd "Tronster" Hartley
—>
<!— - Added "type" construct to script.
—> <!-- - Added "skipAllowed", "beginSection" , "endSection" to form. -->
<!— - Removed "commented" from section.
—> <!--
<!-- [ Version 1.11 ]
<!-- Released on 2000.06.20 by Todd "Tronster" Hartley
—> <!— - Changed data to use "URI" instead of "εrc".
—>
<!-- [ Version 1.10 ] <l— Released on 2000 05 31 by Todd 'Tronster" Hartley
—>
<l— - Changed global attribute from ID to string due to clients —>
<!— which use IDs which start with numbers
—> <l~
<!— [ Version 1.9 ]
<l— Re] eased on 2000 05 22 by Todd "Tronster" Hartley
—>
<■-- - Skipped to 1 9 to syncronize with POLESS
—>
< ι - Added "template" to item (how did that get past me .9 —>
< l -' Modified "presentation"
< ι - Removed "commentTi e" from "form".
< l -
< ι - t Version 1 7 ]
< ! - Released on 2000 05 10 by Todd "Tronster" Hartley
—>
<! - - Changed "result" to "results"
-->
<ι -
<ι _. t Version 1.6 ]
<!— Released on 2000.05.02 by Todd "Tronster" Hartley
--> <!— - Modified "name" with IDREP on reference types, as
I realized —>
<l~ that they will not be able to resolve (I E.. A formRef wall —>
<!-- erroi on the schema load if cue form defmtion is in another —>
Figure imgf000040_0001
<l-
<■-- [ Version 1.5 ]
<!-- Released on 2000 05 01 by Todd "Tronster" Hartley
—> <l~ - Modified "attributes" to be "attribute".
—> <!--
<!-- [ Version 1.4 ]
<!-- Released on 2000 04.17 by Todd "Tronster" Hartley <!-- Skipped to 1.4 as two POLESS updates were made involving —>
<!-- "template"
<!-- Modified "template" construct, and removed
"frameset "
<!— [ Version 1.2 ]
<!-- Released on 2000.04.11 by Todd "Tronster" Hartley
Added "keepExternal" to "data" tag, and fixed tag's comment. —>
Added "skipAllowed" to "item".
Modified attributes referencing plug-ins to use the keyword —> <!-- "plugin" instead of "name".
—>
<!-- Modified all identifiers two words or bigger, removing the —>
<!-- underscore, and added first letter capitalization to compound —>
<!-- words .
<!-- Modified "presentation" to now contain "review" attribute, —> <!-- this was yanked from "section" .
—>
<!-- Removed "score" attribute on "form", as it is already a —> <!-- sub-elemen .
<!--
<!— [ Version 1. 1 ] <!-- Released on 2000.04.06 by Todd "Tronster" Hartley
—> <!-- Modified copyright notice.
Removed "language" from "exam".
—>
Removed "name" from "βresentatiαn" as this will be a generated <!— value at compile time, based on a 30-bit XOR checksum. <!--
<!-- [ Version 1.0 ]
<!— Released on 2000.03.30 by Todd "Tronster" Hartley
—> <!-- - Skipped to 1.0 as production code is now being generated for
Figure imgf000042_0001
<'— - Modified "section", so "navigation" and
"selection" are now —>
<!-- elements.
<!--
<!-- [ Version 0.15 ]
<!— Released on 2000.03 27 by Todd 'Tronstei" Hartley
—>
<!-- Skipped 0.14 to stay synced with POLESS document .
—> <!-- Removed "review" tag on sections. <!--
<!— [ Version 0.13 ]
<'— Released on 2000.03.08 by Todd "Tronster" Hartley
<!-- Added "plugin" to top level (ack, it should have been there) —> <!-- Added "timer" plugm to "form".
—>
Modi ied comment format .
< ! -- Modified commenting (added "commented" to
" section" ) —>
< i - Modified (heavily) "attributes" element
—>
< !-- Modified score, review, and timer to be elements instead of -->
< ! -- attributes .
Modified data, to use "src" instead of "uπ".
<!— <!— Version 0.12 ] <!-- Released on 2000.03.03 by Todd "Tronster" Hartley
—>
An intermediate release. It was incomplete, and since a —> <!-- significant amount of time and change has occured since this —> first got out, release has been incremented to avoid —> <!-- confusion between 0.12 printouts and 0.13 printouts <!--
<!- [ Version 0.11 ] <!— Released on 2000.02 21 by Todd "Tronster" Hartley
—> <!— - Added elements "output_results" and
"pimt_scoιe_ιeport" to —>
<!-- "form".
<!-- Added "counted" to "presentation" .
— > <!-- Added "condition" to "item"
-->
<!-- Added "attributes" to exam, form, section, item, etc ... —> <!-- Modified review concept to be listed as section attribute. —>
Modified "script" related elements.
— > Modified presentation so it can be labeled via
"name" . —> <!-- Renamed "color_bit&" to "bιt_depch"
—>
<ι~ Removed "type" from "data" .
—> <!-- Removed "data ref" element.
[ Version 0.10 ]
Released on 2000.02. IS by Todd "Tronster" Hartley
<!-- - Added implied restriction of 32 byte identifiers as the —>
<!-- maximum.
<!-- - Added back "color bits" to "minimum resolution".
<!-- Added "uri" for external data references via a
DRI. —>
<!__ Added "review" to "presentation" to mark what is a review — > screen.
Modified (heavily) "template" to use HTML's
"f ameset" . —> <!-- Modified (heavily) "data" element .
—>
<!-- Modified previous version's comments (fixed errors in vO .9) —> Modified "language" in "exam" to use ISO standard. —> <!-- Removed "presentation" from "exam" and "section".
-->
<!-- Removed "condition" from "item" .
—>
<!-- Removed "plugin_ref" element. ->
< !--
< !— [ Version 0 . 9 ]
<!-- Released on 2000.02.11 by Todd "Tronster" Hartley
—>
<!-- - Modfied (heavily) "plugin" element (removed
"type", etc.) -->
<!-- - Moved data to top level constuct and added reference type. -->
<!-- - Removed "z_plane" from "template" (formally
—>
<!-- "presentation _template" ) .
<!-- - Removed "helpfile" (help implementation needs to be resolved —>
<!-- first) .
<!-- - Removed "item ref" as items, now referenced via presentations. —>
<!-- - Removed "pool" concept.
<!-- - Removed "color_bits" from "minimum_resolution" .
—>
<!-- [ Version 0.8 ]
<!— Released on 2000.02.07 by Todd "Tronster" Hartley
—>
<!-- - Added "pool" concept based on TD ' s "group" and
"set" -->
<!— concepts.
<!— - Modified schema header comment block to be more descriptive. —>
<!— For version control check-in information (I.E.:
VSS) , the —>
<!-- comments in this block (for the corresponding version) should -->
<!— be cut and paste.
<!-- - Converted elements to attribs, including: "window" . -->
<!-- - Removed "forms" and made "form" an exam level element. -->
<!— [ Version 0.7 ]
< !-- Released 2000.01 .23 by Todd "Tronster" Hartley
—>
<!— - Switched back to 2 digit versionmg.
--> <!- < !— [ Version 0 .6 ]
<!— Released 2000.01.24 by Todd "Tronster" Hartley
—> <!— - Initial version of XXL schema, based on DTD version 0 5 —>
<!— - Tabs within the document are set at 3 spaces.
-->
<!--
—> <!--
<!— GLOBAL DEFINES
<!— NOTES:
<!-- This area contains variable type definitions which can be used —> anywhere within the extensible eXam Language
—> Some entities contain an attribute called "name". This name must be unique according to what scope it is defined for. The schema --> only validates across the "test" scope. Scopes are as follows : —> <!- - test: For test elements such as item, section, etc...
-->
- plugin. For the different
— >
???TBH implied restriction: identifers cannot be larger than 31 —> bytes, due to limitations in OLE Structured Storage <!--
— >
<AttπbuteType name="name" dt:type="string" required="yes" /> <AttπbuteType name="tϊtle" dt:type="string" required="no" /> />
Figure imgf000045_0001
<!-- <xxl> <!-- ATTRIBUTE REQ? DESCRIPTION
<!-- version yes Version of the XXL spec
~>
<!—
<!-- SUB-ELEMENTS
<!-- <exam> <form> <formGrou >
<section> —>
<!-- <group> <item> <category>
<template> —>
<!-- <plugin> <data> <script>
<msgBox>
<!--
<!— NOTES
<!— - These elements are considered "top level". This allows for these -->
<!— elements to be defined in different files before compiling. -->
<!— - If "msgBox" is not in any of the XML files by link time, it is -->
<!— the job of the linker to add a default one.
-->
<!-
<!-
<ElementType na me="xxl" order="many" content="eItOnly" model="closed"> <AttributeType name="version" dt:type= "string" required="yes" /> < attribute type ="version" /> <element type= "msgBox" minOccurs="0" maxOccurs="l" /> <element type= "script" minOccurs="0" maxOccurs="l" /> <element type= "data" minOccurs="0" maxOccurs="*" /> <element type= "plugin" πninOccurs="0" maxOccurs="*" /> <element type= "template" minOccurs="0" maxOccurs="*" /> <element type= "category" minOccurs="0" maxOccurs="*" /> <element type= "item" minOccurs="0" maxOccurs="*" /> <element type= "group" minOccurs="0" maxOccurs="*" /> <element type= "section" minOccurs="0" maxOccurs="*" /> <element type= "form" minOccurs="0" rnaxOccurs="*" /> <element type= "formGroup" minOccurs="0" maxOccurs="*" /> <element type= "exam" minOccurs="0" maxOccurs="*" />
</ElementType>
< !--
< !--
< !— ATTRIBUTE REQ7 DESCRIPTION yes Name of the exam.
<!-- version Which version of the exam.
-->
<!- plain text description of the exam. —>
SUB-ELEMENTS
<mιnResolution> <attribute> <form>
<formRef; <formGroup>
NOTES
Every exam must contain at least one "form" or "formRef"
—> whether it be by itself or within a "formGroup" .
—>
<ElementType name="exam" content="mixed" model="closed"> <AttributeType name="versioπ" dt:type="string" required="yes" /> <attribute type="name" /> ottribute type="version" /> ottribute type="title" />
<element type="minResolution" minOccurs="0" maxOccurs="l" /> <element type="attribute" minOccurs="0" maxOccurs="*" /> - <group order="many" mιnOccurs="l" maxOccurs="*"> <element type="form" minOccurs="0" maxOccurs="*" /> < element type= "formRef" minOccurs="0" maxOccurs="*" /> <element type="formGroup" mιnOccurs="0" maxOccurs="*" /> </group> </ElementType>
— >
<!-
;form>
<!— ATTRIBUTES REQ? DESCRIPTION
<!— name yes Name of the form.
<!-- version no Version of the form.
<!-- title no A brief text description of the form. —> <!-- restartable no [true] (false) If form can be restarted. <!-- skipAllowed [true] (false) If items can be skipped. <!— beginSection no The section which officially begins the exam. <!-- endSection The section which officially ends the exam.
<!-- SUB-ELEMENTS
<!-- cscoring> <timer> <mmResol'
<report> — >
<!-- <results> eattributei <section> <sectionRef > —>
<!- NOTES
<!-- A orm is a ixed order of testing events . This means the order —>
<!-- of section, report, etc... tags occur will effect the delivery, —>
<!— - A form must contain at least one "section" (or
"sectionRef") . —>
<!-- - If beginSection or endSection are omitted, the first and last —>
<!-- sections listed m the form tag will be the begin and end
-->
< !- sections respectively . < !-
•b i ¬
<ElementType name="form" content="eltOnly" order="many" model="closed"> <AttributeType name="versϊon" dt:type="string" required="no" /> <AttributeType name="restartable" dt:type="enumeration" dt:values="true false" default="true" required="no" /> <AttributeType name="skipAllowed" dt:type="enumeration" dt:values="true false" default="true" required="no" /> <AttributeType name="beginSection" dt:type="string" required="no" /> <AttributeType name="endSection" dt:type="string" required="no" /> ottribute type="πame" /> ottribute type="version" /> ttribute type="title" /> Ottribute type="restartable" /> ottribute type="skipAllowed" /> ottribute type="beginSection" /> ottribute type= "endSection" />
<element type="scoriπg" minOccurs="l" maxOccurs="l" /> <element type="timer" minOccurs="l" maxOccurs="i" /> <element type="minResolution" minOccurs="0" maxOccurs="l" /> <element type="report" minOccurs="0" maxOccurs="*" /> <element type =" results" minOccurs="0" maxOccurs="*" /> <elemerιt type-"attribute" minOccurs="0" maxOccurs="*" /> - <group order="many" minOccurs="l" maxOccurs="*">
<eiement type="sectionRef" minOccurs="0" maxOccurs--"*" /> <element type="section" minOccurs="0" maxOccurs="*" /> </group>
</EIementType>
<!--
<!-- <formRe >
<!--
<!-- ATTRIBUTES REQ? DESCRIPTION
<!-- name yes The name of an already defined form. —>
<!--
<!-- NOTES
<!— - Allows for forms to be referenced in many places within an exam. —>
<!--
<!-
—>
<ElementType name= "formRef" content="empty"> ottribute type="name" /> </ElementType>
—>
< !--
< !-- < formGroup >
< !-
< !- ATTRIBUTES REQ? DESCRIPTION
< !— name yes The name of the form group. -->
<!--
<!— SUB-ELEMENTS
<!— <selection> <form> ;formRef5
<!-
<ElementType name="formGroup" content="eltOnIy" order="many" model="closed"> ttribute type="name" /> <element type="selection" minOccurs="l" maxOccurs="l" /> - <group order="many" minOccurs="i" maxOccurs="*">
<element type= "formRef" minOccurs="0" maxOccurs="*" /> <element type="form" miπOccurs="0" maxOccurs="*" /> </group>
</ElementType>
-->
< !-- <section>
<!— ATTRIBUTES REQ? DESCRIPTION
<!- yes The name of the section-
—>
<!-- title A brief text description of the section. —>
<!-- skipAllowed (true) (false) If items can be skipped. —>
<!--
<!- SUB-ELEMENTS <navigation> <timer> <grou > <groupRef> --;
<start> <finish>
<condition <attr ibute=>
<!-- <section> <sectionRef>
<categoryRef> <!-- <!-- NOTES
<!-- This construct dictates the navigation and Selection of items —> as well as displays.
A section with a condition script is run if evaluated to
" rue" . <!-- One and only one group or groupRef must exist for each section.
<!--
— >
<ElementType name="section" content="eItOnly" order="many" model="closed"> <AttributeType name="skipAIIowed" dt:type="enumeration" dt:values="true false" requιred="no" /> ottribute type="name" /> ottribute type="title" /> ottribute type="skjpAllowed" />
<element type="navigatϊon" minOccurs="l" maxOccurs="l" /> <element type="timer" minOccurs="0" maxOccurs ="i" /> olement type="start" minOccurs="0" maxOccurs="l" (> <element type="finish" minOccurs="0" maxOccurs="l" /> <element type="condition" minOccurs="0" maxOccurs="l" /> <element type="attribute" minOccurs="0" maxOccurs="*" /> <element type = "categoryRef" minOccurs="0" maxOccurs="*" />
- <group order="one" mιnOccurs="l" maxOccurs="l">
<element type="group" minOccurs="0" maxOccurs="l" /> <element type="groupRef" minOccurs="0" maxOccurs="l" /> </group>
</ElementType>
<!--
— >
<!-
<!-- <sectιonRef>
<!—
<!-- ATTRIBUTES REQ? DESCRIPTION
<!~ name yes The name of an already defined section. -->
<!--
< ! — NOTES
<!-- Allows for a section to be referenced in many places . — > <!-- <!--
<ElementType name="sectionRef" content="empty"> ottribute type="name" /> </ElementType> <!--
< ! — <group>
< ! — ATTRIBUTES REQ? DESCRIPTION
< ! — name yes The name of the group.
<!— title A brief text description of the grou . — > <!— template A template to
Figure imgf000051_0001
for presentations . — >
<!— SUB-ELEMENTS <!— <attribute> <grou > <groupRef> <presentation> —>
<!-- <scoring> <section>
<sectionRef> <selection —>
<!--
< ! — NOTES
<!-- - If a section or sectionRef exists in a group, then no sub groups —>
< !— or presentations may exist .
<!--
<!--
<ElementType name="group" content="eltOnly" order="many" model="closed"> ottribute type="name" /> ottribute type="title" /> ottribute type="template" />
<elemeπt type="selection" minOccurs="l" maxOccurs="l" />
<element type="scoring" minOccurs="0" maxOccurs="l" />
<element type="attribute" minOccurs="0" maxOccurs="*" /> - <group order="many" minOccurs="l" maxOccurs="*">
<element type="group" minOccurs="0" maxOccurs="*" /> <element type="groupRef" minOccurs="0" maxOccurs="*" /> < element type=" presentation" minOccurs="0" maxOccurs="*" /> <element type="section" minOccurs="0" maxOccurs="*" /> <element type="sectionRef" minOccurs="0" maxOccurs="*" />
</group> </ElementType>
< ! --
< !-
<!-- <groupRef>
<!— ATTRIBUTES REQ? DESCRIPTION
<!-- name yes The name of an already defined group. -->
<!-
<!— NOTES
<!— - Allows for a group to be referenced in many places.
—>
<!--
<ElementType name= "groupRef" content= "empty" > ottribute type="name" /> </ElementType> ->
<!
<! <navigation>
<!
<! .-- ATTRIBUTES REQ? DESCRIPTION
<! plugin yes The name of the navigation plug-in to use . —>
<! —
<!— SUB-ELEMENTS <!-- <data>
<!— NOTES
<!- Within this tag can exist any initializacion data for the
—>
<!- navigation plug-in.
If the name of the plug-in is "transparent", the section should -->
<!-- be evaluated by a drive before it's parent, and then use whatevei —>
<!-- navigation the parent section uses.
—>
<!--
—>
<ElementType name="navigation" content=" ixed" model="open"> <AttributeType name="plugin" dt:type="string" required="yes" /> ottribute type="plugin" /> <element type="data" minOccurs="0" maxOccurs="l" />
</ElementType>
<!--
<!--
<!- <selection>
<!--
<!- ATTRIBUTES REQ? DESCRIPTION
<!-- pliigin yes The name of the navigation plug-m to use.
<!--
NOTES < !-- Within this tag can exist any initialization data for the
—> <!-- selection plug-in. < !-- This is used fo sections, and for formGroups.
—> < !--
— >
<E!ementType name="selection" content=" mixed" model="open"> <AttrιbuteType name="plugin" dt:type="string" required="yes" /> ottribute type="plugin" /> <element type="data" minOccurs="0" maxOccurs="_L" />
</ElementType>
< !--
<!--
< !— <ιtem>
<!- ATTRIBUTES REQ? DESCRIPTION <!— name yes The name of the item.
— >
<!-- template The template in which the item will appear, -->
<!-- area The template ' s area an which the item will appear,
<!— title A text description of the
Figure imgf000054_0001
<!— weight no [1. 0] A value used for judging it scoring. —>
<l~ scored no [true] (false) If the item is scored. —>
<!— skipAllowed no (true) (false) If an item can be skipped. —>
< !-- SUB -ELEMENTS
<!-- <data> <σategoryRef > <start> <f ιnιsh>
< !-- <condition> <attribute>
< !--
< !-- NOTES
<!— - An item is the smallest piece of an exam It contains data for a -->
< !-- plug- in to render in a particular "area" of the screen, as well — >
<!— as information on how it should be scored according to the way m <!-- which the candidate interacts with it
-->
< i-
<!-
<ElementType name="item" content="eltOnIy" order="many" model="open"> <AttrιbuteType name="area" dt:type="strϊng" requιred="no" /> <AttπbuteType name=' weight" dt:type=' string" requιred="no" /> <AttπbuteType name="scored" dt:type="enumeratϊon" dt:values="true false" default="true" requιred="no" /> <AttπbuteType name="skipAllowed" dt:type="enumeration" dt:values="true false" requιred="no" /> ottribute type="name" /> ttribute type="title" /> ottribute type="template" /> ottribute type="area" /> ottribute type = "weight" /> ottribute type="scored" /> ottribute type="skipAllowed" />
<element type="data" minOccurs="i" maxOccurs="l" /> <element type = "categoryRef" mιnOccurs="0" maxOccurs="*" /> <element type="start" mιnOccurs="0" maxOccurs="l" /> <element type="finish" mιnOccurs="0" maxOccurs="l" /> <eiement type="condition" mιnOccurs="0" maxOccurs="l" /> <element type="attrϊbute" mιπOccurs="0" maxOccurs="*" />
</ElementType>
<!-
< ι-
<!— <category>
<i-
<l— ATTRIBUTES REQ" DESCRIPTION
<l— name yes The name of the category
—>
<l— duplicates no [true] (false) hether an item can appear more —>
<ι~ than once within it or it's sub-categories -->
<•— complete [false] (true) Whether or not every item must --> appear within it or it's sub-categories .
<l~ cortents no [anything] (items, sections, categories)
<l Determines what can exist witnin a category. If —>
<!-- "anything", there are no restrictions If "items"
<!-- then no sections can exist. If "sections" then no items can exist If "categories", only —> sub-categories can exist directly off of this --> <!-- category.
<!--
< !-- SUB -ELEMENTS
<!— <descπptιon> <category> <categoryRef>
<scormg <!--
<!- NOTES
<!-- - The primary purpose of a category is to provide a grouping -->
<!-- mechanism which is independent of delivery, allowing for exotic —>
<!— reporting and scoring (if necessary)
-->
<!-
<!--
<ElementType name="category" content="eltOnly" order="many" model="closed"> <AttπbuteType name=' duplicates" dt:type="enumeration" dt:values="true false" default="true" requιred="no" /> <AttrιbuteType name="complete" dt:type="enuιtιeration" dt:values="true false" default="false" requιred="no" /> <AttrιbuteType name="contents" dt:type="enumeration" dt,values="anything items sections categories" default="anything" required="no" /> ottribute type="name" /> ottribute type="duplicates" /> ottribute type="complete" /> ottribute type="contents" />
<elemerιt type="category" mιnOccurs="0" maxOccurs="*" /> <elemeπt type= "categoryRef" mιnOccurs="0" maxOccurs="*" /> olement type="description" mιnOccurε="0" maxOccurs="l" /> <element type-"scoring" mιnOccurs="0" maxOccurs="l" /> </ElementType>
<!--
<!-
<!-- categoryRef> <!-- <!-- ATTRIBUTES REQ? DESCRIPTION yes The name of a defined category. <!- NOTES <!-- Allows for a category to be referenced in many places.
— >
<!--
<ElementType name="categoryRef" conterιt="empty"> ottribute type="name" /> </ElementType>
—>
< ! —
<!— <description
<!--
<!— NOTES
<!-- Used within a category to contain a description of it ' s contents . —> <!-- May contain plain text , or HTML .
<!--
-->
<ElementType name="description" content= "mixed" model="open" />
^templates
< !--
< !-- ATTRIBUTES RED? DESCRIPTION
< ! — name yes The name of the template .
<!— split yes (rows, cols) Defines how the current area is —>
<!— split, either by rows or columns . —>
<!— size no A value describing the size (relative in cols or — >
<!-- rows) of a (sub- template ' s) area in pixels ,
<!-- percentages, or
HTML's "*" frameset syntax. —> size="none" if the area is to not be bounded to the template and
"floats" . <!--
SUB-ELEMENTS
NOTES
"SiThe "top most" template is given a name by the test- designer and --> <!-- is referenced throughout, the exam.
—>
<!-- If a complicated template is to be created, and required the —>
<!-- nesting of templates within other templates, only the top most —>
<!-- most template is given a name. The compiler will supply names —>
<!-- for the other "sub-templates".
—>
<!- A "*" in size is a wildcard which represents whatever free space —>
<!-- has not already been specifically specified in percentages or —>
<!-- oixels.
<!-- Specifying a number before "*" is the ratio of free space : —> <!-- "1*" = one part of what is left (same as just
—>
<!-- "2*" - twice as much free space as a "l*" area
—> <!-- "3*" etc . <!-- <!--
— >
<ElementType name="template" content="eltOnly" order="many" model="closed">
<AttributeType name="split" dt:type="enumeration" required="yes" dt:values="rows cols" />
<AttributeType name="size" dt:type="string" required="no" /> ottribute type="name" /> ottribute type="split" /> ottribute type="size" />
<element type="area" minOccurs="0" maxOccurs="*" />
<element type="template" minOccurs="0" maxOccurs="*" /> </ElementType>
< !-
<!-- <!— <area>
<!— ATTRIBUTES REQ' DESCRIPTION
< !— name yes A label for the area in a given template. —>
<!— plugin yes A pluqin to be used with that area —>
<!— size yes A value describing the size
(relative in cols or -->
<!-- rows) of an area m pixels, percentages, or HTML's —>
<!-- "*" frameset syntax. size- "none" if the area is —>
<!-- to not be bounded to the template and "floats". —>
<!— allowMore no [false] (true) A boolean for the
"More" button. -->
<!-- If True the More button will be enabled to the —>
<l~ candidate for displaymg the presentation area to
<!- the candidate.
<!--
<!-- SUB-ELEMENTS
<!— <data>
<!--
<!-- NOTES
<!— - A "*" in size is a wildcard which represents whatever free space -->
<!— has not already been specifically specified in percentages or —>
<!-- pixels.
<ι~ Specifying a number before "*" is the ratio of free space . —> <!-- "1*" = one part of what is left (same as just
—>
<!-- "2*" = twice as much free space as a «ι*« area
—> <!-- "3*" = etc.. <!--
- <ElementType name="area" corιtent="eltOnly" order="many' model="closed"> <AttributeType name="plugin" dt:type="string" required="yes" /> <AttributeType name="sϊze" dt:type="string" required="yes" /> <AttributeType name="allowMore" dt:type="enumeration" dt:values="true false" default= "false" required="no" /> ottribute type="name" /> ottribute type="plugin" /> ottribute type="size" /> ottribute type="allowMore" />
<element type="data" minOccurs="0" maxOccurs="l" /> </ElementType>
<!-- <!-- <presentation>
<!-- ATTRIBUTES REQ? DESCRIPTION
<!-- name A name of the presentation.
<!— title A text description of the presentation.
<!— review no [false] (true) If the presentation sbxmld be used
<!-- a review screen.
<!-- template no The name of a template to use for the presentation -->
<!-- rather than using the section's current template. —>
<!-- area no Which area the presentation is targeting. —>
<!— item no A reference to the name of the item which will be —>
<!-- used for the presentation.
<!-- display no A reference to the name of the data which will be -->
<!-- used for the presentation .
<!— counted no [true] (false) If this is numbered in the test -->
<!-- driver's list being at "X of Y" presentations. —>
<!-
<!— SUB-ELEMENTS
<!-- <presentation> <data <start>
<finish> <!-- <conditlon> <attribute>
<!- < l ~ NOTES
A presentation is what tells the test driver what data
(such as —> < !- an item) should be presentated on the screen at a given time —> < !-- The name is optional as most presentations won't have them . The —> < ι~ compiler will generate a name base on the contents of the
—>
<!-- presentation The "name" attribute allows for test designers to —>
<!-- to give a specific name to facilitate special behaviors for
<!- navi ation and/or scoring.
<ι~ If multiple areas need to be updated at the same t me,
—>
<!-- presentation tags can be nested. The outer most presentation —> <ι~ tags contains no attributes, while the inner set of presentation —> <ι~ tags makes references to the areas to be updated.
—>
<!- A "data" tag can exist 'inline", to create a display or define —> <l- a helm Another options is to
Figure imgf000061_0001
the "data" attribute to point —>
<l- at existing data, already defined.
—>
<ι~ <ι~
<ElementType name="presentation" content="mixed"> <AttributeType name="name" dt:type="string" required="no" /> <AttrιbuteType name="revϊew" dt:type="enumeratlon" dt:values="true false" default="false" required="no" /> <AttrιbuteType name="area" dt:type="string" requιred="no" /> <AttributeType name="item" dt:type="string" required="no" /> <AttrιbuteType name="data" dt:type-"string" requιred="no" /> <AttributeType name="counted" dt:type="enumeration" dt:values="true false" default="true" requιred="no" /> ottribute type="name" /> ottribute type="tϊtle ' /> ottribute type="revϊe " /> ottribute type="template" /> ottribute type="area" /> ottribute type="item" /> ottribute type="data" /> ottribute type="counted" />
<element type="presentation" minOccurs="0" maxOccurs="*" /> <element type="data" mιnOccurs="0" maxOccurs="l" /> <element type= "start" mmOccurs="0" maxOccurs="l" /> <element type="finish" mιnOccurs="0" maxOccurs="l" /> <eiemeπt type="condition" minOccurs="0" maxOccurs="l" /> olement type="attribute" minOccurs="0" maxOccurs="*" /> </ElementType> <!--
— >
<!--
<!-- <script>
<!--
<!-- ATTRIBUTES REQ? DESCRIPTION
<!-- type no [vbscript] (3 script) The type of scripting —>
<!-- language used within the script block. -->
<!--
<!-- NOTES
<!-- A tag which is used to define scripts which exist within the —>
<!-- global level of the resource file.
—>
<!--
<!-
<EiementType name="script" content="mixed" mode!="open"> <AttributeType name="type" dt:type="enumeration" dt:values= "vbscript jscript" default="vbscript" required="no" /> ottribute type="type" />
</ElementType>
<!-
— >
<!--
<!- <start >
<!-
<!- NOTES
<!- - Script execution at the beginning of a delivery entity. — >
<!--
<!-
<ElementType rιame="start" content="mixed" model="open" /> <!--
< !--
< !— <finish> NOTES
Script execution at the end of a delivery entity.
—>
<ElementType name="fϊnish" content="mixed" model="open" /> < !--
< !--
<!-- <condition>
<!--
<!— NOTES
<!-- - "Condition" ensures execution of the entity it modifies i£ the —>
<!-- script evaluates to a true value .
—>
<ElementType name="condition" content=" mixed" model="open" />
<!--
—>
<!--
<!— <plugin>
<!--
<!-- ATTRIBUTES
<!- name yes A label for the plugin.
<!- progid yes Program ID of the plug-in.
—>
<!-
<!-- SUB-ELEMENTS
<!- <data>
<!-
<!- NOTES
<!- T Teesstt drivers will support "plug-ins" to handle scoring, item -. -> <!-- delivery, navigation, and many other functions which were
—> <!-- traditionally hardcoded into the test driver itself .
—>
<!-- To us a plug in, it must be given a name to use within the XXL —> <!- files, and point to a "progid" which is the Microsoft format of —> <!-- labeling COM objects. Then throughout the XXL, this plugin is —> <!-- referenced by the name it was given here.
—>
<!- Initialization data can exist within the optional "data" tag. —> <!--
<!-
— >
<ElementType name="plugin" content="mixed" order="many" model="closed">
<AttributeType name="progid" dt:type="string" required ="yes" /> ottribute type="name" f> ottribute type="progid" /> olement type="data" minOccurs="0" maxOccurs="l" /> </ElementType> <!—
< !— <data> < !-
<!— ATTRIBUTES REQ? DESCRIPTION
<!— name no Identifier for the data . —>
<!— URI no URI for an external piece of data. —>
<!— keepExtemal no [false] (true) Whether or not to keep the file -->
<!-- external from the resource file. —>
<!-
<!- SUB-ELEMENTS <!- n/a <!- <!- NOTES
Tag must have name if it ' s defined at the top most level in file. —> <!-- <!-
— >
<ElementType name="data" content="mixed" model="open"> <AttrιbuteType name="name" dt:type="id" required="no" /> <AttrιbuteType name="URI" dt:type="strϊng" requιred="no" /> <AttπbuteType name="keepExternal" dt:type="enumeration" dt:values="true false" default="false" required="no" /> ottribute type="name" /> ottribute type="URI" /> ottribute type="keepExternal" />
</ElementType>
<!--
< i-
<!-- <msgBox>
<!— ATTRIBUTES REQ' DESCRIPTION
<!— oκ no [OK] The text appearing on the standard dialog —> <!-- button, usually marked as "OK" . —>
<!-- cancel no [Cancel] The text appearing on the standard —>
<!-- dialog button, usually marked as "Cancel". — >
<!-- abort no [Abort] The text appearing on the standard —>
<!-- dialog button, usually marked as "Abort". — >
<!— retry no [Retry] The text appearing on the standard — >
<!- dialog buttor, usually marked as "Retry". —>
<!— ignore no [Ignore] The text appearing on the standard — >
<!-- dialog button, usually marked as "Ignore". —>
<!— yes no [Yes] The text appearing on the standard —>
<l~ dialog button, u&ually marked as "Yes". —>
<!— no [No] The text appearing on the standard
<!-- dialog button, usually marked as "No" . —>
<!— title no [] The default text which appears m the title —>
<!- bar of message boxes .
<!— SUB-ELEMENTS <!— <boxSize> buttonSize>
<!— NOTES
There are various places where a message box may appear.
This —> <!-- tag (and it's children) describe the properties of such a
<!-- message box, and it's buttons.
<!- If no message box is declared by link time, it's up to linker —> <!-- to add these properties to the resource file with their defaults . —>
<!--
—>
<ElementType name="msgBox" content="eltOnIy" model="closed"> <AttributeType name="OK" dt:type="string" default="OK" required="no" /> <AttributeType name="cancel" dt:type="strϊng" default="Cancel" required="no" /> <AttributeType name="abort" dt:type="string" default="Abort" required="πo" /> <AttributeType name="retry" dt:type="string" default="Retry" required="no" /> <AttributeType name="ignore" dt:type="strϊng" default="Ignore" required="no" /> <AttributeType name="yes" dt:type="string" default="Yes" required="no" /> <AttribtιteType name="no" dt:type="string" defauit="No" required="πo" /> <AttributeType πame="title" dt:type="string" default="" requlred="no" /> ottribute type="OK" /> ottribute type="cancel" /> ottribute type="abort" /> ottribute type="retry" /> ottribute type="ignore" /> ottribute type="yes" /> ottribute type="no" /> ottribute type="title" /> olement type="boxSize" minOccurs="l" maxOccurs="l" /> olement type="buttonSize" minOccurs="l" maxOccurs="l" /> </ElementType> <!-
<!--
<!— <boxSize>
<!-
<!- ATTRIBUTES REQ? DESCRIPTION <!-- width yes The width of the dialog box.
<!— height yes The height of the dialog box.
<!--
<!--
—>
<ElementType name="boxSize" content="empty"> <AttributeType name="width" dt:type="string" /> <AttributeType name="heig t" dt:type="string" /> ottribute type="width" /> ottribute type="height" />
</ElementType>
—>
<!-
<!— <buttonSize>
<!— ATTRIBUTES REQ? DESCRIPTION
<!— width yes The width of the dialog box's buttons. —> <!— height yes The height of the dialog bo ' s buttons . —> <!--
<!-
<EIementType rιame="buttonSize" content=" empty" > <AttπbuteType name= "width" dt:type="string" /> <AttπbuteType name="height" dt:type="string" /> ottribute type="width" /> ottribute type="height" />
</ElementType>
<!-
<!--
<!- <attribute>
<!-
<!-- ATTRIBUTES REQ? DESCRIPTION
<!- name yes The name of the attribute .
—>
<!-- type yes The (variable) type of the attribute
<!- value yes Default value of the attribute. —>
NOTES < ι~ The purpose of attributes is to allow for data to be specif ed —> <!-- for testing constructs which exist at another scope.
—>
<!-- The type attribute can be one of the following:
—>
<"-- Type Description
<!--
<!-- CHAR character (-127 to 128)
<!-- UCHAR unsigned character (0 to 255)
—>
<!-- SHORT two byte number ( -32,767 to
32 , 768 ) —> USHORT unsigned two byte number (0 to 65535;
<!-- LONG four byte number (-2,147,483,647 to 2 , 147 , 483 , 548 ) —>
<!- ULONG unsigned four byte number (0 to
4 , 294 , 9S7 , 29S )
< ! -- INT integer (platform specific, on m98 = four bytes)
< l ~ FLOAT a four byte floating point number
<!-- DOUBLE an eight byte floating point number
<!-- BSTR a "B" string BOOL a boolean (actually it's a
VT_BOOL) —>
<!-- SCODE a VT_ERROR <!-- LPSTR long pointer to a atπng
<!-- LPWSTR long pointer to a wide string (16-pit characters) <ι~
<E!ementType name="attribute" content="empty" model="closed"> <AttrιbuteType name="type" dt:type="string" required="yes" /> <AttrιbuteType name="value" dt:type="string" required="yes" /> ottribute type="name" /> ottribute type="type" /> ottribute type="value" />
</ElementType>
<!-
<ι~
<!— <minResolution> <!-
<!- ATTRIBUTES REQ? DESCRIPTION
<!— width yes The minimum width, in pixels, of the screen.
<!— height yes The minimum height, in pixels, of the sπreen. —>
<!-- bitDepth no [24] (8, 16, 32) The minimum # of bits per pixel,
<!-- sometimes referred to as the coloi depth. —>
<!-- SUB-ELEMENTS
<!-- n/a
<!--
<!-- NOTES
<!-- M Miinnii]mum resolution a monitor & video card should be in
<!-- C Coolloo:rs supported with different bit depths:
<!- BITS COLORS NAME
<!~ a 256 palette or gray scale
<!-- 16 65 ,535 high color
<!-- 24 16,777,216 true color
<!-- 32 16,777,216 true color
<!-
<!--
<ElementType name="min Resolution" content="empty"> <AttributeType name="width" dt:type="int" requιred="yes" /> <AttributeType name="height" dt:type="int" required="yes" /> <AttributeType name="bitDepth" dt:type="ϊnt" required="no" /> ottribute type="width" /> <attπbute type="height" /> ttribute type="bϊtDepth" />
</ElemeπtType>
< !-
< !- < !- <resultsϊ <!-
<!-- ATTRIBUTES REQ? D DESCRIPTION
<!-- plugin yyeess TT!he results plug-in to be used at this
(taq ' s) —>
<!-- point in time.
<!--
<!-- SUB-ELEMENTS
<!-- <data> <message> <condιtιon>
<!-- NOTES
<!-- Output results file
<!— If a condition script exists, the results will only be executed <!-- should it evaluate to true
—>
— >
<ElementType name="results" content="mixed" model="open"> <AttrιbuteType name="plugin" dt:type="strϊng" requιred="yes" /> ottribute type="plugin" /> olement type="data" minOccurs="0" maxOccurs="l" /> olement type="message" minOccurs="0" maxOccurs="l" /> olement type="condition" mmOccurs="0" maxOccurs="l" />
</ElemenfType>
<!-
<!-
<!- <report>
<!-
<!- ATTRIBUTES REQ? D DESCRIPTION
<!-- plugin yes T The reportmq plug-m to be used at this
(tag' s) ->
<!- point m time
<!-
<!-- SUB-ELEMENTS
<!- <data> <mes age> :condιtion>
<!-- < ! -- NOTES
Prints (current ) score report .
— >
< ! - If a condition script exists, the report will only be executed should it evaluate to true .
<!--
<ElementType name="report" content="mixed" rnodel="open"> <AttributeType name="plugin" dt:type="string" required="yes" /> ottribute type="plugin" /> olement type="data" minOccurs="0" maxOccurs="l" /> olement type= "message" minOccurs="0" maxOccurs="l" /> olement type="condition" minOccurs="0" maxOccurs="l" />
</ElementType>
<!-
<!-
<!— <scoring>
<!— ATTRIBUTES RED? DESCRIPTION
<!-- plugin yes The scoring plug-in to be used at this scope.
<!— SUB-ELEMENTS
<!-- <data>
<!--
<!-- NOTES
<!-- - Scores the exam's content.
—>
<!--
< !-
—>
<ElementType name="scoring" content="mixed" model="open"> <AttributeType name="plugin" dt:type="string" required="yes" /> ottribute type="plugϊn" /> olement type="data" minOccurs="0" maxOccurs="l" />
</ElementType>
<!-
<!- <tιmer>
< !--
ATTRIBUTES REQ? DESCRIPTION
<!— plugin yes The timer plug-in to be used at this scope . — >
< !— SUB-ELEMENTS
<!— <data>
<!--
<!- NOTES
<!- - Handles how the form/section are timed.
<!-
<!--
<ElementType name="timer" content="mixed" model="open"> <AttributeType name="plugϊn" dt:type="string" required="yes" /> ottribute type="plugin" /> olement type="data" minOccurs="0" maxOccurs="l" />
</ElementType>
<!-
<!--
<!- <message>
<!-
<!-- ATTRIBUTES
<!- n/a
<!--
<!-- SUB-ELEMENTS
<!- n/a
<!--
<!-- NOTES <!-- Contains text to display to the candidate. <!-- < !-
—>
<ElementType name="message" content="mixed" model="open" />
< !— [xmlResults-schema . xml] -->
<!--
<!--
<!-- <xmlResults>
<!--
<!— ATTRIBUTE REQ? DESCRIPTION
<!— reportCategory no Top level category containing sub-categories —>
<!-- to report against. If there are any items —>
<!-- directly off the top level category, they are —>
<!-- ignored.
Section listed in any categoiy
<!- below the reportCategory are ignored.
<!-- filterCategory no The name of a category which contains sections. —>
<!-- Only these sections are reported against in the
<!-- results file.
<!— history no If true the plugin will write the history in
<!-- the result file.
<!-- resultFileName no The name of the XMLResults output file. —>
<!— A default will be used if left blank. —>
<!-- delete no When true, the candidate file will be deleted from the candidate —>
<!— directory once the file has moved to into the transmit queue. -->
<!-- If field is blank, it will default to false.
<!-- SUB-ELEMENTS
<!-- none
<!--
<!~ NOTES
<!— - With no options, the XML result file is ordered in the same way —>
<!— the exam was delivered. <!- When reportCategory is used, the driver will expect that sub-categories exist directly below the "report category" . —> <!— Within these sub-categories exist items to be reported against . —>
When filterCategory is used, a category filled with sections is --> expected. Only those sections in the "filter category" will be —>
<!-- included in the results file. This is useful for multi- day
<!-- exams, where one category contains day 1 sections, another —>
<!-- contains day 2 sections, etc
<!-
<!--
—>
<ElementType name="xmlResults" content= "empty" >
<AttributeType name="reportCategory" dt:type="string" required="no" />
<AttributeType name="f ilterCategory" dt:type="string" required="no" />
<AttrιbuteType name=" history" dt:type="enumeration" dt:values="true false" default="false" required="no" />
<AttributeType name="resultFileName" dt:type="string" required="no" />
<AttrιbuteType name="delete" dt:type="enumeratϊon" dt:values="true false" default="false" required="no" /> ottribute type="reportCategory" /> ottribute type="filterCategory" /> ottribute type="history" /> ottribute type="resultFileName" /> ottribute type="delete" /> </ElementType>
< !— [video_response-schema .xml] -- >
< !— edited with XML Spy v3 . 5 NT (http : //www . xmlepy . com) by An^ ali (Prometric Inc . ) — >
< !-- Setup the schema so by default we are in the Microsoft namespace . — >
< !— Then create a namespace called "dt " and associate it with the set -- >
< !— of data types Microsoft supports
--> <!--
<!— <videoResponseCheatData>
—> <!--
—>
<!— ELEMENT REQ? DESCRIPTION
—>
<!-- tooMany no [on] Element defining maximum responses before candidate is considered cheating —> <!— rapidResponse no [on] Element defining rapid response cheat detection —>
<!— pattern no [on] Element defining pattern response cheat detection <!--
<ElementType name="videoResponseCheatData" order="many" content="eltOnly" model="closed">
<element maxOccurs="l" minOccurs="0" type="too any" /> <element maxOccurs="l" minOccurs="0" type="rapidResponse" /> <element maxOccurs="l" minOccurs="0" type="pattern" />
</ElementType>
<!--
<!-
< !— <cooMany> -->
< !- — >
< !-- ELEMENT REQ? DESCRIPTION
— >
< !-- maxResponsesPerClip [SO ] Maximum number of responses allowed — >
< !— enabled [true] Turns this cheat detection on/off — >
< !--
<ElementType name="tooMany" order="one" content="empty" model="closed"> <AttributeType name="maxResponsesPerClip" dt:type="int" required="no" /> <AttributeType name="enabled" dt:type="βnumeration" requιred="no" dt:vaiues="true false" /> ottribute type="maxResponsesPerClϊp" /> ottribute type="enabled" /> </ElementType>
< !-
< !-- <pattern>
-- > < !-
-- >
< !— ATTRIBUTE REQ? DESCRIPTION
< !— enabled no [true] Turns this cheat detection on/of f — >
< !-- variance no [ . 1] The time difference between response times — >
< !— maxViolations no [5 ] The amount of violations before cheating is detected — >
< !--
<ElementType name="pattern" order="one" content="empty" model="closed"> <AttributeType name="enabled" dt:type="enumeration" required="no" dt;values="true false" /> <AttπbuteType name="maxViolations" dt:type="int" required="no" /> <AttributeType name="variance" dt:type="float" required="no" /> ottribute type="enabled" /> ottribute type="maxViolations" /> ottribute type= "variance" /> </ElementType> <!--
<!— <rapidResponse> ->
<!--
—>
<!— ATTRIBUTE REQ? DESCRIPTION
—>
<!— enabled [true] Turns this cheat detection on/off — >
<!— speed no [ .2 ] The amount of time a candiate must wait before responding again, default is .2 in code
<!— maxViolations no [2] The amount of violations before cheating is detected, default is 2 in code
< !--
<ElementType name="rapidResponse" order="one" content="empty" model="closed">
<AttributeType name="enabled" dt:type="enumeratϊon" required="no" dt:values="true false" />
<AttributeType name="maxVioIatioπs" dt:type="int" required="no" />
<AttributeType name="speed" dt:type="float" required="no" /> ottribute type="enabled" /> ottribute type="maxViolations" /> ottribute type="speed" /> </ElementType> <!--
< !— <videoResponseConf iguration>
—> <!--
—>
<!— ATTRIBUTE REQ? DESCRIPTION
—>
<!— inappropriaceMessagePosition no [end] When to display the message informing the candidate of cheating —> <!— cheatScore no [0] The score a candidate recieves for a clip if cheating is detected —> <!— inappropriateResponseMessage no [You responded to this clip in an unacceptable manner. You will score -->
<!— 0 for this clip.
You will be moved on in I ime seconds.] —>
<!— The message to display if cheating is detected -->
<!— inappropπateResponseMessageTimeout no The time in seconds to display the cheat warning dialog —>
<!— responselndicatorBlinkAmount no [4] The number of times to blink (off then on) the indicator icon -->
<l— responselndicatorBlin Dplay no [100] The time in milliseconds between indicator icon blinks —>
<!— clipDelay no [10] The amount of time in seconds to show the clip delay dialog —>
<!— clipDelayButtonCaption no [_Continue] The message for the clip delay dialog, blank caption to hide —> <!-- clipDelayMessage no LYour clip will start automatically in (itime) seconds. Press the — >
<!— IDefaultButton button to start your clip now.] — >
<!-- The caption of the clip delay dialog (keywords: Itime = seconds -->
<!-- remaining !
Def aultButton = caption of the default button, and -->
<!-- !NonDe aultButton for the caption of the secondary button — >
<!-- responselndicatorlcon no [{flag icon}] the icon to represent a response on the indicator bar -->
<!-- streachVideo no [false] streach the video to fit the window (may decrease performance) — >
<!-- allowableFra eRateOverrun no [1] the amount of frames/second allowed over the encoded rate before — >
<!— allowableFrameRateϋnderrun no [2] the amount of frames/second allowed under the encoded rate before — >
<!-- the test is shut down — >
<!-- allowableJitterMax no [20] the amount of jitter allowed (frame display deviation) before -->
Figure imgf000077_0001
shut down — >
<!-- serviceLevelErrorMessage no [A system error has occurred and your event has been halted to preserve — >
<! — the integrity of your test. Please contact your test center — >
<! — administrator for assistance.] — >
<!-- The message to display if the video performance falls below the -->
<! — allowed jitter and frame rate — >
<! —
<ElementType πame="videoResponseConfiguration" order="one" content="empty" model="closed"> <AttributeType name="inappropriateMessagePosition" dt;type="enumeration" required="no" dt:values="none end immediate" /> <AttributeType name="cheatScore" dt:type="string" required="no" /> <AttributeType name="inappropriateResponseMessage" dt:type="string" required="no" /> <AttributeType name-"inappropriateResponseMessageTimeout" dt;type="int" required="no" /> <AttributeType name="responseIndicatorBlinkAmount" dt:type="int" required="no" /> <AttributeType name="responseIndicatorBlinkDelay" dt:type="int" required="no" /> <AttributeType name="clipDelay" dt:type="int" required="no" /> <AttributeType name="clϊpDelayMessage" dt:type="string" requlred="no" /> <AttributeType name="clipDelayButtonCaption" dt;type="string" required="no" /> <AttributeType name="responseIndicatorIcon" dt:type-"string" required="no" /> <AttributeType name="streachVideo" dt:type="enumeration" required="no" dt:values="true false" /> <AttributeType name="aIlo ableFrameRateOverrun" dt:type="int" required="no" /> <AttributeType name="allowableFrameRateUnderrun" dt:type="int" requιred="no" /> <AttributeType name="allowableJitter ax" dt:type="int" required="no" /> <AttributeType name="serviceLevelErrorMessage" dt:type= "string" requιred="no" /> ottribute type="responseIndicatorBlin Amount" /> ottribute type="responseIndicatorBlinkDelay" /> ottribute type="clipDelay" /> ottribute type= clipDelayMessage" /> ottribute type="clipDelayButtonCaption" /> ottribute type="responseIndicatorIcon" /> ottribute type="inappropriateMessagePositϊon" /> ottribute type="cheatScore" /> ottribute type="inappropriateResponseMessage" /> ottribute type="inappropriateResponseMessageTimeout" /> ottribute type="allo ableFrameRateOverrun" /> ottribute type="allowableFrameRateUnderrun" /> ottribute type="allowableJitterMax" /> ottribute type="serviceLevelErrorMessage" /> ottribute type="streachVideo" /> </ElementType> <!--
<!-- <videoResponseTutorialConf ιguration>
— > <!--
— >
<!— ATTRIBUTE REQ? DESCRIPTION
—> <!-- replayCount no [1] The number of times the candidate can replay the tutorial video -->
<!-- replayPrompt no [Press the 'Practice' button to play the clip again, or the 'Continue' —>
<!— button to proceed with the test.] (keywords: 'time =• seconds —>
<!-- remaining !
DefaultButton = caption of the default button, and —>
<!-- INonDefaultButton = the caption of the secondary button -->
<i— responselndicatorlcon no [{flag icon}] the icon to represent a response on the indicator bar -->
<!-- The prompt text for replaying the video —>
<!— practiceFra eStart no [-1] The frame m the video to show the clip delay dialog -->
<!— This will be used after the instructions and before the sample content —> <!-- in the tutorial video to simulate a real clip look & feel. —>
<!— replayPromptStart no [-1] The point
(frame) in the video to display the replay prompt —>
<!— replayDelayButtonCaption no [_Practice] The caption of the replay button, a blank caption hides the button —> <!— replayDelay no [20J 'ine numoer or seconds to display the replay dialog -->
<!-- continueButtonCaption no The caption of the continue button, a blank caption hides the button —>
<!--
<ElementType name="videoResponseTutorialConfiguration" order="one" content="empty" model="closed">
<AttributeType name="replayCount" dt:type="int" required="no" />
<AttributeType name="replayPrompt" dt:type="string" required="no" />
<AttributeType name="practiceFrameStart" dt:type="int" required="no" />
<AttributeType name="replayPromptStart" dt:type="int" required="no" />
<AttributeType name="replayDelayButtonCaption" dt:type="string" required="no" />
<AttributeType name="continueButtonCaption" dt:type="string" required="no" />
<AttributeType name="replayDelay" dt:type="int" required="no" /> ottribute type="replayCount" /> ottribute type="replayPrompt" /> ottribute type="practϊceFrameStart" /> ottribute type="replayPromptStart" /> ottribute type="replayDelayButtonCaption" /> ottribute type="continueButtonCaption" /> ottribute type="replayDelay" /> </ElementType> <!-
<!-
—>
<!— <vιdeoResponseWindow
—>
—>
<!— ATTRIBUTE REQ? DESCRIPTION
—>
<!— startFrame yes The frame that marks the beginning of the window -->
<!— endFrame yes The frame that marks the end of the window —>
<!— pointsValue yes The points value for the window -->
<!--
—>
<ElementType name="videoResponse indow" order="one" content="empty" model="closed">
<AttributeType name="startFrame" dt:type="int" required="yes" />
<AttributeType name="endFrame" dt:type="int" required="yes" />
<AttributeType name="pointsValue" dt:type="float" required="yes" /> ottribute type="startFrame" /> ottribute type="endFrame" /> ottribute type="pointsValue" /> </ElementType> <!-- <videoResponseAlternateMedia>
< --
< ! -- ATTRIBUTE REQ? DESCRIPTION
— > < l — URI yes The path to the video file , if you are defining global data at a level higher — >
< ! — than the item, you can specify an empty string and override it at the item level -->
< ! — Language yes The three-character language code to specify when this media should be used -- >
< ! — Specify a language of ' BSL ' for Brittish sign- language — >
< ! --
<ElementType name="videoResponseAlternateMedia" order="many" content="empty" model="closed">
<AttributeType name="URI" dt:type="string" required="yes" />
<AttributeType name="Language" dt:type="string" required="yes" /> ottribute type="URI" /> ottribute type="Language" /> </ElementType>
<videoResponse indowGroup>
—> —>
ATTRIBUTE REQ? DESCRIPTION —> yes The name of the window group —> —>
ELEMENT
—>
<!-- videoResponseWindow yes The windows contained in the window qroup —> —>
<ElementType name="vldeoResponseWindowGroup" order="oπe" content="eItOnly" model="closed">
<AttributeType name="name" dt:type="string" required="yes" /> ottribute type="name" /> olement type="vϊdeoResponseWindow" minOccurs="l" maxOccurs="*" /> </ElementType> <!-
<!-- ςvideoResρonseItem>
<!-
<!- ATTRIBUTE REQ? DESCRIPTIOH — >
<!— mode no [item] Tutorial or item mode -->
—>
<!- ELEMENT —>
<!-- videoResponseCheatData no The data for detecting cheating (must be defined at some level in the exam.) —>
<!— videoResponseCon iguration no The configuration of the item plugin (must be defined at some level in the exam.) —>
<!— videoResponseWindowGroup no The response window groups for the item —>
<!— videoResponseAlternateMedia no Alternate media links for voice-over and sign language -->
<!--
<ElementType name="videoResponseItem" order="many" content="eltOnly" model="closed">
<AttributeType name="URI" dt:type="strϊng" required="yes" /> <AttributeType name="mode" dt:type="enumeration" required="no" dt:values="item tutorial" /> <AttributeType name="verifyMediaOπCompile" dt:type="enumeration" required="no" dt:vaiues="true false" /> ottribute type="URI" /> ottribute type="mode" /> ottribute type="verifyMediaOnCompile" /> <element type="videoResponseCheatData" minOccurs="0" maxOccurs="l" /> <element type="videoResponseConfiguration" minOccurs="0" maxOccurs="l" /> <element type="videoResponseTutorialConf iguration" minOccurs="0" maxOccurs="l" /> <element type="videoResponseWindowGroup" minOccurs="0" maxOccurs-="*" /> olement type="videoResponseAlternateMedia" minOccurs="0" maxOccurs="*" /> </ElementType> < ! — [touchScreen_common-schema .xml] -- >
< !— Includes definitions for the most common touch screen sets of data : -->
<!-- 1. Multi-choice on a touch-screen.
—> <!-- 2. Displays on a touch-screen.
—> <!-- 3. Review screens on a touch-screen.
—>
<!-
—>
<!- < !--
<!— <touchScreenItem>
<!--
<!— ATTRIBUTE REQ? DESCRIPTION
<!— srσPath no Default path to any external file that is —>
<!— not explicitly defined on a sub element . —>
<!— bgcolor no [#FFFFCE] The background color for the item. -->
<!— This is described as three hex values from 00 —>
<!— to FF; representing the red, green, and blue --> components of the color. These are preceded by —> <!— a "#".
<!-- bgcolorSelect no [ftAAAAFF] The background color for an item when —>
<!— it is selected. This uses the same format as —>
<!— the bgcolor attribute.
<!-- id no Exists for backwards compatiblity. No —>
<!-- information for this attribute is compiled into --> <!— the resource file. <!--
< !— SUB-ELEMENTS
<!— <touchScreenPrompt> <stem>
<touchScreenMultiChoice> —>
<!— NOTES
<!— This is the top level element for a multi-choice item on a touc -
—> <!-- screen system.
<!--
<!--
— >
<ElementType name="touchScreenItem" order="many" content="eltOnly" model="open"> <AttributeType name="srcPath" dt:type="strϊng" required="no" /> <AttributeType name="bgcolor" dt:type="string" required="no" default="#FFFFCE" />
<AttributeType name="bgcolorSelect" dt:type="string" required="no" default="#AAAAFF" />
<AttributeType name="ϊd" dt:type="string" required="no" /> ttribute type="srcPath" /> ottribute type="bgcolor" /> ottribute type="bgcolorSelect" /> ottribute type="id" />
<element type="touchScreenPrompt" /> olement type="stem" />
<element type= "touchScreenMultϊChoϊce" /> </ElementType> <!-
< ! —
< !— <touchScreenMultiChoιce>
< !--
<!— ATTRIBUTE REQ? DESCRIPTION
<!— correct nswer yes String containing correct answer. (I.E. "ABD") —>
<!— labelType no Exists for backwards compatiblity . No —> information for this attribute is compiled into —> <!— the resource file .
<!— minResponses no [1] Minimum responses a candidate can choose. -->
<!— maxResponses no [1] Maximum responses a candidate can choose. -->
<!— randomizeDistracters no [false] (true) Whether or not the order of —> <!-- the d. stracters should be randomized. —>
<!—
<!— SUB-ELEMENTS
<!— <distracter>
<!--
<!-- NOTES
<!— This is the top level element for a multi-choice item on a touσh-
—> <!— screen system.
<!-
<!-- <ElementType name="touchScreenMultiChoice" order="many" content="eltOnly" model="open">
<AttrιbuteType name="correctAnswer" dt : type =" string" required="yes" /> <AttributeType name="labelType" dt:type="enumeratϊon" required="no" dt:values="aiphabetic" default="alphabetϊc" /> <AttπbuteType name="mϊnResponses" dt:type="string" default="l" required="no" /> <AttributeType name=' maxResponses" dt:type-"string" default="l" required="no" /> <AttπbuteType name="randomizeDistracters" dt:type="enumeration" dt:values="true false" default= "false" required="no" /> ottribute type="correctAnswer" /> ottribute type="labelType" /> ottribute type="minResponses" /> ottribute type="maxResponses" /> ottribute type="randomizeDistracters" /> olement type="distracter" /> </ElementType> <!-
<!-
<!-- <touchScreenPrompt>
<!— ATTRIBUTE REQ? DESCRIPTION
<!— type no [text] Exists for backwards compatiblity. No —>
<!-- information for this attribute is compiled into —> <!— the resource file . <!--
<!— SUB-ELEMENTS
<!— <text> <mediaInfo>
<!--
<!— NOTES
<!— - The prompt used for a touch-screen, multi-choice item.
—> <!-- - ???TBH Is "giaphiσ" suppose to be implemented? If not remove -->
<!— from this schema.
<!--
<!-
<ElementType name="touchScreenPrompt" order="many" content="eltOnly" model="open">
< dt:values="text"
Figure imgf000085_0001
<element type="medϊaInfo" /> </ElementType> <!--
—>
<!--
<!— <stem>
<!--
<!— ATTRIBUTE REQ? DESCRIPTION
<!-- type yes [text] Type of stem to use.
<!— URI If the HTML/XHTML is in an external file, then —> this points to the file. In the case of multiple — >
<!-- items in one file use an HTML label . — >
<!-- (I . E . i
"filetlabel")
<!-- lang no Exists for backwards compatibility and does not
<!-- offer additional functionality. —>
<!--
<!-- SUB-ELEMENTS
<!-- <html> <mediaInfo>
<!--
<!-- NOTES
<!- The stem portion of
—>
<!--
<!-
- <ElementType name="stem" order-"many" content="eltOnly" model="open"> <AttributeType name="type" dt:type="enumeration" dt:values="text" requιred="yes" /> <AttπbuteType name="URI" dt:type="string" required="no" /> <AttributeType name="lang" dt:type="string" requιred="no" /> ottribute type="type" f> ottribute type="lang" /> ottribute type="URI" /> olement type="mediaInfo" /> </ElementType>
< !--
<dιstraσter
ATTRIBUTE REQ? DESCRIPTION
<!-- id yes Identifier for the distracter. The identifer —> is also the value of the distractor when —> selected.
(I.E. : "A")
<!— type yes [text] (image) Type of distracter. — >
<!-- srσ no The source path to an image, if the distracter
<!- is of type "image" .
<!—
<!-- SUB-ELEMENTS <!-- <teκt> <medialn OS <!-- <!-- NOTES
One of choices a candidate can select as (part of) the correct — > < !- answe .
<!- You cannot mix distracter types . They need to all be
"text" or —> <!— all be "image" .
<!-
<!--
<ElementType name="distracter" order="many" content="eltOnly" model="open">
<AttributeType name="id" dt:type="string" required="yes" /> <AttributeType name="type" dt:type="enumeration" dt:values="text image" required="yes" /> <AttributeType name="src" dt:type="strϊng" required="no" /> ottribute type="id" /> ottribute type="type" /> ottribute type="src" /> olement type="text" /> <element type="mediaInfo" /> </ElementType> < !--
— >
<!-- <touchScreenDisplay>
< !-
<!-- ATTRIBUTE REQ? DESCRIPTION
< !— type yes (standard) (confirmation) Type of display. Standard — > has no special functionality, where as —> confirmation requires user interaction. —>
< !-- srcPath no Default path to any external file that is not — >
< !-- explicitly defined on a sub element —>
<!— image no Image (BMP) to display above the confirmation HTML . -->
<!— URI no If the HTML/XHTML is in an external file , then — >
<!-- this points to the file . In the case of multiple —>
<!-- items in one file use an HTML label . —>
<!-- (I.E. :
"file#label")
<!— bgcolor no [#FFFFCE] The background color for the display. —>
<!-- This is described as three nex values from 00 to —>
<!-- FF; representing the red, green, and blue —>
<!-- components of the color . These are preceded by a —>
<!--
SUB-ELEMENTS <mediaInfo>
<!-- <!-- NOTES <!-- The bgcolor is only necessary for "confirmation" type of display
—> <!-- screens. Standard screens are completely HTML and can change the
—>
<!-- the background color through HTML tags . <!-
<!--
— >
<ElementType name="touchScreenDisplay" order="many" content="eltOnly" model="open">
<AttributeType name="type" dt:type="enumeratϊon" dt:values="standard confirmation" required="yes" />
<AttributeType name="srcPath" dt:type="strϊng" required="no" />
<AttributeType name="image" dt:type="string" required="no" />
<AttributeType name="URI" dt:type="string" required="no" />
<AttributeType name="bgcolor" dt:type="striπg" required="no" default="#FFFFCE" /> ottribute type="type" /> ottribute type="srcPath" /> ottribute type="image" /> ottribute type="URI" /> ottribute type="bgcolor" />
<element type="mediaInfo" /> </ElementType> <!-
<!-
<!-- <touchScn
<!--
<!- ATTRIBUTE REQ? DESCRIPTION
<!- srcPath no Source path.
<!- SUB-ELEMENTS <!- <numberTotal> <numberComplete> <numberIncoraplete> <numberMarked> —>
<!-- NOTES <!- Used to define a review for a touch- screen exam.
—> <!- <!-
<ElementType name= touchScreenReview ' order="many" content="eltOnly" model="opeπ">
<AttributeType name="srcPath" dt:type="string" required="no" /> <AttributeType name="bgcolor" dt:type="strϊng" default="#FFFFCE" required="no" /> ottribute type="srcPath" /> ottribute type="bgcolor" /> olement type="numberTotal" minOccurs="l" maxuccurs=' i" /> olement type="numberComplete" minOccurs="l" maxOccurs="l" /> olement type="numberIncomplete" minOccurs="l" maxOccurs="l" /> olement type="numberMarked" minOccurs="l" maxOccurs="l" />
</ElementType>
< ! —
< !—
<!— <numberTotal <!--
<!— ATTRIBUTE REQ? DESCRIPTION
<!-- image Image (bitmap) to display left of the text . —>
<!--
<!— SUB-ELEMENTS
<!-- <text> <mediaInfo>
<!--
<!— NOTES
<!— - A] lows customization of the 'Total' line for the review screen. -->
<!-
<ElementType name="numberTotal" order="many" content="eltOnly" model="open">
<AttributeType name="image" dt:type="string" required="no" /> ottribute type="image" /> olement type="text" />
<element type="mediaInfo" /> </ElementType> < !-
<!-
<!— <numberComplete>
<!— ATTRIBUTE REQ? DESCRIPTION
<!— image Image (bitmap) to display left of the text . —>
<!-
<!- SUB-ELEMENTS <!-- <text> <mediaInfo>
<!— NOTES
<!— - Allows customization of the 'Complete' line of the review screen. —>
<!-
<!-
— >
<ElementType name="numberComplete" order="many" content="eltOnly" model="open">
<AttrιbuteType name="image" dt:type="string" required="no" /> ottribute type="image" />
<element type="text" /> olement type="mediaInfo" /> </ElementType>
<!--
<!— <numberIncomplete>
<!— ATTRIBUTE REQ7 DESCRIPTION
<!— image Image (bitmap) to display left of the text .
<!— SUB-ELEMENTS
<!— <text> <mediaInfo>
<!— NOTES
<!— - Allows customization the 'Incomplete' line of the review screen. —>
<!--
<!--
->
<ElementType name="πumberIncomplete" order="many" content="eltOnly" model="open">
<AttributeType name="image" dt:type="string" required="πo" /> ttribute type="image" />
<element type="text" /> olement type="mediaInfo" /> </ElementType> <!-
— > <!— <numberMarked
<!— ATTRIBUTE REQ? DESCRIPTION
<!— image no Image (bitmap) to display left of the text .
<!— SUB-ELEMENTS
<i— <text> <mediaInfo>
< !- NOTES
Allows CTistomization the 'Marked ' line of the review screen . — >
< !-
<ElementType name="numberMarked" order="many" content="eltOnly" modei="open">
<AttributeType name="image" dt:type="strϊng" required="no" /> ottribute type="image" />
<element type="text" /> olement type="mediaInfo" /> </ElementType> <!-
< !--
<!— <media!nfo> <!-
<!-- ATTRIBUTE REQ? DESCRIPTION
<!— type yes [audio] (video) Type of media to be played. —>
<!-- lang yes Media Language
<!-- indexStart The time offset where the media should start —> <!-- playing,
<!-- indexEnd no The time offset where the media should stop
<!- playing.
<!- <!— SUB-ELEMENTS
<!-- n/a
<!--
<l~ NOTES
<!-- - Defines which media files (voice or video) are played when a —>
<!— candidate oveis the mouse over the associated text oi graphic. —>
<!-- - If an indexStart & mdeκEnd does not exist, it is assumed the —>
<!— media file is played from start to end.
—>
<!--
->
<ElemeπtType name="mediaInfo" order="many" content="empty" model="open">
<AttributeType name="type" dt:type="enumeration" dt:values="audio video" requιred="yes" />
<AttributeType name="lang" dt:type="strϊng" required="yes" />
<AttributeType name="indexStart" dt:type="string" required="no" />
<AttributeType name="indexEnd" dt:type="strϊng" required="no" /> ottribute type="type" /> ottribute type="lang" /> ottribute type="indexStart" /> ottribute type="indexEnd" /> </ElementType> <!--
<!--
<!-- <text>
<!--
<ι-- ATTRIBUTE REQ? DESCRIPTION
<l— lang no Exists for backwards compatibility and does not —>
<!-- offer additional functionality. —>
<!-- font no Name of a True Type Font to use. —>
<!-- size no The point size of the font to use. —>
<!-- color no Color of the font .
Default of black in the code.
<!— italic no [true false] font is italic. Code default of true.
<!-- bold no [true false] font is bold. Code default of true .
<!-- SUB-ELEMENTS
<!-- n/a
<!--
<!-- NOTES
<!--
<!--
<ElementType name="text" order="many" content="mixed" model="open"> <AttributeType name="Iang" dt:type="string" required="no" /> <AttributeType name="font" dt:type="strlng" required="πo" /> <AttributeType name="size" dt:type="strϊng" required="no" /> <AttributeType name="color" dt:type="string" required="no" /> <AttributeType name="italic" dt:type="enumeration" dt:values="true false" required ="no" /> <AttributeType name="bold" dt:type="enumeration" dt:values="true false" required ="no" /> ottribute type="lang" /> ottribute type="font" /> ottribute type="sΪ2e" /> ottribute type="color" /> ottribute type="italic" /> ottribute type="bold" />
</ElementType>
< !— [tasσ_result ~schema .xml] — >
< !-
— >
<!--
<!— <tascResults> <!--
<!-- ATTRIBUTE REQ? DESCRIPTION
<!— reportCategory no Top level category containing sub-categories — >
<!-- to report against. If there are any items —>
<!- directly off the top level category, they are —>
<!-- ignored,
Section listed in any category —>
<!-- below the reportCategory are ignored. —>
<!— f ilterCategory no The name of a category which contains sections . — >
<!- Only these sections are reported against in the < !-- results file .
< !-- SUB -ELEMENTS
< !-- none < !--
< l— NOTES
<!— - With no options, the TASC result file is ordered in the same way —>
<!-- the exam was delivered.
<!--
<!-- - When reportCategory is used, the driver will expect that
— >
< !-- sub-categories exist directly below the " reportCategory" .
— > < !— Within these sub-categories exist items to be reported against . -->
< !--
< !— - When f ilterCategory is used, a category f illed with sections is -->
<!-- expected Only those sections in the " IlterCategory" will be -->
<!-- included in the results file. (IE: Multi-day exams can utilize —>
<!— this by putting each day's sections into a different
—> <!— filterCategor . )
<'— - To exclude items m results, create a custom attribute called -->
< !— "CountlnOutputTotals " of type "BSTR " with a value of false . -- >
< !--
< !-
<ElementType name="tascResults" content="empty">
<AttributeType name=" reportCategory" dt:type="string" requιred="no" />
<AttributeType name="f ilterCategory" dt:type="string" required="no" /> ottribute type="reportCategory" /> ottribute type="f ilterCategory" /> </ElementType> < i— [summary_score- schema . xml] — >
->
< l- <!-- <sectionScor
<!--
<!— ATTRIBUTE DESCRIPTION
<!-- low yes low value of range
— >
<!-- high yes high value of range
— >
<!--
<!-- SUB-ELEMENTS
<!-- none
<!- NOTES
<!--
<!--
—>
<ElementType name="sectionScoreRange" order="many" content="empty" model="closed">
<AttributeType name="low" dt:type="string" required="yes" />
<AttributeType name="high" dt:type="strϊng" required="yes" /> ottribute type="lo " /> ottribute type="high" /> </ElementType> <!--
—>
< !-
<!— <masteryLevel>
<!-
<!— ATTRIBUTE REQ? DESCRIPTION
<!— cut yes cut score
<!— cutlnSubs cut score for sub- sections
<!— low low value of range
<!— high high value of range
<!--
<!— SUB-ELEMENTS <!-- none
<!— NOTES Specifies required cut score for passing the test
—>
<ElementType name="masteryLevel" order="many" content="empty" model="closed">
<AttributeType name="cut" dt:type="string" required="yes" /> <AttributeType name="cutInSubs" dt:type="string" required="no" /> <AttributeType name="lo " dt:type="strϊng" required="no" /> <AttributeType name="high" dt:type="string" required="no" /> ottribute type="cut" /> ottribute type="cutInSubs" /> ottribute type="Iow" /> ottribute type="high" />
</ElementType>
— >
<!--
<!-- <rav^MasteryL
<ι~
<l- ATTRIBUTE REQ? DESCRIPTION
<!- cut yes cut score
<!- cutlnSubs cut score for sub- sections — >
<!- low low value of
UT3.ΪQΘ
<!- high hiσh value of range
<!-
<!- SUB-ELEMENTS
<!- none
<!-- NOTES
<!-- Specifies required raw cut score for passing the test
—>
<!-
<ElementType name="rawMasteryLevel" order="many" content="empty" model="closed">
<AttrιbuteType name="cut" dt:type="strϊng" requιred="yes" /> <AttπbuteType name="cutInSubs" dt:type="string" required="no" /> <AttrιbuteType name="low" dt:type="string" requιred="no" /> <AttrιbuteType name="high" dt:type="strlng" required="no" /> ottribute type="cut" /> ottribute type="cutInSubs" /> ottribute type="lo " /> <attπbute type="high" />
</ElementType> <!-
< !--
<!-- <passingCriteria>
<l~
<!— ATTRIBUTE REQ? DESCRIPTION
<!-- none
<!--
< ! — SUB -ELEMENTS
< ! -- none
< ! --
< ! — ITOTΞS
<!— Contains script that is evaluated to determine if pass or fail
—> <!-- If the expression evaultes to true, the result will be pass
—>
<!-
<!-
<ElementType name= "passϊngCriteria" order="many" coπtent="textOnly" model="open" /> <!-
<!--
<!— <summaryScore>
<!— ATTRIBUTE EQ1; DESCRIPTION <!— computeScore standard:
( (# correct / total) * range) + low —>
<!- sum eights : <!-- sum of the item weights
<!— graded true result will be pass/fail -->
<!- false - result will be taken --> <!— precision- rounding method <!- trunc - truncation of fractions —>
<!-- round - round up/down at .5 real - no rounding <!-- scoreltem no false - no items will be scored —>
<!-
(overrides item's specification) <!-- true - all items that don't specify —>
<!- otherwise will be scored <!--
<!— SUB-ELEMENTS
<!— sectionScoreRange
<!— masteryLevel
<!— rawMasteryLevel
<!--
<!— NOTES
<!--
<ElementType name="summaryScore" order="many" content= "mixed" model="open">
<AttributeType name="computeScore" dt:type="enumeratϊon" dt:vaiues="standard sumWeights" required="no" default="standard" />
<AttributeType name="graded" dt:type="enumeration" dt:values="true false" required="no" default="true" />
<AttributeType name="precision" dt:type="enumeration" dt:values="trunc round real" required="no" default="trunc" />
<AttributeType name="scoreItem" dt:type="enumeration" dt:values="true false" required="no" default="true" /> ottribute type="computeScore" /> ottribute type="graded" /> ottribute type="precision" /> ottribute type= "scoreltem" />
<element type="sectionScoreRange" minOccurs="0" maxOccurs="l" />
<element type="masteryLevel" minOccurs="0" maxOccurs="i" />
<element type="ra MasteryLevel" minOccurs="0" maxOccurs =":L" />
<element type="passingCriteria" minOccurs="0" maxOccurs="l" /> </ElementType>
< !— [standard_ timer- schema . ml] — >
< !-
— >
<!-- <!— <standardTimer>
<!--
<!— ATTRIBUTE REQ? DESCRIPTION
<!— timed no [true] (false) If the. construct using this —>
<!-- timer is actually timed.
<!— minutes no How many minutes exists for the timer. — >
<!— ty e [session] (exam)
If session, timer will —>
<!-- continuously run. If exam, timer will only run while a persentation is displayed (not when —>
Figure imgf000099_0001
rendered. )
<!-- contributing no [true] (false) If false, this timer essentially -->
<!- "pauses" parent timers. (I.E.: A tutorial —>
<!-- which doesn't effect the exam time.) —>
<!— timeout no [immediate] (after) If immediate, when the time --> ends tne candidate is done. If after, the —> candidate is allowed to finish the current —> presentation. <!-- pauseEnabled no [false] (true) If pausing the timer allowed. —>
<!-- defaultΞxtends no [false] (true) If the timer should look for —>
<!-- default time extensions from the environment <!—
SUB-ELEMENTS <warmng> <extend
NOTES
<!-- Timer plug-ing data, to establish time limits on exam constructs . —> <!-- Default time extensions mαltiplers include xi.5, x2.0. —>
<!-- Default time additions only apply to the form an is +30 minutes . <!--
<ElementType name="standardTimer" order="many" conterιt="eltOnly" model="open"> <AttributeType name="timed" dt:type="enumeration" dt:values="true false" default="true" required="no" /> <AttrlbuteType name="minutes" dt:type="string" required="no" /> <AttributeType name="type" dt:type="enumeration" dt:values="session exam" default="session" required="no" /> <AttributeType name="contributing" dt:type="enumeration" dt:values="true false" default="true" required="no" /> <AttributeType name="timeout" dt:type="enumeration" dt:values="immedlate after" default="immediate" required="no" /> <AttributeType name="pauseEnabled" dt:type="enumeration" dt:values="true false" default= "false" required="no" /> <AttributeType name="defaultExtends" dt:type="enumeration" dt:values="true false" default="true" required="no" /> ottribute type="timed" /> ottribute type="minutes" /> ottribute type="type" /> ottribute type="contributing" /> ottribute type="timeout" /> ottribute type="pauseEnabled" /> ottribute type="defaultExtends" />
<element type="warning" minOccurs="0" maxOccurs^"*" /> <element type="extend" minOccurs="0" maxOccurs="*" /> </ElementType> <!-
<!--
< !-- <warning >
<!-
<!-- ATTRIBUTE REQ? DESCRIPTION
<!— seconds ves i or - value of seconds when warning is displayed. —>
<!-- repeat no [1] Number of times to repeat warning. Only —>
<!- warnings with negative seconds can repeat .
< <!!---- S SUUBB--E' LEMENTS
<!-- none
<!-
<!- NOTE;
<!-- A message which appears after a certain amount of time has —>
<!-- passed. <!-
< !--
—>
<ElementType name= "warning" order="many" content="textOnly" model="open">
<AttributeType name="seconds" dt:type="string" required="yes" />
<AttributeType name="repeat" dt:type="string" default="-L" required="no" /> ottribute type="seconds" /> ottribute type= "repeat" /> </ElementType> <!-
— >
<!-
<! — <extend> <!—
<!— ATTRIBUTE REQ? DESCRIPTION
<!-- multiplier yes A real number to multiply with allowed time. — > <!--
< !— SUB-ELEMENTS
<!-- none
<!— NOTES
Used to extend the timer in certain circumstances
<!-- Contains a script expression that when evaluates true , time will <!- be multiplied
<ElementType name="extend" order="many" content="mixed" model="open"> <AttributeType name="multiplϊer" dt:type="string" required="yes" /> ottribute type="multiplier" />
</ElementType>
< !— [standard_review-schema . xml] — >
< !-
< !-
<!— <standardReview> <!- <!- ATTRIBUTE REQ? DESCRIPTION
<l— style yes (largelcon, smalllcon, list, report) How —> the presentations are shown to the candidate in the review list . <!-- color no Color of the text m the foreground. Format —> <!-- s three nex values after a "4 " , <l- represending the red, green, and blue <!-- components of the color <!-- bgcolor no The background color of the review screen. -->
Format is three hex values after a "#", <!-- represendmg the red, green, and blue —> <!- components of the color. <!— margin no Tne space (in pixels) between the review —>
<ι~ list box end the border of the page. —> <!— itemLabel no [Item] The word
(s) displayed before the item —> <!-- in the first column of the list view -->
<!— yesLabel no [Yes] The word used to mark the columns of —> <!-- the report view display <l~ ltemColu nLabel no [Name] The word at the top of the list
<!-- view's first column, which displays the
<!-- item name and number.
<!--
<!-- SUB-ELEMENTS
<!-- <ιtemIcon>
<!-
<ι- NOTES
Hel plug-in for a review screen (at the end of a section) —> <ι~ < ι ~
<ElementType name="standardReview" content="eltOnly"> <AttπbuteType name="style" dt:type="enumeratϊon" dt:values="largeIcon s alllcon list report" requιred="yes" /> <AttπbuteType name="color" dt:type="string" required="no" /> <AttrιbuteType name="bgcoIor" dt:type= 'string" requιred="no" /> <AttrιbuteType name="margin" dt type="int" requιred="no" /> <AttrιbuteType name="itemLabel" dt;type="string" requιred="no" default="Item" /> <AttrιbuteType name="yesLabel" dt:type="striπg" requιred="no" default="Yes" /> <AttributeType name="itemColumn abel" dt:type="string' requιred="no" default="Name" /> ottribute type="style" /> ottribute type="color" /> ottribute type="bgcolor" /> ottribute type="margin" /> ottribute type="itemLabel" /> ottribute type="yesLabeI" /> ottribute type="itemColumnLabel" /> olement type="itemIcon" /> </ElementType>
<!-
<l- <ιtpmicon>
<!-
<!-- ATTRIBUTE REQ' DESCRIPTION
<!- status yes (flagged, incomplete , complete , skipped)
Figure imgf000103_0001
< !— ' status " which the image icon represents —> < !-- l g yes A URI pointing to a BMP image file .
This image s — >
< !- used as the icon for the value in " status " —>
< l— title i A text string which overrides tne default title
<!-- for the column,
Defaults are determined by the
<!-- value of status
If the string is empty, the
<!-- column is not s own
<l— SUB-ELEMENTS <!-- none < ! — NOTES
<!— - Associates an icon with a status.
<!- Determines if a column exists in the review screen, and the text at the top of the column.
<!-- - It makes no sense to override a columnTitle if set columnShow to —> <!-- be false.
<!--
—>
<ElementType name="itemIcon" content="empty"> <AttributeType name="img" dt:type="string" required="yes" /> <AttributeType name="status" dt:type="enumeration" dt:values="flagged incomplete complete skipped" required="yes" /> <AttributeType name="title" dt:type="strϊng" required="no" /> ottribute type="img" /> ottribute type="status" /> ottribute type="title" />
</ElementType>
<!-- [slsOutput_result - schema . xml] -->
<!--
—>
<!— <slsOutput> <!--
<!— ATTRIBUTE REQ? DESCRIPTION
<!— reportCategory no Top level category containing sub-categories —>
<!-- to report against. If there are any items
<!-- directly off the top level category, they are —>
<!-- ignored,
Section listed in any category
<!-- below the reportCategory are ignored. —>
<!— ilterCategory no The name of a category which contains sections. —>
<!-- Only these sections are reported against m the —>
<!- results file.
<!— resourceCategory no Top level category containing sub-categories —> <'— to use as
"resource descriptions " Each of —>
<l~ these categories must have a description, and —>
<•— should be leferenced by an item for that item —>
<l— to have a resoxirce description. —>
<i— resultType no Defines the type of the result. These values —>
<!-- are from the types permitted by SLSDeli. The —>
<!-- default is "test"
<!— reportCateqoryOrder no The order m which categories are sorted. —>
<■--
Values are "delivery" or "xxl1 Delivery —> <ι~ will sort in. delivery order, and xxl will —>
<l~ sort in the order they are listed m the xxl —> <l- Default is "xxl '
<l— SUB-ELEMENTS
< I — none <l~
< I— NOTES
<'— - With no options, the SLS result file is ordered in the same way -->
<ι— the exam was delivered
<!-- - When reportCategory is used, the driver will expect t-iat
—>
<i— sub-categones exist directly below the "report category" . —>
<'— Within tnese sub-categories exist items to be reported against -->
<'— - When filterCategoiy is used, a category filled with sections is -->
<'-- expected Only those sections m the 'filter category' will be -->
<!— included m the results file This is useful for multi- day ~>
<l— exams, where one category contains day 1 sections, another —>
<ι~ contains day 2 sections, etc
—>
<l— - The resultType option is used to split off result files for exams, —> < ! — surveys , and tutorials .
<ElementType name="slsOutput" content= "empty" >
<AttributeType name="reportCategory" dt:type="string" requιred="no" />
<AttributeType name="f ilterCategory" dt.type="string" required="no" />
<AttributeType name="resourceCategory" dt:type="string" required="no" />
<AttributeType name="resultType" dt:type="enumeration" dt:values="exam survey questionnaire test tutorial quiz review" default="test" required="no" />
<AttributeType name=' reportCategoryOrder" dt:type="enumeration" dt:values="delivery xxl" default="xxl" required="no" /> ottribute type="reportCategory" /> ottribute type="filterCategory" /> ottribute type="resourceCategory" /> ottribute type="resultType" /> ottribute type="reportCategoryOrder" /> </ElementType>
< ! — [sequential__exhaustive- schema . xml ] -->
< !—
<!--
<!-- <seguentialExhaustive>
<!-
<!-- ATTRIBUTE REQ? DESCRIPTION
<!-- none
<!--
<!— SUB-ELEMENTS
<!- none
<!--
<!- NOTES
<!- N Noo ddaattaa iiss ppaassised to the sequential exhaustive selection plug- m —>
<!--
< !-
============================================================== — >
<ElementType name="sequentialExhaustive" content= "empty" />
< ! -- [score_table -schema , xml] -- >
< ! — Setup the achema so by default we are in the Microsoft namespace .
— > < ! — Then create a namespace called " dt " and associate it with the set — > <!-- of data types Microsoft supports
<!--
<!-- <row>
<!--
<!-- ATTRIBUTE REQ? DESCRIPTION
< !— raw yes The raw score, obtained from the driver. —>
< !— report yea The (converted) score used in the report . —>
<!-- letterβraded no The (converted) letter grade used in the repor .
<!--
<!-- SUB-ELEMENTS <!--
NOTES
<!-- A "row" entry exists for each element of the score table.
—>
<!--
<ElementType name="row" order="many" content="empty" model="closed"> <AttributeType name="raw" dt:type="string" required="yes" /> <AttributeType name="report" dt:type="string" required="yes" /> <AttributeType name="letterGrade" dt:type="string" required="no" /> ottribute type="raw" /> ottribute type="report" /> ottribute type="letterGrade" />
</ElementType>
< !-
<!-- <scoreTable>
<!-- ATTRIBUTE REQ? DESCRIPTION
<!-
< ! — graded [true] (false) If true, result marks exam as <!-- pass / fail. Otherwise exam will be marked as —>
<!— taken / incomplete .
<!— scoreltem no [true] (false) If ture, all items are scored
<!-- except those which specially state they are —>
<!-- not. If false, no items are scored, regardless —> of their item attributes. —>
<!— SUB-ELEMENTS
<!-- <row> <masteryLevel>
<xawMasteryLevel> —> <!— <passingCriterιa>
<!--
<!— NOTES
<!-- A table of "row" entries for 'manually' score scaling.
—>
<!--
<!-
<ElementType name="scoreTable" order="many" content="eltOnly" model="closed"> <AttributeType name="graded" dt:type="enumeratϊon" dt:values="true false" default="true" required="no" /> <AttributeType name="scoreItem" dt:type="enumeration" dt:values="true false" default="true" required="no" /> <AttributeType name="letterGraded" dt:type="enumeration" dt:values="true false" default="false" required="no" /> ottribute type="graded" /> ottribute type= "scoreltem" /> ottribute type="letterGraded" /> <element type="row" minOccurs="l" maxOccurs="*" /> - <group order="many" minOccurs="l" maxOccurs="*">
<element type="masteryLevel" minOccurs="0" maxOccurs="l" /> <element type="rawMasteryLevel" minOccurs="0" maxOccurs="l" /> </group> olement type="passingCriteria" minOccurs="0" maxOccurs="l" /> </ElementType>
< !— [random-schema . ml] — >
< !-
— >
< !-
< !— <random> <!- ATTRIBUTE REQ? DESCRIPTION
<!-- quantity yes Number of presentations to select within the —> the section (or formGroup) , or "all" too select --> everything within it . <!--
<!- SUB-ELEMENTS
<!- none
<!~
<!— NOTES
<!-- - Selection plug-in data for random delivery.
—> <!--
<!--
<ElementType name="random" content="empty"> <AttributeType name="quantity" dt:type="string" required ="yes" /> ottribute type="quantity" />
</ElemeπtType>
< ! — [promptDef ult - schema . xml] — >
< ! — Setup the schema so by default we are in the Microsoft namespace .
—>
<!— Then create a namespace called "dt" and associate it with the set
—>
<!— of data types Microsoft supports
—>
<!--
<!-- <prompt>
<!-
<!— ATTRIBUTE REQ? DESCRIPTION
<!— minmax no [0] Contains the number that both min and max are —> <!-- equal to, if they are different, this value —> <!-- should be "0"
<!— height no [70] Height in pixels for prompt area —>
<!-- location [top] (bottom) Where the prompt is displayed <!-- URI ! If the HTML/XHTML is m an external file, then —>
<!- this points to the file . In the case of —>
< !-- multiple prompts in one file , use the anchor —>
< !- and label .
(I .E . : " f ile#label " ) —>
<!--
<!— SUB-ELEMENTS <!-- <html>
<!— NOTES
<!— Contains HTML formatted information which acts as the
" rompt " <!-- for minmax
<!-- - THIS ELEMENT DUPLICATED IN MULTI_CHOICE_DEFAULT- SCHEMA.XML —>
<!— - THIS ELEMENT DUPLICATED IN MULTI__CHOICE-SCHEMA.XML
—>
<!--
<!—
<ElementType name="prompt" order="many" content="mixed" model="open"> <AttributeType name= "minmax" dt:type="string" required="no" default="0" /> <AttributeType name="heϊght" dt:type="strϊng" required="no" default="70" /> <AttributeType name="location" dt:type="enumeratϊon" dt:values="top bottom" default="top" required="no" /> <AttributeType name="URI" dt:type="string" required="no" /> ottribute type="minmax" /> ottribute tγpe="height" /> ottribute type="location" /> ottribute type="URI" /> </ElementType> <!-
< !— <promptDefault> < !-
< ! — ATTRIBUTE REQ? DESCRIPTION
< ! — tooMany no The message to display when a candidate tr ies
< ! - to select o many distractors. (Required if --> maxResponses > 1 . ) <!--
<!-- notEnough no The message to display when a candidate tries —>
<!-- to navigate and minResponses not met
<!--
<!-- SUB-ELEMENTS
<!-- <prompt>
<!--
<!-- NOTES
<!-- - This is used to establish "default" data for the multi_choice -->
<l~ item type.
<!--
<!--
—>
<ElementType name="promptDefault" order="many" content="mϊxed" model="open">
<AttributeType name="tooMany" dt:type= "string" required="no" />
<AttributeType name="notEnough" dt:type="string" required="no" /> ottribute type="tooMany" /> ottribute type="notEnough" default="You have not selected the minimum number of answers" /> lement type="prompt" minOccurs="l" maxOccurs="*" /> </ElementType>
< ! — [next previous -schema . xml] — >
< !--
—>
< !-
<!— <nextPrevious>
<1-
< !-- ATTRIBUTES REQ? DESCRIPTION
< !— bgcolor no [#808080] Background color of the helm . The color -- >
<!— is specified with three hex numbers preceded by a —>
<!— pound sign ("#")
The three hex numbers are —>
<!-- between 00 and
FF. They represent the red, green —>
<!-- and blue ill
components of the target color. <!--
<!— SUB-ELEMENTS
<!- <button>
<!--
<!-- NOTES
<!-- H Heellmm plug-m for basic forward-backwards movement m an exam -->
<!-- T Too as*ee a complete list of functionality, look at "action" 's on —>
<!-- the "button" tag.
<!--
<ElementType name="nextPrevious" content="eltOniy"> <AttributeType name="bgcolor" dt:type="string" default="#808080" requιred="no" /> ottribute type="bgcolor" /> <element type="button" minOccurs="l" maxOccurs="*" />
</ElementType>
< !--
< !--
< !— <button>
< !--
<!— ATTRIBUTES REQ? DESCRIPTION
—>
<!— action yes (next, previous, review, flag, mark,
—> <!— first-incomplete, first-skipped, -->
<!— first- presentation, end, start, quit, help, —>
<!-- comment, first- marked, more ) The type of "thing" —>
<!— the helm should attempt to navigate to. -->
<!-- img yes A URI pointing to a (bitmap) image.
This image — >
<!— will be placed on the button.
<!— imgPress no A URI pointing to a
(bitmap) image which is used —>
<!— when the image is pressed. <!— SUB-ELEMENTS
<!-- none
<!--
<!— NOTES:
<!— - Defines a button which will appear on the helm.
—> <!— - The "maik" and "flag" actions mean the same.
—> <!— - While a button may appear, it's up to the navigation plug- in to —> <!-- allow the actual change in presentation.
—> <!— - When a presentation is flagged, the imgPress icon is used until —> <!— the presentation is no longer flagged.
--> <!--
<!--
—>
<ElementType name="button" content= "empty" > <AttπbuteType name="action" dt:type="enumeration" requιred="yes" dt:values="next previous review flag mark first-incomplete first- skipped first-presentation end start quit help comment first-marked more" /> <AttributeType name="img" dt:type="string" required="yes" /> <AttributeType name="imgPress" dt:type="strϊng" required="no" /> ottribute type="action" /> ottribute type="img" /> ottribute type="imgPress" /> </ElementType>
<!-- [multi_choice_standard-sch.ema .xml] — >
<!--
<!— <option> <!--
<!— ATTRIBUTE REQ? DESCRIPTION
<!-- id yes Id matching
<input> element of the HTML. —>
<!— weight no Weight of this distracter, if >0 this —> distracter is considered correct . <!-- SUB-ELEMENTS
<!— n/a <!--
<!-
<ElementType name="option" order="many" content="mixed" model="open"> <AttributeType name="id" dt:type="string" required="yes" /> <AttributeType name="weϊght" dt:type="string" required="no" /> ottribute type="id" /> ottribute type="weight" />
</ElementType>
—>
< !--
<!— <multiChoiceStaπdard>
<!— ATTRIBUTE REQ? DESCRIPTION
<!— maxResponses no [1] Maximum number of responses allowed on a
<!-- multi- response item
<!— ininResponses no [I] minimum number of responses allowed on a —>
<!-- minimum- response item
<!-- autoPrompt no [false] (true) If true , a prompt is expected to —>
<!-- exist m the "prompt" tag, otherwise a prompt — >
<!-- (if any) should exist in. the HTML itself. —>
<!— tooMany no The message to display when a candidate tries — >
<!— to select too many distraσtors. (Required if — > maxResponses > 1 . ) <!— notEnough no The message to display when a candidate tries —> <!-- to navigate and inResponaes not met —> <!— URI no If the HTML/XHTML is in an external file, then —> <!- this points to the file. In the case of —> <l~ multiple items in one file, use the anchor — <!- and label. (I.E.: "file#label» ) —> <!-- minScore no A limit for the minimum score. —> <!— maxScore no A limit for the aximuπ i score . — >
< ! --
< !-- SUB -ELEMENTS
< l~ <prompt> <html> <option>
< ! --
< !-- NOTES
< ! - T Thhee coptions sub element groups the options (distracters) ttl t - >
< !-- mmaakkee up the correct answer
<!— - Muli -choice item type.
<!-- - The HTML formatted text contains the stem, distracters, and — >
<!-- optionally prompt.
<!— - Distracters are created with the HTML <input> cag.
—> <!--
<!--
->
<ElementType name="multiChoiceStandard" order="maπy" content="mixed" model="open">
<AttributeType name=' maxResponses" dt:type="string" required="no" default="l" />
<AttributeType name="minResponses" dt:type="string" required="no" default="l" />
<AttributeType name="autoPrompt" dt:type="enumeration" dt:values="true false" required="no" default= "false" />
<AttributeType name="tooMany" dt:type="string" required="no" />
<AttributeType name="notEnough" dt:type="string" required="no" />
<AttributeType name="URI" dt:type="string" required="no" />
<AttributeType name="minScore" dt:type="strϊng" required="no" />
<AttributeType name="maxScore" dt:type="string" required ="no" /> ottribute type="maxResponses" /> ottribute type="minResponses" /> ottribute type="autoPrompt" /> ottribute type="tooMany" /> ottribute type="notEnough" /> ottribute type="URI" /> ottribute type="minScore" /> ottribute type="maxScore" />
<element type="prompt" minOccurs="0" maxOccurs="l" />
<element type="option" minOccurs="l" maxOccurs="*" /> </ElementType>
<!-- [multi_choice_default-sch.ema .xml] -->
<!-- Setup the schema so by default we are in the Microsoft namespace .
— > <!-- Then create a namespace called "dt " and associate it with the set — > <!— of data types Microsoft supports
<!--
<!— <multiChoiceDefault> <!--
<!— ATTRIBUTE REQ? DESCRIPTION
<!— tooMany no The message to display when a candidate tries —>
<!— to select too many distraccors . (Required if —>
<!-- maxResponses > 1 . ) <!--
<!— notEnough no The message to display when a candidate tries —>
<!— to navigate and minResponses not met —>
<!--
<!— SUB-ELEMENTS
<!— <prompt>
<!--
<!— NOTES
<!— - This is used to establish "default" data for the multι_choiσe —>
<!— item type.
<!--
<!-
<ElementType name="muItiChoiceDefault" order="many" content^ "mixed" model="open">
<AttributeType name= "tooMany" dt:type="strϊng" required="no" /> <AttrιbuteType name="notEnough" dt:type="string" required="no" /> ottribute type="tooMany" /> ottribute type="πotEnough" default="You have not selected the minimum number of answers" /> olement type="prompt" minOccurs="l" maxOccurs="*" /> </ElementType>
< !— [mulci_choice-schema . xml] -->
< !— Setup the schema so by default we are in the Microsoft namespace .
— > <! -- Then create a namespace called "dt " and associate it with the set — > <!-- of data types Microsoft supports
—> <!-
—>
<!- <!- :multiChoice> <!- <!- ATTRIBUTE REQ? DESCRIPTION
< ! -- correct nswer yes Contains a list of multiple correct answers — >
< ! --
I . E . : "A" , "AB " , "ADP"
<!— maxResponses no [1] Maximum number of responses allowed on a
<!-- multi- response item
<!-- minResponses no [1] minimum number of responses allowed on a — >
<!-- minimum- response item
<!— autoPrompt [false] (true) If true, a prompt is expected to — >
<!- exist in the "prompt" tag, otherwise a prompt —>
<!-- (if any) should exist in the HTML itself. —>
<! — tooMany no The message to display when a candidate tries —>
<!-- to select too many distractors . (Required if maxResponses 1 . ) <!-- notΞnough no The message to display when a candidate tries —> <!-- to navigate and minResponses not met —> <!— URI If the HTML/XHTML is in an external file, then —> <!-- this points to the file. In the case of —> <!-- multiple items in one file, use the anchor —> <!-- and label. (I.E.: "file#label" ) —>
<!— SUB-ELEMENTS <!-- <prompt> <html> < ! — NOTES
<!— - Mulit-αhoiσe item type.
<!— - The HTML formatted text contains the seem, distractors, and —>
<!— optionally prompt.
<!-- - Distraαtors are created with the HTML <input> tag.
—>
—>
<ElementType name="multiChoice" order="many" content="mixed" model="open">
<AttributeType name="correctAnswer" dt:type="string" required="yes" />
<AttributeType name="maxResponses" dt:type="string" required="no" default="l" />
<AttributeType name="minResponses" dt:type="string" required="no" default="l" />
<AttπbuteType name="autoPrompt" dt:type="enumeration" dt:values="true false" required="no" default="false" />
<AttrιbuteType name="tooMany" dt:type="strϊng" required="no" />
<AttributeType name="notEnough" dt:type="string" required="no" />
<AttributeType name="URI" dt:type="string" required="no" /> ottribute type="correctAnswer" /> ottribute type="maxResponses" /> ottribute type="minResponses" /> ottribute type="autoPrompt" /> ottribute type="tooMany" /> ottribute type="notEnough" /> ottribute type="URI" />
<element type="prompt" minOccurs="0" maxOccurs="l" /> </ElementType>
< !-- [mediaLinked_common-schema.xml] — >
< !-
—>
< !-- This schema is used for displays, reviews and items that have
— > < !-- elements linked to multimedia . Some elements are contain in the
— > < !-- touch screen common schema .
< !— Includes definitions for the most common media linked sets of data : — >
<!— 1. Multi-choice - client specific features
—>
<!-- 2. Displays - client specific features
—>
<!— 3. Review screens client specific features
—>
—> <! —
<!-- mediaLinkedItem>
<!-- ATTRIBUTE REQ? DESCRIPTION
< !— srcPath no Default path to any external file that is — >
< !-- not explicitly de ined on a sub element . —>
<!— bgcolor no [#FFFFCΞ] The background color for t e item. — >
<!- This is described as three hex values from 00 —>
<!-- to FF; representing the red, green, and blue —> components of the color . These are preceded by -->
< !-- a " # " .
< !— bgcolorSelect no [#0OC0C0] The background color for an item when — >
< !-- it is selected.
This uses the same format as
< !-- the bgcolor attribute .
< !- id Exists for backwards co patiblity . No —>
< !-- information for this attribute is compiled into
<!-- the resource file.
<!- SUB-ELEMENTS
<!— <mediaLinkedPrompt> <mediaLinkedStem>
^mediaLinkedMultiChoice> —> <!--
<!-- NOTES
<!-- This is the top level element for a multi-choice item
—>
<!--
<ElementType name="mediaϋnkedltem" order="many" content="eltOnly" model="open">
<AttributeType name="srcPath" dt:type="strϊng" required="no" />
<AttributeType name="bgcolor" dt:type="string" required="no" default="#FFFFCE" />
<AttributeType name="bgcolorSelect" dt:type="string" required="no" default="#OOCOCO" />
<AttributeType name="id" dt;type="string" required="no" /> ottribute type="srcPath" /> ottribute type="bgcolor" /> ottribute type="bgcolorSelect" /> ottribute type="id" />
<element type="mediaLinkedPrompt" />
<element type="mediaLinkedStem" /> olement type="mediaLinkedMuItiChoice" /> </ElementType>
— >
< !--
< !— <mediaLinkedMultiChoice>
< !--
<!— ATTRIBUTE REQ? DESCRIPTION
<!— correctAnswer yes String containing correct answer. (1.3. "ABD") —>
<!— labelType no Exists for backwards corapatiblity. No --> information for thia attribute is compiled into —> <!— the resource file .
<!— minResponses no [1] Minimum responses a candidate can choose. —>
<!-- maxResponses no [1] Maximum responses a candidate can choose. -->
<!-- randomizeDistracters no [false] (true) Whether or not the order of -->
<!-- the distracters should be randomized. —>
<!— SUB-ELEMENTS <!— <distracter> <!--
<!- NOTES
<!— This is the top level element for a multi-choice item on a
—> <!-
<!-
— >
<ElementType name="mediaLinkedMultiChoice" order="many" content="eltOnly" model="open"> <AttributeType name="correctAnswer" dt:type="string" required ="yes" /> <AttributeType name="labe.Type" dt:type="enumeration" required="no" dt:values="alphabetϊc" default="alphabetic" /> <AttributeType name="minResponses" dt:type="string" default="l" required="no" /> <AttributeType name="maxResponses" dt:type="string" default="l" required="no" /> <AttributeType name="randomizeDistracters" dt:type="enumeration" dt:values="true false" default="false" required="no" /> ottribute type="correctAnswer" /> ottribute type="labelType" /> ottribute type="minResponses" /> ottribute type="maxResponses" /> ottribute type="randomizeDistracters" /> olement type="distracter" /> </ElementType> <!--
—>
<!-- <mediaLιnkedPx'ompt>
—> <!-
<!— ATTRIBUTE REQ? DESCRIPTION
—>
<!— type no [text] Exists for backwards compatiblity. No -->
<!-- information for this attribute is compiled into -->
<!- the resource
—>
<!— bgcolor no [#FFFFFF] The background color for the prompt. -->
<!- This is described as three hex values from 00 to -->
<!-- FF; representing the red, green, and blue — >
<!-- components of the color . These are preceded by a —>
<!--
<!— height no [29] Height in pixels of the prompt . —>
<!- SUB-ELEMENTS
<!— <text> <mediaInfo> <!-
<!— NOTES
<!- The prompt used for a media linked, ttvulti-choice item.
—> <!-- < !--
<ElemerιtType name="mediaLinkedPrompt" order="many" content="eltOnly" model="open">
<AttributeType name="type" dt:type="enumeratϊon" dt:values="text" default="text" required="no" /> <AttributeType name="bgcolor" dt:type="string" required="no" default="#FFFFFF" /> <AttributeType name="height" dt:type="string" required = "no" default="29" /> ottribute type="type" /> ottribute type=" bgcolor" /> ottribute type="height" /> <element type="text" /> <element type="mediaInfo" /> </ElementType>
—>
< !-- <mediaLinkedStem>
< !--
<!— ATTRIBUTE REQ? DESCRIPTION
< I — type yes [text] Type of stem to xαse .
—>
<!— URI no If the HTML/XHTML is m an external file, then
<!- this points to the file. In the case of multiple —>
<!-- items in one file use an HTML label . —>
<!- (I.E. :
"file#label")
<!— lang no Exists for backwards compatibility and does not —>
<!-- offer additional functionality. —>
<i-- height no [206] height in pixels of the control
<!--
<!- SUB-ELEMENTS
<!-- <mediaInfo>
-->
< ! --
< ! - NOTES
< !- The stem portion of an item, which is defined using HTML. —> —>
<ElementType name="mediaLinkedStem" order="many" content="eltOnly" model="open">
<AttributeType name="type" dt:type="enumeration" dt:values="text" required = "yes" />
<AttributeType name="URI" dt:type="string" required="no" />
<AttributeType name="Iang" dt:type="string" required="no" />
<AttributeType name="height" dt:type="string" required="no" default="206" /> ottribute type="type" /> ottribute type="lang" /> ottribute type="URI" /> ttribute type="height" />
<element type="mediaInfo" /> </EIementType> < !--
< !--
< ! — <mediaLinkedDisplay>
< !--
<!-- ATTRIBUTE REQ? DESCRIPTION
<!-- type yes (standard] (confirmation) Type of display. Standard -->
<!— has no special functionality, where as —>
<!— confirmation requires user interaction. —>
<!— srcPath no Default path to any external file that is not -->
<!-- explicitly defined on a sub element. —>
<!-- image no Image (BMP) to display above the confirmation HTML. —>
<!— URI no If the HTML/XHTML is in an external file, then -->
<!-- this points to the file. In the case of multiple —>
<!-- items in one f le use an HTML label. -->
<!— (I.E.:
"£ileJilabel")
<!— bgcolor no [#FFFFCE] The background color for the display. —>
<!— This is described as three hex values from 00 to —>
<!— FF; representing the red, green, and blue -->
<!-- components of the color. These are preceded by a —>
<!-- "-ft". <!-- SUB-ELEMENTS <!-- <medialnfo> <!-- <!-- NOTES <!-- The bgcolor is only necessary for " confirmation" type of display
— > <!-- screens . Standard screens are completely HTML and can change the
— > <!-- the background color through HTML tags . <!--
<ElementType name="mediaLinkedDisplay" order="many" content="eltOnly" model="open">
<AttributeType name="type" dt:type="enumeration" dt:values="standard confirmation" required="yes" />
<AttributeType name="srcPath" dt:type="string" required="no" />
<AttributeType name="image" dt:type="string" required="no" />
<AttributeType name="URI" dt:type="string" required="no" />
<AttributeType name="bgcolor" dt:type="string" required="no" default="#FFFFCE" /> ottribute type="type" /> ttribute type="srcPath" /> ttribute type="image" /> ottribute type="URI" /> ottribute type="bgco!or" />
<element type="mediaInfo" /> </ElementType> <!-
—>
<!-
<!-- <mediaLinkedReview>
<l-
<!- ATTRIBUTE REQ? DESCRIPTION
<!- srcPath no Source path.
<!-
<!- SUB-ELEMEWTS
<!- <numberTotal> <nutrιberComplete> <numberIncoτroDlete> <numberMarked> —>
<!- < !— NOTES
<!-- - Used to define a review for a media linked type exam.
—>
<ElementType name="mediaϋnkedReview" order="many" content="eltOnly" model="open">
<AttributeType name="srcPath" dt:type="strϊng" required="no" />
<AttributeType name="bgcolor" dt:type="strϊng" default="#FFFFCE" required ="no" /> ottribute type="srcPath" /> ' ottribute type="bgcolor" /> olement type="numberTotal" minOccurs="l" maxOccurs="l" /> olement type="numberComplete" minOccurs="l" maxOccurs="l" /> olement type="numberIncomplete" minOccurs="l" maxOccurs="l" /> olement type="numberMarked" minOccurs="l" maxOccurs="l" />
</ElementType>
<!-- [linear navigate-schema .xml] -->
— >
<!-
<!— <linearNavigate>
<!--
< !-- ATTRIBUTE REQ? DESCRIPTION
< !— initialRevie no [true] ( false)
Whether a candidate may — >
< !- review items from the very beginning of a —> <!- section. <!-- markAllowed no [true]
(false) Whether a candidate may —>
<!- mark items during the exam for review —>
<!- purposes . <!-- incompleteEndAllo ed no [true] (false) Whether a candidate may —>
<!- end a section that contains incomplete —>
<!- items . <!-- endSectionPro pt no The message to disply when ending a —> section. <!-- endlnσompleteSectionPrompt
<!-- no The message to display when ending a —> •bisection with incomplete items. <!— guitExamPrompt no The message to disply when quiting an --> exam. <!— comment no [false]
(true) If the candidate can -->
<!-- make comments during this section. <!-- readonly
[false] (true) If the items are set to —> be read-only. <!— nextOrMore no [true]
(false) Whether co show "Next" —>
<!-- button with "More" button <!--
< !-- SUB -ELEMENTS
< !— none < !--
< ! — NOTES
<!-- Non-adaptive navigation plug-in .
— > <!-- Allows for simple "movement" between items and sections
<!-- For "markAllowed" to have an effect a helm which supports marking —> <!-- of items must be used in the exam too .
—>
<!-- The button labels will appear exaccly as entered including
<!-- capitalization .
<!— - It is a common case to set comment="true" and readOnly="true" and —>
<!-- re-deliver a section for the sole purpose of commenting.
—>
<!-- <!--
->
<ElementType name="linearNavigate" order="many" content="empty" model="closed"> <AttributeType name="initialReview" dt:type="enumeration" dt:values="true false" default="true" required="no" /> <AttributeType name="markAllowed" dt:type="enumeration" dt:values="true false" default="true" required="no" /> <AttributeType name="incompleteEndAIIowed" dt:type="enumeration" dt:values="true false" default="true" required="no" /> <AttributeType name="endSectionPrompt" dt:type="string" required="no" default="This will end your section. Do you wish to end?" /> <AttributeType name="endIncompleteSectionPrompt" dt: ype = "string" requιred="no" default="You have not fully answered all items. If you end incomplete items will be marked as incorrect. Do you wish to end?" /> <AttrlbuteType name="quϊtExamPrompt" dt:type="strϊπg" required="no" default="You are about to exit the exam. Do you wish to exit?" /> <AttributeType name="comment" dt:type="enumeration" dt:values="true false" default="false" required="no" /> <AttributeType name="readOnly" dt:type="enumeration" dt:values="true false" default="false" required="no" /> <AttributeType name="nextOrMore" dt:type="enumeration" dt:values="true false" default="true" required="no" /> ottribute type="initialReview" /> ottribute type="markAHowed" /> ottribute type="incompleteEndAllowed" /> ottribute type="endSectionPrompt" /> ottribute type="quitExamPrompt" /> ottribute type="endIncompleteSectionPrompt" /> ottribute type="comment" /> ottribute type="readOnly" /> ottribute type="nextOrMore" /> </ElementType>
<!-- [hot _area- schema , xml] — >
<!--
—>
<!— hotArea> <!-
<!- ATTRIBUTE REQ? DESCRIPTION
<!— correctAnswer yes Contains a list of multiple correct —> answers. (I.E.: "A", "AB", "ADF") <!-- maxResponses no [1] Maximum number of responses allowed —> <!-- on a multi-response item. <!— minResponses no [1] Minimum number of responses allowed --> on a minimum-response item. <!-- minltemScore no The minimum score achievable. This can —>
<!-- be set to 0 to prevent negative scores . <!— maxItemScore no The maximum achievable item score —>
<!— undefinedAreaScore no The weight for selecting an area of the —> image not defined as an answer. Negative --> answers will be common for multiple --> answer items to offset the value of —> correct answers . <!— autoPrompt no [false]
(true) If true, a prompt is —> expected to exist in the "prompt" tag, —> otherwise a prompt (if any) should exist -->
<!-- in the HTML itself. <!— tooMany no The message τo display when a candidate —> <!-- tries to select too many distractors. -->
(Required if maxResponses > 1) —>
<!— notEnough no The message to display when a candidate —> tries co navigate and minResponses not —>
<!-- met . <!— URI no
If the HTML/XHTML is in an external file, —> <!-- then this points to the file. In the —>
<!-- case of multiple items in one file, use --> <!-- the anchor and label. I . E. "file#label" —> <!— anewerlmage yes The image to display on a selected --> hot-area. <!— mouseCursor no The mouse cursor to display over the —>
<!-- image. Defaults to crosshair. -->
< !--
<!-- SUB -ELEMENTS
< !-- <prompt: > <html>
<!--
<!-- NOTES
< !-- - D Diisstt:ractors are created by using a combination of the following - >
<!- HHTTMMLL tags: <img usemap="mapname">, <map> and <area> . — >
<!— - The hot area image will be an HTML <img> tag named "HotArea". —>
<!—
<ElementType name="hotArea" order="many" content="mixed" model="open"> <AttributeType name="correctAnswer" dt:type="string" required="yes" /> <AttributeType name="answerlmage" dt:type="string" required="yes" /> <AttributeType name="maxResponses" dt:type="string" required="no" default="l" /> <AttributeType name= 'minResponses" dt:type="strϊng" requιred="no" default="l" /> <AttributeType name="minItemScore" dt:type="string" required="no" default="0" /> <AttributeType name="maxItemScore" dt:type="string" required="no" default="1.0" /> <AttributeType name="undefinedAreaScore" dt:type="string" required="no" default="0" /> <AttrlbuteType name="autoPrompt" dt:t pe="enumeratioπ" dt:values="true false" required="no" default="false" /> <AttributeType name="tooMany" dt:type="string" required="no" /> <AttributeType name="notEnough" dt:type="string" required="no" /> <AttributeType name="URI" dt:type="strϊng" required="no" /> <AttributeType name="mouseCursor" dt:type="enumeratϊon" dt:values="auto crosshair default hand help move text wait" default="crosshair" required="no" /> ottribute type="correctAnswer" /> ottribute type="answerlmage" /> ottribute type="maxResponses" /> ottribute type="minResponses" /> ottribute type="mϊnItemScore" /> ottribute type="maxItemScore" /> ottribute type="undefinedAreaScore" /> ottribute type="autαPrompt" /> ottribute type="tooMany" /> ottribute type="notEnough" /> ottribute type="URI" /> ottribute type="mouseCursor" /> olement type="prompt" minOccurs="0" maxOccurs="l" /> </ElementType>
<!-- [f ree_form-schema . xml] — >
<!— Setup the schema so by default we are in the Microsoft namespace .
— >
<!— Then create a namespace called "dt" and associate it with the set
— >
<!— of data types Microsoft supports
<!-- <!--
<!— <freeForm>
<!- < !— ATTRIBUTE REQ? DESCRIPTION
< ! -- judgingSpecs no Contains a VB expression to match the response — >
< ! -- againis .
< !— prompt no If not defined the input box will flash to —> <!-- prompt the user if unanswered Normally the — >
<!-- message attribute that will prompt the user in — >
<!-- a message box
<!— URI no If the HTML/XHTML is in an external file , then — >
< ! -- this points to the file. In the case of — >
<!-- multiple items in one file, use the anchor — >
<!-- label. (I.E.: "file#label" )
<!--
<!-- SUB-ELEMENTS
<! — <htmli
<!--
<!-- NOTES
<!- T Thhee : response must be an HTML <inpuc> tag named
"Response" . — >
<!-- SSaammpp!le j udgingSpecs :
<!-- "hicase ( trιm (oItem . ΘetResponseDisplay) ) = ucase
("George") "
<!--
< !--
— >
<ElementType name="freeForm" order="many" content="mixed" model="open">
<AttrιbuteType name="judgingSpecs" dt:type="string" required="no" />
<AttπbuteType name="prompt" dt:type="string" required="no" />
<AttrιbuteType name="URI" dt:type="string" required="no" /> ottribute type="judgingSpecs" /> ottribute type="prompt" default="You have not answered the item" /> ottribute type="URI" /> </ElementType>
< ! -- [form_select- schema . ml] -->
< ! --
— >
< !-- < !— <selectionScript> < !--
< !— ATTRIBUTE REQ? DESCRIPTION
<!- type [vbscript] (j script)
Whether the script is
<!-- vb-script or a java-scrip . — >
<!— SUB-ELEMENTS
<!— n/a
< !--
< !— NOTES
<!-- <!--
<ElementType name="selectionScript" content="mixed" model="open"> <AttributeType name="type" dt:type="enumeration" dt:values="vbscrϊpt jscript" default="vbscript" required="no" /> ottribute type="type" />
</ElernentType>
<!-
— >
<!--
<!-- <formSelection>
<!--
<!-- ATTRIBUTE REQ? DESCRIPTION
<!-
<!-- selectionMethod n [both] (random) (script) The method of — >
<!- choosing a form,
<!—
<!- SUB-ELEMENTS
<!- <selectionScript
<!-
<!- NOTES <!-- - The selection method works in tne following manner:
—> <!-- random - Form will be chosen randomly.
—> <!-- script - A script expression (one line) will be run which will -->
<!-- return the name of the form (I.E. for eligiblity) . —>
<!-- both - Any script will be run. If it returns a valid form -->
<!-- name, that form will be used.
Otherwise a random form —> <!— w ll be choosen.
<!--
<!--
<ElementType name="formSelection" content="eltOnly"> <AttributeType name="selectϊonMethod" dt:type="enumeration" dt:values="random script both" default="both" required="no" /> ottribute type="selectϊonMethod" /> olement type="selectionScrϊpt" minOccurs="0" maxOccurs="l" />
</ElementType>
< !-- [flash _item- schema . x l] — >
<!-
—>
<!-- <flashltem> <!-- ATTRIBUTE REQ? DESCRIPTION
<!— URI yes The URI of the Flash file
(SWF) .
<!— form yes Name of the form th t qualified che variable —>
<!-- names that are read and written. For example — :
<!-- if form="mam", then the bProceed variable ->
<!-- would be accessed as "main/bProceed" .
<!— saveFrame no [false] ( true!i Whether or not the frame of the —>
<!-- movie should be persisted to the instance file —>
< !-- and restored f om it .
<!— ti eLine yes The flash script timeline that qualifies the —>
<!-- labels call . For example if timeline="root" , — >
< !-- then the
Delivery label would be access as
<!-- target := "root" and label :="Delivery" . —>
<!— SUB-ELEMENTS
<!— -none-
<!--
<!-- NOTES
<!— The flashFile tag contains all the information relating to the use —>
<!-- of the .SWF file.
<ElementType name= "flashFile" order="many" content^ "empty" model="open"> <AttributeType name-"URI" dt:type="string" required="yes" /> <AttributeType name="form" dt:type="string" requιred="yes" /> <AttrιbuteType name="saveFrame" dt:type="enumeration" dt:values="true false" default="false" required="no" /> <AttributeType name="time ine" dt:type="string" required ="yes" /> ottribute type="URI" /> ottribute type="form" /> ottribute type="saveFrame" /> ottribute type="timeLine" />
</ElementType>
<!--
=========∞==__======______===__=_====_________==__^ — >
< !-
<!— <flashMessage>
<i-
<!-- SUB-ELEMENTS <!-- -πone-
<!- NOTES
<!— This contains the text that will be displayed when the plug-in
--> <!— prompts the user before proceeding to the next presentation.
—>
<!-
<ElementType name="flash essage" content="mixed" model="open" /> <!-
— >
<!- <!-- <flashltem> <!-
<!— SUB-ELEMENTS
<!— <flashPile>
<!— <flashMessage>
<!— <data> <!--
<!— NOTES
<!— The data sub elements contains the initialization data to the flash -->
<!— script. The data could reside between the start and end data cag or —>
<!— m an external file referenced by the URI.
—>
<!--
<ElementType name="flashltem" order="many" content="eltOnly" model="open"> <e!ement type= "flashFile" /> olement type="flashMessage" />
<element type="data" /> </ElementType>
< !— [b tton_display~schema . xml] — >
< !-
< !— <buttonDisplay> < !--
< !— ATTRIBUTE REQ? DESCRIPTION
< !— bgcolor no Background color of the helm . The color is — >
< !-- specified with three hex numbers preceded by a -- >
< ! — pound sign ("#") . The three hex numbers are —>
<!— between.
00 and FF They represent the red, —>
< !— green and blue components of the target color. —>
<!— allowPOLESS no (true) (false) Allows the usei to navigate to a --> POLESS resource m the browser
<l~ allowFile no (false) (true) Allows user to navigate to a file —>
<■-- m the browser
<l— allowURI no (false) (true)
Allows user to navigate to a URL —>
<!-- in the browser
<!— enable no A comma seperated list of wmdowButton names -->
<i- If a name appearε, that bucton will be enabled
<l~ otherwise the button will not show up m tne — >
<l~
The ordei of the list is the order
<l~ m which the window buttons appear —>
<ι~
< ■— SUB ELΞMENTS
< I ~ =.wιnαowButton>
< l-
< l — NOTES
< l — T Thhiiss tt.ag houses the buttons which will launch a pop-up display — >
< i- ( (hheellpp., exhibit, etc
< 1 -
< l -
— >
<ElementType name="buttonDisplay" content="mixed" order="many" model="open">
<AttrιbuteType name="bgcolor" dt'type^'^tring" requιred="no" />
<AttπbuteType name="allowPoless" dt:type="enumeration ' dt,values="true false" requιred="no" />
<AttrιbuteType name="allαwFile" dt.type="enumeration" dt:values="true false" requιred="no" />
<AttrιbuteType name="allowURI" dt:type= 'enumeration" dt values="true false" requιred="no" />
<AttrιbuteType name="enable" dt:type="string" requιred="no" /> ottribute type=" bgcolor" /> ottribute type="allowPoless" /> ottribute type="allowFiIe" /> ottribute type="allowURI" /> ottribute type="enable" /> olement type="windowButton" mιnOccurs="0" maxOccurs="*" /> </ElementType> <ι- — >
<l-
<ι~ <wmdowButton>
<ι~
<i- MTRIBUTE REQ' DESCRIPTION
<l-- name yes The name associated with a button --> no Button and window name
< l — irag no A URI pointing to a (bitmap) image. This
<i- lmage will be placed on the button
<l— wmdow idth no Tne width of the pop-up window m pixels — >
<l— wmdowHeight no The height of the pop-up window in pixels. -->
<l— close no The text which appears on the pop-u ' s — >
-el- close button If empty, no close button — > will appear <l— dllowMultiple no (false) (-rue) Whether or not multiple copies — >
<l- of the button wirdow can oe launched. — >
<l— startupPosit on no (cascade) (center) The initial wmαow postion — > <l — when the window first oops up
<!-
<■-- SUB-ELEMENTS
<!-- <tab>
<i-
<ι~ NOTES
<!-- E Eaacchh button (help, exhibit, etc. ) has a corresponding
— >
<ι~ wimnddiowButton rag
<!-- O Onnee J to many windows that can be displayed as a popup
<!--
<l~
— >
<ElementType name="windowButton" content="mixed" order="many" model="open">
<AttributeType name="name" dt:type="string" required="yes" /> <AttributeType name="title" dt:type="string" required="no" /> <AttributeType name="img" dt:type="string" required="no" /> <AttributeType name="windowWidth" dt:type="strϊng" required="no" /> <AttributeType name="windowHeight" dt:type="string" required="no" /> <AttributeType name="close" dt:type="string" required="no" /> <AttributeType name="allowMultiple" dt:type="enumeration" dt:values="true false" required- 'no" /> <AttributeType name="startupPosition" dt:type="enumeration" dt:values="cascade center" required="no" /> ottribute type="name" /> ottribute type="title" /> ottribute type="img" /> ottribute type="windowWidth" /> ottribute type="windowHeight" /> ottribute type="close" /> ottribute type="allow ultiple" /> ottribute type="startupPosition" /> <element type="tab" minOccurs="0" maxOccurs="*" />
</ElementType>
<!-
< !- <!- <tab> <!-
<!-- ATTRIBUTE REQ? DESCRIPTIOM
< ! — URI If the HTML/XHTML is in an external file, then -->
Figure imgf000137_0001
points to the file. In the case of —>
<!- multiple displays in one file, use the
<!-- anchor
<!-- title Title that will be displayed in the tab. —>
<!--
<!— NOTES
<!— - Each "tab" of information is made up of one of these tags. If -->
<!-- a pop-up button indow only needs one screen of information, then —>
<!-- only one tab is used and the plug-in will not display any tabs. —>
<!— - If the URI doesn't exist, the plug-in expects XHTML within this —>
<!— element to make up the display's content.
—>
<!- <!--
— >
<ElementType name="tab" content="mϊxed" order="many" model="open"> <AttrιbuteType name="URI" dt:type="strϊng" required="no" /> <AttributeType name="tϊtle" dt:type="string" required="no" /> ottribute type="URI" /> ottribute type="title" />
</ElementType>
<!- [browser review- schema . xml]
—>
<!-
<!- <browserRevιew>
<!-
< ! — ATTRIBUTES REQ?
DESCRIPTION <!-- URI yes A
URI pointing to an HTML file. —> <!— idComplete
HTML ID for when item is completed. —> <!-- idS ipped
HTML ID for when item is skipped . —> <!-- idScoreCandidate HTML ID for the candidate's score on --> <!-- the item. <!— idScoreMmimuπi HTML ID for the minimum possible score —> <!-- idScoreNominal HTML ID for the nominal score of the <!-- item. <!-- idScoreMaximum HTML ID for the maximum possible <!-- score . <!-- idGetCorrectAnswer no HTML ID for displaying an item's <!-- correct answer. <!— idGetResponseDisplay HTML ID for displaying candidate's —> answe .
<!- idPresented
HTML ID for when item is presented. —> <!- idName
HTML ID for the item's name. —> <!- idTitle
HTML ID for the item's title. —>
<!- idWeight
HTML ID for displaying the item's —> weight <l— ldSecondsElapsed no HTML ID for seconds spent on the item —> <l— idScored no
HTML ID for when the item 19 scored —> <l— idSkipAllowed no HTML ID for when skipping s allowed —> <!— dScoreCandidate eighted no HTML ID foi e candidate's score
<l~ weighted <l-- idPlugin ame HTML ID for the plug-in name of the —> <ι~ item <l— idElapsed
HTML ID for complete date/time —> <l— idTimesPresented HTML ID for # of times item has oeen —> <ι~ presented <l—
<l— SUB-ELEMENTS
<l— n/a
<l-
<l— NOTES
<•-- - The bulk of the attributes on browser review allow overriding —>
<l~ the HTML ID used to associate an Iltem or Icltem pioperty. If —>
< i-- a property is not given an explicit value , then the propery will — >
<ι~ default to the name used when accessing it througn scπpt — >
<ι~ EXAMPLE idSkipped is bSkipped by default
— >
<l~ Default values are not utilized within the sσnema to prevent — >
<l- explicit values from being ovei -written during a compile
— >
<!-- situation involving amalgamation
<l~
<l-
<ElementType name="browserReview" order="many" content="mϊxed" model="open">
<AttπbuteType name="URI" dt:type= "string" requιred="yes' /> <AttπbuteType name="idCompIete" dt:type=,'string" required ="no" /> <AttributeType name="idSkipped" dt:type="string" required="no" /> <AttributeType name="idScoreCandidate" dt:type="string" required="no" /> <AttributeType name="idScoreMinimum" dt;type="strϊng" required="no" /> <AttπbuteType name="idScoreNominal" dt:type="string" required = "no" /> <AttributeType name="idScoreMaximum" dt:type="string" required="no" /> <AttrιbuteType name="idGetCorrectAnswer" dt:type="string" requlred="no" /> <AttributeType name="idGetResponseDisplay" dt:type="string" required="no" /> <AttributeType name="idPresented" dt:type="string" required="no" /> <AttributeType name="idName" dt:type="string" required="no" /> <AttributeType name="idTitle" dt : ype = "string" required="no" /> <AttributeType name="idWeight" dt:type="string" required="no" /> <AttributeType name="idSecondsElapsed" dt:type="string" requιred="no" /> <AttributeType name="idScored" dt:type="strϊπg" required="no" /> <AttributeType name="idSkipAllowed" dt:type="strϊng" required="no" /> <AttributeType name="idScoreCandidateWeighted" dt:type="striπg" required="no" /> <AttributeType name="idPluginName" dt:type="string" required="no" /> <AttributeType name="idElapsed" dt:type="strϊng" required="no" /> <AttributeType name="idTimesPresented" dt:type="string" requlred="no" /> ottribute type= "URI" /> ottribute type= "idComplete" /> ottribute type= "idSkipped" /> ottribute type= "idScoreCandidate" /> ottribute type= "idScoreMinimum" /> ottribute type= "idScoreNomϊnal" /> ottribute type= "idScoreMaximum" /> ottribute type= "IdGetCorrectAnswer" /> ottribute type= "idGetResponseDisplay" /> ottribute type= "idPresented" /> ottribute type= "idName" /> ottribute type= "idTitle" /> ottribute type= "idWeight" /> ottribute type= "idSecondsElapsed" /> ottribute type= "IdScored" /> ottribute type= "idSkipAllowed" /> ottribute type= "IdScoreCandidate eighted" /> ottribute type= "idPluginName" /> ottribute type= "idElapsed" /> ottribute type= "idTimesPresented" />
</ElementType>
< \ — [browser repor - schem . xml ] — >
— >
< ! -- <browserReport >
< ! -
< ! — ATTRIBUTE REQ? DESCRIPTION
< ! -- reprintMessage no If present, display the contents of this
<!-- message in a dialog with "Yes" and "No" buttons .
Report will continue to reprint until
<!-- the candidate clicks "No".
<!- SUB-ELEMENTS
<!-- <page>
<!--
<!-- NOTES
<!-- A Ann HH"TML (score) report which is sent to che printer.
—>
<!--
<!--
- <ElementType name="browserReport" order="many" content="mixed" model="closed">
<AttributeType name="reprintMessage" dt:type="string" required="no" /> ottribute type="reprintMessage" />
<element type="page" minOccurs="l" maxOccurs="l" /> </ElementType> <!--
—>
<!--
<!-- <page>
<!--
< !— ATTRIBUTE REQ? DESCRIPTION
<!— width rno Width of the paper to print on, in inches. ->
<!-- height no Height of the paper to print on, m inches . —>
<!-- pageSize no [letter] Size of the paper to print the report — >
<!-- on. Some examples include: legal, 10x14, —>
11x17,
A4, a4small, a5 , b4, b5, custom, etc —>
<!- This value is ignored if a width and height are
<!- specified. <!-- orientation no [portrait] (landscape)
Determines how the —>
<!-- printed information is oriented on the paper. --> <!-- header no Specifes a string which contains Internet —>
Explorer print control -characters , which
< !-- determine the format of the printed page ' s —>
<ι ~ heade .
< !— footer no Specifes a string which contains Internet —>
< ! - Explorer print control-characters , which —>
< !-- determine the format of the printed page's —>
<!-- footer.
<!-- URI If the HTML/XHTML is in an external file, then —>
<!-- this points to the file.
<!--
SUB-ELEMENTS
<!-- <html>
<!-
<!- NOTES
<!-- The information or a given page of the printed report .
<!-
<!-
— >
- <ElementType name="page" order="many" content="mixed" model="open"> <AttributeType name="width" dt:type="string" required ="no" /> <AttributeType name="height" dt:type="strϊng" required="no" /> <AttributeType name="pageSize" dt:type="string" required="no" default="letter" /> \ <AttributeType name="orientation" dt:type="enumeration" dt:values="landscape portrait" default="portrait" required="no" /> <AttributeType name="header" dt:type="string" required="no" /> <AttributeType name="footer" dt:type="string" required="no" /> <AttributeType name="URI" dt:type="striπg" required="no" /> ottribute type="width" /> ottribute type="height" /> ottribute type="pageSize" /> ottribute type="orientation" /> ottribute type="header" /> ottribute type="footer" /> ottribute type="URI" /> </ElementType>
< !— [browser_ display-schema . xml] — >
< !— Setup the schema so by default we are in the Microsoft namespace . — > <i— Then create a namespace called "dt" and associate it with the set
—> <!-- of data types Microsoft supports
—>
<!--
<!-
<!-- <browserDιsplay> <!--
<!-- ATTRIBUTE REQr DESCRIPTION
<!— allowPOLESS [true] Allows the user to navigate to a POLESS —>
<!- resource ιr_ the browser
<!- allowURI no [false] Allows user to navigate to a file in —>
<!-- the browser
<!— allowFile [false] Allows user to navigate to a URL in the —>
<!-- browser
<!— URI Jno If the HTML/XHTML is in an external file, then —>
<!-- this points to the file. In the case of —>
<!-- multiple displays in one file, use the anchor —>
<!-- label. (I.E.: » ile#label" )
<!--
<!-- SUB-ELEMENTS
<!— NOTES
Within element is the HTML or XHTML to display to candidate —> Use for "displays", which is content presented to the candidate —> <!-- but has no bearing on the score of the exam.
—>
<!-- ???TBH - when Microsoft adds referencing multiple schemas, this —> <!— tag should reference the XHTML schema
—>
<!-- - <ElementType name="browserDisplay" order="many" content="mixed" model="open"> <AttributeType name="allowPO ESS" dt:type="enumeration" dt:values="true false" default="true" required="πo" /> <AttributeType name="alIowURI" dt:type="enumeration" dt:values="true false" default="false" required="no" /> <AttributeType name="allowFiIe" dt:type="enumeration" dt:values="true false" default="false" required="no" /> <AttributeType name="URI" required="no" /> ottribute type="allowPOLESS" /> ottribute type="allowURI" /> ottribute type="allowFile" /> ottribute type="URI" /> </ElementType> </Schema>

Claims

What is claimed is:
1. A memory storing a test definition language in extensible markup language format that characterizes at least one computer-based test delivered to an examinee using a test driver and is implemented by at least one computer, the test definition language having a plurality of segments, the at least one computer-based test having a presentation format and data content, and the test driver delivering the at least one computer-based test to an examinee using a display device, managing the at least one computoi -based test, controlling progression of the at least one computer-base i rest, controlling scoring of the at least one computer-based test, controlling timing of the at least one computer-based test, controlling printing of the at least one computer-based test, and controlling results reporting of the at least one computer-based test based on the test definition language, the memory storing: at least one of a plurality of first data structures, the plurality of first data structures including element specific data objects indicating a classification of at least one of the plurality of segments of the test definition language, wherein the plurality of segments defines information comprising the data content, the presentation format, the progression, the scoring, the printing, the timing, and the results reporting of the at least one computer-based test; and at least one of a plurality of second data structures at least one of depending from and subordinate to the at least one of the plurality of first data structures, the plurality of second data structures including attribute specific data objects indicating at least one attribute of the at least one of the plurality of segments of the test definition language implemented by the at least one computer.
2. The memory of claim 1, wherein the classification of the at least one of the plurality of segments of the test definition language comprises one of message boxes, scripts, data, plug-ins, templates, categories, items, groups, sections, forms, form groups, and exams.
3. The memory of claim 1, wherein the classification of the at least one of the plurality of segments of the test definition language comprises message boxes and the at least one attribute of the at least one of the plurality of segments comprises at least one of ok, cancel, abort, retry, ignore, yes, no, and title.
4. The memory of claim 1, wherein the classification of the at least one of the plurality of segments of the test definition language comprises scripts and the at least one attribute of the at least one of the plurality of segments comprises type.
5. The memory of claim 1, wherein the classification of the at least one of the plurality of segments of the test definition language comprises data and the at least one attribute of the at least one of the plurality of segments comprises at least one of name, a uniform resource identifier, and keep external.
6. The memory of claim 1, wherein the classification of the at least one of the plurality of segments of the test definition language comprises plug-ins and the at least one attribute of the at least one of the plurality of segments comprises at least one of name and program identification.
7. The memory of claim 1, wherein the classification of the at least one of the plurality of segments of the test definition language comprises templates and the at least one attribute of the at least one of the plurality of segments comprises at least one of name, split, and size.
8. The memory of claim 1, wherein the classification of the at least one of the plurality of segments of the test definition language comprises categories and the at least one attribute of the at least one of the plurality of segments comprises at least one of name, duplicate, complete, and contents.
9. The memory of claim 1, wherein the classification of the at least one of the plurality of segments of the test definition language comprises items and the at least one attribute of the at least one of the plurality of segments comprises at least one of name, title, template, area, weight, scored, and skip allowed.
10. The memory of claim 1, wherein the classification of the at least one of the plurality of segments of the test definition language comprises groups and the at least one attribute of the at least one of the plurality of segments comprises at least one of name, title, and template.
11. The memory of claim 1 , wherein the classification of the at least one of the plurality of segments of the test definition language comprises sections and the at least one attribute of the at least one of the plurality of segments comprises at least one of name, title, and skip allowed.
12. The memory of claim 1 , wherein the classification of the at least one of the plurality of segments of the test definition language comprises forms and the at least one attribute of the at least one of the plurality of segments comprises at least one of name, version, title, restartable, skip allowed, begin section, and end section.
13. The memory of claim 1, wherein the classification of the at least one of the plurality of segments of the test definition language comprises form groups and the at least one attribute of the at least one of the plurality of segments comprises at least one of name and selection.
14. The memory of claim 1, wherein the classification of the at least one of the plurality of segments of the test definition language comprises exams and the at least one attribute of the at least one of the plurality of segments comprises at least one of name, version, and title.
15. The memory of claim 1, further stores at least one of a plurality of third data structures at least one of depending from and subordinate to the at least one of the plurality of first data structures, the plurality of third data structures including data specific data objects indicating at least one sub- classification of the at least one of the plurality of segments of the test definition language implemented by the at least one computer.
16. The memory of claim 15, wherein the classification of the at least one of the plurality of segments of the test definition language comprises one of plug-ins and items.
17. The memory of claim 1, further stores at least one of a plurality of third data structures at least one of depending from and subordinate to the at least one of the plurality of first data structures, the plurality of third data structures including element specific data objects indicating a sub- classification of the at least one of the plurality of segments of the test definition language, wherein the sub-classification further indicates at least one property specific to the at least one of the plurality of segments of the test definition language implemented by the at least one computer.
18. The memory of claim 17, wherein the classification of the at least one of the plurality of segments of the test definition language comprises one of message boxes, plug-ins, templates, categories, items, groups, sections, forms, form groups, and exams.
19. The memory of claim 17, wherein the classification of the at least one of the plurality of segments of the test definition language comprises message boxes and the at least one sub- classification of the at least one of the plurality of segments comprises at least one of box size and button size.
20. The memory of claim 17, wherein the classification of the at least one of the plurality of segments of the test definition language comprises plug-ins and the at least one sub-classification of the at least one of the plurality of segments comprises data.
21. The memory of claim 17, wherein the classification of the at least one of the plurality of segments of the test definition language comprises templates and the at least one sub-classification of the at least one of the plurality of segments comprises at least one of area and template.
22. The memory of claim 17, wherein the classification of the at least one of the plurality of segments of the test definition language comprises categories and the at least one sub-classification of the at least one of the plurality of segments comprises at least one of category, category reference, description, and scoring.
23. The memory of claim 17, wherein the classification of the at least one of the plurality of segments of the test definition language comprises items and the at least one sub-classification of the at least one of the plurality of segments comprises at least one of data, category reference, start, finish, condition, and attribute.
24. The memory of claim 17, wherein the classification of the at least one of the plurality of segments of the test definition language comprises groups and the at least one sub-classification of the at least one of the plurality of segments comprises at least one of selection, scoring, attribute, groups, group reference, presentation, section, and section reference.
25. The memory of claim 17, wherein the classification of the at least one of the plurality of segments of the test definition language comprises sections and the at least one sub-classification of the at least one of the plurality of segments comprises at least one of navigation, timer, start finish, condition, attribute, category reference, group, and group reference.
26. The memory of claim 17, wherein the classification of the at least one of the plurality of segments of the test definition language comprises forms and the at least one sub-classification of the at least one of the plurality of segments comprises at least one of scoring, timer, minimum resolution, report, results, attribute, section reference, and section.
27. The memory of claim 17, wherein the classification of the at least one of the plurality of segments of the test definition language comprises form groups and the at least one sub-classification of the at least one of the plurality of segments comprises at least one of selection, form reference, and form.
28. The memory of claim 17, wherein the classification of the at least one of the plurality of segments of the test definition language comprises exams and the at least one sub-classification of the at least one of the plurality of segments comprises at least one of minimum resolution, attribute, form, form reference, and form group.
29. The memory of claim 17, further stores at least one of a plurality of fourth data structures at least one of depending from and subordinate to the at least one of the plurality of first data structures, the plurality of fourth data structures including group specific data objects indicating an order of an appearance of the at least one of the plurality of third data structures, a minimum occurrence of the appearance of the at least one of the plurality of third data structures, and a maximum occurrence of the appearance of the at least one of the plurality of third data structures.
30. The memory of claim 29, wherein the classification of the at least one of the plurality of segments of the test definition language comprises one of groups, sections, forms, form groups, and exams.
31. The memory of claim 29, wherein the classification of the at least one of the plurality of segments of the test definition language comprises groups and the sub-classification of the at least one of the plurality of third data structures comprises at least one of group, group reference, presentation, section, and section reference.
32. The memory of claim 29, wherein the classification of the at least one of the plurality of segments of the test definition language comprises sections and the sub-classification of the at least one of the plurality of third data structures comprises at least one of group and group reference.
33. The memory of claim 29, wherein the classification of the at least one of the plurality of segments of the test definition language comprises forms and the at least one of the plurality of third data structures comprises at least one of section reference and section.
34. The memory of claim 29, wherein the classification of the at least one of the plurality of segments of the test definition language comprises form groups and the at least one of the plurality of third data structures comprises at least one of form reference and form.
35. The memory of claim 29, wherein the classification of the at least one of the plurality of segments of the test definition language comprises exams and the at least one of the plurality of third data structures comprises at least one of form, form reference, and form group.
36. A memory storing a schema for a test definition language in extensible markup language format that that characterizes at least one computer-based test delivered to an examinee using a test driver and is implemented by at least one computer, the test definition language having a plurality of segments, the at least one computer-based test having a presentation format and data content, and the test driver delivering the at least one computer-based test to an examinee using a display device, managing the at least one computer-based test, controlling progression of the at least one computer- based test, controlling scoring of the at least one computer-based test, controlling timing of the at least one computer-based test, controlling printing of the at least one computer-based test, and controlling results reporting of the at least one computer-based test based on the test definition language, wherein the schema defines a permissible grammar for the test definition language, the memory storing: at least one of a plurality of first data structures, the plurality of first data structures including element definition specific data objects defining an element classification of at least one of the plurality of segments of the schema, wherein the plurality of segments defines classification identification information comprising the data content, the presentation format, the progression, the scoring, the printing, the timing, and the results reporting of the at least one computer-based test; at least one of a plurality of second data structures, the plurality of second data structures including attribute definition specific data objects defining at least one attribute classification of the least one of the plurality of segments of the schema; at least one of a plurality of third data structures, the plurality of third data structures including element specific data objects indicating at least one element sub-classification of the at least one of the plurality of segments of the schema; and at least one of a plurality of fourth data structures, the plurality of fourth data structures including attribute specific data objects indicating at least one attribute of the at least one of the plurality of segments of the test definition language implemented by the at least one computer.
37. The memory of claim 36, wherein the at least one of the plurality of second data structures at least one of depends from and is subordinate to the at least one of the plurality of first data structures.
38. The memory of claim 36, wherein the at least one of the plurality of second data structures is independent of the at least one of the plurality of first data structures.
39. The memory of claim 36, wherein the at least one of the plurality of third data stractures at least one of depends from and is subordinate to the at least one of the plurality of first data structures.
40. The memory of claim 36, wherein the at least one of the plurality of fourth data structures at least one of depends from and is subordinate to the at least one of the plurality of first data structures.
41. The memory of claim 36, wherein the element classification is defined by at least one of a plurality of definition attributes, the definition attributes comprising at least one of name, content, order, and model.
42. The memory of claim 36, wherein the at least one attribute classification is defined by at least one of a plurality of definition attributes, the definition attributes comprising at least one of name, type, required, values, and default.
43. The memory of claim 36, wherein the at least one element sub-classification is defined by at least one of a plurality of definition attributes, the definition attributes comprising at least one of type, minimum occurrences, and maximum occurrences.
44. The memory of claim 36, wherein the at least one attribute is defined by a definition attributes, the definition attributes comprising type.
45. The memory of claim 36, further stores at least one of a plurality of fifth data structures, the plurality of fifth data structures including group specific data objects indicating an order of an appearance of the at least one of the plurality of third data structures, a minimum occurrence of the appearance of the at least one of the plurality of third data structures, and a maximum occurrence of the appearance of the at least one of the plurality of third data structures.
46. The memory of claim 45, wherein the at least one of the plurality of fifth data structures at least one of depends from and is subordinate to the at least one of the plurality of first data structures.
47. A method for computer-based testing for at least one test having a presentation format and a data content, the at least one test being delivered by a test driver, the method comprising the steps of: authoring a test specification and content of the at least one test using a test definition language, wherein the test specification and content defines the presentation format and the data content of the at least one test; compiling the test specification and content of the at least one test to create a compiled test specification and content, wherein the compiling comprises validating the test specification and content; storing the compiled test specification and content to a resource file; and retrieving the compiled test specification and content from the resource file during delivery of the test.
48. The method of claim 47, wherein validating the test specification and content comprises determining whether the test specification and content are correctly formatted.
49. The method of claim 47, wherein the test definition language comprises extensible markup language format.
50. The method of claim 49, wherein validating the test specification and content comprises determining whether the test specification and content are correctly formatted and wherein a correct format for the test specification and content is defined in a schema.
51. The method of claim 47, wherein the test specification and content are compiled by a compiler and at least one validation module.
52. The method of claim 51 , wherein the at least one validation module validates at least a portion of the test specification and content.
53. The method of claim 52, wherein the at least one validation module is a plugin.
54. The method of claim 47, wherein authoring the test specification and content further comprises: defining at least one element for each aspect of the at least one test; and defining at least one of at least one attribute object and at least one data object of the at least one element.
55. The method of claim 54, wherein the at least one element comprises one of message boxes, scripts, data, plug-ins, templates, categories, items, groups, sections, forms, form groups, and exams.
56. The method of claim 47, wherein compiling the test specification and content of the at least one test further comprises: compiling a first set of data files, wherein the first set of data files are globally accessible to the test driver; and compiling a second set of data files, wherein the second set of data files comprise by the test definition language.
57. The method of claim 56, wherein the first set of data files comprises multimedia data and wherein the first set of data files are compiled before the second set of data files are compiled.
58. The method of claim 56, wherein the second set of data files further comprises at least one of data comprising scripts, data comprising at least one validation module, data comprising the presentation format of the at least one test, data comprising items, and data comprising the data content of the at least one test.
59. The method of claim 58, wherein compiling the second set of data files further comprises: compiling the data comprising scripts; compiling the data comprising the at least one validation module after compiling the data comprising scripts; compiling the data comprising the presentation format of the at least one test after compiling the data comprising the at least one validation module; compiling the data comprising items after compiling the data comprising the presentation format of the at least one test; and compiling the data comprising the data content of the at least one test after compiling the data comprising items.
60. A method for defining a schema for a test definition language, the method comprising: defining a first set of elements; defining a set of attributes; and defining a second set of elements, wherein the second set of elements references the first set of elements and the set of attributes.
61. In a computer-based testing system executing at least one test controlled by a test driver, the test driver having an executable code that controls the test driver and functionality performed by the test driver that enables the test driver to deliver the at least one test to an examinee using a display device, manage the at least one test, control progression of the at least one test, control scoring of the at least one test, control timing of the at least one computer-based test, control printing of the at least one computer-based test, and control results reporting of the at least one test based on a test definition language in extensible markup language format, a method for computer-based testing for the at least one test, the at least one test having a presentation format and data content, the test definition language having a plurality of segments that defines information comprising the data content, the presentation format, the progression, the scoring, the printing, the timing, and the results reporting of the at least one test, the method comprising at least one of the sequential, non-sequential and sequence independent steps of: authoring at least one of the plurality of segments and storing the at least one of the plurality of segments to the source file; instantiating a validation expansion module during a test production cycle; loading the at least one of the plurality of segments of the test definition language into a memory from the source file; validating the at least one of the plurality of segments; unloading the at least one of the plurality of segments from the memory into at least one of a plurality of storage elements; providing to the validation expansion module the at least one of the plurality of storage elements; loading the at least one of the plurality of segments of the test definition language from the at least one of the plurality of storage elements into the validation expansion module during a test delivery cycle; implementing directly by the validation expansion module the information defined by the at least one of the plurality of segments; and accessing by the test driver the at least one of the plurality of segments of the test definition language to enable the functionality of the test driver via the direct implementation by the validation expansion module.
62. In a computer-based testing system executing at least one test controlled by a test driver, the test driver having an executable code that controls the test driver and functionality performed by the test driver that enables the test driver to deliver the at least one test to an examinee using a display device, manage the at least one test, control progression of the at least one test, control scoring of the at least one test, control timing of the at least one computer-based test, control printing of the at least one computer-based test, and control results reporting of the at least one test based on a test definition language in extensible markup language format, a method for computer-based testing for the at least one test, the at least one test having a presentation format and data content, the test definition language having a plurality of element specific data objects and a plurality of attribute specific data objects depending from and subordinate to the plurality of element specific data objects, the plurality of element specific data objects and the plurality attribute specific data objects defining information comprising the data content, the presentation format, the progression, the scoring, the printing, the timing, and the results reporting of the at least one test, the method comprising at least one of the sequential, non-sequential and sequence independent steps of: authoring at least one of the plurality of element specific data objects and the at least one of the plurality of attribute specific data objects and storing the al least one of the plurality of element specific data objects and the at least one of the plurality of attribute specific data objects to the jource file; instantiating a validation expansion module during a test production cycle; loading the at least one of the plurality of element specific data objects and the at least one of the plurality of attribute specific data objects of the test definition language into the validation expansion module from the source file; validating the at least one of the plurality of element specific data objects and the at least one of the plurality of attribute specific data objects; unloading the at least one of the plurality of element specific data objects and the at least one of the plurality of attribute specific data objects from the memory into at least one of a plurality of storage elements; providing to the validation expansion module the at least one of the plurality of storage elements; loading the at least one of the plurality of element specific data objects and the at least one of the plurality of attribute specific data objects of the test definition language from the at least one of the plurality of storage elements into the validation expansion module during a test delivery cycle; implementing directly by the validation expansion module the information defined by the at least one of the plurality of element specific data objects and the at least one of the plurality of attribute specific data objects; and accessing by the test driver the at least one of the plurality of element specific data objects and the at least one of the plurality of attribute specific data objects of the test definition language to enable the functionality of the test driver via the direct implementation by the validation expansion module.
PCT/US2002/036288 2001-11-13 2002-11-13 Extensible exam language (xxl) protocol for computer based testing WO2003042786A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2002361616A AU2002361616A1 (en) 2001-11-13 2002-11-13 Extensible exam language (xxl) protocol for computer based testing

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US33122801P 2001-11-13 2001-11-13
US60/331,228 2001-11-13

Publications (3)

Publication Number Publication Date
WO2003042786A2 true WO2003042786A2 (en) 2003-05-22
WO2003042786A3 WO2003042786A3 (en) 2003-10-23
WO2003042786B1 WO2003042786B1 (en) 2003-11-27

Family

ID=23293104

Family Applications (5)

Application Number Title Priority Date Filing Date
PCT/US2002/036264 WO2003043157A1 (en) 2001-11-13 2002-11-13 Method and system for computer based testing using plugins to expand functionality of a test driver
PCT/US2002/036220 WO2003043255A2 (en) 2001-11-13 2002-11-13 Method and system for computer based testing using an amalgamated resource file
PCT/US2002/036286 WO2003042956A1 (en) 2001-11-13 2002-11-13 Method and system for computer based testing using a non-deterministic exam extensible language (xxl) protocol
PCT/US2002/036288 WO2003042786A2 (en) 2001-11-13 2002-11-13 Extensible exam language (xxl) protocol for computer based testing
PCT/US2002/036287 WO2003042785A2 (en) 2001-11-13 2002-11-13 Method and system for computer based testing using customizable templates

Family Applications Before (3)

Application Number Title Priority Date Filing Date
PCT/US2002/036264 WO2003043157A1 (en) 2001-11-13 2002-11-13 Method and system for computer based testing using plugins to expand functionality of a test driver
PCT/US2002/036220 WO2003043255A2 (en) 2001-11-13 2002-11-13 Method and system for computer based testing using an amalgamated resource file
PCT/US2002/036286 WO2003042956A1 (en) 2001-11-13 2002-11-13 Method and system for computer based testing using a non-deterministic exam extensible language (xxl) protocol

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/US2002/036287 WO2003042785A2 (en) 2001-11-13 2002-11-13 Method and system for computer based testing using customizable templates

Country Status (6)

Country Link
US (12) US7828551B2 (en)
EP (1) EP1444763A4 (en)
CN (1) CN100486068C (en)
AU (3) AU2002360371A1 (en)
CA (1) CA2466683C (en)
WO (5) WO2003043157A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012168576A1 (en) * 2011-06-08 2012-12-13 Edunovo Personal electronic learning device

Families Citing this family (145)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020182579A1 (en) * 1997-03-27 2002-12-05 Driscoll Gary F. System and method for computer based creation of tests formatted to facilitate computer based testing
US7845950B2 (en) * 1997-03-27 2010-12-07 Educational Testing Service System and method for computer based creation of tests formatted to facilitate computer based testing
US7828551B2 (en) * 2001-11-13 2010-11-09 Prometric, Inc. Method and system for computer based testing using customizable templates
US20040076930A1 (en) * 2002-02-22 2004-04-22 Steinberg Linda S. Partal assessment design system for educational testing
US8769517B2 (en) * 2002-03-15 2014-07-01 International Business Machines Corporation Generating a common symbol table for symbols of independent applications
US7496845B2 (en) 2002-03-15 2009-02-24 Microsoft Corporation Interactive presentation viewing system employing multi-media components
US7127641B1 (en) * 2002-03-29 2006-10-24 Cypress Semiconductor Corp. System and method for software testing with extensible markup language and extensible stylesheet language
US20040001106A1 (en) * 2002-06-26 2004-01-01 John Deutscher System and process for creating an interactive presentation employing multi-media components
US7216340B1 (en) * 2002-08-19 2007-05-08 Sprint Communications Company L.P. Analysis data validation tool for use in enterprise architecture modeling with result based model updating
US7424702B1 (en) 2002-08-19 2008-09-09 Sprint Communications Company L.P. Data integration techniques for use in enterprise architecture modeling
AU2003302232A1 (en) 2002-11-13 2004-06-30 Educational Testing Service Systems and methods for testing over a distributed network
AU2003282439A1 (en) * 2002-11-27 2004-06-18 Samsung Electronics Co., Ltd. Apparatus and method for reproducing interactive contents by controlling font according to aspect ratio conversion
CA2414378A1 (en) * 2002-12-09 2004-06-09 Corel Corporation System and method for controlling user interface features of a web application
CA2414047A1 (en) * 2002-12-09 2004-06-09 Corel Corporation System and method of extending scalable vector graphics capabilities
KR100526181B1 (en) * 2003-05-13 2005-11-03 삼성전자주식회사 Test-Stream Generating Method And Apparatus Providing Various Standards And Testing Level
US20050282133A1 (en) * 2004-06-18 2005-12-22 Christopher Crowhurst System and method for facilitating computer-based testing using traceable test items
CN100585662C (en) 2003-06-20 2010-01-27 汤姆森普罗梅特里克公司 System and method for computer based testing using cache and cacheable objects to expand functionality of a test driver application
WO2005008440A2 (en) * 2003-07-11 2005-01-27 Computer Associates Think, Inc. System and method for common storage object model
AU2004265995A1 (en) * 2003-08-15 2005-02-24 Blackboard Inc. Content system and associated methods
US7523447B1 (en) * 2003-09-24 2009-04-21 Avaya Inc. Configurator using markup language
US20050125196A1 (en) * 2003-12-09 2005-06-09 Len Swanson Method and system for computer-assisted test construction performing specification matching during test item selection
CA2651461A1 (en) * 2003-12-17 2005-06-17 Ibm Canada Limited - Ibm Canada Limitee Relationship management for data modeling in an integrated development environment
US8136094B2 (en) * 2004-01-07 2012-03-13 International Business Machines Corporation Relationship management for data modeling in an integrated development environment
US7506332B2 (en) * 2004-03-24 2009-03-17 Sap Ag Object set optimization using dependency information
US7793262B2 (en) * 2004-07-29 2010-09-07 International Business Machines Corporation Method and apparatus for facilitating software testing and report generation with interactive graphical user interface
EP1789894A4 (en) * 2004-08-02 2007-09-19 Justsystems Corp Document processing and management approach to making changes to a document and its representation
US7688723B1 (en) 2004-09-16 2010-03-30 Avaya Inc. Procedural XML-based telephony traffic flow analysis and configuration tool
US7830813B1 (en) 2004-09-30 2010-11-09 Avaya Inc. Traffic based availability analysis
US20060099563A1 (en) * 2004-11-05 2006-05-11 Zhenyu Lawrence Liu Computerized teaching, practice, and diagnosis system
US8146057B1 (en) * 2005-01-07 2012-03-27 Interactive TKO, Inc. Instrumentation system and method for testing software
US8117591B1 (en) 2005-01-07 2012-02-14 Interactive TKO, Inc. Graphical model for test case viewing, editing, and reporting
US20060160057A1 (en) * 2005-01-11 2006-07-20 Armagost Brian J Item management system
FR2882448B1 (en) * 2005-01-21 2007-05-04 Meiosys Soc Par Actions Simpli METHOD OF MANAGING, JOURNALIZING OR REJECTING THE PROGRESS OF AN APPLICATION PROCESS
WO2006096133A1 (en) * 2005-03-10 2006-09-14 Knowledge Director Pte. Ltd. System and method for generation of multimedia learning files
US8834173B2 (en) * 2005-04-08 2014-09-16 Act, Inc. Method and system for scripted testing
US8326659B2 (en) * 2005-04-12 2012-12-04 Blackboard Inc. Method and system for assessment within a multi-level organization
US7493520B2 (en) * 2005-06-07 2009-02-17 Microsoft Corporation System and method for validating the graphical output of an updated software module
US7543188B2 (en) * 2005-06-29 2009-06-02 Oracle International Corp. Browser based remote control of functional testing tool
US20070038894A1 (en) * 2005-08-09 2007-02-15 Microsoft Corporation Test Data verification with different granularity levels
US20070111182A1 (en) * 2005-10-26 2007-05-17 International Business Machines Corporation Method and system for distributing answers
US8083675B2 (en) * 2005-12-08 2011-12-27 Dakim, Inc. Method and system for providing adaptive rule based cognitive stimulation to a user
CN101370925B (en) * 2006-01-23 2014-03-26 美利肯公司 Laundry care compositions with thiazolium dye
JP2008009715A (en) * 2006-06-29 2008-01-17 Hitachi Ltd Program distribution method and computer system
JP4902282B2 (en) * 2006-07-12 2012-03-21 株式会社日立製作所 Business system configuration change method, management computer, and business system configuration change program
US9390629B2 (en) 2006-09-11 2016-07-12 Houghton Mifflin Harcourt Publishing Company Systems and methods of data visualization in an online proctoring interface
US10861343B2 (en) 2006-09-11 2020-12-08 Houghton Mifflin Harcourt Publishing Company Polling for tracking online test taker status
US9142136B2 (en) 2006-09-11 2015-09-22 Houghton Mifflin Harcourt Publishing Company Systems and methods for a logging and printing function of an online proctoring interface
US9230445B2 (en) 2006-09-11 2016-01-05 Houghton Mifflin Harcourt Publishing Company Systems and methods of a test taker virtual waiting room
US9892650B2 (en) 2006-09-11 2018-02-13 Houghton Mifflin Harcourt Publishing Company Recovery of polled data after an online test platform failure
US20080124696A1 (en) * 2006-10-26 2008-05-29 Houser Ronald L Empirical development of learning content using educational measurement scales
GB2443443A (en) * 2006-10-30 2008-05-07 Hewlett Packard Development Co method of defining editable portions within the template document
GB2443446B (en) * 2006-10-30 2011-11-30 Hewlett Packard Development Co A method of identifying an extractable portion of a source machine-readable document
GB2443444A (en) * 2006-10-30 2008-05-07 Hewlett Packard Development Co Remotely editing a template document
GB2443438A (en) * 2006-10-30 2008-05-07 Hewlett Packard Development Co Method of constructing and storing a document
GB2443447A (en) * 2006-10-30 2008-05-07 Hewlett Packard Development Co A method of constructing an output document by adding data from a variable data document to a template document
US7831195B2 (en) * 2006-12-11 2010-11-09 Sharp Laboratories Of America, Inc. Integrated paper and computer-based testing administration system
US8239478B2 (en) * 2006-12-18 2012-08-07 Fourier Systems (1989) Ltd. Computer system
US20080229288A1 (en) * 2007-03-13 2008-09-18 Steve Nelson Software Plugin Modules for Device Testing
US20080293033A1 (en) * 2007-03-28 2008-11-27 Scicchitano Anthony R Identity management system, including multi-stage, multi-phase, multi-period and/or multi-episode procedure for identifying and/or authenticating test examination candidates and/or individuals
US8881105B2 (en) * 2007-04-11 2014-11-04 Patrick J. Quilter, Jr. Test case manager
US8156149B2 (en) * 2007-07-24 2012-04-10 Microsoft Corporation Composite nested streams
US8191045B2 (en) * 2007-09-04 2012-05-29 Nec Laboratories America, Inc. Mining library specifications using inductive learning
US8037163B1 (en) 2008-01-08 2011-10-11 Avaya Inc. Alternative methodology in assessing network availability
US8387015B2 (en) * 2008-01-31 2013-02-26 Microsoft Corporation Scalable automated empirical testing of media files on media players
US20100107425A1 (en) 2008-05-05 2010-05-06 Eveready Battery Company Inc. Razor Blade and Method of Manufacture
US9111019B2 (en) 2008-09-30 2015-08-18 Interactive TKO, Inc. Modeling and testing interactions between components of a software system
KR20100041447A (en) * 2008-10-14 2010-04-22 삼성전자주식회사 Apparatus and method for automatic testing of softwares or digital divices
US10943030B2 (en) 2008-12-15 2021-03-09 Ibailbonding.Com Securable independent electronic document
US9081881B2 (en) 2008-12-18 2015-07-14 Hartford Fire Insurance Company Computer system and computer-implemented method for use in load testing of software applications
US9280907B2 (en) * 2009-10-01 2016-03-08 Kryterion, Inc. Proctored performance analysis
US9141513B2 (en) 2009-10-01 2015-09-22 Kryterion, Inc. Maintaining a secure computing device in a test taking environment
US8915744B2 (en) * 2009-11-06 2014-12-23 Tata Consultancy Services, Ltd. System and method for automated competency assessment
US9640085B2 (en) * 2010-03-02 2017-05-02 Tata Consultancy Services, Ltd. System and method for automated content generation for enhancing learning, creativity, insights, and assessments
US10672286B2 (en) 2010-03-14 2020-06-02 Kryterion, Inc. Cloud based test environment
US20110223576A1 (en) * 2010-03-14 2011-09-15 David Foster System for the Administration of a Secure, Online, Proctored Examination
US9081888B2 (en) 2010-03-31 2015-07-14 Cloudera, Inc. Collecting and aggregating log data with fault tolerance
US9082127B2 (en) 2010-03-31 2015-07-14 Cloudera, Inc. Collecting and aggregating datasets for analysis
US8874526B2 (en) 2010-03-31 2014-10-28 Cloudera, Inc. Dynamically processing an event using an extensible data model
US9317572B2 (en) 2010-03-31 2016-04-19 Cloudera, Inc. Configuring a system to collect and aggregate datasets
US20110246511A1 (en) * 2010-04-06 2011-10-06 John Smith Method and system for defining and populating segments
US9243476B2 (en) * 2010-05-19 2016-01-26 Schlumberger Technology Corporation System and method for simulating oilfield operations
US8739150B2 (en) * 2010-05-28 2014-05-27 Smartshift Gmbh Systems and methods for dynamically replacing code objects via conditional pattern templates
US9322872B2 (en) 2010-06-01 2016-04-26 The United States Of America As Represented By The Secretary Of The Navy Correlated testing system
US9137163B2 (en) 2010-08-04 2015-09-15 Kryterion, Inc. Optimized data stream upload
US8713130B2 (en) 2010-08-04 2014-04-29 Kryterion, Inc. Peered proctoring
WO2012024352A2 (en) * 2010-08-16 2012-02-23 Extegrity Inc. Systems and methods for detecting substitution of high-value electronic documents
US9449405B2 (en) * 2010-11-30 2016-09-20 Sap Se Systems and methods to display dependencies within a graph of grouped elements
US8869112B2 (en) * 2010-11-30 2014-10-21 Sap Se System and method for modular business applications
US8429744B1 (en) * 2010-12-15 2013-04-23 Symantec Corporation Systems and methods for detecting malformed arguments in a function by hooking a generic object
US8554797B2 (en) 2010-12-17 2013-10-08 Sap Ag System and method for modular business applications
JP5978597B2 (en) * 2011-03-18 2016-08-24 株式会社リコー Information display device, question input device and system
US20120254722A1 (en) * 2011-03-31 2012-10-04 Cloudera, Inc. Interactive user interface implementation and development environment therefor
US8909127B2 (en) 2011-09-27 2014-12-09 Educational Testing Service Computer-implemented systems and methods for carrying out non-centralized assessments
US9128949B2 (en) 2012-01-18 2015-09-08 Cloudera, Inc. Memory allocation buffer for reduction of heap fragmentation
US9172608B2 (en) * 2012-02-07 2015-10-27 Cloudera, Inc. Centralized configuration and monitoring of a distributed computing cluster
WO2013126782A1 (en) 2012-02-24 2013-08-29 National Assoc. Of Boards Of Pharmacy Outlier detection tool
US9405692B2 (en) 2012-03-21 2016-08-02 Cloudera, Inc. Data processing performance enhancement in a distributed file system
US9338008B1 (en) 2012-04-02 2016-05-10 Cloudera, Inc. System and method for secure release of secret information over a network
GB2511668A (en) * 2012-04-12 2014-09-10 Supercell Oy System and method for controlling technical processes
US9842126B2 (en) 2012-04-20 2017-12-12 Cloudera, Inc. Automatic repair of corrupt HBases
US9088463B1 (en) 2012-05-11 2015-07-21 Amazon Technologies, Inc. Container contract for data dependencies
US9015234B2 (en) 2012-07-25 2015-04-21 Lg Cns Co., Ltd. Automated distributed testing administration environment
US9330164B1 (en) 2012-10-26 2016-05-03 Andrew Wills Edwards Electronic platform for user creation and organization of groups of member profiles to aid in memorization of selected information
KR20140056478A (en) * 2012-10-26 2014-05-12 삼성전자주식회사 Automatic testing apparatus for embedded software, automatic testing method thereof and test scenario composing method
US9116865B2 (en) 2012-12-05 2015-08-25 Chegg, Inc. Enhancing automated terms listings in HTML document publishing based on user searches
US9595205B2 (en) * 2012-12-18 2017-03-14 Neuron Fuel, Inc. Systems and methods for goal-based programming instruction
US20140214709A1 (en) * 2013-01-07 2014-07-31 Assessment Innovation, Inc. Occupational performance assessment apparatuses, methods and systems
US9342557B2 (en) 2013-03-13 2016-05-17 Cloudera, Inc. Low latency query engine for Apache Hadoop
US9703679B2 (en) 2013-03-14 2017-07-11 International Business Machines Corporation Probationary software tests
US8954456B1 (en) 2013-03-29 2015-02-10 Measured Progress, Inc. Translation and transcription content conversion
WO2014174414A1 (en) * 2013-04-23 2014-10-30 University Of The Witwatersrand, Johannesburg A system for instrument quality assessment
KR20160031005A (en) * 2013-07-16 2016-03-21 가부시키가이샤 베네세 코포레이션 Portable information processing device, test assistance system, and test assistance method
US9477731B2 (en) 2013-10-01 2016-10-25 Cloudera, Inc. Background format optimization for enhanced SQL-like queries in Hadoop
US9934382B2 (en) 2013-10-28 2018-04-03 Cloudera, Inc. Virtual machine image encryption
US9690671B2 (en) 2013-11-01 2017-06-27 Cloudera, Inc. Manifest-based snapshots in distributed computing environments
US10025839B2 (en) 2013-11-29 2018-07-17 Ca, Inc. Database virtualization
US10706734B2 (en) 2013-12-06 2020-07-07 Act, Inc. Methods for improving test efficiency and accuracy in a computer adaptive test (CAT)
US9940310B1 (en) * 2014-03-04 2018-04-10 Snapwiz Inc. Automatically converting an electronic publication into an online course
US9727314B2 (en) 2014-03-21 2017-08-08 Ca, Inc. Composite virtual services
US9531609B2 (en) 2014-03-23 2016-12-27 Ca, Inc. Virtual service automation
US9430216B2 (en) * 2014-05-11 2016-08-30 Lumension Security, Inc. Self-contained executable for predetermined software updating
US11266342B2 (en) * 2014-05-30 2022-03-08 The Regents Of The University Of Michigan Brain-computer interface for facilitating direct selection of multiple-choice answers and the identification of state changes
US9747333B2 (en) 2014-10-08 2017-08-29 Cloudera, Inc. Querying operating system state on multiple machines declaratively
US9684876B2 (en) 2015-03-30 2017-06-20 International Business Machines Corporation Question answering system-based generation of distractors using machine learning
US20170004723A1 (en) * 2015-06-30 2017-01-05 Act, Inc. Identifying evidence of justification and explanation skills in computer automated scoring
US10324829B2 (en) * 2015-07-30 2019-06-18 Entit Software Llc Application testing
US10084738B2 (en) 2015-10-23 2018-09-25 Paypal, Inc. Emoji commanded action
US10402049B1 (en) * 2015-12-29 2019-09-03 EMC IP Holding Company LLC User interface development
US11593342B2 (en) 2016-02-01 2023-02-28 Smartshift Technologies, Inc. Systems and methods for database orientation transformation
US10114736B2 (en) 2016-03-30 2018-10-30 Ca, Inc. Virtual service data set generation
US10585655B2 (en) 2016-05-25 2020-03-10 Smartshift Technologies, Inc. Systems and methods for automated retrofitting of customized code objects
US10089103B2 (en) 2016-08-03 2018-10-02 Smartshift Technologies, Inc. Systems and methods for transformation of reporting schema
JP2020532031A (en) 2017-08-23 2020-11-05 ニューラブル インコーポレイテッド Brain-computer interface with high-speed optotype tracking
JP6322757B1 (en) * 2017-09-29 2018-05-09 株式会社ドワンゴ Server and terminal
US10740132B2 (en) * 2018-01-30 2020-08-11 Veritas Technologies Llc Systems and methods for updating containers
US10740075B2 (en) 2018-02-06 2020-08-11 Smartshift Technologies, Inc. Systems and methods for code clustering analysis and transformation
US10698674B2 (en) 2018-02-06 2020-06-30 Smartshift Technologies, Inc. Systems and methods for entry point-based code analysis and transformation
US10528343B2 (en) 2018-02-06 2020-01-07 Smartshift Technologies, Inc. Systems and methods for code analysis heat map interfaces
US10664050B2 (en) 2018-09-21 2020-05-26 Neurable Inc. Human-computer interface using high-speed and accurate tracking of user interactions
US11561997B2 (en) * 2019-03-13 2023-01-24 Oracle International Corporation Methods, systems, and computer readable media for data translation using a representational state transfer (REST) application programming interface (API)
US11537727B2 (en) * 2020-05-08 2022-12-27 Bold Limited Systems and methods for creating enhanced documents for perfect automated parsing
CN112099928A (en) * 2020-08-28 2020-12-18 上海微亿智造科技有限公司 Recovery method, system and medium for accidental stop of Maxwell process
US11481312B2 (en) * 2020-10-15 2022-10-25 EMC IP Holding Company LLC Automation framework for monitoring and reporting on resource consumption and performance bottlenecks
US11574696B2 (en) * 2021-04-12 2023-02-07 Nanya Technology Corporation Semiconductor test system and method
CN113985252B (en) * 2021-10-28 2022-07-15 江苏博敏电子有限公司 Lamp panel jig regional testing method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5827070A (en) * 1992-10-09 1998-10-27 Educational Testing Service System and methods for computer based testing
US6149438A (en) * 1991-08-09 2000-11-21 Texas Instruments Incorporated System and method for the delivery, authoring, and management of courseware over a computer network
US6418298B1 (en) * 1997-10-21 2002-07-09 The Riverside Publishing Co. Computer network based testing system
US6431875B1 (en) * 1999-08-12 2002-08-13 Test And Evaluation Software Technologies Method for developing and administering tests over a network

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4216539A (en) * 1978-05-05 1980-08-05 Zehntel, Inc. In-circuit digital tester
US5318450A (en) 1989-11-22 1994-06-07 Gte California Incorporated Multimedia distribution system for instructional materials
US5157782A (en) 1990-01-31 1992-10-20 Hewlett-Packard Company System and method for testing computer hardware and software
US5195033A (en) * 1990-06-08 1993-03-16 Assessment Systems, Inc. Testing system including removable storage means for transfer of test related data and means for issuing a certification upon successful completion of the test
US5359546A (en) * 1992-06-30 1994-10-25 Sun Microsystems, Inc. Automatic generation of test drivers
DE69316631T2 (en) * 1992-08-31 1998-07-16 Dow Chemical Co SCREENPLAY BASED SYSTEM FOR TESTING A MULTI-USER COMPUTER SYSTEM
EP0671039B1 (en) 1993-09-30 2004-03-17 Educational Testing Service A centralized system and method for administering computer based tests
US5629878A (en) 1993-10-07 1997-05-13 International Business Machines Corporation Test planning and execution models for generating non-redundant test modules for testing a computer system
US5524110A (en) * 1993-11-24 1996-06-04 Intel Corporation Conferencing over multiple transports
US5761684A (en) * 1995-05-30 1998-06-02 International Business Machines Corporation Method and reusable object for scheduling script execution in a compound document
US5743743A (en) * 1996-09-03 1998-04-28 Ho; Chi Fai Learning method and system that restricts entertainment
US6138252A (en) 1996-07-01 2000-10-24 Sun Microsystems, Inc. Graphical test progress monitor
US6029257A (en) * 1996-12-06 2000-02-22 Intergraph Corporation Apparatus and method for testing computer systems
US6632248B1 (en) * 1996-12-06 2003-10-14 Microsoft Corporation Customization of network documents by accessing customization information on a server computer using uniquie user identifiers
US5854930A (en) * 1996-12-30 1998-12-29 Mci Communications Corporations System, method, and computer program product for script processing
US6134674A (en) 1997-02-28 2000-10-17 Sony Corporation Computer based test operating system
US6259890B1 (en) * 1997-03-27 2001-07-10 Educational Testing Service System and method for computer based test creation
US6142682A (en) 1997-06-13 2000-11-07 Telefonaktiebolaget Lm Ericsson Simulation of computer processor
US6018617A (en) 1997-07-31 2000-01-25 Advantage Learning Systems, Inc. Test generating and formatting system
US6289472B1 (en) 1997-08-07 2001-09-11 Texas Instruments Incorporated Method and test system for testing under a plurality of test modes
US6243835B1 (en) 1998-01-30 2001-06-05 Fujitsu Limited Test specification generation system and storage medium storing a test specification generation program
US6000945A (en) 1998-02-09 1999-12-14 Educational Testing Service System and method for computer based test assembly
US6149441A (en) 1998-11-06 2000-11-21 Technology For Connecticut, Inc. Computer-based educational system
US5987443A (en) * 1998-12-22 1999-11-16 Ac Properties B. V. System, method and article of manufacture for a goal based educational system
US6725399B1 (en) 1999-07-15 2004-04-20 Compuware Corporation Requirements based software testing method
US6934934B1 (en) 1999-08-30 2005-08-23 Empirix Inc. Method and system for software object testing
US6681098B2 (en) 2000-01-11 2004-01-20 Performance Assessment Network, Inc. Test administration system using the internet
US6712615B2 (en) * 2000-05-22 2004-03-30 Rolf John Martin High-precision cognitive performance test battery suitable for internet and non-internet use
US6594466B1 (en) * 2000-05-24 2003-07-15 Bentley Systems, Incorporated Method and system for computer based training
US6505342B1 (en) * 2000-05-31 2003-01-07 Siemens Corporate Research, Inc. System and method for functional testing of distributed, component-based software
EP1178407A3 (en) * 2000-06-02 2007-12-12 Compaq Computer Corporation Architecture for parallel distributed table driven I/O mapping
US6632174B1 (en) * 2000-07-06 2003-10-14 Cognifit Ltd (Naiot) Method and apparatus for testing and training cognitive ability
US6996816B1 (en) * 2000-10-02 2006-02-07 Hewlett-Packard Development Company, L.P. Utilization of third party legacy data list
US6704741B1 (en) * 2000-11-02 2004-03-09 The Psychological Corporation Test item creation and manipulation system and method
US6694510B1 (en) * 2000-11-03 2004-02-17 Hewlett-Packard Development Company, L.P. Collection driver for collecting system data using record based requests with tag lists and pausing all but one thread of a computer system
US6898712B2 (en) 2001-02-20 2005-05-24 Networks Associates Technology, Inc. Test driver ordering
US7828551B2 (en) * 2001-11-13 2010-11-09 Prometric, Inc. Method and system for computer based testing using customizable templates

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6149438A (en) * 1991-08-09 2000-11-21 Texas Instruments Incorporated System and method for the delivery, authoring, and management of courseware over a computer network
US5827070A (en) * 1992-10-09 1998-10-27 Educational Testing Service System and methods for computer based testing
US6418298B1 (en) * 1997-10-21 2002-07-09 The Riverside Publishing Co. Computer network based testing system
US6431875B1 (en) * 1999-08-12 2002-08-13 Test And Evaluation Software Technologies Method for developing and administering tests over a network

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012168576A1 (en) * 2011-06-08 2012-12-13 Edunovo Personal electronic learning device
FR2976391A1 (en) * 2011-06-08 2012-12-14 Edunovo ELECTRONIC PERSONAL LEARNING DEVICE.

Also Published As

Publication number Publication date
AU2002360371A1 (en) 2003-05-26
US20090155758A1 (en) 2009-06-18
US8579633B2 (en) 2013-11-12
US7494340B2 (en) 2009-02-24
AU2002363735A1 (en) 2003-05-26
CN100486068C (en) 2009-05-06
WO2003043157A1 (en) 2003-05-22
WO2003042956A1 (en) 2003-05-22
US20110236873A1 (en) 2011-09-29
US7080303B2 (en) 2006-07-18
US6966048B2 (en) 2005-11-15
EP1444763A1 (en) 2004-08-11
WO2003043255A2 (en) 2003-05-22
US6948153B2 (en) 2005-09-20
US7318727B2 (en) 2008-01-15
US7784045B2 (en) 2010-08-24
CA2466683A1 (en) 2003-05-22
US20090155756A1 (en) 2009-06-18
US8413131B2 (en) 2013-04-02
WO2003042956B1 (en) 2003-10-30
US20090162829A1 (en) 2009-06-25
WO2003043255A3 (en) 2004-11-25
WO2003042785A2 (en) 2003-05-22
US8454369B2 (en) 2013-06-04
WO2003043255A9 (en) 2004-05-13
US20030138765A1 (en) 2003-07-24
US7828551B2 (en) 2010-11-09
US20030129573A1 (en) 2003-07-10
WO2003042785A3 (en) 2003-10-02
US20110145792A1 (en) 2011-06-16
WO2003042786B1 (en) 2003-11-27
US20060107254A1 (en) 2006-05-18
WO2003042786A3 (en) 2003-10-23
AU2002361616A1 (en) 2003-05-26
US9418565B2 (en) 2016-08-16
CN1608338A (en) 2005-04-20
US20030182602A1 (en) 2003-09-25
EP1444763A4 (en) 2005-11-16
US20060069970A1 (en) 2006-03-30
CA2466683C (en) 2016-01-12
US20030196170A1 (en) 2003-10-16
US20030203342A1 (en) 2003-10-30

Similar Documents

Publication Publication Date Title
WO2003042786A2 (en) Extensible exam language (xxl) protocol for computer based testing
Signore Towards a quality model for web sites
Kearsley Designing educational software for international use
Möller Ehrnlund Enriching the user experience of e-learning platforms using responsive design: a case study
Bordash et al. The Web Professional’s Handbook
CA2530064C (en) System and method for computer based testing using cache and cacheable objects to expand functionality of a test driver application
Agrawal CodEval
Wallis JavaTutor-A Remotely Collaborative, Real-Time Distributed Intelligent Tutoring System for Introductory Java Computer Programming-A Qualitative Analysis.
Mohd Rakbi Interactive Learning System forE-commerce, Technopreneursip
Ragnarsson Data models for interactive web based Textbooks: Investigating how well DocBook 5.0 supports interactive and multimedia content
Mader Abgabe am 22. Januar 2019
Harmon A WEB-BASED QUIZ DELIVERY SYSTEM USING AJAX (ASYNCHRONOUS JAVASCRIPT AND XML)
Mattson et al. Using IMS Caliper, Question & Test Interoperability (QTI) and Learning
Emadi-Dehaghi Dynamic and XML-Based Contents within E-learning
Moncur Sams teach yourself DHTML in 24 hours

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LU MC NL PT SE SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
B Later publication of amended claims

Free format text: 20030513

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP