Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS8156424 B2
Publication typeGrant
Application numberUS 11/232,827
Publication dateApr 10, 2012
Filing dateSep 22, 2005
Priority dateOct 8, 2004
Also published asUS20060103588
Publication number11232827, 232827, US 8156424 B2, US 8156424B2, US-B2-8156424, US8156424 B2, US8156424B2
InventorsRoy K. Chrisop, Tanna Marie Richardson
Original AssigneeSharp Laboratories Of America, Inc.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Methods and systems for imaging device dynamic document creation and organization
US 8156424 B2
Abstract
Aspects of the present invention relate to systems, methods and devices for imaging device dynamic document creation and organization. Some aspects relate to imaging device dynamic document creation and organization, wherein a document format, a dynamic field structure and document static content are selected on an imaging device user interface and combined to form a dynamic document on the imaging device. Some aspects of the present invention relate to imaging device dynamic document creation and organization, wherein dynamic document menu options are sent to an imaging device from a remote computing device.
Images(20)
Previous page
Next page
Claims(10)
What is claimed is:
1. A method for imaging device dynamic document creation and organization, the method comprising:
identifying a user at an imaging device (IDev), based on user input at an IDev user interface (UI);
sending a user identification, formatted as a markup language message, from a web service on said IDev to a remote computing device (RCD);
receiving access to data related to said user via said RCD;
associating dynamic document editing functions with said user data, wherein said user data is linked to user data fields identified in a dynamic field structure;
receiving a document format selection from an imaging device (IDev) user interface (UI) on an IDev, wherein said document format selection defines a page size, a page orientation, a single-sided or double-sided page format, a margin parameter, a header parameter and a footer parameter and at least one of a text box parameter and a graphic box parameter, and wherein said IDev comprises a printer function and said IDev UI in a single hardware device;
defining a dynamic field structure on said imaging device (IDev) UI, wherein said dynamic field structure defines a plurality of data fields comprising numerical data, graphics data, and image data, wherein said dynamic field structure defines field attributes consisting of field size, field shape, field color, field background color, field shading, field rotation and field orientation, said dynamic field structure also comprising a field relationship relating content of a first field in said plurality of data fields to content of a second field in said plurality of data fields, wherein said field relationship defines a plurality of relationships consisting of a geographical relationship, a mathematical relationship and a logical relationship wherein said content of said first field is related to said content of said second field according to said relationships;
designating document static content on said imaging device (IDev) user interface, wherein said document static content defines content remaining constant in all instances of said dynamic document; and
compiling said document format, said dynamic field structure and said document static content into a dynamic document on said IDev, wherein said dynamic document is a selectable file, which, when selected, automatically populates said data fields to create a document defined by said document format, said dynamic field structure and said document static content.
2. A method as described in claim 1 wherein said combining into a dynamic document is done on a remote computing device (RCD).
3. A method for imaging device dynamic document creation and organization, said method comprising:
receiving a user identification, formatted as a markup language message, from a web service on an imaging device (IDev) to a remote computing device (RCD);
accessing data related to said user with said RCD;
associating dynamic document editing functions with said user data, wherein said user data is linked to user data fields identified in a dynamic field structure;
receiving a dynamic document editing request as a markup language message received from said web service on said imaging device (IDev) at said remote computing device (RCD);
sending document format options menu content to said IDev web service from said RCD, wherein said document format options menu content is an XML message;
receiving a document format options menu selection from said IDev web service at said RCD, wherein said document format options menu selection defines a page size, a page orientation, a single-sided or double-sided page format, a margin location, a header dimension or a footer dimension and at least one of a text box parameter and a graphics box parameter and wherein said document format options menu selection is a markup language message;
sending a dynamic field structure options menu to said IDev, wherein said dynamic field structure options menu is in the form of a markup language message;
receiving a dynamic field structure options selection from said IDev, wherein said dynamic field structure options selection was received as user input on an IDev user interface (UI) at said IDev and wherein said dynamic field structure options selection defines a plurality of data fields comprising numerical data, graphics data and image data, wherein said dynamic field structure also defines field attributes consisting of field size, field shape, field color, field background color, field shading, field rotation and field orientation, said dynamic field structure also comprising a field relationship relating content of a first field in said plurality of data fields to content of a second field in said plurality of data fields, wherein said field relationship defines a plurality of relationships consisting of a geographical relationship, a mathematical relationship and a logical relationship wherein said content of said first field is related to said content of said second field according to said relationships;
sending a static content menu to said IDev from said RCD, wherein said static content menu is in the form of a markup language message;
receiving a static content menu selection from said IDev web service as a markup language message , wherein said static content menu selection defines content remaining constant in all documents created with said dynamic document;
compiling said document format options menu selection, said dynamic field structure options menu selection and said static content menu selection into a dynamic document on said RCD and wherein said dynamic document is a selectable file, which, when selected, automatically populates said data fields to create a document defined by said document format, said dynamic field structure and said document static content.
4. A method as described in claim 3 further comprising storing said dynamic document in a manner that provides access to any sources linked to a field in said dynamic document.
5. A method for imaging device dynamic document creation and organization, said method comprising:
identifying a user at an imaging device (IDev), based on user input at an IDev user interface (UI);
sending a user identification, formatted as a markup language message, from a web service on said IDev to a remote computing device (RCD);
accessing data related to said user with said RCD;
associating dynamic document editing functions with said user data, wherein said user data is linked to user data fields identified in a dynamic field structure;
sending a document format options menu with user specific data to said IDev from said remote computing device (RCD), wherein said document format options menu is a markup language message;
displaying said document format options menu on said IDev;
accepting a document format user selection of said document format options on said IDev, wherein said document format user selection defines at least one of a page size, a page orientation, a single-sided or double-sided page format, a margin location, a header dimension or a footer dimension and at least one of a text box parameter and a graphic box parameter;
sending said user selection to said RCD;
sending a dynamic field structure options menu with user specific data to said IDev from said RCD;
displaying said dynamic field structure options menu on said IDev;
accepting a dynamic field structure user selection of said dynamic field structure options on said IDev, wherein said dynamic field structure user selection defines a plurality of data fields comprising numerical data, graphics data and image data, wherein said dynamic field structure also defines field attributes consisting of field size, field shape, field color, field background color, field shading, field rotation and field orientation, said dynamic field structure also comprising a field relationship relating content of a first field in said plurality of data fields to content of another field in said plurality of data fields, wherein said field relationship defines a plurality of relationships consisting of a geographical relationship, a mathematical relationship and a logical relationship;
sending said user selection to said RCD;
sending a static content menu with user specific data to said IDev from said RCD;
displaying said static content menu on said IDev;
accepting a static content user selection of said static content menu on said IDev, wherein said static content user selection defines content remaining constant in all documents created with said dynamic document;
sending said user selection to said RCD; and
compiling said document format user selection, said dynamic field structure user selection and said static content user selection into a dynamic document on said RCD, wherein said dynamic document is a selectable file, which, when selected, automatically populates said data fields to create a document defined by said document format user selection, said dynamic field structure user selection and said document static content user selection if said selectable file is selected by an entity authenticated as said user.
6. A method as described in claim 5 wherein said dynamic field structure options comprise at least one option selected from the set consisting of relating a field to data stored on a remote device, relating a field to data on a web page and relating a field to user-specific data stored on a user database.
7. A method as described in claim 5 wherein said static content comprises at least one item selected from the set consisting of text, numerical data, graphical data, and images.
8. A non-transitory, computer-readable medium comprising instructions for instructing a processor to execute a method for imaging device dynamic document creation and organization, said instructions instructing said processor to:
identify a user at an imaging device (IDev), based on user input at an IDev user interface (UI);
send a user identification, formatted as a markup language message, from a web service on said IDev to a remote computing device (RCD);
receive access to data related to said user via said RCD;
associate dynamic document editing functions with said user data, wherein said user data is linked to user data fields identified in a dynamic field structure;
receive a document format selection from an imaging device (IDev) user interface (UI) on an IDev, wherein said document format selection defines a page size, a page orientation, a single-sided or double-sided page format, a margin parameter, a header parameter and a footer parameter and at least one of a text box parameter and a graphic box parameter, and wherein said IDev comprises a printer function and said IDev UI in a single hardware device;
define a dynamic field structure on said imaging device (IDev) UI, wherein said dynamic field structure defines a plurality of data fields comprising numerical data, graphics data, and image data, wherein said dynamic field structure defines field attributes consisting of field size, field shape, field color, field background color, field shading, field rotation and field orientation, said dynamic field structure also comprising a field relationship relating content of a first field in said plurality of data fields to content of a second field in said plurality of data fields, wherein said field relationship defines a plurality of relationships consisting of a geographical relationship, a mathematical relationship and a logical relationship wherein said content of said first field is related to said content of said second field according to said relationships;
designate document static content on said imaging device (IDev) user interface, wherein said document static content defines content remaining constant in all instances of said dynamic document; and
compile said document format, said dynamic field structure and said document static content into a dynamic document on said IDev, wherein said dynamic document is a selectable file, which, when selected, automatically populates said data fields to create a document defined by said document format, said dynamic field structure and said document static content.
9. A non-transitory, computer-readable medium comprising instructions for instructing a processor to execute a method for imaging device dynamic document creation and organization, said instructions instructing said processor to:
receive a user identification, formatted as a markup language message, from a web service on an imaging device (IDev) to a remote computing device (RCD);
access data related to said user with said RCD;
associate dynamic document editing functions with said user data, wherein said user data is linked to user data fields identified in a dynamic field structure;
receive a dynamic document editing request as a markup language message received from said web service on said imaging device (IDev) at said remote computing device (RCD);
send document format options menu content to said IDev web service from said RCD, wherein said document format options menu content is an XML message;
receive a document format options menu selection from said IDev web service at said RCD, wherein said document format options menu selection defines a page size, a page orientation, a single-sided or double-sided page format, a margin location, a header dimension or a footer dimension and at least one of a text box parameter and a graphics box parameter and wherein said document format options menu selection is a markup language message;
send a dynamic field structure options menu to said IDev, wherein said dynamic field structure options menu is in the form of a markup language message;
receive a dynamic field structure options selection from said IDev, wherein said dynamic field structure options selection was received as user input on an IDev user interface (UI) at said IDev and wherein said dynamic field structure options selection defines a plurality of data fields comprising numerical data, graphics data and image data, wherein said dynamic field structure also defines field attributes consisting of field size, field shape, field color, field background color, field shading, field rotation and field orientation, said dynamic field structure also comprising a field relationship relating content of a first field in said plurality of data fields to content of a second field in said plurality of data fields, wherein said field relationship defines a plurality of relationships consisting of a geographical relationship, a mathematical relationship and a logical relationship wherein said content of said first field is related to said content of said second field according to said relationships;
send a static content menu to said IDev from said RCD, wherein said static content menu is in the form of a markup language message;
receive a static content menu selection from said IDev web service as a markup language message , wherein said static content menu selection defines content remaining constant in all documents created with said dynamic document; and
compile said document format options menu selection, said dynamic field structure options menu selection and said static content menu selection into a dynamic document on said RCD and wherein said dynamic document is a selectable file, which, when selected, automatically populates said data fields to create a document defined by said document format, said dynamic field structure and said document static content.
10. A non-transitory, computer-readable medium comprising instructions for instructing a processor to execute a method for imaging device dynamic document creation and organization, said instructions instructing said processor to:
identify a user at an imaging device (IDev), based on user input at an IDev user interface (UI);
send a user identification, formatted as a markup language message, from a web service on said IDev to a remote computing device (RCD);
access data related to said user with said RCD;
associate dynamic document editing functions with said user data, wherein said user data is linked to user data fields identified in a dynamic field structure;
send a document format options menu with user specific data to said IDev from said remote computing device (RCD), wherein said document format options menu is a markup language message;
display said document format options menu on said IDev;
accept a document format user selection of said document format options on said IDev, wherein said document format user selection defines at least one of a page size, a page orientation, a single-sided or double-sided page format, a margin location, a header dimension or a footer dimension and at least one of a text box parameter and a graphic box parameter;
send said user selection to said RCD;
send a dynamic field structure options menu with user specific data to said IDev from said RCD;
display said dynamic field structure options menu on said IDev;
accept a dynamic field structure user selection of said dynamic field structure options on said IDev, wherein said dynamic field structure user selection defines a plurality of data fields comprising numerical data, graphics data and image data, wherein said dynamic field structure also defines field attributes consisting of field size, field shape, field color, field background color, field shading, field rotation and field orientation, said dynamic field structure also comprising a field relationship relating content of a first field in said plurality of data fields to content of another field in said plurality of data fields, wherein said field relationship defines a plurality of relationships consisting of a geographical relationship, a mathematical relationship and a logical relationship;
send said user selection to said RCD;
send a static content menu with user specific data to said IDev from said RCD;
display said static content menu on said IDev;
accept a static content user selection of said static content menu on said IDev, wherein said static content user selection defines content remaining constant in all documents created with said dynamic document;
send said user selection to said RCD; and
compile said document format user selection, said dynamic field structure user selection and said static content user selection into a dynamic document on said RCD, wherein said dynamic document is a selectable file, which, when selected, automatically populates said data fields to create a document defined by said document format user selection, said dynamic field structure user selection and said document static content user selection if said selectable file is selected by an entity authenticated as said user.
Description
RELATED REFERENCES

This application is a continuation-in-part of U.S. patent application Ser. No. 10/962,248, entitled “Methods and Systems for Imaging Device Remote Application Interaction,” filed on Oct. 8, 2004; this application is also a continuation-in-part of U.S. patent application Ser. No. 10/961,793, entitled “Methods and Systems for Imaging Device Remote Form Management,” filed on Oct. 8, 2004; this application is also a continuation-in-part of U.S. patent application Ser. No. 10/961,911, entitled “Methods and Systems for Imaging Device Remote Location Functions,” filed on Oct. 8, 2004; this application is also a continuation-in-part of U.S. patent application Ser. No. 10/961,594, entitled “Methods and Systems for Imaging Device Remote document Management,” filed on Oct. 8, 2004; and this application is also a continuation-in-part of U.S. patent application Ser. No. 10/962,103, entitled “Methods and Systems for Imaging Device Document Translation,” filed on Oct. 8, 2004; this application also claims the benefit of U.S. Provisional Patent Application No. 60/704,066, entitled “Methods and Systems for Imaging Device Applications,” filed Jul. 28, 2005.

FIELD OF THE INVENTION

Embodiments of the present invention comprise methods and systems for imaging device dynamic document creation and organization.

BACKGROUND OF THE INVENTION

Imaging devices such as printers, copiers, scanners and fax machines can have a wide array of functions and capabilities to fit specific uses or combinations of uses. Imaging devices often take the form of a multi-function peripheral device (MFP) that combines the functions of two or more of the traditionally separated imaging devices. An MFP may combine any number of imaging devices, but typically comprises the functions of a printer, scanner, copier and fax machine.

Some imaging devices may comprise computing resources for data storage and processing such as processors, hard disk drives, memory and other devices. As imaging devices add more features and functions, they become more costly and complex.

More complex imaging devices and MFPs may comprise network connectivity to provide communication with other computing devices, such as personal computers, other imaging devices, network servers and other apparatus. This connectivity allows the imaging device to utilize off-board resources that are available on a connected network.

Imaging devices typically have a user input panel with an array of buttons, knobs and other user input devices. Some devices also have a display panel, which can be for display only or can be a touch panel display that enables user input directly on the display.

Devices with touch panel displays or displays with buttons arranged in cooperation with the display can display menu data that may be selected by user input. This menu data is typically driven by an on-board server module within the imaging device.

BRIEF SUMMARY OF THE INVENTION

Embodiments of the present invention comprise systems and methods for creating, organizing and editing dynamic documents through the use of an imaging device user interface.

Embodiments of the present invention comprise systems, methods and devices for interacting with a remote computing device from an imaging device. These embodiments comprise remote computing devices configured to communicate with imaging devices, imaging devices configured to communicate with remote computing devices and systems comprising various combinations of remote computing devices in communication with imaging devices.

The foregoing and other objectives, features, and advantages of the invention will be more readily understood upon consideration of the following detailed description of the invention taken in conjunction with the accompanying drawings.

BRIEF DESCRIPTION OF THE SEVERAL DRAWINGS

FIG. 1 is a diagram of an embodiment of the present invention comprising an imaging device in connection with a remote computing device;

FIG. 2 is an image of an exemplary user interface for an imaging device;

FIG. 3 shows an exemplary imaging device;

FIG. 4 is a chart depicting steps of an imaging device method;

FIG. 5 is a chart depicting steps of an imaging device method using a markup language;

FIG. 6 shows an exemplary remote computing device embodiment;

FIG. 7 is a diagram showing components of an exemplary remote computing device;

FIG. 8 is a chart showing steps of a remote computing device method;

FIG. 9 is a chart showing steps of a remote computing device method using a markup language;

FIG. 10 is a diagram showing a system comprising multiple imaging devices in connection with a remote computing device;

FIG. 11 is a chart showing steps of a method comprising RCD processing of user input data;

FIG. 12 is a diagram showing components of some embodiments comprising linked resources;

FIG. 13 is a chart showing steps of an embodiment comprising a form building application at an imaging device;

FIG. 14 is a chart showing steps of an embodiment comprising a form building application on a remote computing device;

FIG. 15 is a chart showing steps of an embodiment comprising designating document format and content at an imaging device;

FIG. 16 is a chart showing steps of an embodiment comprising menus with user-specific data;

FIG. 17 is a chart showing steps of an embodiment comprising storing a dynamic document on a remote computing device;

FIG. 18 is a chart showing steps of an embodiment comprising sending and displaying options menus at an imaging device.

FIG. 19 is a chart showing steps of an embodiment comprising displaying user-specific options menus on an imaging device.

FIG. 20 is a chart showing steps of an embodiment comprising saving a dynamic document on a remote computing device.

FIG. 21 is a chart showing steps of an embodiment comprising compiling user selections into a dynamic document on a remote computing device.

FIG. 22 is a chart showing steps of an embodiment comprising compiling user-specific menu selections into a dynamic document on a remote computing device.

DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS

Embodiments of the present invention will be best understood by reference to the drawings, wherein like parts are designated by like numerals throughout. The figures listed above are expressly incorporated as part of this detailed description.

It will be readily understood that the components of the present invention, as generally described and illustrated in the figures herein, could be arranged and designed in a wide variety of different configurations. Thus, the following more detailed description of the embodiments of the methods and systems of the present invention is not intended to limit the scope of the invention but it is merely representative of the presently preferred embodiments of the invention.

Elements of embodiments of the present invention may be embodied in hardware, firmware and/or software. While exemplary embodiments revealed herein may only describe one of these forms, it is to be understood that one skilled in the art would be able to effectuate these elements in any of these forms while resting within the scope of the present invention.

Embodiments of the present invention comprise interfaces and architecture that integrate imaging devices with remote computing device applications and environments to provide solutions that may not be possible solely with an imaging device alone. Some embodiments comprise an infrastructure and set of interfaces that allow applications on a network to programmatically control imaging device functions and interact with a user through an imaging device input panel. Software functions that are not practical within the imaging device can be performed on the server but are accessible from the imaging device.

For the purposes of this specification and claims, an imaging device (IDev) may be described as a device that performs an imaging function. Imaging functions comprise scanning, printing, copying, image transmission (sending and receiving), image conversion and other functions. Exemplary imaging devices comprise printers, copiers, facsimile machines, scanners, computing devices that transmit, convert or process images and other devices. An IDev may also perform multiple imaging functions. For example, and not by way of limitation, a multi-function peripheral device (MFP), which typically has the capability to perform a plurality of functions comprising a printer, scanner, copier and/or a facsimile machine or image transmitter/receiver, is a type of imaging device. Other MFP imaging devices may comprise other combinations of functions and still qualify as an IDev.

For the purposes of this specification and claims, a remote computing device (RCD) is a device capable of processing data and communicating with other devices through a communications link. An RCD is a remote device because it requires a communications link, such as a network connection, a telephone line, a serial cable or some other wired or wireless link to communicate with other devices such as an imaging device. Some exemplary RCDs are network servers, networked computers and other processing and storage devices that have communications links.

Some embodiments of the present invention may be described with reference to FIGS. 1 & 2. These embodiments comprise an imaging device (IDev) 4 that may be a multi-function peripheral device (MFP) or a single function device. The imaging device 4 further comprises a user interface (UI) panel 2, which may comprise input buttons 14 and a display device 13 or may comprise a touch panel system with or without buttons 14. User input and display may also be performed through a separate UI device 8, which may be connected to the imaging device 4 by a communication link 12, such as a USB connection, a network cable, a wireless connection or some other communications link. UI device 8 may comprise an input device, such as a keyboard or buttons as well as a display device, which may also be a touch screen panel. UI device 8 may also comprise an interface for transfer of instructions that are input to the device 8 from a remote input device. This form of UI device 8 may comprise memory sticks, USB memory cards and other storage devices that may be configured to store input for transfer to an imaging device.

These embodiments further comprise a remote computing device (RCD) 6 that is linked to the imaging device 4 via a communications link 10, such as a network connection. This network connection may be a typical wired connection or a wireless link.

Embodiments of the present invention may provide menu data from the RCD 6 to the imaging device UI panel 2 or remote panel 8 via the network connection 10. Once this menu data is fed to the imaging device 4, an UI panel 2, 8 on the imaging device 4 may be used to interact with applications that run on the remote computing device 6. User input received from UI panels 2, 8 may be returned directly to the remote computing device 6.

A Web Service is a software application identified by a Uniform Resource Identifier (URI), whose interfaces and binding are capable of being defined, described and discovered by Extensible Markup Language (XML) artifacts and supports direct interactions with other software applications using XML based messages via Internet-based protocols.

An application on the remote computing device 6 may use one or more Web Services to control various features in the imaging device 4, such as enabling, disabling or setting device values or controlling device functions.

Embodiments of the present invention allow network applications running on remote computing devices to interact with the user of the imaging device through the imaging device I/O panel. These embodiments allow imaging device user interface (UI) control (i.e., touch panel, button/display) by applications. Some embodiments may also integrate custom display screens or menus with the native imaging device UI. Embodiments may hand off control of imaging device functions between standard operation modes performed on the imaging device in response to user input to an imaging device UI and open systems modes that utilize network resources, such as applications on RCDs, through user input at the imaging device UI.

Embodiments of the present invention comprise network-based applications that have full control over the imaging device UI to display text and graphics in any format. In these embodiments, the application can programmatically display buttons, textboxes, graphics, etc. in any layout desired.

In some embodiments, the UI layout is easy to program using a standard language, such as a markup language. These languages comprise Hypertext Markup Language (HTML), Extensible Markup Language (XML), Wireless Markup Language (WML), Extensible Hypertext Markup Language (XHTML) and other languages.

In some embodiments of the present invention a remote computing device application or server application is able to request a keyboard UI to be displayed on the imaging device display 13, 8. In some embodiments, this functionality is available on the imaging device and does not need to be recreated by remote computing device applications. In some embodiments, the remote computing device may define the keyboard prompt and default values. These embodiments may comprise a remote computing device that is able to rename imaging device UI buttons, such as the OK and Cancel buttons as well as define additional buttons.

In some embodiments, menu templates may be served to the imaging device UI by the imaging device itself 4 or from a remote computing device 6.

External Authorization Application

Some embodiments of the present invention may comprise a remote computing device application that is registered as the External Authorization server. The External Authorization application may control access to the imaging device and may have top-level control of the UI. UI control may be given to this application in the same manner that control is given to an internal auditor.

In these embodiments, when an imaging device system boots, it checks to see if an External Authorization application is registered. If so, the imaging device is placed in disabled mode and the application is contacted to take control of the UI. If the External Authorization server is not available, an error message may be displayed and the device may remain disabled. The imaging device may periodically try to contact the External Authorization server until it is available. Table 1 below describes what entity has control of the UI, in an exemplary embodiment, when the device is in a disabled state.

TABLE 1
UI Control in Disabled State
Button Press UI Control Indicator Lights
Device boots External Application None
Document Filing External Application None
Image Send External Application None
Copy External Application None
Job Status Device - standard Job Status Job Status
screens
Custom Settings Device - standard Custom Settings N/A
screens
OS Mode Not available when device is
disabled

Remote Computing Device Applications

In embodiments of the present invention, access to the custom UI panels of imaging devices may vary from application to application. Some solutions, such as Document Management integration, may wish to leverage the native Image Send screens, but display some custom UI's to gather additional information about a scan job. Other solutions, like custom printing applications, may be accessed from a separate mode than the native functions.

In order to accommodate the diversified needs of these solutions applications, embodiments may support multiple integration points for UI control. These integration points are based on a user action (“trigger”) for which applications may register. In some embodiments, applications may be registered with target devices so that the device knows that when “trigger A” occurs on the front panel to contact “remote computing device B” for instructions. In exemplary embodiments, applications may be integrated with an imaging device at any of several “trigger” points.

Remote computing devices may be registered to a specific function and contacted when that function's hardware key is pressed (e.g. Image Send) on the imaging device UI. Any UI information provided by the remote computing device may be displayed instead of the standard function screens native to the imaging device. This trigger may be used for applications that wish to replace the existing functions with completely custom UI's, such as an alternative scan solution or a specialized display, such as a “Section 508” compatible screen or other specialized-need interface that may have large buttons or other accommodations.

In some embodiments, each function on the imaging device may have a menu on the touch screen that remote computing devices, such as servers, can register. This enables solutions applications to provide custom content and still use some of the standard functionality provided by the imaging device. When a button assigned to a custom application is selected, a menu will be displayed with the solutions registered to that function. Users may select the desired solution and the remote computing device will be contacted for instructions.

In some embodiments, a stand-alone RCD mode that provides remote computing device application access can be accessed from the job queue portion of the UI that is displayed on every screen. This trigger point may be used for applications that do not fit within one of the standard device functions, such as custom printing solutions on an imaging device. When the RCD menu is selected, a menu will be displayed with the solutions applications registered to the generic RCD mode. Users will select the desired solution and the remote computing device will be contacted for instructions.

Hardware Key Interaction

In some embodiments of the present invention, when an imaging device is enabled, additional hardware keys may be used to manage the device. Hardware key assignments for an exemplary embodiment are shown in table 2.

TABLE 2
Exemplary Hardware Key Assignments
Button Press Standard IDev Mode RCD Mode
Mode keys (Copy, Clear current job Clear current job settings,
Doc Filing, Image settings, move to move to target screen
Send) and Custom target screen
Settings key
Job Status key Move to Job Status, Move to Job Status,
maintain current maintain current settings &
settings & UI location UI location
Clear (C) Clears settings Sends clear event to
external application
Clear All (CA) Clears settings, cancels Cancels job and returns to
job, and returns to default IDev screen
default IDev (notification sent to external
screen application)
**When External
Authorization is controlling
the UI, only notification is
sent
Start Initiates scan function Initiates scan function
Number keys Input for copy count Not used
or fax numbers
* Logs user out (disable Logs user out (disable
device and contact device and contact External
External Authorization Authorization for screens)
for screens)

In some embodiments, in addition to the * key for logout, a timeout period may be implemented. Some embodiments also comprise an auto clear setting that can be configured for a given period of time, such as 10 to 240 seconds (or disabled). In these embodiments, when there is no activity for the time configured in auto clear, the device may automatically return to disabled mode and attempt to contact a remote computing device to retake control of the UI.

Error & Jam Notifications

Depending on a particular solution, a remote computing device application may have full or only partial control of the imaging device UI and a particular imaging job. In some embodiments, partial control may include cases where a remote computing device is monitoring clicks, but native modes are responsible for the UI interaction and controlling the job. Partial control may also include cases where the remote computing device application is integrated with a native mode (UI trigger=function custom menu). In these embodiments, the imaging device may handle all error and jam notifications with only a notification sent to the relevant remote computing device application.

For some embodiments, in cases where the remote computing device application has full control over the UI and the job, error and jam notifications may be handled differently depending on the type of error. For recoverable errors, a notification may be sent to the remote computing device application and the application may be responsible for displaying messages and resolving the error. For non-recoverable errors, the imaging device and RCD mode may interact to gracefully handle the error condition (e.g. provide user with instructions for clearing jam).

Control Handoffs

In some embodiments, at different points throughout an imaging job, several applications may need control over an imaging device including, but not limited to, an External Authorization application, a standard RCD application, an imaging device native mode and other applications. The following section describes, for an exemplary embodiment, the various steps in an exemplary job, the entities that may have control during each step, and what type of control may be allowed.

Step 1: User provides credentials to access the device at the device UI. This step may be controlled by a remote computing device, such as an External Authorization application or by Internal Accounting (native mode) in the imaging device itself. At the end of this step, the device is enabled. The External Authorization application may also specify default parameters or disable specific job parameters (e.g. default file format is PDF, but user may change; color mode is set to B/W and user may not change).

Step 2: User sets parameters for the job using one of the native imaging device modes or a standard RCD application. At the end of this step the user makes an input to initiate the job. When the input is made, an optional notification may be sent to the standard RCD application, which can then change job parameters if desired. An e-mail application is one example of an application that may request notification when the user input is made. A user may use native Image Send screens or other input to select scan options and choose e-mail recipients. A user may then select a custom application button and choose the scan-to-e-mail option from the menu. The e-mail application may then display custom screens for the user to set permissions for the file. Once a user places the original document(s) on the scanner and initiates the process, the e-mail application may capture the destination parameters set by the user and change the target destination to the e-mail application FTP server. The e-mail application may then receive the file, apply the appropriate permissions, and send to the e-mail recipients selected by the user. A remote computing device application may also want to retake control of the UI at this point, if, as in some embodiments, the application generates thumbnails of the scanned images and displays them to the user for verification.

Step 3: Once the job is initiated, the imaging device is responsible for scanning or RIPing the job and spooling it to the HDD. If the imaging device is configured to authorize jobs with an external authorization application, it may send a click report to the application and wait for instructions. The external authorization application may enable the job for sending/printing, cancel the job, or change job parameters (and then enable). As an example, a rules-based printing application may wish to change job parameters after it receives a click report. Some rules-based printing applications support rules-based printing and scanning that can limit what each user is allowed to do based on the time of day, the destination, or many other parameters. For example, only users in the marketing group may be able to scan high-quality color images. If a user from another group selects color and 600 dpi, a rules-based application may change the parameters to color and 200 dpi. At the end of this step the job should either be authorized or canceled.

Step 4: In some embodiments, this may be an optional step, where the standard RCD application in step 2 may have specified the destination as a HDD for temporary storage. This step may also be used, in some embodiments, by a Java application running on the imaging device. For example, a government office may have a custom encryption application running on the device that takes the scanned document, encrypts it, and then requests the imaging device to send it to the target destination selected by the user in step 2. In some embodiments, it may be beneficial to send a notification to the external authorization application after this step—because the imaging device does not know how long the file will be on the HDD or what the application is going to do with it—and after the send/print step.

Step 5: In the final step, the file is actually output. In typical embodiments, the file is either sent over the network or printed locally. At the end of this step, a notification that the job was successfully completed should be sent to the external authorization application and optionally, to the standard RCD application.

Device Control and Management API's

The API's may be used to allow a remote computing device application to control access to an imaging device for vend applications and to manage the device from a remote location.

Device Control and Vend API

In some embodiments of the present invention, a Device Control and Vend API allows applications to enable and disable access to the device and track click counts. The Device Control and Vend API may provide an RCD with the following controls:

Enable/disable device of function—this may allow an RCD to enable or disable access to the device as a whole or by function to enforce individual user privileges. In some exemplary embodiments, the functions listed in Table 3 may be selectively enabled or disabled by an application.

TABLE 3
Device Functions
Enable/Disable Description
Copy Copy function
(Copy button)
Image Send Scan and fax function, plus send from Doc Filing
(Image Send button)
Document Filing All access to Document Filing functions
(Document Filing button)
Print Network prints, pull print from front panel, and
print from Document Filing
(No button control)

Report clicks used—at the end of a successful job, the clicks used may be reported back to an RCD including:

TABLE 4
Job and Page Characteristics
Fax PC- E-mail/ Scan
Item Copy Print Send Fax FTP Broadcast to HD
JOB Characteristics
Job Mode Yes Yes Yes Yes Yes Yes Yes
Broadcast No No Yes Yes Yes Yes No
Manage No.
User Name Yes Yes Yes Yes Yes Yes Yes
Address No No Yes Yes Yes # No
Start Time Yes Yes Yes Yes Yes Yes Yes
End Time Yes Yes Yes Yes Yes Yes Yes
Total Page Yes Yes Yes Yes Yes Yes Yes
Result Yes Yes Yes Yes Yes Yes Yes
Error Cause No No Yes Yes Yes Yes No
Doc Filing Yes Yes Yes Yes Yes Yes Yes
Save Mode *1 *1 *1 *1 *1 *1 *1
File Name *1 Yes *1 Yes Yes *1 Yes
File Size Yes Yes *1 *1 *1 *1 Yes
Resolution Yes Yes Yes Yes Yes Yes Yes
Special Yes Yes Yes No Yes Yes Yes
Finishing Yes Yes No No No No No
File Format No No No No Yes Yes No
Compression No No No No Yes Yes No
PAGE Characteristics
Copy Yes Yes Yes Yes Yes # Yes
Paper Size Yes Yes Yes Yes Yes Yes Yes
Simplex/ Yes Yes Yes Yes Yes Yes Yes
duplex
Paper Type Yes Yes Yes Yes No No Yes
Page Yes Yes Yes Yes Yes Yes Yes
*1 - Yes when Document Filing is used

Debit mode—in these embodiments, when an application enables the device it may specify if the current job requires authorization. If so, the job will be spooled to memory and click information (e.g., as defined in Table 4) will be sent to an RCD. An RCD will then notify the device if the job should be deleted or output/sent. At this point, the application also has the option of changing job parameters. If the application does not require authorization, the job will continue as normal and a click report will be sent at the end of the job.

Print job accounting—in these embodiments, an RCD may wish to monitor print jobs along with walk-up functions. For print job accounting, an IDev may monitor all incoming print jobs and send accounting data in the PJL header to an RCD for verification before printing the job. The RCD will evaluate the accounting data (or lack thereof) and inform the IDev to continue with or cancel the job.

Report on unidentified jobs—in these embodiments, an RCD may also wish to monitor print jobs that it cannot associate to a specific user, such as device reports and incoming fax jobs. The RCD can register to receive click counts for all unidentified jobs, so that it may bill them to a general account.

Device Management API

In some embodiments of the present invention, a Device Management API allows a network application to remotely setup and manage the imaging device. In exemplary embodiments, the Device Management API may provide an RCD with the following controls:

    • Device status—an RCD may request the current status of the device. This is the same status information as reported on the embedded web pages.
    • Device configuration—an RCD can retrieve a list of installed options supported by the device.
    • Web Page settings—an RCD application can retrieve and set any of the values that are configurable on the embedded web pages.
    • Key Operator Programs—an RCD application can retrieve and set any of the values that are configurable in Key Operator Programs, including software keys.
    • Custom Settings—an RCD application can retrieve and set any of the values that are configurable in Custom Settings.
    • Job Status—an RCD application can retrieve the current job queue and history information and reprioritize or delete jobs in the queue.
    • Click counts—an RCD application can retrieve device total counts and clicks for each function by account code.
    • Data Security settings—an RCD application may retrieve the status information on the DSK (e.g. last erase) and initiate data clear functions.
    • RED data—an RCD can retrieve all data typically sent in a RED message.
    • Remote reboot—an RCD can initiate a reboot of the imaging device.

The above groupings are provided only as an exemplary embodiment detailing which settings should be included. In some embodiments, actual API's should be grouped by functional areas since there may be overlap between Key Operator settings and web page settings.

Internal Accounting API

In some embodiments, an Internal Accounting API may allow a remote computing device application to configure internal accounting and report click counts. In some exemplary embodiments an Internal Accounting API may include:

    • Set Auditing Options—an RCD may set auditing options including which modes auditing is enabled for, “account number security”, and “cancel jobs of invalid accounts.”
    • Manage Account Codes—an RCD can add, edit, or delete account codes
    • Account Limits—an RCD application can specify a maximum number of clicks by function for individual account codes or for all account codes
    • Account Reset—an RCD application can reset the click count for an individual account or for all accounts
    • Retrieve Clicks—an RCD can retrieve the number of clicks by function for each account code
      Font and Form Management API

Some embodiments of the present invention may comprise a Font and Form Management API, which allows an RCD application to remotely download and manage fonts and forms in mass-storage. In some exemplary embodiments, a Font and Form Management API may provide a remote computing device with the following controls:

    • Mass storage control—an RCD application can retrieve mass storage status information including storage capacity, space available, and write-protect mode plus modify write-protect status.
    • Resource list—an RCD application can retrieve a list of stored fonts and forms including font or macro ID, font number, font/form name, escape sequence, and file size.
    • Download resource—an RCD application can download PCL fonts, PCL macros, and PS fonts and forms. Any special processing that is performed when a resource is downloaded via the web pages will also be performed when the resource is downloaded via Open Systems.
    • Delete resource—an RCD application can delete any resource stored in mass storage.
    • Upload resources—an RCD application can upload an individual or all resources. On devices where effective memory management is unavailable, a server application can use this function to “defrag” mass storage.
    • Font/macro ID's—an RCD application can assign or modify the ID's assigned to PCL fonts and macros.
      Firmware Management API

In some embodiments of the present invention, a Firmware Management API may allow a remote computing device or network application to remotely download and manage the imaging device firmware. In some exemplary embodiments, a Firmware Management API may provide a remote computing device (e.g., a server) with the following controls:

    • Firmware versions—an RCD application can retrieve the current firmware version numbers.
    • Service mode—an RCD application can place the MFP in service mode to lockout other jobs that will interfere with firmware upgrade. Upon receiving a service mode request, the IDev will stop accepting incoming jobs, complete all jobs in the queue, and then notify the server that it is in service mode.
    • Update firmware—an RCD can download an updated firmware version to the device. If a reboot is necessary, the IDev will perform it automatically when download is complete.
    • Download status—the IDev will send a status notification (success/error) to an RCD after firmware download.
    • Revert to previous version—if firmware update is not successful, the application can request the IDev to revert to the previous firmware version.
Device Function API's

In some embodiments of the present invention, device function API's allow a remote computing device application to use existing imaging device functionality to provide new custom solutions.

Image Send API

In some embodiments, an Image Send API may provide the remote computing device application with the following controls:

    • Image Send Parameters—a remote computing device application can get and set values for the following scan and fax parameters:
      • COLOR OR B/W
      • IMAGE MODE—TEXT, TEXT/PHOTO, PHOTO; EXPOSURE LEVEL
      • RESOLUTION
      • FILE FORMAT—FILE TYPE, COMPRESSION, AND PAGES PER FILE
      • ORIGINAL—ORIGINAL SIZE, SIMPLEX/DUPLEX, ROTATE, AND JOB BUILD
      • FILENAME
      • SUBJECT
      • MESSAGE
      • SENDER
      • SCHEDULE SEND TIME
      • PAGE DIVISION (BOOK SCANNING)
      • COVER PAGE
      • TRANSMISSION MESSAGE (CONFIDENTIAL, URGENT, ETC.)
      • THIN PAPER SCANNING
      • DESTINATION
      • DOCUMENT FILING
    • Initiate Scan—the remote computing device application can initiate the scan function (same as user pressing start button).

In some embodiments, a remote computing device can change the default values on the imaging device or the values for the current job. For the current job, the remote computing device may also specify if scan parameters may be modified by the user or not. If one remote computing device application (e.g. Access Control) specifies that a parameter cannot be changed and then a second application (e.g. Document Management) tries to set the parameter, a notification may be sent to the second application and the setting will not be changed.

Print API

In some embodiments, print jobs may be submitted by remote computing device applications using standard printing channels. In some exemplary embodiments, a Print API may provide a remote computing device with the following additional control:

    • PJL sniffing—an RCD application can register with the IDev to be contacted for instructions when a specific PJL command is found in a print job. The RCD can then instruct the IDev to replace the command, cancel the job, or continue printing. This interface may be used in applications like accounting and other-brand compatibility.
      Copy API

In some embodiments of the present invention, a Copy API may provide a remote computing device with the following exemplary controls:

    • Copy Parameters—an RCD application can get and set values for the following copy parameters:
      • COLOR OR B/W
      • EXPOSURE—TEXT, TEXT/PHOTO, PHOTO, SUPER PHOTO; EXPOSURE LEVEL
      • PAPER SELECT (BY TRAY)
      • COPY RATIO
      • 2-SIDED COPY—1TO1, 1TO2, 2TO2, 2TO1; BINDING EDGE
      • OUTPUT—OUTPUT TRAY, SORT, STAPLE, GROUP, OFFSET
      • ORIGINAL SIZE
      • SPECIAL FUNCTIONS—MARGIN SHIFT, ERASE, PAMPHLET, ETC.
      • DOCUMENT FILING
    • Initiate Copy—an RCD application can initiate the copy function (same as user pressing start button).

In some embodiments, a remote computing device can change the default values on the imaging device or the values for the current job. For the current job, the remote computing device may also specify if copy parameters may be modified by the user or not.

Document Filing API

In some embodiments of the present invention, a Document Filing API may provide a remote computing device with the following exemplary controls:

    • Backup/restore—the remote computing device application can import and export a batch file with all Document Filing data. In some embodiments, this package will be in a proprietary format since it contains documents that are password-protected and should not be accessed individually—this is typically for restore in case of failure or cloning to other devices.
    • File/folder list—the remote computing device application can retrieve, modify, and create new files and folders to be stored on the IDev (also covered in device management).
    • Download file—the remote computing device can download a new file to the Document Filing systems and specify folder, filename, username, and password.
    • User list—the remote computing device application can retrieve, modify, and create new users to be stored on the IDev (also covered in device management).
    • HDD Status—the remote computing device application can retrieve the current HDD status including the % allocated to the main folder, quick folder, and custom folders and the % remaining.
    • Doc Filing Parameters—the remote computing device application can get and set values for storing a file to Doc Filing including:
      • EXPOSURE
      • RESOLUTION
      • ORIGINAL—SIZE, SIMPLEX/DUPLEX
      • FILE INFORMATION—USERNAME, FILENAME, FOLDER, CONFIDENTIAL, PASSWORD
      • SPECIAL MODES—ERASE, DUAL PAGE COPY, 2IN1, JOB BUILD, CARD SHOT
    • Initiate Print—the remote computing device application can select a stored file and initiate a print including the following parameters:
      • PAPER SIZE/SOURCE
      • OUTPUT—SORT/GROUP, OUTPUT TRAY, STAPLE, PUNCH, OFFSET
      • SIMPLEX/DUPLEX (TABLET/BOOKLET)
      • TANDEM PRINT
      • NUMBER OF COPIES
      • DELETE OR STORE AFTER PRINTING
    • Initiate Send—the remote computing device application can select a stored file and initiate a send including the following parameters:
      • RESOLUTION
      • FILE FORMAT
      • DESTINATION
      • TIMER
      • SENDER
      • FILENAME
      • SUBJECT
      • MESSAGE
Security

Allowing external applications to control an imaging device opens up the imaging device to new security vulnerabilities. In embodiments of the present invention that provide some security measures, the following exemplary items are security concerns that may be addressed by the remote computing device interface.

Access to remote computing device interfaces may be limited to valid applications. Embodiments provide extensive access and control of the imaging device, which poses a significant security risk. The interface of these embodiments may be protected from access by attackers, while maintaining ease of setup and use for valid solutions.

Confidential data (user credentials and job data) may be protected during network transfer. User credentials and job data may be secured during network transfer to ensure that it cannot be stolen, an intruder cannot monitor device activity, and a man-in-the-middle attack cannot change messages. Imaging devices may support Secure Sockets Layer (SSL) and other connections to ensure data is safe while being communicated between the imaging device and remote computing device applications.

Administrators may have the ability to lock-down imaging device access. For users with strict security policies, administrators may have the ability to disable access by remote computing devices or limit access to specific applications. Administrators may have an option to register the limited applications that they wish to access the imaging device interfaces.

Remote computing device applications may ensure the imaging device is not being “spoofed.” The remote computing device may be able to authenticate an imaging device that it is contract with it to ensure an intruder cannot imitate the imaging device to collect network configuration and password information, monitor file/folder structures of a document management system, or spoof security settings and DSK status of the imaging device.

A remote computing device may ensure that the server is not being “spoofed.” The imaging device must be able to authenticate all remote computing devices that it is in contact with to ensure that an intruder is not spoofing the remote computing device's IP address. By pretending to be the remote computing device, an intruder could steal user credentials, redirect scanned documents, change device settings or firmware, or bring down the access control system (either to provide access to unauthorized users or initiate a denial of service attack for valid users).

Access control/vend applications may not be compromised when a remote computing device is unavailable. When the remote computing device is unavailable, it may not be acceptable to provide open access to the device. If the remote computing device is unavailable at startup or becomes unavailable at anytime (e.g. someone disconnects network cable), the imaging device may immediately be disabled and an error message displayed.

An administrator may be able to adjust a security level based on company and application requirements. Security requirements can have a large impact on the time it takes to develop a remote computing device application and the resources required to implement the solution. Users using some embodiments may range from a small business with one imaging device, no IT staff, and a simple scan or print application to a large government office using access control and audit trails to track all device activity. The security measures used to protect imaging device interfaces may be adjustable by the administrator to match the target environment.

The imaging device and remote computing device applications may be able to hand-off user credentials. Users may be prompted to login at multiple points throughout a job. For example, an access control application or accounting application may control total device access, the imaging device may have user authentication enabled for Image Send, and a document management application may require user login before showing a folder list. In many environments, all of these applications will use a common user database. In some embodiments, it is, therefore, desirable for the applications to pass user credentials to each other, so that each one does not have to repeat the authentication process.

Some embodiments of the present invention may be described with reference to FIG. 3. These embodiments comprise an imaging device only, which is configured to interact with a remote computing device, such as a server through a communications link. The imaging device 30 comprises a user interface 32, which comprises a user input device 34, such as a keypad, one or more buttons, knobs or switches or a touch-screen panel and a display 36, which may comprise user input device 34 in the form of a touch-screen panel.

Imaging device 30 will typically be capable of performing one or more imaging functions including, but not limited to, scanning, printing, copying, facsimile transmission (sending and receiving) and others.

These embodiments further comprise a communications link 38, which may be a wired connection (as shown in FIG. 3) comprising a network cable, a Universal Serial Bus (USB) cable, a serial cable, a parallel cable, a powerline communication connection such as a HomePlug connection or other wired connections. Alternatively, the communications link 38 may comprise a wireless connection, such as an IEEE 802.11(b) compliant connection, a Bluetooth connection, an Infrared Data Association (IrDA) connection or some other wireless connection.

The operation of some imaging device embodiments may be explained with reference to FIG. 4. In these embodiments, menu data is received 40 from a remote computing device (not shown in FIG. 3), which is connected to the imaging device 30 via the communication link 38 through a wired or wireless connection. This menu data is then displayed 42 on the imaging device user interface display 36. This display of remote menu data is intended to prompt a user to make an input on the user interface input device 34.

Imaging devices of these embodiments are further configured to accept input from a user in response to a display of remote menu data and communicate 44 that user input to a remote computing device. In some embodiments, this user input data will be processed by a remote computing device. This may comprise running an application on the remote computing device. This processing may also comprise accessing and communicating data that is stored on the remote computing device.

The imaging devices of these embodiments are further configured to receive 46 data resulting from processing the user input data. This may comprise data generated by an application running on the remote computing device in response to the user input. The imaging device may also receive data that was stored on a remote computing device, such as a file server, in response to processing the user input.

Once the imaging device 30 has received 46 the processed data, the imaging device 30 may perform 48 a native function in response to the data or using the data. For example, and not be way of limitation, the imaging device 30 may print a document that was stored on the remote computing device and modified on the remote computing device according to the user input. As another non-limiting example, the imaging device 30 may active or enable functions (i.e., scanning, copying, printing, fax transmission) on the imaging device in response to the receipt 46 of processed data.

Some, more specific, imaging device embodiments may be explained with reference to FIG. 5. In these embodiments, the imaging device 30 is configured to receive 50 menu data formatted in a markup language from a remote computing device. The communication link by which the menu data is communicated may be established and maintained using a Hypertext Transfer Protocol (HTTP). The markup language may comprise terms from Hypertext Markup Language (HTML), Extensible Markup Language (XML), Wireless Markup Language (WML), Extensible Hypertext Markup Language (XHTML) and/or other languages.

Once the menu data is received 50, it may be displayed 52 on the imaging device user interface display 36. As in previously described embodiments, the menu data is typically intended to prompt user input on imaging device user interface 32. Display 52 of the remotely-stored menu data may be accomplished with a browser application that is native to the imaging device 30.

In these embodiments, the imaging device 30 is further configured to route 54 user input received though its user interface 32 to a remote computing device. The remote computing device that receives the user input may then run an application or otherwise process the user input and return the results of the processing the imaging device 30. Accordingly, the imaging device 30 is further configured to receive 58 processed data from a remote computing device. In some embodiments, the imaging device 30 may perform one or more functions in response to the receipt 58 of processed data.

Some embodiments of the present invention may be explained with reference to FIG. 6. These embodiments comprise a remote computing device (RCD) 60, which has a communications link 64. Communications link 64 may be a wired connection (as shown in FIG. 6) comprising a network cable, a Universal Serial Bus (USB) cable, a serial cable, a parallel cable, a powerline communication connection such as a HomePlug connection or other wired connections. Alternatively, the communications link 64 may comprise a wireless connection, such as an IEEE 802.11(b) compliant connection, a Bluetooth connection, an Infrared connection, such as those defined in the Infrared Data Association (IrDA) standard or some other wireless connection. In some embodiments, RCD 60 may further comprise a data storage device 62, which is typically a hard drive, but may also be an optical drive device, such as an array of compact disk drives, flash memory or some other storage device.

Embodiments of RCD 60 may be further described with reference to FIG. 7. In these embodiments, RCD 60 comprises a processor 72 for processing data and running programs such as operating systems and applications. RCD 60 may further comprise memory 74, which may be in the form of Random Access Memory (RAM) and Read Only Memory (ROM). Generally, any applications processed by processor 72 will be loaded into memory 74. RCD 60 may further comprise a network interface 78, which allows RCD 60 to communicate with other devices, such as an imaging device 30. In some embodiments, RCD 60 may also comprise a user interface 80, but this is not required in many embodiments. Storage 62 may be used to store applications and data that may be accessed by an imaging device 30 of embodiments of the present invention. Processor 72, memory 74, storage 62, network interface 78 and, optionally, user interface 80 are typically linked by a system bus 76 to enable data transfer between each component. Communications link 64 may couple the RCD 60 to other devices via network interface 78.

In some embodiments, described with reference to FIG. 8, an RCD 60 may comprise menu data stored on storage device 62 or in memory 74. This menu data may be configured for display on an imaging device user interface 32. Menu data may be stored in many formats and configurations. In some embodiments menu data may take the form of terms expressed with a markup language. The markup language may comprise terms from Hypertext Markup Language (HTML), Extensible Markup Language (XML), Wireless Markup Language (WML), Extensible Hypertext Markup Language (XHTML) and/or other languages. In these embodiments, menu data may be sent 82 through a communications link 64 to an imaging device 30. Accordingly, menu data configured for display on an imaging device is stored on RCD 60.

An RCD 60, of some embodiments, will be further configured to receive 84 user input obtained through the user interface 32 of an imaging device 30 and transferred to the RCD 60 over communications links 38 & 64. Once this input data is received at an RCD 60, the input data may be processed 86. This processing 86 may comprise conversion of the data to a new format, execution of commands contained within the data or some other process. Once the input data has been processed 86, the processed output may be sent 88 back to the imaging device 30 where the processed output may be used in an imaging device process or function.

In some embodiments, as described with reference to FIG. 9, an RCD 60 may send 90 menu data configured for an imaging device display 36 using a markup language. The markup language menu data is then received at the imaging device 30 and displayed to a user. Typically, this will prompt the user to enter an input on the imaging device user interface 32. This user input will then be sent by the imaging device 30 to the RCD 60. The RCD 60 will then receive 92 the input data prompted by the display of the menu data on the imaging device 30. Once received, the input data may be processed 94 on the RCD 60. Processing may comprise the selection, recordation and/or modification of a form, document or other data stored on RCD 60, the authorization of a user identified by the user input, the translation of a document input by the user, generation of a map or other directions related to user input or some other process or function. After this processing 94, the processing result may be sent 96 to the imaging device.

Some embodiments of the present invention may be described with reference to FIGS. 10 & 11. These embodiments comprise at least one RCD 60 and a plurality of imaging devices 30 a-30 d. In these embodiments, at least one of the imaging devices 30 a-30 d comprises a user interface 32 with a display 36 and user input panel 34 that is integral with the display (i.e., touch-screen) or a separate input unit. RCD 60 is connected to imaging devices 30 a-30 d by a communications link and network 100 to enable data transmission between RCD 60 and imaging devices 30 a-30 d.

In these embodiments, menu data is stored on RCD 60 and sent 110 to at least one of the imaging devices 30 a-30 d where the menu data is displayed on a user interface. Any of Imaging devices 30 a-30 d that receive the menu data are configured to accept 112 and transmit 114 user input to an RCD 60. Once the user input data is received at the RCD, the data may be processed 116 as discussed in previously described embodiments. The result of processing 116 may then be sent 118 back to any combination of the imaging devices 30 a-30 d.

In these embodiments, a single RCD 60 may be used to provide processing power, resources and functionality to a plurality of imaging devices 30 a-30 d without reproducing these resources in each imaging device. In some embodiments, data generated by input on one imaging device 30 a may be directed to another imaging device 30 d for processed data output or final processing.

Dynamic Document Creation Embodiments

Some embodiments of the present invention may comprise an imaging device (IDev) with internal processing capabilities. Other embodiments may comprise an imaging device (IDev) in communication with one or more remote computing devices (RCDs) on which processing may be performed.

Some embodiments of the present invention may be described with reference to FIG. 12. In these embodiments, an imaging device (IDev) 120 comprises a user interface 124, which comprises a user input device and a display as described in relation to other embodiments. The imaging device 120 is connected to other devices through a communication link 122, which may comprise a wired or wireless network connection or some other connection. The imaging device 120 may be connected to a remote computing device (RCD) 126 residing on a local area network (LAN) or similar local communication link. A remote computing device 126 b may also reside on a wide area network (WAN) or even a global network, such as the internet 128. The imaging device 120 may also be connected to other devices, such as a database server 125 and other computing devices, storage devices, output devices and/or other device capable of communicating with the IDev 120.

In an exemplary embodiment of a dynamic document creation and editing application, a user, through the use of an imaging device (IDev) user interface (UI), may select a document format, such as a business letter, invoice, vacation request or some other document format. A document format may comprise one or more parameters for defining a page size, page orientation, single-sided, double-sided, margins, headers, footers, columns, text boxes, graphics boxes and other formatting options. In some embodiments, a document format may comprise a predefined template.

A user, through the IDev UI, may also define a dynamic field structure for a dynamic document. A dynamic field structure may comprise one or more data fields for text, numerical data, graphics, images or other field types.

A dynamic field structure may also comprise field display characteristics. Field display characteristics may comprise a field size, field shape, field color, field background color field shading, font size, field rotation, field orientation and other characteristics that affect the way field content is displayed on a display screen or output to media. In an exemplary embodiment comprising a company advertisement flyer, dynamic field display characteristics may comprise a text color, background color, text font size, text orientation and other characteristics.

A dynamic field structure may also comprise a field relationship. A filed relationship may comprise a relationship between field content and the content of another field or a relationship between field content and other data. A field relationship may comprise a geographical relationship, a mathematical relationship, a logical relationship, such as with Boolean logic, or another relationship. In an exemplary embodiment comprising a purchase order form, a field relationship may comprise a mathematical relationship for a total amount field that is a summation of the column above that field.

In addition to a document format and a dynamic field structure, a dynamic document may comprise static content. A user may specify static content, such as text, graphics, images, or other content that remain in a static state on the document. In an exemplary embodiment comprising a company letter, static content may comprise a company logo and/or text denoting the company name. This static content will remain constant when other field content may vary, such as an addressee field or a signature block field.

Some embodiments of the present invention may be described with reference to FIG. 13A. In these embodiments, a user may select a document format such as a business letter 130A or an invoice 140A. This data relating to this document format may comprise page size, page orientation, single-sided pages, double-sided pages, margins, headers, footers, columns, and other formatting options. In some embodiments a document format may be selected by a pre-defined template.

A user may also define dynamic field structures for a document. These dynamic field structures may comprise one or more fields that may be populated with many types of information that may be linked or related to the field with a field relationship, such as a link to a remote site or a mathematical relationship to another field. A field structure typically comprises field types, field display characteristics and field relationships. In the exemplary business letter embodiment 130A shown in FIG. 13A, field types comprise an address line field 133A, a date field 134A, a RE: line field 135A, a salutation field 136A, a field for the letter body 137A and a signature field 138A.

In addition to a document format and a document field structure, a user may also designate document static content. This static content may include graphics, such as a company logo 131A or images and text such as a company letterhead 141A or 132A. Static content may also include column headings 148A, numbers and other data that is intended to be communicated by each instance of the document.

Once these three document elements have been defined, they may be compiled into a dynamic document that may automatically populate its fields when it is selected and generated by a user at an imaging device. In some embodiments, the fields may be linked or related to user data once the user is identified through a login process or another identification process.

In another exemplary embodiment 140A, illustrated in FIG. 13A, a document format for an invoice document may be selected. A field structure may be defined with fields including, but not limited to, a quantity field 142A, an item description field 143A, a unit price field 144A, a total price field 145A, a grand total field 146A, and a customer signature field 147A, which may also include a digital signature field. Static content in this embodiment comprises a letterhead 141A.

In some embodiments of the present invention, as illustrated in FIG. 14, a user may select a dynamic document editor application 141. In these embodiments, a user may select 142 a document format, as described above. A document format definition may include data comprising page size, page orientation, single-sided pages, double-sided pages, margins, headers, footers, columns, predefined templates and other formatting options.

A user may also define 143 a dynamic field structure. This dynamic field structure may comprise one or more fields that may be populated with data. Fields may be populated with user input at the IDev UI 124. Fields may also be populated by automated processes that access information stored on the IDev, an RCD or some other resource in communication with the IDev. Some fields may be populated at the time of dynamic document creation. Fields may also be populated when a document is generated for output, such as for printing or transmission to a recipient. Fields may comprise many types of information. Some types comprise text, numerical data, graphics, images and other types.

A user may also designate 144 document static content. This static content may comprise text, graphics, images, symbols, numbers and other data that is intended to be communicated by each instance of the document. Document static content does not change with any document variable. Once these elements are defined, a combination of the document format, dynamic field structure and static content may be formed into a dynamic document. The application that combines these elements into a dynamic document may reside on the IDev or may reside on a remote computing device (RCD). The dynamic document may then be saved 147 at any location accessible to the IDev.

Further embodiments of the present invention may be described with reference to FIG. 15. These embodiments comprise selecting 150 a dynamic document editor application at an imaging device (IDev) 120. The IDev may respond to this selection 150 by requesting 151 a document format menu from a remote computing device (RCD) 128. The RCD may then send 152 a document format options menu to the IDev. In response to the display of the document format menu data, a user may input a selection at the IDev. The IDev may receive 154 this selection from its UI 124. This document format menu may be formatted as a hypertext language document, such as an XML document. The IDev may then display 153 the document format options menu. Menu display may be accomplished through the use of a web browser that interprets and displays content in a hypertext language format. The IDev may then send 155 the document format options menu selection to the RCD. This selection message may take the form of an XML/SOAP message or a .NET message.

In some embodiments of the present invention, the RCD may then send 156 a dynamic field structure menu to the IDev. This dynamic field structure menu may be formatted as a hypertext language document, such as an XML document. The IDev may then display 157 the dynamic field structure menu. This menu may be displayed through a web browser as described above in the document format menu. In response to this menu display, a user may input a selection through the IDev UI. The IDev may receive 158 this selection and then send 159 the dynamic field structure menu selection to the RCD.

In some embodiments, the RCD may respond to the receipt of the dynamic field structure menu selection data by sending 160 a static content menu to the IDev. The IDev may then display 161 the static content menu in a manner similar to that described for other embodiments. When a user responds, the IDev may accept 162 a user selection of the static content menu and send 163 the static content menu selection to the RCD.

The RCD may then compile 164 the document format, the dynamic field structure and the static content into a dynamic document. The dynamic document may then be saved 165 at the IDev, the RDC or some other location.

Some embodiments of the present invention comprise associating dynamic document editor application options with user specific data. In these embodiments, a user is identified through the IDev. Data linked to the user may then be accessed by the IDev or a device in communication with the IDev. This user-specific data may be linked or otherwise related to the document field structure. A document field may relate user-specific data that is stored on a resource accessible to the IDev or RCD. In some embodiments, a user's personal information may be related to fields in a dynamic document. A user's name, address, social security number, bank account data and/or other information may be related to dynamic document fields so that these fields are automatically populated with personal data when a user selects a defined dynamic document. Security measures may be implemented when a user logs onto the application.

Some embodiments, illustrated in FIG. 16, comprise identifying 170 a user at an imaging device (IDev) 120 and associating 171 dynamic document editing options, such as menu selections, with the user. In these embodiments, a remote computing device (RCD) 128 may send 172 a document format options menu, which may comprise user-specific options, to the IDev which may then display 173 the document format options menu and accept 174 a user selection in relation to the menu. The IDev may then-send 175 the document format options menu selection to the RCD.

Some embodiments of the present invention may further comprise sending 176 a dynamic field structure menu, which may comprise user specific data, to the IDev from an RCD. The IDev may then display 177 the dynamic field structure menu and accept 178 a user selection of the menu. The IDev may then send 179 the dynamic field structure menu selection to the RCD.

Some embodiments of the present invention may further comprise sending 180 a static content menu, which may comprise user specific data, to the IDev from an RCD. The IDev may then display 181 the static content menu and accept 182 a user selection of the menu. The IDev may then send 183 the static content menu selection to the RCD. The RCD may then compile 184 a dynamic document comprising data from the document format menu selection, the field structure menu selection and the static content menu selection. The dynamic document may then be saved 185 at the RDC, the IDev or some other location.

Further embodiments of the present invention may be described with reference to FIG. 17. These embodiments comprise an IDev with applications and resources integral to the IDev. In these embodiments, an IDev takes steps to create a dynamic document. The IDev typically begins the process with the receipt of 190 a dynamic document editor application selection at an imaging device (IDev) 120 UI. The IDev may then display 191 a document format options menu and accept 192 a user selection of the document format options menu. The IDev may further display 193 a dynamic field structure menu and accept 194 a user selection of the dynamic field structure menu. The IDev may then display 195 a static content menu and accept 196 user selection of the static content menu. In some embodiments, these menus and selections may be combined into fewer divisions or split into more divisions for efficiency, convenience or for other reasons. The IDev may then compile 197 a dynamic document and save 198 the dynamic document.

Further embodiments of the present invention, as illustrated in FIG. 18, comprise an IDev at which user input may be received and at which menu content may be received from an RCD on which applications and resources may reside. In these embodiments, the IDev may accept 200 a dynamic document editor application selection. The IDev may then send 201 the dynamic document editor application selection to a remote computing device (RCD) 128, on which the dynamic document editor application may be executed. In response, the IDev may receive 202 a document format options menu from the RCD. The IDev may then display 203 the document format options menu and accept 204 any user selection in response to the document format options menu.

The IDev may then receive 206 a dynamic field structure menu from the RCD and may then display 207 the dynamic field structure menu. The IDev may then accept 208 a user selection in response to the dynamic field structure menu and send 209 the user selection to the RCD. The IDev may then receive 210 a static content menu from the RCD and may then display 211 the static content menu. The IDev may then accept 212 a user selection in response to the static content menu and send 213 the user selection to the RCD.

Further embodiments of the present invention may be described with reference to FIG. 19. These embodiments comprise an IDev that may receive user input and that may also receive menu content from an RCD. In these embodiments, the IDev may accept 220 a dynamic document editor application selection. The IDev may also identify 221 a user by receipt of credentials at the IDev UI or by some other method. The IDev may then send 222 the user identification data to a remote computing device (RCD) 128 along with a request to run the dynamic document editor application. The IDev may then receive 223 a document format options menu from the RCD. This menu may comprise user-specific data that is correlated to the user. User specific data may be stored on the RCD or another device in communication with the RCD. This menu may be displayed 224 on the IDev. The IDev may then accept 225 a user selection in response to the document format options menu and send 226 the user selection to the RCD.

The IDev may then receive 227 a dynamic field structure menu from the RCD. This menu may also comprise user specific data correlated to the identified user. The IDev may then display 228 the dynamic field structure menu. The IDev may then accept 229 a user selection in response to the dynamic field structure menu and send 230 the user selection to the RCD. The IDev may then receive 231 a static content menu from the RCD. Again, this menu may comprise user-specific data correlated to the identified user. The IDev may then display 232 the static content menu. The IDev may then accept 233 a user selection in response to the static content menu and send 234 the user selection to the RCD. The dynamic document editor application may then take this selection data transmitted from the IDev and compile a dynamic document.

Further embodiments of the present invention may be described with reference to FIG. 20. These embodiments comprise an RCD with a dynamic document editing application. In these embodiments, an RCD may send 240 a document format options menu to an imaging device (IDev) 120 and receive 241 a user selection of the document format options menu from the IDev. In these embodiments the RCD may also send 242 a dynamic field structure menu to the IDev and receive 243 a user selection of the dynamic field structure menu. The RCD may also send 244 a static content menu to the IDev and receive 245 a user selection of the static content menu. The RCD may then compile 246 the input that it has received into a dynamic document and save 247 the dynamic document.

Further embodiments of the present invention may be described with reference to FIG. 21. These embodiments comprise an RCD that may receive a user identification. These embodiments comprise receiving 250 a user identification data from an imaging device (IDev) 120 and associating 251 the user identification data with dynamic document editor application options. These embodiments may further comprise sending 252 a document format options menu (with user specific data) from a remote computing device (RCD) 128 to the (IDev) and receiving 253, from the IDev, a user selection in response to the document format options menu. The RCD may also send 254 a dynamic field structure menu to the IDev and receive 255 from the IDev a user selection in response to the dynamic field structure menu. Embodiments of the present invention may further comprise the RCD sending 256 a static content menu to the IDev and receiving 257 a user selection in response to the static content menu. The RCD may then compile 258 a dynamic document and save 259 the dynamic document.

The terms and expressions which have been employed in the forgoing specification are used therein as terms of description and not of limitation, and there is no intention in the use of such terms and expressions of excluding equivalence of the features shown and described or portions thereof, it being recognized that the scope of the invention is defined and limited only by the claims which follow.

Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US5085587Aug 7, 1990Feb 4, 1992Scantron CorporationScannable form and system
US5228100Jul 10, 1990Jul 13, 1993Hitachi, Ltd.Method and system for producing from document image a form display with blank fields and a program to input data to the blank fields
US5323393Nov 18, 1992Jun 21, 1994Canon Information Systems, Inc.Method and apparatus for obtaining and for controlling the status of a networked peripheral
US5365494Feb 7, 1994Nov 15, 1994Mike LynchRadio alarm clock with reminder capability
US5410646Feb 23, 1994Apr 25, 1995Park City Group, Inc.System and method for creating, processing, and storing forms electronically
US5504589Dec 27, 1993Apr 2, 1996Montague; Charles E.System and apparatus for transmitting food orders to a central station
US5513112Oct 6, 1993Apr 30, 1996Neopost LimitedDatabase system
US5542031Apr 30, 1993Jul 30, 1996Douglass; Clay S.Halftone computer imager
US5586260Feb 12, 1993Dec 17, 1996Digital Equipment CorporationMethod and apparatus for authenticating a client to a server in computer systems which support different security mechanisms
US5659845May 30, 1996Aug 19, 1997Xerox CorporationAccounting system for use with document processing system
US5671412Jul 28, 1995Sep 23, 1997Globetrotter Software, IncorporatedLicense management system for software applications
US5699493Jun 23, 1995Dec 16, 1997Lexmark International, Inc.Method and apparatus for providing job accounting information to a host computer from a printer
US5699494May 1, 1996Dec 16, 1997Lexmark International, Inc.Remote replication of printer operator panel
US5717439Oct 10, 1995Feb 10, 1998Xerox CorporationHierarchy of saving and retrieving control templates
US5726883Oct 10, 1995Mar 10, 1998Xerox CorporationMethod of customizing control interfaces for devices on a network
US5727082Apr 11, 1995Mar 10, 1998Canon Kabushiki KaishaImage processing apparatus
US5727135Aug 2, 1996Mar 10, 1998Lexmark International, Inc.Multiple printer status information indication
US5745883May 30, 1996Apr 28, 1998Xerox CorporationBilling system for use with document processing system
US5760775Oct 30, 1995Jun 2, 1998Xerox CorporationApparatus and method for programming a job ticket in a document processing system
US5774678May 7, 1997Jun 30, 1998Ricoh Company, Ltd.Method and apparatus for controlling and communicating with business office devices
US5791790Mar 13, 1996Aug 11, 1998Lexmark International, Inc.Method and apparatus for providing print job buffering for a printer on a fast data path
US5796934May 31, 1996Aug 18, 1998Oracle CorporationFault tolerant client server system
US5799206Apr 7, 1995Aug 25, 1998Hitachi, Ltd.Remote print system having a plurality of computers which are capable of monitoring and controlling operations of a remote printer
US5799289Sep 30, 1996Aug 25, 1998Ricoh Company, Ltd.Order management system and method considering budget limit
US5812818Nov 17, 1994Sep 22, 1998Transfax Inc.Apparatus and method for translating facsimile text transmission
US5832264Jul 19, 1995Nov 3, 1998Ricoh Company, Ltd.Object-oriented communications framework system with support for multiple remote machine types
US5848231Dec 24, 1996Dec 8, 1998Teitelbaum; NeilSystem configuration contingent upon secure input
US5877776Apr 26, 1996Mar 2, 1999Apple Computer, Inc.Method and system for supporting multiple font formats by a font scaler sub-system
US5944824Apr 30, 1997Aug 31, 1999Mci Communications CorporationSystem and method for single sign-on to a plurality of network elements
US5956487Oct 25, 1996Sep 21, 1999Hewlett-Packard CompanyEmbedding web access mechanism in an appliance for user interface functions including a web server and web browser
US5956698Jul 31, 1997Sep 21, 1999Xerox CorporationInformation broker for printing system
US5968127Aug 6, 1997Oct 19, 1999Fuji Xerox Co., Ltd.Information processing apparatus
US5993088Sep 30, 1998Nov 30, 1999International Business Machines CorporationMethod for improving print performance and quality by accumulating, storing and using resource accounting information with a print job
US5995553Jan 28, 1997Nov 30, 1999Tft, Inc.Encoder/decoder for emergency alert system
US5999708Jul 30, 1996Dec 7, 1999Canon Kabushiki KaishaMethod and apparatus for copier request and retrieval of files from multiple computers
US6042384Jun 30, 1998Mar 28, 2000Bookette Software CompanyComputerized systems for optically scanning and electronically scoring and reporting test results
US6044382Jun 20, 1997Mar 28, 2000Cyber Fone Technologies, Inc.Data transaction assembly server
US6069706Jul 26, 1996May 30, 2000Canon Kabushiki KaishaImage reading device and image processing method utilizing the same
US6075860Feb 19, 1997Jun 13, 20003Com CorporationApparatus and method for authentication and encryption of a remote terminal over a wireless link
US6115132Dec 24, 1997Sep 5, 2000Canon Kabushiki KaishaPrinting system that transmits job information independently of print data
US6118546Feb 18, 1998Sep 12, 2000Canon Kabushiki KaishaPrinter/facsimile driver with page count generation
US6128731Oct 21, 1998Oct 3, 2000Silicon Graphics, Inc.Advanced boot sequence for an +86 computer system that maintains expansion card device compatibility
US6141662Mar 24, 1998Oct 31, 2000Canon Kabushiki KaishaKeyword-formatted hierarchical file management apparatus and method
US6148346Jun 20, 1996Nov 14, 2000Peerless Systems Imaging Products, Inc.Dynamic device driver
US6161139Feb 12, 1999Dec 12, 2000Encommerce, Inc.Administrative roles that govern access to administrative functions
US6178308Oct 16, 1998Jan 23, 2001Xerox CorporationPaper based intermedium for providing interactive educational services
US6199080Aug 30, 1996Mar 6, 2001Sun Microsystems, Inc.Method and apparatus for displaying information on a computer controlled display device
US6213652Oct 17, 1995Apr 10, 2001Fuji Xerox Co., Ltd.Job scheduling system for print processing
US6216113Oct 17, 1994Apr 10, 2001Xerox CorporationAuditron access printer
US6233409Jan 10, 2000May 15, 2001Hewlett-Packard CompanyRedundant reorder prevention for replaceable printer components
US6240456Sep 18, 1997May 29, 2001Microsoft CorporationSystem and method for collecting printer administration information
US6246487Mar 24, 1998Jun 12, 2001Fujitsu LimitedMulti-function unit, server and network system having multi-function unit
US6292267May 8, 1997Sep 18, 2001Fujitsu LimitedNetwork printer apparatus and LAN network system
US6301016Dec 9, 1994Oct 9, 2001Canon Kabushiki KaishaData processing apparatus connectable to a LAN
US6307640Feb 25, 1998Oct 23, 2001Ricoh Company, Ltd.Computer-based network printing system and method
US6311040Jul 31, 1997Oct 30, 2001The Psychological CorporationSystem and method for scoring test answer sheets having open-ended questions
US6353878Aug 13, 1998Mar 5, 2002Emc CorporationRemote control of backup media in a secondary storage subsystem through access to a primary storage subsystem
US6369905Jun 15, 1993Apr 9, 2002Canon Kabushiki KaishaInformation processing apparatus and output apparatus
US6407820May 17, 2000Jun 18, 2002Heidelberg Digital L.L.C.Efficient use of print resources within a job stream
US6426798Mar 4, 1999Jul 30, 2002Canon Kabushiki KaishaData structure for printer description file
US6433883Nov 19, 1999Aug 13, 2002Canon Kabushiki KaishaImage processing apparatus
US6438589May 27, 1998Aug 20, 2002Fuji Xerox Co., Ltd.System for communicating a plurality of information processing modules using two cascaded links
US6462756May 17, 2000Oct 8, 2002Heidelberger Druckmaschinen AgSystem and method for visual representation of pages in a production printing workflow
US6476926Jun 7, 1995Nov 5, 2002Canon Kabushiki KaishaMethod and apparatus for controlling the amount of ink and the life of the printhead in an ink-jet recording apparatus
US6490601Jan 15, 1999Dec 3, 2002Infospace, Inc.Server for enabling the automatic insertion of data into electronic forms on a user computer
US6509974May 17, 2000Jan 21, 2003Heidelberger Druckmaschinen AgAutomated job creation for job preparation
US6510466Dec 14, 1998Jan 21, 2003International Business Machines CorporationMethods, systems and computer program products for centralized management of application programs on a network
US6516157May 9, 2000Feb 4, 2003Minolta Co., Ltd.Printing system that calculates printing cost using data input via a remote data input terminal and returns calculated printing cost to the remote data input terminal
US6526258Jul 27, 2001Feb 25, 2003Educational Testing ServiceMethods and systems for presentation and evaluation of constructed responses assessed by human evaluators
US6567179Sep 30, 1997May 20, 2003Canon Kabushiki KaishaSystem for controlling communication between a printer and an external host computer
US6590589Nov 29, 1999Jul 8, 2003International Business Machines CorporationAutomatic generation of fastpath applications
US6590673Feb 11, 2000Jul 8, 2003Canon Kabushiki KaishaImage reading device and image processing method utilizing the same
US6592275Oct 11, 2000Jul 15, 2003Minolta Co., Ltd.Image forming apparatus having a function of sending output completion notice
US6597469Dec 30, 1998Jul 22, 2003Canon Kabushiki KaishaImage forming system, management method of number of outputs from image forming system, and medium storing program for executing the method
US6604157Feb 19, 1999Aug 5, 2003Hewlett-Packard Development CompanySystem and method for allowing a user to select and scan from a peripheral to a target application on a host system
US6621422Oct 1, 2001Sep 16, 2003Advanced Public Safety, Inc.Apparatus for communicating with law enforcement during vehicle travel and associated methods
US6636929Apr 6, 2000Oct 21, 2003Hewlett-Packard Development Company, L.P.USB virtual devices
US6643650Sep 12, 2000Nov 4, 2003Sun Microsystems, Inc.Mechanism and apparatus for using messages to look up documents stored in spaces in a distributed computing environment
US6652169Feb 20, 2002Nov 25, 2003Hewlett-Packard Development Company, L.P.Method and system for printer suggested upgrades to correct errors
US6685637Oct 11, 2002Feb 3, 2004Koninklijke Philips Electronics N.V.Ultrasonic diagnostic imaging system with multiple language user interface
US6707466Oct 19, 2000Mar 16, 2004Workonce Wireless CorporationMethod and system for form recognition and digitized image processing
US6721286Apr 14, 1998Apr 13, 2004Hewlett-Packard Development Company, L.P.Method and apparatus for device interaction by format
US6735773Jun 25, 1999May 11, 2004Intel CorporationMethod and apparatus for issuing commands to a network processor configured to provide a plurality of APIs
US6749434Jan 31, 2003Jun 15, 2004Sylvan Learning Systems, Inc.System and method for conducting a learning session using teacher and student workbooks
US6772945Oct 9, 2001Aug 10, 2004Hewlett-Packard Development Company, L.P.Printed card to control printer
US6775729Nov 24, 1999Aug 10, 2004Canon Kabushiki KaishaPeripheral device, peripheral device control method, peripheral device control system, storage medium for storing peripheral device control programs, sending device for sending peripheral device control programs, and peripheral device control program product
US6823225Dec 4, 1997Nov 23, 2004Im Networks, Inc.Apparatus for distributing and playing audio information
US6826727 *Nov 24, 1999Nov 30, 2004Bitstream Inc.Apparatus, methods, programming for automatically laying out documents
US6836623Mar 21, 2003Dec 28, 2004Ricoh Company, Ltd.Imaging apparatus and remote management system of the same
US6836845Jun 30, 2000Dec 28, 2004Palm Source, Inc.Method and apparatus for generating queries for secure authentication and authorization of transactions
US6850252Oct 5, 2000Feb 1, 2005Steven M. HoffbergIntelligent electronic appliance system and method
US6854839Apr 11, 2003Feb 15, 2005Hewlett-Packard Development Company, L.P.Pay-per-use printing
US6862110Dec 18, 2000Mar 1, 2005Xerox CorporationMethod and apparatus for controlling page cost in an image-rendering device
US6862583Oct 4, 1999Mar 1, 2005Canon Kabushiki KaishaAuthenticated secure printing
US6873429Dec 6, 2000Mar 29, 2005Nec CorporationScanning device
US6874010Sep 29, 2000Mar 29, 2005Accenture LlpBase service architectures for netcentric computing systems
US6904412Aug 24, 2000Jun 7, 2005EverbankMethod and apparatus for a mortgage loan originator compliance engine
US6915525Feb 14, 2001Jul 5, 2005Sony CorporationMethod and apparatus for controlling set-top box hardware and software functions
US6934706Mar 22, 2002Aug 23, 2005International Business Machines CorporationCentralized mapping of security credentials for database access operations
US6934740Sep 19, 2000Aug 23, 20053Com CorporationMethod and apparatus for sharing common data objects among multiple applications in a client device
US6999987 *Oct 25, 2000Feb 14, 2006America Online, Inc.Screening and survey selection system and method of operating the same
US7007026 *Dec 14, 2001Feb 28, 2006Sun Microsystems, Inc.System for controlling access to and generation of localized application values
US7013289 *Feb 21, 2001Mar 14, 2006Michel HornGlobal electronic commerce system
US7145686 *Oct 31, 2001Dec 5, 2006Hewlett-Packard Development Company, L.P.Web-based imaging device service influenced by accessories
US7149964 *Feb 9, 2000Dec 12, 2006Microsoft CorporationCreation and delivery of customized content
US7174056 *Jul 6, 2004Feb 6, 2007Silverbrook Research Pty LtdProviding information in a document
US7249100 *May 15, 2001Jul 24, 2007Nokia CorporationService discovery access to user location
US7268896 *Jun 4, 2005Sep 11, 2007Bell Litho, Inc.Method for controlling brand integrity in a network environment
US7284199 *Dec 21, 2000Oct 16, 2007Microsoft CorporationProcess of localizing objects in markup language documents
US7305616 *Dec 28, 2000Dec 4, 2007Cisco Technology, Inc.Document builder for interactive dynamic documentation web site
US7325196 *Jun 16, 2003Jan 29, 2008Microsoft CorporationMethod and system for manipulating page control content
US7328245 *Jun 5, 2002Feb 5, 2008Ricoh Co., Ltd.Remote retrieval of documents
US7468805 *Jul 14, 2004Dec 23, 2008Canon Kabushiki KaishaSelective preview and proofing of documents or layouts containing variable data
US7500178 *Sep 11, 2003Mar 3, 2009Agis Network, Inc.Techniques for processing electronic forms
US7523401 *Feb 20, 2004Apr 21, 2009Theoris Software, LlcSystem and method for providing a browser-based user interface
US7548334 *Jul 15, 2004Jun 16, 2009Canon Kabushiki KaishaUser interface for creation and editing of variable data documents
US7567360 *Mar 26, 2004Jul 28, 2009Canon Kabushiki KaishaImage forming system, method and program of controlling image forming system, and storage medium
US7573593 *Mar 30, 2004Aug 11, 2009Ricoh Company, Ltd.Printer with hardware and software interfaces for media devices
US7904600 *Oct 30, 2002Mar 8, 2011Hewlott-Packard Development Company, L.P.Integrating user specific output options into user interface data
US8060556 *Dec 20, 2006Nov 15, 2011Sap AgService enabled tagged user interfaces
US20030084114 *Oct 31, 2001May 1, 2003Simpson Shell S.Web-based imaging device service influenced by accessories
US20030182632 *Mar 8, 2002Sep 25, 2003Murdock Joseph BertSystem and method for remote localization service
US20040054573 *Aug 28, 2002Mar 18, 2004Samir ShahSmart content information merge and presentation
US20040181747 *Mar 30, 2004Sep 16, 2004Hull Jonathan J.Multimedia print driver dialog interfaces
US20040190057 *Mar 26, 2004Sep 30, 2004Canon Kabushiki KaishaImage forming system, method and program of controlling image forming system, and storage medium
US20040205118 *Sep 13, 2001Oct 14, 2004Allen YuMethod and system for generalized localization of electronic documents
US20040215671 *Mar 1, 2001Oct 28, 2004Ricoh Company, Ltd. And Ricoh CorporationSystem, computer program product and method for managing documents
US20040261010 *Apr 1, 2004Dec 23, 2004Takaya MatsuishiWeb page creation apparatus, Web page creation method, Web page creation program and recording method
US20040268306 *Jun 30, 2003Dec 30, 2004Cheng Ken PrayoonMethods, systems and computer program products for language independent data communication and display
US20050063010 *Sep 24, 2003Mar 24, 2005Hewlett-Packard Development Company, L.P.Multiple flow rendering using dynamic content
US20050071746 *Mar 30, 2004Mar 31, 2005Hart Peter E.Networked printer with hardware and software interfaces for peripheral devices
US20050257148 *Jun 23, 2004Nov 17, 2005Microsoft CorporationIntelligent autofill
US20060041443 *Aug 23, 2004Feb 23, 2006Horvath Charles W JrVariable data business system and method therefor
US20060221941 *Nov 5, 2005Oct 5, 2006Konstantin KishinskyVoice over internet protocol implemented call center
US20070041035 *Aug 16, 2005Feb 22, 2007Xerox CorporationSystem and method for producing variable information documents using undetermined data sources
US20070291293 *Aug 23, 2007Dec 20, 2007Bell Litho, Inc.System for controlling brand integrity in a network environment
Non-Patent Citations
Reference
1Canon USA, Inc.; MEAP Multifunctional Embedded Application Platform; Aug. 2004; http://developersupport.canon.com/Web-MEAP-Presentation.pdf.
2Canon USA, Inc.; MEAP Multifunctional Embedded Application Platform; Aug. 2004; http://developersupport.canon.com/Web—MEAP—Presentation.pdf.
3Canon USA, Inc.; MEAP: FAQ; accessed on Jul. 2004, pub. date unknown; http://developersupport.canon.com/MEAP.htm.
4E. Uemukai Toshiaki, A WWW Browsing System in Remote Display Environments, IPSJ magazine, Information Processing Society of Japan, Publication Date: Sep. 15, 2000, vol. 41, No. 9, p. 2364 to 2373.
5F.D. Wright, Design Goals for an Internet Printing Protocol, Apr. 1999, pp. 1-43, http://tools.ietf.org/html/rfc2567.
6FOLDOC. "relational database", Jun. 2002, retrieved from <http://foldoc.org/index.cgi?query=relational+database>.
7Foreign Patent App. No. JP2006-058600—Office Action filed for a related foreign application dated Aug. 18, 2009 corresponding to U.S. Appl. No. 11/073,055.
8Foreign Patent App. No. JP2006205150—Office Action filed for a related foreign application dated Sep. 28, 2010 corresponding to U.S. Appl. No. 11/192,500.
9Foreign Patent App. No. JP2006205159—Japanese Office Action filed for a related foreign application dated Sep. 27, 2011 corresponding to U.S. Appl. No. 11/192,500.
10Foreign Patent App. No. JP2006207195—Office Action filed for a related foreign application dated Jul. 27, 2010 corresponding to U.S. Appl. No. 11/192,617.
11Foreign Patent App. No. JP2006-207196—Office Action filed for a related foreign application dated Mar. 2, 2010 corresponding to U.S. Appl. No. 11/192,862.
12Foreign Patent App. No. JP2006-207198—Office Action filed for a related foreign application dated Mar. 2, 2010 corresponding to U.S. Appl. No. 11/192,616.
13Foreign Patent App. No. JP2006207198—Office Action filed for a related foreign application dated Sep. 21, 2010 corresponding to U.S. Appl. No. 11/192,836.
14Foreign Patent App. No. JP2006-207200—Office Action filed for a related foreign application dated Jun. 1, 2010 corresponding to U.S. Appl. No. 11/192,547.
15Foreign Patent App. No. JP2006256440—Office Action filed for a related foreign application dated Jun. 7, 2010 corresponding to U.S. Appl. No. 11/233,270.
16Foreign Patent App. No. JP2006256440—Office Action filed for a related foreign application dated Oct. 19, 2010 corresponding to U.S. Appl. No. 11/233,270.
17Foreign Patent App. No. JP2006-256441—Office Action filed for a related foreign application dated Mar. 30, 2010 corresponding to U.S. Appl. No. 11/233,202.
18Foreign Patent App. No. JP2006256441—Office Action filed for a related foreign application dated Nov. 9, 2010 corresponding to U.S. Appl. No. 11/233,202.
19Foreign Patent App. No. JP2006261563—Interrogation Report filed for a related foreign application dated Jun. 7, 2011 corresponding to U.S. Appl. No. 11/241,501.
20Foreign Patent App. No. JP2006261564—Office Action filed for a related foreign application dated Jun. 15, 2010 corresponding to U.S. Appl. No. 11/241,010.
21Foreign Patent Appl. No. JP2006207200—Interrogation Report filed for a related foreign application dated Mar. 8, 2011 corresponding to U.S. Appl. No. 11/192,615.
22Gaedke, Martin et al. "A Modeling Approach to Federated Identity and Access Management", May 2005 ACM.
23Hartman, Bret et al. Mastering Web Services Security, 2003 Wiley Publishing, Inc., pp. 36-46.
24Hewlett-Packard Company; JetCAPS chai applications; Dec. 9, 2002; http://www.stethos.com/chai/data/d-us-chai.pdf.
25Hewlett-Packard Company; JetCAPS chai applications; Dec. 9, 2002; http://www.stethos.com/chai/data/d—us—chai.pdf.
26Hewlett-Packard Company; JetCAPS Scan2Folder; 2003; http://www.jetcaps.se/resources/datasheets/ds-scan2folder.pdf.
27Hewlett-Packard Company; JetCAPS Scan2Folder; 2003; http://www.jetcaps.se/resources/datasheets/ds—scan2folder.pdf.
28JP Patent App. No. 2005-295772—Notice of Allowance filed for a related foreign application dated Dec. 15, 2009.
29JP Patent App. No. 2005-295772-Office Action filed for a related foreign application dated Sep. 15, 2009.
30JP Patent App. No. 2005-295772—Office Action filed for a related foreign application dated Sep. 15, 2009.
31JP Patent App. No. 2006-058600-Office Action filed for a related foreign application dated Aug. 18, 2009.
32JP Patent App. No. 2006-058600—Office Action filed for a related foreign application dated Aug. 18, 2009.
33JP Patent App. No. 2006-207194—Office Action filed for a related foreign application dated Jan. 12, 2010.
34JP Patent App. No. 2006-207194—Office Action filed for a related foreign application dated Jun. 23, 2009.
35JP Patent App. No. 2006-207199-Office Action filed for a related foreign application dated Feb. 2, 2010.
36JP Patent App. No. 2006-207199—Office Action filed for a related foreign application dated Feb. 2, 2010.
37JP Patent App. No. 2006-207199—Office Action filed for a related foreign application dated Nov. 17, 2009.
38JP Patent App. No. 2006-207200—Office Action filed for a related foreign application dated Feb. 2, 2010.
39JP Patent App. No. 2006-256442—Office Action filed for a related foreign application dated Jul. 14, 2009.
40JP Patent App. No. 2006-261563-Office Action filed for a related foreign application dated Jan. 19, 2010.
41JP Patent App. No. 2006-261563—Office Action filed for a related foreign application dated Jan. 19, 2010.
42JP Patent App. No. 2006-261564—Office Action filed for a related foreign application dated Jan. 19, 2010.
43JP Patent App. No. 2007-225913—Office Action filed for a related foreign application dated Dec. 24, 2009.
44Microsoft Corporation. Microsoft Computer Dictionary, Fifth Edition, 2002 Microsoft Press, pp. 487-488.
45Oasis. "Security Assertion Markup Language (SAML) 2.0 Technical Overview", Working Draft 01, Jul. 22, 2004, <http://www.oasis-open.org/committees/documents.php?wg—abbrev=security>.
46R. Herriot, Internet Printing Protocol (IPP): Event Notifications and Subscriptions (Feb. 21, 2003, retrieved from http://tools.ietf.org/html/draft-ietf-ipp-not-spec-11 on Aug. 20, 2008, pp. 1-101).
47R. Herriot, Internet Printing Protocol (IPP): Event Notifications and Subscriptions, Jun. 21, 2004, http://tools.ietf.org/html/draft-ietf-ipp-not-spec-12, pp. 1-98.
48Ratha, N.K., Connell, J.H., Bolle, R.M. "Enhancing security and privacy in biometrics-based authentication systems". IBM Systems Journal 40(3), pp. 614-634 (2001).
49Ricoh Company, Ltd.; Ricoh's Medium-Term Management Plan; Mar. 19, 2002; http://www.ricoh.com/IR/data/pre/pdf/ir-pre2002.pdf.
50Ricoh Company, Ltd.; Ricoh's Medium-Term Management Plan; Mar. 19, 2002; http://www.ricoh.com/IR/data/pre/pdf/ir—pre2002.pdf.
51Ricoh Company, Ltd.; White Paper: Embedded Software Architecture SDK; Jun. 25, 2003; http://www.ricoh-usa.com/products/concept/esa.asp?catname=ESA.
52T. Hastings, "Internet Printing Protocol/1.1: Model and Semantics" (Sep. 2000, retrieved from http://www.ietf.org/rfc/rfc291.txt on Sep. 18, 2008, pp. 1-210).
53U.S. App. No. 11/241,071—Office Action dated Sep. 19, 2008.
54U.S. Appl. No. 10/961,594—Final Office Action dated Apr. 2, 2010.
55U.S. Appl. No. 10/961,594—Final Office Action dated May 19, 2011.
56U.S. Appl. No. 10/961,594—Non-Final Office Action dated Sep. 15, 2010.
57U.S. Appl. No. 10/961,594—Notice of Allowance dated Oct. 13, 2011.
58U.S. Appl. No. 10/961,594—Office Action dated Dec. 3, 2008.
59U.S. Appl. No. 10/961,594-Office Action dated Jan. 7, 2008.
60U.S. Appl. No. 10/961,594—Office Action dated Jan. 7, 2008.
61U.S. Appl. No. 10/961,594—Office Action dated Mar. 16, 2009.
62U.S. Appl. No. 10/961,793-Final Office Action dated Feb. 4, 2010.
63U.S. Appl. No. 10/961,793—Final Office Action dated Feb. 4, 2010.
64U.S. Appl. No. 10/961,793-Non- Final Office Action dated Jun. 24, 2009.
65U.S. Appl. No. 10/961,793—Non- Final Office Action dated Jun. 24, 2009.
66U.S. Appl. No. 10/961,793—Non-Final Office Action dated Oct. 28, 2010.
67U.S. Appl. No. 10/961,793—Notice of Allowance dated Jun. 10, 2011.
68U.S. Appl. No. 10/961,793—Office Action dated Dec. 19, 2008.
69U.S. Appl. No. 10/961,793—Office Action dated Jun. 20, 2008.
70U.S. Appl. No. 10/961,911—Final Office Action dated Oct. 20, 2010.
71U.S. Appl. No. 10/961,911-Non-Final Office Action dated Feb. 3, 2010.
72U.S. Appl. No. 10/961,911—Non-Final Office Action dated Feb. 3, 2010.
73U.S. Appl. No. 10/961,911-Non-Final Office Action dated Jun. 8, 2009.
74U.S. Appl. No. 10/961,911—Non-Final Office Action dated Jun. 8, 2009.
75U.S. Appl. No. 10/961,911-Office Action dated Apr. 16, 2008.
76U.S. Appl. No. 10/961,911—Office Action dated Apr. 16, 2008.
77U.S. Appl. No. 10/961,911—Office Action dated Oct. 28, 2008.
78U.S. Appl. No. 10/962,103-Non-Final Office Action dated Aug. 14, 2009.
79U.S. Appl. No. 10/962,103—Non-Final Office Action dated Aug. 14, 2009.
80U.S. Appl. No. 10/962,103—Non-final Office Action dated May 14, 2010.
81U.S. Appl. No. 10/962,103—Notice of Allowance dated Feb. 22, 2011.
82U.S. Appl. No. 10/962,103—Office Action dated Jan. 23, 2009.
83U.S. Appl. No. 10/962,103—Office Action dated Jul. 9, 2008.
84U.S. Appl. No. 10/962,248—Final Office Action dated Aug. 17, 2010.
85U.S. Appl. No. 10/962,248-Final Office Action dated Jun. 10, 2009.
86U.S. Appl. No. 10/962,248—Final Office Action dated Jun. 10, 2009.
87U.S. Appl. No. 10/962,248-Non-Final Office Action dated Jan. 29, 2010.
88U.S. Appl. No. 10/962,248—Non-Final Office Action dated Jan. 29, 2010.
89U.S. Appl. No. 10/962,248—Notice of Allowance dated Apr. 1, 2011.
90U.S. Appl. No. 10/962,248—Office Action dated Aug. 19, 2008.
91U.S. Appl. No. 11/073,055-Final Office Action dated Feb. 18, 2010.
92U.S. Appl. No. 11/073,055—Final Office Action dated Feb. 18, 2010.
93U.S. Appl. No. 11/073,055—Final Office Action dated Mar. 30, 2011.
94U.S. Appl. No. 11/073,055-Non-Final Office Action dated Jun. 19, 2009.
95U.S. Appl. No. 11/073,055—Non-Final Office Action dated Jun. 19, 2009.
96U.S. Appl. No. 11/073,055—Non-Final Office Action dated Nov. 23, 2010.
97U.S. Appl. No. 11/073,055—Office Action dated Mar. 4, 2009.
98U.S. Appl. No. 11/073,055—Office Action dated Sep. 18, 2008.
99U.S. Appl. No. 11/192,467—Final Office Action dated Jun. 25, 2010.
100U.S. Appl. No. 11/192,467-Non-Final Office Action dated Nov. 13, 2009.
101U.S. Appl. No. 11/192,467—Non-Final Office Action dated Nov. 13, 2009.
102U.S. Appl. No. 11/192,467—Notice of Allowance dated Dec. 22, 2010.
103U.S. Appl. No. 11/192,500—Final Office Action dated Mar. 21, 2011.
104U.S. Appl. No. 11/192,500—Non-final Office Action dated Jul. 21, 2010.
105U.S. Appl. No. 11/192,500—Non-Final Office Action dated Sep. 30, 2011.
106U.S. Appl. No. 11/192,546—Final Office Action dated Jul. 14, 2010.
107U.S. Appl. No. 11/192,546-Final Office Action dated Jun. 30, 2009.
108U.S. Appl. No. 11/192,546—Final Office Action dated Jun. 30, 2009.
109U.S. Appl. No. 11/192,546—Non-Final Office Action dated Feb. 17, 2011.
110U.S. Appl. No. 11/192,546-Non-Final Office Action dated Nov. 24, 2009.
111U.S. Appl. No. 11/192,546—Non-Final Office Action dated Nov. 24, 2009.
112U.S. Appl. No. 11/192,546—Notice of Allowance dated Aug. 30, 2011.
113U.S. Appl. No. 11/192,546—Office Action dated Jan. 22, 2009.
114U.S. Appl. No. 11/192,547-Final Office Action dated Jan. 15, 2010.
115U.S. Appl. No. 11/192,547—Final Office Action dated Jan. 15, 2010.
116U.S. Appl. No. 11/192,547—Final Office Action dated Mar. 7, 2011.
117U.S. Appl. No. 11/192,547—Non-final Office Action dated Jun. 25, 2010.
118U.S. Appl. No. 11/192,547—Office Action dated Feb. 5, 2009.
119U.S. Appl. No. 11/192,615—Final Office Action dated Apr. 20, 2010.
120U.S. Appl. No. 11/192,615—Final Office Action dated Oct. 11, 2011.
121U.S. Appl. No. 11/192,615—Non-Final Office Action dated Jan. 4, 2011.
122U.S. Appl. No. 11/192,615-Non-Final Office Action dated Sep. 4, 2009.
123U.S. Appl. No. 11/192,615—Non-Final Office Action dated Sep. 4, 2009.
124U.S. Appl. No. 11/192,616—Final Office Action dated May 26, 2010.
125U.S. Appl. No. 11/192,616—Non-Final Office Action dated Jun. 9, 2011.
126U.S. Appl. No. 11/192,616-Non-Final Office Action dated Sep. 17, 2009.
127U.S. Appl. No. 11/192,616—Non-Final Office Action dated Sep. 17, 2009.
128U.S. Appl. No. 11/192,616—Notice of Allowance dated Oct. 11, 2011.
129U.S. Appl. No. 11/192,617—Final Office Action dated Jun. 11, 2010.
130U.S. Appl. No. 11/192,617—Non-Final Office Action dated Jun. 9, 2011.
131U.S. Appl. No. 11/192,617-Non-Final Office Action dated Sep. 29, 2009.
132U.S. Appl. No. 11/192,617—Non-Final Office Action dated Sep. 29, 2009.
133U.S. Appl. No. 11/192,617—Notice of Allowance dated Oct. 11, 2011.
134U.S. Appl. No. 11/192,629—Final Office Action dated Aug. 25, 2010.
135U.S. Appl. No. 11/192,629-Final Office Action dated Jun. 26, 2009.
136U.S. Appl. No. 11/192,629—Final Office Action dated Jun. 26, 2009.
137U.S. Appl. No. 11/192,629-Non-Final Office Action dated Jan. 15, 2010.
138U.S. Appl. No. 11/192,629—Non-Final Office Action dated Jan. 15, 2010.
139U.S. Appl. No. 11/192,629—Notice of Allowance dated Apr. 11, 2011.
140U.S. Appl. No. 11/192,629—Office Action dated Jan. 22, 2009.
141U.S. Appl. No. 11/192,630—Final Office Action dated Dec. 8, 2010.
142U.S. Appl. No. 11/192,630-Final Office Action dated Sep. 2, 2009.
143U.S. Appl. No. 11/192,630—Final Office Action dated Sep. 2, 2009.
144U.S. Appl. No. 11/192,630—Non-final Office Action dated Apr. 9, 2010.
145U.S. Appl. No. 11/192,630—Notice of Allowance dated May 31, 2011.
146U.S. Appl. No. 11/192,630—Office Action dated Jan. 21, 2009.
147U.S. Appl. No. 11/192,796-Non-Final Office Action dated Dec. 28, 2009.
148U.S. Appl. No. 11/192,796—Non-Final Office Action dated Dec. 28, 2009.
149U.S. Appl. No. 11/192,796—Notice of Allowance dated Sep. 10, 2010.
150U.S. Appl. No. 11/192,796—Office Action dated Feb. 24, 2009.
151U.S. Appl. No. 11/192,824—Final Office Action dated Oct. 22, 2010.
152U.S. Appl. No. 11/192,824—Non-Final Office Action dated Jun. 9, 2011.
153U.S. Appl. No. 11/192,824—Non-final Office Action dated Mar. 1, 2010.
154U.S. Appl. No. 11/192,824-Non-Final Office Action dated Sep. 18, 2009.
155U.S. Appl. No. 11/192,824—Non-Final Office Action dated Sep. 18, 2009.
156U.S. Appl. No. 11/192,824—Notice of Allowance dated Apr. 20, 2011.
157U.S. Appl. No. 11/192,836-Notice of Allowance dated Dec. 30, 2008.
158U.S. Appl. No. 11/192,836—Notice of Allowance dated Dec. 30, 2008.
159U.S. Appl. No. 11/192,836-Office Action dated Dec. 5, 2007.
160U.S. Appl. No. 11/192,836—Office Action dated Dec. 5, 2007.
161U.S. Appl. No. 11/192,836-Office Action dated Jan. 30, 2007.
162U.S. Appl. No. 11/192,836—Office Action dated Jan. 30, 2007.
163U.S. Appl. No. 11/192,836-Office Action dated Jul. 3, 2007.
164U.S. Appl. No. 11/192,836—Office Action dated Jul. 3, 2007.
165U.S. Appl. No. 11/192,836—Office Action dated Jul. 9, 2008.
166U.S. Appl. No. 11/192,862—Final Office Action dated Mar. 21, 2011.
167U.S. Appl. No. 11/192,862—Non-Final Office Action dated Jul. 26, 2010.
168U.S. Appl. No. 11/192,862—Non-Final Office Action dated Oct. 13, 2011.
169U.S. Appl. No. 11/192,865—Final Office Action dated Mar. 4, 2010.
170U.S. Appl. No. 11/192,865-Non-Final Office Action dated Aug. 20, 2009.
171U.S. Appl. No. 11/192,865—Non-Final Office Action dated Aug. 20, 2009.
172U.S. Appl. No. 11/192,865—Non-Final Office Action dated Sep. 2, 2010.
173U.S. Appl. No. 11/192,865—Notice of Allowance dated May 19, 2011.
174U.S. Appl. No. 11/192,868-Final Office Action dated Aug. 11, 2009.
175U.S. Appl. No. 11/192,868—Final Office Action dated Aug. 11, 2009.
176U.S. Appl. No. 11/192,868—Final Office Action dated Dec. 8, 2010.
177U.S. Appl. No. 11/192,868—Non-final Office Action dated May 19, 2010.
178U.S. Appl. No. 11/192,868—Notice of Allowance dated Apr. 29, 2011.
179U.S. Appl. No. 11/192,868—Office Action dated Feb. 2, 2009.
180U.S. Appl. No. 11/192,870—Final Office Action dated Aug. 8, 2011.
181U.S. Appl. No. 11/192,870-Final Office Action dated Jan. 4, 2010.
182U.S. Appl. No. 11/192,870—Final Office Action dated Jan. 4, 2010.
183U.S. Appl. No. 11/192,870—Non-Final Office Action dated Feb. 22, 2011.
184U.S. Appl. No. 11/192,870-Non-Final Office Action dated Jul. 17, 2009.
185U.S. Appl. No. 11/192,870—Non-Final Office Action dated Jul. 17, 2009.
186U.S. Appl. No. 11/192,937—First Action Interview Pilot Program Pre-Interview Communication dated Apr. 7, 2010.
187U.S. Appl. No. 11/192,937—Notice of Allowance dated Sep. 7, 2010.
188U.S. Appl. No. 11/193,076—Final Office Action dated Jan. 6, 2011.
189U.S. Appl. No. 11/193,076—Non-final Office Action dated Apr. 5, 2010.
190U.S. Appl. No. 11/193,076—Notice of Allowance dated Oct. 11, 2011.
191U.S. Appl. No. 11/193,077-Notice of Allowance dated Mar. 11, 2008.
192U.S. Appl. No. 11/193,077—Notice of Allowance dated Mar. 11, 2008.
193U.S. Appl. No. 11/193,077-Office Action dated Apr. 6, 2007.
194U.S. Appl. No. 11/193,077—Office Action dated Apr. 6, 2007.
195U.S. Appl. No. 11/193,140-Final Office Action dated May 18, 2009.
196U.S. Appl. No. 11/193,140—Final Office Action dated May 18, 2009.
197U.S. Appl. No. 11/193,140-Notice of Allowance dated Jan. 29, 2010.
198U.S. Appl. No. 11/193,140—Notice of Allowance dated Jan. 29, 2010.
199U.S. Appl. No. 11/193,140—Office Action dated Nov. 18, 2008.
200U.S. Appl. No. 11/193,147-Notice of Allowance dated Dec. 30, 2008.
201U.S. Appl. No. 11/193,147—Notice of Allowance dated Dec. 30, 2008.
202U.S. Appl. No. 11/193,147-Office Action dated Dec. 6, 2007.
203U.S. Appl. No. 11/193,147—Office Action dated Dec. 6, 2007.
204U.S. Appl. No. 11/193,147-Office Action dated Feb. 9, 2007.
205U.S. Appl. No. 11/193,147—Office Action dated Feb. 9, 2007.
206U.S. Appl. No. 11/193,147-Office Action dated Jul. 23, 2007.
207U.S. Appl. No. 11/193,147—Office Action dated Jul. 23, 2007.
208U.S. Appl. No. 11/193,147—Office Action dated Jul. 9, 2008.
209U.S. Appl. No. 11/193,151—Final Office Action dated Nov. 2, 2010.
210U.S. Appl. No. 11/193,151-Final Office Action dated Sep. 21, 2009.
211U.S. Appl. No. 11/193,151—Final Office Action dated Sep. 21, 2009.
212U.S. Appl. No. 11/193,151—Non-Final Office Action dated Mar. 16, 2011.
213U.S. Appl. No. 11/193,151—Non-Final Office Action dated Mar. 29, 2010.
214U.S. Appl. No. 11/193,151—Notice of Allowance dated Aug. 22, 2011.
215U.S. Appl. No. 11/193,151—Office Action dated Feb. 23, 2009.
216U.S. Appl. No. 11/193,152—Final Office Action dated Nov. 18, 2010.
217U.S. Appl. No. 11/193,152—Non-Final Office Action dated Mar. 1, 2010.
218U.S. Appl. No. 11/193,152—Notice of Allowance dated Apr. 8, 2011.
219U.S. Appl. No. 11/193,154-Final Office Action dated Dec. 7, 2009.
220U.S. Appl. No. 11/193,154—Final Office Action dated Dec. 7, 2009.
221U.S. Appl. No. 11/193,154-Non-Final Office Action dated Jun. 3, 2009.
222U.S. Appl. No. 11/193,154—Non-Final Office Action dated Jun. 3, 2009.
223U.S. Appl. No. 11/193,154—Office Action dated Dec. 2, 2008.
224U.S. Appl. No. 11/193,188-Final Office Action dated Aug. 5, 2009.
225U.S. Appl. No. 11/193,188—Final Office Action dated Aug. 5, 2009.
226U.S. Appl. No. 11/193,188—Final Office Action dated Dec. 8, 2010.
227U.S. Appl. No. 11/193,188—Non-final Office Action dated Apr. 19, 2010.
228U.S. Appl. No. 11/193,188—Office Action dated Jan. 21, 2009.
229U.S. Appl. No. 11/218,033-Final Office Action dated Mar. 30, 2009.
230U.S. Appl. No. 11/218,033—Final Office Action dated Mar. 30, 2009.
231U.S. Appl. No. 11/218,033—Final Office Action dated May 14, 2010.
232U.S. Appl. No. 11/218,033—Non-Final Office Action dated Jun. 9, 2011.
233U.S. Appl. No. 11/218,033-Non-Final Office Action dated Sep. 8, 2009.
234U.S. Appl. No. 11/218,033—Non-Final Office Action dated Sep. 8, 2009.
235U.S. Appl. No. 11/218,033—Notice of Allowance dated Oct. 11, 2011.
236U.S. Appl. No. 11/218,033—Office Action dated Sep. 12, 2008.
237U.S. Appl. No. 11/218,186-Final Office Action dated Feb. 1, 2010.
238U.S. Appl. No. 11/218,186—Final Office Action dated Feb. 1, 2010.
239U.S. Appl. No. 11/218,186—Non-Final Office Action dated Jun. 16, 2011.
240U.S. Appl. No. 11/218,186-Non-Final Office Action dated Jun. 23, 2009.
241U.S. Appl. No. 11/218,186—Non-Final Office Action dated Jun. 23, 2009.
242U.S. Appl. No. 11/218,186—Notice of Allowance dated Oct. 11, 2011.
243U.S. Appl. No. 11/232,552—Final Office Action dated Aug. 19, 2010.
244U.S. Appl. No. 11/232,552—Final Office Action dated Aug. 5, 2011.
245U.S. Appl. No. 11/232,552-Final Office Action dated Jun. 24, 2009.
246U.S. Appl. No. 11/232,552—Final Office Action dated Jun. 24, 2009.
247U.S. Appl. No. 11/232,552-Non-Final Office Action dated Dec. 24, 2009.
248U.S. Appl. No. 11/232,552—Non-Final Office Action dated Dec. 24, 2009.
249U.S. Appl. No. 11/232,552—Office Action dated Nov. 18, 2008.
250U.S. Appl. No. 11/232,588-Final Office Action dated Nov. 27, 2009.
251U.S. Appl. No. 11/232,588—Final Office Action dated Nov. 27, 2009.
252U.S. Appl. No. 11/232,588-Non-Final Office Action dated Apr. 1, 2009.
253U.S. Appl. No. 11/232,588—Non-Final Office Action dated Apr. 1, 2009.
254U.S. Appl. No. 11/232,588—Notice of Allowance dated Jun. 23, 2010.
255U.S. Appl. No. 11/233,201-Final Office Action dated Apr. 28, 2009.
256U.S. Appl. No. 11/233,201—Final Office Action dated Apr. 28, 2009.
257U.S. Appl. No. 11/233,201—Final Office Action dated Jun. 3, 2010.
258U.S. Appl. No. 11/233,201—Non-Final Office Action dated Sep. 15, 2010.
259U.S. Appl. No. 11/233,201-Non-Final Office Action dated Sep. 4, 2009.
260U.S. Appl. No. 11/233,201—Non-Final Office Action dated Sep. 4, 2009.
261U.S. Appl. No. 11/233,201—Notice of Allowance dated Jun. 24, 2011.
262U.S. Appl. No. 11/233,201—Office Action dated Oct. 3, 2008.
263U.S. Appl. No. 11/233,202-Final Office Action dated Jan. 15, 2010.
264U.S. Appl. No. 11/233,202—Final Office Action dated Jan. 15, 2010.
265U.S. Appl. No. 11/233,202—Final Office Action dated Mar. 23, 2011.
266U.S. Appl. No. 11/233,202—Non-Final Office Action dated Jul. 27, 2010.
267U.S. Appl. No. 11/233,202-Non-Final Office Action dated Jun. 9, 2009.
268U.S. Appl. No. 11/233,202—Non-Final Office Action dated Jun. 9, 2009.
269U.S. Appl. No. 11/233,202—Office Action dated Dec. 1, 2008.
270U.S. Appl. No. 11/233,202—Office Action dated Jun. 5, 2008.
271U.S. Appl. No. 11/233,270-Final Office Action dated Mar. 31, 2009.
272U.S. Appl. No. 11/233,270—Final Office Action dated Mar. 31, 2009.
273U.S. Appl. No. 11/233,270-Final Office Action dated Nov. 27, 2009.
274U.S. Appl. No. 11/233,270—Final Office Action dated Nov. 27, 2009.
275U.S. Appl. No. 11/233,270—Non-final Office Action dated Jun. 9, 2010.
276U.S. Appl. No. 11/233,270—Notice of Allowance dated Nov. 30, 2010.
277U.S. Appl. No. 11/233,270—Office Action dated Sep. 17, 2008.
278U.S. Appl. No. 11/240,039-Final Office Action dated Apr. 13, 2009.
279U.S. Appl. No. 11/240,039—Final Office Action dated Apr. 13, 2009.
280U.S. Appl. No. 11/240,039-Non-Final Office Action dated Nov. 3, 2009.
281U.S. Appl. No. 11/240,039—Non-Final Office Action dated Nov. 3, 2009.
282U.S. Appl. No. 11/240,039—Notice of Allowance dated Jun. 3, 2010.
283U.S. Appl. No. 11/240,039—Office Action dated Oct. 20, 2008.
284U.S. Appl. No. 11/240,084-Final Office Action dated Apr. 15, 2009.
285U.S. Appl. No. 11/240,084—Final Office Action dated Apr. 15, 2009.
286U.S. Appl. No. 11/240,084—Final Office Action dated Aug. 6, 2010.
287U.S. Appl. No. 11/240,084-Non-Final Office Action dated Dec. 16, 2009.
288U.S. Appl. No. 11/240,084—Non-Final Office Action dated Dec. 16, 2009.
289U.S. Appl. No. 11/240,084—Non-Final Office Action dated May 12, 2011.
290U.S. Appl. No. 11/240,084—Notice of Allowance dated Oct. 11, 2011.
291U.S. Appl. No. 11/240,084—Office Action dated Oct. 30, 2008.
292U.S. Appl. No. 11/240,139—Final Office Action dated Jun. 9, 2010.
293U.S. Appl. No. 11/240,139—Non-Final Office Action dated Jun. 10, 2011.
294U.S. Appl. No. 11/240,139-Non-Final Office Action dated Oct. 6, 2009.
295U.S. Appl. No. 11/240,139—Non-Final Office Action dated Oct. 6, 2009.
296U.S. Appl. No. 11/240,139—Notice of Allowance dated Oct. 11, 2011.
297U.S. Appl. No. 11/240,156—Final Office Action dated Mar. 31, 2010.
298U.S. Appl. No. 11/240,156—Non-Final Office Action dated Nov. 10, 2010.
299U.S. Appl. No. 11/240,156-Non-Final Office Action dated Sep. 16, 2009.
300U.S. Appl. No. 11/240,156—Non-Final Office Action dated Sep. 16, 2009.
301U.S. Appl. No. 11/240,156—Notice of Allowance dated Jul. 12, 2011.
302U.S. Appl. No. 11/240,156—Office Action dated Aug. 28, 2008.
303U.S. Appl. No. 11/240,156—Office Action dated Feb. 20, 2009.
304U.S. Appl. No. 11/241,010-Final Office Action dated Mar. 20, 2009.
305U.S. Appl. No. 11/241,010—Final Office Action dated Mar. 20, 2009.
306U.S. Appl. No. 11/241,010—Final Office Action dated Oct. 15, 2010.
307U.S. Appl. No. 11/241,010—Non-final Office Action dated Apr. 15, 2010.
308U.S. Appl. No. 11/241,010—Notice of Allowance dated May 27, 2011.
309U.S. Appl. No. 11/241,010—Office Action dated Oct. 9, 2008.
310U.S. Appl. No. 11/241,011-Final Office Action dated Apr. 2, 2009.
311U.S. Appl. No. 11/241,011—Final Office Action dated Apr. 2, 2009.
312U.S. Appl. No. 11/241,011—Final Office Action dated Jun. 29, 2010.
313U.S. Appl. No. 11/241,011—Non-Final Office Action dated Feb. 17, 2011.
314U.S. Appl. No. 11/241,011-Non-Final Office Action dated Jan. 4, 2010.
315U.S. Appl. No. 11/241,011—Non-Final Office Action dated Jan. 4, 2010.
316U.S. Appl. No. 11/241,011—Notice of Allowance dated Sep. 6, 2011.
317U.S. Appl. No. 11/241,011—Office Action dated Oct. 8, 2008.
318U.S. Appl. No. 11/241,071—Final Office Action dated Apr. 16, 2010.
319U.S. Appl. No. 11/241,071-Non-Final Office Action dated Aug. 19, 2009.
320U.S. Appl. No. 11/241,071—Non-Final Office Action dated Aug. 19, 2009.
321U.S. Appl. No. 11/241,071—Notice of Allowance dated May 3, 2011.
322U.S. Appl. No. 11/241,071—Office Action dated Mar. 3, 2009.
323U.S. Appl. No. 11/241,320—Final Office Action dated Jun. 17, 2010.
324U.S. Appl. No. 11/241,320—Non-Final Office Action dated Jun. 9, 2011.
325U.S. Appl. No. 11/241,320-Non-Final Office Action dated Oct. 7, 2009.
326U.S. Appl. No. 11/241,320—Non-Final Office Action dated Oct. 7, 2009.
327U.S. Appl. No. 11/241,320—Notice of Allowance dated Oct. 11, 2011.
328U.S. Appl. No. 11/241,447—Final Office Action dated Apr. 1, 2010.
329U.S. Appl. No. 11/241,447—Non-Final Office Action dated Dec. 8, 2010.
330U.S. Appl. No. 11/241,447-Non-Final Office Action dated Jul. 22, 2009.
331U.S. Appl. No. 11/241,447—Non-Final Office Action dated Jul. 22, 2009.
332U.S. Appl. No. 11/241,447—Notice of Allowance dated Jul. 13, 2011.
333U.S. Appl. No. 11/241,447—Office Action dated Mar. 5, 2009.
334U.S. Appl. No. 11/241,447—Office Action dated Sep. 15, 2008.
335U.S. Appl. No. 11/241,497-Non-Final Office Action dated Oct. 6, 2009.
336U.S. Appl. No. 11/241,497—Non-Final Office Action dated Oct. 6, 2009.
337U.S. Appl. No. 11/241,497—Notice of Allowance dated Aug. 11, 2010.
338U.S. Appl. No. 11/241,497—Office Action dated Aug. 27, 2008.
339U.S. Appl. No. 11/241,497—Office Action dated Feb. 20, 2009.
340U.S. Appl. No. 11/241,498—Final Office Action dated Jul. 22, 2010.
341U.S. Appl. No. 11/241,498-Non-Final Office Action dated Dec. 10, 2009.
342U.S. Appl. No. 11/241,498—Non-Final Office Action dated Dec. 10, 2009.
343U.S. Appl. No. 11/241,498—Notice of Allowance dated Apr. 1, 2011.
344U.S. Appl. No. 11/241,498—Office Action dated Mar. 5, 2009.
345U.S. Appl. No. 11/241,498—Office Action dated Sep. 16, 2008.
346U.S. Appl. No. 11/241,501—Final Office Action dated Jul. 22, 2010.
347U.S. Appl. No. 11/241,501-Final Office Action dated May 13, 2009.
348U.S. Appl. No. 11/241,501—Final Office Action dated May 13, 2009.
349U.S. Appl. No. 11/241,501-Non-Final Office Action dated Feb. 9, 2010.
350U.S. Appl. No. 11/241,501—Non-Final Office Action dated Feb. 9, 2010.
351U.S. Appl. No. 11/241,501—Notice of Allowance dated Feb. 17, 2011.
352U.S. Appl. No. 11/241,501—Office Action dated Oct. 23, 2008.
353U.S. Appl. No. 11/255,333-Notice of Allowance dated Nov. 3, 2009.
354U.S. Appl. No. 11/255,333—Notice of Allowance dated Nov. 3, 2009.
355U.S. Appl. No. 11/255,333—Office Action dated Mar. 13, 2009.
356U.S. Appl. No. 11/255,611-Notice of Allowance dated Aug. 10, 2009.
357U.S. Appl. No. 11/255,611—Notice of Allowance dated Aug. 10, 2009.
358U.S. Appl. No. 11/255,611—Office Action dated Mar. 12, 2009.
359U.S. Appl. No. 11/256,479-Final Office Action dated Apr. 1, 2009.
360U.S. Appl. No. 11/256,479—Final Office Action dated Apr. 1, 2009.
361U.S. Appl. No. 11/256,479—Final Office Action dated May 13, 2010.
362U.S. Appl. No. 11/256,479-Non-Final Office Action dated Nov. 16, 2009.
363U.S. Appl. No. 11/256,479—Non-Final Office Action dated Nov. 16, 2009.
364U.S. Appl. No. 11/256,479—Non-Final Office Action dated Nov. 23, 2010.
365U.S. Appl. No. 11/256,479—Notice of Allowance dated Jul. 13, 2011.
366U.S. Appl. No. 11/256,479—Office Action dated Nov. 4, 2008.
367U.S. Appl. No. 11/256,493—Final Office Action dated Aug. 20, 2010.
368U.S. Appl. No. 11/256,493—Non-final Office Action dated Mar. 9, 2010.
369U.S. Appl. No. 11/256,493—Notice of Allowance dated Apr. 15, 2011.
370U.S. Appl. No. 11/465,699-Final Office Action dated Mar. 31, 2009.
371U.S. Appl. No. 11/465,699—Final Office Action dated Mar. 31, 2009.
372U.S. Appl. No. 11/465,699—Final Office Action dated May 24, 2010.
373U.S. Appl. No. 11/465,699-Final Office Action dated Nov. 27, 2009.
374U.S. Appl. No. 11/465,699—Final Office Action dated Nov. 27, 2009.
375U.S. Appl. No. 11/465,699—Non-Final Office Action dated Mar. 23, 2011.
376U.S. Appl. No. 11/465,699—Non-final Office Action dated Nov. 16, 2009.
377U.S. Appl. No. 11/465,699-Non-Final Office Action dated Sep. 17, 2008.
378U.S. Appl. No. 11/465,699—Non-Final Office Action dated Sep. 17, 2008.
379U.S. Appl. No. 11/465,699—Notice of Allowance dated Sep. 30, 2011.
380U.S. Appl. No. 11/465,722—Final Office Action dated Apr. 30, 2010.
381U.S. Appl. No. 11/465,722—Non-Final Office Action dated Aug. 5, 2011.
382U.S. Appl. No. 11/465,722-Non-Final Office Action dated Dec. 24, 2009.
383U.S. Appl. No. 11/465,722—Non-Final Office Action dated Dec. 24, 2009.
384U.S. Appl. No. 11/465,747—Final Office Action dated Mar. 9, 2010.
385U.S. Appl. No. 11/465,747-Non-Final Office Action dated Jun. 24, 2009.
386U.S. Appl. No. 11/465,747—Non-Final Office Action dated Jun. 24, 2009.
387U.S. Appl. No. 11/465,747—Notice of Allowance dated Dec. 28, 2010.
388U.S. Appl. No. 11/465,752—Final Office Action dated Apr. 2, 2010.
389U.S. Appl. No. 11/465,752—Final Office Action dated Oct. 31, 2011.
390U.S. Appl. No. 11/465,752—Non-Final Office Action dated Apr. 1, 2011.
391U.S. Appl. No. 11/465,752-Non-Final Office Action dated Jun. 24, 2009.
392U.S. Appl. No. 11/465,752—Non-Final Office Action dated Jun. 24, 2009.
393U.S. Appl. No. 11/536,115—Final Office Action dated Mar. 10, 2011.
394U.S. Appl. No. 11/536,115—Non-final Office Action dated Jun. 15, 2010.
395U.S. Appl. No. 11/562,342-Final Office Action dated Dec. 21, 2009.
396U.S. Appl. No. 11/562,342—Final Office Action dated Dec. 21, 2009.
397U.S. Appl. No. 11/562,342-Non-Final Office Action dated May 29, 2009.
398U.S. Appl. No. 11/562,342—Non-Final Office Action dated May 29, 2009.
399U.S. Appl. No. 11/685,046-Final Office Action dated Dec. 21, 2009.
400U.S. Appl. No. 11/685,046—Final Office Action dated Dec. 21, 2009.
401U.S. Appl. No. 11/685,046-Non-Final Office Action dated Jul. 8, 2009.
402U.S. Appl. No. 11/685,046—Non-Final Office Action dated Jul. 8, 2009.
403Xerox, Inc.; Xerox FreeFlow digital workflow collection; 2003; http://www.xerox.com/downloads/usa/en/s/solutions-digital-workflow-whitepaper-sdk.pdf.
404Xerox, Inc.; Xerox FreeFlow digital workflow collection; 2003; http://www.xerox.com/downloads/usa/en/s/solutions—digital—workflow—whitepaper—sdk.pdf.
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US20110295761 *May 27, 2011Dec 1, 2011Kabushiki Kaisha ToshibaBusiness form management system, method and program
Classifications
U.S. Classification715/234, 715/760, 345/2.1, 715/273, 358/1.15
International ClassificationG06F17/00
Cooperative ClassificationH04N2201/0049, H04N1/00204, G03G2215/00109, G06F21/41, G06F21/84, H04N1/32561, H04N2201/0075, G03G15/502, H04N2201/0039, H04N1/00352, H04N2201/0094, H04N2201/3221
European ClassificationG03G15/50F, H04N1/32K, G06F21/84, G06F21/41, H04N1/00C3, H04N1/00D2
Legal Events
DateCodeEventDescription
Jun 1, 2012ASAssignment
Owner name: SHARP KABUSHIKI KAISHA, JAPAN
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SHARP LABORATORIES OF AMERICA INC.;REEL/FRAME:028306/0544
Effective date: 20120601
Sep 22, 2005ASAssignment
Owner name: SHARP LABORATORIES OF AMERICA, INC., WASHINGTON
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHRISOP, ROY K.;RICHARDSON, TANNA MARIE;REEL/FRAME:017035/0633
Effective date: 20050920