|Publication number||US5917493 A|
|Application number||US 08/633,614|
|Publication date||Jun 29, 1999|
|Filing date||Apr 17, 1996|
|Priority date||Apr 17, 1996|
|Publication number||08633614, 633614, US 5917493 A, US 5917493A, US-A-5917493, US5917493 A, US5917493A|
|Inventors||Jin-Meng Tan, Beng Hong Kang|
|Original Assignee||Hewlett-Packard Company|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (11), Referenced by (110), Classifications (11), Legal Events (9)|
|External Links: USPTO, USPTO Assignment, Espacenet|
1. Field of the Invention
The present invention relates generally to data storage, more particularly to data entry and compilation and specifically to a stylus input device based computer, having a method for the random capture of information for later organization.
2. Description of Related Art
As shown in FIGS. 1A, 1B and 1C, "pen-input based" computer systems, sometimes referred to as "palmtops," are small, hand-held computing apparatus 100. One such apparatus is the OMNIGO™ computer by Hewlett-Packard Company. While a keypad 102 is provided, having a set of alphanumeric and function keys 103 (FIGS. 1A and 1B only) for program control and data entry and manipulation, generally a primary means for inputting data in a palmtop computer is a hand-manipulated stylus, or "pen," 104, configured as a gesture implemented data manipulation device. Handwriting recognition based on stylus input to a "touchscreen" is well known to persons skilled in the art. The pen 104 can be carried in a holster 105 as shown in FIG. 1A, and is removed from the holster 105 by pulling in the direction of the representative arrow in the drawing. The palmtop apparatus housing 106 folds via a hinge mechanism 107 from a closed position for easy transportation (preferably palmtops are pocket-sized), to a desktop type position as shown in FIGS. 1A and 1B, to a fully open position as shown in FIG. 1C.
A dual function display 108, such as an LCD touchscreen as would be known in the art, is provided to serve as both an input device and an output device. When operating as an input device, the display exhibits either keypad 102 entries as shown in FIG. 1B or the positioning of the tip of the stylus 104 on the display 108 as shown in FIG. 1C. The data entered in either manner is routed via the apparatus' internal, central processing unit ("CPU" such as a commercially available VG-230™ by VADEM™--not shown) and memory (not shown) as would also be known in the art. When operating as an output device, the display 108 presents computer generated images on its central data screen section 109 (hereinafter "screen 109"), FIG. 2, based upon the user input. With touchscreen capability, the display 108 can also be used for quick program access. Icons 1-12 in the display 108 side perimeters 201 are provided which when touched with the stylus 104 (FIGS. 1A, 1B, and 1C) activate, "boot-up," the requested application program. One of the icons, 1 Home screen, is used to bring up application icons that are not in the screen side perimeters 201 onto the central data screen section 109.
The CPU is programmed to recognize both alphanumeric characters and graphical images created on the display 108 with the keys 103 and with the stylus 104. Thus, the user is provided with a convenient, multifunctional, computing apparatus which also has ports (not shown) for transferring information to a printer or a host computer. In other words, the user may be working with the keypad 102 to run a built-in, sophisticated, spreadsheet program at a leisurely pace, or the user may wish to scribble in a quick reminder note or sketch with the stylus 104.
While versatile and easy to use, a problem with such notepad computing convenience is that data entry, particularly with the stylus 104 is randomly input. For example, the user may be at a meeting where a variety of issues are being discussed that are traditionally stored in various other programs, such as the spreadsheet 8, one of the databases 5, in the appointment book, or the like. In such situations, real time data entry into each application can become too complicated to keep up with the meeting, rendering the usefulness of the apparatus less than an optimal.
One method for creating pen generated notes is shown in U.S. Pat. No. 5,398,310 to Tchao et al. for a Pointing Gesture Based Computer Note Pad Paging and Scrolling Interface. Tchao shows a method for creating notes which can be segregated by drawing a line across the screen 38, creating a new header for the following entry. Entry continues in that note until another line is drawn across the screen, creating a new header for the next following entry. A limitation of the Tchao methodology is that one must scroll through the notes sequentially. Another limitation is that other than keeping the notes entered, there is no means of transferring the notes to other applications.
Thus, there is a need for a random entry and random access method and apparatus for randomly generating information for subsequent correlating to other resident applications.
A computer is provided with a system for randomly generating information for subsequent organization. Such a system comprises a series of application programs and a screen. Of the application programs, at least one contains a field that is capable of accepting data. An application program for randomly generating on the screen computer images, which provide information to a user, is also provided. The computer-generated images are arranged in a paginal order in such an application program and are presented to the user as pages of information. When one computer-generated image is selected and subsequently removed, the rest of the images are re-arranged in a new paginal order. An icon depicting the selected image is then provided on the screen, and if the icon is correlated with the data-accepting field, the selected image is transferred to such a field.
The present invention further provides for a memory device having an embedded program application in accordance with the methodology of these steps.
It is an advantage of the present invention that it provides a method for random entry of differing forms of data in data sets that can later be transferred to other relevant applications, thereby allowing a "capture first, organize later" operation philosophy valuable in hectic business situations.
It is another advantage of the present invention that it automatically re-correlates remaining sets of data when any one set of a series is transferred to another application, thereby eliminating scrolling through blanked entries.
It is yet another advantage of the present invention that it caters to a user model of entering and organizing note-like information throughout an integrated system in a way that is intuitive and natural to the user.
It is a further advantage of the present invention that it employs stylus based drag-and-drop data transfer to facilitate fast data manipulation.
Other objects, features and advantages of the present invention will become apparent upon consideration of the following explanation and the accompanying drawings, in which like reference designations represent like features throughout the drawings.
FIGS. 1A, 1B, 1C are perspective depictions of a palmtop computing apparatus in accordance with the present invention in which:
FIG. 1A shows the apparatus open with a stylus partially withdrawn from its holster;
FIG. 1B shows the apparatus of FIG. 1A as being used for data entry using its keypad; and
FIG. 1C shows the apparatus fully opened and being used fro data entry using the stylus.
FIG. 2 is a two dimensional depiction of the display device of the apparatus as shown in FIGS. 1A, 1B and 1C.
FIG. 3 is a two dimensional depiction of the apparatus with a central data screen section of the display showing an exemplary mode of operation in accordance with the present invention.
FIGS. 4A, 4B, 4C is a flow chart for the methodology in accordance with the present invention in which:
FIG. 4A is a flow chart showing the initial turn-on routine for the JOTTER application,
FIG. 4B is a flow chart showing the steps of a new data input routine for the JOTTER application,
FIG. 4C is a flow chart showing the steps of a routine for the JOTTER application for function key operations implementing page changes,
FIG. 4D is a flow chart showing the steps of a routine for the JOTTER application function key to move from a current page to the first used page,
FIG. 4E is a flow chart showing the steps of a routine for the JOTTER application function key to discard a current used page, and
FIG. 4F is a flow chart showing the steps of a routine for the JOTTER application function key to transfer a current used page to a different application.
FIG. 5 is a flow chart showing the steps of a routine for starting another application capable of receiving a used page from the JOTTER application in conformance with the aspects of the present invention as shown in FIG. 4F.
FIG. 6 is a flow chart of a routine associated with data entry in the JOTTER application as shown in FIG. 4A, element 409.
FIG. 7 is a flow chart of a routine associated with transferring a used page from the JOTTER application to another application in conformance with the aspects of the present invention as shown in FIG. 4F.
The drawings referred to in this specification should be understood as not being drawn to scale except if specifically noted.
Reference is made now in detail to a specific embodiment of the present invention, which illustrates the best mode presently contemplated by the inventor(s) for practicing the invention. Alternative embodiments are also briefly described as applicable.
As shown in FIG. 2, a plurality of resident applications are provided in the CPU and memory of the hand held computing apparatus 100. Each application is accessed by tapping its touch sensitive icon 1-12 with the tip of the stylus 104. Some routines are merely apparatus utilities such as "Home screen"/icon 1 which boots up a main program manager; "Online help"/icon 2 which boots up help screens during applications; "Calculator/icon 9; "Graffiti"/icon 11 which is used to preset handwriting recognition patterns; and "Rotates the screen"/icon 12 which changes the data entry screen orientation ninety degrees as shown in FIG. 1C. Note that the keypad 102 (FIGS. 1A and 1B) also includes dedicated keys for each of the built-in applications for one-touch keypad booting rather than stylus "tap and activate" entries. The non-utility program applications, icons 2-8, are interactive; that is, data can be transferred between the applications.
An application specific to the disclosure of the present invention is stored in fixed memory, such as a read only memory (ROM) integrated circuit device (memory devices such as integrated circuits, flash memory, hard disks, floppy disks, laser compact disks, and the like, are well known in the art; further explanation is not necessary to an understanding of the present invention and the term "memory" should be understood to represent any such apparatus as would be justified as a design expedient for a particular implementation of the present invention without limitation as to the claims hereinafter). The specific application of interest is referred to hereinafter as "JOTTER", represented by an touch sensitive icon 6 in the lower left corner of the display 108. When the tip of the stylus 104 is touched to the JOTTER icon 6, the JOTTER application of the present invention is booted up.
The JOTTER application is a focal point for other applications; in other words, it is a locus for entry of data in a rapid manner for later organization. That is, the user may be receiving information which in real-time he cannot rapidly organize in an intelligent manner. The present invention functions to allow data capture now and organization later. Namely, data entered using the JOTTER application as described hereinafter can have a logical destination application, e.g., time and date data for an Appointment Book application, icon 2, to which the data can be moved at a convenient time.
The functional operation of the JOTTER application is to allow the quick capture of information by keypad 102 entries or in "ink"--that is, by the use of the stylus 104 in a hand writing recognition mode on the display 108 wherein sensing by the CPU of the position of the tip of the stylus on the screen 109 and its positional movement across the screen 109 generates a corresponding image on the screen to create the illusion that the stylus 104 is a "pen" drawing the image directly on the screen 109.
The application is started by touching a dedicated function key 103 or the tip of the stylus 104 to the JOTTER icon 6 in the display side perimeter 201. The JOTTER application in accordance with the present invention is a multi-page (e.g., 20 pages) electronic notepad. Each page can contain alphanumeric text, graphical images, or both. Information entered into the JOTTER application can be saved and later transferred to any other application that can have note or table as will be explained in detail hereinafter. The JOTTER application methodology allows multi-page, quick capture of information using the stylus 104 or the keypad 102. After capturing the information, the user transfers it to another application or discards those notes that are no longer required. Information is captured in a first in, last out, "stack" of virtual pages, where page one (1) is at the bottom of the stack and the most recently entered data page, e.g., page eleven (11), is at the top of the stack. In the commercial embodiment of the HP OMNIGO computer, twenty (20) pages are permitted.
The operation in accordance with the present invention will now be described in conjunction with continual reference to FIGS. 3 and 4A-F. To repeat, the application is started by touching the tip of the stylus 104 to the JOTTER icon 6 in the display side perimeter 201 or tapping a dedicated JOTTER key. This will bring up the application, represented by step 401 in FIG. 4A.
The housing 106 portion that contains the display 108 is labeled, F1, F2, F3, F4, and F5, beneath the screen 109 as shown in FIG. 3 only. The JOTTER application provides virtual function keys 301 (NEW), 302 (FIRST), 303 (LAST), 304 (DISCARD), and 305 (STICK) across the bottom of the screen 109 in respective proximity to the labels F1 through F5 for use with the stylus 104. A set of actual function keys 103 are provided as part of the keypad 102. The redundant virtual function keys 301-305 permit stylus access rather than requiring a switch to the keypad 102 which might be awkward, for example, if the apparatus 100 is being used as shown in FIG. 1C.
An additional virtual key 306 provides access to a pop-up application menu of editing and printing task commands, providing use of such well known screen editing actions as CUT, COPY, PASTE, DELETE, PRINT SCREEN, or the like.
Other function keypad keys 103 are provided and also duplicated with virtual keys. Across the top of the screen 109 is a title bar showing the active application name, JOTTER, its current page number (PAGE #), a pen mode select virtual key 307 which pops up a menu of stylus 104 functions (such as WRITE TEXT, DRAW INK, ERASE INK, SELECT INK, or the like--where "SELECT" means a boundary can be drawn around some entry on the screen in order to make it the object of a CUT, COPY, DELETE, or the like, operation from the screen editing utility) as would be known to a person skilled in the art and about which further detail is not necessary to the understanding of the present invention. The pen mode key 307 icon will change to an image that reflects the status of the pen mode currently selected by the user. Generally, the stylus 104 can only be in one mode at a time; for instance, if the stylus 104 is in a WRITE mode, a handwriting recognition mode is activated and text can be written and recognized but other modes, ERASE, DRAW, SELECT, are disabled.
Two horizontal page change virtual keys 308, 309 are provided to allow the user to go to the previous or the next JOTTER page; JOTTER employs "wrapping" and, thus, pressing the next page key 309 while on the last used page moves to the first used page. To the left of the title bar is a virtual key 310 which closes the JOTTER application.
The first time that the application is started, it will bring up a blank page if one is available; if not, the last page, e.g. page 20, will be displayed. Thus, in step 403, a data file is opened, that is a section of memory is initialized and designated for displaying data present in it, if any, and storing new data entered; the apparatus initializes and designates a handwriting recognition mode of operation; and a text cursor, -- 312, appears on the beginning of the first text line. If some, but not all, pages have been used, the application will always bring up the page that was last used, with the stylus input mode activated, and with the text cursor position in the same position as when the last input was entered.
In the next step 405, the application sets a flag to record user modifications to a current page or when a stuck page (explained in detail hereinafter) is newly created or deleted by a destination non-utility application program, e.g. a calendar-type application. As would be well known in the art of integrated circuit data manipulation, this task can be implemented in a number of ways; most commonly, input/output registers, accumulators, or buffer circuitry associated with the display 108, act as a pipeline between a random access memory and the keypad 102 or stylus 104 actions. simple user request input is the use of the page change virtual keys 308, 309. Left and right facing arrow icons are provided to step through used pages; by used pages, is meant pages in which data is presently held; the displayed page is considered "used" when any mark, "ink," albeit even a single dot or line, exists (entry of tabs or spaces alone are not printable and therefore if a page contains only tab or space characters, it is considered "unused"). For example, assume there are three used pages. Tap the left page change virtual key 308 and the application will go from displaying the current page to the next previous used page. In other words, if the current page is page 2, tapping the left page change virtual key 308 takes the application to used page 1 which is then displayed on the screen 109. If the current page is page 1, tapping the left page change virtual key 308 will take the application to the last used page. Similarly, tapping the right page change virtual key 309 will take the application to the next sequentially used page. In other words, if the current page is page 2, tapping the right page change virtual key 309 takes the application to used page 3. Likewise, if the current page is the last page, page 20 in the exemplary embodiment, tapping the right page change virtual key 309 will take the application to used page 1. Note that if only pages 1,2 and 3 are used, and the user being on used page 3 taps the right page change virtual key 309, the application wraps to page 1 because it is the next used page. Again, as previously noted, dedicated keys 103 for page changing, e.g., labeled PREV, NEXT, or the like, are provided as part of the keypad 102 for users who prefer keypad use over stylus use at any particular time.
In order to omit the need for scrolling through pages sequentially, the application will recollate used pages by "shifting left" as needed. That is, if pages numbered 1, 2, 3 and 4 are used, and page 3 is discarded (as will be explained hereinafter), the data on used page 4 is automatically moved to page 3 and page 4 becomes available for reuse. In this manner, used pages are always contiguous; that is, the present invention does not require the user to step, or scroll, through blank pages. This may also be thought of as a stack of notes where if one is pulled from the stack, those below rise toward the top, viz., toward used page number 1.
Returning to the flow chart of FIG. 4A, assume that the apparatus has been turned on, the JOTTER application selected, step 401, that the most recently used page has appeared on the screen, step 403, and all registers/buffers are set, step 405. The application enters a wait state 407 for user input. In general, except when executing another function, the application returns to the wait state 407.
Other than searching for a particular used page as just described, the logical next use is data input, step 409. Data can be input using the keypad 102 as shown in FIG. 1B, namely by typing alphanumeric characters which will appear sequentially as the cursor 312 moves across the screen as with any typical computing keyboard entry system. Alternatively, data can be input using the stylus 104. In the example shown on screen 109 in FIG. 3, the user records a map and phone number of a client. Initialization has by default put the stylus 104 in the handwriting recognition mode, DRAW INK. Therefore, the user can scrawl "MAP TO CLIENT 619 555 1234" across the screen as shown. Next, wishing to draw the map, the user taps the mode virtual key 307; stylus mode selections appear on screen as a pop-up menu; and the user taps the icon or text to enable a graphics mode, e.g., a DRAW INK selection or the like. ERASE INK, a non-graphics mode, or SELECT INK, a momentary graphics mode operation to enclose "INK" and recognize it for an editing operation are similary used. The pop-up menu vanishes and the stylus mode virtual key 307 changes its icon to reflect the new mode. The user then draws lines to represent streets and intersections. Wishing to label the streets and intersections, the user again taps the stylus mode virtual key 307 and changes back to the handwriting recognition mode by tapping the appropriate icon or its help text on the same pop-up menu, e.g., a " |! WRITE TEXT" menu selection or the like. Again, the pop-up menu disappears and the stylus node virtual key 307 changes to the icon |! to reflect the handwriting recognition mode. The user then prints the labels shown in FIG. 3, "16399--X--WEST BERNARDO DR--I 15--and BERNARDO CENTER EXIT" with the stylus 104.
To continue the exemplary operation in accordance with the present invention, assume that the user has finished entering data for page 1 and stops; the application returns to the wait state 407. Assume the user next wishes to start a new page with a list of things to do for the client prior to following the map to visit the client. Using either the dedicated keypad F1 key or virtual function F1 key 301, NEW, with the stylus 104, as represented in step 411, a blank page is brought up on the screen 109 in accordance with the steps of FIG. 1B. More data can be entered accordingly.
Note that the user could select another operation 413, 415, 417, 419, with the other dedicated function keys F2-F5, virtual keys 302-305, respectively. Details as to those operations follow hereinafter. Having selected function key F1/301 NEW, step 411, the application takes the step 421 of automatically saving the current text/drawing data to a page file in random access memory, where N is set as the total number of used pages, thus if the current used page is page 1, set N=1, step 423. The entered text/drawing data is cleared from the screen 109 (annotated in the flow chart as "UI" for "user interface") and the title bar is changed to "JOTTER (page #N+1)" for the new page in which data can be entered, step 425. For example, if the current used page is PAGE 3, the NEW page will be PAGE 4. Since the screen is blank at this point, the DISCARD and STICK function keys F4/304 and F5/305 are disabled, step 427. The application again returns to the wait state 407. Selecting the F1/301, NEW, key has no effect if the current page does not contain any data, step 429. If all pages have been used, step 431, including the current page, the F1/301, NEW, key is disabled, step 433. As is common to graphical user interface ("GUI") techniques, changing the color or shading of the virtual keys from a standard operational shading or color indicates disablement.
A NEW page becomes a used page when it contains some text or drawing. Used pages are numbered sequentially, with PAGE 1 being the oldest page. Editing or modifying data on a used page (described in detail hereinafter) does not alter its page number.
At anytime during operation, the user can use the F2/302, FIRST, or F3/303, LAST, function keys. Tapping these keys takes the application to the first or last page that contains data. The F2/302, FIRST, function key routine, step 413, is shown in FIG. 4C. If the current page is PAGE 1 step 435, the F2/302, FIRST, key is disabled, step 437. Assuming the current used page is not PAGE 1 as with the F1/301, NEW, function key, the first step 421' is to save the text/drawing data to a page file in memory. In the next step 439, the current page number is set for PAGE 1 i.e., PAGE 1=N-(N-1), where "N" is the current page number. Data is then loaded from memory into the screen 109 buffers and displayed, with the title bar being updated to show "JOTTER (PAGE 1)", step 441. Function keys F4/304 and F5/305 are enabled for use, step 443. Again, the application returns to the wait state 407.
Use of the F3/303, LAST, function key, step 415 as shown in FIG. 4D, is very similar to the F2/302, FIRST, function key routine. Note that the F2/302 key is disabled under two conditions: (1) there is only one used page, step 445, or (2) the current page is the last used page, i.e., the highest in the stack of used pages, step 447. Step one, 421", of the LAST routine is to save the current used page to a file in memory. To jump to the last used page, the called up current page will be page N-1 step 449. Data will be transferred from memory for page N-1 to the screen, step 451. The F4/304, DISCARD, and F5/305, STICK, keys are enabled and, since the application is now at the last used page, F3/303, LAST, is disabled, step 453. The wait state 407 is again entered.
Whenever a used page is on the screen 109, the F4/304, DISCARD, and F5/305, STICK, keys are enabled, see steps 417 and 419. FIG. 4E shows the DISCARD routine. This routine is used to delete the entire current used page. Tapping the F5/305, DISCARD, key pops up a dialog box, step 455, which, in the typical computer mannerism, gives the user the chance to confirm that the current page is to be discarded, step 457, or to cancel the request. If the user changes his mind, he cancels the request and the application returns to the wait state 407; otherwise, the DISCARD command is reaffirmed. Once reaffirmed, the current used page is removed from memory, pages with a number higher than the current page decrement by one; viz., if N=7, and the current used page is number 3, being discarded, used page 7 becomes the new used page 6, used page 6 becomes the new used page 5, used page 5 becomes the new used page 4, and used page 4 becomes the current used page 3 which is transferred from memory to the screen 109--step 459. If the only used page is PAGE 1 and it is discarded, a NEW page="JOTTER (PAGE 1)" is presented on the screen 109. Again, the application goes to the wait state 407.
The F5/305, STICK, function key lies at the heart of the apparatus functionality. It provides a means of moving JOTTER entered data to another program application as shown in FIG. 4F. Each non-utility application (e.g., see icons 2-5, 7 and 8) is provided with a "Note Field" or "Table Fields" into which data can be transferred. In other words, each non-utility application is provided with a data retention capability, e.g., a Note Field in Database 5 or table items in Spreadsheet 8 (see FIG. 2), to which information can be directly transferred to or from the JOTTER application by a simple user action, namely, a "drag-and-drop" feature as it is known in the art. These type applications support the JOTTER application in that pages can be moved directly from a JOTTER page to another application's particular Note Field. In other words, the present invention provides the electronic equivalent of writing something on a small pad, tearing off the written page, and sticking the page somewhere else, e.g., in a physical file folder where the note becomes more useful.
In general, the user writes so a JOTTER application page which is saved until it is discarded, as previously described, or it is "stuck" into another application.
When the user selects to move a current used page data to another application's Note Field, the F5/305, STICK, function key is tapped, step 419. If there is no data on the current page, or if there is and the user selects the F5/305 function key, that key is disabled, step 461. The current used page data is sent to a clipboard file, step 463. The STICK function works on the entire current used page, thus it is different from other data editing such as CUT or COPY command functions. Next, all applications having a Note Field are notified that a page of data has been selected for a STICK operation, step 465. Shrink line animation--simple animation using rectangles as an outline showing the full-screen application shrinking down to a simple icon--shows the JOTTER application shrinking down to a simple icon, such as a stick, appearing on the screen, in the preferred embodiment in the title bar, e.g., to the left of the stylus mode icon 307, step 467, shown in phantom line in FIG. 3 as element 314. The JOTTER application goes off line, step 469, deleting the page now represented by the remaining shrink line icon and adjusting the page numbers of remaining used pages in accordance with the recollating of the DISCARD routine.
A non-utility application booting up displays STICK animation if that application has the capability of receiving a page in a Note Field or as a table item; if it does not, the shrink line icon 314 for the STICK page is disabled, that is, the STICK page so represented can not be dragged and dropped into that application, but the STICK page is not lost. Note that the JOTTER application does not allow the STICK page to be dropped down onto either a used or NEW page; that is, in the Jotter application itself, the shrink line icon 314 remains disabled if the JOTTER application is rebooted after STICKing a page up in the title bar as a shrink line icon 314.
The STICK page shrink line icon 314 remains in the title bar until it is dragged and dropped by the user as shown in FIG. 5 into a compatible other application. Drag-and-drop operations are well known in the computer GUI state of the art. As an example of such an operation, assume an application having a Note Field or table, such as the Appointment 2 application (FIG. 2), is booted by using the touch screen icon 2 with the stylus 104 or a dedicated key 103, step 501. The STICK page shrink line icon 314 in the title bar is enabled, step 503. After finding the date for the client meeting, the user places the tip of the stylus 104 and holds it there until a signal, such as a beep, indicates that the STICK page has been captured by the stylus 104; the user then slides the tip across the screen 109 to a valid drop area--designated by visual feedback such as changing the color of a table location where the page may be dropped in the opened application (losing contact before reaching a valid drop area cancels the drag-and-drop operation); the stylus tip is then lifted from the drop area and the STICK page is "dropped" into the application accordingly. The shrink line icon 314 then disappears from the title bar, step 505. The user now has a map and telephone number in his Appointment schedule. All other applications which were notified that a STICK page was available (see step 465) are now notified that the STICK page is gone, step 507, thus, if booted next, another application with STICK page receiving capability does not activate the STICK page animation step 503 since there is no longer a STICK page available. That is, animation only occurs once, yet each non-utility application needs to disable their STICK page icons. After a successful drag-and-drop operation, the STICK page shrink line icon 314 is removed from the title bar.
In a note formatted screen or note field of an application, dragging the STICK page shrink line icon into the data area will cause an outline box to surround the entire note data area (viz., between the title bar and the virtual function keys), providing visual feedback to the user. Dropping the icon 314 by lifting the stylus anywhere in this area appends the JOTTER STICK page data to the end of the outlined note and causes the note to update its display to show the appended data.
In a table screen, dragging the STICK page shrink line icon 314 will turn off any table selection highlighting of a currently selected table item and cause an outline box to appear around the item which the pen is over; this visual feedback outline box is the same width and height of the selection item and moves item by item as the icon 314 is dragged across the table with the stylus. Dragging an icon 314 into the bottom or top edge of a table will scroll the table item by item and the outline box will remain at its last position prior to scrolling. Dropping onto a table item appends the JOTTER data to the item's note area. Dragging outside the table's scroll areas will remove the outline box visual feedback around table items and dropping the icon, viz. lifting the stylus, cancels the drag-and-drop operation. On completion of a drag-and-drop operation, the selection highlighting is restored on the dropped item or at the last position of the outline box if the drop was canceled.
In order to support the differing function keys properly, two independent routines are provided as shown in FIGS. 6 and 7. Referring to FIG. 6, the DISCARD function key F4/304 is disabled in the NEW page mode--step 601.
Referring to FIG. 7, in conjunction with the STICK page animation as just previously described, once a STICK page shrink line icon operation has been instituted, all available applications which can receive the STICK page JOTTER data which receive notification (FIG. 5, 507), step 701, instruct enabling the STICK function key F5/305. When a notification is received by another application, shrink line animation and enabling is done. In the JOTTER application, the icon 314 is enabled and disabled depending upon if a STICK page exists; no overwrite operation is allowed.
Thus, the concept of the present invention caters to a user model of entering and organizing note-like information throughout an integrated system in a way that is intuitive and natural to the user.
The foregoing description of the preferred embodiment of the present invention has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in this art. Similarly, any process steps described might be interchangeable with other steps in order to achieve the same result. The embodiment was chosen and described in order to best explain the principles of the invention and its best mode practical application to thereby enable others skilled in the art to understand the invention for various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the claims appended hereto and their equivalents.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US5398310 *||Apr 13, 1992||Mar 14, 1995||Apple Computer, Incorporated||Pointing gesture based computer note pad paging and scrolling interface|
|US5404442 *||Nov 30, 1992||Apr 4, 1995||Apple Computer, Inc.||Visible clipboard for graphical computer environments|
|US5446882 *||Oct 2, 1992||Aug 29, 1995||Apple Computer, Inc.||Interface for a computerized database having card and list views|
|US5457476 *||May 27, 1993||Oct 10, 1995||Apple Computer, Inc.||Method for controlling a computerized organizer|
|US5539427 *||Jan 27, 1994||Jul 23, 1996||Compaq Computer Corporation||Graphic indexing system|
|US5544358 *||May 26, 1995||Aug 6, 1996||Apple Computer, Inc.||Interface for a computerized database having card and list views|
|US5559942 *||May 10, 1993||Sep 24, 1996||Apple Computer, Inc.||Method and apparatus for providing a note for an application program|
|US5563996 *||Sep 24, 1993||Oct 8, 1996||Apple Computer, Inc.||Computer note pad including gesture based note division tools and method|
|US5581681 *||Jun 7, 1995||Dec 3, 1996||Apple Computer, Inc.||Pointing gesture based computer note pad paging and scrolling interface|
|US5590256 *||Dec 14, 1994||Dec 31, 1996||Apple Computer, Inc.||Method for manipulating notes on a computer display|
|US5625377 *||May 26, 1995||Apr 29, 1997||Apple Computer, Inc.||Method for controlling a computerized organizer|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US6144888 *||Nov 10, 1997||Nov 7, 2000||Maya Design Group||Modular system and architecture for device control|
|US6300933 *||Sep 5, 1996||Oct 9, 2001||Canon Kabushiki Kaisha||Electronic apparatus and a control method thereof|
|US6310634||Oct 13, 2000||Oct 30, 2001||Starfish Software, Inc.||User interface methodology supporting light data entry for microprocessor device having limited user input|
|US6504956 *||Oct 5, 1999||Jan 7, 2003||Ecrio Inc.||Method and apparatus for digitally capturing handwritten notes|
|US6505087 *||Jun 2, 2000||Jan 7, 2003||Maya Design Group||Modular system and architecture for device control|
|US7102615||Jul 27, 2002||Sep 5, 2006||Sony Computer Entertainment Inc.||Man-machine interface using a deformable device|
|US7158675 *||Jun 28, 2002||Jan 2, 2007||Microsoft Corporation||Interfacing with ink|
|US7167585||Dec 16, 2005||Jan 23, 2007||Microsoft Corporation||Interfacing with ink|
|US7171498 *||Jan 9, 2003||Jan 30, 2007||Sony Computer Entertainment America Inc.||Alphanumeric keyboard input system using a game controller|
|US7233321||Dec 15, 1998||Jun 19, 2007||Intel Corporation||Pointing device with integrated audio input|
|US7272796||Dec 17, 2003||Sep 18, 2007||Canon Kabushiki Kaisha||Data processing apparatus, data processing method of data processing apparatus, and computer-readable memory medium storing program therin|
|US7370281||Jun 25, 2002||May 6, 2008||Bea Systems, Inc.||System and method for smart drag-and-drop functionality|
|US7567239 *||Jun 26, 2003||Jul 28, 2009||Motorola, Inc.||Method and system for message and note composition on small screen devices|
|US7623115||Jan 16, 2004||Nov 24, 2009||Sony Computer Entertainment Inc.||Method and apparatus for light input device|
|US7627139||May 4, 2006||Dec 1, 2009||Sony Computer Entertainment Inc.||Computer image and audio processing of intensity and input devices for interfacing with a computer program|
|US7646372||Dec 12, 2005||Jan 12, 2010||Sony Computer Entertainment Inc.||Methods and systems for enabling direction detection when interfacing with a computer program|
|US7656397||May 8, 2007||Feb 2, 2010||Intel Corporation||Pointing device with integrated audio input and associated methods|
|US7663689||Jan 16, 2004||Feb 16, 2010||Sony Computer Entertainment Inc.||Method and apparatus for optimizing capture device settings through depth information|
|US7715630||Dec 16, 2005||May 11, 2010||Mircosoft Corporation||Interfacing with ink|
|US7725338 *||Sep 15, 2000||May 25, 2010||Palmsource Inc.||Time based profile management on palmtop computer|
|US7751623 *||Feb 17, 2004||Jul 6, 2010||Microsoft Corporation||Writing guide for a free-form document editor|
|US7760248||May 4, 2006||Jul 20, 2010||Sony Computer Entertainment Inc.||Selective sound source listening in conjunction with computer interactive processing|
|US7804494||Jan 4, 2010||Sep 28, 2010||Intel Corporation||Pointing device with integrated audio input and associated methods|
|US7827493||Aug 15, 2007||Nov 2, 2010||Canon Kabushiki Kaisha||Data processing apparatus, data processing method of data processing apparatus, and computer-readable memory medium storing program therein|
|US7874917||Dec 12, 2005||Jan 25, 2011||Sony Computer Entertainment Inc.||Methods and systems for enabling depth and direction detection when interfacing with a computer program|
|US7883415||Sep 15, 2003||Feb 8, 2011||Sony Computer Entertainment Inc.||Method and apparatus for adjusting a view of a scene being displayed according to tracked head motion|
|US7895537 *||Dec 29, 2003||Feb 22, 2011||International Business Machines Corporation||Method and apparatus for setting attributes and initiating actions through gestures|
|US7925987||Jun 28, 2002||Apr 12, 2011||Microsoft Corporation||Entry and editing of electronic ink|
|US8031845||Jan 15, 2009||Oct 4, 2011||International Business Machines Corporation||System for viewing information underlying lists and other contexts|
|US8035629||Dec 1, 2006||Oct 11, 2011||Sony Computer Entertainment Inc.||Hand-held computer interactive device|
|US8072470||May 29, 2003||Dec 6, 2011||Sony Computer Entertainment Inc.||System and method for providing a real-time three-dimensional interactive environment|
|US8142288||May 8, 2009||Mar 27, 2012||Sony Computer Entertainment America Llc||Base station movement detection and compensation|
|US8166388||Jun 28, 2002||Apr 24, 2012||Microsoft Corporation||Overlaying electronic ink|
|US8188968||Dec 21, 2007||May 29, 2012||Sony Computer Entertainment Inc.||Methods for interfacing with a program using a light input device|
|US8193908 *||Jul 16, 2003||Jun 5, 2012||At&T Intellectual Property I, Lp||Pager with a touch-sensitive display screen and method for transmitting a message therefrom|
|US8251820||Jun 27, 2011||Aug 28, 2012||Sony Computer Entertainment Inc.||Methods and systems for enabling depth and direction detection when interfacing with a computer program|
|US8276101 *||Sep 30, 2011||Sep 25, 2012||Google Inc.||Touch gestures for text-entry operations|
|US8287373||Apr 17, 2009||Oct 16, 2012||Sony Computer Entertainment Inc.||Control device for communicating visual information|
|US8303411||Oct 12, 2010||Nov 6, 2012||Sony Computer Entertainment Inc.||Methods and systems for enabling depth and direction detection when interfacing with a computer program|
|US8310656||Sep 28, 2006||Nov 13, 2012||Sony Computer Entertainment America Llc||Mapping movements of a hand-held controller to the two-dimensional image plane of a display screen|
|US8313380||May 6, 2006||Nov 20, 2012||Sony Computer Entertainment America Llc||Scheme for translating movements of a hand-held controller into inputs for a system|
|US8323106||Jun 24, 2008||Dec 4, 2012||Sony Computer Entertainment America Llc||Determination of controller three-dimensional location using image analysis and ultrasonic communication|
|US8342963||Apr 10, 2009||Jan 1, 2013||Sony Computer Entertainment America Inc.||Methods and systems for enabling control of artificial intelligence game characters|
|US8368753||Mar 17, 2008||Feb 5, 2013||Sony Computer Entertainment America Llc||Controller with an integrated depth camera|
|US8393964||May 8, 2009||Mar 12, 2013||Sony Computer Entertainment America Llc||Base station for position location|
|US8527657||Mar 20, 2009||Sep 3, 2013||Sony Computer Entertainment America Llc||Methods and systems for dynamically adjusting update rates in multi-player network gaming|
|US8542907||Dec 15, 2008||Sep 24, 2013||Sony Computer Entertainment America Llc||Dynamic three-dimensional object mapping for user-defined control device|
|US8547401||Aug 19, 2004||Oct 1, 2013||Sony Computer Entertainment Inc.||Portable augmented reality device and method|
|US8570378||Oct 30, 2008||Oct 29, 2013||Sony Computer Entertainment Inc.||Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera|
|US8654082 *||Sep 21, 1998||Feb 18, 2014||Glenn Rolus Borgward||Digital book|
|US8686939||May 6, 2006||Apr 1, 2014||Sony Computer Entertainment Inc.||System, method, and apparatus for three-dimensional input control|
|US8758132||Aug 27, 2012||Jun 24, 2014||Sony Computer Entertainment Inc.|
|US8781151||Aug 16, 2007||Jul 15, 2014||Sony Computer Entertainment Inc.||Object detection using video input combined with tilt angle information|
|US8797260||May 6, 2006||Aug 5, 2014||Sony Computer Entertainment Inc.||Inertially trackable hand-held controller|
|US8840470||Feb 24, 2009||Sep 23, 2014||Sony Computer Entertainment America Llc||Methods for capturing depth data of a scene and applying computer actions|
|US8928577 *||Jan 29, 2001||Jan 6, 2015||Qualcomm Incorporated||Method and apparatus for single-step user generated notes on a personal digital assistant|
|US8961313||May 29, 2009||Feb 24, 2015||Sony Computer Entertainment America Llc||Multi-positional three-dimensional controller|
|US8976265||Oct 26, 2011||Mar 10, 2015||Sony Computer Entertainment Inc.||Apparatus for image and sound capture in a game environment|
|US8994675 *||Sep 30, 2010||Mar 31, 2015||Lg Electronics Inc.||Mobile terminal and information processing method thereof|
|US9021380 *||Oct 5, 2012||Apr 28, 2015||Google Inc.||Incremental multi-touch gesture recognition|
|US9043730||Sep 28, 2010||May 26, 2015||Canon Kabushiki Kaisha||Data processing apparatus, data processing method of data processing apparatus, and computer-readable memory medium storing program therein|
|US9081500||May 31, 2013||Jul 14, 2015||Google Inc.||Alternative hypothesis error correction for gesture typing|
|US9112988 *||Dec 1, 2008||Aug 18, 2015||Lg Electronics Inc.||Terminal and method of controlling the same|
|US9134906||Mar 4, 2014||Sep 15, 2015||Google Inc.||Incremental multi-word recognition|
|US9177387||Feb 11, 2003||Nov 3, 2015||Sony Computer Entertainment Inc.||Method and apparatus for real time motion capture|
|US9381424||Jan 11, 2011||Jul 5, 2016||Sony Interactive Entertainment America Llc||Scheme for translating movements of a hand-held controller into inputs for a system|
|US9393487||May 7, 2006||Jul 19, 2016||Sony Interactive Entertainment Inc.||Method for mapping movements of a hand-held controller to game commands|
|US9436378||Jun 10, 2015||Sep 6, 2016||Lg Electronics Inc.||Terminal and method of controlling the same|
|US9474968||May 6, 2006||Oct 25, 2016||Sony Interactive Entertainment America Llc||Method and system for applying gearing effects to visual tracking|
|US9542385||Sep 11, 2015||Jan 10, 2017||Google Inc.||Incremental multi-word recognition|
|US9552080||Jul 14, 2014||Jan 24, 2017||Google Inc.||Incremental feature-based gesture-keyboard decoding|
|US9573056||Apr 22, 2009||Feb 21, 2017||Sony Interactive Entertainment Inc.||Expandable control device via hardware attachment|
|US9678943||Sep 24, 2014||Jun 13, 2017||Google Inc.||Partial gesture text entry|
|US9682319||Jun 25, 2007||Jun 20, 2017||Sony Interactive Entertainment Inc.||Combiner method for altering game gearing|
|US9682320||Jul 31, 2014||Jun 20, 2017||Sony Interactive Entertainment Inc.||Inertially trackable hand-held controller|
|US9710453||Sep 4, 2014||Jul 18, 2017||Google Inc.||Multi-gesture text input prediction|
|US9798718||Sep 14, 2016||Oct 24, 2017||Google Inc.||Incremental multi-word recognition|
|US20020149630 *||Apr 15, 2002||Oct 17, 2002||Parascript Llc||Providing hand-written and hand-drawn electronic mail service|
|US20030160825 *||Jun 25, 2002||Aug 28, 2003||Roger Weber||System and method for smart drag-and-drop functionality|
|US20030214553 *||Jun 28, 2002||Nov 20, 2003||Microsoft Corporation||Ink regions in an overlay control|
|US20030215140 *||Jun 28, 2002||Nov 20, 2003||Microsoft Corporation||Interfacing with ink|
|US20030215142 *||Jun 28, 2002||Nov 20, 2003||Microsoft Corporation||Entry and editing of electronic ink|
|US20040125145 *||Dec 17, 2003||Jul 1, 2004||Canon Kabushiki Kaisha||Data processing apparatus, data processing method of data processing apparatus, and computer-readable memory medium storing program therein|
|US20040139254 *||Jan 9, 2003||Jul 15, 2004||Sony Computer Entertainment America Inc.||Alphanumeric keyboard input system using a game controller|
|US20040240739 *||May 30, 2003||Dec 2, 2004||Lu Chang||Pen gesture-based user interface|
|US20040263486 *||Jun 26, 2003||Dec 30, 2004||Giovanni Seni||Method and system for message and note composition on small screen devices|
|US20050073392 *||Jul 16, 2003||Apr 7, 2005||Walsh Patrick Jay||Pager with a touch-sensitive display screen and method for transmitting a message therefrom|
|US20050156948 *||Apr 3, 2003||Jul 21, 2005||Bernard Hunt||Electronic device including a display|
|US20050157204 *||Jan 16, 2004||Jul 21, 2005||Sony Computer Entertainment Inc.||Method and apparatus for optimizing capture device settings through depth information|
|US20050160372 *||Dec 29, 2003||Jul 21, 2005||Gruen Daniel M.||Method and apparatus for setting attributes and initiating actions through gestures|
|US20060093218 *||Dec 16, 2005||May 4, 2006||Microsoft Corporation||Interfacing with ink|
|US20060093219 *||Dec 16, 2005||May 4, 2006||Microsoft Corporation||Interfacing with ink|
|US20060139322 *||Feb 28, 2006||Jun 29, 2006||Sony Computer Entertainment America Inc.||Man-machine interface using a deformable device|
|US20060256097 *||May 13, 2005||Nov 16, 2006||Microsoft Corporation||Docking apparatus for a pen-based computer|
|US20060277571 *||May 4, 2006||Dec 7, 2006||Sony Computer Entertainment Inc.||Computer image and audio processing of intensity and input devices for interfacing with a computer program|
|US20070250511 *||Apr 21, 2006||Oct 25, 2007||Yahoo! Inc.||Method and system for entering search queries|
|US20070298882 *||Dec 12, 2005||Dec 27, 2007||Sony Computer Entertainment Inc.||Methods and systems for enabling direction detection when interfacing with a computer program|
|US20080022221 *||Aug 15, 2007||Jan 24, 2008||Canon Kabushiki Kaisha|
|US20080170049 *||May 8, 2007||Jul 17, 2008||Larson Jim A||Pointing device with integrated audio input and associated methods|
|US20090187855 *||Jan 15, 2009||Jul 23, 2009||International Business Machines Corporation||System for viewing information underlying lists and other contexts|
|US20100131880 *||Dec 1, 2008||May 27, 2010||Lg Electronics Inc.||Terminal and method of controlling the same|
|US20100235358 *||May 23, 2010||Sep 16, 2010||Palmsource, Inc.||Time based profile management on palmtop computer|
|US20110016404 *||Sep 28, 2010||Jan 20, 2011||Canon Kabushiki Kaisha|
|US20110060985 *||Sep 8, 2009||Mar 10, 2011||ABJK Newco, Inc.||System and Method for Collecting a Signature Using a Smart Device|
|US20120096388 *||Dec 23, 2011||Apr 19, 2012||Shinobu Usui||Electronic apparatus and method of initializing setting items thereof|
|US20120216141 *||Sep 30, 2011||Aug 23, 2012||Google Inc.||Touch gestures for text-entry operations|
|US20120326996 *||Sep 30, 2010||Dec 27, 2012||Cho Yongwon||Mobile terminal and information processing method thereof|
|US20140098023 *||Oct 5, 2012||Apr 10, 2014||Shumin Zhai||Incremental multi-touch gesture recognition|
|WO2003073197A2 *||Jan 24, 2003||Sep 4, 2003||Bea Systems, Inc.||System and method for smart drag-and-drop functionality|
|WO2003073197A3 *||Jan 24, 2003||Jun 3, 2004||Bea Systems Inc||System and method for smart drag-and-drop functionality|
|U.S. Classification||715/835, 715/863, 715/201|
|International Classification||G06F3/048, G06F1/16, G06F3/033|
|Cooperative Classification||G06F2200/1632, G06F1/1618, G06F3/04883|
|European Classification||G06F1/16P1F2, G06F3/0488G|
|Jul 25, 1996||AS||Assignment|
Owner name: HEWLETT-PACKARD COMPANY, CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TAN, JIN-MENG;KANG, BENG HONG;REEL/FRAME:008047/0686
Effective date: 19960517
|Jan 16, 2001||AS||Assignment|
Owner name: HEWLETT-PACKARD COMPANY, COLORADO
Free format text: MERGER;ASSIGNOR:HEWLETT-PACKARD COMPANY;REEL/FRAME:011523/0469
Effective date: 19980520
|Sep 23, 2002||FPAY||Fee payment|
Year of fee payment: 4
|Dec 29, 2006||FPAY||Fee payment|
Year of fee payment: 8
|Dec 29, 2010||FPAY||Fee payment|
Year of fee payment: 12
|Sep 22, 2011||AS||Assignment|
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HEWLETT-PACKARD COMPANY;REEL/FRAME:026945/0699
Effective date: 20030131
|May 3, 2013||AS||Assignment|
Owner name: PALM, INC., CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.;REEL/FRAME:030341/0459
Effective date: 20130430
|Dec 18, 2013||AS||Assignment|
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PALM, INC.;REEL/FRAME:031837/0239
Effective date: 20131218
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PALM, INC.;REEL/FRAME:031837/0659
Effective date: 20131218
Owner name: PALM, INC., CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.;REEL/FRAME:031837/0544
Effective date: 20131218
|Jan 28, 2014||AS||Assignment|
Owner name: QUALCOMM INCORPORATED, CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HEWLETT-PACKARD COMPANY;HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.;PALM, INC.;REEL/FRAME:032132/0001
Effective date: 20140123