|Publication number||US7184025 B2|
|Application number||US 10/159,851|
|Publication date||Feb 27, 2007|
|Filing date||May 31, 2002|
|Priority date||May 31, 2002|
|Also published as||US20030234797|
|Publication number||10159851, 159851, US 7184025 B2, US 7184025B2, US-B2-7184025, US7184025 B2, US7184025B2|
|Inventors||Lyndsay Williams, Andrew Blake|
|Original Assignee||Microsoft Corporation|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (61), Non-Patent Citations (14), Referenced by (27), Classifications (9), Legal Events (3) |
|External Links: USPTO, USPTO Assignment, Espacenet|
Altering a display on a viewing device based upon a user controlled orientation of the viewing device
US 7184025 B2
A method, apparatus, and article of manufacture altering a displayed image presented to a user on a viewing device using a user-controlled orientation of the viewing device to determine how the displayed image is to be presented. The viewing device includes a plurality of tilt sensors that are used to determine the orientation of the viewing device. As the user moves the orientation of the viewing device, the tilt sensors detect the change in the device orientation. These changes in orientation are used to alter the image being displayed upon the viewing device.
1. A method for altering a computer generated image of an object displayed upon a display device, the display device having a display device orientation measurement module including a tilt sensor for obtaining a measure of a spatial orientation of the display device in at least two dimensions, and a Holosim processing module for applying an oblique projection matrix to all visible points on the object to generate the computer generated image of the object, the method comprising:
obtaining a set of orientation measurements for the display device from the display device orientation measurement module;
pre-processing the orientation measurements using a microcontroller processing module before transmission to a hand-held processing module;
generating a transformation matrix using the set of orientation measurements for use in generating the computer generated image of an object; and
applying the transformation matrix to all visible points within the computer generated image of an object.
2. The method according to claim 1, wherein the display device orientation measurement module transmits the orientation measurements to the hand-held processing module over a serial communications link.
3. A computer program data product readable by a computing system and encoding a set of computer instructions implementing a method for altering a computer generated image of an abject displayed upon a display device, the display device having a display device orientation measurement module for obtaining a measure of a spatial orientation of the display device, and a Holosim processing module for applying an oblique projection matrix to all visible points on the object to generate the computer generated image of the object, the method comprising:
obtaining a set of orientation measurements for the display device from the display device orientation measurement module;
pre-processing the orientation measurements using a microcontroller processing module before transmission to a hand-held processing module;
generating a transformation matrix using the set of orientation measurements for use in generating the computer generated image of an object;
applying the transformation matrix to all visible points within the computer generated image of an object.
4. The computer program data product according to claim 3, wherein the display device orientation measurement module comprises a tilt sensor for measuring the spatial orientation of the display device in at least two dimensions.
5. The computer program data product according to claim 3, wherein the display device orientation measurement module transmits the orientation measurements to the hand-held processing module over a serial communications link.
6. The computer program data product according to claim 3, wherein the computer program data product comprises a computer readable storage medium.
7. The computer program data product according to claim 3, wherein computer program data product comprises a readable data stream encoded and superimposed upon a carrier wave for transmission between comprising systems.
The invention relates generally to a controlling a display of an electronic image viewing device and more particularly to a system, method and article of manufacture for altering a display on a viewing device based upon a user controlled orientation of the viewing device.
Computing systems are routinely used to display images of objects for a wide variety of purposes. Typically, these images are 2D images that present a static representation of an object. Many applications that use such images of objects find 2D static images less than desirable as they do not present a complete representation of the object to the view. For example, a buyer of watches shopping over the Internet may wish to see the watch from different perspectives to see how the face of the watch appears when reading the displayed time as well as to see how thick the watch is as it is worn on a wrist.
Image display systems have also been developed to allow a user to pan and scroll around an object to see the object from differing perspectives. Such systems typically provide a user with a flat, 2D image that provides a panoramic view of all sides of an object while allowing a user to see a portion of the image as if the user was rotating the object. Such systems are an improvement over the flat 2D image of an object; however, these images still do not provide a true perspective view of the object in a 3D concept.
When a user views items like a watch, a user would like to see the object as if it was located within a specimen box. In such a system, the user may see different perspectives of the item by “changing the orientation of the box” to obtain a different view of the object within the box. This approach will address the need to provide a 3D perspective of the item within the confines of a 2D window into the box and thus address limitations existing in earlier image presentation systems.
The present invention relates to a method, apparatus, and article of manufacture for altering a display on a viewing device based upon a user-controlled orientation of the viewing device. A system in accordance with the principles of the present invention includes a system for altering a computer generated image of an object displayed upon a display device. The system includes a display device orientation measurement module for obtaining a measure of a spatial orientation of the display device and a hand-held processing module for generating the computer generated image of an object. The computer generated image of the object is generated using the measurements of the spatial orientation of the display device to determine a displayed orientation of the object.
Another aspect of the present invention is a computer implemented method, and corresponding computer program data product, for altering a computer generated image of an object displayed upon a display device where the display device has a display device orientation measurement module for obtaining a measure of a spatial orientation of the display device. The method obtains a set of orientation measurements for the display device from the display device orientation measurement module, generates a transformation matrix using the set of orientation measurements for use in generating the computer generated image of an object, and applies the transformation matrix to all visible points within the computer generated image of an object.
These and various other advantages and features of novelty which characterize the invention are pointed out with particularity in the claims annexed hereto and form a part hereof. However, for a better understanding of the invention, its advantages, and the objects obtained by its use, reference should be made to the drawings which form a further part hereof, and to accompanying descriptive matter, in which there are illustrated and described specific examples of an apparatus in accordance with the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates a user of a hand-held computing device to altering an image displayed upon the hand-held computing device according to one embodiment of the present invention.
FIG. 2 illustrates geometry for an object projected onto a screen of a hand-held computing device appearing to display a solid object within its physical dimensions according to one possible embodiment of the present invention.
FIG. 3 illustrates a hand-held computing device appearing to display a solid object within its physical dimensions according to another embodiment of the present invention.
FIG. 4 illustrates a hand-held computing device appearing to display a solid object within its physical dimensions at an altered orientation according to an example embodiment of the present invention.
FIG. 5 illustrates a hand-held computing device appearing to display a solid object within its physical dimensions at an altered orientation and having a different user orientation relative to the computing device according to another example embodiment of the present invention.
FIG. 6 illustrates a hand-held computing device appearing to display a solid object within its physical dimensions at an additional altered orientation and having a different user orientation relative to the computing device according to another example embodiment of the present invention.
FIG. 7 illustrates an exemplary computing system that may be used to support various computing system that are part of example embodiments of the present invention.
FIG. 8 illustrates a block diagram for an image manipulation and display processing system according to an embodiment of the present invention.
FIG. 9 illustrates an operational flow for an image manipulation and display processing system according to yet another example embodiment of the present invention.
The present invention relates to a system, method and article of manufacture for altering a display on a viewing device based upon a user controlled orientation of the viewing device.
FIG. 1 illustrates a user of a hand-held computing device to altering an image displayed upon the hand-held computing device according to one embodiment of the present invention. A user holds a hand-held computer 100, such as a personal digital assistant (PDA) device like a Pocket PC computer or a pen-based Tablet PC computer, that displays a 3D image of an object 101. As the user changes the orientation of the hand-held computer 100, the orientation of the 3D image of the object 101 changes to provide an altered orientation of the object. The image of the object 101 appears to provide the user with the view of the object as if the hand-held computer acted as a specimen box containing a physical representation of the object 101. In such a specimen box, a view of an item changes as the orientation of the box is a varied.
The orientation of the hand-held computer 100 is provided using one or more accelerometers, or tilt-sensors, that are mounted onto or within the hand-held computer 100. As the hand-held computer is moved relative to the Earth's gravitational field, the change in orientation may be detected. These sensor inputs may be repetitively sampled to allow the 3D image of the object to be continuously updated. Examples of tilt sensors that may be used in such a system include an ADXL202 device from ANALOG DEVICES, of Norwood, Mass. This device provides a complete 2-axis accelerometer with a measurement range of ±2 g. The ADXL202 can measure both dynamic acceleration (e.g., vibration) and static acceleration (e.g., gravity). This device may also be used to measure rotation of a screen similar as to how points of the compass are measured. In one embodiment, a device to provide this compass functionality is a PNI electronic compass from PNI Corp, of Santa Rosa, Calif.
FIGS. 2 a and 2 b illustrates geometry for an object projected onto a screen of a hand-held computing device appearing to display a solid object within its physical dimensions according to one possible embodiment of the present invention. FIG. 2 a illustrates an object 202 viewed by a user 200 as it appears on a screen 201 of a computing device 100. FIG. 2 a shows the object 202 and screen 201 of the device 100 in a first orientation. FIG. 2 b illustrates the same object 202 and screen 201 after they have been changed to a second orientation. The user 200 observes various points on the object 202, such as object corners A 211, B 212, C 213, and D 214 as images pixels located on the screen 201 and corresponding locations a 221, b 222, c 223, and d 224. The screen points a 221, b 222, c 223, and d 224 are found by projecting the object onto the screen 201 as viewed by the user 200.
When the orientation of computer 100 is changed in FIG. 2 b, the screen 201 and object 202, and its corner points A 211, B 212, C 213, and D 214 remain in the same orientation relative to each other. However, a new set of screen points a′ 231, b′ 232, c′ 233, and d′ 234 are created. These screen points 231–234, which represent the image of the object 202 are different for each orientation of the computer 100. The changes in the screen points may be easily seen by comparing the relative distances between screen points b→a and a→c in FIG. 2 a with the corresponding distances b′→a′ and a′→c′ in FIG. 2 b. These changes arise because of the effects of parallax on the observations of the object 202 as seen by the user 200. By measuring the orientation and changing the projection of a digital representation of the object 202 onto the screen 201, the desired effect of creating a specimen box is achieved.
Other visual effects such lighting changes due to other light sources may be added to the representations of the points on the object 202 without deviating from the spirit and scope of the present invention as recited in the attached claims. Additionally, the object 202, which is a digital representation of a physical object, may itself be a static object or a “live” object that moves within the field of view of the screen. In this latter case, the computer 100 will move the object within the field of view of the computer and the updated 3D image will then be projected upon the screen 201 to provide the desired effects.
FIG. 3 illustrates a hand-held computing device appearing to display a solid object within its physical dimensions according to another embodiment of the present invention. At the heart of present invention is a projection formula. This process is what makes the simulated object inside the pocket PC (think of it as a “specimen box” whose glass lid is the screen) look right. Accelerometers on the device provide output signals whose values determines the necessary “oblique projection” matrix.
Before one may understand the process of the present invention, a definition of various coordinates is needed. World coordinates in the frame of the pocket PC are:
FIG. 2 shows viewing geometry in the standard “home” configuration, in which the viewer looks directly down on device. FIG. 3 shows a general configuration, in which pocket PC is tilted to an angle.
Accelerometers attached to device give outputs:
in units of earth's gravity g—i.e. ax=1 is one gravity unit, and will occur when the pocket PC is rotated, about its Y-axis, through 90 degrees from the horizontal.
Accelerometer outputs a are used to define a “control” signal u:
All projections here are done as for a viewer at infinity—i.e. affined approximation to the full projective form. An exact projective form could be used instead, in which case the exact distance from PC to viewer needs to be known. In the absence of such information, the affined approximation is effectively assuming that the dimensions of the “specimen box” are small compared with the viewer-PC distance.
In the standard configuration of FIG. 3, the formula for the “display” projection onto the screen is
It is understood that this projection is combined with hidden surface removal, using a z-buffer display algorithm or other standard method. With an oblique projection as in FIG. 4, the general viewer configuration is driven from accelerometer outputs a as follows.
Iterate at successive time intervals, as follows:
- 1. Given raw accelerometer outputs, solve for θε[−π,π] and φε[0,π] in
- to obtain the gravity vector in polar form (θ, φ).
- 2. Now compute the control vector
- 3. Then define a 3D, affine, oblique projection matrix M
- 4. Apply M to each visible point R of the virtual object:
- 5. Apply the standard display projection of equation (1), including hidden surface removal, to each visible point R.
FIG. 5 illustrates a hand-held computing device appearing to display a solid object within its physical dimensions at an altered orientation and having a different user orientation relative to the computing device according to another example embodiment of the present invention. As above with respect to FIGS. 3 and 4, FIG. 6 illustrates a hand-held computing device appearing to display a solid object within its physical dimensions at an additional altered orientation and having a different user orientation relative to the computing device according to another example embodiment of the present invention. FIGS. 3 and 4 are a special case of the examples shown in FIGS. 5 and 6. In the FIG. 3, the viewing direction is aligned with the gravity vector. This is not the most general case; nor is it comfortable ergonomically, as the viewer is likely to feel comfortable viewing at a slant as in FIG. 5. When the device is tilted an additional amount, the configuration of FIG. 6 occurs.
For this more general case, a modified projection algorithm is needed. First, define the “gravity vector” g, as a 3-vector. This vector is computed from the 2 accelerometer outputs a as follows:
Initialize as follows.
- 1. In the home configuration (which could be indicated by the user pressing a button to indicate “start viewing”), the accelerometer output a0 is recorded.
- 2. Calculate the gravity vector g0 from a0, using equation (7).
Iterate at successive time intervals, as follows:
- 1. Read the instantaneous accelerometer outputs a and calculate the gravity vector g from a, using equation (7).
- 2. Solve for θ, φ in
g=R z(θ)R y(φ)g 0 (8)
- to obtain the gravity vector in polar form (θ, φ), where
- 3. Apply steps 2 to 5 from section 3.
With reference to FIG. 7, an exemplary computing system for embodiments of the invention includes a general purpose computing device in the form of a conventional computer system 700, including a processor unit 702, a system memory 704, and a system bus 706 that couples various system components including the system memory 704 to the processor unit 700. The system bus 706 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus and a local bus using any of a variety of bus architectures. The system memory includes read only memory (ROM) 708 and random access memory (RAM) 710. A basic input/output system 712 (BIOS), which contains basic routines that help transfer information between elements within the computer system 700, is stored in ROM 708.
The computer system 700 further includes a hard disk drive 712 for reading from and writing to a hard disk, a magnetic disk drive 714 for reading from or writing to a removable magnetic disk 716, and an optical disk drive 718 for reading from or writing to a removable optical disk 719 such as a CD ROM, DVD, or other optical media. The hard disk drive 712, magnetic disk drive 714, and optical disk drive 718 are connected to the system bus 706 by a hard disk drive interface 720, a magnetic disk drive interface 722, and an optical drive interface 724, respectively. The drives and their associated computer-readable media provide nonvolatile storage of computer readable instructions, data structures, programs, and other data for the computer system 700.
Although the exemplary environment described herein employs a hard disk, a removable magnetic disk 716, and a removable optical disk 719, other types of computer-readable media capable of storing data can be used in the exemplary system. Examples of these other types of computer-readable mediums that can be used in the exemplary operating environment include magnetic cassettes, flash memory cards, digital video disks, Bernoulli cartridges, random access memories (RAMs), and read only memories (ROMs).
A number of program modules may be stored on the hard disk, magnetic disk 316, optical disk 719, ROM 708 or RAM 710, including an operating system 726, one or more application programs 728, other program modules 730, and program data 732. A user may enter commands and information into the computer system 300 through input devices such as a keyboard 734 and mouse 736 or other pointing device. Examples of other input devices may include a microphone, joystick, game pad, satellite dish, and scanner. For hand-held devices and tablet PC devices, electronic pen input devices may also be used. These and other input devices are often connected to the processing unit 702 through a serial port interface 740 that is coupled to the system bus 706. Nevertheless, these input devices also may be connected by other interfaces, such as a parallel port, game port, or a universal serial bus (USB). A monitor 742 or other type of display device is also connected to the system bus 706 via an interface, such as a video adapter 744. In addition to the monitor 742, computer systems typically include other peripheral output devices (not shown), such as speakers and printers.
The computer system 700 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 746. The remote computer 746 may be a computer system, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer system 700. The network connections include a local area network (LAN) 748 and a wide area network (WAN) 750. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets, and the Internet.
When used in a LAN networking environment, the computer system 700 is connected to the local network 748 through a network interface or adapter 752. When used in a WAN networking environment, the computer system 700 typically includes a modem 754 or other means for establishing communications over the wide area network 750, such as the Internet. The modem 754, which may be internal or external, is connected to the system bus 706 via the serial port interface 740. In a networked environment, program modules depicted relative to the computer system 700, or portions thereof, may be stored in the remote memory storage device. It will be appreciated that the network connections shown are exemplary, and other means of establishing a communication link between the computers may be used.
FIG. 8 illustrates a block diagram for an image manipulation and display processing system according to an embodiment of the present invention. In the example embodiment shown herein, the system includes an orientation measurement module 801, a hand-held computing system 802, and a serial connection between them 803.
The orientation measurement module 801 possesses a tilt sensor module 811, a microcontroller module 812, and a serial interface module 813. The tilt sensor module 811, which may be constructed using the ADXL 202 sensor as discussed above, generated an X and a Y static orientation measurement that is passed to the microcontroller module 812. The microcontroller module 812 performs all control and communications functions to obtain the orientation measurements and transmit them to the hand-held computer 802 as needed. The serial interface module 813 formats and transmits the data over the serial communications link in a desired protocol.
The hand-held computing system 802 possesses a set of processing modules to implement the HoloSim processing system. This set of processing modules includes a serial input module 821, a HoloSim processing module 822, a Displayed object dynamic motion module 823, a display output module 824, and a data memory module 825. The serial input module 821 receives and decodes the data transmitted over the serial communications link in the desired protocol. The HoloSim processing module 822 performs the location transformation and projection calculations needed to update a displayed image of an object. The Displayed object dynamic motion module 823 provides the processing needed to dynamically move an object within the field of view of the computer 802 if desired. The display output module 824 performs the display generation functions to output the 3D representation of the object onto the display of the computer 802. The data memory module 825 contains the data representations for the object and its projection onto the display screen of the computer 802 that is used by the other processing modules.
The serial connection between them 803 may be constructed as any serial connection between two digital processing devices such as an RS-232 connection, a USB connection, a Firewire connection or any other serial communication protocol. In addition, one skilled in the art will recognize that the orientation measurement module 801 may be integrated within the hand-held computer 802 where the tilt sensor 811 is a peripheral device of a processor within the hand-held computer 802 without deviating from the spirit and scope of the present invention as recited in the attached claims.
FIG. 9 illustrates an operational flow for an image manipulation and display processing system according to yet another example embodiment of the present invention. The processing begins 901 and an initial set of accelerometer readings are obtained in module 911 in order to initialize the Holosim system to an initial orientation to begin the display of an object. These initial measurements are processed within module 912 to calculate an initial gravity vector.
Once the initialization process is completed, the display update process begins with a set of instantaneous accelerometer readings being obtained in module 921. A current measure of the gravity vector is calculated in polar form in module 922. The control vector u is calculated in module 923 and the Obique Projection Matrix M is calculated in module 924. These vectors are used in module 925 as matrix M is applied to each visible point R on the object. A standard display projection, along with hidden surface removal processing, is then applied to generated the output image in module 926. This output image is displayed to the user.
Test module 913 determines if an additional update for the output image is to be generated. If test module 913 determines that an additional output image is to be generated, the processing returns to module 921 where a new set of accelerometer readings are obtained and used in the generation of the next output image. If test module 913 determines that no additional output images are to be generated, the processing ends 902.
FIG. 7 illustrates an example of a suitable operating environment 700 in which the invention may be implemented. The operating environment is only one example of a suitable operating environment 700 and is not intended to suggest any limitation as to the scope of use or functionality of the invention. Other well known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to, personal computers, server computers, held-held or laptop devices, multiprocessor systems, microprocessor-based systems, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
The invention may also be described in the general context of computer-executable instructions, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically the functionality of the program modules may be combined or distributed in desired in various embodiments.
A computing system 700 typically includes at least some form of computer readable media. Computer readable media can be any available media that can be accessed by the system 700. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer 100.
Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer readable media.
While the above embodiments of the present invention describe a processing system for altering an image displayed to a user, one skilled in the art will recognize that the various computing architectures may be used to implement the present invention as recited within the attached claims. It is to be understood that other embodiments may be utilized and operational changes may be made without departing from the scope of the present invention.
The foregoing description of the exemplary embodiments of the invention has been presented for the purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in light of the above teaching. It is intended that the scope of the invention be limited not with this detailed description, but rather by the claims appended hereto. Thus the present invention is presently embodied as a method, apparatus, computer storage medium or propagated signal containing a computer program for providing a method, apparatus, and article of manufacture for altering an image displayed to a user based upon the proximity of the user to the display device.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US4504701||Nov 14, 1983||Mar 12, 1985||J. C. Penney Company, Inc.||Telephone having circuitry for reducing the audio output of the ringing signal|
|US5329577||Dec 29, 1992||Jul 12, 1994||Nec Corporation||Telephone having touch sensor for responding to a call|
|US5337353||Apr 1, 1992||Aug 9, 1994||At&T Bell Laboratories||Capacitive proximity sensors|
|US5481595||Mar 8, 1994||Jan 2, 1996||Uniden America Corp.||Voice tag in a telephone auto-dialer|
|US5543588 *||Dec 3, 1993||Aug 6, 1996||Synaptics, Incorporated||Touch pad driven handheld computing device|
|US5602566 *||Aug 23, 1994||Feb 11, 1997||Hitachi, Ltd.||Small-sized information processor capable of scrolling screen in accordance with tilt, and scrolling method therefor|
|US5657372||Oct 17, 1994||Aug 12, 1997||Ericsson Inc.||Systems and methods for selectively accepting telephone calls without establishing voice communications|
|US5661632||Sep 29, 1995||Aug 26, 1997||Dell Usa, L.P.||Hand held computer with dual display screen orientation capability controlled by toggle switches having first and second non-momentary positions|
|US5689665||Jun 7, 1995||Nov 18, 1997||International Business Machines Corporation||Apparatus and method for displaying windows|
|US5705997||May 30, 1995||Jan 6, 1998||Daewood Electronics Co., Ltd.||Self illumination circuit of a hand-held remote control device and self illumination method thereof|
|US5712911||Sep 13, 1995||Jan 27, 1998||Samsung Electronics Co. Ltd.||Method and system for automatically activating and deactivating a speakerphone|
|US5714997||Jan 6, 1995||Feb 3, 1998||Anderson; David P.||Virtual reality television system|
|US5761071||Jul 27, 1996||Jun 2, 1998||Lexitech, Inc.||Self-service computer|
|US5860016||Sep 30, 1996||Jan 12, 1999||Cirrus Logic, Inc.||Arrangement, system, and method for automatic remapping of frame buffers when switching operating modes|
|US5910882||Nov 14, 1995||Jun 8, 1999||Garmin Corporation||Portable electronic device for use in combination portable and fixed mount applications|
|US5924046||May 22, 1997||Jul 13, 1999||Nokia Mobile Phones (U.K.) Limited||Portable radio telephone with sound off-hook production|
|US5963952||Jul 15, 1997||Oct 5, 1999||International Business Machines Corp.||Internet browser based data entry architecture|
|US5995852||Dec 13, 1995||Nov 30, 1999||Sony Corporation||Communication terminal equipment and call incoming control method|
|US6115025 *||Sep 30, 1997||Sep 5, 2000||Silicon Graphics, Inc.||System for maintaining orientation of a user interface as a display changes orientation|
|US6137468||Oct 15, 1996||Oct 24, 2000||International Business Machines Corporation||Method and apparatus for altering a display in response to changes in attitude relative to a plane|
|US6201554||Jan 12, 1999||Mar 13, 2001||Ericsson Inc.||Device control apparatus for hand-held data processing device|
|US6216016||Nov 24, 1997||Apr 10, 2001||U.S. Philips Corporation||Method and system for generating and transmitting a waiting message|
|US6246862||Feb 3, 1999||Jun 12, 2001||Motorola, Inc.||Sensor controlled user interface for portable communication device|
|US6259787||May 26, 1998||Jul 10, 2001||Dynachieve, Inc.||Telephone alarm and monitoring method and apparatus|
|US6288704||Nov 9, 1999||Sep 11, 2001||Vega, Vista, Inc.||Motion detection and tracking system to control navigation and display of object viewers|
|US6292674||Aug 5, 1998||Sep 18, 2001||Ericsson, Inc.||One-handed control for wireless telephone|
|US6304765||Nov 16, 1999||Oct 16, 2001||Motorola, Inc.||Foldable communication device and method|
|US6310955||Jun 16, 1998||Oct 30, 2001||Lucent Technologies Inc.||Methods and apparatus for enabling portable telephone handset to automatically go off-hook|
|US6374145||Dec 14, 1998||Apr 16, 2002||Mark Lignoul||Proximity sensor for screen saver and password delay|
|US6381540||Sep 14, 2001||Apr 30, 2002||Garmin Corporation||GPS device with compass and altimeter and method for displaying navigation information|
|US6408187||May 14, 1999||Jun 18, 2002||Sun Microsystems, Inc.||Method and apparatus for determining the behavior of a communications device based upon environmental conditions|
|US6426736||Dec 23, 1999||Jul 30, 2002||Nec Corporation||Portable telephone with liquid crystal display|
|US6449363||Nov 9, 1999||Sep 10, 2002||Denso Corporation||Safety tilt mechanism for portable telephone including a speakerphone|
|US6466198||Apr 5, 2000||Oct 15, 2002||Innoventions, Inc.||View navigation and magnification of a hand-held device with a display|
|US6491632 *||Jun 26, 2001||Dec 10, 2002||Geoffrey L. Taylor||Method and apparatus for photogrammetric orientation of ultrasound images|
|US6509907||Dec 15, 1999||Jan 21, 2003||Denso Corporation||Personal communication terminal with variable speed scroll display feature|
|US6516202||Aug 12, 1999||Feb 4, 2003||Handspring, Inc.||Mobile computer system designed for wireless communication expansion|
|US6542436||Jun 30, 2000||Apr 1, 2003||Nokia Corporation||Acoustical proximity detection for mobile terminals and other devices|
|US6560466||Sep 15, 1998||May 6, 2003||Agere Systems, Inc.||Auditory feedback control through user detection|
|US6567068||Jul 29, 1997||May 20, 2003||Sony Corporation||Information processing device and method|
|US6567101||Oct 13, 1999||May 20, 2003||Gateway, Inc.||System and method utilizing motion input for manipulating a display of data|
|US6573883||Jun 24, 1998||Jun 3, 2003||Hewlett Packard Development Company, L.P.||Method and apparatus for controlling a computing device with gestures|
|US6621800||Jan 24, 2000||Sep 16, 2003||Avaya Technology Corp.||Message monitor application concept and implementation|
|US6624824||Apr 30, 1996||Sep 23, 2003||Sun Microsystems, Inc.||Tilt-scrolling on the sunpad|
|US6631192||Sep 22, 1999||Oct 7, 2003||Nec Corporation||Cellular phone with lighting device and method of controlling lighting device|
|US6658272||Apr 28, 2000||Dec 2, 2003||Motorola, Inc.||Self configuring multiple element portable electronic device|
|US6822683||Oct 25, 1999||Nov 23, 2004||Fuji Photo Film Co., Ltd||Image sensing apparatus and method of controlling operation thereof|
|US6931592||May 22, 2000||Aug 16, 2005||Microsoft Corporation||Reviewing and merging electronic documents|
|US6970182||Oct 20, 1999||Nov 29, 2005||National Instruments Corporation||Image acquisition system and method for acquiring variable sized objects|
|US20010044318||Dec 14, 2000||Nov 22, 2001||Nokia Mobile Phones Ltd.||Controlling a terminal of a communication system|
|US20020021278||Jun 6, 2001||Feb 21, 2002||Hinckley Kenneth P.||Method and apparatus using multiple sensors in a device with a display|
|US20020167488||Jun 3, 2002||Nov 14, 2002||Hinckley Kenneth P.||Mobile phone operation based upon context sensing|
|US20030085870||Nov 14, 2002||May 8, 2003||Hinckley Kenneth P.||Method and apparatus using multiple sensors in a device with a display|
|US20030104800||Feb 21, 2002||Jun 5, 2003||Artur Zak||Telephone with alarm signalling|
|US20030133629||Jan 17, 2002||Jul 17, 2003||Sayers Craig P.||System and method for using printed documents|
|US20030176205||Mar 18, 2002||Sep 18, 2003||Kabushiki Kaisha Toshiba||Mobile communication terminal with unanswered incoming-call notifying function|
|JP2000124970A|| ||Title not available|
|JP2001094636A|| ||Title not available|
|JPH08292826A|| ||Title not available|
|WO1998014863A2||Sep 22, 1997||Apr 9, 1998||Philips Electronics Nv||Hand-held image display device|
|WO1999022338A1||Oct 8, 1998||May 6, 1999||British Telecomm||Portable computers|
|1||Bartlett, J.F., "Rock 'n' Scroll Is Here to Stay," IEEE Computer Graphics and Applications, pp. 40-45, May/Jun. 2000).|
|2||Harrison, Beverly L. et al., "Squeeze Me, Hold Me, Tilt Me An Exploration of Manipulative User Interfaces," pp. 17-24 (Apr. 18-23, 1998), CHI '98.|
|3||Hinckley, et al., "Sensing Techniques for Mobile Interaction," CHI Letters vol. 2:2; Copyright 2000, ACM 1-55113-232-3, pp. 91-100.|
|4||Innoventions' RotoView(TM), The Intuitive Display Navigation Solution for Hand Held Devices, Background and problem definition.|
|5||One page (19) from Technology Review dated Mar. 2002.|
|6||Rekimoto, Jun, "Tilting Operations for Small Screen Interfaces (Tech Note)," pp. 167-168, UIST '96.|
|7||RotoView(TM) By Innoventions, Features and Specifications.|
|8||RotoView(TM) By Innoventions, How It Works.|
|9||RotoView(TM) By Innoventions, The Instuitive Display Navigation Solution for Hand Held Devices.|
|10||RotoView(TM), By Innoventions, The Intuitive Display Navigation solution for Hand Held Devices.|
|11||Schmidt, Albercht et al., "Advanced Interaction in Context," 13 pages, HUC '00.|
|12||Schmidt, Albrecht et al., "There Is More to Context Than Location," Enviroment Sensing Technologies for Adaptive Mobile User Interfaces, 5 pages, IMC '98.|
|13||Schmidt, Albrecht, "Implicit Human Computer Interaction Through Context," pp. 1-5, 2nd Workshop on Human Computer Interaction with Mobil Devices, 1999.|
|14||Small, David et al., "Design of Spatially Aware Graspable Displays," Extended Abstracts of CHI '97, pp. 1-2 (Mar. 22-27, 1997).|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US7310082 *||May 28, 2004||Dec 18, 2007||Samsung Electronics Co., Ltd.||Computer display having display direction control|
|US7382353 *||Nov 18, 2004||Jun 3, 2008||International Business Machines Corporation||Changing a function of a device based on tilt of the device for longer than a time period|
|US7567818 *||Mar 4, 2005||Jul 28, 2009||Motionip L.L.C.||Mobile device with wide-angle optics and a radiation sensor|
|US7730422 *||Jan 25, 2006||Jun 1, 2010||Microsoft Corporation||Smart icon placement across desktop size changes|
|US7966146||Apr 11, 2008||Jun 21, 2011||Keynetik, Inc.||Force sensing apparatus and method to determine the radius of rotation of a moving object|
|US8743054||Dec 3, 2009||Jun 3, 2014||Koninklijke Philips N.V.||Graphical representations|
|US8767053 *||Aug 26, 2010||Jul 1, 2014||Stmicroelectronics, Inc.||Method and apparatus for viewing stereoscopic video material simultaneously with multiple participants|
|US8803816||Sep 3, 2009||Aug 12, 2014||Qualcomm Incorporated||Multi-fold mobile device with configurable interface|
|US8836611||Sep 3, 2009||Sep 16, 2014||Qualcomm Incorporated||Multi-panel device with configurable interface|
|US8860632||Sep 3, 2009||Oct 14, 2014||Qualcomm Incorporated||Multi-panel device with configurable interface|
|US8860765||Sep 3, 2009||Oct 14, 2014||Qualcomm Incorporated||Mobile device with an inclinometer|
|US8863038||Sep 3, 2009||Oct 14, 2014||Qualcomm Incorporated||Multi-panel electronic device|
|US8890802 *||Jun 10, 2008||Nov 18, 2014||Intel Corporation||Device with display position input|
|US8907943||Jul 7, 2010||Dec 9, 2014||Apple Inc.||Sensor based display environment|
|US8922583 *||Nov 17, 2009||Dec 30, 2014||Qualcomm Incorporated||System and method of controlling three dimensional virtual objects on a portable computing device|
|US8933874||Sep 3, 2009||Jan 13, 2015||Patrik N. Lundqvist||Multi-panel electronic device|
|US8947320||Sep 3, 2009||Feb 3, 2015||Qualcomm Incorporated||Method for indicating location and direction of a graphical user interface element|
|US8947382||Feb 28, 2012||Feb 3, 2015||Motorola Mobility Llc||Wearable display device, corresponding systems, and method for presenting output on the same|
|US20090303208 *||Jun 10, 2008||Dec 10, 2009||Case Jr Charlie W||Device with display position input|
|US20100156907 *||Dec 23, 2008||Jun 24, 2010||Microsoft Corporation||Display surface tracking|
|US20110023113 *||Sep 29, 2010||Jan 27, 2011||Munyon Paul J||System and method for inhibiting access to a computer|
|US20110115784 *||Nov 17, 2009||May 19, 2011||Tartz Robert S||System and method of controlling three dimensional virtual objects on a portable computing device|
|US20120050463 *||Aug 26, 2010||Mar 1, 2012||Stmicroelectronics, Inc.||Method and apparatus for viewing stereoscopic video material simultaneously with multiple participants|
|US20120249600 *||Oct 25, 2011||Oct 4, 2012||Kabushiki Kaisha Toshiba||Information processing apparatus and method|
|US20130120459 *||Nov 16, 2011||May 16, 2013||Motorola Mobility, Inc.||Display Device, Corresponding Systems, and Methods for Orienting Output on a Display|
|US20140184489 *||Dec 27, 2012||Jul 3, 2014||Liwei Ma||Open angle detection and processing apparatus and method|
|WO2007033154A2 *||Sep 12, 2006||Mar 22, 2007||Rembrandt Ip Man Llc||Motion detection and tracking system to control navigation|
|Jul 25, 2014||FPAY||Fee payment|
Year of fee payment: 8
|Jul 28, 2010||FPAY||Fee payment|
Year of fee payment: 4
|May 31, 2002||AS||Assignment|
Owner name: MICROSOFT CORPORATION, WASHINGTON
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WILLIAMS, LYNDSAY;BLAKE, ANDREW;REEL/FRAME:012972/0135
Effective date: 20020530