US20110084984A1 - Self-orienting display - Google Patents

Self-orienting display Download PDF

Info

Publication number
US20110084984A1
US20110084984A1 US12/974,173 US97417310A US2011084984A1 US 20110084984 A1 US20110084984 A1 US 20110084984A1 US 97417310 A US97417310 A US 97417310A US 2011084984 A1 US2011084984 A1 US 2011084984A1
Authority
US
United States
Prior art keywords
display image
display
display device
image portion
sensors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/974,173
Inventor
Scott Manchester
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to US12/974,173 priority Critical patent/US20110084984A1/en
Publication of US20110084984A1 publication Critical patent/US20110084984A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/22Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of characters or indicia using display control signals derived from coded signals representing the characters or indicia, e.g. with a character-code memory
    • G09G5/32Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of characters or indicia using display control signals derived from coded signals representing the characters or indicia, e.g. with a character-code memory with means for controlling the display position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/161Indexing scheme relating to constructional details of the monitor
    • G06F2200/1614Image rotation following screen orientation, e.g. switching from landscape to portrait mode
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0606Manual adjustment
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0492Change of orientation of the displayed image, e.g. upside-down, mirrored
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • the present invention generally relates to displays and more specifically relates to systems and methods that automatically orient displays.
  • Display devices are becoming smaller and more portable. Display devices such as flat liquid crystal displays (LCDs) and plasma displays are relative thin and light weight. These light weight, smaller displays are more easily maneuvered than many of the bulkier cathode ray tube (CRT) displays. Due to the increased maneuverability of these displays, viewers are more likely to turn or rotate the display. This is also applicable to the plethora of available hand held display devices such as personnel digital assistants (PDAs), cell phones, and games, just to name a few. As the cost of these display devices continues to decrease, and the increasing number of smart devices which incorporate these displays increases, more and more users will be using these products to accommodate a variety of needs.
  • LCDs flat liquid crystal displays
  • plasma displays are relative thin and light weight. These light weight, smaller displays are more easily maneuvered than many of the bulkier cathode ray tube (CRT) displays. Due to the increased maneuverability of these displays, viewers are more likely to turn or rotate the display. This is also applicable to the plethora of available hand held display devices such as
  • a problem with current display devices is that the display image becomes difficult to read/see when the display device is turned or rotated. For example, as a hand held PDA is rotated 90°, the display image appears tilted and can be difficult to interpret, or a viewer watching television may decide to lie down, which also makes the display image on the television appear tilted.
  • some multipurpose devices are better suited to display specific display types in specific formats, such as text in traditional portrait orientation and video in landscape orientation.
  • a method for orienting a display image includes sensing at least one characteristic of an object and determining the orientation of the object from at least one of the sensed characteristic(s).
  • An image display is oriented relative to the determined orientation of the object.
  • a system for implementing this method includes a sensor portion and a display processor.
  • the sensor portion senses at least one characteristic of an object and provides a sensor signal indicative of the characteristic(s).
  • the display processor receives the sensor signal and determines the orientation of the object from the sensor signal.
  • the display processor also orients a display image relative to the determined orientation of the object.
  • FIG. 1 is an illustration of a self-orienting display comprising a display device, a display image, a sensor, and optional control buttons in accordance with an exemplary embodiment of the present invention
  • FIG. 2A is an illustration of a rotated display device not possessing a self-orienting capability
  • FIG. 2B is an illustration of rotated self-orienting display device showing the oriented display image portions and control buttons in accordance with an exemplary embodiment of the present invention
  • FIG. 3 is an illustration of rotated self-orienting display device showing the oriented display image rotated to achieve an arbitrary orientation in accordance with an exemplary embodiment of the present invention
  • FIG. 4 is an enlarged illustration of a control button comprising an array of light emitting diodes (LEDs) in accordance with an exemplary embodiment of the present invention
  • FIG. 5 is an illustration of liquid crystal display (LCD) control buttons in accordance with an exemplary embodiment of the present invention
  • FIG. 6 is an enlarged illustration of a control button that is automatically oriented by gravity in accordance with an exemplary embodiment of the present invention
  • FIG. 7 is an enlarged illustration of a control button that is automatically oriented by gravity in accordance with another exemplary embodiment of the present invention.
  • FIG. 8 is an illustration of a self-orienting display showing a viewer viewing the display image, and multiple sensors positioned on the display device, in accordance with an exemplary embodiment of the present invention
  • FIG. 9 an illustration of a self-orienting display showing a viewer viewing the display image, and multiple sensors positioned on the viewer, in accordance with an exemplary embodiment of the present invention
  • FIG. 10 is a functional block diagram of self-orienting display system comprising a sensor portion, a display processor, a display portion, and an authenticator, in accordance with an exemplary embodiment of the present invention.
  • FIG. 11 is a flow diagram of an exemplary process for orienting a display in accordance with an exemplary embodiment of the present invention.
  • a self-orienting display in accordance with the present invention senses the orientation of an object and automatically orients a display image in accordance with the orientation of that object.
  • self-orienting includes automatically rotating, along any number of axes, and formatting.
  • An exemplary embodiment of this self-orienting display comprises a monitor that automatically orients the display image provided by the monitor to either a landscape orientation or a portrait orientation in response to the orientation of the monitor.
  • the display image may be rotated in response to an audio command, such as “rotate”, or the display image may be rotated in response to depression of a switch on the display device.
  • objects may include the display device that provides the display image, a person viewing the display image, an object within visual and/or acoustic range of the self-orienting display, or a combination thereof.
  • the display device may be any appropriate device having the capability to provide a display image, such as a monitor, a hand held device, a personal digital assistant (PDA), a cellular telephone having a display, a game device having a display, or a portable computer, for example.
  • PDA personal digital assistant
  • sensors include mechanical sensors, electrical sensors, optical sensors, acoustic sensors, gyroscopic sensors, or a combination thereof.
  • Example sensors include mercury switches, infrared detectors, motion detectors, ultrasonic detectors, cameras, and microphones.
  • sensors may be positioned on the display device, a person, or a combination thereof (e.g., mercury switches attached to the display device and gyroscopic sensor attached to a headset of a viewer of the display image).
  • Various embodiments of the display image include graphic display images, textual display images, videos display images, and functional control buttons (e.g., functional displayed representations of control buttons such as play, rewind, stop, scroll), for example. A more detailed description of these various embodiments is provided below.
  • FIG. 1 is an illustration of a self-orienting display 100 comprising a display device 12 , a display image 14 , a sensor 16 , and optional control buttons 18 .
  • the self-orienting display 12 may be in the form of any appropriate display device capable of providing the display image 14 .
  • appropriate display devices 12 include cathode ray tube (CRT) displays, plasma displays, light emitting diode (LED) displays, flat panel displays, projection displays, wireless devices (e.g., cellular devices including telephones, personal digital assistants (PDAs), portable computers; and devices communicating via an optical link, such as an infrared link), hand held devices (e.g., hand held games or game controllers), televisions, radios, or alarm clocks, just to name a few.
  • CTR cathode ray tube
  • LED light emitting diode
  • flat panel displays flat panel displays
  • projection displays projection displays
  • wireless devices e.g., cellular devices including telephones, personal digital assistants (PDAs), portable computers
  • the sensor 16 may comprise any type of sensor capable of sensing the orientation of the display device 12 and/or another object (e.g., a person viewing the display image 14 ).
  • appropriate sensors include mechanical sensors, electrical sensors, optical sensors, acoustic sensors, gyroscopic sensors, or a combination thereof.
  • Some specific types of sensors 16 include mercury switches, infrared detectors, motion detectors, ultrasonic detectors, cameras, and microphones, or a combination thereof. Note some types of sensors fall into more than one category. For example, a mercury switch may be considered a mechanical sensor and an electrical sensor, or an ultrasonic sensor may be considered an acoustic sensor and an electrical sensor.
  • the sensor 16 may include a single sensor or a plurality of sensors.
  • the sensor 16 may be positioned at various locations on the display device 12 or may be positioned at a single location. For example, sensors 16 may be placed at the corners of the display device 12 . Furthermore, sensors 16 may be positioned on the display device, a person, or a combination thereof.
  • the display image 14 may be in the form of a graphic display image, a textual display image, a video display image, and a functional control button 18 , or a combination thereof.
  • the display image 14 may comprise display image portions, such as display image portions 14 a and 14 b.
  • a graphic/video display type is provided by the display image portion 14 a
  • a text display type is provided by the display image portion 14 b .
  • the display image portion 14 a may depict a video
  • the display image portion 14 b may depict email headers/text. It is to be understood that this depiction is exemplary, and not intended to be limited thereto.
  • the display image 14 may not be partitioned into portions, the display image 14 may be portioned into a plurality of portions, the display image portions may overlap, the display image portions may provide any combination of display types, or a combination thereof.
  • the control buttons are implemented as display image portions (described in more detail below). It is to be understood, therefore, that reference to display image 14 , display image portion 14 a, and/or display image portion 14 b, may also be appropriately interpreted to refer to the control buttons 18 when implemented as display portions. For example, a description of rotation and formatting techniques to be applied to the display image 14 also applies to the control buttons 18 implemented as display portions.
  • the control buttons 18 may comprise any appropriate type of control device capable of controlling functions related to the display image 14 and/or the display device 12 .
  • the control buttons 18 comprise liquid crystal display (LCD) buttons with a protective overlay (e.g., a touch switch).
  • each control button 18 comprises an array of light emitting diodes (LEDs).
  • each control button 18 comprises a thin disc or the like, formed in a desired shape (e.g., triangle) contained within the liquid.
  • the control buttons 18 are weighted such that a portion of each button is always oriented towards the greatest gravitation force.
  • the control buttons 18 control various aspects of the display image 14 and/or the display device 12 .
  • control buttons 18 may include, for example, playback, pause, stop, rewind, enable/disable back lighting, or a combination thereof.
  • the control buttons 18 may include an orientation button that, when activated, orients the display image 14 .
  • one of the control buttons 18 may switch the display image 14 between landscape orientation and portrait orientation each time the button is depressed/touched.
  • the orientation control button may rotate the display image 14 a predetermined number of degrees each time it is depressed/touched.
  • the control buttons 18 are optional.
  • various embodiments of the self-orienting display in accordance with the present invention may or may not comprise control buttons.
  • FIG. 2A is an illustration of a rotated display device not possessing a self-orienting capability.
  • FIG. 2B is an illustration of rotated self-orienting display device 12 showing the oriented display image portions 14 a, 14 b, and control buttons 18 .
  • the display devices shown in FIG. 2A and FIG. 2B are rotated 90 degrees with respect to the display device 12 shown in FIG. 1 . Comparing FIG. 2A with FIG. 2B , the display image portion 14 a of FIG. 2B is rotated by 90° with respect to the equivalent display image portion shown in FIG. 2A .
  • the display image portion 14 b of FIG. 2B is also rotated by 90° with respect to the equivalent display image portion shown in FIG. 2A .
  • buttons 18 a and 18 b of FIG. 2B are rotated and reformatted to conform to the rotated image space of the display device 12 .
  • Rotation and formatting may be accomplished by any appropriate technique. For example, a raster scan display image may be rotated by simply transposing the horizontal and vertical deflection values. Formatting may then be accomplished to fit the rotated image within the available display image space to reduce any distortion. For a display utilizing pixels, each array of pixels may be transposed and formatted. Examples of algorithms/techniques for reformatting displays include scaling, stretching, and the ability to dynamically update resolution.
  • the display image 14 is oriented with respect to the orientation of the display device 12 .
  • the display image 14 is automatically oriented, such that the appearance of the display image 14 appears to remain approximately stable regardless of the orientation of the display device 12 .
  • a viewer prefers landscape mode, she can rotate the display device 12 to achieve the orientation shown in FIG. 1 .
  • the viewer prefers portrait mode, she can rotate the display device 12 to achieve the orientation shown in FIG. 2B . Note that even though the appearance of the display image 14 , relative to a viewer, remains approximately constant, the display image 14 is actually oriented (rotated and formatted) in response to the orientation of the display device 12 .
  • the relative orientation between the display image 14 and a viewer is approximately constant.
  • the display image 14 is tilted in the same direction, such that the orientation between the viewer and the display image 14 is approximately constant (fixed).
  • the display image 14 is rotated to achieve a landscape orientation or a portrait orientation.
  • orientation of the display image 14 is not limited thereto.
  • FIG. 3 is an illustration of rotated self-orienting display device 12 showing the oriented display image 14 rotated to achieve an arbitrary orientation.
  • the display image 14 of FIG. 3 is automatically rotated such that the relative orientation between a viewer is approximately constant, regardless of the amount by which the display device 12 is rotated.
  • Orientation of the display image 14 and/or the control buttons 18 is not limited to rotation in a single dimension (e.g., plane).
  • the display image 14 may be oriented in one, two, or three dimensions, as indicated by the three dimensional set of axes 15 .
  • a three dimensional depiction on the display image 14 may be rotated horizontally, vertically, or a combination thereof as the display device 12 is rotated, such that the relative orientation between a viewer and the display image 14 remains approximately the same.
  • the three dimensional display image 14 is oriented to provide a desired perspective to the viewer. This may be accomplished by the viewer simply turning and/or shifting her head to view the desired perspective, turning the display device 12 to view the desired perspective, or a combination thereof.
  • sensors 16 can be positioned on the viewer 36 and/or on the display device 12 to sense the orientation of the viewer and/or display device 12 .
  • the three dimensional display image 14 is a cube and the display device 12 is a hand held display device.
  • the viewer is viewing a front side of the cube. If the viewer desires to view the left side of the cube, she may simply rotate the hand held display device (e.g., to the right) to view the left side of the cube. She may also turn her head (e.g., to the right and/or shift her head to the left), as if the cube were physically in front of her and she positioned herself to look at the left side.
  • FIG. 4 is an enlarged illustration of a control button 18 b comprising an array of light emitting diodes (LEDs).
  • An exemplary control button 18 b is expanded to show the array of LEDs utilized to display the shape corresponding to the control function performed by the button.
  • the control button 18 b comprises a triangle shaped image, which may signify play, for example.
  • the LEDs may be various colors.
  • the image of control buttons 18 including 18 b, are rotated accordingly.
  • the array of LEDs is symmetric, thus allowing the control button image to be rotated between landscape and portrait mode by transposing the array of LEDs.
  • FIG. 5 is an illustration of liquid crystal display (LCD) control buttons 18 .
  • the control buttons 18 in FIG. 5 comprise LCD portions for displaying the shape corresponding to the control function performed by the button. As shown in FIG. 5 , the LCD portions are covered with an appropriate overlay 20 to protect the LCD portions and to provide a surface which can be touched/depressed to utilize the control buttons 18 . Upon the orientation of the display device 12 being sensed, the LCD image of control buttons 18 are rotated accordingly.
  • LCD liquid crystal display
  • FIG. 6 is an enlarged illustration of a control button 18 b that is automatically oriented by gravity, wherein the control button comprises a liquid portion 22 having a shaped disc 24 contained therein.
  • the exemplary control button 18 b is automatically oriented by gravity when the display device 12 is rotated.
  • the control button 18 b of FIG. 6 comprises a liquid portion 22 contained within the control button 18 b.
  • Within the liquid portion 22 is contained disc 24 formed in a shape corresponding to the control function performed by the button.
  • the disc 24 is triangular shaped, indicating the play function, for example.
  • the disc 24 is suspended in the liquid portion 22 . As the display device 12 is rotated the disc 24 automatically rotates, thus resulting in self-orientation of the control button 18 b.
  • the disc 24 is weighted such that a specific portion 28 of the disc 24 is always pointed in the direction of the strongest gravitational pull (e.g., down).
  • the arrow 28 depicts a portion of the disc 24 that is heavier (more mass) such that the portion 28 is always facing “down” (toward the strongest gravitational attractive force).
  • the disc 24 contains an air bubble 26 (or other appropriate gas of liquid portion) such that the portion with the lesser mass is always facing “up” (away from the direction of the strongest gravitation pull).
  • the bubble 26 may be any portion comprising a gas or a liquid that is less dense than the liquid in the liquid portion 22 .
  • Other types of control buttons 18 that are automatically oriented by gravity are envisioned.
  • FIG. 7 is an enlarged illustration of a control button 18 b that is automatically oriented by gravity, wherein the control button 18 b comprises bearing 34 .
  • Self-orientation of the control button 18 b of FIG. 7 is achieved via gravity in a similar manner as described above with respect to FIG. 6 .
  • the disc 24 is contained within the control button 18 b by bearings, or the like, which allow the disc 24 to freely rotate.
  • the control button 18 b shown in FIG. 7 may comprise a weighted portion 28 , a less dense portion 26 , or a combination, similar to the control button 18 b shown in FIG. 6 .
  • the control buttons 18 may be attached to spindles or axles that allow the control buttons 18 to freely rotate.
  • FIG. 8 is an illustration of a self-orienting display showing a viewer 36 viewing the display image 14 , and multiple sensors 16 a, 16 b, and 16 c positioned on the display device 12 .
  • the sensors 16 may comprise any appropriate type and combination of sensors capable of sensing the orientation of an object. Examples of which include known types of devices such as mercury switches, gyroscopic sensors/devices, gravity switches/devices, optical detectors (e.g., infrared detectors), acoustic sensors/devices (e.g., ultrasonic devices, acoustic microphones), electrical sensors/devices, magnetic devices/sensors, and cameras.
  • the sensors 16 may be positioned on the display device 12 and/or on the viewer 36 . Thus, the sensors 16 may be positioned on an object, wherein the object may comprise the display device 12 , the viewer 36 , another object within sensing range of the sensors 16 , or a combination thereof.
  • the sensors 16 positioned on the display device 12 in FIG. 8 are a camera 16 a, an acoustic sensor (e.g., microphone) 16 b, and mercury switches 16 c.
  • the camera 16 a may comprise any appropriate type of camera, such as a camera utilizing a charge coupled device (CCD), or an infrared camera (e.g., night vision), for example.
  • CCD charge coupled device
  • the camera 16 a senses the orientation of the viewer's 36 head.
  • the display image 14 is automatically oriented by any of the techniques/devices described herein.
  • the relative orientation between the display image 14 and the object is initialized. This may include initialization of the relative orientation between the display image 14 and the display device 12 , the relative orientation between the display image 14 and the viewer 36 , or a combination thereof.
  • the relative orientation between the viewer 36 and the display image 14 is initialized.
  • the viewer 36 may position herself in front of the display image 14 , such that she is within sensing range of the sensors 16 (e.g., optical range of the camera 16 a and/or audio range of the microphone 16 b ). While observing her depiction on the display image 14 , the viewer may position her head to align the depiction to be centered in the display image 14 , for example.
  • Initialization may be accomplished by any appropriate means, such as activating a switch, depressing a button (e.g., a control button 18 ), giving an audible command, waiting a period of time, or a combination thereof.
  • the viewer gives an audio command, such as “align”.
  • the microphone 16 b receives this audio command, and transduces the audio command into a sense signal. This sense signal is utilized to establish the baseline relative orientation between the display image 14 and the viewer 36 .
  • the viewer 36 may rotate the display device 12 to either landscape of portrait orientation.
  • the mercury switches 16 c senses the orientation of the display device 12 , also providing a sense signal.
  • the sense signal provided by the mercury switches 16 c and the sense signal provided by the camera 16 b during initialization are utilized to establish the baseline relative orientation.
  • the sense signals are also utilized to orient the display image 14 as the display device 12 and/or the viewer 36 change orientation.
  • the display image 14 may be oriented via an audio command, such as “rotate”, in response to which the display image 14 is rotated (e.g.,)90°.
  • an audio command such as “rotate”
  • the viewer 36 is authenticated. Authentication may be accomplished by analyzing the sensed image, which is sensed by the camera 16 b, to determine if the viewer is authorized to use the display device 12 .
  • the sensed image may include a retinal scan, a finger print scan, or the like.
  • the sensed image is analyzed to determine if authorization is appropriate. Any appropriate technique may be utilized to analyze the sensed image. For example, the sensed image may be compared to a stored representation of an authorized image, or the sensed image may be analyzed for key features which distinguish an authorized sensed image, or a combination thereof.
  • the viewer is authenticated by analyzing a sensed acoustic signal received by the acoustic sensor 16 c.
  • the acoustic signal may include a key phrase, such as the viewer's 36 name or a password.
  • the sensed acoustic signal is analyzed to determine if authorization is appropriate. Any appropriate technique may be utilized to analyze the sensed acoustic signal. For example, the sensed acoustic signal may be compared to a stored representation of an authorized acoustic signal, or the sensed acoustic signal may be analyzed for key features which distinguish an authorized sensed acoustic (e.g., acoustic signature), or a combination thereof.
  • FIG. 10 is a functional block diagram of self-orienting display system comprising a sensor portion 40 , a display processor 42 , a display portion 44 , and an optional authenticator 44 .
  • the sensor portion 40 may comprise any combination of the sensors described above.
  • the sensor portion 40 senses at least one characteristic of an object.
  • the object may be the display device (e.g., display device 12 ) and characteristics may include orientation of the display device; the object may be a user (e.g., viewer 36 ) and the characteristic may include an image of a portion of the user's body (e.g., retina, finger print); the object may be a user and the characteristic may include a acoustic signal provided by the user (e.g., voice), or a combination thereof
  • the sensor portion 40 provides a sensor signal 48 indicative of the sensed characteristic (or characteristics) of the object.
  • the display processor 42 receives the sensor signal 48 and processes the sensor signal 48 to determine the orientation of the sensed characteristic(s).
  • the display processor 42 provides an orientation signal 50 indicative of the sensed characteristic(s).
  • the display portion 44 receives the orientation signal 50 and orients a display image (e.g., display image 14 ) in accordance with the determined orientation.
  • the self-orienting display system comprises the authenticator 46 for authenticating the object by analyzing the sensed characteristic(s) of the object.
  • the authenticator 46 receives the sensor signal 48 and analyzes the sensed characteristic(s) using any of the analysis techniques described above.
  • This sense signal 48 and the orientation signal 50 may be provided by any appropriate means, such as electrically, acoustically, optically, electromagnetically, or a combination thereof.
  • FIG. 11 is a flow diagram of an exemplary process for self-orienting a display.
  • the object is sensed at step 54 .
  • the object may be a person, the display device, or a combination thereof.
  • the characteristic may include orientation of the object, an image of a portion of the object (e.g., retina or fingerprint), an acoustic signal (e.g., voice or clap), or a combination thereof.
  • the object may be sensed by any combination of the sensors described above, such as optical sensors, mechanical sensors, gravity sensors, gyroscopic sensors, electromagnetic sensors, acoustic sensors, touch sensitive sensors (e.g., control buttons 18 ), for example.
  • the object is authenticated as described above. The step of authentication is optional.
  • the relative orientation between the object and the display image is initialized at step 58 .
  • the step of initialization is also optional. Initialization may be accomplished described above.
  • the orientation of the object is determined utilizing the sensed characteristic (or characteristics) of the object.
  • the display image is oriented with respect to the determined orientation of the object at step 62 .
  • the display image may be oriented to predetermined orientations, such as portrait, landscape, rotation in a predetermined number of degrees, or a combination thereof.
  • the display image may also be oriented such that the orientation of the display image appears approximately constant (e.g., fixed) regardless of the orientation of the object. For example, a display image will appear to rotate and/or tilt in the opposite direction of the rotation and/or tilt of the display device.
  • a method for self-orienting a display image as described herein may be embodied in the form of computer-implemented processes and system for practicing those processes.
  • a method for self-orienting a display image as described herein may also be embodied in the form of computer program code embodied in tangible media, such as floppy diskettes, read only memories (ROMs), CD-ROMs, hard drives, high density disk, or any other computer-readable storage medium, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes a system for practicing the invention.
  • the method for self-orienting a display image as described herein may also be embodied in the form of computer program code, for example, whether stored in a storage medium, loaded into and/or executed by a computer, or transmitted over some transmission medium, such as over the electrical wiring or cabling, through fiber optics, or via electromagnetic radiation, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes a system for practicing the invention.
  • the computer program code segments configure the processor to create specific logic circuits.

Abstract

A self-orienting display senses the characteristics of an object and automatically rotates and reformats a display image in accordance with those characteristics. In one embodiment, the object is the display device, such as a hand held device, that provides the display image. As the display device is rotated, the display image is automatically oriented to either a landscape orientation or a portrait orientation. Characteristics may be sensed by mechanical sensors, electrical sensors, optical sensors, acoustic sensors, gyroscopic sensors, or a combination thereof. Sensors may be positioned on the display device, a person, or a combination thereof. The display images may include graphic display images, textual display images, videos display images, and functional control buttons (e.g., functional displayed representations of control buttons such as play, rewind, stop, scroll). The self-orienting display may also include an authenticator that authenticates a user.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This patent application is a continuation of U.S. patent application Ser. No. 10/412,042 filed Apr. 11, 2003, which is incorporated herein by reference in its entirety.
  • TECHNICAL FIELD
  • The present invention generally relates to displays and more specifically relates to systems and methods that automatically orient displays.
  • BACKGROUND
  • Display devices are becoming smaller and more portable. Display devices such as flat liquid crystal displays (LCDs) and plasma displays are relative thin and light weight. These light weight, smaller displays are more easily maneuvered than many of the bulkier cathode ray tube (CRT) displays. Due to the increased maneuverability of these displays, viewers are more likely to turn or rotate the display. This is also applicable to the plethora of available hand held display devices such as personnel digital assistants (PDAs), cell phones, and games, just to name a few. As the cost of these display devices continues to decrease, and the increasing number of smart devices which incorporate these displays increases, more and more users will be using these products to accommodate a variety of needs.
  • However, a problem with current display devices is that the display image becomes difficult to read/see when the display device is turned or rotated. For example, as a hand held PDA is rotated 90°, the display image appears tilted and can be difficult to interpret, or a viewer watching television may decide to lie down, which also makes the display image on the television appear tilted. Furthermore, some multipurpose devices are better suited to display specific display types in specific formats, such as text in traditional portrait orientation and video in landscape orientation.
  • A display device which overcomes these problems is desired.
  • SUMMARY
  • A method for orienting a display image includes sensing at least one characteristic of an object and determining the orientation of the object from at least one of the sensed characteristic(s). An image display is oriented relative to the determined orientation of the object. A system for implementing this method includes a sensor portion and a display processor. The sensor portion senses at least one characteristic of an object and provides a sensor signal indicative of the characteristic(s). The display processor receives the sensor signal and determines the orientation of the object from the sensor signal. The display processor also orients a display image relative to the determined orientation of the object.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The foregoing and other objects, aspects and advantages will be better understood from the following detailed description with reference to the drawings, in which:
  • FIG. 1 is an illustration of a self-orienting display comprising a display device, a display image, a sensor, and optional control buttons in accordance with an exemplary embodiment of the present invention;
  • FIG. 2A is an illustration of a rotated display device not possessing a self-orienting capability;
  • FIG. 2B is an illustration of rotated self-orienting display device showing the oriented display image portions and control buttons in accordance with an exemplary embodiment of the present invention;
  • FIG. 3 is an illustration of rotated self-orienting display device showing the oriented display image rotated to achieve an arbitrary orientation in accordance with an exemplary embodiment of the present invention;
  • FIG. 4 is an enlarged illustration of a control button comprising an array of light emitting diodes (LEDs) in accordance with an exemplary embodiment of the present invention;
  • FIG. 5 is an illustration of liquid crystal display (LCD) control buttons in accordance with an exemplary embodiment of the present invention;
  • FIG. 6 is an enlarged illustration of a control button that is automatically oriented by gravity in accordance with an exemplary embodiment of the present invention;
  • FIG. 7 is an enlarged illustration of a control button that is automatically oriented by gravity in accordance with another exemplary embodiment of the present invention;
  • FIG. 8 is an illustration of a self-orienting display showing a viewer viewing the display image, and multiple sensors positioned on the display device, in accordance with an exemplary embodiment of the present invention;
  • FIG. 9 an illustration of a self-orienting display showing a viewer viewing the display image, and multiple sensors positioned on the viewer, in accordance with an exemplary embodiment of the present invention;
  • FIG. 10 is a functional block diagram of self-orienting display system comprising a sensor portion, a display processor, a display portion, and an authenticator, in accordance with an exemplary embodiment of the present invention; and
  • FIG. 11 is a flow diagram of an exemplary process for orienting a display in accordance with an exemplary embodiment of the present invention.
  • DETAILED DESCRIPTION
  • A self-orienting display in accordance with the present invention senses the orientation of an object and automatically orients a display image in accordance with the orientation of that object. As described herein, self-orienting includes automatically rotating, along any number of axes, and formatting. An exemplary embodiment of this self-orienting display comprises a monitor that automatically orients the display image provided by the monitor to either a landscape orientation or a portrait orientation in response to the orientation of the monitor. However, this is just one of many envisioned embodiments. For example, the display image may be rotated in response to an audio command, such as “rotate”, or the display image may be rotated in response to depression of a switch on the display device. Various embodiments of the self-orienting display include various embodiments of the object, the sensors, the format of the display image, and functions performed by the self-orienting display. For example, objects may include the display device that provides the display image, a person viewing the display image, an object within visual and/or acoustic range of the self-orienting display, or a combination thereof. The display device may be any appropriate device having the capability to provide a display image, such as a monitor, a hand held device, a personal digital assistant (PDA), a cellular telephone having a display, a game device having a display, or a portable computer, for example. Various embodiments of sensors include mechanical sensors, electrical sensors, optical sensors, acoustic sensors, gyroscopic sensors, or a combination thereof. Example sensors include mercury switches, infrared detectors, motion detectors, ultrasonic detectors, cameras, and microphones. Furthermore, sensors may be positioned on the display device, a person, or a combination thereof (e.g., mercury switches attached to the display device and gyroscopic sensor attached to a headset of a viewer of the display image). Various embodiments of the display image include graphic display images, textual display images, videos display images, and functional control buttons (e.g., functional displayed representations of control buttons such as play, rewind, stop, scroll), for example. A more detailed description of these various embodiments is provided below.
  • FIG. 1 is an illustration of a self-orienting display 100 comprising a display device 12, a display image 14, a sensor 16, and optional control buttons 18. The self-orienting display 12 may be in the form of any appropriate display device capable of providing the display image 14. Examples of appropriate display devices 12 include cathode ray tube (CRT) displays, plasma displays, light emitting diode (LED) displays, flat panel displays, projection displays, wireless devices (e.g., cellular devices including telephones, personal digital assistants (PDAs), portable computers; and devices communicating via an optical link, such as an infrared link), hand held devices (e.g., hand held games or game controllers), televisions, radios, or alarm clocks, just to name a few.
  • The sensor 16 may comprise any type of sensor capable of sensing the orientation of the display device 12 and/or another object (e.g., a person viewing the display image 14). Examples of appropriate sensors include mechanical sensors, electrical sensors, optical sensors, acoustic sensors, gyroscopic sensors, or a combination thereof. Some specific types of sensors 16 include mercury switches, infrared detectors, motion detectors, ultrasonic detectors, cameras, and microphones, or a combination thereof. Note some types of sensors fall into more than one category. For example, a mercury switch may be considered a mechanical sensor and an electrical sensor, or an ultrasonic sensor may be considered an acoustic sensor and an electrical sensor. The sensor 16 may include a single sensor or a plurality of sensors. The sensor 16 may be positioned at various locations on the display device 12 or may be positioned at a single location. For example, sensors 16 may be placed at the corners of the display device 12. Furthermore, sensors 16 may be positioned on the display device, a person, or a combination thereof.
  • The display image 14 may be in the form of a graphic display image, a textual display image, a video display image, and a functional control button 18, or a combination thereof. The display image 14 may comprise display image portions, such as display image portions 14 a and 14 b. As depicted in FIG. 1, a graphic/video display type is provided by the display image portion 14 a and a text display type is provided by the display image portion 14 b. For example, the display image portion 14 a may depict a video and the display image portion 14 b may depict email headers/text. It is to be understood that this depiction is exemplary, and not intended to be limited thereto. For example, the display image 14 may not be partitioned into portions, the display image 14 may be portioned into a plurality of portions, the display image portions may overlap, the display image portions may provide any combination of display types, or a combination thereof. In some embodiments of the self-orientating display, the control buttons are implemented as display image portions (described in more detail below). It is to be understood, therefore, that reference to display image 14, display image portion 14 a, and/or display image portion 14 b, may also be appropriately interpreted to refer to the control buttons 18 when implemented as display portions. For example, a description of rotation and formatting techniques to be applied to the display image 14 also applies to the control buttons 18 implemented as display portions.
  • The control buttons 18 may comprise any appropriate type of control device capable of controlling functions related to the display image 14 and/or the display device 12. In one embodiment, the control buttons 18 comprise liquid crystal display (LCD) buttons with a protective overlay (e.g., a touch switch). In another embodiment, each control button 18 comprises an array of light emitting diodes (LEDs). In yet another embodiment, each control button 18 comprises a thin disc or the like, formed in a desired shape (e.g., triangle) contained within the liquid. In still another embodiment, the control buttons 18 are weighted such that a portion of each button is always oriented towards the greatest gravitation force. The control buttons 18 control various aspects of the display image 14 and/or the display device 12. Functions controlled by the control buttons 18 may include, for example, playback, pause, stop, rewind, enable/disable back lighting, or a combination thereof. Furthermore, the control buttons 18 may include an orientation button that, when activated, orients the display image 14. For example, one of the control buttons 18 may switch the display image 14 between landscape orientation and portrait orientation each time the button is depressed/touched. In another example, the orientation control button may rotate the display image 14 a predetermined number of degrees each time it is depressed/touched. The control buttons 18 are optional. Thus, various embodiments of the self-orienting display in accordance with the present invention may or may not comprise control buttons.
  • FIG. 2A is an illustration of a rotated display device not possessing a self-orienting capability. FIG. 2B is an illustration of rotated self-orienting display device 12 showing the oriented display image portions 14 a, 14 b, and control buttons 18. The display devices shown in FIG. 2A and FIG. 2B are rotated 90 degrees with respect to the display device 12 shown in FIG. 1. Comparing FIG. 2A with FIG. 2B, the display image portion 14 a of FIG. 2B is rotated by 90° with respect to the equivalent display image portion shown in FIG. 2A. The display image portion 14 b of FIG. 2B is also rotated by 90° with respect to the equivalent display image portion shown in FIG. 2A. The control buttons 18 of FIG. 2B are also rotated by 90° with respect to the equivalent control buttons of FIG. 2A. The rotation of the control buttons 18 is most clearly illustrated by comparing buttons 18 a and 18 b of FIG. 2B with the equivalent control buttons of FIG. 2A. Portions of the display image 14, such as display image portions 14 a and 14 b, along with the control buttons 18, which may also be a display image portion, are rotated and reformatted to conform to the rotated image space of the display device 12. Rotation and formatting may be accomplished by any appropriate technique. For example, a raster scan display image may be rotated by simply transposing the horizontal and vertical deflection values. Formatting may then be accomplished to fit the rotated image within the available display image space to reduce any distortion. For a display utilizing pixels, each array of pixels may be transposed and formatted. Examples of algorithms/techniques for reformatting displays include scaling, stretching, and the ability to dynamically update resolution.
  • In one embodiment of the present invention, the display image 14 is oriented with respect to the orientation of the display device 12. As the display device 12 oriented as shown in FIG. 1 is rotated, the display image 14 is automatically oriented, such that the appearance of the display image 14 appears to remain approximately stable regardless of the orientation of the display device 12. Thus, if a viewer prefers landscape mode, she can rotate the display device 12 to achieve the orientation shown in FIG. 1. If the viewer prefers portrait mode, she can rotate the display device 12 to achieve the orientation shown in FIG. 2B. Note that even though the appearance of the display image 14, relative to a viewer, remains approximately constant, the display image 14 is actually oriented (rotated and formatted) in response to the orientation of the display device 12.
  • In another embodiment, the relative orientation between the display image 14 and a viewer (See FIG. 8 for depiction of a viewer 36) is approximately constant. Thus, if a viewer tilts her head, the display image 14 is tilted in the same direction, such that the orientation between the viewer and the display image 14 is approximately constant (fixed). As shown in FIG. 1 and FIG. 2B, the display image 14 is rotated to achieve a landscape orientation or a portrait orientation. However, orientation of the display image 14 is not limited thereto.
  • FIG. 3 is an illustration of rotated self-orienting display device 12 showing the oriented display image 14 rotated to achieve an arbitrary orientation. The display image 14 of FIG. 3 is automatically rotated such that the relative orientation between a viewer is approximately constant, regardless of the amount by which the display device 12 is rotated. Orientation of the display image 14 and/or the control buttons 18 is not limited to rotation in a single dimension (e.g., plane). The display image 14 may be oriented in one, two, or three dimensions, as indicated by the three dimensional set of axes 15. For example, a three dimensional depiction on the display image 14 may be rotated horizontally, vertically, or a combination thereof as the display device 12 is rotated, such that the relative orientation between a viewer and the display image 14 remains approximately the same.
  • In another embodiment, the three dimensional display image 14 is oriented to provide a desired perspective to the viewer. This may be accomplished by the viewer simply turning and/or shifting her head to view the desired perspective, turning the display device 12 to view the desired perspective, or a combination thereof. As explained in more detail below, sensors 16 can be positioned on the viewer 36 and/or on the display device 12 to sense the orientation of the viewer and/or display device 12. For example, assume the three dimensional display image 14 is a cube and the display device 12 is a hand held display device. Also assume the viewer is viewing a front side of the cube. If the viewer desires to view the left side of the cube, she may simply rotate the hand held display device (e.g., to the right) to view the left side of the cube. She may also turn her head (e.g., to the right and/or shift her head to the left), as if the cube were physically in front of her and she positioned herself to look at the left side.
  • As mentioned above, various embodiments of the control buttons 18 are envisioned. FIG. 4 is an enlarged illustration of a control button 18 b comprising an array of light emitting diodes (LEDs). An exemplary control button 18 b is expanded to show the array of LEDs utilized to display the shape corresponding to the control function performed by the button. The control button 18 b comprises a triangle shaped image, which may signify play, for example. The LEDs may be various colors. Upon the orientation of the display device 12 being sensed, the image of control buttons 18, including 18 b, are rotated accordingly. In one embodiment, the array of LEDs is symmetric, thus allowing the control button image to be rotated between landscape and portrait mode by transposing the array of LEDs.
  • FIG. 5 is an illustration of liquid crystal display (LCD) control buttons 18. The control buttons 18 in FIG. 5 comprise LCD portions for displaying the shape corresponding to the control function performed by the button. As shown in FIG. 5, the LCD portions are covered with an appropriate overlay 20 to protect the LCD portions and to provide a surface which can be touched/depressed to utilize the control buttons 18. Upon the orientation of the display device 12 being sensed, the LCD image of control buttons 18 are rotated accordingly.
  • FIG. 6 is an enlarged illustration of a control button 18 b that is automatically oriented by gravity, wherein the control button comprises a liquid portion 22 having a shaped disc 24 contained therein. The exemplary control button 18 b is automatically oriented by gravity when the display device 12 is rotated. The control button 18 b of FIG. 6 comprises a liquid portion 22 contained within the control button 18 b. Within the liquid portion 22 is contained disc 24 formed in a shape corresponding to the control function performed by the button. The disc 24 is triangular shaped, indicating the play function, for example. The disc 24 is suspended in the liquid portion 22. As the display device 12 is rotated the disc 24 automatically rotates, thus resulting in self-orientation of the control button 18 b. In one embodiment, the disc 24 is weighted such that a specific portion 28 of the disc 24 is always pointed in the direction of the strongest gravitational pull (e.g., down). The arrow 28 depicts a portion of the disc 24 that is heavier (more mass) such that the portion 28 is always facing “down” (toward the strongest gravitational attractive force). In another embodiment, the disc 24 contains an air bubble 26 (or other appropriate gas of liquid portion) such that the portion with the lesser mass is always facing “up” (away from the direction of the strongest gravitation pull). The bubble 26 may be any portion comprising a gas or a liquid that is less dense than the liquid in the liquid portion 22. Other types of control buttons 18 that are automatically oriented by gravity are envisioned.
  • FIG. 7 is an enlarged illustration of a control button 18 b that is automatically oriented by gravity, wherein the control button 18 b comprises bearing 34. Self-orientation of the control button 18 b of FIG. 7 is achieved via gravity in a similar manner as described above with respect to FIG. 6. However, the disc 24 is contained within the control button 18 b by bearings, or the like, which allow the disc 24 to freely rotate. Again the control button 18 b shown in FIG. 7 may comprise a weighted portion 28, a less dense portion 26, or a combination, similar to the control button 18 b shown in FIG. 6. Also, other mechanisms for providing a self-orienting display that is automatically oriented via gravity. For example, the control buttons 18 may be attached to spindles or axles that allow the control buttons 18 to freely rotate.
  • FIG. 8 is an illustration of a self-orienting display showing a viewer 36 viewing the display image 14, and multiple sensors 16 a, 16 b, and 16 c positioned on the display device 12. As mentioned above, the sensors 16 may comprise any appropriate type and combination of sensors capable of sensing the orientation of an object. Examples of which include known types of devices such as mercury switches, gyroscopic sensors/devices, gravity switches/devices, optical detectors (e.g., infrared detectors), acoustic sensors/devices (e.g., ultrasonic devices, acoustic microphones), electrical sensors/devices, magnetic devices/sensors, and cameras. The sensors 16 may be positioned on the display device 12 and/or on the viewer 36. Thus, the sensors 16 may be positioned on an object, wherein the object may comprise the display device 12, the viewer 36, another object within sensing range of the sensors 16, or a combination thereof.
  • For purposes of explaining the following exemplary embodiment, the sensors 16 positioned on the display device 12 in FIG. 8 are a camera 16 a, an acoustic sensor (e.g., microphone) 16 b, and mercury switches 16 c. The camera 16 a may comprise any appropriate type of camera, such as a camera utilizing a charge coupled device (CCD), or an infrared camera (e.g., night vision), for example. The camera 16 a senses the orientation of the viewer's 36 head. In response to the sensed orientation of the viewer's 36 head, the display image 14 is automatically oriented by any of the techniques/devices described herein.
  • To facilitate automatic self-orientation, in one embodiment, the relative orientation between the display image 14 and the object is initialized. This may include initialization of the relative orientation between the display image 14 and the display device 12, the relative orientation between the display image 14 and the viewer 36, or a combination thereof. For example, the relative orientation between the viewer 36 and the display image 14 is initialized. To generate the initial relative orientation, the viewer 36 may position herself in front of the display image 14, such that she is within sensing range of the sensors 16 (e.g., optical range of the camera 16 a and/or audio range of the microphone 16 b). While observing her depiction on the display image 14, the viewer may position her head to align the depiction to be centered in the display image 14, for example. Once the viewer is satisfied that the relative orientation is as desired, she may initialize this relative orientation. All subsequent automatic orientation will be with respect to the initial relative orientation. Initialization may be accomplished by any appropriate means, such as activating a switch, depressing a button (e.g., a control button 18), giving an audible command, waiting a period of time, or a combination thereof. In one exemplary embodiment, the viewer gives an audio command, such as “align”. The microphone 16 b receives this audio command, and transduces the audio command into a sense signal. This sense signal is utilized to establish the baseline relative orientation between the display image 14 and the viewer 36. Thus, the viewer 36 may rotate the display device 12 to either landscape of portrait orientation. The mercury switches 16 c senses the orientation of the display device 12, also providing a sense signal. The sense signal provided by the mercury switches 16 c and the sense signal provided by the camera 16 b during initialization are utilized to establish the baseline relative orientation. The sense signals are also utilized to orient the display image 14 as the display device 12 and/or the viewer 36 change orientation. Also, the display image 14 may be oriented via an audio command, such as “rotate”, in response to which the display image 14 is rotated (e.g.,)90°. It is to be understood that various combinations of sensors 16, and placements thereof are envisioned. For example, as shown in FIG. 9, sensors 16 may be positioned on the viewer 36.
  • In yet another embodiment, the viewer 36 is authenticated. Authentication may be accomplished by analyzing the sensed image, which is sensed by the camera 16 b, to determine if the viewer is authorized to use the display device 12. The sensed image may include a retinal scan, a finger print scan, or the like. The sensed image is analyzed to determine if authorization is appropriate. Any appropriate technique may be utilized to analyze the sensed image. For example, the sensed image may be compared to a stored representation of an authorized image, or the sensed image may be analyzed for key features which distinguish an authorized sensed image, or a combination thereof In another embodiment, the viewer is authenticated by analyzing a sensed acoustic signal received by the acoustic sensor 16 c. The acoustic signal may include a key phrase, such as the viewer's 36 name or a password. The sensed acoustic signal is analyzed to determine if authorization is appropriate. Any appropriate technique may be utilized to analyze the sensed acoustic signal. For example, the sensed acoustic signal may be compared to a stored representation of an authorized acoustic signal, or the sensed acoustic signal may be analyzed for key features which distinguish an authorized sensed acoustic (e.g., acoustic signature), or a combination thereof.
  • FIG. 10 is a functional block diagram of self-orienting display system comprising a sensor portion 40, a display processor 42, a display portion 44, and an optional authenticator 44. The sensor portion 40 may comprise any combination of the sensors described above. The sensor portion 40 senses at least one characteristic of an object. For example, the object may be the display device (e.g., display device 12) and characteristics may include orientation of the display device; the object may be a user (e.g., viewer 36) and the characteristic may include an image of a portion of the user's body (e.g., retina, finger print); the object may be a user and the characteristic may include a acoustic signal provided by the user (e.g., voice), or a combination thereof The sensor portion 40 provides a sensor signal 48 indicative of the sensed characteristic (or characteristics) of the object. The display processor 42 receives the sensor signal 48 and processes the sensor signal 48 to determine the orientation of the sensed characteristic(s). The display processor 42 provides an orientation signal 50 indicative of the sensed characteristic(s). The display portion 44 (e.g., the display device 12) receives the orientation signal 50 and orients a display image (e.g., display image 14) in accordance with the determined orientation. In one embodiment, the self-orienting display system comprises the authenticator 46 for authenticating the object by analyzing the sensed characteristic(s) of the object. The authenticator 46 receives the sensor signal 48 and analyzes the sensed characteristic(s) using any of the analysis techniques described above. This sense signal 48 and the orientation signal 50 may be provided by any appropriate means, such as electrically, acoustically, optically, electromagnetically, or a combination thereof.
  • FIG. 11 is a flow diagram of an exemplary process for self-orienting a display. The object is sensed at step 54. As described above, the object may be a person, the display device, or a combination thereof. The characteristic may include orientation of the object, an image of a portion of the object (e.g., retina or fingerprint), an acoustic signal (e.g., voice or clap), or a combination thereof. The object may be sensed by any combination of the sensors described above, such as optical sensors, mechanical sensors, gravity sensors, gyroscopic sensors, electromagnetic sensors, acoustic sensors, touch sensitive sensors (e.g., control buttons 18), for example. At step 56, the object is authenticated as described above. The step of authentication is optional. The relative orientation between the object and the display image is initialized at step 58. The step of initialization is also optional. Initialization may be accomplished described above. At step 60, the orientation of the object is determined utilizing the sensed characteristic (or characteristics) of the object. The display image is oriented with respect to the determined orientation of the object at step 62. As described above, the display image may be oriented to predetermined orientations, such as portrait, landscape, rotation in a predetermined number of degrees, or a combination thereof. The display image may also be oriented such that the orientation of the display image appears approximately constant (e.g., fixed) regardless of the orientation of the object. For example, a display image will appear to rotate and/or tilt in the opposite direction of the rotation and/or tilt of the display device.
  • A method for self-orienting a display image as described herein may be embodied in the form of computer-implemented processes and system for practicing those processes. A method for self-orienting a display image as described herein may also be embodied in the form of computer program code embodied in tangible media, such as floppy diskettes, read only memories (ROMs), CD-ROMs, hard drives, high density disk, or any other computer-readable storage medium, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes a system for practicing the invention. The method for self-orienting a display image as described herein may also be embodied in the form of computer program code, for example, whether stored in a storage medium, loaded into and/or executed by a computer, or transmitted over some transmission medium, such as over the electrical wiring or cabling, through fiber optics, or via electromagnetic radiation, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes a system for practicing the invention. When implemented on a general-purpose processor, the computer program code segments configure the processor to create specific logic circuits.
  • Although illustrated and described herein with reference to certain specific embodiments, the system and method for orienting a display as described herein are nevertheless not intended to be limited to the details shown. Rather, various modifications may be made in the details within the scope and range of equivalents of the claims and without departing from the spirit of the invention.

Claims (18)

1. A method of reorienting a display image portion of a display image rendered by a display device, the method comprising:
rendering a first viewing perspective of the display image portion;
sensing an object other than the display device;
sensing a reorientation of the object; and
responsive to sensing the reorientation of the object, reorienting the display image portion from the first viewing perspective to a second viewing perspective,
wherein the display image portion is rendered by the display device as it is reoriented from the first viewing perspective to the second viewing perspective.
2. The method of claim 1, wherein reorienting the display image portion comprises rotating the display image portion.
3. The method of claim 2, wherein the display image portion is rotated about an axis that is parallel with respect to a plane of the display image.
4. The method of claim 3, wherein the reorientation of the object comprises activation of a button on the display device.
5. The method of claim 4, wherein the button is a second display image portion of the display image comprising a display representation of a functional control.
6. The method of claim 1, wherein the object comprises a portion of a body of a viewer of the display device and reorientation of the object comprises repositioning of the object with respect to the display image.
7. A system for reorienting a display image portion of a display image rendered by a display device, the system comprising:
a sensor portion configured to sense a reorientation of an object other than the display device and provide a sensor signal indicative of the reorientation; and
a display processor configured to:
receive the sensor signal;
determine a second viewing perspective for the display image portion based upon the sensor signal;
reorient the display image portion from a first viewing perspective to the second viewing perspective; and
render the display image portion on the display device as it is reoriented.
8. The system of claim 7, wherein reorienting the display image portion comprises rotating the display image portion.
9. The system of claim 8, wherein the display image portion is rotated about an axis that is parallel with respect to a plane of the display image.
10. The system of claim 9, wherein the reorientation of the object comprises activation of a button on the display device.
11. The system of claim 10, wherein the button is a second display image portion of the display image comprising a display representation of a functional control.
12. The system of claim 7, wherein the object comprises a portion of a body of a viewer of the display device and reorientation of the object comprises repositioning of the object with respect to the display image.
13. A computer readable storage medium having program code stored thereon that when executed by a processor causes a display device to reorient a display image portion of a display image rendered by the display device, the program code comprising:
a sense object code segment that causes one or more sensors to sense a reorientation of an object other than the display device;
a determine orientation code segment that causes the processor to determine a second viewing perspective for the display image portion based upon the sensed reorientation; and
an orient code segment that causes the display device to reorient the display image portion from a first viewing perspective to the determined second viewing perspective,
wherein the display image portion is rendered by the display device as it is reoriented.
14. The computer readable storage medium of claim 13, wherein reorienting the display image portion comprises rotating the display image portion.
15. The computer readable storage medium of claim 14, wherein the display image portion is rotated about an axis that is parallel with respect to a plane of the display image.
16. The computer readable storage medium of claim 15, wherein the reorientation of the object comprises activation of a button on the display device.
17. The computer readable storage medium of claim 16, wherein the button is a second display image portion of the display image comprising a display representation of a functional control.
18. The computer readable storage medium of claim 13, wherein the object comprises a portion of a body of a viewer of the display device and reorientation of the object comprises repositioning of the object with respect to the display image.
US12/974,173 2003-04-11 2010-12-21 Self-orienting display Abandoned US20110084984A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/974,173 US20110084984A1 (en) 2003-04-11 2010-12-21 Self-orienting display

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/412,042 US20040201595A1 (en) 2003-04-11 2003-04-11 Self-orienting display
US12/974,173 US20110084984A1 (en) 2003-04-11 2010-12-21 Self-orienting display

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/412,042 Continuation US20040201595A1 (en) 2003-04-11 2003-04-11 Self-orienting display

Publications (1)

Publication Number Publication Date
US20110084984A1 true US20110084984A1 (en) 2011-04-14

Family

ID=33131136

Family Applications (5)

Application Number Title Priority Date Filing Date
US10/412,042 Abandoned US20040201595A1 (en) 2003-04-11 2003-04-11 Self-orienting display
US10/987,859 Expired - Fee Related US7626598B2 (en) 2003-04-11 2004-11-12 Self-orienting display
US12/974,327 Abandoned US20110090256A1 (en) 2003-04-11 2010-12-21 Self-orienting display
US12/974,173 Abandoned US20110084984A1 (en) 2003-04-11 2010-12-21 Self-orienting display
US14/520,026 Abandoned US20150062180A1 (en) 2003-04-11 2014-10-21 Self-orienting display

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US10/412,042 Abandoned US20040201595A1 (en) 2003-04-11 2003-04-11 Self-orienting display
US10/987,859 Expired - Fee Related US7626598B2 (en) 2003-04-11 2004-11-12 Self-orienting display
US12/974,327 Abandoned US20110090256A1 (en) 2003-04-11 2010-12-21 Self-orienting display

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/520,026 Abandoned US20150062180A1 (en) 2003-04-11 2014-10-21 Self-orienting display

Country Status (1)

Country Link
US (5) US20040201595A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040201595A1 (en) * 2003-04-11 2004-10-14 Microsoft Corporation Self-orienting display
US20080317441A1 (en) * 2003-03-06 2008-12-25 Microsoft Corporation Systems and methods for receiving, storing, and rendering digital video, music, and pictures on a personal media player

Families Citing this family (152)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2527829C (en) 2003-05-30 2016-09-27 Privaris, Inc. A man-machine interface for controlling access to electronic devices
US20040245334A1 (en) * 2003-06-06 2004-12-09 Sikorski Steven Maurice Inverted terminal presentation scanner and holder
CN100392569C (en) * 2003-07-28 2008-06-04 日本电气株式会社 Mobile information terminal
KR100538948B1 (en) * 2003-08-11 2005-12-27 삼성전자주식회사 Display device of portable apparatus capable of accommodately displaying image
JP2005086252A (en) * 2003-09-04 2005-03-31 Nikon Corp Portable terminal
FI117217B (en) * 2003-10-01 2006-07-31 Nokia Corp Enforcement and User Interface Checking System, Corresponding Device, and Software Equipment for Implementing the Process
KR101124826B1 (en) * 2003-10-22 2012-03-26 교세라 가부시키가이샤 Mobile telephone apparatus, display method, and computer readable recording medium having program
US20050140696A1 (en) * 2003-12-31 2005-06-30 Buxton William A.S. Split user interface
TWI234754B (en) * 2004-02-12 2005-06-21 Benq Corp Display
JP2005277452A (en) * 2004-03-22 2005-10-06 Nec Corp Portable electronic apparatus and its display switching method
US20050222801A1 (en) 2004-04-06 2005-10-06 Thomas Wulff System and method for monitoring a mobile computing product/arrangement
TWI248043B (en) * 2004-04-20 2006-01-21 Wistron Corp Electrical device capable of auto-adjusting display direction as a tilt of a display
CN1961289A (en) * 2004-06-04 2007-05-09 皇家飞利浦电子股份有限公司 A hand-held device for content navigation by a user
US20050276164A1 (en) * 2004-06-12 2005-12-15 Scott Amron Watch adapted to rotate a displayed image so as to appear in a substantially constant upright orientation
US20060020904A1 (en) * 2004-07-09 2006-01-26 Antti Aaltonen Stripe user interface
TW200622893A (en) * 2004-07-09 2006-07-01 Nokia Corp Cute user interface
KR100651938B1 (en) * 2004-08-16 2006-12-06 엘지전자 주식회사 apparatus, method and medium for controlling image orientation
KR100677569B1 (en) * 2004-12-13 2007-02-02 삼성전자주식회사 3D image display apparatus
US20060176278A1 (en) * 2005-02-10 2006-08-10 Motorola, Inc. Method and system for display orientation
JP2006313313A (en) * 2005-04-06 2006-11-16 Sony Corp Reproducing device, setting switching method, and setting switching device
KR100566184B1 (en) * 2005-04-22 2006-03-29 삼성전자주식회사 Method for displaying image in a mobile communication terminal and the mobile communication terminal
BRPI0613165A2 (en) 2005-05-17 2010-12-21 Gesturetek Inc signal output sensitive to orientation
CN101506760A (en) * 2005-05-27 2009-08-12 夏普株式会社 Display device
US7822513B2 (en) * 2005-07-27 2010-10-26 Symbol Technologies, Inc. System and method for monitoring a mobile computing product/arrangement
US7884836B2 (en) * 2005-08-30 2011-02-08 Ati Technologies Ulc Notifying a graphics subsystem of a physical change at a display device
US7431216B2 (en) 2005-11-16 2008-10-07 Sony Ericsson Mobile Communications Ab Methods for presenting parameter status information and related portable electronic devices and parameters
US7657849B2 (en) 2005-12-23 2010-02-02 Apple Inc. Unlocking a device by performing gestures on an unlock image
US7509588B2 (en) 2005-12-30 2009-03-24 Apple Inc. Portable electronic device with interface reconfiguration mode
US20070254729A1 (en) * 2006-05-01 2007-11-01 Freund Joseph M Handheld portable electronic device with integrated stand
US8594742B2 (en) * 2006-06-21 2013-11-26 Symbol Technologies, Inc. System and method for monitoring a mobile device
US20070297028A1 (en) * 2006-06-21 2007-12-27 Thomas Wulff System and device for monitoring a computing device
JP4811173B2 (en) * 2006-07-28 2011-11-09 日本電気株式会社 Portable terminal device, timer control method, and timer control program
US8139026B2 (en) 2006-08-02 2012-03-20 Research In Motion Limited System and method for adjusting presentation of text and images on an electronic device according to an orientation of the device
US8493323B2 (en) * 2006-08-02 2013-07-23 Research In Motion Limited System and method for adjusting presentation of moving images on an electronic device according to an orientation of the device
US7956849B2 (en) 2006-09-06 2011-06-07 Apple Inc. Video manager for portable multifunction device
US7864163B2 (en) 2006-09-06 2011-01-04 Apple Inc. Portable electronic device, method, and graphical user interface for displaying structured electronic documents
US10313505B2 (en) 2006-09-06 2019-06-04 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US8842074B2 (en) * 2006-09-06 2014-09-23 Apple Inc. Portable electronic device performing similar operations for different gestures
US8797267B2 (en) * 2006-09-12 2014-08-05 Virgin Mobile Usa, L.P. Virtual hard keys on a wireless device
US8235724B2 (en) * 2006-09-21 2012-08-07 Apple Inc. Dynamically adaptive scheduling system
US8956290B2 (en) * 2006-09-21 2015-02-17 Apple Inc. Lifestyle companion system
US8745496B2 (en) 2006-09-21 2014-06-03 Apple Inc. Variable I/O interface for portable media device
US8001472B2 (en) 2006-09-21 2011-08-16 Apple Inc. Systems and methods for providing audio and visual cues via a portable electronic device
US8429223B2 (en) 2006-09-21 2013-04-23 Apple Inc. Systems and methods for facilitating group activities
US20080077489A1 (en) * 2006-09-21 2008-03-27 Apple Inc. Rewards systems
US20080076972A1 (en) * 2006-09-21 2008-03-27 Apple Inc. Integrated sensors for tracking performance metrics
JP2008076818A (en) * 2006-09-22 2008-04-03 Fujitsu Ltd Mobile terminal device
EP2084590B1 (en) * 2006-11-10 2017-01-04 Draeger Medical Systems, Inc. A system for adaptively orienting a display image on a device
KR101122090B1 (en) * 2006-11-14 2012-03-15 엘지전자 주식회사 Mobile communication terminal having function for displaying information and information displaying method thereof
EP1929976A1 (en) * 2006-12-07 2008-06-11 Swiss Medical Technology GmbH Support system and method for viewer-dependent positioning and alignment of a display unit
US7706579B2 (en) * 2006-12-21 2010-04-27 Sony Ericsson Communications Ab Image orientation for display
US8214768B2 (en) * 2007-01-05 2012-07-03 Apple Inc. Method, system, and graphical user interface for viewing multiple application windows
US7877707B2 (en) * 2007-01-06 2011-01-25 Apple Inc. Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US8519964B2 (en) 2007-01-07 2013-08-27 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US20080165148A1 (en) * 2007-01-07 2008-07-10 Richard Williamson Portable Electronic Device, Method, and Graphical User Interface for Displaying Inline Multimedia Content
US20080181502A1 (en) * 2007-01-31 2008-07-31 Hsin-Ming Yang Pattern recognition for during orientation of a display device
JP2008281622A (en) * 2007-05-08 2008-11-20 Univ Of Electro-Communications Liquid crystal television set and display device
US7742783B2 (en) * 2007-05-10 2010-06-22 Virgin Mobile Usa, L.P. Symmetric softkeys on a mobile electronic device
US9933937B2 (en) 2007-06-20 2018-04-03 Apple Inc. Portable multifunction device, method, and graphical user interface for playing online videos
US9772751B2 (en) 2007-06-29 2017-09-26 Apple Inc. Using gestures to slide between user interfaces
US8068121B2 (en) 2007-06-29 2011-11-29 Microsoft Corporation Manipulation of graphical objects on a display or a proxy device
WO2009027895A1 (en) * 2007-08-28 2009-03-05 Nxp B.V. Image display device
US9619143B2 (en) 2008-01-06 2017-04-11 Apple Inc. Device, method, and graphical user interface for viewing application launch icons
US11126321B2 (en) 2007-09-04 2021-09-21 Apple Inc. Application menu user interface
US8619038B2 (en) 2007-09-04 2013-12-31 Apple Inc. Editing interface
US20090066506A1 (en) * 2007-09-07 2009-03-12 Niizawa Derek T Electronic device with circuitry operative to change an orientation of an indicator and method for use therewith
US8159352B2 (en) * 2007-09-11 2012-04-17 Colgate-Palmolive Company Personal care implement having a display
CN101809581B (en) 2007-09-24 2014-12-10 苹果公司 Embedded authentication systems in an electronic device
DE102007050296A1 (en) * 2007-10-22 2009-04-23 Robert Bosch Gmbh Distance measuring device and use
US9202444B2 (en) * 2007-11-30 2015-12-01 Red Hat, Inc. Generating translated display image based on rotation of a display device
KR101010459B1 (en) * 2007-12-28 2011-01-21 엘지전자 주식회사 Apparatus for displaying mark of an display device and display device
US20090187102A1 (en) * 2008-01-21 2009-07-23 Gerois Di Marco Method and apparatus for wide-screen medical imaging
US20090221890A1 (en) * 2008-02-28 2009-09-03 Daniel Saffer Diabetes Management System
US20090237420A1 (en) * 2008-03-22 2009-09-24 Lawrenz Steven D Automatically conforming the orientation of a display signal to the rotational position of a display device receiving the display signal
US8605091B2 (en) * 2008-04-18 2013-12-10 Leviton Manufacturing Co., Inc. Enhanced power distribution unit with self-orienting display
JP4971241B2 (en) * 2008-05-09 2012-07-11 株式会社リコー Image display device
US9253416B2 (en) * 2008-06-19 2016-02-02 Motorola Solutions, Inc. Modulation of background substitution based on camera attitude and motion
US8454436B2 (en) * 2008-06-26 2013-06-04 Wms Gaming Inc. Gaming machine with movable display screen
US20100062833A1 (en) * 2008-09-10 2010-03-11 Igt Portable Gaming Machine Emergency Shut Down Circuitry
KR101602363B1 (en) * 2008-09-11 2016-03-10 엘지전자 주식회사 3 Controling Method of 3 Dimension User Interface Switchover and Mobile Terminal using the same
WO2010030984A1 (en) * 2008-09-12 2010-03-18 Gesturetek, Inc. Orienting a displayed element relative to a user
KR101500741B1 (en) * 2008-09-12 2015-03-09 옵티스 셀룰러 테크놀로지, 엘엘씨 Mobile terminal having a camera and method for photographing picture thereof
CN102203850A (en) * 2008-09-12 2011-09-28 格斯图尔泰克公司 Orienting displayed elements relative to a user
US20100088532A1 (en) * 2008-10-07 2010-04-08 Research In Motion Limited Method and handheld electronic device having a graphic user interface with efficient orientation sensor use
US20100153313A1 (en) * 2008-12-15 2010-06-17 Symbol Technologies, Inc. Interface adaptation system
JP2010193031A (en) * 2009-02-17 2010-09-02 Olympus Imaging Corp Photographic apparatus and method for controlling the same
US8355031B2 (en) * 2009-03-17 2013-01-15 Harris Corporation Portable electronic devices with adjustable display orientation
TWI383315B (en) * 2009-03-27 2013-01-21 Wistron Corp Computer screen displaying method, computer having a vertical displaying device, storage medium having a bios stored therein, and computer program product
KR20100129416A (en) * 2009-06-01 2010-12-09 삼성전자주식회사 Method for operating input mode of mobile terminal comprising a plurality of input means
US8271898B1 (en) 2009-06-04 2012-09-18 Mellmo Inc. Predictive scrolling
US20100309228A1 (en) * 2009-06-04 2010-12-09 Camilo Mattos Displaying Multi-Dimensional Data Using a Rotatable Object
US8817048B2 (en) * 2009-07-17 2014-08-26 Apple Inc. Selective rotation of a user interface
EP2280331B1 (en) * 2009-07-22 2018-10-31 BlackBerry Limited Display orientation change for wireless devices
US9305232B2 (en) * 2009-07-22 2016-04-05 Blackberry Limited Display orientation change for wireless devices
US20110080809A1 (en) * 2009-10-07 2011-04-07 Michele Berman Personalized Children's Multimedia Picture Alarm
US8438504B2 (en) 2010-01-06 2013-05-07 Apple Inc. Device, method, and graphical user interface for navigating through multiple viewing areas
US8736561B2 (en) 2010-01-06 2014-05-27 Apple Inc. Device, method, and graphical user interface with content display modes and display rotation heuristics
US8913009B2 (en) 2010-02-03 2014-12-16 Nintendo Co., Ltd. Spatially-correlated multi-display human-machine interface
US8317615B2 (en) 2010-02-03 2012-11-27 Nintendo Co., Ltd. Display device, game system, and game method
US8814686B2 (en) 2010-02-03 2014-08-26 Nintendo Co., Ltd. Display device, game system, and game method
US8339364B2 (en) 2010-02-03 2012-12-25 Nintendo Co., Ltd. Spatially-correlated multi-display human-machine interface
US20110193857A1 (en) * 2010-02-05 2011-08-11 Vasily Filippov Methods and apparatus for rendering a collection of widgets on a mobile device display
US20110197165A1 (en) * 2010-02-05 2011-08-11 Vasily Filippov Methods and apparatus for organizing a collection of widgets on a mobile device display
CN102148011A (en) * 2010-02-08 2011-08-10 鸿富锦精密工业(深圳)有限公司 Interactive type image display method
US8792683B2 (en) * 2010-06-04 2014-07-29 Blackberry Limited Fingerprint scanning with optical navigation
US8528072B2 (en) 2010-07-23 2013-09-03 Apple Inc. Method, apparatus and system for access mode control of a device
US20120026098A1 (en) * 2010-07-30 2012-02-02 Research In Motion Limited Portable electronic device having tabletop mode
JP6243586B2 (en) 2010-08-06 2017-12-06 任天堂株式会社 GAME SYSTEM, GAME DEVICE, GAME PROGRAM, AND GAME PROCESSING METHOD
US10150033B2 (en) 2010-08-20 2018-12-11 Nintendo Co., Ltd. Position calculation system, position calculation device, storage medium storing position calculation program, and position calculation method
JP5840385B2 (en) 2010-08-30 2016-01-06 任天堂株式会社 GAME SYSTEM, GAME DEVICE, GAME PROGRAM, AND GAME PROCESSING METHOD
JP5840386B2 (en) 2010-08-30 2016-01-06 任天堂株式会社 GAME SYSTEM, GAME DEVICE, GAME PROGRAM, AND GAME PROCESSING METHOD
US8593558B2 (en) * 2010-09-08 2013-11-26 Apple Inc. Camera-based orientation fix from portrait to landscape
KR101364826B1 (en) 2010-11-01 2014-02-20 닌텐도가부시키가이샤 Operating apparatus and operating system
KR101752698B1 (en) * 2011-01-06 2017-07-04 삼성전자주식회사 Photographing device and methods thereof
JP5689014B2 (en) 2011-04-07 2015-03-25 任天堂株式会社 Input system, information processing apparatus, information processing program, and three-dimensional position calculation method
WO2013013683A1 (en) * 2011-07-25 2013-01-31 Siemens Enterprise Communications Gmbh & Co. Kg Method for controlling the representation of an object on a display of a mobile device held by a user, a computer program product implementing the method and a mobile device controllable by means of the method
US9311426B2 (en) * 2011-08-04 2016-04-12 Blackberry Limited Orientation-dependent processing of input files by an electronic device
JP5861395B2 (en) 2011-11-02 2016-02-16 リコーイメージング株式会社 Portable device
US9704220B1 (en) * 2012-02-29 2017-07-11 Google Inc. Systems, methods, and media for adjusting one or more images displayed to a viewer
JP6023879B2 (en) 2012-05-18 2016-11-09 アップル インコーポレイテッド Apparatus, method and graphical user interface for operating a user interface based on fingerprint sensor input
US9075569B1 (en) * 2012-07-25 2015-07-07 Amazon Technologies, Inc. Device and associated element for coupling to the device
CN103675690B (en) * 2012-09-25 2018-05-25 腾讯科技(深圳)有限公司 A kind of method and terminal of display terminal charged state
CA2894579A1 (en) * 2012-12-12 2014-06-19 Koninklijke Philips N.V. An automated cardiopulmonary resuscitation device with a display
CN103034416B (en) * 2012-12-25 2015-09-09 珠海金山办公软件有限公司 A kind of by shaking the method making device screen display forward gravity direction to
CN103034423B (en) * 2012-12-25 2015-09-09 珠海金山办公软件有限公司 A kind of method being realized rotating in its screen display direction by shake equipment
US9741150B2 (en) 2013-07-25 2017-08-22 Duelight Llc Systems and methods for displaying representative images
US20140267006A1 (en) * 2013-03-15 2014-09-18 Giuseppe Raffa Automatic device display orientation detection
JP6161400B2 (en) * 2013-05-17 2017-07-12 キヤノン株式会社 Video playback apparatus and control method thereof
EP3575936A1 (en) * 2013-09-17 2019-12-04 Nokia Technologies Oy Determination of an operation
JP6270408B2 (en) * 2013-10-23 2018-01-31 株式会社キーエンス Photoelectric sensor
US9483087B2 (en) 2013-11-29 2016-11-01 At&T Intellectual Property I, L.P. Multi-orientation mobile device, computer-readable storage unit therefor, and methods for using the same
KR102182162B1 (en) * 2014-02-20 2020-11-24 엘지전자 주식회사 Head mounted display and method for controlling the same
KR102244248B1 (en) * 2014-04-01 2021-04-26 삼성전자주식회사 Operating Method For content and Electronic Device supporting the same
US10304163B2 (en) * 2014-09-08 2019-05-28 Apple Inc. Landscape springboard
US10228766B2 (en) * 2014-09-12 2019-03-12 Microsoft Technology Licensing, Llc Enhanced Display Rotation
US10776739B2 (en) 2014-09-30 2020-09-15 Apple Inc. Fitness challenge E-awards
US20160133420A1 (en) * 2014-11-10 2016-05-12 General Electric Company Circuit breaker with orientation correcting user interface system
CN105991940A (en) * 2015-02-13 2016-10-05 深圳积友聚乐科技有限公司 Method and system for processing image
KR102344045B1 (en) * 2015-04-21 2021-12-28 삼성전자주식회사 Electronic apparatus for displaying screen and method for controlling thereof
CN105005386B (en) * 2015-07-23 2017-05-17 广东欧珀移动通信有限公司 Method for regulating screen display direction and terminal
US10055818B2 (en) * 2016-09-30 2018-08-21 Intel Corporation Methods, apparatus and articles of manufacture to use biometric sensors to control an orientation of a display
JP6818496B2 (en) * 2016-10-06 2021-01-20 ソニー・オリンパスメディカルソリューションズ株式会社 Image processing device for endoscopes, endoscope device, operation method of image processing device for endoscope, and image processing program
CA3049249C (en) * 2017-01-06 2023-09-19 Ryan SHELTON Self-orienting imaging device and methods of use
DE102017204980B3 (en) 2017-03-24 2018-06-21 Ifm Electronic Gmbh Display unit for a measuring device of process and automation technology and measuring device with such a display unit
CN107247571B (en) * 2017-06-26 2020-07-24 京东方科技集团股份有限公司 Display device and display method thereof
US10627854B2 (en) 2018-04-13 2020-04-21 Microsoft Technology Licensing, Llc Systems and methods of providing a multipositional display
US10890288B2 (en) 2018-04-13 2021-01-12 Microsoft Technology Licensing, Llc Systems and methods of providing a multipositional display
US11538442B2 (en) 2018-04-13 2022-12-27 Microsoft Technology Licensing, Llc Systems and methods of displaying virtual elements on a multipositional display
US10678490B1 (en) * 2018-08-09 2020-06-09 Rockwell Collins, Inc. Polyhedral display device using shaped flat-panel displays
CN112445139A (en) * 2019-08-30 2021-03-05 珠海格力电器股份有限公司 Intelligent magic cube controller
CN113163228B (en) * 2020-01-22 2022-11-08 聚好看科技股份有限公司 Media asset playing type marking method and server
US11215817B1 (en) * 2020-12-03 2022-01-04 Facebook Technologies, Llc. Systems and methods for generating spectator images of an artificial reality environment

Citations (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4527155A (en) * 1981-03-04 1985-07-02 Nissan Motor Company, Limited System for maintaining an orientation of characters displayed with a rotatable image
US4649499A (en) * 1984-03-07 1987-03-10 Hewlett-Packard Company Touchscreen two-dimensional emulation of three-dimensional objects
US4831368A (en) * 1986-06-18 1989-05-16 Hitachi, Ltd. Display apparatus with rotatable display screen
US5134390A (en) * 1988-07-21 1992-07-28 Hitachi, Ltd. Method and apparatus for rotatable display
US5454043A (en) * 1993-07-30 1995-09-26 Mitsubishi Electric Research Laboratories, Inc. Dynamic and static hand gesture recognition through low-level image analysis
US5485600A (en) * 1992-11-09 1996-01-16 Virtual Prototypes, Inc. Computer modelling system and method for specifying the behavior of graphical operator interfaces
US5574479A (en) * 1994-01-07 1996-11-12 Selectech, Ltd. Optical system for determining the roll orientation of a remote unit relative to a base unit
US5661632A (en) * 1994-01-04 1997-08-26 Dell Usa, L.P. Hand held computer with dual display screen orientation capability controlled by toggle switches having first and second non-momentary positions
US6038467A (en) * 1997-01-24 2000-03-14 U.S. Philips Corporation Image display system and image guided surgery system
US6088018A (en) * 1998-06-11 2000-07-11 Intel Corporation Method of using video reflection in providing input data to a computer system
US6115025A (en) * 1997-09-30 2000-09-05 Silicon Graphics, Inc. System for maintaining orientation of a user interface as a display changes orientation
US6147678A (en) * 1998-12-09 2000-11-14 Lucent Technologies Inc. Video hand image-three-dimensional computer interface with multiple degrees of freedom
US6148149A (en) * 1998-05-26 2000-11-14 Microsoft Corporation Automatic image rotation in digital cameras
US6154214A (en) * 1998-03-20 2000-11-28 Nuvomedia, Inc. Display orientation features for hand-held content display device
US6160899A (en) * 1997-07-22 2000-12-12 Lg Electronics Inc. Method of application menu selection and activation using image cognition
EP1063843A2 (en) * 1999-05-28 2000-12-27 Sony Corporation Image pick-up apparatus having an image display screen
US6181344B1 (en) * 1998-03-20 2001-01-30 Nuvomedia, Inc. Drag-and-release method for configuring user-definable function key of hand-held computing device
US6204852B1 (en) * 1998-12-09 2001-03-20 Lucent Technologies Inc. Video hand image three-dimensional computer interface
US6256019B1 (en) * 1999-03-30 2001-07-03 Eremote, Inc. Methods of using a controller for controlling multi-user access to the functionality of consumer devices
US20010035845A1 (en) * 1995-11-28 2001-11-01 Zwern Arthur L. Portable display and method for controlling same with speech
US6326978B1 (en) * 1999-04-20 2001-12-04 Steven John Robbins Display method for selectively rotating windows on a computer display
US20020021278A1 (en) * 2000-07-17 2002-02-21 Hinckley Kenneth P. Method and apparatus using multiple sensors in a device with a display
US20020036622A1 (en) * 2000-09-26 2002-03-28 Denny Jaeger Method and apparatus for detecting actuation of a controller device on a touch screen
US20020091762A1 (en) * 2000-03-07 2002-07-11 Yahoo! Inc. Information display system and methods
US20020091763A1 (en) * 2000-11-06 2002-07-11 Shah Lacky Vasant Client-side performance optimization system for streamed applications
US20020129068A1 (en) * 1997-09-09 2002-09-12 Eiji Takasu Information processing method, apparatus, and storage medium for shifting objects in a document
US20020140675A1 (en) * 1999-01-25 2002-10-03 Ali Ammar Al System and method for altering a display mode based on a gravity-responsive sensor
US20020149613A1 (en) * 2001-03-05 2002-10-17 Philips Electronics North America Corp. Automatic positioning of display depending upon the viewer's location
US6567101B1 (en) * 1999-10-13 2003-05-20 Gateway, Inc. System and method utilizing motion input for manipulating a display of data
US6597384B1 (en) * 1999-12-22 2003-07-22 Intel Corporation Automatic reorienting of screen orientation using touch sensitive system
US20030221876A1 (en) * 2002-05-31 2003-12-04 Doczy Paul J. Instrument-activated sub-surface computer buttons and system and method incorporating same
US20040049743A1 (en) * 2000-03-31 2004-03-11 Bogward Glenn Rolus Universal digital mobile device
US6721738B2 (en) * 2000-02-01 2004-04-13 Gaveo Technology, Llc. Motion password control system
US6750848B1 (en) * 1998-11-09 2004-06-15 Timothy R. Pryor More useful man machine interfaces and applications
US6798429B2 (en) * 2001-03-29 2004-09-28 Intel Corporation Intuitive mobile device interface to virtual spaces
US20040204130A1 (en) * 2002-08-30 2004-10-14 Khazaka Samir Khalil Display format for handheld wireless communication devices
US20040201595A1 (en) * 2003-04-11 2004-10-14 Microsoft Corporation Self-orienting display
US20040212597A1 (en) * 2000-10-31 2004-10-28 Frank Nuovo Keypads for electrical devices
US20040257341A1 (en) * 2002-12-16 2004-12-23 Bear Eric Justin Gould Systems and methods for interfacing with computer devices
US6888532B2 (en) * 2001-11-30 2005-05-03 Palmone, Inc. Automatic orientation-based user interface for an ambiguous handheld device
US20050140696A1 (en) * 2003-12-31 2005-06-30 Buxton William A.S. Split user interface
US7084859B1 (en) * 1992-09-18 2006-08-01 Pryor Timothy R Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US7161618B1 (en) * 1998-03-18 2007-01-09 Minolta Co., Ltd. Camera system including camera and computer having inter-device control capability and camera thereof
US7190331B2 (en) * 2002-06-06 2007-03-13 Siemens Corporate Research, Inc. System and method for measuring the registration accuracy of an augmented reality system
US7426329B2 (en) * 2003-03-06 2008-09-16 Microsoft Corporation Systems and methods for receiving, storing, and rendering digital video, music, and pictures on a personal media player

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5742264A (en) * 1995-01-24 1998-04-21 Matsushita Electric Industrial Co., Ltd. Head-mounted display
EP1008927A4 (en) * 1996-03-15 2007-04-18 Hitachi Ltd Display and its operating method
US6911916B1 (en) * 1996-06-24 2005-06-28 The Cleveland Clinic Foundation Method and apparatus for accessing medical data over a network
US6597817B1 (en) * 1997-07-15 2003-07-22 Silverbrook Research Pty Ltd Orientation detection for digital cameras
US6496104B2 (en) * 2000-03-15 2002-12-17 Current Technologies, L.L.C. System and method for communication via power lines using ultra-short pulses
JP4708581B2 (en) * 2000-04-07 2011-06-22 キヤノン株式会社 Coordinate input device, coordinate input instruction tool, and computer program
US6813344B1 (en) * 2001-08-29 2004-11-02 Palm Source, Inc. Method and system for providing information for identifying callers based on a partial number
WO2004036378A2 (en) * 2002-10-15 2004-04-29 Mcintyre David J System and method for simulating visual defects

Patent Citations (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4527155A (en) * 1981-03-04 1985-07-02 Nissan Motor Company, Limited System for maintaining an orientation of characters displayed with a rotatable image
US4649499A (en) * 1984-03-07 1987-03-10 Hewlett-Packard Company Touchscreen two-dimensional emulation of three-dimensional objects
US4831368A (en) * 1986-06-18 1989-05-16 Hitachi, Ltd. Display apparatus with rotatable display screen
US5134390A (en) * 1988-07-21 1992-07-28 Hitachi, Ltd. Method and apparatus for rotatable display
US7084859B1 (en) * 1992-09-18 2006-08-01 Pryor Timothy R Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US5485600A (en) * 1992-11-09 1996-01-16 Virtual Prototypes, Inc. Computer modelling system and method for specifying the behavior of graphical operator interfaces
US5454043A (en) * 1993-07-30 1995-09-26 Mitsubishi Electric Research Laboratories, Inc. Dynamic and static hand gesture recognition through low-level image analysis
US5661632A (en) * 1994-01-04 1997-08-26 Dell Usa, L.P. Hand held computer with dual display screen orientation capability controlled by toggle switches having first and second non-momentary positions
US5574479A (en) * 1994-01-07 1996-11-12 Selectech, Ltd. Optical system for determining the roll orientation of a remote unit relative to a base unit
US20010035845A1 (en) * 1995-11-28 2001-11-01 Zwern Arthur L. Portable display and method for controlling same with speech
US6038467A (en) * 1997-01-24 2000-03-14 U.S. Philips Corporation Image display system and image guided surgery system
US6160899A (en) * 1997-07-22 2000-12-12 Lg Electronics Inc. Method of application menu selection and activation using image cognition
US20020129068A1 (en) * 1997-09-09 2002-09-12 Eiji Takasu Information processing method, apparatus, and storage medium for shifting objects in a document
US6115025A (en) * 1997-09-30 2000-09-05 Silicon Graphics, Inc. System for maintaining orientation of a user interface as a display changes orientation
US7161618B1 (en) * 1998-03-18 2007-01-09 Minolta Co., Ltd. Camera system including camera and computer having inter-device control capability and camera thereof
US6154214A (en) * 1998-03-20 2000-11-28 Nuvomedia, Inc. Display orientation features for hand-held content display device
US6181344B1 (en) * 1998-03-20 2001-01-30 Nuvomedia, Inc. Drag-and-release method for configuring user-definable function key of hand-held computing device
US6148149A (en) * 1998-05-26 2000-11-14 Microsoft Corporation Automatic image rotation in digital cameras
US6088018A (en) * 1998-06-11 2000-07-11 Intel Corporation Method of using video reflection in providing input data to a computer system
US6750848B1 (en) * 1998-11-09 2004-06-15 Timothy R. Pryor More useful man machine interfaces and applications
US6204852B1 (en) * 1998-12-09 2001-03-20 Lucent Technologies Inc. Video hand image three-dimensional computer interface
US6147678A (en) * 1998-12-09 2000-11-14 Lucent Technologies Inc. Video hand image-three-dimensional computer interface with multiple degrees of freedom
US20020140675A1 (en) * 1999-01-25 2002-10-03 Ali Ammar Al System and method for altering a display mode based on a gravity-responsive sensor
US6256019B1 (en) * 1999-03-30 2001-07-03 Eremote, Inc. Methods of using a controller for controlling multi-user access to the functionality of consumer devices
US6326978B1 (en) * 1999-04-20 2001-12-04 Steven John Robbins Display method for selectively rotating windows on a computer display
EP1063843A2 (en) * 1999-05-28 2000-12-27 Sony Corporation Image pick-up apparatus having an image display screen
US6518956B1 (en) * 1999-05-28 2003-02-11 Sony Corporation Image pick-up apparatus
US6567101B1 (en) * 1999-10-13 2003-05-20 Gateway, Inc. System and method utilizing motion input for manipulating a display of data
US6597384B1 (en) * 1999-12-22 2003-07-22 Intel Corporation Automatic reorienting of screen orientation using touch sensitive system
US6721738B2 (en) * 2000-02-01 2004-04-13 Gaveo Technology, Llc. Motion password control system
US20020091762A1 (en) * 2000-03-07 2002-07-11 Yahoo! Inc. Information display system and methods
US20040049743A1 (en) * 2000-03-31 2004-03-11 Bogward Glenn Rolus Universal digital mobile device
US20020021278A1 (en) * 2000-07-17 2002-02-21 Hinckley Kenneth P. Method and apparatus using multiple sensors in a device with a display
US20020036622A1 (en) * 2000-09-26 2002-03-28 Denny Jaeger Method and apparatus for detecting actuation of a controller device on a touch screen
US20040212597A1 (en) * 2000-10-31 2004-10-28 Frank Nuovo Keypads for electrical devices
US20020091763A1 (en) * 2000-11-06 2002-07-11 Shah Lacky Vasant Client-side performance optimization system for streamed applications
US20020149613A1 (en) * 2001-03-05 2002-10-17 Philips Electronics North America Corp. Automatic positioning of display depending upon the viewer's location
US6798429B2 (en) * 2001-03-29 2004-09-28 Intel Corporation Intuitive mobile device interface to virtual spaces
US6888532B2 (en) * 2001-11-30 2005-05-03 Palmone, Inc. Automatic orientation-based user interface for an ambiguous handheld device
US20030221876A1 (en) * 2002-05-31 2003-12-04 Doczy Paul J. Instrument-activated sub-surface computer buttons and system and method incorporating same
US7190331B2 (en) * 2002-06-06 2007-03-13 Siemens Corporate Research, Inc. System and method for measuring the registration accuracy of an augmented reality system
US20040204130A1 (en) * 2002-08-30 2004-10-14 Khazaka Samir Khalil Display format for handheld wireless communication devices
US20040257341A1 (en) * 2002-12-16 2004-12-23 Bear Eric Justin Gould Systems and methods for interfacing with computer devices
US7426329B2 (en) * 2003-03-06 2008-09-16 Microsoft Corporation Systems and methods for receiving, storing, and rendering digital video, music, and pictures on a personal media player
US20040201595A1 (en) * 2003-04-11 2004-10-14 Microsoft Corporation Self-orienting display
US7626598B2 (en) * 2003-04-11 2009-12-01 Microsoft Corporation Self-orienting display
US20110090256A1 (en) * 2003-04-11 2011-04-21 Microsoft Corporation Self-orienting display
US20050140696A1 (en) * 2003-12-31 2005-06-30 Buxton William A.S. Split user interface

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Mark T. Bolas, Human Factors in the Design of an Immersive Display, January 1994, IEEE Computer Graphics and Applications, Volume 14, Issue 1, pages 55-59. *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080317441A1 (en) * 2003-03-06 2008-12-25 Microsoft Corporation Systems and methods for receiving, storing, and rendering digital video, music, and pictures on a personal media player
US8503861B2 (en) 2003-03-06 2013-08-06 Microsoft Corporation Systems and methods for receiving, storing, and rendering digital video, music, and pictures on a personal media player
US9479553B2 (en) 2003-03-06 2016-10-25 Microsoft Technology Licensing, Llc Systems and methods for receiving, storing, and rendering digital video, music, and pictures on a personal media player
US10178141B2 (en) 2003-03-06 2019-01-08 Microsoft Technology Licensing, Llc Systems and methods for receiving, storing, and rendering digital video, music, and pictures on a personal media player
US20040201595A1 (en) * 2003-04-11 2004-10-14 Microsoft Corporation Self-orienting display
US20110090256A1 (en) * 2003-04-11 2011-04-21 Microsoft Corporation Self-orienting display

Also Published As

Publication number Publication date
US20150062180A1 (en) 2015-03-05
US20040201595A1 (en) 2004-10-14
US20110090256A1 (en) 2011-04-21
US7626598B2 (en) 2009-12-01
US20050156882A1 (en) 2005-07-21

Similar Documents

Publication Publication Date Title
US7626598B2 (en) Self-orienting display
US20210026413A1 (en) Flexible display device and method of controlling same
US9906725B2 (en) Portable video communication system
EP3246788B1 (en) Head mounted display device and method for controlling the same
CN102682742B (en) Transparent display and operational approach thereof
US20100124363A1 (en) Display privacy system
US9176542B2 (en) Accelerometer-based touchscreen user interface
US20200409545A1 (en) Display adaptation method and apparatus for application, and storage medium
JP5944150B2 (en) Portable information terminal, program for controlling portable information terminal, and distance learning method
US20190012000A1 (en) Deformable Electronic Device with Methods and Systems for Controlling the Deformed User Interface
US20110134146A1 (en) Interactive display method
CN110383214B (en) Information processing apparatus, information processing method, and recording medium
WO2020238408A1 (en) Application icon display method and terminal
CN106454499B (en) Mobile terminal and its control method
US20090322909A1 (en) Simulated reflective display
JP2022534757A (en) Imaging method and terminal
TWI674529B (en) Mixed reality assembly and method of forming mixed reality
KR20120104480A (en) Transparent display apparatus
CN115904079A (en) Display equipment adjusting method, device, terminal and storage medium
US11907357B2 (en) Electronic devices and corresponding methods for automatically performing login operations in multi-person content presentation environments
WO2018192455A1 (en) Method and apparatus for generating subtitles
KR20120104471A (en) Transparent display apparatus
KR20120104475A (en) Transparent display apparatus
CN213028132U (en) Mobile terminal
KR102201740B1 (en) Mobile terminal and method for controlling the same

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034544/0001

Effective date: 20141014