|Publication number||US20050222801 A1|
|Application number||US 10/903,225|
|Publication date||Oct 6, 2005|
|Filing date||Jul 30, 2004|
|Priority date||Apr 6, 2004|
|Also published as||CA2562145A1, EP1733242A2, EP1733242A4, EP2381334A1, EP2381335A1, US8773260, US20100100623, US20110205076, US20110205376, US20110221673, WO2005101028A2, WO2005101028A3|
|Publication number||10903225, 903225, US 2005/0222801 A1, US 2005/222801 A1, US 20050222801 A1, US 20050222801A1, US 2005222801 A1, US 2005222801A1, US-A1-20050222801, US-A1-2005222801, US2005/0222801A1, US2005/222801A1, US20050222801 A1, US20050222801A1, US2005222801 A1, US2005222801A1|
|Inventors||Thomas Wulff, Alistair Hamilton, Sudhir Bhatia, David Bellows, Kevin Cordes|
|Original Assignee||Thomas Wulff, Alistair Hamilton, Sudhir Bhatia, David Bellows, Kevin Cordes|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (6), Referenced by (90), Classifications (46), Legal Events (1)|
|External Links: USPTO, USPTO Assignment, Espacenet|
This Application claims the benefit of the U.S. Provisional Application Ser. No. 60/559,735 filed on Apr. 6, 2004, which is expressly incorporated herein, by reference.
Business and individuals today rely on mobile computing products/arrangements (“MCPs”, e.g., bar code readers, PDAs, laptops, two-way pagers, mobile phones, digital cameras, mobile optical readers) in a multitude of situations ranging from basic everyday tasks to highly specialized procedures. As the virtues and benefits of utilizing MCPs continue to be realized across increasingly diverse industries, the features and capabilities of these products are expanding at a correspondingly rapid pace. In many industries, MCPs have gone from fashionable accessories to essential business components used by all levels of personnel.
Accordingly, a great need has developed for MCPs to perform complicated tasks quickly, efficiently and reliably. However, as conventional MCPs are fitted with more advanced gadgetry and software features, sacrifices are often made with respect to durability, power management and user-friendliness. While many methods have been devised attempting to resolve these difficulties, MCPs currently continue to suffer from problems of inefficient power usage, complicated operational procedures and on-screen menus, and the inability to tolerate the harsh industrial conditions to which the products may be subjected.
In the ongoing search for solutions to these problems, one aspect of MCPs that has remained overlooked is a product's kinetic state. From an MCP's motions, valuable information may be extracted from which various predetermined procedures directed at accomplishing some useful end or preventing some harmful result may be executed. Therefore, it is desirable to be able to detect, interpret and utilize the movements experienced by MCPs.
Described is a system and method for monitoring a mobile computing Arrangement. The arrangement may include a sensor and a processor. The sensor detects first data of an event including a directional orientation and a motion of the arrangement. The processor compares the first data to second data to determine if at least one predetermined procedure is to be executed. The second data may include a predetermined threshold range of changes in the directional orientation and the motion. If the predetermined procedure is to be executed, the processor selects the predetermined procedure which corresponds to the event as a function of the first data. Subsequently, the predetermined procedure is executed.
The present invention may be further understood with reference to the following description and the appended drawings, wherein like elements are provided with the same reference numerals. The present invention relates to an MCP which includes a sensor that monitors the MCP's directional orientation and motion. In particular, the sensor may measure the MCP's acceleration, velocity, or angular velocity in any direction, orientation with respect to the user, the forces on the MCP upon impact, the direction of impact, or any other shocks or movements to which the MCP may be subjected. These measurements may be contrasted with prerecorded movement patterns or predefined levels of acceptable and unacceptable movement. As will be described below, predetermined procedures may then be executed that may be useful in a wide range of applications, including but not limited to abuse indication, power management, gesture input, compensating for undesired motion, display orientation, and security.
The WLAN 40 may use a version of the IEEE 802.11 or a similar protocol. One benefit of using a version of the IEEE 802.11 standard is that existing infrastructures using that standard may be adapted to support the system with minimal modifications. With only a simple software upgrade, most MCPs 20, 25 supporting that standard may operate according to the present invention. In alternative exemplary embodiments, different wireless protocols or technologies (e.g., Bluetooth, WWAN, WPAN, infrared) may also be utilized.
Referring back to the mobile network 100, the AP 10 may be, for example, a router, switch or bridge that forms the connection between the WLAN 40 and the communications network 50. Coupled to the WLAN 40 are the MCPs 20, 25, and coupled to the communications network 50 are the server 60 and the client computer 70. The communications network 50 is utilized to transmit data between the various components of the mobile network 100. This communications network 50 can be any network usable to transmit data, such as between microprocessors, and may be a local area network (“LAN”), a wide area network (“WAN”) or the Internet. The range of the MCPs 20, 25 are restricted only by the extent of the communications network 50. When the communications network 50 includes the Internet, the range can be essentially unlimited, as long as the AP 10 connected to the communications network 50 is within range of each of the MCPs 20, 25. Therefore, the AP 10 does not have to physically be in the vicinity of the server 60 or the client computer 70, as it may be remotely located by extending network cabling or through the Internet.
The MCPs 20, 25 may be any type of computer or processor based mobile device (e.g., a bar code reader, a PDA, a laptop, a two-way pager, a mobile phone, a digital camera, a mobile optical reader). Since the MCPs 20, 25 are portable, they are capable of connecting to a wireless network, and are sufficiently small to be easily carried. The MCPs 20, 25 may be designed for specific purposes, such as reading barcodes, or may be handheld devices with different purposes, to which various functionalities have been added through separate software modules. In one exemplary embodiment, the MCPs 20, 25 are based on a multi-purpose personal digital assistant (“PDA”) such as those running the Microsoft Pocket PC 2003 operating system, or similar.
In the exemplary embodiment of
The sensor 120 may be any type of measurement device capable of monitoring directional orientation and motion, and may be based on, for example, a G-shock sensor, a switch, an accelerometer, a strain gage, a piezo, MEMS technologies, or combinations of the like. The directional orientation may include any angular movement value with respect to at least one three-dimensional axis of the MCPs 20, 25. The motion may include, for example, a velocity value, an acceleration value, or an angular velocity value. Although the sensor 120 may be of any size, the sensor 120 is preferably small enough so that any added weight and space occupied on the MCPs 20, 25 are negligible. Because the MCPs 20, 25 usually operate on batteries, the sensor 120 should also have a low power consumption. In addition, the sensor 120 should be durable enough to withstand the abusive environments of which its purpose is to monitor.
The sensor 120 detects changes in the directional orientation and motion of the MCP 20, 25 and generates the first data. The first data is provided to the processor 110 which compares the first data to predetermined second data which includes threshold range values. For example, the second data may be a prerecorded rotation of the MCP 20, 25 by ninety degrees, the detection of which may indicate of the occurrence of an event. The second data may be a maximum height from which the MCP 20, 25 is dropped. Subsequently, based on the first data, a particular predetermined procedure is selected and executed.
The first data may be retained for each instance where the measurements of the sensor 120 are above or below the second data which specifies an acceptable threshold level. The processor 110 may also append additional information to the retained first data including sequential numbering of the events, time and date for each event, acceleration data, data corresponding to a status of the MCPs 20, 25 at the date/time of the event, environmental factors, a direction of the shock, etc.
Depending on the application of the present invention, various predetermined procedures may be performed based on the first data. For example, if desired, the first data may be stored in the non-removable memory 130 and/or the removable memory 140 prior to executing any other procedures. Alternatively, the first data may not need to be stored locally at all, instead it is transmitted in real-time for storage and/or further processing by a central server or a remote device. Such a transmission may be accomplished via the communication arrangement of the mobile network 100 of
The foregoing embodiments of the mobile network 100 and the MCPs 20, 25 are not to be construed so as to limit the present invention in any way. As will be apparent to those skilled in the art, different types of MCPs 20, 25 may be used to communicate over the same data network, as long as they work under compatible protocols. Other configurations with different numbers of MCPs, APs, or client and server computers may also be used to implement the system and method of the present invention.
In an alternative exemplary embodiment of the mobile network 100, the MCPs 20, 25 may connect to the communications network 50 directly via wires despite being portable. For example, rather than real-time reporting, the MCPs 20, 25 may only be required to connect periodically to the mobile network 100 for updates on their movements as monitored by their respective sensors 120. Furthermore, no wireless capabilities or communications network 50 may be needed entirely. In such a situation, the sensor 120 makes measurements to be processed internally for use locally by the users or manufacturers. For example, the measurements may be used to suggest replacing or repairing the MCP 20, 25 because it has exceeded a threshold of abuse and is in danger of malfunctioning.
In the step 320, the MCP 20, 25 is continuously monitored by the sensor 120 for changes in the directional orientation and/or motion/movements that may constitute the occurrence of a predefined event. An event may include, for example, the MCP 20, 25 being dropped, jerked, tugged, shaken a certain number of times within a certain time period, or remaining still for a specified duration. Whenever the MCP 20, 25 experiences detectable motion or an extended lack thereof, the first data is generated. The sensor 120 may make no effort to differentiate between or prioritize directional orientation or motion values, returning all results to the processor 110 for processing.
In the step 330, the processor 110 compares the measured first data with the predetermined second data. If the characteristics of the first data match those of the second data, the processor 110 determines that an event has occurred and a corresponding predetermined procedure needs to be selected. At the occurrence of an event, the processor 110 may also attach to the first data at least one of a time/date of each event, a status of the computing arrangement, a direction of the acceleration, and environmental data. In an alternative exemplary embodiment of the present invention, the above-described attachment may occur as a part of the predetermined procedure.
For example, when the sensor 120 detects that the MCP 20, 25 came to an abrupt stop after being accelerated for a short period of time, the processor 110, after comparing that information to at least a portion of the preprogrammed second data, may conclude that the MCP 20, 25 dropped to the ground 30. From the magnitude and duration of acceleration, the processor 110 may also determine whether the drop was forcibly induced (e.g., by an abusive user) and the distance h1 or h2 of its displacement. Furthermore, from the direction of impact and other data, the processor 110 may also approximate the part of the MCP 20, 25 that initially made contact with the ground 30 and whether any critical components were directly impacted. Such information may be attached to the first data and may be helpful in determining whether the fall poses a danger to the MCP 20, 25's continued operation.
Due to practical considerations (e.g., memory limitations and processing power) and because not all event occurrences may be significant, the reporting and recording of all movements of the MCP 20, 25 no matter how minor, although possible, may in some instances be impractical. Movements within acceptable limits may be superfluous and have no bearing to applications of the present invention. Therefore, in the step 340, the first data is measured against threshold values contained in the second data. The first data is retained only when at least one event and/or reading satisfies the threshold values or matches the prerecorded motions of the second data; otherwise the first data is discarded and the method 300 is returned to the step 320 for the monitoring of new events.
If the first data falls within the threshold of the second data, the method 300 continues to the step 350 where the processor 110 selects, as a function of the first data, at least one predetermined procedure for execution. In particular, the processor 110 analyzes the measured first data and determines the corresponding procedure of the plurality of predetermined procedures.
In the step 360, the predetermined procedure is executed. The execution of the predetermined procedure may depend upon the specific application of the present invention. For example, the first data may be stored into the non-removable memory 130 or the removable memory 140. A plurality of stored first data records form an event history of the MCP 20, 25. The event history may be readily accessible to any user of the MCP 20, 25, or may be password protected and/or encrypted so that only authorized personnel (e.g., the network administrator or the manufacturer) may gain access.
Other examples of predetermined procedures include encrypting the first data so that it may be accessible only by an authorized user, transmitting the first data to a remote computer, analyzing the event history of the MCP 20, 25 for service recommendations, reporting the cause of any damages, issuing precautionary warnings of the MCP 20, 25's condition, changing the MCP 20, 25's display, powering off, etc. After the predetermined procedure has been successfully executed, the method 300 may resume again at the step 320 to monitor for new event occurrences.
The examples discussed in the foregoing discussion are for illustrative purposes only and are not representative of all possible applications of the present invention. Rather, the present invention may be applied across a diverse range of industries, practice areas, and purposes. The description that follows further outlines the features and advantages of several exemplary applications of the present invention. However, as will be apparent to one skilled in the art, the MCPs 20, 25 may benefit from and make use of an added motion sensor component according to the present invention in many other ways.
As MCPs 20, 25 are increasingly being integrated into the daily operations of businesses today, a need has developed to ensure that these MCPs 20, 25 can withstand the rugged treatment to which they are often subjected. Conventional design and construction techniques yield MCPs 20, 25 that exhibit levels of performance that are only marginal in terms of reliability and durability under the demands of industrial environments. Damaged or malfunctioning MCPs 20, 25 may have devastating effects on the numerous businesses currently relying on mobile solutions. For example, MCPs 20, 25 that are completely inoperable may result in costly delays while replacement products are sought. Also, MCPs 20, 25 with latent malfunctions may cause undetectable computational errors that corrupt systems and induce further errors down the line.
Typically, the user of the MCP 20, 25 has no reliable way of anticipating malfunctions and only discovers a problem as it manifests itself. By that time, damage has often already occurred. Therefore, there is a great need for IT and customer service personnel be able to monitor and accurately determine when the MCP 20, 25 has surpassed an intolerable threshold of abuse. This may be accomplished by establishing measured levels of acceptable and unacceptable usage profiles according to the exemplary embodiments of the present invention. In this way, user profiles may be established and predictions may be made of when the MCP 20, 25 should be replaced prior to it actually malfunctioning. In instances where the MCP 20, 25 is being abused, the customer may intercede to minimize the abusive treatment, thereby reducing the amount of service to and/or replacement of the MCP 20, 25 required and lowering the total cost of ownership.
Referring to the exemplary method 300 of
In other exemplary embodiments, the MCPs 20, 25 may similarly be directed to only retain and execute procedures when the first data indicates some form of an abuse. For example, the MCPs 20, 25 may be programmed to execute a procedure only after a predetermined number of events occurring within a predetermined time period has been detected. Furthermore, the MCPs 20, 25 may instead only retain and perform operations when the first data shows an impact to certain critical components or that are oriented in a certain predetermined direction and/or are of a certain predetermined force.
As previously mentioned, the predetermined procedure may vary depending on the specific application of the present invention. For example, in abuse indication, the predetermined procedure may simply be a real-time on-screen display of the updated event history of the MCP 20, 25. If the MCP 20, 25 is being exposed to usage profiles beyond its intended use, it may also be desirable to alert the user through visible warning (e.g., on-screen precautionary displays, flashing LEDs), audible sirens (e.g., using a speaker, headset, receiver) or mechanical alerts (e.g., vibrations, pager motors).
Furthermore, usage profiles detrimental to the MCP 20, 25 may be brought to the attention of a remote party with an interest in its condition. IT and customer service personnel, for example, may monitor the MCP 20, 25's event history in real-time, on-site or off-site, through the communication links of the mobile network 100. In instances where real-time monitoring is impossible or impractical, updates may instead be made in periodic or predetermined intervals. For example, the MCP 20, 25 may have no wireless communication capabilities, may be beyond the wireless operating range of the AP 10, or it may be desirable to conserve the limited bandwidth of the mobile network 110. In such situations, the number and level of unacceptable usage instances experienced by the MCP 20, 25 may be archived for retrieval at a later time. A periodic servicing and maintenance schedule may be established, during which remote parties may obtain updates. The event history may also be downloaded at the end of a shift when the MCP 20, 25 is returned to a docking station or charging cradle.
With the MCP 20, 25's event history, remote parties (e.g., IT and customer service personnel) may perform operations beyond servicing the particular MCP 20, 25. This information may be used by manufacturers for research and development for the benefit of later MCPs 20, 25. By establishing the usage patterns of MCPs 20, 25 operating under similar conditions, future specifications may be tailored to actual conditions of use, adjusting levels of durability based on the expected conditions to which the MCPs 20, 25 may be subjected. Acceptable standards of motion data may then be refined and monitored for excessive abuse according to a new set of criteria.
Still another advantage of the present invention to manufacturers is the ability to archive and retrieve warranty information. Manufacturers' warranties typically only insure against defects arising from production or out of the normal course of usage of the MCP 20, 25, neither of which includes the MCP 20, 25 being dropped in a way that may violate its specifications or being otherwise abused by the customer. However, without any actual knowledge of the MCP 20, 25's usage, manufacturers presented by a customer with a malfunctioning MCP 20, 25 often has no method to accurately determine the cause of the malfunction. If usage information is available either within the MCP 20, 25's memory or in transmissions to the manufacturer, warranty claims may more easily be verified or discredited.
In addition to interacting with the user or remote parties, the MCPs 20, 25 of the present invention may also autonomously monitor their own condition and take actions accordingly. The probability of losing critical data increases substantially when the MCPs 20, 25 are used beyond their intended usage profiles or environmental design specifications. The exemplary embodiments of the present invention allow the MCPs 20, 25 to take preventative measures to ensure against harm during an abusive event. For example, while an MCP 20, 25 is experiencing excessive motion beyond a predetermined usage threshold value (e.g., as the MCP 20, 25 is dropping to the ground 30 from height h1 or h2), the processor 110 in the step 360 may terminate programs containing critical information to prevent data corruption. Access to the non-removable memory 130 or the removable memory 140 by any other components may also be temporarily disabled, avoiding any possible loss of data. If necessary, the MCP 20, 25 may power off or switch into standby mode and not be allowed to resume operations until the abusive event has passed or subsided back within an acceptable range.
Although the exemplary applications of the present invention in foregoing description has primarily focused on abuse indication, the present invention may also be used in a variety of other settings. As described below, these settings include, for example, power management, gesture input, compensating for undesired motion, display orientation, and security.
The power management properties of MCPs have always been a primary focus of product design engineers. Due to their limited size and weight and their mobile nature, MCPs usually have limited power supplies (e.g., rechargeable or disposable battery packs). Developing MCPs that operate for long periods of time, without sacrificing mobility, is an ongoing design challenge. Designing a robust power management system that optimizes and conserves power is a critical element in addressing this challenge.
Understanding the MCP 20, 25 directional orientation with respect to the user is possible by incorporating the previously described sensor 120. As such, it is possible to enhance current power management systems by turning on and off various systems when appropriate. For example, many MCPs 20, 25 have a display and backlight that use a large amount of the available power supply. Utilizing the orientation aspect of the sensor may enable the MCP 20, 25 to keep the display and backlight on only when the display is within the user's viewing angle and range. By employing the exemplary system and method of the present invention, when the MCP 20, 25 is rotated past the viewing angle or brought beyond the visible distance for a predetermined time period, the display and backlight may shut off to save power. When the MCP 20, 25 is rotated back within user's viewing angle or brought within the visible range, the display and backlight may instantaneously turn back on.
Another way in which the present invention may optimize the power management of the MCP 20, 25 may be by switching it into a power conservative state when not in use. Conventional power management systems typically shut down the MCP 20, 25 or switch it into idle mode after a preset amount of time transpires with no interaction from the user. The preset time period is usually adjustable by the MCP 20, 25 software. The present invention uses the lack of motion as an additional trigger to switch the MCP 20, 25 into the idle or shut down modes, thus taking advantage of tendency of the MCPs 20, 25 to be in motion when in use, and conserving energy when at rest. The amount of motionless time needed to trigger the power saving state may also be adjustable by the MCP 20, 25 software.
Continuing with some exemplary applications of the present invention, the combined sensor and MCP 20, 25 of the present invention may also simplify the MCP 20, 25's operation through a gesture input. The advantages afforded by increasingly advanced computing products are often offset by sacrifices to usability and user-friendliness. Elaborate menus, onscreen buttons, procedures or the like frequently frustrate users and impede rather than advance productivity. The ability to sense and analyze motion through the present invention enables the MCP 20, 25 to recognize and react to various motions or user gestures. These motions or gestures may be pre-established to trigger the MCP 20, 25 to perform various functions that would otherwise need to be actuated manually.
For example, if the MCP 20, 25 equipped with a display is in document viewing mode and orientation, a quick flip of the user's wrist detected by the sensor 120 may coincide with the software application flipping to the next page of the document. In another example, when long lists of application options are being displayed to the user, a wrist roll gesture could trigger the MCP 20, 25 to start scrolling down the list. In still another example, if the MCP 20, 25 is a device with data capturing capabilities (e.g., an imager, scanner, camera), a motion detected corresponding to a certain pre-recorded gesture may trigger the MCP 20, 25 to turn on the data capture functionality.
Still another advantage of the present invention is the ability to compensate for an undesirable motion. Although not as detrimental to the MCPs 20, 25 as motion constituting abuse, minor motion values may still adversely affect applications that require as little motion as possible. For example, MCPs 20, 25 with data capture capabilities utilizing various camera technologies produce blurred or out of focus pictures when in motion. Various methods have been developed attempting to offset such undesirable effects, such as weights or stands that minimizes or cancels out extraneous motion.
The present invention may be utilized to address this problem without the need for cumbersome physical attachments or mechanical devices. Undesirable motion may be recognized, processed, and de-sensitized through various software applications employed by the MCP 20, 25 under the exemplary embodiments of the present invention. The MCP 20, 25 may identify a non-acceptable operating situation to the user due to motion through the display or other alert mechanisms, and/or automatically have the software compensate for the motion during the data capture event.
Furthermore, in MCPs 20, 25 equipped with displays, the orientation sensing capability of the present invention may also conveniently adjust the display orientation with respect to the user. MCPs 20, 25 typically format display data in landscape or portrait mode. Newer mobile software applications enable the display data format to be manually switched between the two. The present invention allows the orientation of the MCP 20, 25 to be monitored relative to the user, enabling the MCP 20, 25 to automatically switch the display data format between the landscape and portrait modes.
As a final exemplary application of the present invention, the combined sensor and MCP 20, 25 of the present invention may be used for purposes of security. Because the MCPs 20, 25 are portable, they are easily misplaced or stolen. By employing the exemplary system and method of the present invention, the MCPs 20, 25 may be able to incorporate security features that indicate their location to the user or prevent use by unauthorized personnel. For example, when the MCP 20, 25 is at rest for a preset period of time (e.g., during recharge, overnight storage), it may enter a secure mode and be programmed to trigger an alarm when motion to the MCP 20, 25 is detected. This alarm may be local to the MCP 20, 25, using audible, visual, or mechanical features. At the same time or as an alternative, the alarm may be triggered in a remote device on-site or off-site using the previously described communication systems. If the MCP 20, 25 utilized tracking technologies (e.g., global positioning system), it may also convey its location. The security features may additionally lock terminal applications, preventing the MCP 20, 25 from being used until an authorized user password is entered.
The present invention has been described with the reference to the above exemplary embodiments. One skilled in the art would understand that the present invention may also be successfully implemented if modified. Accordingly, various modifications and changes may be made to the embodiments without departing from the broadest spirit and scope of the present invention as set forth in the claims that follow. The specification and drawings, accordingly, should be regarded in an illustrative rather than restrictive sense.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US5148153 *||Dec 20, 1990||Sep 15, 1992||Motorola Inc.||Automatic screen blanking in a mobile radio data terminal|
|US5227929 *||Nov 26, 1990||Jul 13, 1993||International Business Machines Corporation||Portable computer hard disk protective reflex system|
|US6956564 *||Oct 8, 1998||Oct 18, 2005||British Telecommunications Public Limited Company||Portable computers|
|US20030122804 *||Feb 7, 2001||Jul 3, 2003||Osamu Yamazaki||Portable terminal|
|US20030139205 *||Jan 22, 2002||Jul 24, 2003||Belcher Brian E.||Access control for vehicle mounted communications devices|
|US20050183118 *||Feb 13, 2004||Aug 18, 2005||Wee Susie J.||Media data decoding device|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US7647196||Aug 8, 2007||Jan 12, 2010||Dp Technologies, Inc.||Human activity monitoring device with distance calculation|
|US7653508||Dec 22, 2006||Jan 26, 2010||Dp Technologies, Inc.||Human activity monitoring device|
|US7753861||Apr 4, 2007||Jul 13, 2010||Dp Technologies, Inc.||Chest strap having human activity monitoring device|
|US7788059||Nov 25, 2008||Aug 31, 2010||Dp Technologies, Inc.||Rotational insensitivity using gravity-based adjustments|
|US7822513||Jul 27, 2005||Oct 26, 2010||Symbol Technologies, Inc.||System and method for monitoring a mobile computing product/arrangement|
|US7881749 *||Sep 28, 2006||Feb 1, 2011||Hewlett-Packard Development Company, L.P.||Mobile communication device and method for controlling component activation based on sensed motion|
|US7881902||Jan 26, 2010||Feb 1, 2011||Dp Technologies, Inc.||Human activity monitoring device|
|US7982776 *||Jul 13, 2007||Jul 19, 2011||Ethicon Endo-Surgery, Inc.||SBI motion artifact removal apparatus and method|
|US7986917||Jul 10, 2006||Jul 26, 2011||Sony Ericsson Mobile Communications Ab||Method and system for data transfer from a hand held device|
|US7990556||Feb 28, 2006||Aug 2, 2011||Google Inc.||Association of a portable scanner with input/output and storage devices|
|US8005720||Aug 18, 2005||Aug 23, 2011||Google Inc.||Applying scanned information to identify content|
|US8019648||Apr 1, 2005||Sep 13, 2011||Google Inc.||Search engines and systems with handheld document data capture devices|
|US8061182||Jun 22, 2009||Nov 22, 2011||Research In Motion Limited||Portable electronic device and method of measuring drop impact at the portable electronic device|
|US8064700||Mar 10, 2010||Nov 22, 2011||Google Inc.||Method and system for character recognition|
|US8081849||Feb 6, 2007||Dec 20, 2011||Google Inc.||Portable scanning and memory device|
|US8146156||Sep 2, 2008||Mar 27, 2012||Google Inc.||Archive of text captures from rendered documents|
|US8170186||Apr 7, 2008||May 1, 2012||Sony Mobile Communications Ab||Electronic device with motion controlled functions|
|US8179563 *||Sep 29, 2010||May 15, 2012||Google Inc.||Portable scanning device|
|US8187182||Aug 29, 2008||May 29, 2012||Dp Technologies, Inc.||Sensor fusion for activity identification|
|US8214387||Apr 1, 2005||Jul 3, 2012||Google Inc.||Document enhancement system and method|
|US8261094||Aug 19, 2010||Sep 4, 2012||Google Inc.||Secure data gathering from rendered documents|
|US8285344||May 20, 2009||Oct 9, 2012||DP Technlogies, Inc.||Method and apparatus for adjusting audio for a user environment|
|US8314831 *||Aug 2, 2006||Nov 20, 2012||Sony Corporation||Imaging system, camera control apparatus, panorama image generation method and program therefor|
|US8320578||Apr 30, 2008||Nov 27, 2012||Dp Technologies, Inc.||Headset|
|US8346620||Sep 28, 2010||Jan 1, 2013||Google Inc.||Automatic modification of web pages|
|US8418055||Feb 18, 2010||Apr 9, 2013||Google Inc.||Identifying a document by performing spectral analysis on the contents of the document|
|US8442331||Aug 18, 2009||May 14, 2013||Google Inc.||Capturing text from rendered documents using supplemental information|
|US8447066||Mar 12, 2010||May 21, 2013||Google Inc.||Performing actions based on capturing information from rendered documents, such as documents under copyright|
|US8447111||Feb 21, 2011||May 21, 2013||Google Inc.||Triggering actions in response to optically or acoustically capturing keywords from a rendered document|
|US8447144||Aug 18, 2009||May 21, 2013||Google Inc.||Data capture from rendered documents using handheld device|
|US8489624||Jan 29, 2010||Jul 16, 2013||Google, Inc.||Processing techniques for text capture from a rendered document|
|US8505090||Feb 20, 2012||Aug 6, 2013||Google Inc.||Archive of text captures from rendered documents|
|US8515816||Apr 1, 2005||Aug 20, 2013||Google Inc.||Aggregate analysis of text captures performed by multiple users from rendered documents|
|US8549892||Sep 30, 2011||Oct 8, 2013||Blackberry Limited||Portable electronic device and method of measuring drop impact at the portable electronic device|
|US8555282||Jul 27, 2007||Oct 8, 2013||Dp Technologies, Inc.||Optimizing preemptive operating system with motion sensing|
|US8568310||May 21, 2012||Oct 29, 2013||Dp Technologies, Inc.||Sensor fusion for activity identification|
|US8587601||Jan 5, 2009||Nov 19, 2013||Dp Technologies, Inc.||Sharing of three dimensional objects|
|US8594742||Jun 21, 2006||Nov 26, 2013||Symbol Technologies, Inc.||System and method for monitoring a mobile device|
|US8600196||Jul 6, 2010||Dec 3, 2013||Google Inc.||Optical scanners, such as hand-held optical scanners|
|US8619147||Oct 6, 2010||Dec 31, 2013||Google Inc.||Handheld device for capturing text from both a document printed on paper and a document displayed on a dynamic display device|
|US8619287||Aug 17, 2009||Dec 31, 2013||Google Inc.||System and method for information gathering utilizing form identifiers|
|US8620083||Oct 5, 2011||Dec 31, 2013||Google Inc.||Method and system for character recognition|
|US8620353||Jan 26, 2007||Dec 31, 2013||Dp Technologies, Inc.||Automatic sharing and publication of multimedia from a mobile device|
|US8620760||Oct 11, 2010||Dec 31, 2013||Google Inc.||Methods and systems for initiating application processes by data capture from rendered documents|
|US8621349||Oct 5, 2010||Dec 31, 2013||Google Inc.||Publishing techniques for adding value to a rendered document|
|US8638363||Feb 18, 2010||Jan 28, 2014||Google Inc.||Automatically capturing information, such as capturing information using a document-aware device|
|US8678925||Jun 11, 2009||Mar 25, 2014||Dp Technologies, Inc.||Method and apparatus to provide a dice application|
|US8712723||Jan 31, 2011||Apr 29, 2014||Dp Technologies, Inc.||Human activity monitoring device|
|US8713418||Apr 12, 2005||Apr 29, 2014||Google Inc.||Adding value to a rendered document|
|US8773260||Apr 29, 2011||Jul 8, 2014||Symbol Technologies, Inc.||System and method for monitoring a mobile computing product/arrangement|
|US8781228||Sep 13, 2012||Jul 15, 2014||Google Inc.||Triggering actions in response to optically or acoustically capturing keywords from a rendered document|
|US8784309||Oct 23, 2013||Jul 22, 2014||Dp Technologies, Inc.||Sensor fusion for activity identification|
|US8793162||May 5, 2010||Jul 29, 2014||Google Inc.||Adding information or functionality to a rendered document via association with an electronic counterpart|
|US8799099||Sep 13, 2012||Aug 5, 2014||Google Inc.||Processing techniques for text capture from a rendered document|
|US8799303||Oct 13, 2010||Aug 5, 2014||Google Inc.||Establishing an interactive environment for rendered documents|
|US8831365||Mar 11, 2013||Sep 9, 2014||Google Inc.||Capturing text from rendered documents using supplement information|
|US8872646||Oct 8, 2008||Oct 28, 2014||Dp Technologies, Inc.||Method and system for waking up a device due to motion|
|US8874129||Jun 10, 2010||Oct 28, 2014||Qualcomm Incorporated||Pre-fetching information based on gesture and/or location|
|US8874504||Mar 22, 2010||Oct 28, 2014||Google Inc.||Processing techniques for visual capture data from a rendered document|
|US8876738||Jul 12, 2010||Nov 4, 2014||Dp Technologies, Inc.||Human activity monitoring device|
|US8892495||Jan 8, 2013||Nov 18, 2014||Blanding Hovenweep, Llc||Adaptive pattern recognition based controller apparatus and method and human-interface therefore|
|US8902154||Jul 11, 2007||Dec 2, 2014||Dp Technologies, Inc.||Method and apparatus for utilizing motion user interface|
|US8903759||Sep 21, 2010||Dec 2, 2014||Google Inc.||Determining actions involving captured information and electronic content associated with rendered documents|
|US8949070||Feb 8, 2008||Feb 3, 2015||Dp Technologies, Inc.||Human activity monitoring device with activity identification|
|US8953886||Aug 8, 2013||Feb 10, 2015||Google Inc.||Method and system for character recognition|
|US8982034 *||Apr 20, 2009||Mar 17, 2015||Htc Corporation||Portable electronic apparatus and backlight control method thereof|
|US8988439||Jun 6, 2008||Mar 24, 2015||Dp Technologies, Inc.||Motion-based display effects in a handheld device|
|US8990235||Mar 12, 2010||Mar 24, 2015||Google Inc.||Automatically providing content associated with captured information, such as information captured in real-time|
|US8996332||Jun 23, 2009||Mar 31, 2015||Dp Technologies, Inc.||Program setting adjustments based on activity identification|
|US9030699||Aug 13, 2013||May 12, 2015||Google Inc.||Association of a portable scanner with input/output and storage devices|
|US9068844||Jan 8, 2010||Jun 30, 2015||Dp Technologies, Inc.||Method and apparatus for an integrated personal navigation system|
|US9075779||Apr 22, 2013||Jul 7, 2015||Google Inc.||Performing actions based on capturing information from rendered documents, such as documents under copyright|
|US9081799||Dec 6, 2010||Jul 14, 2015||Google Inc.||Using gestalt information to identify locations in printed information|
|US9116890||Jun 11, 2014||Aug 25, 2015||Google Inc.||Triggering actions in response to optically or acoustically capturing keywords from a rendered document|
|US9143638||Apr 29, 2013||Sep 22, 2015||Google Inc.||Data capture from rendered documents using handheld device|
|US9144398||Jul 22, 2014||Sep 29, 2015||Dp Technologies, Inc.||Sensor fusion for activity identification|
|US20060100887 *||Nov 9, 2004||May 11, 2006||Erickson David E||Apparatus, system, and method for a motion based business decision|
|US20070027585 *||Jul 27, 2005||Feb 1, 2007||Thomas Wulff||System and method for monitoring a mobile computing product/arrangement|
|US20070030341 *||Aug 2, 2006||Feb 8, 2007||Sony Corporation||Imaging system, camera control apparatus, panorama image generation method and program therefor|
|US20110254691 *||Aug 25, 2010||Oct 20, 2011||Sony Corporation||Display device and control method|
|US20110270562 *||Nov 3, 2011||Nikon Corporation||Profile measuring apparatus|
|US20130050406 *||Oct 25, 2012||Feb 28, 2013||Sony Corporation||Imaging system, camera control apparatus, panorama image generation method and program therefor|
|EP2267579A1||Jun 22, 2009||Dec 29, 2010||Research In Motion Limited||Portable electronic device and method of measuring drop impact at the portable electronic device|
|WO2007105032A2 *||Dec 13, 2006||Sep 20, 2007||Sony Ericsson Mobile Comm Ab||Electronic equipment with data transfer function using motion and method|
|WO2007149747A2 *||Jun 13, 2007||Dec 27, 2007||Symbol Technologies Inc||System and device for monitoring a computing device|
|WO2008002770A2 *||Jun 13, 2007||Jan 3, 2008||Symbol Technologies Inc||Touch panel system and method for activation thereof|
|WO2008042335A2 *||Sep 27, 2007||Apr 10, 2008||Hewlett Packard Development Co||Method for controlling component activation based on motion information and corresponding mobile communication device|
|WO2010003706A1 *||Jan 12, 2009||Jan 14, 2010||Sony Ericsson Mobile Communications Ab||Method and arrangement relating power supply in an electrical device|
|WO2010008900A1 *||Jun 24, 2009||Jan 21, 2010||Dp Technologies, Inc.||Program setting adjustments based on activity identification|
|WO2014090829A1 *||Dec 10, 2013||Jun 19, 2014||Compagnie Industrielle Et Financiere D'ingenierie "Ingenico"||Method of protecting an electronic terminal, computer program, and electronic terminal corresponding thereto|
|International Classification||G06F3/00, G01P15/00, G06F1/32, G01P1/12, G06F1/16, G06F21/00, G01P3/50, G06F3/01, H04M1/725, H04M1/73|
|Cooperative Classification||G01P1/127, G06F1/1626, G06F1/3203, Y02B60/50, G06F21/88, G06F1/1698, H04W52/027, G06F2200/1614, G06F1/3265, H04M2250/12, G03B17/18, G06F1/3246, G06F2221/2101, G06F3/017, G03B2217/18, G06F1/1694, H04M1/72572, H04M1/72569, G01P15/00, Y02B60/1242, G06F2221/2111, G01P3/50|
|European Classification||G06F1/16P9P9, G06F1/16P9P7, G03B17/18, H04W52/02T8C2, G06F1/32P5N, G06F1/32P5P5, G06F21/88, G01P15/00, G06F1/32P, G06F3/01G, G01P3/50, G06F1/16P3, G01P1/12C|
|Oct 12, 2004||AS||Assignment|
Owner name: SYMBOL TECHNOLOGIES, INC., NEW YORK
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WULFF, THOMAS;HAMILTON, ALISTAIR;BHATIA, SUDHIR;AND OTHERS;REEL/FRAME:015875/0445;SIGNING DATES FROM 20040816 TO 20040817