|Publication number||US20050285941 A1|
|Application number||US 10/878,952|
|Publication date||Dec 29, 2005|
|Filing date||Jun 28, 2004|
|Priority date||Jun 28, 2004|
|Also published as||DE602005010275D1, EP1782406A2, EP1782406B1, EP1916639A2, EP1916639A3, WO2006085960A2, WO2006085960A3|
|Publication number||10878952, 878952, US 2005/0285941 A1, US 2005/285941 A1, US 20050285941 A1, US 20050285941A1, US 2005285941 A1, US 2005285941A1, US-A1-20050285941, US-A1-2005285941, US2005/0285941A1, US2005/285941A1, US20050285941 A1, US20050285941A1, US2005285941 A1, US2005285941A1|
|Inventors||Karen Haigh, Liana Kiff, Vassilios Morellas|
|Original Assignee||Haigh Karen Z, Kiff Liana M, Vassilios Morellas|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (17), Referenced by (19), Classifications (30), Legal Events (1)|
|External Links: USPTO, USPTO Assignment, Espacenet|
The present invention relates generally to the field of monitoring devices and systems. More specifically, the present invention relates to monitoring devices having on-board image processing capabilities.
Monitoring devices are used in a wide variety of applications for monitoring activity in one or more spaces. One type of monitoring device is a simple motion detector, which detects and then reports whether motion has been detected within the field of view (FOV) of the detector. In general, such motion detectors are part of a motion detection system that simply reports whether motion has been detected without typically providing other information. Since these motion detectors typically do not capture images, they have limited use in identifying what is actually going on in the monitored space, but can be of particular use in applications where privacy is demanded.
An example of a more sophisticated monitoring system is a video surveillance system. Video surveillance systems typically include a number of video cameras that are used to relay video images of the monitored space to a centralized controller/processor, which can then be provided to a display screen and/or video recording device. Video surveillance systems can have a number of drawbacks, however. First, they can be relatively expensive. Second, privacy concerns over the transmission of images to a centralized remote location can limit the use of such systems. For example, in some homes, office buildings, hospitals, elder care facilities, and other locations, for example, the transmission of images to a monitor, screen or recording device can cause apprehension, discomfort, and/or other privacy concerns for the occupants, preventing their installation in such locations. In certain cases, the transmission of images to a remote location may be prohibited by law, or may pose a security risk if intercepted by an unauthorized third party.
The present invention pertains to monitoring devices having on-board image processing capabilities. Associated systems and methods for monitoring one or more objects are also described herein.
A monitoring device in accordance with an illustrative embodiment of the present invention can include an image detector for viewing one or more objects within a field of view, an on-board image processor adapted to determine one or more object parameters related to the one or more objects in the FOV, and a communication means for transmitting an imageless output signal to a remote location such as a fire station, a police station, an Emergency Medical Service (EMS) provider, a security operator, a customer service center, and/or any other desired location. In certain embodiments, the monitoring device can be programmed to run one or more routines that can be used to compute various parameters relating to one or more tracked objects, the status of the monitoring device, as well as other environmental factors. In one such embodiment, for example, the monitoring device can be configured to output a detector output parameter, an environment output parameter, a significance output parameter, a confidence output parameter, and/or an object output parameter computed by the image processor. The number and/or type of output parameters can vary depending on the particular application, as desired.
An illustrative method or routine for monitoring one or more objects using a monitoring device equipped with an on-board image processor can include the steps of initiating a low-power image differencing routine within the monitoring device that can be used to detect the initial presence of motion, and then initiating a higher rate mode within the monitoring device if motion is detected. Upon the initiation of the higher rate mode or at other desired times, the monitoring device can be configured to adjust the image capture rate, allowing higher-level information to be computed by the image processor. In certain embodiments, the monitoring device can be configured to determine if one or more of the computed parameters are of significance, and, if so, output that parameter to a remote location and/or to another monitoring device. In other embodiments, the monitoring device can be programmed to detect if a particular override event has occurred, justifying the output of an image to the remote location.
The following description should be read with reference to the drawings, in which like elements in different drawings are numbered in like fashion. The drawings, which are not necessarily to scale, depict illustrative embodiments and are not intended to limit the scope of the invention. Although examples of various operational steps are illustrated in the various views, those skilled in the art will recognize that the many of the examples provided have suitable alternatives that can be utilized. Moreover, while specific applications are described throughout the disclosure, it should be understood that the present invention could be employed in other applications where motion detection is desired.
In some cases, the monitoring devices 16 can be operatively coupled to each other to permit the tracking of one or more objects within each room 14, or to track movement of an object from one room 14 to another. A first monitoring device 16 a located in a lower-left room 22 of the building 12, for example, can be configured to track an individual 24 moving in a direction indicated generally by arrow 26. The first monitoring device 16 a can be configured to initially scan the entire area of the room 22, and then pan and/or tilt in the direction of the individual 24 once their presence is detected. In certain embodiments, the first monitoring device 16 a can also be configured to zoom-in on the object using, for example, a vari-focal lens, as indicated generally by dashed lines 28.
A second monitoring device 16 b located in an adjoining room 30 of the building 12 can be configured to track motion 26 of the individual 24 from the first room 22 into the second room 30. The second monitoring device 16 b can have an overlapping field of view with the first monitoring device 16 a to permit the smooth transitioning and indexing from one monitoring device to the next without encountering any discontinuity; however this is not required. As with the first monitoring device 16 a, the second monitoring device 16 b can include a set of pan, tilt, and zoom controls to facilitate tracking of the individual 24 within its field of view, if desired.
In certain rooms 14 within the building 12, multiple monitoring devices 16 can be employed to facilitate the tracking of multiple objects, or to differentiate between various features of a single object. In the illustrative monitoring system 10 of
In some embodiments, the monitoring devices 16 can be adapted to communicate with each other to permit monitoring of all or selective rooms 14 within the building 12, as desired. The monitoring devices 16 can be either hard-wired to each other via an electrical cable, fiber optic cable, or other suitable conduit, or can include a wireless transponder/receiver that can be used to wirelessly transmit and receive signals to and from each monitoring device 16 within the system 10.
In certain embodiments, the monitoring devices 16 can be networked with other components of the monitoring system 10 including, for example, fire or carbon monoxide detectors, window or door detectors, proximity sensors, ambient light sensors, temperature sensors, electrical load switches, glucose sensors, sleep detectors, seismic sensors, magnetic strip sensors, etc. to further detect movement or the occurrence of other events within the building 12. The monitoring devices 16 can be coupled to the other system components via a local area network (LAN), a wide area network (WAN), a public switch telecommunications network (PSTN), or other suitable connection means, as desired. In certain embodiments, a computer terminal 37 or other suitable system controller equipped with a user interface can be provided to coordinate the functioning of one or more components within the monitoring system 10.
In use, the monitoring system 10 can be used monitor the health and safety of occupants living alone, and, if necessary, contact a caregiver, security guard or other third-party operator. In certain applications, for example, the monitoring system 10 can be used to monitor individuals at risk for injury such as the elderly or disabled. The monitoring devices 16 can be coordinated in a manner to detect, for example, whether an accidental fall has occurred, to detect the lack of an expected activity (e.g. eating or cooking), or to provide for home automation by activating lights, opening doors, etc. The monitoring devices 16 can also be used to discretely monitor bathroom activity, or provide an assessment of whether the individual is acting in a different manner than normal, indicative of a stroke or other emergency event.
The monitoring system 10 can also be used in fire and security applications to identify motion in areas where video would normally be inappropriate. In certain security applications, for example, the monitoring system 10 could be used to detect motion in restrooms, dressing rooms, or other areas where the transmission of images is normally restricted. In some fire detection applications, the monitoring system 10 could be employed to determine if a fire has occurred by detecting the presence of a flame, heat, or other such indicator, and then contact the fire department and alert the emergency personnel responding to the fire of the presence of trapped victims in areas within the building 12 that would otherwise not be monitored effectively. The monitoring system 10 can also be used in other applications such as that described in co-pending application Ser. No. 10/341,335, entitled “A Method for Monitoring, Recognizing, Supporting, and Responding to the Behavior of an Actor,” which is incorporated herein by reference in its entirety.
As discussed previously, each monitoring device 16 can be configured to output an imageless signal 18 that can then be transmitted by the monitoring system 10 to a remote location, thus ensuring the privacy and security of the occupants. In certain cases, however, it may be desirable and/or necessary to transmit an image signal to the remote location upon the occurrence of an event. If, for example, one or more of the monitoring devices 16 within the monitoring system 10 determine that an individual has fallen down, it may be desirable for the system 10 to transmit an image signal to emergency response personnel along with an alarm indicating that a fall has occurred. In such event, the monitoring system 10 can be configured to temporarily transmit an image signal, allowing the response personnel to confirm the actual occurrence of the event, and take appropriate action.
The image detector 44 may employ one or more infrared and/or visible light cameras capable of acquiring images that can be used by the image processor 42 to determine several object-related parameters. In certain embodiments, the monitoring device 40 can be configured to employ both infrared and visible light cameras, allowing the monitoring device 40 to differentiate between animate and inanimate objects.
The monitoring device 40 can be equipped with communication means 46 that can be used to transmit signals to and from a remote location 48 such as a computer terminal, relay station, or the like. The communication means 46 may include an antenna, electrical wire, fiber optic cable, or other suitable transmission means for transmitting signals back and forth between the remote location 48 and monitoring device 40. In certain embodiments, the communication means 48 can be configured to receive commands from the remote location 48 or some other desired device and/or source that can be used to upload monitoring device routines, as needed, or to diagnose or check images received by the image detector 44 to verify the proper functioning of the monitoring device 40.
A coordination module 50 of the monitoring device 40 can be configured to coordinate the use of other monitoring devices 40 within the monitoring system, if any. In some embodiments, for example, the coordination module 50 can be utilized to synchronize tracking of multiple monitoring devices 40 within the system in order to anticipate movement of the objects across multiple fields. The coordination module 50 can also be used to coordinate the monitoring device 40 to function with other system components (e.g. proximity sensors, temperature sensors, etc.) in the system. In certain embodiments, for example, the coordination module 50 can be configured to synchronize the frame rate of the monitoring device 40 with other monitoring devices 40 and/or components in the monitoring system.
The monitoring device 40 may further include a detector control unit 52 for controlling the operation of the image detector 44. The detector control unit 52 can, for example, be operatively coupled to a set of pan, tilt and zoom controls that can be used control the tracking and focusing of the image detector 44. The detector control 52 can also be used to adjust various other settings (e.g. sensitivity, operation time, etc.), as desired. The detector control 52 as well as other components of the monitoring device 40 can be powered via a power source 54 such as a battery or power line.
The image processor 42 can be programmed to run a number of special modes that can be used to task the monitoring device 40 in a particular manner. In certain embodiments, for example, the image processor 42 can be pre-programmed to run a separate vacation mode routine, sick mode routine, sleep mode routine or other such routine, allowing a user to adjust the types of information acquired and/or processed by the image processor 42. In a sleep mode routine, for example, the monitoring device 40 can be configured to trigger an intruder alarm response if motion is detected at a period of time when the actor is typically asleep.
The image processor 42 can be configured to compute a number of imageless output parameters 60 that can then be transmitted via the monitoring system to a remote location for monitoring. In the illustrative embodiment depicted in
The DETECTOR output parameter 62 outputted by the image processor 42 can be used to relay status information about the monitoring device 40 and any associated components. Example status information may include the identity of the particular monitoring device 40 providing the signal, the location of the detector, the pan/tilt/zoom settings of the detector, the amount of ambient light detected by the detector, the frame rate of the detector, the aspect ratio of the detector, the sensitivity settings of the detector, the power status of the detector, the date and time of the transmitted signal, as well as other desired information regarding the status and operation of the detector. If, for example, the monitoring device 40 detects motion within its FOV, the image processor 44 can be configured to output a unique identification code identifying the monitoring device 40 that detected the motion, along with the date and time in which the motion was detected. In some embodiments, self-diagnostic information can also be provided to check the operational status of the monitoring device 40, if desired.
An ENVIRONMENT output parameter 64 outputted by the image processor 42 can be used to provide information about the environment surrounding the monitoring device 40. In certain embodiments, for example, the image processor 42 can be configured to output the amount of ambient light detected, which can then be utilized to adjust the settings of the monitoring device 40, if necessary. If, for example, the image processor 42 determines that the level of ambient light surrounding the device is relatively low, the monitoring device 40 can be configured to increase the light sensitivity of the detector.
A SIGNIFICANCE output parameter 65 outputted by the monitoring device 40 may be used to alert a caregiver, security operator, customer service representative, computer, or other such receiver of the occurrence of a particular event. If, for example, an individual tracked by the monitoring device 40 abruptly stops for a certain period of time, or is oriented in an unusual position within a particular room (e.g. a restroom), the image processor 42 can be configured to transmit a SIGNIFICANCE output parameter 65 that can be utilized by the monitoring system to alert the receiver that an event requiring immediate response may have occurred. The SIGNIFICANCE output parameter 65 may comprise a binary signal such as “on” or “off”, or may comprise an alphanumeric message such as “fall detected”.
A CONFIDENCE output parameter 66 outputted by the monitoring device 40 may be used to provide an indication of the level of confidence that an event has occurred. In certain embodiments, the CONFIDENCE output parameter 66 may also indicate the percentage likelihood (e.g. 50%, 75%, 100%, etc.) that the event triggering the response is genuine. One or more differing confidence values can be provided for each object detected by the monitoring system as well as for each output parameter 60 outputted by the monitoring device 40. If, for example, the image processor 42 is 80% confident that an object detected is an individual and 60% confident that the object is moving at a rate of 5 m/s, the monitoring device 40 can be configured to output a CONFIDENCE output parameter 66 indicating “80% confidence <OBJECT is PERSON>”, and “60% confidence <OBJECT VELOCITY=5 m/s”). Similar information can be provided for multiple objects detected by the monitoring device 40. The number and type of values provided will, of course, depend on the particular application.
An OBJECT output parameter 68 of the image processor 42 can be configured to convey various information regarding objects detected by the monitoring device 40. As with the SIGNIFICANCE and CONFIDENCE output parameters 65,66, the information outputted via the OBJECT output parameter 68 may be application specific, relaying information necessary for a caregiver, security operator or other receiver to respond when an event has occurred. As discussed in greater detail below, such parameter can be provided, for example, to inform the receiver of the velocity, direction, size, temperature, orientation, as well as other such parameters corresponding to one or more objects being tracked. An identification code (e.g. “object 1”, “object 2”, etc.) corresponding to each object tracked can also be outputted to maintain consistency between each consecutive parameter outputted.
The output from the monitoring device 40 can be configured to prompt the monitoring system to trigger an alarm when a particular event has occurred, or when one or more objects are detected. The alarm can be audible, visual, or some combination of both. In certain embodiments, for example, a visual alarm can be provided by a flashing light emitting diode (LED) on a display panel, or by displaying an annotation on a video monitor. An aural alarm such as a siren or electronic voice announcer can also be provided, if desired. The visual and/or aural alarm may be provided in conjunction with the SIGNIFICANCE output parameter 65 to inform the receiver of the significance of the event.
A DISTANCE FROM DETECTOR output parameter 74 of the monitoring device 40 can be used to provide information relating to the distance of each tracked object from the monitoring device 40, or the distance of the object from some other object or geographic feature. If, for example, the image processor 42 determines that the tracked object is located 10 feet away from the monitoring device 40, a DISTANCE FROM DETECTOR output parameter 74 of “10 feet” can be outputted from the monitoring device 40.
A LOCATION OF OBJECT output parameter 75 of the monitoring device 40 can be used to provide information relating to the location of each tracked object within the FOV. The image processor 42 can be configured to determine the location of each tracked object, and then output a LOCATION OF OBJECT output parameter 75 indicating that location of the tracked object along with an identifier parameter identifying the object being tracked. The manner in which the monitoring device 40 expresses the LOCATION OF OBJECT output parameter 75 may vary depending on the particular application. In certain embodiments, for example, the LOCATION OF OBJECT output parameter 75 can be expressed as coordinates (e.g. Cartesian coordinates), pixel range, or other suitable location identifier. In those embodiments utilizing Cartesian coordinates, for example, a CAD design showing the locations of the system cameras and/or the approximate distances of the objects from each respective camera could be employed, if desired.
A TYPE OF OBJECT output parameter 76 and SIZE OF OBJECT output parameter 78 of the monitoring device 40 may be outputted by the image processor 42 to provide information about the type and size of each tracked object. Such parameters 76,78 can be provided, for example, to inform a security guard whether the type of object detected is animate or inanimate, whether the object tracked has appreciably increased in size over a period of time (e.g. indicative of shoplifting), whether the object tracked is a human or an animal, and so forth. As with other output parameters described herein, the image processor 42 can be configured to trigger an alarm signal if a particular type and/or size of object is detected.
A TEMPERATURE OF OBJECT output parameter 80 may be determined by the image processor 42 to provide an indication of the temperature of each tracked object within the field of view. Such parameter may be useful for triggering a fire alarm if heat is detected, or can be used to differentiate between animate or inanimate objects detected by the monitoring device 40. In such case, the image processor 44 can be configured to trigger an alarm or other alert informing the operator that the individual may need assistance.
In certain applications, it may be desirable to confirm the identity of each object tracked by the monitoring device 40. In such cases, the image processor 42 can be configured to run a routine that recognizes the identity of the tracked object, and output a RECOGNITION OF OBJECT output parameter 82 that provides the operator with the identity of the individual. Such parameter 84, for example, could be utilized for security applications wherein it may be desirable to confirm the identity of an individual prior to entrance within a restricted room or building.
An ORIENTATION OF OBJECT output parameter 84 and RATE OF CHANGE OF OBJECT ORIENTATION output parameter 86 can be further outputted by the image processor 42. If, for example, an individual has fallen down and is in need of assistance, the image processor 42 can be configured to output an ORIENTATION OF OBJECT output parameter 84 of “horizontal”, indicating that the tracked individual may require assistance.
A NUMBER OF OBJECTS output parameter 88 may be provided to indicate the number of objects detected within the monitoring device's 40 field of view. If, for example, three individuals are detected by the monitoring device 40, the image processor 42 can be configured to output a NUMBER OF OBJECTS parameter 88 of “3”. Such output parameter 88 can be used in conjunction with other output parameters to facilitate tracking of multiple objects by the monitoring system, if desired. In certain embodiments, the output from the monitoring device 40 can cause the monitoring system to activate an alarm or other alert if the number of objects detected reaches a certain minimum or maximum threshold value.
An OBJECT IDENTIFIER output parameter 90 can be provided for each object detected to facilitate tracking of multiple objects within the monitoring device's 40 field of view, and/or to facilitate tracking of multiple objects using other devices within the monitoring system. If, for example, the image processor 42 determines that 2 objects are located within a particular room (e.g. a bedroom), the monitoring device 40 can be configured to output an OBJECT IDENTIFIER output parameter 90 (e.g. “object 1” and “object 2”) for each object detected along with a NUMBER OF OBJECTS output parameter 88 of “2”, indicating that two objects of interest are being tracked by the monitoring device 40.
While the embodiment of
Each of the monitoring devices 96,98,100 may be configured to communicate with each other to coordinate tracking of one or more objects. In the illustrative embodiment of
As can be further seen in
Turning now to
Beginning with block 142, a temporal image differencing routine can be configured to detect changes indicative of movement and/or the presence of an object. This can be achieved, for example, by processing pixel intensity differences in the three most recent images acquired by a camera or other image detector (block 138) and stored in memory (block 140). If a change is detected between the compared images (decision block 144), the monitoring device can be configured to “wake up” and initiate a higher rate mode, as indicated generally by reference number 146, wherein image frames are processed at a higher frame rate (block 148) to permit the image processor to compute higher-level information about the object. At this step, the monitoring device may also employ image-filtering techniques (e.g. spatial median filter, dilation, etc.) to filter out certain components of the image signal prior to image processing. Alternatively, if no object motion is detected, the monitoring device can be configured to return to the initial step (i.e. step 138) and repeat the image differencing process until such motion is detected.
While it anticipated that the higher rate mode 146 be activated upon the detection of motion within the field of view in order to conserve power, an on/off switch or other suitable input means may be provided to permit the monitoring device to operate at the higher rate mode 146 at other desired times. In some embodiments, the monitoring device can be configured to initiate the higher rate mode 146 if motion is anticipated (e.g. via a control signal sent from another monitoring device), or upon the activation of another system component (e.g. a door or window sensor).
In the illustrative embodiment, once the monitoring device has detected motion of one or more objects, an image-processing step (block 150) may be performed to compute a number of desired parameters relating to one or more objects within the field of view. As discussed herein, the parameters may relate to the detector ID of the monitoring device, the date/time/location of the event and/or object, the significance of the event, and various parameters relating to the movement, orientation, size, identity, temperature or other desired parameter of the tracked object.
As indicated generally by decision block 152, once one or more parameters are computed at step 150, the monitoring device can determine if the computed parameter(s) is/are significant, and if so, transmit an imageless output signal as indicated by block 154. If, for example, the monitoring device determines that there is more than one moving object within the monitoring device's field of view when only one object is anticipated, the monitoring device can be configured to transmit an imageless output signal indicating that more than one moving object has been detected. In certain embodiments, the imageless output signal (block 154) transmitted by the monitoring device may cause the monitoring system to activate a visual and/or aural alarm that can be used to alert an operator that an event may have occurred. The process can then be repeated again with a new set of images.
If none of the computed parameter(s) is/are determined to be significant, the monitoring device can be configured to determine whether motion is still present, as indicated generally by decision block 156. If motion is still detected, the monitoring device can be configured to repeat the image-processing step of block 150 to compute a new set of parameters, otherwise the monitoring device can be configured to revert to the initial state 136 and repeat the image differencing process until such motion is detected.
Having thus described the several embodiments of the present invention, those of skill in the art will readily appreciate that other embodiments may be made and used which fall within the scope of the claims attached hereto. Numerous advantages of the invention covered by this document have been set forth in the foregoing description. It will be understood that this disclosure is, in many respects, only illustrative. Changes can be made with respect to various elements described herein without exceeding the scope of the invention.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US5396284 *||Aug 20, 1993||Mar 7, 1995||Burle Technologies, Inc.||Motion detection system|
|US6049281 *||Sep 29, 1998||Apr 11, 2000||Osterweil; Josef||Method and apparatus for monitoring movements of an individual|
|US6445298 *||Dec 21, 2000||Sep 3, 2002||Isaac Shepher||System and method for remotely monitoring movement of individuals|
|US6504482 *||Jan 4, 2001||Jan 7, 2003||Sanyo Electric Co., Ltd.||Abnormality detection apparatus and method|
|US6570608 *||Aug 24, 1999||May 27, 2003||Texas Instruments Incorporated||System and method for detecting interactions of people and vehicles|
|US6611206 *||Mar 15, 2001||Aug 26, 2003||Koninklijke Philips Electronics N.V.||Automatic system for monitoring independent person requiring occasional assistance|
|US6816184 *||Apr 15, 1999||Nov 9, 2004||Texas Instruments Incorporated||Method and apparatus for mapping a location from a video image to a map|
|US7023469 *||Apr 15, 1999||Apr 4, 2006||Texas Instruments Incorporated||Automatic video monitoring system which selectively saves information|
|US20020118862 *||Feb 25, 2002||Aug 29, 2002||Kazuo Sugimoto||Moving object detector and image monitoring system|
|US20020171551 *||Mar 15, 2001||Nov 21, 2002||Eshelman Larry J.||Automatic system for monitoring independent person requiring occasional assistance|
|US20030053658 *||Dec 27, 2001||Mar 20, 2003||Honeywell International Inc.||Surveillance system and methods regarding same|
|US20030053659 *||Dec 27, 2001||Mar 20, 2003||Honeywell International Inc.||Moving object assessment system and method|
|US20030123703 *||Dec 27, 2001||Jul 3, 2003||Honeywell International Inc.||Method for monitoring a moving object and system regarding same|
|US20030174772 *||Jan 30, 2003||Sep 18, 2003||Transchip, Inc.||Systems and methods for utilizing activity detection information in relation to image processing|
|US20040008254 *||Jun 5, 2003||Jan 15, 2004||Martin Rechsteiner||Object protection device|
|US20040030531 *||Jan 10, 2003||Feb 12, 2004||Honeywell International Inc.||System and method for automated monitoring, recognizing, supporting, and responding to the behavior of an actor|
|US20050146605 *||Nov 15, 2001||Jul 7, 2005||Lipton Alan J.||Video surveillance system employing video primitives|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US7643056 *||Mar 14, 2005||Jan 5, 2010||Aptina Imaging Corporation||Motion detecting camera system|
|US7834235||Aug 31, 2006||Nov 16, 2010||Kimberly-Clark Worldwide, Inc.||System for interactively training a child and a caregiver to assist the child to overcome bedwetting|
|US7969973 *||Oct 6, 2006||Jun 28, 2011||Canon Kabushiki Kaisha||Information processing apparatus, method for controlling the same, and program|
|US8029411||Jul 31, 2007||Oct 4, 2011||Honeywell International Inc.||Systems and methods of monitoring exercises and ranges of motion|
|US8064722 *||Mar 6, 2007||Nov 22, 2011||The United States Of America As Represented By The Secretary Of The Navy||Method and system for analyzing signal-vector data for pattern recognition from first order sensors|
|US8587667 *||Jul 9, 2007||Nov 19, 2013||International Business Machines Corporation||Beyond field-of-view tracked object positional indicators for television event directors and camera operators|
|US8775452||Sep 14, 2007||Jul 8, 2014||Nokia Corporation||Method, apparatus and computer program product for providing standard real world to virtual world links|
|US8780214 *||May 6, 2011||Jul 15, 2014||Panasonic Corporation||Imaging apparatus using shorter and larger capturing intervals during continuous shooting function|
|US9019373 *||Mar 23, 2012||Apr 28, 2015||Kabushiki Kaisha Toshiba||Monitoring device, method thereof|
|US20050226463 *||Jul 20, 2004||Oct 13, 2005||Fujitsu Limited||Imaging data server and imaging data transmission system|
|US20060007310 *||Jun 23, 2005||Jan 12, 2006||Avermedia Technologies, Inc.||Surveillance system and surveillance method|
|US20070189728 *||Dec 22, 2006||Aug 16, 2007||Lg Electronics Inc.||Method of recording and reproducing surveillance images in DVR|
|US20090185784 *||Jul 23, 2009||Atsushi Hiroike||Video surveillance system and method using ip-based networks|
|US20110279691 *||Nov 17, 2011||Panasonic Corporation||Imaging apparatus|
|US20120113311 *||Jun 1, 2011||May 10, 2012||Hon Hai Precision Industry Co., Ltd.||Image capture device and method for adjusting focal point of lens of image capture device|
|US20130057702 *||Jul 6, 2010||Mar 7, 2013||Lg Electronics Inc.||Object recognition and tracking based apparatus and method|
|US20130063593 *||Mar 14, 2013||Kabushiki Kaisha Toshiba||Monitoring device, method thereof|
|EP2052371A1 *||Jul 4, 2007||Apr 29, 2009||Tyco Safety Products Canada Ltd.||Intruder detection using video and infrared data|
|WO2008066619A1 *||Oct 19, 2007||Jun 5, 2008||Travis Sparks||Pool light with safety alarm and sensor array|
|U.S. Classification||348/155, 348/143, 348/E07.086, 348/E05.065|
|International Classification||H04N9/47, H04N7/18, G08B21/04, G08B13/194, H04N5/14, G08B13/196|
|Cooperative Classification||G08B13/19671, G08B13/19602, G08B13/19608, G08B13/1968, H04N7/181, H04N5/144, G08B21/0492, G08B21/0423, G08B21/0476, G08B13/19645|
|European Classification||G08B13/196A, G08B13/196S3, G08B13/196U1, G08B13/196L2, G08B13/196A3, G08B21/04S7, G08B21/04S5, G08B21/04A2, H04N7/18C, H04N5/14M|
|Jun 28, 2004||AS||Assignment|
Owner name: HONEYWELL INTERNATIONAL INC., NEW JERSEY
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HAIGH, KAREN Z.;KIFF, LIANA M.;MORELLAS, VASSILIOS;REEL/FRAME:015528/0980
Effective date: 20040614