US20020015094A1 - Monitoring system and imaging system - Google Patents

Monitoring system and imaging system Download PDF

Info

Publication number
US20020015094A1
US20020015094A1 US09/084,315 US8431598A US2002015094A1 US 20020015094 A1 US20020015094 A1 US 20020015094A1 US 8431598 A US8431598 A US 8431598A US 2002015094 A1 US2002015094 A1 US 2002015094A1
Authority
US
United States
Prior art keywords
imaging
monitoring
monitoring area
recording
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US09/084,315
Other versions
US6456320B2 (en
Inventor
Yukinori Kuwano
Toshiyuki Okino
Takashi Ikeda
Masato Arisawa
Hideto Fujita
Haruhiko Murata
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Godo Kaisha IP Bridge 1
Original Assignee
Sanyo Electric Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP14615797A external-priority patent/JPH10333219A/en
Priority claimed from JP14771697A external-priority patent/JPH10336630A/en
Priority claimed from JP14771797A external-priority patent/JPH10336632A/en
Priority claimed from JP14745497A external-priority patent/JPH1145379A/en
Application filed by Sanyo Electric Co Ltd filed Critical Sanyo Electric Co Ltd
Assigned to SANYO ELECTRIC CO., LTD. reassignment SANYO ELECTRIC CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ARISAWA, MASATO, FUJITA, HIDETO, IKEDA, TAKASHI, KUWANO, YUKINORI, MURATA, HARUHIKO, OKINO, TOSHIYUKI
Publication of US20020015094A1 publication Critical patent/US20020015094A1/en
Application granted granted Critical
Publication of US6456320B2 publication Critical patent/US6456320B2/en
Assigned to GODO KAISHA IP BRIDGE 1 reassignment GODO KAISHA IP BRIDGE 1 ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SANYO ELECTRIC CO., LTD.
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19641Multiple cameras having overlapping views on a single scene
    • G08B13/19643Multiple cameras having overlapping views on a single scene wherein the cameras play different roles, e.g. different resolution, different camera type, master-slave camera
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19652Systems using zones in a single scene defined for different treatment, e.g. outer zone gives pre-alarm, inner zone gives alarm
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19697Arrangements wherein non-video detectors generate an alarm themselves

Definitions

  • the present invention relates to a monitoring system capable of detecting that a person enters a monitoring area from an area outside the monitoring area, or a person exists from the monitoring area to the area outside the monitoring area.
  • the present invention relates to a monitoring device capable of imaging a characteristic part such as the face of an entering person.
  • the present invention relates to a monitoring device capable of monitoring a place which cannot be monitored by an imaging device such as a CCD (Charge Coupled Device) camera in the night, for example.
  • an imaging device such as a CCD (Charge Coupled Device) camera in the night, for example.
  • the present invention relates to an imaging system for intermittently recording a picked-up image of a subject .
  • An example of a conventional monitoring system for prevention is one for always imaging a monitoring area using a video camera, and displaying a picked-up image on a monitor as well as recording the picked-up image on a video tape.
  • an image projected on the monitor must be always monitored by a supervisor in order to know that a person enters the monitoring area from an area outside the monitoring area.
  • An object of the present invention is to provide a monitoring system capable of automatically detecting and reporting to a supervisor that a person enters a monitoring area from an area outside the monitoring area.
  • Another object of the present invention is to provide a monitoring system capable of automatically detecting that a person enters a monitoring area from an area outside the monitoring area and starting the recording of a picked-up image at the time point.
  • Still another object of the present invention is to provide a monitoring system capable of automatically detecting and reporting to a supervisor that a person exits from a monitoring area to an area outside the monitoring area.
  • a monitoring video camera is set for prevention in a convenience store, a bank, and so forth, so that an image picked up by the video camera is recorded on a VTR (Video Tape Recorder), and is made use of for criminal investigation.
  • VTR Video Tape Recorder
  • An object of the present invention is to provide a monitoring device capable of easily recording a face image important to specify an individual.
  • an image cannot be obtained by an imaging device such as a CCD camera. Therefore, the imaging device cannot be used as a monitoring camera for prevention.
  • an infrared camera measures, on the basis of the amount of infrared rays emitted from an object, the temperature of the object, converts the temperature distribution of the object into an amount which can be recognized by a person, and outputs the amount to a monitor or the like.
  • the infrared camera can output, if there is an object, an image based on the quantity of heat of the object depending on emitted infrared rays irrespective of illuminance, so that it is considered that the infrared camera is utilized as a monitoring camera in the place where the CCD camera is poor at monitoring, for example, in the night.
  • An object of the present invention is to provide a monitoring device capable of easily doing monitoring even in the place where there is no illuminance, for example, in the night.
  • a first monitoring system is characterized by comprising an imaging device for imaging a monitoring area, and means for detecting information relating to the movement of an object in the monitoring area on the basis of an output of the imaging device.
  • a recording device for recording an image picked up by the imaging device, and means for starting the recording by the recording device when it is judged that somebody enters the monitoring area.
  • a recording device for recording an image picked up by the imaging device, reporting means for reporting, when it is judged that somebody enters the monitoring area, to a supervisor that somebody enters the monitoring area, and means for starting the recording by the recording device when it is judged that somebody enters the monitoring area.
  • An entering person detecting sensor may be provided in an entrance path of a person entering the monitoring area so that the imaging device is operated when the entering person is detected by the entering person detecting sensor. It is preferable that a power supply comprising a solar battery and a storage battery storing power obtained by the solar battery supplies the power to the imaging device.
  • An example of the information relating to the movement of the object is a motion vector corresponding to a detecting area or motion vectors corresponding to a plurality of detecting areas set in an imaging area of the imaging device.
  • the resolution of the imaging device may be a sufficiently low resolution to judge the presence or absence of the movement of the object.
  • a second monitoring system is characterized by comprising an imaging device for imaging a monitoring area, means for detecting information relating to the movement of an object in the monitoring area on the basis of an output of the imaging device, means for judging whether or not a person to be monitored exits from the monitoring area on the basis of the information relating to the movement of the object, and reporting means for reporting, when it is judged that the person to be monitored exits from the monitoring area, to a supervisor that the person to be monitored exits from the monitoring area.
  • a third monitoring system is characterized by comprising first imaging means for imaging a monitoring area, detection means for detecting the movement of an object in the monitoring area on the basis of an output of the first imaging means, and second imaging means for imaging, when the movement of the object in the monitoring area is detected, a moving portion.
  • An example of the second imaging means is one for enlarging the moving portion and imaging the enlarged moving portion.
  • the first imaging means comprises a monitoring camera for imaging the whole monitoring area
  • the second imaging means comprises a close-up camera for taking a close-up of a part of the monitoring area and imaging the part whose close-up has been taken.
  • the first imaging means and the second imaging means may be constituted by one video camera having a zoom mechanism.
  • a recording device a switch for switching an output of the first imaging means and an output of the second imaging means and feeding the output obtained by the switching to the recording device, and control means for controlling the switch such that the output of the first imaging means is fed to the recording device when the movement of the object in the monitoring area is not detected, while the output of the second imaging means is fed to the recording device when the movement of the object in the monitoring area is detected.
  • an identifier for making identification as to which of the output of the first imaging device and the output of the second imaging device is recorded is recorded by the recording device.
  • a recording device and means for recording the output of the second imaging device by the recording device only when the movement of the object in the monitoring area is detected.
  • a fourth monitoring system is characterized by comprising detection means for detecting the movement of an object in a monitoring area by a signal change obtained on the basis of the amount of infrared rays in the monitoring area, and output means for outputting the results of the detection by the detection means.
  • a fifth monitoring system is characterized by comprising an infrared camera for receiving infrared rays emitted from an object in a monitoring area, detection means for detecting the movement of the object in the monitoring area on the basis of a signal change proportional to the intensity of the infrared rays outputted from the infrared camera, and output means for outputting the results of the detection by the detection means.
  • the fourth monitoring system or the fifth monitoring system according to the present invention is provided with a warning device, and means for driving the warning device on the basis of the output of the detection means.
  • the fourth monitoring system or the fifth monitoring system is provided with a video camera for imaging the monitoring area, and means for driving the video camera on the basis of the output of the detection means.
  • An imaging system is an imaging system for intermittently recording a picked-up image of a subject, characterized by comprising an imaging device for imaging the subject, movement amount measurement means for measuring the amount of movement of the subject from the previous time when the picked-up image was recorded on the basis of an output of the imaging device, and means for recording the picked-up image obtained by the imaging device when the amount of movement of the subject from the previous time when the picked-up image was recorded becomes not less than a predetermined amount.
  • FIG. 1 is a diagram showing the schematic configuration of a first monitoring system
  • FIG. 2 is a block diagram showing the electrical configuration of the first monitoring system
  • FIG. 3 is a schematic view showing a plurality of detecting areas set in a imaging area of a video camera
  • FIG. 4 is a schematic view showing a plurality of small areas in the detecting area shown in FIG. 3;
  • FIG. 5 is a schematic view showing a plurality of sampling points and one representative point which are set in the small area shown in FIG. 4;
  • FIGS. 6 a and 6 b are schematic views respectively showing a picked-up image in a case where no person enters a monitoring area and a picked-up image in a case where a person enters the monitoring area;
  • FIGS. 7 a and 7 b are schematic views respectively showing a motion vector in each of detecting areas in a case where no person enters a monitoring area and a motion vector in each of the detecting areas in a case where a person enters the monitoring area;
  • FIG. 8 is a flow chart showing the procedure for entrance monitoring processing
  • FIG. 9 is a flow chart showing another example of entrance monitoring processing
  • FIG. 10 is a block diagram showing the electrical configuration of a second monitoring system
  • FIG. 11 is a fIow chart showing the procedure for entrance monitoring processing
  • FIG. 12 is a flow chart showing another example of entrance monitoring processing
  • FIG. 13 is a block diagram showing the electrical configuration of a third monitoring system
  • FIG. 14 is a schematic view showing an inner area and an outer area which are set in a monitoring area
  • FIGS. 15 a , 15 b and 15 c are schematic views for explaining the outline of exit monitoring processing
  • FIG. 16 is a flow chart showing the procedure for exit monitoring processing
  • FIG. 17 is a block diagram showing the electrical configuration of a fourth monitoring system
  • FIG. 18 is a block diagram showing the electrical configuration of a fifth monitoring system
  • FIG. 19 is a block diagram showing the electrical configuration of a sixth monitoring system
  • FIGS. 20 a and 20 b are schematic views showing an image picked up by an infrared camera
  • FIG. 21 is a block diagram showing the electrical configuration of an imaging system
  • FIG. 22 is a flow chart showing the procedure for recording control processing performed by a CPU.
  • FIG. 23 is a flow chart showing another example of recording control processing.
  • FIG. 1 illustrates the schematic configuration of a first monitoring system capable of detecting that a person enters a monitoring area from an area outside the monitoring area.
  • the first monitoring system comprises a video camera 1 for imaging a monitoring area 100 , a monitor 2 for displaying an image picked up by the video camera 1 , a recording device 3 for recording the image picked up by the video camera 1 , and a monitoring control device 4 .
  • An output of the video camera 1 is fed to the monitor 2 , the recording device 3 , and the monitoring control device 4 .
  • the image picked up by the video camera 1 is always displayed on the monitor 2 .
  • the recording device 3 is controlled on the basis of a control signal from the monitoring control device 4 .
  • the monitoring control device 4 comprises an analog-to-digital converter (ADC) 41 , a motion vector detecting circuit 42 , a CPU 43 , an alarm 44 , a during-monitoring display lamp 45 , and an operating unit 46 .
  • the CPU 43 comprises a ROM (not shown) storing its program and the like and a RAM (not shown) storing necessary data.
  • the ADC 41 converts an analog image signal outputted from the video camera 1 into a digital image signal.
  • the digital image signal outputted from the ADC 41 is fed to the motion vector detecting circuit 42 .
  • the motion vector detecting circuit 42 detects for each frame motion vectors (information relating to the movement) for a plurality of detecting areas E set in an image area (a monitoring area) 100 of the video camera 1 , as shown in FIG. 3, on the basis of a representative point matching method.
  • each of the detecting areas E is further divided into a plurality of small areas e, as shown in FIG. 4. As shown in FIG. 5, a plurality of sampling points S and one representative point R are set in each of the small areas e.
  • the sum of correlated values at the sampling points S which are the same in deviation from the representative points R in all the small areas e in the detecting area E is found (a value obtained is hereinafter referred to as an accumulated correlated value). Consequently, accumulated correlated values whose number corresponds to the number of the sampling points S in one of the small areas e are found for each of the detecting areas E.
  • Deviation of the sampling point S having the minimum accumulated correlated value, that is, having the highest correlation in each of the detecting areas E is extracted as a motion vector (the movement of an object) in the detecting area E.
  • the magnitude of a motion vector in each of the detecting areas E is less than a predetermined value as shown in FIG. 7 a .
  • the magnitude of a motion vector in the detecting area E on which an entering person Q is projected is not less than the predetermined value as shown in FIG. 7 b.
  • a motion vector for each of the detecting areas E which is detected by the motion vector detecting circuit 42 is fed to the CPU 43 .
  • the CPU 43 performs entrance monitoring processing on the basis of the motion vectors for the detecting areas E which are inputted for each frame.
  • FIG. 8 shows the procedure for entrance monitoring processing performed by the CPU 43 .
  • the entrance monitoring processing shown in FIG. 8 is processing effective in detecting an entering person such as a thief, to report the entering person to a supervisor.
  • the during-monitoring display lamp 45 is first turned on (step 1 ).
  • motion vectors, which correspond to one frame, for the respective detecting areas E are inputted (step 2 )
  • step 3 When it is judged at the step 3 that the object moves in at least one of the detecting areas E, it is judged that a person enters the monitoring area, so that the alarm 44 is driven to report to the supervisor that a person enters the monitoring area, and recording by the recording device 3 is started to record the person entering the monitoring area (step 4 ). Further, the during-monitoring display lamp 45 is turned off.
  • step 8 When the supervisor enters a recording stop command using the operating unit 46 (YES at step 7 ), the recording by the recording device 3 is stopped (step 8 ). The program is returned to the step 1 .
  • FIG. 9 shows the procedure for another entrance monitoring processing performed by the CPU 43 .
  • the entrance monitoring processing shown in FIG. 9 is processing effective in detecting and reporting to the supervisor in a store or the like that a customer visited the store, and causing the supervisor to check the customer.
  • the during-monitoring display lamp 45 is first turned on (step 11 ).
  • motion vectors, which correspond to one frame, for the respective detecting areas E are inputted (step 12 )
  • step 13 When it is judged at the step 13 that the object moves in at least one of the detecting areas E, it is judged that a person enters the monitoring area, so that the alarm 44 is driven to report to the supervisor that a person enters the monitoring area, and recording by the recording device 3 is started to record the person entering the monitoring area (step 14 ). Further, the during-monitoring display lamp 45 is turned off.
  • step 17 it is judged whether or not the object moves in at least one of the detecting areas E (step 18 ).
  • the program is returned to the step 17 .
  • the processing at the steps 17 and 18 is repeated.
  • step 18 When it is judged at the step 18 that the object does not move in any of the detecting areas E, it is judged that the person entering the monitoring area exits from the monitoring area. Thereafter, the recording by the recording device 3 is stopped (step 20 ) after an elapse of a predetermined time period T 2 , for example, one minute (step 20 ). The program is returned to the step 11 .
  • FIG. 10 illustrates the electrical configuration of a second monitoring system capable of detecting that a person enters a monitoring area from an area outside the monitoring area.
  • the second monitoring system comprises a video camera 201 for imaging a monitoring area 100 , an analog-to-digital converter (ADC) 202 for converting an image signal outputted from the video camera 201 into a digital signal, a monitor 203 for displaying an image picked up by the video camera 201 on the basis of the digital signal obtained by the ADC 202 , a digital recording device 204 for recording the digital signal obtained by the ADC 202 , an entering person detecting sensor 205 arranged in a place which is expected to be the entrance of an entrance path to the monitoring area 100 , a monitoring control device 206 , and a power supply 210 for supplying power of each of the devices.
  • ADC analog-to-digital converter
  • An example of the digital recording device 204 is one for recording the digital signal on an optical disk device such as an MO (Magneto-Optic) or a CDR (Compact Disc-Recordable).
  • An example of the entering person detecting sensor 205 is a photoelectric detector or a magnetometric sensor.
  • An example of the power supply 210 is one comprising a solar battery 211 and a storage battery 212 storing power obtained by the solar battery 211 .
  • the monitoring control device 206 comprises a motion vector detecting circuit 221 , a CPU 222 , an alarm 223 , a during-monitoring display lamp 224 , and an operating unit 225 .
  • An output of the entering person detecting sensor 205 is inputted to the CPU 222 .
  • the CPU 222 carries out the on-off control of the power supplies of the video camera 201 , the ADC 202 and the monitor 203 , and controls a recording operation of the digital recording device 204 .
  • FIG. 11 shows the procedure for entrance monitoring processing performed by the CPU 222 .
  • the during-monitoring display lamp 224 is first turned on (step 51 ).
  • the CPU 222 waits until an entering person is detected by the entering person detecting sensor 205 (step 52 ).
  • the power supplies of the video camera 201 , the ADC 202 and the monitor 203 are turned on (step 53 ).
  • step 62 When the answer is in the affirmative at the step 62 after the processing at the steps 54 , 55 and 62 is repeated, that is, when the movement of the object is not detected until the predetermined time period T 0 has elapsed since the power supply of the video camera 201 was turned on, the power supplies of the video camera 201 , the ADC 202 and the monitor 203 are turned off (step 61 ). The program is returned to the step 51 .
  • step 55 When it is judged at the step 55 that the object moves in at least one of the detecting areas E, it is judged that a person enters the monitoring area, so that the alarm 223 is driven to report to a supervisor that a person enters the monitoring area, and recording by the recording device 204 is started to record the person entering the monitoring area (step 56 ). Further, the during-monitoring display lamp 224 is turned off.
  • step 60 When the supervisor enters a recording stop command using the operating unit 225 (YES at step 59 ), the recording by the recording device 204 is stopped (step 60 ). The power supplies of the video camera 201 , the ADC 202 and the monitor 203 are turned off (step 61 ). The program is returned to the step 51 .
  • FIG. 12 shows the procedure for another entrance monitoring processing performed by the CPU 222 .
  • the during-monitoring display lamp 224 is first turned on (step 71 ).
  • the CPU 222 waits until an entering person is detected by the entering person detecting sensor 205 (step 72 ).
  • the power supplies of the video camera 201 , the ADC 202 and the monitor 203 are turned on (step 73 ).
  • step 75 When it is judged that the object does not move in any of the detecting areas E (NO at step 75 ), it is judged whether or not a predetermined time period T 0 (for example, five minutes) has elapsed since the power supply of the video camera 201 was turned on at the foregoing step 73 (step 84 ). Unless the predetermined time period T 0 has elapsed since the power supply of the video camera 201 was turned on, the program is returned to the step 74 . The processing at the steps 74 , 75 and 84 is repeated.
  • a predetermined time period T 0 for example, five minutes
  • step 84 When the answer is in the affirmative at the step 84 after the processing at the steps 74 , 75 and 84 is repeated, that is, when the movement of the object is not detected until the predetermined time period TO has elapsed since the power supply of the video camera 201 was turned on, the power supplies of the video camera 201 , the ADC 202 and the monitor 203 are turned off (step 83 ) .
  • the program is returned to the step 71 .
  • step 75 When it is judged at the step 75 that the object moves in at least one of the detecting areas E, it is judged that a person enters the monitoring area, so that the alarm 223 is driven to report to a supervisor that a person enters the monitoring area, and recording by the recording device 204 is started to record the person entering the monitoring area (step 76 ). Further, the during-monitoring display lamp 224 is turned off.
  • step 79 After, when motion vectors, which correspond to one frame, for the respective detecting areas E are inputted (step 79 ) , it is judged whether or not the object moves in at least one of the detecting areas E (step 80 ). When the object moves in at least one of the detecting areas E, the program is returned to the step 79 . Until it is judged at the step 80 that the object does not move in any of the detecting areas E, the processing at the steps 79 and 80 is repeated.
  • step 80 When it is judged at the step 80 that the object does not move in any of the detecting areas E, it is judged that the person entering the monitoring area exits from the monitoring area. Thereafter, the recording by the recording device 204 is stopped (step 82 ) after an elapse of a predetermined time period T 2 , for example, one minute (step 81 ). The power supplies of the video camera 201 , the ADC 202 and the monitor 203 are turned off (step 83 ). The program is returned to the step 71 . While the power supply of the video camera 210 is being turned on, the power supply of the entering person detecting sensor 205 may be turned off.
  • the second monitoring system it is possible to monitor the entrance of a person from a gate, a wall, etc. around a house, for example, by the entering person detecting sensor 205 , and monitor the entrance of the person into the house using the video camera 201 .
  • the power supply of the video camera 201 is not always turned on, and the power supply of the video camera 201 is turned on when an entering person is detected by the entering person detecting sensor 205 , so that the power consumption can be reduced.
  • the entrance can be monitored even in a monitoring area to which no power is usually supplied.
  • the digital recording device can record, in addition to image information, information for retrieving an image represented by the image information, for example, a motion vector of the image, so that a desired image is easy to retrieve. Further, the speed for retrieval is high.
  • a recorded image is transmitted to a monitoring chamber, and is displayed or recorded in the monitoring chamber, it is possible to make digital transmission. Therefore, the recorded image is hardly degraded by the transmission, so that it is possible to more clearly display or record the image. Since the retrieval is easy, and the image is hardly degraded by the transmission and the recording, as described above, it is easy to extract only an important part of the recorded image to produce a database.
  • FIG. 13 is the schematic configuration of a third monitoring system capable of detecting that a person exits from a monitoring area to an area outside the monitoring area.
  • the third monitoring system comprises a video camera 101 for imaging the monitoring area, a monitor 102 for displaying an image picked up by the video camera 101 , and a monitoring control device 103 .
  • An output of the video camera 101 is fed to the monitor 102 and the monitoring control device 103 .
  • the image picked up by the video camera 101 is always displayed on the monitor 102 .
  • the monitoring control device 103 comprises an analog-to-digital converter (ADC) 141 , a motion vector detecting circuit 142 , a CPU 143 , an alarm 144 , and an operating unit 145 .
  • the CPU 143 comprises a ROM (not shown) storing its program and the like and a RAM (not shown) storing necessary data.
  • the ADC 141 converts an analog image signal outputted from the video camera 101 into a digital image signal.
  • the digital image signal outputted from the ADC 141 is fed to the motion vector detecting circuit 142 .
  • the motion vector detecting circuit 142 detects for each frame motion vectors for a plurality of detecting areas E set in an image area (a monitoring area) 100 of the video camera 101 , as shown in FIG. 3, on the basis of a representative point matching method, similarly to the motion vector detecting circuit 42 shown in FIG. 2.
  • the motion vector for each of the detecting areas E which has been detected by the motion vector detecting circuit 142 is fed to the CPU 143 .
  • the CPU 143 performs exist monitoring processing on the basis of the motion vectors for the detecting areas E which are inputted for each frame.
  • the exit monitoring processing is processing effective in detecting and reporting to a supervisor that a person to be monitored such as a child exits from the monitoring area 100 .
  • the outline of the exit monitoring processing will be described.
  • an inner area 100 a and an outer area 100 b are set in the monitoring area 100 .
  • Q denotes a person to be monitored.
  • FIG. 16 shows the procedure for exit monitoring processing performed by the CPU 143 .
  • step 31 When motion vectors, which correspond to one frame, for the respective detecting areas E are inputted (step 31 ), it is judged whether or not an object moves in the inner area 100 a (step 32 ).
  • step 33 When it is judged at the step 32 that the object does not move in the inner area 100 a , it is judged whether or not the object moves in the outer area 100 b (step 33 ).
  • the program is returned to the step 31 .
  • the CPU 222 waits until the motion vectors, which correspond to one frame, for the respective detecting areas E are inputted (step 34 ).
  • the motion vectors, which correspond to one frame, for the respective detecting areas E are inputted, it is judged whether or not the object moves in the inner area 100 a (step 35 ).
  • step 35 When it is judged at the step 35 that the object does not move in the inner area 100 a , it is judged whether or not the object moves in the outer area 100 b (step 36 ) . When the object moves in the outer area 100 b , the program is returned to the step 34 .
  • step 36 When it is judged at the step 36 that the object does not move in the outer area 100 b , it is judged that the person to be monitored exits from the monitoring area 100 , so that the alarm 144 is driven (step 37 ).
  • each of the first to third monitoring systems it is detected that a person enters the monitoring area or exits from the monitoring area by automatically detecting the movement of an object from the picked-up image. Therefore, it is possible to use a video camera having a lower resolution, as compared with a video camera used in a conventional monitoring system. Such detection precision that the presence or absence of the movement can be judged is sufficient. When it is not necessary to specify an entering person (when a precise image is not required), therefore, a low-cost system can be constructed. Moreover, if a lot of simple video cameras of this type are used, a system capable of monitoring a lot of points can be manufactured at low cost.
  • FIG. 17 illustrates the schematic configuration of a fourth monitoring system.
  • the fourth monitoring system comprises a monitoring video camera 301 for imaging the whole of a monitoring area, and a close-up video camera 302 for taking a close-up of the face of a person entering the monitoring area and imaging the face whose close-up has been taken.
  • the monitoring area is monitored by the monitoring video camera 301 .
  • the close-up video camera 302 is moved upward and downward and rightward and leftward by a pan tilt driving device 303 , so that the close-up video camera 302 is directed toward the face of the person entering the monitoring area.
  • the close-up video camera 302 has an automatic focusing function, so that the face of the person entering the monitoring area can be clearly imaged.
  • the control circuit 307 judges whether or not a person moves, that is, whether or not a person enters the monitoring area on the basis of the motion vector from the motion vector detecting circuit 304 .
  • the control circuit 307 switches, when it judges that the person enters the monitoring area, the image data fed to the recording unit 306 to image data from the close-up video camera 302 .
  • the control circuit 307 judges that a person enters the monitoring area, the control circuit 307 operates the pan tilt driving device 303 , to direct the close-up video camera 302 toward the position where the person exists.
  • the position where the person exists is specified on the basis of the motion vector for each of the plurality of detecting areas E (see FIG. 3), which is obtained from the motion vector detecting circuit 304 , set in the image area (the monitoring area) 100 of the monitoring video camera 301 .
  • the close-up video camera 304 is operated, to take a close-up of the face of the person and record an image of the face whose close-up has been taken (hereinafter referred to as a close-up image of the face) on the recording unit 306 .
  • control circuit 307 judges that no person exists in the monitoring area on the basis of the motion vector from the motion vector detecting circuit 304 , the control circuit 307 switches the signal selecting circuit 305 such that the image data from the monitoring video camera 301 for entire observation is fed to the recording unit 306 .
  • Image data from the video camera 301 a is fed to a recording unit 306 such as a VTR, and is recorded thereon.
  • the image data from the video camera 301 a is fed to a motion vector detecting circuit 304 .
  • the motion vector detecting circuit 304 detects for each frame motion vectors for a plurality of detecting areas E set in an image area (a monitoring area) 100 of the video camera 301 a , as shown in FIG. 3, on the basis of a representative point matching method, similarly to the motion vector detecting circuit 42 shown in FIG. 2.
  • a movement detecting circuit may be provided in a recording and reproducing devlice so that the image is reproduced at high speed when no motion vector is outputted by the movement detecting circuit, while being reproduced at standard or low speed when a motion vector is outputted.
  • FIGS. 20 a and 20 b when a monitoring area 501 where there is no light, for example, in the night is monitored by the infrared camera 402 , image data having luminance corresponding to the temperature of a person is outputted from the infrared camera 401 , as indicated by a picked-up image 502 .
  • the image data is fed to a motion vector detecting device 402 .
  • the motion vector detecting device 402 is so constructed as to detect as a motion vector a change of a signal corresponding to a heat source such as a person having temperature.
  • a heat source such as a person having temperature.
  • the motion vector is outputted even in a case where a tree, for example, swings by wind or the like, so that a warning device 404 or the like, described later, is operated. In order to prevent such an erroneous operation, only the motion vector for the signal corresponding to the temperature of a person is outputted.
  • the movement of the person can be also distinguished from the movement of an animal such as a dog or a cat, so that it is possible to prevent an erroneous operation of the warning device 404 or the like more reliably.
  • FIG. 21 illustrates the configuration of an imaging system.
  • sampling point data A difference between the image signal level at each of the sampling points S in the small area e in the current frame (hereinafter referred to as sampling point data) and the image signal level at the representative point R in a corresponding small area e in the preceding frame (hereinafter referred to as representative point data), that is, a correlated value at each of the sampling points S is found for each of the detecting areas E.
  • sampling point data the image signal level at the sampling points S in the small area e in the current frame
  • representative point data a correlated value at each of the sampling points S which are the same in deviation from the representative point R in all the small areas e in the detecting area E is found (a value obtained is hereinafter referred to as an accumulated correlated value). Consequently, accumulated correlated values whose number corresponds to the number of the sampling points S in one of the small areas e are formed for each of the detecting areas E.
  • the ADC 541 converts an analog image signal outputted from the video camera 501 into a digital image signal.
  • the representative point data in the obtained digital image signal is fed to the representative point memory 542 .
  • the writing of the representative point data into the representative point memory 542 is controlled by the CPU 544 .
  • the sampling point data in the digital image signal obtained by the ADC 541 is inputted to the correlated value operating circuit 543 .
  • the correlated value operating circuit 543 finds for each of the detecting areas E the difference between each of the sampling point data in the current frame and the representative point data stored in the representative point memory 542 , that is, a correlated value at each of the sampling points, and finds, for each of the detecting areas E, the sum of correlated values at the sampling points S which are the same in deviation from the representative points R in all the small areas e in the detecting area E (a value obtained is hereinafter referred to as an accumulated correlated value).
  • the accumulated correlated value found for each of the detecting areas E is fed to the CPU 544 .
  • the CPU 544 extracts deviation of the sampling point S having the minimum accumulated correlated value, that is, having the highest correlation in each of the detecting areas E as a motion vector in the detecting area E.
  • the recording device 503 is controlled on the basis of the obtained motion vector.
  • FIG. 22 shows the procedure for recording control processing performed by the CPU 544 .
  • step 104 a motion vector is calculated for each of the detecting areas E (step 104 ). That is, information relating to the movement of the subject from the previous recording time is calculated.
  • FIG. 23 shows another example of recording control processing performed by the CPU 544 .
  • step 114 it is judged whether or not the predetermined time period T has elapsed since the interval timer was started.
  • the CPU 222 waits until accumulated correlated values corresponding to one frame are inputted from the correlated value operating circuit 543 (step 115 ).
  • a motion vector is calculated for each of the detecting areas E (step 116 ). That is, information relating to the movement of the subject from the previous recording time is calculated.
  • the program is returned to the step 111 .
  • picked-up images which correspond to one or several frames, obtained by the video camera 501 are also recorded by the recording device 503 .
  • representative point data which correspond to one frame, currently fed to the representative point memory 542 are written into the representative point memory 542 . That is, the contents of the representative point memory 542 are updated. Further, the interval timer is started again. The program proceeds to the step 114 .

Abstract

The present invention relates to a monitoring system capable of automatically detecting and reporting to a supervisor that a person enters a monitoring area from an area outside the monitoring area. The present invention comprises an imaging device for imaging the monitoring area, and means for detecting information relating to the movement of an object in the monitoring area.

Description

    BACKGOUND OF THE INVENTION
  • 1. Field of the Invention [0001]
  • The present invention relates to a monitoring system capable of detecting that a person enters a monitoring area from an area outside the monitoring area, or a person exists from the monitoring area to the area outside the monitoring area. [0002]
  • The present invention relates to a monitoring device capable of imaging a characteristic part such as the face of an entering person. [0003]
  • The present invention relates to a monitoring device capable of monitoring a place which cannot be monitored by an imaging device such as a CCD (Charge Coupled Device) camera in the night, for example. [0004]
  • The present invention relates to an imaging system for intermittently recording a picked-up image of a subject . [0005]
  • 2. Description of the Prior Art [0006]
  • [1] An example of a conventional monitoring system for prevention is one for always imaging a monitoring area using a video camera, and displaying a picked-up image on a monitor as well as recording the picked-up image on a video tape. In such a monitoring system, an image projected on the monitor must be always monitored by a supervisor in order to know that a person enters the monitoring area from an area outside the monitoring area. [0007]
  • An object of the present invention is to provide a monitoring system capable of automatically detecting and reporting to a supervisor that a person enters a monitoring area from an area outside the monitoring area. [0008]
  • Another object of the present invention is to provide a monitoring system capable of automatically detecting that a person enters a monitoring area from an area outside the monitoring area and starting the recording of a picked-up image at the time point. [0009]
  • Still another object of the present invention is to provide a monitoring system capable of automatically detecting and reporting to a supervisor that a person exits from a monitoring area to an area outside the monitoring area. [0010]
  • [2] A monitoring video camera is set for prevention in a convenience store, a bank, and so forth, so that an image picked up by the video camera is recorded on a VTR (Video Tape Recorder), and is made use of for criminal investigation. [0011]
  • In the conventional VTR, however, the whole of a monitoring area is imaged and recorded. In cases such as a case where a crime occurred, the face of a criminal recorded on the VTR cannot, in some cases, be sufficiently recognized. Even in a case where almost all of persons are absent, for example, in the night, recording is always made on the VTR, so that a huge amount of a video tape or the like is required, and it takes long to make a search at a later time. [0012]
  • An object of the present invention is to provide a monitoring device capable of easily recording a face image important to specify an individual. [0013]
  • [3] In the place where there is no predetermined illuminance, for example, in the night, an image cannot be obtained by an imaging device such as a CCD camera. Therefore, the imaging device cannot be used as a monitoring camera for prevention. On the other hand, an infrared camera measures, on the basis of the amount of infrared rays emitted from an object, the temperature of the object, converts the temperature distribution of the object into an amount which can be recognized by a person, and outputs the amount to a monitor or the like. [0014]
  • The infrared camera can output, if there is an object, an image based on the quantity of heat of the object depending on emitted infrared rays irrespective of illuminance, so that it is considered that the infrared camera is utilized as a monitoring camera in the place where the CCD camera is poor at monitoring, for example, in the night. [0015]
  • In the above-mentioned infrared camera, however, all objects are respectively outputted as images corresponding to their quantities of heat. In order to judge whether or not the image is a person, an operator must make the judgment by observing the monitor or the like, resulting in band operability. [0016]
  • An object of the present invention is to provide a monitoring device capable of easily doing monitoring even in the place where there is no illuminance, for example, in the night. [0017]
  • [4] When an object which is very slowly moving is imaged, for example, a plant or a living thing in the growth process, a subject has been conventionally recorded for each predetermined time period. [0018]
  • An object of the present invention is to provide an imaging system capable of recording a picked-up image of a subject every time the amount of movement of the subject from the previous time when the picked-up image was recorded becomes not less than a predetermined amount. [0019]
  • SUMMARY OF THE INVENTION
  • A first monitoring system according to the present invention is characterized by comprising an imaging device for imaging a monitoring area, and means for detecting information relating to the movement of an object in the monitoring area on the basis of an output of the imaging device. [0020]
  • It is preferable to provide means for judging whether or not somebody enters the monitoring area on the basis of the information relating to the movement of the object. It is preferable to provide reporting means for reporting, when it is judged that somebody enters the monitoring area, to a supervisor that somebody enters the monitoring area. [0021]
  • It is preferable to provide a recording device for recording an image picked up by the imaging device, and means for starting the recording by the recording device when it is judged that somebody enters the monitoring area. [0022]
  • It is preferable to provide a recording device for recording an image picked up by the imaging device, reporting means for reporting, when it is judged that somebody enters the monitoring area, to a supervisor that somebody enters the monitoring area, and means for starting the recording by the recording device when it is judged that somebody enters the monitoring area. [0023]
  • An entering person detecting sensor may be provided in an entrance path of a person entering the monitoring area so that the imaging device is operated when the entering person is detected by the entering person detecting sensor. It is preferable that a power supply comprising a solar battery and a storage battery storing power obtained by the solar battery supplies the power to the imaging device. [0024]
  • An example of the information relating to the movement of the object is a motion vector corresponding to a detecting area or motion vectors corresponding to a plurality of detecting areas set in an imaging area of the imaging device. [0025]
  • The resolution of the imaging device may be a sufficiently low resolution to judge the presence or absence of the movement of the object. [0026]
  • A second monitoring system according to the present invention is characterized by comprising an imaging device for imaging a monitoring area, means for detecting information relating to the movement of an object in the monitoring area on the basis of an output of the imaging device, means for judging whether or not a person to be monitored exits from the monitoring area on the basis of the information relating to the movement of the object, and reporting means for reporting, when it is judged that the person to be monitored exits from the monitoring area, to a supervisor that the person to be monitored exits from the monitoring area. [0027]
  • A third monitoring system according to the present invention is characterized by comprising first imaging means for imaging a monitoring area, detection means for detecting the movement of an object in the monitoring area on the basis of an output of the first imaging means, and second imaging means for imaging, when the movement of the object in the monitoring area is detected, a moving portion. [0028]
  • An example of the second imaging means is one for enlarging the moving portion and imaging the enlarged moving portion. [0029]
  • The first imaging means comprises a monitoring camera for imaging the whole monitoring area, and the second imaging means comprises a close-up camera for taking a close-up of a part of the monitoring area and imaging the part whose close-up has been taken. The first imaging means and the second imaging means may be constituted by one video camera having a zoom mechanism. [0030]
  • There may be provided a recording device, a switch for switching an output of the first imaging means and an output of the second imaging means and feeding the output obtained by the switching to the recording device, and control means for controlling the switch such that the output of the first imaging means is fed to the recording device when the movement of the object in the monitoring area is not detected, while the output of the second imaging means is fed to the recording device when the movement of the object in the monitoring area is detected. [0031]
  • It is preferable that an identifier for making identification as to which of the output of the first imaging device and the output of the second imaging device is recorded is recorded by the recording device. [0032]
  • It is preferable to make, in reproducing an image recorded by the recording device, the speed at which an image picked up by the second imaging means is reproduced lower than the speed at which an image picked up by the first imaging means is reproduced. [0033]
  • There may be provided a recording device, and means for recording the output of the second imaging device by the recording device only when the movement of the object in the monitoring area is detected. [0034]
  • A fourth monitoring system according to the present invention is characterized by comprising detection means for detecting the movement of an object in a monitoring area by a signal change obtained on the basis of the amount of infrared rays in the monitoring area, and output means for outputting the results of the detection by the detection means. [0035]
  • A fifth monitoring system according to the present invention is characterized by comprising an infrared camera for receiving infrared rays emitted from an object in a monitoring area, detection means for detecting the movement of the object in the monitoring area on the basis of a signal change proportional to the intensity of the infrared rays outputted from the infrared camera, and output means for outputting the results of the detection by the detection means. [0036]
  • It is preferable that the fourth monitoring system or the fifth monitoring system according to the present invention is provided with a warning device, and means for driving the warning device on the basis of the output of the detection means. [0037]
  • It is preferable that the fourth monitoring system or the fifth monitoring system is provided with a video camera for imaging the monitoring area, and means for driving the video camera on the basis of the output of the detection means. [0038]
  • An imaging system according to the present invention is an imaging system for intermittently recording a picked-up image of a subject, characterized by comprising an imaging device for imaging the subject, movement amount measurement means for measuring the amount of movement of the subject from the previous time when the picked-up image was recorded on the basis of an output of the imaging device, and means for recording the picked-up image obtained by the imaging device when the amount of movement of the subject from the previous time when the picked-up image was recorded becomes not less than a predetermined amount. [0039]
  • There may be provided means for recording, unless the amount of movement of the subject from the previous time when the picked-up image was recorded becomes not less than a predetermined amount before a predetermined time period has elapsed since the previous time when the picked-up image was recorded, the picked-up image obtained by the imaging device at the time point where the predetermined time period has elapsed since the previous time when the picked-up image was recorded. [0040]
  • The foregoing and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.[0041]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a diagram showing the schematic configuration of a first monitoring system; [0042]
  • FIG. 2 is a block diagram showing the electrical configuration of the first monitoring system; [0043]
  • FIG. 3 is a schematic view showing a plurality of detecting areas set in a imaging area of a video camera; [0044]
  • FIG. 4 is a schematic view showing a plurality of small areas in the detecting area shown in FIG. 3; [0045]
  • FIG. 5 is a schematic view showing a plurality of sampling points and one representative point which are set in the small area shown in FIG. 4; [0046]
  • FIGS. 6[0047] a and 6 b are schematic views respectively showing a picked-up image in a case where no person enters a monitoring area and a picked-up image in a case where a person enters the monitoring area;
  • FIGS. 7[0048] a and 7 b are schematic views respectively showing a motion vector in each of detecting areas in a case where no person enters a monitoring area and a motion vector in each of the detecting areas in a case where a person enters the monitoring area;
  • FIG. 8 is a flow chart showing the procedure for entrance monitoring processing; [0049]
  • FIG. 9 is a flow chart showing another example of entrance monitoring processing; [0050]
  • FIG. 10 is a block diagram showing the electrical configuration of a second monitoring system; [0051]
  • FIG. 11 is a fIow chart showing the procedure for entrance monitoring processing; [0052]
  • FIG. 12 is a flow chart showing another example of entrance monitoring processing; [0053]
  • FIG. 13 is a block diagram showing the electrical configuration of a third monitoring system; [0054]
  • FIG. 14 is a schematic view showing an inner area and an outer area which are set in a monitoring area; [0055]
  • FIGS. 15[0056] a, 15 b and 15 c are schematic views for explaining the outline of exit monitoring processing;
  • FIG. 16 is a flow chart showing the procedure for exit monitoring processing; [0057]
  • FIG. 17 is a block diagram showing the electrical configuration of a fourth monitoring system; [0058]
  • FIG. 18 is a block diagram showing the electrical configuration of a fifth monitoring system; [0059]
  • FIG. 19 is a block diagram showing the electrical configuration of a sixth monitoring system; [0060]
  • FIGS. 20[0061] a and 20 b are schematic views showing an image picked up by an infrared camera;
  • FIG. 21 is a block diagram showing the electrical configuration of an imaging system; [0062]
  • FIG. 22 is a flow chart showing the procedure for recording control processing performed by a CPU; and [0063]
  • FIG. 23 is a flow chart showing another example of recording control processing.[0064]
  • DETAILED DESCRIPTION OF THE EPREFERRED EMBODIMENTS
  • Embodiments of the present invention will be described while referring to the drawings. [0065]
  • [1] Description of First Monitoring System [0066]
  • FIG. 1 illustrates the schematic configuration of a first monitoring system capable of detecting that a person enters a monitoring area from an area outside the monitoring area. [0067]
  • The first monitoring system comprises a [0068] video camera 1 for imaging a monitoring area 100, a monitor 2 for displaying an image picked up by the video camera 1, a recording device 3 for recording the image picked up by the video camera 1, and a monitoring control device 4.
  • FIG. 2 illustrates the electrical configuration of the first monitoring system. [0069]
  • An output of the [0070] video camera 1 is fed to the monitor 2, the recording device 3, and the monitoring control device 4. The image picked up by the video camera 1 is always displayed on the monitor 2. The recording device 3 is controlled on the basis of a control signal from the monitoring control device 4.
  • The monitoring control device [0071] 4 comprises an analog-to-digital converter (ADC) 41, a motion vector detecting circuit 42, a CPU 43, an alarm 44, a during-monitoring display lamp 45, and an operating unit 46. The CPU 43 comprises a ROM (not shown) storing its program and the like and a RAM (not shown) storing necessary data.
  • The [0072] ADC 41 converts an analog image signal outputted from the video camera 1 into a digital image signal. The digital image signal outputted from the ADC 41 is fed to the motion vector detecting circuit 42.
  • The motion [0073] vector detecting circuit 42 detects for each frame motion vectors (information relating to the movement) for a plurality of detecting areas E set in an image area (a monitoring area) 100 of the video camera 1, as shown in FIG. 3, on the basis of a representative point matching method.
  • More specifically, each of the detecting areas E is further divided into a plurality of small areas e, as shown in FIG. 4. As shown in FIG. 5, a plurality of sampling points S and one representative point R are set in each of the small areas e. [0074]
  • A difference between the image signal level at each of the sampling points S in the small area e in the current frame and the image signal level at the representative point R in a corresponding small area e in the preceding frame, that is, a correlated value at each of the sampling points is found for each of the detecting areas E. For each of the detecting areas E, the sum of correlated values at the sampling points S which are the same in deviation from the representative points R in all the small areas e in the detecting area E is found (a value obtained is hereinafter referred to as an accumulated correlated value). Consequently, accumulated correlated values whose number corresponds to the number of the sampling points S in one of the small areas e are found for each of the detecting areas E. [0075]
  • Deviation of the sampling point S having the minimum accumulated correlated value, that is, having the highest correlation in each of the detecting areas E is extracted as a motion vector (the movement of an object) in the detecting area E. [0076]
  • When no person enters the [0077] monitoring area 100 as shown in FIG. 6a, the magnitude of a motion vector in each of the detecting areas E is less than a predetermined value as shown in FIG. 7a. When a person enters the monitoring area 100 as shown in FIG. 6b, the magnitude of a motion vector in the detecting area E on which an entering person Q is projected is not less than the predetermined value as shown in FIG. 7b.
  • A motion vector for each of the detecting areas E which is detected by the motion [0078] vector detecting circuit 42 is fed to the CPU 43. The CPU 43 performs entrance monitoring processing on the basis of the motion vectors for the detecting areas E which are inputted for each frame.
  • FIG. 8 shows the procedure for entrance monitoring processing performed by the [0079] CPU 43. The entrance monitoring processing shown in FIG. 8 is processing effective in detecting an entering person such as a thief, to report the entering person to a supervisor.
  • The during-[0080] monitoring display lamp 45 is first turned on (step 1). When motion vectors, which correspond to one frame, for the respective detecting areas E are inputted (step 2), it is judged whether or not an object moves in at least one of the detecting areas E (step 3).
  • When it is judged that the object does not move in any of the detecting areas E (NO at step [0081] 3) , the program is returned to the step 1. Consequently, the processing at the steps 1, 2 and 3 is always repeatedly performed.
  • When it is judged at the [0082] step 3 that the object moves in at least one of the detecting areas E, it is judged that a person enters the monitoring area, so that the alarm 44 is driven to report to the supervisor that a person enters the monitoring area, and recording by the recording device 3 is started to record the person entering the monitoring area (step 4). Further, the during-monitoring display lamp 45 is turned off.
  • Thereafter, when the supervisor enters an alarm stop command using the operating unit [0083] 46 (YES at step 5), the driving of the alarm 44 is stopped (step 6).
  • When the supervisor enters a recording stop command using the operating unit [0084] 46 (YES at step 7), the recording by the recording device 3 is stopped (step 8). The program is returned to the step 1.
  • FIG. 9 shows the procedure for another entrance monitoring processing performed by the [0085] CPU 43. The entrance monitoring processing shown in FIG. 9 is processing effective in detecting and reporting to the supervisor in a store or the like that a customer visited the store, and causing the supervisor to check the customer.
  • The during-[0086] monitoring display lamp 45 is first turned on (step 11). When motion vectors, which correspond to one frame, for the respective detecting areas E are inputted (step 12), it is judged whether or not an object moves in at least one of the detecting areas E (step 13).
  • When it is judged that the object does not move in any of the detecting areas E (NO at step [0087] 13), the program is returned to the step 11. Consequently, the processing at the steps 11, 12 and 13 is always repeatedly performed.
  • When it is judged at the [0088] step 13 that the object moves in at least one of the detecting areas E, it is judged that a person enters the monitoring area, so that the alarm 44 is driven to report to the supervisor that a person enters the monitoring area, and recording by the recording device 3 is started to record the person entering the monitoring area (step 14). Further, the during-monitoring display lamp 45 is turned off.
  • Thereafter, when a predetermined time period T[0089] 1, for example, 10 seconds has elapsed (YES at step 15), the driving of the alarm 44 is stopped (step 16).
  • Thereafter, when the motion vectors, which correspond to one frame, for the respective detecting areas E are inputted (step [0090] 17), it is judged whether or not the object moves in at least one of the detecting areas E (step 18). When the object moves in at least one of the detecting areas E, the program is returned to the step 17. Until it is judged at the step 18 that the object does not move in any of the detecting areas E, the processing at the steps 17 and 18 is repeated.
  • When it is judged at the [0091] step 18 that the object does not move in any of the detecting areas E, it is judged that the person entering the monitoring area exits from the monitoring area. Thereafter, the recording by the recording device 3 is stopped (step 20) after an elapse of a predetermined time period T2, for example, one minute (step 20). The program is returned to the step 11.
  • [2] Description of Second Monitoring System [0092]
  • FIG. 10 illustrates the electrical configuration of a second monitoring system capable of detecting that a person enters a monitoring area from an area outside the monitoring area. [0093]
  • The second monitoring system comprises a [0094] video camera 201 for imaging a monitoring area 100, an analog-to-digital converter (ADC) 202 for converting an image signal outputted from the video camera 201 into a digital signal, a monitor 203 for displaying an image picked up by the video camera 201 on the basis of the digital signal obtained by the ADC 202, a digital recording device 204 for recording the digital signal obtained by the ADC 202, an entering person detecting sensor 205 arranged in a place which is expected to be the entrance of an entrance path to the monitoring area 100, a monitoring control device 206, and a power supply 210 for supplying power of each of the devices.
  • An example of the [0095] digital recording device 204 is one for recording the digital signal on an optical disk device such as an MO (Magneto-Optic) or a CDR (Compact Disc-Recordable). An example of the entering person detecting sensor 205 is a photoelectric detector or a magnetometric sensor. An example of the power supply 210 is one comprising a solar battery 211 and a storage battery 212 storing power obtained by the solar battery 211.
  • The [0096] monitoring control device 206 comprises a motion vector detecting circuit 221, a CPU 222, an alarm 223, a during-monitoring display lamp 224, and an operating unit 225. An output of the entering person detecting sensor 205 is inputted to the CPU 222. The CPU 222 carries out the on-off control of the power supplies of the video camera 201, the ADC 202 and the monitor 203, and controls a recording operation of the digital recording device 204.
  • Although in the second monitoring system, power is always supplied to the entering [0097] person detecting sensor 205 and the monitoring control device 206 from the power supply 210, the power supplies of the video camera 201, the ADC 202 and the monitor 203 are turned off.
  • FIG. 11 shows the procedure for entrance monitoring processing performed by the [0098] CPU 222.
  • The during-[0099] monitoring display lamp 224 is first turned on (step 51). The CPU 222 waits until an entering person is detected by the entering person detecting sensor 205 (step 52). When the entering person is detected by the detecting sensor 205, the power supplies of the video camera 201, the ADC 202 and the monitor 203 are turned on (step 53).
  • Thereafter, when motion vectors, which correspond to one frame, for respective detecting areas E are inputted (step [0100] 54), it is judged whether or not an object moves in at least one of the detecting areas E (step 55).
  • When it is judged that the object does not move in any of the detecting areas E (NO at step [0101] 55) , it is judged whether or not a predetermined time period T0 (for example, five minutes) has elapsed since the power supply of the video camera 201 was turned on at the foregoing step 53 (step 62). Unless the predetermined time period T0 has elapsed since the power supply of the video camera 201 was turned on, the program is returned to the step 54. The processing at the steps 54, 55 and 62 is repeated.
  • When the answer is in the affirmative at the [0102] step 62 after the processing at the steps 54, 55 and 62 is repeated, that is, when the movement of the object is not detected until the predetermined time period T0 has elapsed since the power supply of the video camera 201 was turned on, the power supplies of the video camera 201, the ADC 202 and the monitor 203 are turned off (step 61). The program is returned to the step 51.
  • When it is judged at the [0103] step 55 that the object moves in at least one of the detecting areas E, it is judged that a person enters the monitoring area, so that the alarm 223 is driven to report to a supervisor that a person enters the monitoring area, and recording by the recording device 204 is started to record the person entering the monitoring area (step 56). Further, the during-monitoring display lamp 224 is turned off.
  • Thereafter, when the supervisor enters an alarm stop command using the operating unit [0104] 225 (YES at step 57), the driving of the alarm 223 is stopped (step 58).
  • When the supervisor enters a recording stop command using the operating unit [0105] 225 (YES at step 59), the recording by the recording device 204 is stopped (step 60). The power supplies of the video camera 201, the ADC 202 and the monitor 203 are turned off (step 61). The program is returned to the step 51.
  • FIG. 12 shows the procedure for another entrance monitoring processing performed by the [0106] CPU 222.
  • The during-[0107] monitoring display lamp 224 is first turned on (step 71). The CPU 222 waits until an entering person is detected by the entering person detecting sensor 205 (step 72). When the entering person is detected by the detecting sensor 205, the power supplies of the video camera 201, the ADC 202 and the monitor 203 are turned on (step 73).
  • Thereafter, when motion vectors, which correspond to one frame, for the respective detecting areas E are inputted (step [0108] 74), it is judged whether or not an object moves in at least one of the detecting areas E (step 75)
  • When it is judged that the object does not move in any of the detecting areas E (NO at step [0109] 75), it is judged whether or not a predetermined time period T0 (for example, five minutes) has elapsed since the power supply of the video camera 201 was turned on at the foregoing step 73 (step 84). Unless the predetermined time period T0 has elapsed since the power supply of the video camera 201 was turned on, the program is returned to the step 74. The processing at the steps 74, 75 and 84 is repeated.
  • When the answer is in the affirmative at the [0110] step 84 after the processing at the steps 74, 75 and 84 is repeated, that is, when the movement of the object is not detected until the predetermined time period TO has elapsed since the power supply of the video camera 201 was turned on, the power supplies of the video camera 201, the ADC 202 and the monitor 203 are turned off (step 83) . The program is returned to the step 71.
  • When it is judged at the [0111] step 75 that the object moves in at least one of the detecting areas E, it is judged that a person enters the monitoring area, so that the alarm 223 is driven to report to a supervisor that a person enters the monitoring area, and recording by the recording device 204 is started to record the person entering the monitoring area (step 76). Further, the during-monitoring display lamp 224 is turned off.
  • Thereafter, when a predetermined time period T[0112] 1, for example, 10 seconds has elapsed (YES at step 77), the driving of the alarm 223 is stopped (step 78).
  • Thereafter, when motion vectors, which correspond to one frame, for the respective detecting areas E are inputted (step [0113] 79) , it is judged whether or not the object moves in at least one of the detecting areas E (step 80). When the object moves in at least one of the detecting areas E, the program is returned to the step 79. Until it is judged at the step 80 that the object does not move in any of the detecting areas E, the processing at the steps 79 and 80 is repeated.
  • When it is judged at the [0114] step 80 that the object does not move in any of the detecting areas E, it is judged that the person entering the monitoring area exits from the monitoring area. Thereafter, the recording by the recording device 204 is stopped (step 82) after an elapse of a predetermined time period T2, for example, one minute (step 81). The power supplies of the video camera 201, the ADC 202 and the monitor 203 are turned off (step 83). The program is returned to the step 71. While the power supply of the video camera 210 is being turned on, the power supply of the entering person detecting sensor 205 may be turned off.
  • According to the above-mentioned second monitoring system, it is possible to monitor the entrance of a person from a gate, a wall, etc. around a house, for example, by the entering [0115] person detecting sensor 205, and monitor the entrance of the person into the house using the video camera 201.
  • In the above-mentioned second monitoring system, the power supply of the [0116] video camera 201 is not always turned on, and the power supply of the video camera 201 is turned on when an entering person is detected by the entering person detecting sensor 205, so that the power consumption can be reduced.
  • Since the power of the whole system is supplied by the [0117] power supply 210 comprising the solar battery 211 and the storage battery 212, the entrance can be monitored even in a monitoring area to which no power is usually supplied.
  • When the digital recording device is used as in the above-mentioned second monitoring system, there are advantages that follow, as compared with an analog recording device such as a VTR. That is, the digital recording device can record, in addition to image information, information for retrieving an image represented by the image information, for example, a motion vector of the image, so that a desired image is easy to retrieve. Further, the speed for retrieval is high. When a recorded image is transmitted to a monitoring chamber, and is displayed or recorded in the monitoring chamber, it is possible to make digital transmission. Therefore, the recorded image is hardly degraded by the transmission, so that it is possible to more clearly display or record the image. Since the retrieval is easy, and the image is hardly degraded by the transmission and the recording, as described above, it is easy to extract only an important part of the recorded image to produce a database. [0118]
  • [3] Description of Third Monitoring System [0119]
  • FIG. 13 is the schematic configuration of a third monitoring system capable of detecting that a person exits from a monitoring area to an area outside the monitoring area. [0120]
  • The third monitoring system comprises a [0121] video camera 101 for imaging the monitoring area, a monitor 102 for displaying an image picked up by the video camera 101, and a monitoring control device 103.
  • An output of the [0122] video camera 101 is fed to the monitor 102 and the monitoring control device 103. The image picked up by the video camera 101 is always displayed on the monitor 102.
  • The [0123] monitoring control device 103 comprises an analog-to-digital converter (ADC) 141, a motion vector detecting circuit 142, a CPU 143, an alarm 144, and an operating unit 145. The CPU 143 comprises a ROM (not shown) storing its program and the like and a RAM (not shown) storing necessary data.
  • The [0124] ADC 141 converts an analog image signal outputted from the video camera 101 into a digital image signal. The digital image signal outputted from the ADC 141 is fed to the motion vector detecting circuit 142.
  • The motion [0125] vector detecting circuit 142 detects for each frame motion vectors for a plurality of detecting areas E set in an image area (a monitoring area) 100 of the video camera 101, as shown in FIG. 3, on the basis of a representative point matching method, similarly to the motion vector detecting circuit 42 shown in FIG. 2.
  • The motion vector for each of the detecting areas E which has been detected by the motion [0126] vector detecting circuit 142 is fed to the CPU 143. The CPU 143 performs exist monitoring processing on the basis of the motion vectors for the detecting areas E which are inputted for each frame.
  • The exit monitoring processing is processing effective in detecting and reporting to a supervisor that a person to be monitored such as a child exits from the [0127] monitoring area 100. The outline of the exit monitoring processing will be described.
  • As shown in FIG. 14, an [0128] inner area 100 a and an outer area 100 b are set in the monitoring area 100. In FIG. 14, Q denotes a person to be monitored.
  • When the person to be monitored which exists in the [0129] inner area 100 a exits from the monitoring area 100, a state where the person to be monitored Q exists in the inner area 100 a (FIG. 15a) , a state where the person to be monitored Q exists in the outer area 100 b (FIG. 15b), and a state where the person to be monitored Q does not exist in the monitoring area 100 (FIG. 15c) arise in this order, respectively, as shown in FIGS. 15a, 15 b, and FIG. 15c.
  • When the person to be monitored Q exists in the [0130] inner area 100 a as shown in FIG. 15a, the movement is detected in the detecting area E in the inner area 100 a. When the person to be monitored Q exists in the outer area 100 b as shown in FIG. 15b, the movement is not detected in the detecting area E in the inner area 100 a, while being detected in the detecting area E in the outer area 100 b. When the person to be monitored Q does not exist in the monitoring area 100 as shown in FIG. 15c, the movement is not detected in the detecting areas E in both the inner area 100 a and the outer area 100 b.
  • FIG. 16 shows the procedure for exit monitoring processing performed by the [0131] CPU 143.
  • When motion vectors, which correspond to one frame, for the respective detecting areas E are inputted (step [0132] 31), it is judged whether or not an object moves in the inner area 100 a (step 32).
  • When the object moves in the [0133] inner area 100 a, the program is returned to the step 31. Consequently, the processing at the steps 31 and 32 is always repeatedly performed.
  • When it is judged at the [0134] step 32 that the object does not move in the inner area 100 a, it is judged whether or not the object moves in the outer area 100 b (step 33).
  • When the object does not move in the [0135] outer area 100 b at the step 33, the program is returned to the step 31. When it is judged at the step 33 that the object moves in the outer area 100 b, the CPU 222 waits until the motion vectors, which correspond to one frame, for the respective detecting areas E are inputted (step 34). When the motion vectors, which correspond to one frame, for the respective detecting areas E are inputted, it is judged whether or not the object moves in the inner area 100 a (step 35).
  • When it is judged at the [0136] step 35 that the object moves in the inner area 100 a, it is judged that a person to be monitored is returned to the inner area 100 a from the outer area 100 b, after which the program is returned to the step 31.
  • When it is judged at the [0137] step 35 that the object does not move in the inner area 100 a, it is judged whether or not the object moves in the outer area 100 b (step 36) . When the object moves in the outer area 100 b, the program is returned to the step 34.
  • When it is judged at the [0138] step 36 that the object does not move in the outer area 100 b, it is judged that the person to be monitored exits from the monitoring area 100, so that the alarm 144 is driven (step 37).
  • Thereafter, when the supervisor enters an alarm stop command using the operating unit [0139] 145 (YES at step 38), the driving of the alarm 144 is stopped (step 39). The current exit monitoring processing is terminated.
  • In each of the first to third monitoring systems, it is detected that a person enters the monitoring area or exits from the monitoring area by automatically detecting the movement of an object from the picked-up image. Therefore, it is possible to use a video camera having a lower resolution, as compared with a video camera used in a conventional monitoring system. Such detection precision that the presence or absence of the movement can be judged is sufficient. When it is not necessary to specify an entering person (when a precise image is not required), therefore, a low-cost system can be constructed. Moreover, if a lot of simple video cameras of this type are used, a system capable of monitoring a lot of points can be manufactured at low cost. [0140]
  • [4] Description of Fourth Monitoring System [0141]
  • FIG. 17 illustrates the schematic configuration of a fourth monitoring system. [0142]
  • The fourth monitoring system comprises a [0143] monitoring video camera 301 for imaging the whole of a monitoring area, and a close-up video camera 302 for taking a close-up of the face of a person entering the monitoring area and imaging the face whose close-up has been taken.
  • The monitoring area is monitored by the [0144] monitoring video camera 301. The close-up video camera 302 is moved upward and downward and rightward and leftward by a pan tilt driving device 303, so that the close-up video camera 302 is directed toward the face of the person entering the monitoring area. The close-up video camera 302 has an automatic focusing function, so that the face of the person entering the monitoring area can be clearly imaged.
  • Image data from the [0145] monitoring video camera 301 and the close-up video camera 302 are fed to a recording unit 306 such as a VTR, through a signal selecting circuit 305. Further, the image data from the monitoring video camera 301 is fed to a motion vector detecting circuit 304.
  • The motion [0146] vector detecting circuit 304 detects for each frame motion vectors for a plurality of detecting areas E set in an image area (a monitoring area) 100 of the monitoring video camera 301, as shown in FIG. 3, on the basis of a representative point matching method, similarly to the motion vector detecting circuit 42 shown in FIG. 2.
  • An output of the motion [0147] vector detecting circuit 304 is fed to a control circuit 307 which is constituted by a microcomputer and the like. The control circuit 307 judges whether or not a person moves into the monitoring area on the basis of the output of the motion vector detecting circuit 304, to control the driving of the pan tilt driving device 303, the close-up video camera 302, and the signal selecting circuit 305.
  • The [0148] control circuit 307 judges whether or not a person moves, that is, whether or not a person enters the monitoring area on the basis of the motion vector from the motion vector detecting circuit 304. The control circuit 307 switches, when it judges that the person enters the monitoring area, the image data fed to the recording unit 306 to image data from the close-up video camera 302.
  • When the [0149] control circuit 307 judges that no person enters the monitoring area, the image data from the monitoring video camera 301 is fed to the recording unit 306, so that an image of the whole monitoring area is recorded.
  • When the [0150] control circuit 307 judges that a person enters the monitoring area, the control circuit 307 operates the pan tilt driving device 303, to direct the close-up video camera 302 toward the position where the person exists. The position where the person exists is specified on the basis of the motion vector for each of the plurality of detecting areas E (see FIG. 3), which is obtained from the motion vector detecting circuit 304, set in the image area (the monitoring area) 100 of the monitoring video camera 301. The close-up video camera 304 is operated, to take a close-up of the face of the person and record an image of the face whose close-up has been taken (hereinafter referred to as a close-up image of the face) on the recording unit 306. The closed-up image may be recorded for a predetermined time period. Alternatively, the closed-up image may be recorded, when a person is moving, while moving the camera 302 so as to follow the person. Further, when the close-up image is recorded, an identifier or the like may be simultaneously recorded such that the image to be recorded can be identified from the entire image for convenience of a later search.
  • When the [0151] control circuit 307 judges that no person exists in the monitoring area on the basis of the motion vector from the motion vector detecting circuit 304, the control circuit 307 switches the signal selecting circuit 305 such that the image data from the monitoring video camera 301 for entire observation is fed to the recording unit 306.
  • As described in the foregoing, when the monitoring area is monitored by the [0152] monitoring video camera 301, and the person in the monitoring area moves, the face of the person imaged by the close-up video camera 302 is clearly recorded on the recording unit 306, so that the person can be easily specified.
  • [5] Description of Fifth Monitoring System [0153]
  • FIG. 18 illustrates the schematic configuration of a fifth monitoring system. [0154]
  • In the fifth monitoring system, the whole of a monitoring area is imaged, and the face whose close-up has been taken is imaged by one [0155] video camera 301 a. Therefore, the video camera 301 a has a zooming function.
  • The zoom angle of the [0156] video camera 301 a having a zooming function is widened, to monitor the monitoring area. A pan tilt driving device 303 for directing the video camera 301 a toward a person in taking a close-up is mounted on the video camera 301 a. The video camera 301 a is moved upward and downward and rightward or leftward by the pan tilt driving device 303, so that the video camera 301 a is directed toward the face of a person entering the monitoring area. Further, the video camera 301 a has an automatic focusing function, so that the face of the person entering the monitoring area can be clearly imaged.
  • Image data from the [0157] video camera 301 a is fed to a recording unit 306 such as a VTR, and is recorded thereon. The image data from the video camera 301 a is fed to a motion vector detecting circuit 304.
  • The motion [0158] vector detecting circuit 304 detects for each frame motion vectors for a plurality of detecting areas E set in an image area (a monitoring area) 100 of the video camera 301 a, as shown in FIG. 3, on the basis of a representative point matching method, similarly to the motion vector detecting circuit 42 shown in FIG. 2.
  • An output from the motion [0159] vector detecting circuit 304 is fed to a control circuit 307 which is constituted by a microcomputer and the like. The control circuit 307 judges whether or not a person enters the monitoring area on the basis of the output of the motion vector detecting circuit 304, to carry out control of the driving of the pan tilt driving circuit 303 and the zooming function of the video camera 301 a.
  • When the monitoring area is monitored by the [0160] video camera 301 a, and the person in the monitoring area moves, the motion vector detecting circuit 304 calculates the motion vector, and outputs the calculated motion vector. The control circuit 307 judges whether or not the person moves, that is, the person enters the monitoring area on the basis of the motion vector from the motion vector detecting circuit 304.
  • The [0161] control circuit 307 operates, when it judges that the person enters the monitoring area, the pan tilt driving device 303, directs the video camera 301 a toward the position where the person exists, takes a close-up of the face of the person by the zooming function, and records an image of the face whose close-up has been taken (hereinafter referred to as a close-up image of the face) on the recording unit 306 for a predetermined time period. Further, when the closed-up image is recorded, an identifier or the like may be simultaneously recorded such that the image to be recorded can be identified from the entire image for convenience of a later search.
  • When the [0162] control circuit 307 judges that no person exists in the monitoring area on the basis of the motion vector from the motion vector detecting circuit 304, the control circuit 307 operates the zooming function of the video camera 301 a and the pan tilt driving device 303 such that an image signal for entire observation is fed to the recording unit 306 from the video camera 301 a.
  • As described in the foregoing, when the monitoring area is monitored by the one [0163] video camera 301 a, and the person in the monitoring area moves, the face of the person imaged after taking the close-up thereof by the zooming function is clearly recorded on the recording unit 306, so that the person can be easily specified.
  • Although in the fourth and fifth monitoring systems, the image of the whole monitoring area and the close-up image are switched, and the image obtained by the switching is recorded on the [0164] recording unit 306, only an image in a case where the person moves, that is, an image in a case where a motion vector is outputted from the motion vector detecting circuit 304 may be recorded for the purpose of saving a video tape.
  • When an identifier indicating a closed-up image (an image in a case where a person moves) is recorded on the video tape, a search is significantly easy to make at the time of reproduction if the image is reproduced at high speed when the identifier is not detected, while being reproduced at standard or low speed when it is detected. [0165]
  • Furthermore, when no identifier or the like is recorded, a movement detecting circuit may be provided in a recording and reproducing devlice so that the image is reproduced at high speed when no motion vector is outputted by the movement detecting circuit, while being reproduced at standard or low speed when a motion vector is outputted. [0166]
  • [6] Description of Sixth Monitoring System [0167]
  • FIG. 19 illustrates the schematic configuration of a sixth monitoring system. [0168]
  • The sixth monitoring system comprises an [0169] infrared camera 401 for imaging a monitoring area. The monitoring area is monitored by the infrared camera 401. The infrared camera 401 receives infrared rays emitted from an object, measures the temperature on the basis of the amount of the infrared rays, forms an image as a signal change depending on the quantity of heat, and feeds an image based on the temperature of a person to a motion vector detecting device 402.
  • As shown in FIGS. 20[0170] a and 20 b, when a monitoring area 501 where there is no light, for example, in the night is monitored by the infrared camera 402, image data having luminance corresponding to the temperature of a person is outputted from the infrared camera 401, as indicated by a picked-up image 502. The image data is fed to a motion vector detecting device 402.
  • The motion [0171] vector detecting device 402 detects a motion vector on the basis of the image data fed from the infrared camera 401. That is, when a person moves from a state shown in FIG. 20a to a state shown in FIG. 20b, an image of a heat source, for example, a person having temperature is moved. The motion vector is detected on the basis of the movement of the image. Examples of a motion vector detecting method include an all points matching method and a representative point matching method.
  • In the present embodiment, the motion [0172] vector detecting device 402 is so constructed as to detect as a motion vector a change of a signal corresponding to a heat source such as a person having temperature. When changes of signals corresponding to all heat sources are detected as motion vectors, the motion vector is outputted even in a case where a tree, for example, swings by wind or the like, so that a warning device 404 or the like, described later, is operated. In order to prevent such an erroneous operation, only the motion vector for the signal corresponding to the temperature of a person is outputted.
  • An output from the motion [0173] vector detecting device 402 is fed to a control device 403 which is constituted by a microcomputer and the like. The control device 403 judges whether or not a person enters the monitoring area on the basis of the output of the motion vector detecting device 402. The control device 403 drives, when it judges that the person enters the monitoring area, the warning device 404 such as a buzzer. Further, the control device 403 operates, when it judges that the person enters the monitoring area, a pan tilt driving device 406, to direct a CCD camera 405 toward the position where the person exists. The CCD camera 405 is operated, to record an image picked up by the CCD camera 405 on a recording device 407. The CCD camera 405 is provided with an illuminating lamp. If illuminance is insufficient to pick up an image by the CCD camera 405, the illuminating lamp is turned on.
  • When models of motion vectors caused by the movement of a person are previously registered in the [0174] control device 403, the movement of the person can be also distinguished from the movement of an animal such as a dog or a cat, so that it is possible to prevent an erroneous operation of the warning device 404 or the like more reliably.
  • Although in the above-mentioned embodiment, a person is recorded by the [0175] CCD camera 405, another recording means such as a Polaroid camera may be used.
  • [7] Description of Imaging System [0176]
  • FIG. 21 illustrates the configuration of an imaging system. [0177]
  • The imaging system comprises a [0178] video camera 501 for imaging a subject, a monitor 502 for displaying an image picked up by the video camera 501, a recording device 503 for recording the image picked up by the video camera 501, and a movement monitoring device 504 for monitoring the amount of movement of the subject.
  • An output of the [0179] video camera 501 is fed to the monitor 502, the recording device 503, and the movement monitoring device 504. The image picked up by the video camera 501 is always displayed on the monitor 502. The recording device 503 is controlled on the basis of a control signal from the movement monitoring device 504.
  • The [0180] movement monitoring device 504 detects the amount of movement of the subject in the same method as a representative point matching method, and comprises an analog-to-digital converter (ADC) 541, a representative point memory 542, a correlated value operating circuit 543, and a CPU 544. The CPU 544 comprises a ROM (not shown) storing its program and the like and a RAM (not shown) storing necessary data.
  • Description is made of a motion vector detecting method based on a normal representative point matching method. As shown in FIG. 3, a plurality of detecting areas E are set in an image area (a monitoring area) [0181] 100 of the video camera 501. Each of the detecting areas E is further divided into a plurality of small areas e, as shown in FIG. 4. As shown in FIG. 5, a plurality of sampling points S and one representative point R are set in each of the small areas e.
  • A difference between the image signal level at each of the sampling points S in the small area e in the current frame (hereinafter referred to as sampling point data) and the image signal level at the representative point R in a corresponding small area e in the preceding frame (hereinafter referred to as representative point data), that is, a correlated value at each of the sampling points S is found for each of the detecting areas E. For each of the detecting areas E, the sum of correlated values at the sampling points S which are the same in deviation from the representative point R in all the small areas e in the detecting area E is found (a value obtained is hereinafter referred to as an accumulated correlated value). Consequently, accumulated correlated values whose number corresponds to the number of the sampling points S in one of the small areas e are formed for each of the detecting areas E. [0182]
  • Deviation of the sampling point S having the minimum accumulated correlated value, that is, having the highest correlation in each of the detecting areas E is extracted as a motion vector (the movement of an object) in the detecting area E. [0183]
  • Although in the above-mentioned normal motion vector detecting method, motion vectors corresponding to the amount of movement of the subject from the preceding frame are calculated for each frame, the difference between the representative point data at the previous recording time and the sampling point data obtained for each frame, that is, the correlated value at each of the sampling points is found in the present embodiment, so that motion vectors corresponding to the amount of movement of the subject from the previous recording time are calculated. [0184]
  • The [0185] ADC 541 converts an analog image signal outputted from the video camera 501 into a digital image signal. The representative point data in the obtained digital image signal is fed to the representative point memory 542. The writing of the representative point data into the representative point memory 542 is controlled by the CPU 544.
  • The sampling point data in the digital image signal obtained by the [0186] ADC 541 is inputted to the correlated value operating circuit 543. The correlated value operating circuit 543 finds for each of the detecting areas E the difference between each of the sampling point data in the current frame and the representative point data stored in the representative point memory 542, that is, a correlated value at each of the sampling points, and finds, for each of the detecting areas E, the sum of correlated values at the sampling points S which are the same in deviation from the representative points R in all the small areas e in the detecting area E (a value obtained is hereinafter referred to as an accumulated correlated value).
  • The accumulated correlated value found for each of the detecting areas E is fed to the [0187] CPU 544. The CPU 544 extracts deviation of the sampling point S having the minimum accumulated correlated value, that is, having the highest correlation in each of the detecting areas E as a motion vector in the detecting area E. The recording device 503 is controlled on the basis of the obtained motion vector.
  • FIG. 22 shows the procedure for recording control processing performed by the [0188] CPU 544.
  • Picked-up images, which correspond to one or several frames, obtained by the [0189] video camera 501 are first recorded by the recording device 503 (step 101). Representative point data corresponding to one frame which are currently fed to the representative point memory 542 are written into the representative point memory 542 (step 102).
  • Thereafter, when accumulated correlated values corresponding to one frame are inputted from the correlated value operating circuit [0190] 543 (step 103), a motion vector is calculated for each of the detecting areas E (step 104). That is, information relating to the movement of the subject from the previous recording time is calculated.
  • It is judged whether or not there exists a motion vector whose magnitude is not less than a predetermined value out of the motion vectors calculated for the detecting areas E (step [0191] 105).
  • When there exists no motion vector whose magnitude is not less than the predetermined value out of the motion vectors calculated for the detecting areas E, the program is returned to the [0192] step 103. Consequently, the processing at the steps 103, 104 and 105 is always repeatedly performed.
  • When it is judged at the [0193] step 105 that there exists the motion vector whose magnitude is not less than the predetermined value out of the motion vectors calculated for the detecting areas E, it is judged that the amount of movement of the subject from the previous recording time becomes not less than the predetermined value, after which the program is returned to the step 101. In this case, therefore, picked-up images, which correspond to one or several frames, obtained by the video camera 501 are recorded by the recording device 503. Further, representative point data, which correspond to one frame, currently fed to the representative point memory 542 are written into the representative point memory 542. That is, the contents of the representative point memory 542 are updated. The program proceeds to the step 103.
  • According to the recording control processing shown in FIG. 22, recording is made every time the amount of movement of the subject from the previous recording time becomes not less than the predetermined value. [0194]
  • FIG. 23 shows another example of recording control processing performed by the [0195] CPU 544.
  • The recording control processing differs from the recording control processing shown in FIG. 22 in that recording is made, unless the amount of movement of a subject from the previous recording time becomes not less than a predetermined value until a predetermined time period has elapsed since the previous recording time, at the time point where the predetermined time period has elapsed since the previous recording time. [0196]
  • Picked-up images, which correspond to one or several frames, obtained by the [0197] video camera 501 are first recorded by the recording device 502 (step 111). Representative point data corresponding to one frame which are currently fed to the representative point memory 542 are written into the representative point memory 542 (step 112). An interval timer for measuring a predetermined time period T is started (step 113).
  • Thereafter, it is judged whether or not the predetermined time period T has elapsed since the interval timer was started (step [0198] 114). When the predetermined time period T has not elapsed since the interval timer was started, the CPU 222 waits until accumulated correlated values corresponding to one frame are inputted from the correlated value operating circuit 543 (step 115).
  • When the accumulated correlated values corresponding to one frame are inputted from the correlated value operating circuit [0199] 543 (step 115), a motion vector is calculated for each of the detecting areas E (step 116). That is, information relating to the movement of the subject from the previous recording time is calculated.
  • It is judged whether or not there exists a motion vector whose magnitude is not less than the predetermined value out of the motion vectors calculated for the detecting areas (step [0200] 117).
  • When there exists no motion vector whose magnitude is not less than the predetermined value out of the motion vectors calculated for the detecting areas E, the program is returned to the [0201] step 114. Consequently, the processing at the steps 114, 115, 116 and 117 is always repeatedly performed.
  • When it is judged at the [0202] step 117 that there exists the motion vector whose magnitude is not less than the predetermined value out of the motion vectors calculated for the detecting areas E, it is judged that the amount of movement of the subject from the previous recording time becomes not less than the predetermined value, after which the program is returned to the step 111. In this case, therefore, picked-up images, which correspond to one or several frames, obtained by the video camera 501 are recorded by the recording device 503. Further, representative point data, which correspond to one frame, currently fed to the representative point memory 542 are written into the representative point memory 542. That is, the contents of the representative point memory 542 are updated. Further, the interval timer is started again. The program proceeds to the step 114.
  • Even when it is judged at the [0203] step 114 that the predetermined time period has not elapsed since the interval timer was started, the program is returned to the step 111. In this case, therefore, picked-up images, which correspond to one or several frames, obtained by the video camera 501 are also recorded by the recording device 503. Further, representative point data, which correspond to one frame, currently fed to the representative point memory 542 are written into the representative point memory 542. That is, the contents of the representative point memory 542 are updated. Further, the interval timer is started again. The program proceeds to the step 114.
  • An electronic still camera (a digital camera) may be used as a combination of the [0204] video camera 501 and the recording device 503. In this case, the on-off control of a shutter of the electronic still camera is carried out by the movement monitoring device 504.
  • Although the present invention has been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the spirit and scope of the present invention being limited only by the terms of the appended claims. [0205]

Claims (26)

What is claimed is:
1. A monitoring system comprising:
an imaging device for imaging a monitoring area; and
means for detecting information relating to the movement of an object in the monitoring area on the basis of an output of the imaging device.
2. The monitoring device according to claim 1, further comprising
means for judging whether or not somebody enters the monitoring area on the basis of the information relating to the movement of the object.
3. The monitoring system according to claim 2, further comprising
reporting means for reporting, when it is judged that somebody enters the monitoring area, to a supervisor that somebody enters the monitoring area.
4. A monitoring system according to claim 2, further comprising
a recording device for recording an image picked up by the imaging device, and
means for starting the recording by the recording device when it is judged that somebody enters the monitoring area.
5. The monitoring system according to claim 2, further comprising
a recording device for recording an image picked up by the imaging device,
reporting means for reporting, when it is judged that somebody enters the monitoring area, to a supervisor that somebody enters the monitoring area, and
means for starting the recording by the recording device when it is judged that somebody enters the monitoring area.
6. The monitoring system according to claim 1, wherein
an entering person detecting sensor is provided in an entrance path of a person entering the monitoring area, and
said imaging device is operated when the entering person is detected by the entering person detecting sensor.
7. The monitoring system according to claim 6, wherein
a power supply comprising a solar battery and a storage battery storing power obtained by the solar battery supplies the power to said imaging device.
8. The monitoring system according to claim 1, wherein
the information relating to the movement of the object is a motion vector corresponding to a detecting area or motion vectors corresponding to a plurality of detecting areas set in an imaging area of the imaging device.
9. The monitoring system according to claim 1, wherein
the resolution of the imaging device is a sufficiently low resolution to judge the presence or absence of the movement of the object.
10. A monitoring system comprising:
an imaging device for imaging a monitoring area;
means for detecting information relating to the movement of an object in the monitoring area on the basis of an output of the imaging device;
means for judging whether or not a person to be monitored exits from the monitoring area on the basis of the information relating to the movement of the object; and
reporting means for reporting, when it is judged that the person to be monitored exits from the monitoring area, to a supervisor that the person to be monitored exits from the monitoring area.
11. A monitoring system comprising:
first imaging means for imaging a monitoring area;
detection means for detecting the movement of an object in the monitoring area on the basis of an output of the first imaging means; and
second imaging means for imaging, when the movement of the object in the monitoring area is detected, a moving portion.
12. The monitoring system according to claim 11, wherein
the second imaging means enlarges said moving portion and images the enlarged moving portion.
13. The monitoring system according to claim 12, wherein
the first imaging means comprises a monitoring camera for imaging the whole monitoring area, and
the second imaging means comprises a close-up camera for taking a close-up of a part of the monitoring area and imaging the part whose close-up has been taken.
14. The monitoring system according to claim 12, wherein
the first imaging means and the second imaging means are constituted by one video camera having a zoom mechanism.
15. The monitoring system according to claim 11, comprising:
a recording device;
a switch for switching an output of the first imaging means and an output of the second imaging means and feeding the output obtained by the switching to the recording device; and
control means for controlling the switch such that the output of the first imaging means is fed to the recording device when the movement of the object in the monitoring area is not detected, while the output of the second imaging means is fed to the recording device when the movement of the object in the monitoring area is detected.
16. The monitoring system according to claim 14, wherein
an identifier for making identification as to which of the output of the first imaging device and the output of the second imaging device is recorded is recorded by the recording device.
17. The monitoring system according to claim 15, further comprising
means for making, in reproducing an image recorded by the recording device, the speed at which an image picked up by the second imaging means is reproduced lower than the speed at which an image picked up by the first imaging means is reproduced.
18. The monitoring system according to claim 11, further comprising
a recording device, and
means for recording the output of the second imaging device by the recording device only when the movement of the object in the monitoring area is detected.
19. The monitoring system comprising:
detection means for detecting the movement of the object in the monitoring area by a signal change obtained on the basis of the amount of infrared rays in the monitoring area, and
output means for outputting the results of the detection by the detection means.
20. A monitoring system comprising:
an infrared camera for receiving infrared rays emitted from an object in a monitoring area;
detection means for detecting the movement of the object in the monitoring area on the basis of a signal change proportional to the intensity of the infrared rays outputted from the infrared camera; and
output means for outputting the results of the detection by the detection means.
21. The monitoring system according to claim 19, where in
a warning device, and
means for driving the warning device on the basis of the output of the detection means.
22. The monitoring system according to claim 20, further comprising
a warning device, and
means for driving the warning device on the basis of the output of the detection means.
23. The monitoring system according to claim 19, further comprising
a video camera for imaging the monitoring area, and
means for driving the video camera on the basis of the output of the detection means.
24. The monitoring system according to claim 20, further comprising
a video camera for imaging the monitoring area, and
means for driving the video camera on the basis of the output of the detection means.
25. An imaging system for intermittently recording a picked-up image of a subject, comprising:
an imaging device for imaging the subject;
movement amount measurement means for measuring the amount of movement of the subject from the previous time when the picked-up image was recorded on the basis of an output of the imaging device; and
means for recording the picked-up image obtained by the imaging device when the amount of movement of the subject from the previous time when the picked-up image was recorded becomes not less than a predetermined amount.
26. The imaging system according to claim 25, further comprising
means for recording, unless the amount of movement of the subject from the previous time when the picked-up image was recorded becomes not less than a predetermined amount before a predetermined time period has elapsed since the previous time when the picked-up image was recorded, the picked-up image obtained by the imaging device at the time point where the predetermined time period has elapsed since the previous time when the picked-up image was recorded.
US09/084,315 1997-05-27 1998-05-26 Monitoring system and imaging system Expired - Lifetime US6456320B2 (en)

Applications Claiming Priority (15)

Application Number Priority Date Filing Date Title
JP137306/1997 1997-05-27
JP13730697 1997-05-27
JP9-137306 1997-05-27
JP9-146157 1997-06-04
JP146157/1997 1997-06-04
JP14615797A JPH10333219A (en) 1997-06-04 1997-06-04 Image pickup system
JP14771697A JPH10336630A (en) 1997-06-05 1997-06-05 Monitor system
JP9-147717 1997-06-05
JP14771797A JPH10336632A (en) 1997-06-05 1997-06-05 Monitoring device
JP9-147454 1997-06-05
JP147717/1997 1997-06-05
JP14745497A JPH1145379A (en) 1997-05-27 1997-06-05 Monitoring system
JP147454/1997 1997-06-05
JP147716/1997 1997-06-05
JP9-147716 1997-06-05

Publications (2)

Publication Number Publication Date
US20020015094A1 true US20020015094A1 (en) 2002-02-07
US6456320B2 US6456320B2 (en) 2002-09-24

Family

ID=27527481

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/084,315 Expired - Lifetime US6456320B2 (en) 1997-05-27 1998-05-26 Monitoring system and imaging system

Country Status (1)

Country Link
US (1) US6456320B2 (en)

Cited By (67)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020090116A1 (en) * 2000-10-13 2002-07-11 Kunihiro Miichi Image comparison apparatus, image comparison method, image comparison center apparatus, and image comparison system
US20020191831A1 (en) * 2001-05-02 2002-12-19 Stmicroelectronics S.R.L. System and process for analyzing surface defects
US20030007003A1 (en) * 2001-05-24 2003-01-09 Ostrowski Dominic Jan Data processing device with server-generated graphic user interface
US20030044046A1 (en) * 2001-08-30 2003-03-06 Yoshifumi Nakamura Method and system for delivering monitored image signal of sbject to be monitored
US20030103138A1 (en) * 2001-12-03 2003-06-05 Inter-Cite Video Inc. Video security and control system
US20030155811A1 (en) * 2002-02-20 2003-08-21 Makoto Yoneya Remote input/output device
US6970576B1 (en) 1999-08-04 2005-11-29 Mbda Uk Limited Surveillance system with autonomic control
US20060174206A1 (en) * 2005-01-31 2006-08-03 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Shared image device synchronization or designation
US20060171695A1 (en) * 2005-01-31 2006-08-03 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Shared image device designation
US20060274165A1 (en) * 2005-06-02 2006-12-07 Levien Royce A Conditional alteration of a saved image
US20060274154A1 (en) * 2005-06-02 2006-12-07 Searete, Lcc, A Limited Liability Corporation Of The State Of Delaware Data storage usage protocol
US20060279643A1 (en) * 2005-06-02 2006-12-14 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Storage access technique for captured data
US20070008326A1 (en) * 2005-06-02 2007-01-11 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Dual mode image capture technique
US20070098348A1 (en) * 2005-10-31 2007-05-03 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Degradation/preservation management of captured data
US20070109411A1 (en) * 2005-06-02 2007-05-17 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Composite image selectivity
US20070120981A1 (en) * 2005-06-02 2007-05-31 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Storage access technique for captured data
US20070139529A1 (en) * 2005-06-02 2007-06-21 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Dual mode image capture technique
US20070200934A1 (en) * 2006-02-28 2007-08-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Imagery processing
US20070222865A1 (en) * 2006-03-15 2007-09-27 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Enhanced video/still image correlation
US20080043108A1 (en) * 2006-08-18 2008-02-21 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Capturing selected image objects
US20080106621A1 (en) * 2005-01-31 2008-05-08 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Shared image device synchronization or designation
US20080122932A1 (en) * 2006-11-28 2008-05-29 George Aaron Kibbie Remote video monitoring systems utilizing outbound limited communication protocols
US20080143831A1 (en) * 2006-12-15 2008-06-19 Daniel David Bowen Systems and methods for user notification in a multi-use environment
US20080158366A1 (en) * 2005-01-31 2008-07-03 Searete Llc Shared image device designation
US20080219589A1 (en) * 2005-06-02 2008-09-11 Searete LLC, a liability corporation of the State of Delaware Estimating shared image device operational capabilities or resources
US20090027505A1 (en) * 2005-01-31 2009-01-29 Searete Llc Peripheral shared image device sharing
US20090073268A1 (en) * 2005-01-31 2009-03-19 Searete Llc Shared image devices
US20090093688A1 (en) * 2003-05-30 2009-04-09 Michael Mathur System, Device, and Method for Remote Monitoring and Servicing
US20090144391A1 (en) * 2007-11-30 2009-06-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Audio sharing
US20090147095A1 (en) * 2007-12-06 2009-06-11 Samsung Techwin Co., Ltd. Digital photographing apparatus, method of controlling the same, and recording medium storing a program for implementing the method
US20090207247A1 (en) * 2008-02-15 2009-08-20 Jeffrey Zampieron Hybrid remote digital recording and acquisition system
US20100007736A1 (en) * 2007-02-14 2010-01-14 Panasonic Corporation Monitoring camera and monitoring camera control method
US20100034523A1 (en) * 2007-02-14 2010-02-11 Lg Electronics Inc. Digital display device for having dvr system and of the same method
US20100235466A1 (en) * 2005-01-31 2010-09-16 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Audio sharing
US20100271490A1 (en) * 2005-05-04 2010-10-28 Assignment For Published Patent Application, Searete LLC, a limited liability corporation of Regional proximity for shared image device(s)
US20100329660A1 (en) * 2009-03-09 2010-12-30 Samsung Electronics Co., Ltd. Digital moving picture photographing apparatus, method of controlling the apparatus, recording medium storing program to execute the method, and method of determining movement of subject
US20110057796A1 (en) * 2009-09-10 2011-03-10 Sony Corporation Apparatus and Method for Operation of a Display Device to Provide a Home Security Alarm
US20110110608A1 (en) * 2005-03-30 2011-05-12 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Image transformation estimator of an imaging device
US20110187880A1 (en) * 2010-02-03 2011-08-04 Honeywell International Inc. Image acquisition system using orthogonal transfer ccd sensing element
US20110228977A1 (en) * 2010-03-18 2011-09-22 Hon Hai Precision Industry Co., Ltd. Image capturing device and method for adjusting a position of a lens of the image capturing device
CN102655570A (en) * 2011-03-03 2012-09-05 霍尼韦尔国际公司 Flashless motion invariant image acquisition system
EP2541519A1 (en) * 2011-06-30 2013-01-02 Xtralis AG Method for operating systems with PIR detectors
WO2013052863A1 (en) * 2011-10-05 2013-04-11 Radio Systems Corporation Image-based animal control systems and methods
US20130088422A1 (en) * 2011-10-05 2013-04-11 Sony Corporation Input apparatus and input recognition method
US9489717B2 (en) 2005-01-31 2016-11-08 Invention Science Fund I, Llc Shared image device
US9621749B2 (en) 2005-06-02 2017-04-11 Invention Science Fund I, Llc Capturing selected image objects
US9942511B2 (en) 2005-10-31 2018-04-10 Invention Science Fund I, Llc Preservation/degradation of video/audio aspects of a data stream
US10003762B2 (en) 2005-04-26 2018-06-19 Invention Science Fund I, Llc Shared image devices
US10045512B2 (en) 2015-06-16 2018-08-14 Radio Systems Corporation Systems and methods for monitoring a subject in a premise
US10154651B2 (en) 2011-12-05 2018-12-18 Radio Systems Corporation Integrated dog tracking and stimulus delivery system
US10228447B2 (en) 2013-03-15 2019-03-12 Radio Systems Corporation Integrated apparatus and method to combine a wireless fence collar with GPS tracking capability
US10231440B2 (en) 2015-06-16 2019-03-19 Radio Systems Corporation RF beacon proximity determination enhancement
US10268220B2 (en) 2016-07-14 2019-04-23 Radio Systems Corporation Apparatus, systems and methods for generating voltage excitation waveforms
US10514439B2 (en) 2017-12-15 2019-12-24 Radio Systems Corporation Location based wireless pet containment system using single base unit
US10645908B2 (en) 2015-06-16 2020-05-12 Radio Systems Corporation Systems and methods for providing a sound masking environment
US10674709B2 (en) 2011-12-05 2020-06-09 Radio Systems Corporation Piezoelectric detection coupling of a bark collar
US10842128B2 (en) 2017-12-12 2020-11-24 Radio Systems Corporation Method and apparatus for applying, monitoring, and adjusting a stimulus to a pet
US10986813B2 (en) 2017-12-12 2021-04-27 Radio Systems Corporation Method and apparatus for applying, monitoring, and adjusting a stimulus to a pet
US11109182B2 (en) 2017-02-27 2021-08-31 Radio Systems Corporation Threshold barrier system
CN113835457A (en) * 2021-08-10 2021-12-24 太原市高远时代科技有限公司 Water conservancy integration intelligence rack based on edge calculation
US11238889B2 (en) 2019-07-25 2022-02-01 Radio Systems Corporation Systems and methods for remote multi-directional bark deterrence
US11341456B2 (en) * 2020-08-25 2022-05-24 Datalogic Usa, Inc. Compact and low-power shelf monitoring system
US11372077B2 (en) 2017-12-15 2022-06-28 Radio Systems Corporation Location based wireless pet containment system using single base unit
US11394196B2 (en) 2017-11-10 2022-07-19 Radio Systems Corporation Interactive application to protect pet containment systems from external surge damage
US11470814B2 (en) 2011-12-05 2022-10-18 Radio Systems Corporation Piezoelectric detection coupling of a bark collar
US11490597B2 (en) 2020-07-04 2022-11-08 Radio Systems Corporation Systems, methods, and apparatus for establishing keep out zones within wireless containment regions
US11553692B2 (en) 2011-12-05 2023-01-17 Radio Systems Corporation Piezoelectric detection coupling of a bark collar

Families Citing this family (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7612796B2 (en) * 2000-01-13 2009-11-03 Countwise, Llc Video-based system and method for counting persons traversing areas being monitored
US9892606B2 (en) 2001-11-15 2018-02-13 Avigilon Fortress Corporation Video surveillance system employing video primitives
US8711217B2 (en) 2000-10-24 2014-04-29 Objectvideo, Inc. Video surveillance system employing video primitives
US8564661B2 (en) 2000-10-24 2013-10-22 Objectvideo, Inc. Video analytic rule detection system and method
US6711279B1 (en) * 2000-11-17 2004-03-23 Honeywell International Inc. Object detection
US6841780B2 (en) * 2001-01-19 2005-01-11 Honeywell International Inc. Method and apparatus for detecting objects
US7176440B2 (en) * 2001-01-19 2007-02-13 Honeywell International Inc. Method and apparatus for detecting objects using structured light patterns
US7424175B2 (en) 2001-03-23 2008-09-09 Objectvideo, Inc. Video segmentation using statistical pixel modeling
US7154531B2 (en) * 2001-10-26 2006-12-26 The Chamberlain Group, Inc. Detecting objects by digital imaging device
JP2004021495A (en) * 2002-06-14 2004-01-22 Mitsubishi Electric Corp Monitoring system and monitoring method
WO2004021337A1 (en) * 2002-09-02 2004-03-11 Samsung Electronics Co., Ltd. Optical information storage medium and method of and apparatus for recording and/or reproducing information on and/or from the optical information storage medium
US7221775B2 (en) * 2002-11-12 2007-05-22 Intellivid Corporation Method and apparatus for computerized image background analysis
DE60330898D1 (en) * 2002-11-12 2010-02-25 Intellivid Corp METHOD AND SYSTEM FOR TRACKING AND BEHAVIORAL MONITORING OF MULTIPLE OBJECTS THROUGH SEVERAL VISIBILITIES
US8525469B1 (en) 2003-07-03 2013-09-03 Battery-Free Outdoors, Llc System and method using capacitors to power a camera having a motion sensor
US7286157B2 (en) * 2003-09-11 2007-10-23 Intellivid Corporation Computerized method and apparatus for determining field-of-view relationships among multiple image sensors
US7346187B2 (en) * 2003-10-10 2008-03-18 Intellivid Corporation Method of counting objects in a monitored environment and apparatus for the same
US7280673B2 (en) * 2003-10-10 2007-10-09 Intellivid Corporation System and method for searching for changes in surveillance video
US7664292B2 (en) * 2003-12-03 2010-02-16 Safehouse International, Inc. Monitoring an output from a camera
US9318012B2 (en) 2003-12-12 2016-04-19 Steve Gail Johnson Noise correcting patient fall risk state system and method for predicting patient falls
US8675059B2 (en) 2010-07-29 2014-03-18 Careview Communications, Inc. System and method for using a video monitoring system to prevent and manage decubitus ulcers in patients
US7477285B1 (en) * 2003-12-12 2009-01-13 Careview Communication, Inc. Non-intrusive data transmission network for use in an enterprise facility and method for implementing
US9311540B2 (en) 2003-12-12 2016-04-12 Careview Communications, Inc. System and method for predicting patient falls
US20050289363A1 (en) * 2004-06-28 2005-12-29 Tsirkel Aaron M Method and apparatus for automatic realtime power management
US20060190960A1 (en) * 2005-02-14 2006-08-24 Barker Geoffrey T System and method for incorporating video analytics in a monitoring network
ATE500580T1 (en) 2005-03-25 2011-03-15 Sensormatic Electronics Llc INTELLIGENT CAMERA SELECTION AND OBJECT TRACKING
US9036028B2 (en) 2005-09-02 2015-05-19 Sensormatic Electronics, LLC Object tracking and alerts
JP5061439B2 (en) * 2005-09-07 2012-10-31 パナソニック株式会社 Video signal processing apparatus and video signal processing method
JP2009533778A (en) 2006-04-17 2009-09-17 オブジェクトビデオ インコーポレイテッド Video segmentation using statistical pixel modeling
US7825792B2 (en) * 2006-06-02 2010-11-02 Sensormatic Electronics Llc Systems and methods for distributed monitoring of remote sites
US7671728B2 (en) * 2006-06-02 2010-03-02 Sensormatic Electronics, LLC Systems and methods for distributed monitoring of remote sites
EP1909229B1 (en) * 2006-10-03 2014-02-19 Nikon Corporation Tracking device and image-capturing apparatus
US8903222B2 (en) * 2007-02-01 2014-12-02 Sony Corporation Image reproducing apparatus, image reproducing method, image capturing apparatus, and control method therefor
JP2010533319A (en) * 2007-06-09 2010-10-21 センサーマティック・エレクトロニクス・コーポレーション Systems and methods for integrating video analysis and data analysis / mining
US9959471B2 (en) 2008-05-06 2018-05-01 Careview Communications, Inc. Patient video monitoring systems and methods for thermal detection of liquids
US9794523B2 (en) 2011-12-19 2017-10-17 Careview Communications, Inc. Electronic patient sitter management system and method for implementing
US10645346B2 (en) 2013-01-18 2020-05-05 Careview Communications, Inc. Patient video monitoring systems and methods having detection algorithm recovery from changes in illumination
US9579047B2 (en) 2013-03-15 2017-02-28 Careview Communications, Inc. Systems and methods for dynamically identifying a patient support surface and patient monitoring
US9866797B2 (en) 2012-09-28 2018-01-09 Careview Communications, Inc. System and method for monitoring a fall state of a patient while minimizing false alarms
DE102008058671B4 (en) * 2008-10-03 2011-04-07 ASTRA Gesellschaft für Asset Management mbH & Co. KG Method for controlling a video surveillance device
US8471899B2 (en) 2008-12-02 2013-06-25 Careview Communications, Inc. System and method for documenting patient procedures
US9101126B2 (en) * 2010-01-11 2015-08-11 Jager Pro, Llc Remote control gate release for trap enclosure
DE112011100263A5 (en) * 2010-01-18 2013-07-25 Stefan Wieser Apparatus and method for monitoring a building opening
TW201218091A (en) * 2010-10-19 2012-05-01 Hon Hai Prec Ind Co Ltd Image capturing device and method for detecting a human object using the image capturing device
CN102457706A (en) * 2010-10-19 2012-05-16 由田新技股份有限公司 Multi-angle monitoring device and multi-angle monitoring ATM (Automated Teller Machine)
US10076109B2 (en) 2012-02-14 2018-09-18 Noble Research Institute, Llc Systems and methods for trapping animals
JP5891061B2 (en) * 2012-02-15 2016-03-22 株式会社日立製作所 Video monitoring apparatus, monitoring system, and monitoring system construction method
US9237743B2 (en) 2014-04-18 2016-01-19 The Samuel Roberts Noble Foundation, Inc. Systems and methods for trapping animals

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA1116286A (en) * 1979-02-20 1982-01-12 Control Data Canada, Ltd. Perimeter surveillance system
CA1172746A (en) * 1980-10-22 1984-08-14 Trevor W. Mahoney Video movement detector
US5111288A (en) * 1988-03-02 1992-05-05 Diamond Electronics, Inc. Surveillance camera system
US5095365A (en) * 1989-10-20 1992-03-10 Hitachi, Ltd. System for monitoring operating state of devices according to their degree of importance
US5091780A (en) * 1990-05-09 1992-02-25 Carnegie-Mellon University A trainable security system emthod for the same
US5164992A (en) * 1990-11-01 1992-11-17 Massachusetts Institute Of Technology Face recognition system
US5289275A (en) * 1991-07-12 1994-02-22 Hochiki Kabushiki Kaisha Surveillance monitor system using image processing for monitoring fires and thefts
KR930009102B1 (en) * 1991-08-17 1993-09-22 삼성전자 주식회사 Record auto-exchange circuit
JPH0816958B2 (en) * 1991-12-11 1996-02-21 茨城警備保障株式会社 Security surveillance system
WO1996003839A1 (en) * 1994-07-26 1996-02-08 Maxpro Systems Pty. Ltd. A video security system
US5825413A (en) * 1995-11-01 1998-10-20 Thomson Consumer Electronics, Inc. Infrared surveillance system with controlled video recording
IL116703A (en) * 1996-01-08 2001-01-11 Israel State System and method for detecting an intruder
US5969755A (en) * 1996-02-05 1999-10-19 Texas Instruments Incorporated Motion based event detection system and method

Cited By (108)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6970576B1 (en) 1999-08-04 2005-11-29 Mbda Uk Limited Surveillance system with autonomic control
US7190814B2 (en) * 2000-10-13 2007-03-13 Omron Corporation Image comparison apparatus and method for checking an image of an object against a stored registration image
US20020090116A1 (en) * 2000-10-13 2002-07-11 Kunihiro Miichi Image comparison apparatus, image comparison method, image comparison center apparatus, and image comparison system
US20020191831A1 (en) * 2001-05-02 2002-12-19 Stmicroelectronics S.R.L. System and process for analyzing surface defects
US20030007003A1 (en) * 2001-05-24 2003-01-09 Ostrowski Dominic Jan Data processing device with server-generated graphic user interface
US20030044046A1 (en) * 2001-08-30 2003-03-06 Yoshifumi Nakamura Method and system for delivering monitored image signal of sbject to be monitored
US7194109B2 (en) * 2001-08-30 2007-03-20 Hitachi Kokusai Electric, Inc. Method and system for delivering monitored image signal of subject to be monitored
US20030103138A1 (en) * 2001-12-03 2003-06-05 Inter-Cite Video Inc. Video security and control system
US20030155811A1 (en) * 2002-02-20 2003-08-21 Makoto Yoneya Remote input/output device
US20090093688A1 (en) * 2003-05-30 2009-04-09 Michael Mathur System, Device, and Method for Remote Monitoring and Servicing
US20120212596A1 (en) * 2003-05-30 2012-08-23 Michael Mathur System, Device, and Method for Remote Monitoring and Servicing
US9124729B2 (en) 2005-01-31 2015-09-01 The Invention Science Fund I, Llc Shared image device synchronization or designation
US8902320B2 (en) 2005-01-31 2014-12-02 The Invention Science Fund I, Llc Shared image device synchronization or designation
US9489717B2 (en) 2005-01-31 2016-11-08 Invention Science Fund I, Llc Shared image device
US9910341B2 (en) 2005-01-31 2018-03-06 The Invention Science Fund I, Llc Shared image device designation
US20090073268A1 (en) * 2005-01-31 2009-03-19 Searete Llc Shared image devices
US20060171695A1 (en) * 2005-01-31 2006-08-03 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Shared image device designation
US9082456B2 (en) 2005-01-31 2015-07-14 The Invention Science Fund I Llc Shared image device designation
US9019383B2 (en) 2005-01-31 2015-04-28 The Invention Science Fund I, Llc Shared image devices
US8988537B2 (en) 2005-01-31 2015-03-24 The Invention Science Fund I, Llc Shared image devices
US20090027505A1 (en) * 2005-01-31 2009-01-29 Searete Llc Peripheral shared image device sharing
US8606383B2 (en) 2005-01-31 2013-12-10 The Invention Science Fund I, Llc Audio sharing
US20060174206A1 (en) * 2005-01-31 2006-08-03 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Shared image device synchronization or designation
US20090115852A1 (en) * 2005-01-31 2009-05-07 Searete Llc Shared image devices
US20080106621A1 (en) * 2005-01-31 2008-05-08 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Shared image device synchronization or designation
US20080158366A1 (en) * 2005-01-31 2008-07-03 Searete Llc Shared image device designation
US20100235466A1 (en) * 2005-01-31 2010-09-16 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Audio sharing
US20110110608A1 (en) * 2005-03-30 2011-05-12 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Image transformation estimator of an imaging device
US10003762B2 (en) 2005-04-26 2018-06-19 Invention Science Fund I, Llc Shared image devices
US20100271490A1 (en) * 2005-05-04 2010-10-28 Assignment For Published Patent Application, Searete LLC, a limited liability corporation of Regional proximity for shared image device(s)
US9819490B2 (en) 2005-05-04 2017-11-14 Invention Science Fund I, Llc Regional proximity for shared image device(s)
US9967424B2 (en) 2005-06-02 2018-05-08 Invention Science Fund I, Llc Data storage usage protocol
US20070139529A1 (en) * 2005-06-02 2007-06-21 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Dual mode image capture technique
US20070008326A1 (en) * 2005-06-02 2007-01-11 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Dual mode image capture technique
US9621749B2 (en) 2005-06-02 2017-04-11 Invention Science Fund I, Llc Capturing selected image objects
US20070040928A1 (en) * 2005-06-02 2007-02-22 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Capturing selected image objects
US9451200B2 (en) 2005-06-02 2016-09-20 Invention Science Fund I, Llc Storage access technique for captured data
US20060274165A1 (en) * 2005-06-02 2006-12-07 Levien Royce A Conditional alteration of a saved image
US9191611B2 (en) 2005-06-02 2015-11-17 Invention Science Fund I, Llc Conditional alteration of a saved image
US20070052856A1 (en) * 2005-06-02 2007-03-08 Searete Llc, A Limited Liability Corporation Of The State Of Delaware. Composite image selectivity
US8681225B2 (en) 2005-06-02 2014-03-25 Royce A. Levien Storage access technique for captured data
US20080219589A1 (en) * 2005-06-02 2008-09-11 Searete LLC, a liability corporation of the State of Delaware Estimating shared image device operational capabilities or resources
US9041826B2 (en) 2005-06-02 2015-05-26 The Invention Science Fund I, Llc Capturing selected image objects
US20070109411A1 (en) * 2005-06-02 2007-05-17 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Composite image selectivity
US20060279643A1 (en) * 2005-06-02 2006-12-14 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Storage access technique for captured data
US9001215B2 (en) 2005-06-02 2015-04-07 The Invention Science Fund I, Llc Estimating shared image device operational capabilities or resources
US20070120981A1 (en) * 2005-06-02 2007-05-31 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Storage access technique for captured data
US10097756B2 (en) 2005-06-02 2018-10-09 Invention Science Fund I, Llc Enhanced video/still image correlation
US20060274154A1 (en) * 2005-06-02 2006-12-07 Searete, Lcc, A Limited Liability Corporation Of The State Of Delaware Data storage usage protocol
US20070098348A1 (en) * 2005-10-31 2007-05-03 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Degradation/preservation management of captured data
US9942511B2 (en) 2005-10-31 2018-04-10 Invention Science Fund I, Llc Preservation/degradation of video/audio aspects of a data stream
US9076208B2 (en) 2006-02-28 2015-07-07 The Invention Science Fund I, Llc Imagery processing
US20070200934A1 (en) * 2006-02-28 2007-08-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Imagery processing
US20070222865A1 (en) * 2006-03-15 2007-09-27 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Enhanced video/still image correlation
US20080043108A1 (en) * 2006-08-18 2008-02-21 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Capturing selected image objects
US8964054B2 (en) 2006-08-18 2015-02-24 The Invention Science Fund I, Llc Capturing selected image objects
US20080122932A1 (en) * 2006-11-28 2008-05-29 George Aaron Kibbie Remote video monitoring systems utilizing outbound limited communication protocols
US20080143831A1 (en) * 2006-12-15 2008-06-19 Daniel David Bowen Systems and methods for user notification in a multi-use environment
US9286775B2 (en) * 2007-02-14 2016-03-15 Panasonic Intellectual Property Management Co., Ltd. Monitoring camera and monitoring camera control method
US8456525B2 (en) * 2007-02-14 2013-06-04 Lg Electronics Inc. Digital display device for a DVR system that receives a movement image and a method for using such
US20100007736A1 (en) * 2007-02-14 2010-01-14 Panasonic Corporation Monitoring camera and monitoring camera control method
US10861304B2 (en) 2007-02-14 2020-12-08 Panasonic I-Pro Sensing Solutions Co., Ltd. Monitoring camera and monitoring camera control method
US9437089B2 (en) 2007-02-14 2016-09-06 Panasonic Intellectual Property Management Co., Ltd. Monitoring camera and monitoring camera control method
US9870685B2 (en) 2007-02-14 2018-01-16 Panasonic Intellectual Property Management Co., Ltd. Monitoring camera and monitoring camera control method
US20100034523A1 (en) * 2007-02-14 2010-02-11 Lg Electronics Inc. Digital display device for having dvr system and of the same method
US10475312B2 (en) 2007-02-14 2019-11-12 Panasonic intellectual property Management co., Ltd Monitoring camera and monitoring camera control method
US20090144391A1 (en) * 2007-11-30 2009-06-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Audio sharing
US8223211B2 (en) * 2007-12-06 2012-07-17 Samsung Electronics Co., Ltd. Digital photographing apparatus, method of controlling the same, and recording medium storing a program for implementing the method
US20090147095A1 (en) * 2007-12-06 2009-06-11 Samsung Techwin Co., Ltd. Digital photographing apparatus, method of controlling the same, and recording medium storing a program for implementing the method
US20090207247A1 (en) * 2008-02-15 2009-08-20 Jeffrey Zampieron Hybrid remote digital recording and acquisition system
US8345097B2 (en) 2008-02-15 2013-01-01 Harris Corporation Hybrid remote digital recording and acquisition system
WO2009126365A3 (en) * 2008-02-15 2009-12-17 Harris Corporation Hybrid remote digital recording and acquisition system
WO2009126365A2 (en) * 2008-02-15 2009-10-15 Harris Corporation Hybrid remote digital recording and acquisition system
US20100329660A1 (en) * 2009-03-09 2010-12-30 Samsung Electronics Co., Ltd. Digital moving picture photographing apparatus, method of controlling the apparatus, recording medium storing program to execute the method, and method of determining movement of subject
US8441350B2 (en) * 2009-09-10 2013-05-14 Sony Corporation Apparatus and method for operation of a display device to provide a home security alarm
US20110057796A1 (en) * 2009-09-10 2011-03-10 Sony Corporation Apparatus and Method for Operation of a Display Device to Provide a Home Security Alarm
US20110187880A1 (en) * 2010-02-03 2011-08-04 Honeywell International Inc. Image acquisition system using orthogonal transfer ccd sensing element
US8576324B2 (en) 2010-02-03 2013-11-05 Honeywell International Inc. Image acquisition system using orthogonal transfer CCD sensing element
US8406468B2 (en) * 2010-03-18 2013-03-26 Hon Hai Precision Industry Co., Ltd. Image capturing device and method for adjusting a position of a lens of the image capturing device
US20110228977A1 (en) * 2010-03-18 2011-09-22 Hon Hai Precision Industry Co., Ltd. Image capturing device and method for adjusting a position of a lens of the image capturing device
US20120224088A1 (en) * 2011-03-03 2012-09-06 Honeywell International Inc. Flashless motion invariant image acquisition system
US8872926B2 (en) * 2011-03-03 2014-10-28 Honeywell International Inc. Flashless motion invariant image acquisition system
CN102655570A (en) * 2011-03-03 2012-09-05 霍尼韦尔国际公司 Flashless motion invariant image acquisition system
EP2541519A1 (en) * 2011-06-30 2013-01-02 Xtralis AG Method for operating systems with PIR detectors
US9268412B2 (en) * 2011-10-05 2016-02-23 Sony Corporation Input apparatus having an input recognition unit and input recognition method by using the same
US20130088422A1 (en) * 2011-10-05 2013-04-11 Sony Corporation Input apparatus and input recognition method
WO2013052863A1 (en) * 2011-10-05 2013-04-11 Radio Systems Corporation Image-based animal control systems and methods
US11553692B2 (en) 2011-12-05 2023-01-17 Radio Systems Corporation Piezoelectric detection coupling of a bark collar
US10154651B2 (en) 2011-12-05 2018-12-18 Radio Systems Corporation Integrated dog tracking and stimulus delivery system
US11470814B2 (en) 2011-12-05 2022-10-18 Radio Systems Corporation Piezoelectric detection coupling of a bark collar
US10674709B2 (en) 2011-12-05 2020-06-09 Radio Systems Corporation Piezoelectric detection coupling of a bark collar
US10228447B2 (en) 2013-03-15 2019-03-12 Radio Systems Corporation Integrated apparatus and method to combine a wireless fence collar with GPS tracking capability
US10645908B2 (en) 2015-06-16 2020-05-12 Radio Systems Corporation Systems and methods for providing a sound masking environment
US10045512B2 (en) 2015-06-16 2018-08-14 Radio Systems Corporation Systems and methods for monitoring a subject in a premise
US10231440B2 (en) 2015-06-16 2019-03-19 Radio Systems Corporation RF beacon proximity determination enhancement
US10613559B2 (en) 2016-07-14 2020-04-07 Radio Systems Corporation Apparatus, systems and methods for generating voltage excitation waveforms
US10268220B2 (en) 2016-07-14 2019-04-23 Radio Systems Corporation Apparatus, systems and methods for generating voltage excitation waveforms
US11109182B2 (en) 2017-02-27 2021-08-31 Radio Systems Corporation Threshold barrier system
US11394196B2 (en) 2017-11-10 2022-07-19 Radio Systems Corporation Interactive application to protect pet containment systems from external surge damage
US10986813B2 (en) 2017-12-12 2021-04-27 Radio Systems Corporation Method and apparatus for applying, monitoring, and adjusting a stimulus to a pet
US10842128B2 (en) 2017-12-12 2020-11-24 Radio Systems Corporation Method and apparatus for applying, monitoring, and adjusting a stimulus to a pet
US10955521B2 (en) 2017-12-15 2021-03-23 Radio Systems Corporation Location based wireless pet containment system using single base unit
US11372077B2 (en) 2017-12-15 2022-06-28 Radio Systems Corporation Location based wireless pet containment system using single base unit
US10514439B2 (en) 2017-12-15 2019-12-24 Radio Systems Corporation Location based wireless pet containment system using single base unit
US11238889B2 (en) 2019-07-25 2022-02-01 Radio Systems Corporation Systems and methods for remote multi-directional bark deterrence
US11490597B2 (en) 2020-07-04 2022-11-08 Radio Systems Corporation Systems, methods, and apparatus for establishing keep out zones within wireless containment regions
US11341456B2 (en) * 2020-08-25 2022-05-24 Datalogic Usa, Inc. Compact and low-power shelf monitoring system
CN113835457A (en) * 2021-08-10 2021-12-24 太原市高远时代科技有限公司 Water conservancy integration intelligence rack based on edge calculation

Also Published As

Publication number Publication date
US6456320B2 (en) 2002-09-24

Similar Documents

Publication Publication Date Title
US6456320B2 (en) Monitoring system and imaging system
US6396534B1 (en) Arrangement for spatial monitoring
EP1279150B1 (en) Surveillance system with camera
US5938717A (en) Speed detection and image capture system for moving vehicles
KR100896949B1 (en) Image Monitoring System for Object Identification
JP2003069987A (en) Event video recording/reproducing system, event management device, and local recorder
AU2000241410A1 (en) Surveillance system with camera
WO2002091733A1 (en) Event detection in a video recording system
CA2425855C (en) A method of searching recorded digital video for areas of activity
JP3657132B2 (en) Housing intrusion monitoring device
JP3894122B2 (en) Object detection apparatus and method
JPH0614320A (en) Monitoring video recorder
KR100890767B1 (en) Cctv system for searching detail information and control method thereof
JPH11275566A (en) Monitoring camera apparatus
JP2001346194A (en) Intruder management device
JPH0261794A (en) Picture supervisory equipment
JP4634689B2 (en) Frame monitoring system
JPH1145379A (en) Monitoring system
KR100464372B1 (en) Front Particle Image Recording / Playback System of Home Automation System
KR101092654B1 (en) Digital video recorder having a perfume diffusing part
JP4461649B2 (en) Surveillance camera and security system using surveillance camera
JPH0968741A (en) Monitoring camera system
KR100309984B1 (en) Recording method of closed circuit television
JPH10336630A (en) Monitor system
JPH0737175A (en) Monitoring system

Legal Events

Date Code Title Description
AS Assignment

Owner name: SANYO ELECTRIC CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KUWANO, YUKINORI;OKINO, TOSHIYUKI;IKEDA, TAKASHI;AND OTHERS;REEL/FRAME:009394/0855

Effective date: 19980617

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: GODO KAISHA IP BRIDGE 1, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SANYO ELECTRIC CO., LTD.;REEL/FRAME:032077/0337

Effective date: 20140116

FPAY Fee payment

Year of fee payment: 12