WO2002073086A1 - Object detection - Google Patents

Object detection Download PDF

Info

Publication number
WO2002073086A1
WO2002073086A1 PCT/US2002/007493 US0207493W WO02073086A1 WO 2002073086 A1 WO2002073086 A1 WO 2002073086A1 US 0207493 W US0207493 W US 0207493W WO 02073086 A1 WO02073086 A1 WO 02073086A1
Authority
WO
WIPO (PCT)
Prior art keywords
border
area
interest
region
image
Prior art date
Application number
PCT/US2002/007493
Other languages
French (fr)
Inventor
Darren Duane Cofer
Rida M. Hamza
Original Assignee
Honeywell International Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honeywell International Inc. filed Critical Honeywell International Inc.
Priority to JP2002572310A priority Critical patent/JP2004535616A/en
Priority to EP02723404A priority patent/EP1373785B1/en
Priority to DE60224373T priority patent/DE60224373T2/en
Publication of WO2002073086A1 publication Critical patent/WO2002073086A1/en

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16PSAFETY DEVICES IN GENERAL; SAFETY DEVICES FOR PRESSES
    • F16P3/00Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body
    • F16P3/12Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine
    • F16P3/14Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact
    • F16P3/142Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact using image capturing devices
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19606Discriminating between target movement or movement in an area of interest and other non-signicative movements, e.g. target movements induced by camera shake or movements of pets, falling leaves, rotating fan

Definitions

  • the present invention relates to object detection, and more specifically, to object intrusion and/or presence detection within a predefined area or region.
  • Electrosensitive Protective Equipment is well-known and widely used in industrial settings to protect operators of hazardous equipment from injury.
  • ESPE devices typically have a sensing function, a control or monitoring function, and an output signal switching function.
  • the sensing function typically collects data from, for example, a defined safety zone surrounding dangerous equipment.
  • the safety zone may be a line, an area, or a volume, depending on the sensing technology used.
  • the control function monitors the sensing function. When the control function determines that the sensor data provided by the sensing function corresponds to an intrusion into the safety zone, an output signal is produced to sound an alarm, deactivate the hazardous equipment, or perform some other precautionary measure.
  • Single beam photodetectors typically use a single light source and light detector to provide some level of access monitoring. When an object moves between the light source and the light detector, the light beam extending therebetween is interrupted, which then triggers a safety violation.
  • a limitation of single beam photodetector systems is that only limited access control and typically no presence sensing is provided. Another limitation is that to change the location, shape or size of the safety zone, the light source and/or light detector must typically be physically moved.
  • Light curtain systems are similar to single beam photodetector systems, except a linear array of light emitter/light detector pairs are provided.
  • the light emitter/light detector pairs are mounted in a pair of spaced enclosures.
  • the array of light emitters produce a "light curtain" that extends to the corresponding light detectors.
  • the resolution typically depends on the spacing of the light beams.
  • Light curtain systems can provide some level of access control when mounted vertically, and some level of presence monitoring when mounted horizontally. However, a limitation of some light curtain systems is that they are relatively expensive and complex. Another limitation is that variations in the size and shape of the safety area may be restricted, and the spaced enclosures must typically be physically moved to change the configuration of the safety zone to be monitored.
  • Laser scanner system typically include a rotating laser emitter/detector, which scans a plane and measures the distance to the nearest object in any direction by monitoring the reflection of the beam.
  • This type of device can provide some level of presence monitoring along a horizontal plane. It may also be mounted vertically to provide some level of access monitoring, similar to the light curtain systems discussed above.
  • a limitation of laser scanner systems is that they use complex mechanical components, such as rotating heads, which can requiring periodic and precise alignment. While the region to be monitored may be redefined using configuration software, its shape is often limited by the line-of-sight of the laser. Also, the response time is limited by the need to rotate the laser beam, and the sensitivity may be limited by air pollution in an industrial environment.
  • safety mat systems have been used to provide presence monitoring by detecting physical contact with a floor mat/sensor. Its robustness is limited by the need for physical contact with the floor mat for detection, which can be problematic in the often harsh environment of the factory floor. Safety mat systems typically cannot monitor large areas unless a number of mats are connected together. Finally, and like the single beam photodetector and light curtain systems described above, the safety mats must typically be physically moved to change the configuration of the safety zone to be monitored.
  • the present invention provides a visual object detection system that uses one or more images from a video camera, digital camera, etc., to provide access and/or presence monitoring of an area of interest.
  • a visual object detection system that uses one or more images from a video camera, digital camera, etc., to provide access and/or presence monitoring of an area of interest.
  • steady state operation that is when no object is entering or within the area of interest, only those portions of the incoming images that correspond to the border of the area of interest are analyzed.
  • the present invention may quickly detect when the border has been breached by an object. After the border is breached, the present invention preferably sounds an alarm, deactivates hazardous equipment in the area of interest, or performs some other precautionary measure, but this is not required.
  • the present invention may begin analyzing the entire area or selected regions inside the border of the area of interest. This may provide some level of presence monitoring of the area of interest. In some embodiments, the presence monitoring can be performed at a slower rate than the border analysis, particularly when one or more precautionary measures have already been initiated by a border breach. It is contemplated that both modes of analysis can take place simultaneously or sequentially, depending on the application.
  • the present invention preferably returns to the original steady state, and monitors only the border regions of the incoming images.
  • Figure 1 is a partial side view of a workplace area with an overhead safety camera in place
  • Figure 2 is a perspective view of a safety camera having a field of view, with an area of interest within the field of view of the safety camera;
  • Figure 3 is a schematic diagram showing a workplace with safety system in place;
  • Figures 4A is a diagram of an image of an area of interest with a border region defined;
  • Figure 4B is a diagram of an image of an area of interest, wherein the floor of the area of interest has a pattern along the border region;
  • Figure 5 is a perspective view of a safety camera system that monitors an area of interest having irregular shaped borders;
  • Figures 6A is a perspective view of an area of interest with breaks in the border
  • Figures 6B is a top view of the area of interest of Figure 6 A
  • Figures 7 A is perspective view of a safety camera system having an overhead camera and a side camera for monitoring the area of interest;
  • Figure 7B is a diagram showing the field of view of the overhead camera of Figure 7A;
  • Figure 7C is a diagram showing the field of view of the side camera of Figure 7A;
  • Figure 8A is a perspective view of another safety camera system having two safety cameras for monitoring a volume of interest
  • Figure 8B is a diagram showing the field of view of the overhead camera of Figure 8A;
  • Figure 8C is a diagram showing the field of view of the side camera of Figure
  • Figure 9 is a perspective view of yet another safety camera system having two safety cameras for monitoring an area of interest
  • Figure 10A is a perspective view of an area of interest with an object just entering the border of the area of interest;
  • Figure 10B is an overhead view of the area of interest and the object of Figure 10A;
  • Figure 11 A is a perspective view of an area of interest with an object already across the border of the area of interest;
  • Figure 1 IB is an overhead view of the area of interest and the object of Figure
  • Figures 12A is a perspective view ot an area of mterest with an object entirely within the area of mterest;
  • Figure 12B is an overhead view of the area of interest and the object of Figure 12 A;
  • Figure 13 is a block diagram showing an illustrative safety camera system in accordance with the present invention.
  • Figure 14 is another block diagram showing an illustrative safety camera system in accordance with the present invention.
  • Figure 15 is a flow chart showing an illustrative relationship between border and interior analysis functions in accordance with the present invention.
  • Figure 16 is a flow chart showing an illustrative method for performing interior and border analysis in accordance with the present invention.
  • Figure 17 is a state machine diagram of an illustrative embodiment of the present invention
  • Figure 18 is a block diagram showing an illustrative data flow and analysis in accordance of the present invention
  • Figure 19 is a timing diagram showing illustrative timing relationships for the embodiment of Figure 18;
  • Figure 20 is a state diagram showing the progression of states for an illustrative embodiment of the present invention.
  • Figure 21 is a schematic diagram of an illustrative embodiment using multiple channels of data flow in accordance with the present invention.
  • Figure 22 is a timing diagram showing illustrative timing relationships of the embodiment of Figure 21;
  • Figure 23 is a block diagram showing processing, memory and control blocks of an illustrative embodiment of the present invention.
  • Figure 24 is a functional block diagram of another illustrative embodiment of the present invention.
  • FIG. 1 is a side view of a workplace area 14 with a camera 10.
  • camera 10 is placed over equipment 12, which is surrounded by a safety area 14.
  • the equipment 12 may be a piece of hazardous equipment, for example a machine with moving parts, a chemical storage unit, a raw materials processor, an incinerator, or any other machine that could present a danger to a person.
  • the equipment 12 may be one or more pieces of equipment that are performing highly sensitive activities, where a safety system could be used to prevent an object or a person from interfering with the highly sensitive activity.
  • equipment 12 may be a valuable item, and a safety system could be implemented to prevent the item from being taken or damaged.
  • a worker 20 is shown standing in the safety area 14.
  • the camera 10 gathers frames along a pyramid 30 shaped field of view, preferably as a two dimensional image.
  • Figure 2 is a schematic view of a camera 10 having a field of view 19.
  • An area of interest 14 is shown in cross-hatch. It should be noted that the term "area” is not limited to a two-dimensional area, and may include three-dimensional volumes, as further described below.
  • the present invention preferably uses a digital camera 10 or the like to gather images along the field of view 19 of the camera 10. After gathering an image along of the field of view 19 of the camera 10, data processing techniques are preferably used to select those pixels that fall within the border 16. The present invention also preferably distinguishes between those pixels that fall along the border 16 and those pixels that fall within the border 16.
  • the present invention may quickly detect when the border 16 has been breached by an object. After the border is breached, a precautionary measure may be taken, such as sounding an alarm, deactivating hazardous equipment in the area of interest 17, or performing some other precautionary measure, but this is not required.
  • the illustrative embodiment may begin analyzing those pixels within the interior 15 of the area of interest 14. This may provide some level of presence monitoring of the interior 15 of the area of interest 14.
  • the presence monitoring can be performed at a slower rate than the border analysis, particularly when one or more precautionary measures have already been initiated by a border breach. It is contemplated that both modes of analysis can take place simultaneously or sequentially, depending on the application.
  • FIG. 3 is a schematic diagram showing a workplace with safety system in place.
  • a safety camera 10 is disposed above the workplace area.
  • the workplace area preferably has a predefined region of interest 14, which may correspond to a safety zone around a piece of equipment 12.
  • the interior area 15 of the area of interest 14 is defined by a border 16, which in the illustrative embodiment, has sides 40, 42, 44, 46.
  • a processing system 70 is shown connected to the safety camera 10 via interface 72.
  • Processing system 70 preferably processes images received from safety camera 10 to determine if an object has breached a predefined border region 16 and/or if an object remains within the area of interest 15, as described above.
  • the processing system 70 may also be connected to equipment 12, via interface 74.
  • processing system 70 may send an enable or tum-off signal to the equipment 12 via interface 74.
  • This enable or turn-off signal preferably causes the equipment 12 to turn off, trigger a brake, or otherwise stop the operation of equipment 12.
  • processing system 70 may receive information from equipment 12, such as a warning, an error or a malfunction signal, if desired.
  • the processing system 70 is also connected to other input or output devices 28 via an interface 76.
  • the input or output devices 28 may include, for example, an audible alarm, a visible alarm, a transmitter to send a warning signal to a remote location, a memory device to keep records of warning signals, etc.
  • Figures 4A is a diagram of an image of an area of interest 14 with a border region 16 defined.
  • a piece of equipment 12 is shown within the interior area 15.
  • the image is divided into a border region 16 and an interior region 15.
  • Border 16 may be more narrowly or more widely defined, depending on the application.
  • analysis of the interior region 15 only takes place after a triggering event, such as the breach of the border region 16.
  • analysis ol the interior region 15 may be initiated by other mechanisms, such as at timed intervals, in response to an external stimulus, such as a manual request of interior region analysis, etc.
  • a pattern may be provided on the floor, preferably corresponding to the desired border region 16.
  • a patterned tape or painted strip with contrasting dark and light areas may be placed along the desired border region 16.
  • the area defined by each of the contrasting color regions may be chosen to correspond to the minimum size of the object to be detected. It is contemplated, however, that any number of patterns may be used including, for example, a single color line, a checked pattern, crisscrossing stripes, etc.
  • the pattern may cover a more general area, and need not be confined to the border 16.
  • Markers may also be included in the area of interest 14 to enable functioning of a system in which the relative position of the camera with respect to the area of interest cannot or will not be maintained during operation of the safety system.
  • a marker or the like may be used to provide a point or reference.
  • vibrations may cause noticeable movement of the camera.
  • a suitable marker may be any of several known in the art such as colored or painted dots, strips, sound or light signal generators, identifiable shapes or designs, etc.
  • Figure 4B is a diagram of an image of an area of interest, wherein the floor of the area of interest has a pattern along the border region.
  • Area of interest 90 is defined by an outside border 92 and an interior region 94.
  • outside border 92 may include a pattern, such as a checker pattern. Such a pattern is not required in the present invention, but may be useful to improve the accuracy and/or speed of the border analysis. Such a pattern may be provided throughout the area of interest 90, if desired. In addition, and in some embodiments, a different pattern may be used in the border region 92 than in the interior region 94.
  • Equipment 96 is shown within an excluded area 98, which is defined by internal border 99.
  • movement across outside border 92 and internal border 99 may be monitored during steady state operation. If an object crosses into one of these areas, the interior region 94 (excluded area 98) may then be monitored.
  • the excluded area 98 may be used to detect objects that are thrown or otherwise are moving away from equipment 96 during steady state operation. Such objects may indicate failure or malfunction of the piece of equipment 96. It is contemplated that equipment 96 may not be within excluded area 98, or may be part in excluded area 98 and part in interior region 94, or may be entirely inside excluded area 98. Selection of the border of excluded area 98 may include assessing whether the equipment 96 has moving parts that could disrupt the safety system if not ignored.
  • Figure 5 is a perspective view of a safety camera system that monitors an area of interest having irregular shaped borders.
  • Camera 110 is disposed above an area of interest 114 with an interior region 115 and a border region 116.
  • Equipment 112 is located inside area of interest 114.
  • Border 116 is irregular in shape, made up of segments 140, 142, 144, 146 and 148, with 142 being curved.
  • Camera 110 gathers an image including a larger field of view 130 defined under cone 132.
  • a processing unit may then exclude pixels that correspond to areas 134 and 136 from analysis.
  • a user can select the dimensions and shape of the area interest 114, as desired.
  • FIG. 5 Also shown in Figure 5 is that the camera 110 need not be centered over the desired area of interest 114. Offsetting the camera 110 may be desirable for a variety of reasons including space constraints, etc.
  • supposing equipment 112 includes a part 112A that is of particular interest, the camera 110 may be disposed in an off-center fashion relative to the area of interest 114 to provide a better view of the part 112A.
  • Figures 6A is a perspective view of an area of interest with breaks m the border.
  • Camera 160 is disposed above the area of interest 164.
  • the area of interest 164 is divided into two interior regions 165 A and 165B.
  • the interior regions 165 A and 164B are separated by conveyer belt 180 and machine 162.
  • the border region 166 is defined by lines 194, 196, 198, and lines 190, 192, 199.
  • the conveyer belt 180 crosses the border 166 as shown. In a preferred embodiment, movement along the conveyer belt 180 does not trigger the safety system, while movement across border lines 190, 192, 194, 196, 198, 199 does.
  • Figures 6B is a top view of the area of interest of Figure 6A.
  • Conveyer belt 180 crosses the area of interest 164, splitting the interior area into internal areas 165 A and 165B, thereby making the border region 166 non-continuous.
  • the border 166 is made up of several segments 190, 192, 194, 196, 198, 199, as described above.
  • Equipment 162 connects to the conveyer belt 180.
  • movement in the area covered by the conveyer belt 180 is ignored, such that steady state analysis would monitor the border segments 190, 192, 194, 196, 198, 199 not covered by the conveyer belt 180.
  • interior analysis would ignore the area covered by the conveyer belt 180 and would only analyze areas 165 A and 165B.
  • the interior analysis may or may not analyze the area over the equipment 162.
  • each interior zone 165 A and 165B could contain or surround separate and/or unrelated equipment, assuming each safety zone is in the field of view of the camera. This may reduce the cost of providing the safety camera system.
  • each zone may be monitored differently. For example, in a "LOAD MACHINE" mode, only interior region 165B may be monitored, while interior region 165 A may not be monitored. In a "RUN" mode, both interior regions 165 A and 165B may be monitored.
  • Figures 7A is perspective view of a safety camera system having an overhead camera 210 and a side camera 220 for monitoring an area of interest 202.
  • the overhead camera may monitor, for example, horizontal movement in the area of interest, as described above.
  • the side camera 220 may monitor, for example, vertical movement within the area of interest 202.
  • a pattern may be applied to a wall or the like in the field of view of the side camera to help detect movement of objects within the area of interest. Further illustration of the camera operations for the illustrative embodiment of
  • Figure 7A appears in Figures 7B and 7C.
  • Figure 7B is a diagram showing a possible field of view for the overhead camera 210, and shows an area of interest 232.
  • Figure 7C is a diagram of an illustrative field of view for the side camera 220.
  • the side camera 220 may, for example, monitor vertical movement across a predefined plane.
  • the field of view 235 of the side camera 220 has a thin selected area 237 that corresponds to the desired plane.
  • Figure 8A is a perspective view of another safety camera system having two safety cameras 250 and 260 for monitoring a volume of interest 270.
  • a first camera 250 is positioned to capture an image under a cone 252 defining a circle 254 along the horizontal plane, while a second camera 260 is disposed to capture an image under a cone 262 defining circle 264 in the vertical plane.
  • Figure 8B is a diagram showing the field of view 290 of the first camera 250 of Figure 8A, with a first selected area of interest 292.
  • Figure 8C is a diagram showing the field of view 295 of the second camera of Figure 8A, with a second selected area of interest 297.
  • volume 270 is a six-sided volume, whose shape is defined by the selected areas 292, 297, and the shape of cones 252, 262.
  • the shape of the volume of interest 270 may be refined by using additional cameras, or by using improved cameras that may capture additional information within the corresponding field of view. Additional optical devices may also be used for the shaping of the volume of interest 270.
  • FIG 8A an object 299 is shown within volume of interest 270.
  • the object 299 lies along a first line 256 corresponding to camera 250, and a second line 266 corresponding to camera 260.
  • Figure 8B shows that the object 299 can be found in a selected area 292
  • Figure 8C shows that the object can be found in selected area 297. Because the object 299 appears in both selected areas 292 and 297, the object lies within volume of interest 270. When the object is at a boundary of either of the selected areas 292 or 297, the object is at the boundary of the volume of interest 270.
  • the present invention when no object is entering or within the volume of interest 270, only those portions of the incoming images that correspond to the border of the volume of interest 270 may be analyzed.
  • the present invention may quickly detect when the border has been breached by an object. After the border is breached, the present invention may, for example, sound an alarm, deactivate hazardous equipment in the volume of interest, or perform some other precautionary measure, but this is not required.
  • the present invention may begin analyzing the entire volume or selected regions inside the volume of the area of interest, as desired. This may provide some level of presence monitoring of the volume of interest.
  • the presence monitoring can be performed at a slower rate than the border analysis, particularly when one or more precautionary measures have already been initiated by a border breach. It is contemplated that both modes of analysis can take place simultaneously or sequentially, depending on the application.
  • the present invention preferably returns to the original steady state, and monitors only the border regions of the incoming images.
  • Figure 9 is a perspective view of yet another safety camera system having two safety cameras for monitoring an area of interest.
  • the area of interest is shown at 300, and is defined by border 302. Corners 306 and 308 are also shown, with cameras 310 and 320 disposed over the comers.
  • a first camera 310 captures images under a cone 312, defined by field of view 314.
  • a second camera 320 likewise captures images under a cone 322, defined by field of view 324.
  • a planar border defined by a polygon 330 and two triangles 332 and 334 can then be defined.
  • the shape and center height of the polygon 330 may be changed by adjusting the angles of cameras 310 and 320 with respect to the area of interest 300, by utilizing different cameras 310 and 320, by adding additional cameras, etc. If so desired, the border may be defined as only including the area within the polygon 330, which would thus monitor an area captured only by both cameras 310 and 320.
  • the present invention preferably monitors the border for breach oy an object, ⁇ nce breached, the present invention preferably begins analyzing the entire area or selected regions of the area of interest 300, as described above.
  • FIG. 1-9 there are a variety of configurations that may be used to detect a border breach and/or monitor an area of interest. Many other configurations are also possible. Data selection and exclusion, along with placement of multiple cameras, manipulation of angles of a single or multiple cameras, and other embodiments may be used to monitor borders, areas, and volumes of many shapes, sizes, configurations, and numbers, as desired.
  • Figure 10A is a perspective view of an area of interest 350 with an object 370 just entering the border 352 of the area of interest 350.
  • Equipment 356 is in the area of interest 350, and the object 370 is shown just at the border 352 of a pyramid 362 outlining a selected area within the field of view of a camera 360. No other object is observed in the interior 354 of the area of interest 350.
  • Figure 10B is an overhead view of the area of interest 350 and the object 370 of Figure 10A.
  • steady state operation that is when no object is entering or within the area of interest, only those portions of the incoming images that correspond to the border 352 of the area of interest 350 are analyzed.
  • a most recent image of the border 352 may be compared to at least one reference image.
  • the reference image may be a single image taken at one point in time, or the reference image may be updated periodically. In one illustrative embodiment, the reference image is updated after a set period of time. Alternatively, or in addition, the reference image may be updated upon the occurrence of an event, such as an external signal requesting that a new updated image be taken, or a response to a change in condition of the border 352 of the area of interest 350. In some embodiments, there may be more than one reference image, where a first reference image may be a set image or a recent image and a second reference image may be the second-to-last image taken, such that a first comparison with one image may note immediate changes in the border 352 of the area of interest 250, while a second comparison may note accumulated changes.
  • the present invention may quickly detect when the border 352 has been breached by object 370. After the border 352 is breached, the present invention preferably sounds an alarm, deactivates hazardous equipment 356 in the area of interest 350, and/or performs some other precautionary measure, but this is not required.
  • Figure 11 A is a perspective view of an area of interest 400 with an object 420 already across the border 402 of the area of interest 400.
  • Figure 1 IB is an overhead view of the area of interest 400 and the object 420 of Figure 11A.
  • the present invention preferably begins analyzing the entire area 404 or selected regions inside the border 402 of the area of interest 400. This may provide some level of presence monitoring of the area of interest 400. In some embodiments, the presence monitoring can be performed at a slower rate than the border analysis, particularly when one or more precautionary measures have already been initiated by a border breach. It is contemplated that both modes of analysis can take place simultaneously or sequentially, depending on the application.
  • the interior region is defined to include the border region as well.
  • the safety monitoring system may return to the original steady state after a second triggering event, and monitors only the border region of the incoming images.
  • the second triggering event may be, for example, a manual reset input signal, the passage of time after which the object 420 does not move with respect to the interior 404 or the border 402 of the area of interest 400, the exit of the object 420 from the area of interest 400, a determination that the object 420 is smaller than some pre-selected minimum size, etc.
  • Figure 12A is a perspective view of an area of interest 450 with an object 470 entirely within the area of interest 450.
  • Figure 12B is an overhead view of the illustrative drawing of Figure 12 A.
  • the object 470 would not be noted within the analysis of the border 452 of the area of interest 450, but it would appear as a change in the interior 454 of the area of interest 450 during interior monitoring, which may start when the object 470 breaches the border 452.
  • the safety monitoring system may return to the original steady state after a second triggering event.
  • the second triggering event may be, for example, a manual reset input signal, the passage of time after which the object 420 does not move with respect to the interior 404 or the border 402 of the area of interest 400, the exit of the object 420 from the area of interest 400, a determination that the object 420 is smaller than some pre-selected minimum size, etc.
  • the safety system may update the reference image to reflect the change in the interior 454 and/or border 42 of the area of interest 450.
  • Figure 13 is a block diagram showing an illustrative safety camera system in accordance with the present invention.
  • An image analysis block 500 receives an input signal 502 from, for example, a digital or analog video camera, CCD or the like.
  • Input signal 502 may include a synchronization signal or other signal for indicating when, during the data stream, data relating to a new frame begins.
  • the input signal 502 is used to provide an input image or frame 504, which may be stored in a memory or file.
  • a signal may be sent to the control block 520.
  • the control block may control the operation of the image analysis block 500. For example, when a new reference image is desired, the control block 520 may notify the updating block 506, which transfers the current image to the reference image memory block 508.
  • the selection block 510 may allow a user to define, for example, an area of interest, borders, excluded areas, etc. Whether the selection block 510 and/or control block 520 elicit any area, interior, and border definitional data is dependent upon the particular system needs and capacities.
  • a mask may be used on the camera(s) to limit the field of view of the cameras to reduce the amount of data that needs to be saved and/or processed.
  • the selection block 510 also accesses the reference image memory 508. Portions of each of these images are then sent to the border analysis block 512, as per instructions from either the control block 520 or stored within the selection block 510. Border analysis block 512 receives image data from the selection block 510 and determines whether an object has intruded into the area of the input image 504 that is defined as the border. The border analysis block 512 sends an output to the control block 520 indicating the results of the border analysis.
  • the selection block 510 may send data to the interior analysis block 514 after a triggering event has occurred, such as detection of an object by the border analysis block 512.
  • the control block 520 may direct the selection block 510 to send such data, or the control block 520 may pass on a signal indicating a violation has taken place to the selection block 510, which may contain instructions as to how to respond.
  • the interior analysis block 514 may receive data relating to the interior area of the area of interest and, upon prompting from the selection block 510 or control block 520, may perform analysis on the interior of the area of interest.
  • the interior area of the area of interest may, in some embodiments, include the border region as well.
  • the interior analysis 514 may send a signal back to the control block 520 indicating the results of the interior analysis.
  • Some outputs may include, for example, a connection to equipment or machines controlling the speed, operation, state, or other characteristics, including whether such equipment or machine is on or off, connection to a brake for stopping equipment or machines, audible or visible alarm systems, communication links to emergency services, monitoring facilities, security, upkeep, custodial or other maintenance facilities or personnel, etc.
  • the control block 520 may turn off the interior analysis 514 under some conditions, for example, when it is determined that no border violation has occurred.
  • the border analysis and interior analysis may be performed by the same processing unit at different times, as shown and described with reference to, for example, in Figure 19 (processor 2, 890).
  • the capture images and/or reference images may be saved for later viewing, if desired.
  • the capture and/or the reference images may be saved to a storage medium such as a hard disk, RAM, Compact Disk, magnetic tape or any other storage medium.
  • the images may be viewed to identify the situation that occurred.
  • both image analysis blocks 500 may be provided, each receiving an input s gnal 502 from, for example, a digital or analog video camera, CCD or the like.
  • both image analysis blocks 500 may analyze the capture image, and provide signals to control block 520.
  • Control block 520 may then only provide external output signals when both image analysis blocks 500 agree that an external output signal is warranted.
  • two or more imaging devices such as digital or analog video cameras, CCDs or the like may be provided, each providing an image of the area of interest to a corresponding control block 500.
  • all image analysis blocks 500 may analyze the corresponding images, and provide signals to control block 520.
  • Control block 520 may then only provide external output signals when all image analysis blocks 500 agree that an external output signal is warranted.
  • FIG 14 is another block diagram showing an illustrative safety camera system in accordance with the present invention.
  • inputs 560 to a control block 550 may include equipment-generated information 562, such as equipment operation speed, fluid levels, internal temperature, cover or shell integrity signals, error signals, and other information relating to the status, upkeep, or operation of equipment. If the equipment were, for example, an internal combustion engine, equipment generated information could include indications of oil pressure, fuel level, heat level, etc.
  • Other inputs 560 may include manual reset 564 and manual turn-off 566, for example.
  • outputs 570 from the control block 550 may include an output to the equipment 572, and other outputs 574.
  • the output to the equipment 572 may include signals such as off, idle, startup, stop, or could apply a brake, regulate speed of operation, close emergency safety guards, etc.
  • Other outputs 574 could include sirens, bells, whistles, lights and emergency signals to warn emergency crews, or may be include a memory device that could keep track of the timing and number of unsafe condition messages generated by the safety monitoring system.
  • Figure 15 is a flow chart showing an illustrative relationship between border and interior analysis functions in accordance with the present invention.
  • the safety output 646 is used to disable the equipment, set off an a arm, e c. e sa ety output s genera e y a oo ean unct on , w c ANDs the border analysis output 640 with the interior analysis output 642. That is, the border analysis output 640 must indicate that no object has breached the border, AND the interior analysis output 642 must indicate that no object is currently in the interior of the area of interest before the safety output 646 is set high. In one embodiment, the equipment in the safety zone can only be operated when the safety output 646 is high.
  • the border analysis 600 may include a quick examination of a minimum number of pixels to establish whether there has been an access violation.
  • the border region will include a fewer number of pixels than the interior region.
  • the border analysis algorithms are preferably optimized to analyze the limited number of pixels with the fastest possible response time.
  • border analysis 600 compares the border pixels in the live image with those in the reference image. If a region is detected where the computed difference is greater than a specified threshold, a more detailed analysis can be performed on the region to determine whether the difference is due to an intruding object, a shadow or some other cause. If the difference is determined to be a shadow, the border analysis preferably resumes, examining other regions for detected differences.
  • the border analysis may terminate when one of the following conditions is reached: an object is detected on the border; all of the differences have been determined to be shadows or other non-objects; or the time available to perform the border analyze has expired.
  • the resolution used for object detection preferably is set to detect hands, feet, or other objects of similar (or smaller) size that are likely to first penetrate the border. Once the object size is set, the object detection may be limited to detect only objects of the specified size or objects with bigger size.
  • the minimum size of objects to be detected is automatically determined from a reference object, reference markings or some other feature in the reference image during a configuration procedure. Additionally, the minimum size of the objects to be detected can be different for the border analysis than for the interior analysis. For example, the minimum size of objects used for the border analysis (e.g. hand) may be smaller than the minimum size used for the interior analysis (arm, body, etc.).
  • Border analysis 600 may be performed for every image captured to achieve the minimum response time. That is, the border analysis 600 may be triggered by arrival of a new image frame from an image input device such as a camera. Other embodiments may perform border analysis 600 on less than all received images, using some of the received frames to perform image verification processes to, for example, help assure that the image capture devices (e.g. cameras) are functioning properly. For example, and as shown and described below with reference to Figure 22, one half of the frames may be used for border analysis, while the other half may be used for camera verification procedures.
  • the interior analysis block 620 may determine whether there is any new, unexpected, or undesired object present inside the area of interest.
  • the interior analysis 602 may, in some embodiments, be limited to finding objects greater than a specified minimum size. Smaller objects and debris on the floor can, in some embodiments or applications, be ignored.
  • the interior analysis 620 may not need to be performed for every image frame received, but instead may be invoked on demand when needed, for example, after a triggering event such as the breach detected by the border analysis block 600.
  • Analysis of the interior of an area of interest typically must process more pixels and may require more computation time than the border analysis 600.
  • a reference marking may be provided on the floor of the interior of the area of interest. However, in some applications it may not be desirable or possible to do so.
  • a longer response time may be acceptable for interior analysis 602 since it is not used for the initial detection of a safety violation, but instead can be used for interior analysis that takes place after a border violation is detected.
  • the slower algorithms and analysis may thus be acceptable because the system is already "aware" of the safety violation, and ameliorative actions may have taken place, for example, shutting down a hazardous machine or setting off alarm systems.
  • the border analysis 600 and interior analysis 620 procedures can be used together to maintain safety for the overall system.
  • the interaction of these two analyses can be thought of as a gate that "closes" the safety zone.
  • the border analysis 600 determines that no object is in the process of entering the safety zone, the border of the safety zone is "closed".
  • the interior analysis can then work to determine whether there is an object present in the interior of the safety zone. Once the interior is found to be empty and no further border violation occurs, the system is in the safe or original steady state mode. As long as the border remains "closed” (no access violation), the system preferably remains in the safe or original steady state mode. Sometimes, during the time that the interior analysis 620 is being performed, a border violation occurs.
  • FIG. 16 is a flow chart showing an illustrative method for performing interior and border analysis in accordance with the present invention.
  • the border analysis block 600 and interior analysis block 602 of Figure 15 are shown in Figure 16, outlined in dashed lines.
  • the system preferably determines whether the safety zone (e.g. area of interest) is empty before activating the safety output 646.
  • Border analysis block 600 determines if there is a border violation by obtaining a next live image, analyzing the border of the live image as shown at 604, and determining if there is a border violation as shown at 606. If there is a border violation, the safety output of the border analysis block 600 is set to zero, as shown at 610. If no border violation is detected at 606, a next live image is received and the process is repeated. Once no border violation is detected, the interior analysis block 602 receives a new live image, as shown at 624 of the area of interest. The interior analysis block 602 then analyzes the interior of the safety zone, as shown at 626.
  • the system may determine whether there has been a violation of the interior, as shown at 628. If there has been a violation of the interior, the system may return to block 622 and wait for an indication that the border is no longer being violated. If there is no interior violation, the system may check whether the border has been violated during the interior analysis, as shown at 630. In some embodiments, the received frames may be saved into memory during the interior analysis. After the interior analysis is complete, border analysis 600 may be performed on the saved frames in a relatively short amount of time, and the system may conclude that no border violations have occurred during the interior analysis. If the border has been violated, the system may return to block 622 and wait for an indication that the border is no longer being violated.
  • the system may generate an output indicating the system is safe, as shown at 632. Once safe, the system may enter the RUN state as shown at 634.
  • the safety output signal 646 may go high, indicating safe operating conditions.
  • the RUN state 634 is a safe or steady state in which the illustrative system performs no further interior analysis until a border violation is detected. When a border violation is detected, the system returns to block 622 and waits for an indication that the border is no longer being violated.
  • Figure 17 is a state machine diagram of an illustrative embodiment of the present invention. In the initialization state 750, the system may determine whether a current configuration is valid or a new reference image and safety zone configuration is necessary or desirable.
  • the system may communicate with an operator to capture a new reference image, define a desired safety zone, and/or compute the needed configuration data for use in the safety monitoring procedures. It is contemplated that the system may automatically identify the border region and/or interior region in the reference image from a reference marker positioned in the area of interest.
  • the reference marker may be, for example, a pattern on the floor, etc.
  • the system switches to the running state 790.
  • the safety output may also be activated at this time, while the border analysis continues.
  • the system preferably deactivates the safety output and switches to either the stop state 799 (for a manual restart) or the clear state 780 (if an automatic restart is desired).
  • the lighting conditions may vary significantly throughout the day. Under these conditions, it may be desirable to include a procedure for automatically updating the reference image and the configuration data to account for changing lighting conditions.
  • the system may periodically switch to the update state 770 to capture a new reference image and compute new configuration data corresponding to the new reference image. If a valid configuration is achieved and there is no safety violation with the new configuration the system may return to the run state 790.
  • an operator may choose to manually initiate the update procedure.
  • the system may include an output directed to a light source that may be varied as ambient light conditions change in order to maintain consistent lighting in the area of interest.
  • the system may then capture a new reference image for the current lighting conditions and attempt to return to the run state 790 if a valid configuration is obtained and the safety zone is determined to be empty.
  • FIG 18 is a block diagram showing an illustrative data flow and analysis in accordance of the present invention.
  • an image capturing device is shown, and is preferably a standard black and white CCD video camera 810 operating at thirty frames per second. Use of a color or CMOS-based camera is also contemplated. Other frame rates are also possible, and they will impact the response time of the system.
  • an analog output signal from the camera 810 is converted to an eight-bit digital sequence of luminance data (pixels) by an analog/digital converter 802.
  • a pixel processing block 804 performs initial analysis on the digitized output of the camera 810.
  • the pixel processing device may use, for example, two pipelined processing elements, as shown in Figure 24.
  • a first stage processor may be selected to assure sufficient speed to perform initial processing of the pixel data received from the camera 810. Pixel data may be received at, for example, a rate of about 12.5MHz for a standard video input signal.
  • the first stage processor may, for example, perform initial sorting and accumulation steps of the object detection algorithm for the pixels on the border of the safety zone in each image produced by the camera 810. These interim results are preferably stored in a memory, as shown at 820.
  • the first stage processor may also perform an initial differencing and thresholding operation as the first step of the interior analysis procedure.
  • a second stage processor may be, for example, a standard microcontroller that receives the initial analysis results stored in memory 820 and performs the remaining border and interior analysis processing to determine the correct state of the safety output, as shown at 840.
  • the second stage processor may also implement the control function, as shown at 850.
  • the second stage processor may. be signaled via an interrupt when the first stage processor completes its portion of the analysis.
  • the second stage processor may then perform averaging and comparison operations of the object detection algorithm, and deactivate the safety output if a border access violation has been detected. If a border access violation has been detected, the second stage processor may send one or more signals to the first stage processor to capture a live image and perform the initial interior analysis operations. When this is complete, the second stage processor may use the results stored in memory 820 to complete the interior analysis.
  • the controller 850 may direct the pixel processing block 804 to capture a new reference image, as desired.
  • a software program 862 or the like may be executed on a separate computer 860, such as a PC.
  • the software preferably enables an operator to capture and view a reference image and graphically identify the border of the desired safety zone as it appears in the image.
  • the configuration data needed by the border and interior analysis procedures may be sent back to the pixel processing block 804.
  • the computer 860 used for the configuration procedure may not be needed during normal operation of the safety system.
  • FIG 19 is a timing diagram showing illustrative timing relationships for the embodiment of Figure 18.
  • a top line 870 corresponds to a camera output, numbering individual frames i, i+1, i+2, etc. In the illustrative example, individual frames arrive every thirty-three milliseconds, as the camera takes thirty frames per second.
  • a second line 880 may correspond to a first processor, which performs pixel processing on each successive frame received. Preferably, the pixels are processed as they are received, so that at a time just after data for a given frame is completed, the first processor can send the ready frame to the second processor, which corresponds to line 891.
  • Typical data flow for the illustrative embodiment of Figure 19 is shown by dashed line 876.
  • An image is captured and sent via a data stream to pixel processing, which preferably does not process the entire image simultaneously, but rather performs pixel-by-pixel processing.
  • the output of pixel processing is preferably stored in a memory, which is then read by the second processor.
  • the second processor preferably performs a border analysis once all of the pixels of an image are processed by the first processor.
  • the entire processing may be performed pixel by pixel, for example, by comparing each individual pixel received to a corresponding reference pixel.
  • the pixel information may be filtered in a first state so that, rather than processing the pixels for an entire frame before performing border analysis, only pixels corresponding to the border region are processed, while the other pixels are ignored, discarded, or further filtered to save those pixels that correspond to the interior of an area of interest.
  • the second processor may use the thirty-three milliseconds between arrivals of new frames to perform two different functions. As shown, border analysis may take up a relatively short period of time for the second processor. Once the border analysis is complete, the second processor 890 may begin processing the interior, if appropnate. The interior analysis, as shown at 892, preferably begins in the RUN state 893, but preferably only after the safety output is opened 894 as a result of a border violation. The second processor then enters the CLEAR state 895. The CLEAR state 895 requests a new live image from the first processor, and then enters CLEAR WAITING state 896 until the new live frame is ready.
  • the second processor goes from CLEAR WAITING 896 into CLEAR CHECK INTERIOR 897, and performs an interior analysis. In the illustrative embodiment, this is repeated until the frame 874 that shows that the safety zone is clear is processed by the second processor, wherein the safety output is closed as shown at 898.
  • the maximum response time is equal to the time it takes for a new frame to be captured by the camera, plus the time it takes for pixel processing of the new frame to be completed, plus the time it takes for the border analysis to be completed.
  • the maximum response time is about seventy-six milliseconds, less than one-tenth of a second.
  • Response times can vary depending on a wide number of factors.
  • a faster camera can take more frames per second, reducing the maximum delay times significantly.
  • a higher resolution camera may be useful to detect smaller objects, but may increase the processing times.
  • One way to counteract the effect of a faster camera on processor time may be to define the border to include a smaller area, which in turn may decrease the likelihood that a small or fast moving object can completely cross the border without being detected.
  • faster processors may be utilized, but this may increase the cost of the system.
  • the border region may be defined using different patterns that may maximize the effectiveness of the pixel processing. More sha ⁇ ly defined borders, however, may in turn reduce the robustness of the system. Such variables should be considered and weighed in determining the best system for a given application and/or environment.
  • Figure 20 is a state diagram showing the progression of states for an illustrative embodiment of the present invention.
  • An INIT block 900 may perform such functions as initializing hardware, performing a self-check or a camera check, and starts all internal processes. If there is a bad configuration or pc connected, as shown at 902, the illustrative embodiment moves to a CONFIG block 910. Otherwise, if the configuration is ok and no PC is connected, as shown at 904, the illustrative embodiment moves to the CLEAR block 920.
  • configuration routines may be initiated, including, for example, capturing of a new reference image. If a pc is connected, a new reference image may be routed to the pc, and definitions for a safety zone and windows for analysis may be returned by the pc, as described above. If there is a pc, the user of the pc may confirm the validity, as part of the routines in the CONFIG block 910, via a user interface. For each defined analysis window, the CONFIG block 910 may determine median, light, and dark pixel sets. Also, reference contrast values may be computed and validated, and, if connected, sent to the pc. Border analysis may also begin while in the CONFIG block 910.
  • the CONFIG block 910 may return with OK 912 and enter the CLEAR block 920. Otherwise, if some of the routines cannot be completed, the CONFIG block 910 may instead return with FAILED CONFIG 914 and enter the STOP block 950.
  • CONFIG OK & NO PC 904 is returned by the INIT block 900 or CONFIG OK 912 is returned by the CONFIG block 910
  • the illustrative embodiment enters CLEAR block 920.
  • One pu ⁇ ose of the CLEAR block 920 is to determine whether the interior region of the area of interest is clear for a sufficient period of time during which the border region is also clear.
  • a border analysis may be performed more quickly than the interior analysis, as, for example, shown in Figure 19.
  • the CLEAR block 920 may include a first sub- block, WAIT FOR BORDER CLEAR 921, which in the illustrative embodiment, can be satisfied first, which sends a BDRCLEAR 922 signal to COMPARE INTERIOR TO REFERENCE 923. Once the border is clear, the COMPARE INTERIOR TO REFERENCE 923 is initiated.
  • Analysis performed by WAIT FOR BORDER CLEAR 921 may include comparison of areas of a live image or most recent image to areas of a reference image. The areas being compared may correspond to some of the windows defined in the CONFIG block 910, as further described in co-pending U.S. Patent Application Serial No. H16-26483. entitled "OBJECT DETECTION", which is inco ⁇ orated herein by reference.
  • the COMPARE INTERIOR TO REFERENCE 923 preferably determines whether it may be safe to initiate operation or a run state (RUN block 930) for the system. Such analysis may include comparison of a live image of the interior of the area of interest to a reference image, and this analysis may be divided into sub-blocks corresponding to some of the windows defined in the CONFIG block 910. If the interior is not clear, indicated by UNTCLEAR 924, the CLEAR block 920 returns to the WAIT FOR BORDER CLEAR 921.
  • a safety output may be defined to be closed when it is safe for operations to resume in the area of interest, and open when unsafe conditions are present.
  • the safety output is defined as open while in the INIT 900, CONFIG 910, CLEAR 920, UPDATE 940, and STOP 950 blocks. If the CLEAR block 920 determines that the border is clear and the interior is clear, it may go to the RUN block 930, as shown by BDRCLEAR & INTCLEAR 925. Upon the determination that the border and interior are both clear, the CLEAR block 920 also moves the safety output to a closed, or safe, state.
  • Border scanning may include comparison of a live frame to a reference frame. Such comparison may be performed according to the windows defined in the CONFIG block 910. If a border violation is detected, the safety output is opened.
  • a variable may also be defined in the illustrative embodiment as AUTORESTART.
  • AUTORESTART may have two states, for example, 1/0, high/low, on/off, etc. If AUTORESTART is in a first state, the system may autorestart in response to a border violation, entering the CLEAR block 920 (shown at IBORDERCLEAR & AUTORESTART 932) when, during operation in the RUN block 930, a border violation is detected. Altematively, if AUTORESTART is in a second state, the system may enter the STOP state 950 (shown at IBORDERCLEAR & ! AUTORESTART) upon detecting a border violation.
  • the UPDATE block 940 may be entered from the CLEAR 920 (UPDATE REF 926) or RUN 930 (UPDATE REF 934) blocks at some predefined time.
  • the predefined time may be a time related to a border or interior violation, a preset interval from the last updated reference image, or any other time. Also, the predefined time may arise as a result of a manual input or other passage of time.
  • the safety output is opened while routines in the UPDATE block 940 are performed. In other embodiments, the safety output remains closed, at least in some circumstances, during performance of routines in the UPDATE block 940.
  • the UPDATE block 940 While in the UPDATE block 940, a camera check may be performed, and a new reference image may be captured by the camera(s) of the system. If there are multiple cameras in use, the UPDATE block 940 may take new reference images from each camera, or may only capture a new reference image from one or more of the cameras. The UPDATE block 940 may also include the steps of computing a reference contrast value for each window and validating the sufficiency of the contrast. If the UPDATE block 940 fails to obtain a valid reference image, the UPDATE block 940 may return BAD CONFIG 944 and enter the STOP block 950. Otherwise, if the UPDATE block 940 completes all routines, it may retum OK 942 and go into the CLEAR block 920.
  • the illustrative example stops border scanning and remains in a state wherein the safety output is open until the system can be reset or otherwise checked, at which point the system may enter the INIT 900 block again.
  • Other states and relationships between states are also contemplated.
  • FIG. 21 is a schematic diagram of an illustrative embodiment using multiple channels of data flow in accordance with the present invention.
  • safety camera 1010 gathers images at a certain rate of frames per second, creating a composite video stream 1012.
  • the video stream is split into two channels, Channel A 1020 and Channel B 1030.
  • Channel A 1020 generates Safety Output A 1022
  • Channel B 1030 generates Safety Output B.
  • a first cross-check connection 1024 connects Safety Output B 1032 to the image analysis block for Channel A 1020
  • a second cross-check connection 1034 connects Safety Output A 1022 to the image analysis block for Channel B 1030.
  • alternating frames in the image stream are available for validation of the camera data.
  • 15 fps may be used by Channel A 1020 for safety analysis.
  • the remaining or intervening 15 fps may be analyzed in Channel B 1030 to detect dead pixels, row failures, insufficient lighting, etc.
  • the analysis performed in Channel B 1030 may rely on detecting, in every frame, a boundary marking for the safety zone that provides a sufficient or reliable reference signal. Because camera fault conditions, such as pixel failures, loss of sensitivity, bum out, etc., may also result in insufficient contrast compared to the reference image, such analysis may detect additional safety violations.
  • Figure 22 is a timing diagram showing illustrative timing relationships of the embodiment of Figure 21.
  • the camera takes a certain number of frames per second; the frames are represented in a first line 1100, enumerated as frame i, frame i+1, etc.
  • Hardware processing 1110 alternates between performing image analysis 1112 and signal validation 1114. In a preferred embodiment, analysis takes place as the pixels are received, rather than in a buffered grouping, so hardware processing 1110 continually analyzes data.
  • software processing 1120 receives data streams from hardware processing 1110.
  • software processing 1120 performs an evaluation and generates a safety output 1122. Once evaluation and safety output 1122 are completed, software processing 1 120 performs self-check and background tasks 1122 until pixel data is again ready to be sent by hardware processing 1110.
  • FIG. 23 is a block diagram showing processing, memory and control blocks of an illustrative embodiment of the present invention.
  • a CCD video camera 1200 generates a composite video signal.
  • the composite video signal is received by a video decoder 1210.
  • Video decoder 1210 may include an analog/digital converter and a synchronization function.
  • the video decoder generates signals that may indicate luminescence, a pixel clock, and a synchronization signal to a pixel processing block 1215.
  • the pixel processing block 1215 may fetch pixels and perform a contrast comparison algorithm or, in some embodiments, a frame matching algorithm, to determine whether an object has entered a portion of an area of interest.
  • Pixel processing block 1215 also has access to memory 1220, and is preferably capable of saving and retrieving data from memory 1220.
  • Memory 1220 may include a live image 1222, a reference image 1224, window definitions 1226, and analysis results 1228.
  • Control, safety, and configuration software block 1230 may be accessed by pixel processing block 1215. Control, safety, and configuration software block 1230 may also access and save data to memory block 1220, and receive inputs and generate outputs 1240 as shown. The control, safety, and configuration software block 1230 preferably controls the safety system, as described above.
  • Figure 24 is a functional block diagram of another illustrative embodiment of the present invention.
  • hardware 1303 includes CCD video camera 1300, video decoder 1310, pixel analysis 1320, analysis windows memory block 1330, analysis window definitions memory block 1340, reference image memory block 1350, and live image memory block 1360.
  • analysis window definition block 1340 and reference image block 1350 may include flash memory elements.
  • Software 1306 may include system supervision 1370, safety analysis 1375, configuration 1380, and user interface software 1390.
  • CCD video camera 1300 generates a composite video data signal that is received by video decoder 1310.
  • Video decoder 1310 may include analog/digital conversion capabilities and may generate synchronization data.
  • Video decoder 1310 is controlled by system supervision 1370 during a configuration step.
  • Video decoder 1310 may send luminescence data, clock data, horizontal synchronization data, and vertical synchronization data to pixel analysis block 1320.
  • pixel analysis block 1320 performs a contrast comparison algorithm and image validation.
  • Pixel analysis 1320 provides data regarding pixels to analysis windows 1330, which generates status and results that are then sent on to system supervision 1370 and safety analysis 1375.
  • Pixel analysis block may be controlled by system supervision 1370 to determine the mode of pixel analysis.
  • Pixel analysis can include any number of pixel processing methods including, for example, contrast comparison algorithm mode, frame matching algorithm mode, image validation mode, etc.
  • Pixel analysis block also send data corresponding to the live image to live image memory block 1360, where the live image may be stored.
  • Configuration 1380 may control live image memory block 1360, and may request live image memory block 1360 to send data to reference image memory block 1350, possibly updating the reference image memory block 1350.
  • Pixel analysis block 1320 receives a reference image from reference image memory block 1350 as well as data relating to the analysis windows from analysis window definitions block 1340.
  • the analysis window definitions enable the pixel analysis 1320 to group pixels together into bins to perform contrast comparison algorithms comparing a live image to a reference image.
  • Analysis window definitions block 1340 receives data from configuration block 1380.
  • System supervision 1370 receives data from reference image memory block
  • System supervision performs tasks including initialization and self-check tasks by controlling the configuration of video decoder 1310 and the mode of pixel analysis 1320, along with controlling safety analysis 1375 and configuration 1380.
  • Safety analysis 1375 evaluates analysis windows 1330 outputs and generates a safety output. Safety analysis also receives generates an error output, which can actually be generated by any of system supervision 1370, safety analysis 1375, or configuration 1380. Safety analysis receives control signals from system supervision 1370.
  • Configuration 1380 may send data to analysis window definitions 1340, and controls capture of the live image memory block 1360. Via a coupling such as an RS 232 or any other data connection, configuration 1380 sends and receives data to user interface software 1390. Configuration 1380 can also generate an ercor signal.
  • initialization During an initialization mode, hardware 1303 is initialized, internal processes are started, and a self-check is performed. Much of initialization can be controlled from system supervision 1370, though information received from camera 1300 and user interface 1390 can be inco ⁇ orated as well. An error signal will be generated if there is a failure to correctly initialize the system.
  • a new reference image is captured with camera 1300 and sent to the PC operating the user interface software 1390.
  • Safety zone parameters are received from the PC, again using user interface software 1390.
  • Analysis windows definitions 1340 are generated using the safety zone parameters. For each analysis window 1330, the median pixel level (for example, with respect to luminescence) and defined light and dark pixel sets are determined.
  • Reference contrast values are computed and validated, and ultimately sent to the PC to confirm that the reference contrast values are acceptable. The PC user can confirm the validity of reference contrast values. An error will be generated if there is a failure to define a valid safety zone.
  • reference contrast values for each analysis window 1330 are computed. Whether there is sufficient contrast in each window is then validated. Windows lacking sufficient contrast for processing via contrast comparison algorithms are marked for frame differencing analysis.
  • boundary scanning begins, and the safety output is closed (set to "safe"). While the system is running, boundary scans are continued. The safety output will be opened if there is a detected violation.

Abstract

A visual object detection system to provide access and/or presence monitoring of an area of interest. In steady operation, that is when no object is entering or within the area of interest, only those portions of the incoming images that correpond to the border of the area of interest are analyzed. Once the border is breached by an object, the present invention may begin analyzing the entire area or selected regions inside the border of the area of interest. This may provide some level of presence monitoring of the area of interest. It is contemplated that both modes of analysis can take place simultaneously or sequenctially, depending on the application. Once the object leaves the area of interest, the present invention preferably returns to the original steady state, and monitors only the border regions of the incoming images.

Description

OBJECT DETECTION
This application is a Continuation-in-Part of co-pending U.S. Patent Application Serial No. 09/716,002, filed November 17, 2000, entitled "OBJECT DETECTION", which is incorporated herein by reference. This application also claims priority under 35 U.S.C.§1 19(e)(1) to co-pending U.S. Provisional Patent Application Serial No. 60/275,879, filed March 14, 2001 , and entitled "SAFETY CAMERA".
Field of the Invention The present invention relates to object detection, and more specifically, to object intrusion and/or presence detection within a predefined area or region.
Background of the Invention
Electrosensitive Protective Equipment (ESPE) is well-known and widely used in industrial settings to protect operators of hazardous equipment from injury. ESPE devices typically have a sensing function, a control or monitoring function, and an output signal switching function. The sensing function typically collects data from, for example, a defined safety zone surrounding dangerous equipment. The safety zone may be a line, an area, or a volume, depending on the sensing technology used.
The control function monitors the sensing function. When the control function determines that the sensor data provided by the sensing function corresponds to an intrusion into the safety zone, an output signal is produced to sound an alarm, deactivate the hazardous equipment, or perform some other precautionary measure.
A variety of ESPE devices are currently commercially available, including single beam photodetectors, light curtains, laser scanners, safety mats and others. Single beam photodetectors typically use a single light source and light detector to provide some level of access monitoring. When an object moves between the light source and the light detector, the light beam extending therebetween is interrupted, which then triggers a safety violation. A limitation of single beam photodetector systems is that only limited access control and typically no presence sensing is provided. Another limitation is that to change the location, shape or size of the safety zone, the light source and/or light detector must typically be physically moved.
Light curtain systems are similar to single beam photodetector systems, except a linear array of light emitter/light detector pairs are provided. The light emitter/light detector pairs are mounted in a pair of spaced enclosures. The array of light emitters produce a "light curtain" that extends to the corresponding light detectors. When the light curtain is interrupted by an object, a safety violation is triggered. The resolution (size of object detected) typically depends on the spacing of the light beams. Light curtain systems can provide some level of access control when mounted vertically, and some level of presence monitoring when mounted horizontally. However, a limitation of some light curtain systems is that they are relatively expensive and complex. Another limitation is that variations in the size and shape of the safety area may be restricted, and the spaced enclosures must typically be physically moved to change the configuration of the safety zone to be monitored.
Laser scanner system typically include a rotating laser emitter/detector, which scans a plane and measures the distance to the nearest object in any direction by monitoring the reflection of the beam. This type of device can provide some level of presence monitoring along a horizontal plane. It may also be mounted vertically to provide some level of access monitoring, similar to the light curtain systems discussed above. A limitation of laser scanner systems is that they use complex mechanical components, such as rotating heads, which can requiring periodic and precise alignment. While the region to be monitored may be redefined using configuration software, its shape is often limited by the line-of-sight of the laser. Also, the response time is limited by the need to rotate the laser beam, and the sensitivity may be limited by air pollution in an industrial environment.
Finally, safety mat systems have been used to provide presence monitoring by detecting physical contact with a floor mat/sensor. Its robustness is limited by the need for physical contact with the floor mat for detection, which can be problematic in the often harsh environment of the factory floor. Safety mat systems typically cannot monitor large areas unless a number of mats are connected together. Finally, and like the single beam photodetector and light curtain systems described above, the safety mats must typically be physically moved to change the configuration of the safety zone to be monitored. Summary of the Invention
The present invention provides a visual object detection system that uses one or more images from a video camera, digital camera, etc., to provide access and/or presence monitoring of an area of interest. In steady state operation, that is when no object is entering or within the area of interest, only those portions of the incoming images that correspond to the border of the area of interest are analyzed. By only monitoring the border area, the present invention may quickly detect when the border has been breached by an object. After the border is breached, the present invention preferably sounds an alarm, deactivates hazardous equipment in the area of interest, or performs some other precautionary measure, but this is not required.
Once the border is breached by an object, the present invention may begin analyzing the entire area or selected regions inside the border of the area of interest. This may provide some level of presence monitoring of the area of interest. In some embodiments, the presence monitoring can be performed at a slower rate than the border analysis, particularly when one or more precautionary measures have already been initiated by a border breach. It is contemplated that both modes of analysis can take place simultaneously or sequentially, depending on the application. Once the object leaves the area of interest, the present invention preferably returns to the original steady state, and monitors only the border regions of the incoming images. Brief Description of the Drawings
Other objects of the present invention and many of the attendant advantages of the present invention will be readily appreciated as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, in which like reference numerals designate like parts throughout the figures thereof and wherein:
Figure 1 is a partial side view of a workplace area with an overhead safety camera in place;
Figure 2 is a perspective view of a safety camera having a field of view, with an area of interest within the field of view of the safety camera; Figure 3 is a schematic diagram showing a workplace with safety system in place; Figures 4A is a diagram of an image of an area of interest with a border region defined;
Figure 4B is a diagram of an image of an area of interest, wherein the floor of the area of interest has a pattern along the border region; Figure 5 is a perspective view of a safety camera system that monitors an area of interest having irregular shaped borders;
Figures 6A is a perspective view of an area of interest with breaks in the border;
Figures 6B is a top view of the area of interest of Figure 6 A; Figures 7 A is perspective view of a safety camera system having an overhead camera and a side camera for monitoring the area of interest;
Figure 7B is a diagram showing the field of view of the overhead camera of Figure 7A;
Figure 7C is a diagram showing the field of view of the side camera of Figure 7A;
Figure 8A is a perspective view of another safety camera system having two safety cameras for monitoring a volume of interest;
Figure 8B is a diagram showing the field of view of the overhead camera of Figure 8A; Figure 8C is a diagram showing the field of view of the side camera of Figure
8A;
Figure 9 is a perspective view of yet another safety camera system having two safety cameras for monitoring an area of interest;
Figure 10A is a perspective view of an area of interest with an object just entering the border of the area of interest;
Figure 10B is an overhead view of the area of interest and the object of Figure 10A;
Figure 11 A is a perspective view of an area of interest with an object already across the border of the area of interest; Figure 1 IB is an overhead view of the area of interest and the object of Figure
11A; Figures 12A is a perspective view ot an area of mterest with an object entirely within the area of mterest;
Figure 12B is an overhead view of the area of interest and the object of Figure 12 A; Figure 13 is a block diagram showing an illustrative safety camera system in accordance with the present invention;
Figure 14 is another block diagram showing an illustrative safety camera system in accordance with the present invention;
Figure 15 is a flow chart showing an illustrative relationship between border and interior analysis functions in accordance with the present invention;
Figure 16 is a flow chart showing an illustrative method for performing interior and border analysis in accordance with the present invention;
Figure 17 is a state machine diagram of an illustrative embodiment of the present invention; Figure 18 is a block diagram showing an illustrative data flow and analysis in accordance of the present invention;
Figure 19 is a timing diagram showing illustrative timing relationships for the embodiment of Figure 18;
Figure 20 is a state diagram showing the progression of states for an illustrative embodiment of the present invention;
Figure 21 is a schematic diagram of an illustrative embodiment using multiple channels of data flow in accordance with the present invention;
Figure 22 is a timing diagram showing illustrative timing relationships of the embodiment of Figure 21; Figure 23 is a block diagram showing processing, memory and control blocks of an illustrative embodiment of the present invention; and
Figure 24 is a functional block diagram of another illustrative embodiment of the present invention.
Detailed Description of the Drawings Figure 1 is a side view of a workplace area 14 with a camera 10. In the illustrative embodiment, camera 10 is placed over equipment 12, which is surrounded by a safety area 14. The equipment 12 may be a piece of hazardous equipment, for example a machine with moving parts, a chemical storage unit, a raw materials processor, an incinerator, or any other machine that could present a danger to a person. Likewise, the equipment 12 may be one or more pieces of equipment that are performing highly sensitive activities, where a safety system could be used to prevent an object or a person from interfering with the highly sensitive activity. Also, equipment 12 may be a valuable item, and a safety system could be implemented to prevent the item from being taken or damaged. A worker 20 is shown standing in the safety area 14. In the illustrative embodiment, the camera 10 gathers frames along a pyramid 30 shaped field of view, preferably as a two dimensional image. Figure 2 is a schematic view of a camera 10 having a field of view 19. An area of interest 14 is shown in cross-hatch. It should be noted that the term "area" is not limited to a two-dimensional area, and may include three-dimensional volumes, as further described below. The present invention preferably uses a digital camera 10 or the like to gather images along the field of view 19 of the camera 10. After gathering an image along of the field of view 19 of the camera 10, data processing techniques are preferably used to select those pixels that fall within the border 16. The present invention also preferably distinguishes between those pixels that fall along the border 16 and those pixels that fall within the border 16.
In one illustrative embodiment, when no object is entering or within the area of interest 14, only those pixels of the incoming images that correspond to the border
16 of the area of interest are analyzed. By only monitoring those pixels along the border area 16, the present invention may quickly detect when the border 16 has been breached by an object. After the border is breached, a precautionary measure may be taken, such as sounding an alarm, deactivating hazardous equipment in the area of interest 17, or performing some other precautionary measure, but this is not required.
Once the border is breached by an object, the illustrative embodiment may begin analyzing those pixels within the interior 15 of the area of interest 14. This may provide some level of presence monitoring of the interior 15 of the area of interest 14.
In some embodiments, the presence monitoring can be performed at a slower rate than the border analysis, particularly when one or more precautionary measures have already been initiated by a border breach. It is contemplated that both modes of analysis can take place simultaneously or sequentially, depending on the application. Once the object leaves the area of interest 15, the present invention may return to the original steady state, and monitor only those pixels in the border region 16 of the incoming images.
Figure 3 is a schematic diagram showing a workplace with safety system in place. In the illustrative embodiment, a safety camera 10 is disposed above the workplace area. The workplace area preferably has a predefined region of interest 14, which may correspond to a safety zone around a piece of equipment 12. The interior area 15 of the area of interest 14 is defined by a border 16, which in the illustrative embodiment, has sides 40, 42, 44, 46. A processing system 70 is shown connected to the safety camera 10 via interface 72. Processing system 70 preferably processes images received from safety camera 10 to determine if an object has breached a predefined border region 16 and/or if an object remains within the area of interest 15, as described above. The processing system 70 may also be connected to equipment 12, via interface 74. When the processing system 70 determines that the border region 16 has been breached, the processing system may send an enable or tum-off signal to the equipment 12 via interface 74. This enable or turn-off signal preferably causes the equipment 12 to turn off, trigger a brake, or otherwise stop the operation of equipment 12. Alternatively, or in addition, processing system 70 may receive information from equipment 12, such as a warning, an error or a malfunction signal, if desired.
In the illustrative embodiment, the processing system 70 is also connected to other input or output devices 28 via an interface 76. The input or output devices 28 may include, for example, an audible alarm, a visible alarm, a transmitter to send a warning signal to a remote location, a memory device to keep records of warning signals, etc.
Figures 4A is a diagram of an image of an area of interest 14 with a border region 16 defined. A piece of equipment 12 is shown within the interior area 15. In the illustrative embodiment, the image is divided into a border region 16 and an interior region 15. Border 16 may be more narrowly or more widely defined, depending on the application. In a preferred embodiment, only the border 16 is analyzed during steady state operation, and analysis of the interior region 15 only takes place after a triggering event, such as the breach of the border region 16. Alternatively, or in addition, analysis ol the interior region 15 may be initiated by other mechanisms, such as at timed intervals, in response to an external stimulus, such as a manual request of interior region analysis, etc.
In some embodiments, a pattern may be provided on the floor, preferably corresponding to the desired border region 16. In one example, a patterned tape or painted strip with contrasting dark and light areas may be placed along the desired border region 16. The area defined by each of the contrasting color regions may be chosen to correspond to the minimum size of the object to be detected. It is contemplated, however, that any number of patterns may be used including, for example, a single color line, a checked pattern, crisscrossing stripes, etc. In addition, the pattern may cover a more general area, and need not be confined to the border 16.
Algorithms and designs for detecting objects using a patterned area can be found in co-pending U.S. Patent Application Serial No. HI 6-26483, entitled "OBJECT DETECTION", which is incorporated herein by reference. Preferably, objects are detected in the border region 16 using the algorithms described in co- pending U.S. Patent Application Serial No. HI 6-26483, entitled "OBJECT DETECTION". Several embodiments use an analysis in which individual or groups of pixels are compared to one or more reference images. Comparison is preferably made on the basis of some identifiable or quantifiable property, such as luminance, color, tint, hue, spectra, etc., of the pixel or group of pixels being analyzed.
Markers may also be included in the area of interest 14 to enable functioning of a system in which the relative position of the camera with respect to the area of interest cannot or will not be maintained during operation of the safety system. In environments where there are complicated operations, additional equipment or space constraints, or any other need to move the camera in relation to the area of interest 14, a marker or the like may be used to provide a point or reference. In some industrial settings, vibrations may cause noticeable movement of the camera. A suitable marker may be any of several known in the art such as colored or painted dots, strips, sound or light signal generators, identifiable shapes or designs, etc. Figure 4B is a diagram of an image of an area of interest, wherein the floor of the area of interest has a pattern along the border region. Area of interest 90 is defined by an outside border 92 and an interior region 94. As discussed above, outside border 92 may include a pattern, such as a checker pattern. Such a pattern is not required in the present invention, but may be useful to improve the accuracy and/or speed of the border analysis. Such a pattern may be provided throughout the area of interest 90, if desired. In addition, and in some embodiments, a different pattern may be used in the border region 92 than in the interior region 94.
Equipment 96 is shown within an excluded area 98, which is defined by internal border 99. In one illustrative embodiment, movement across outside border 92 and internal border 99 may be monitored during steady state operation. If an object crosses into one of these areas, the interior region 94 (excluded area 98) may then be monitored. In one embodiment, the excluded area 98 may be used to detect objects that are thrown or otherwise are moving away from equipment 96 during steady state operation. Such objects may indicate failure or malfunction of the piece of equipment 96. It is contemplated that equipment 96 may not be within excluded area 98, or may be part in excluded area 98 and part in interior region 94, or may be entirely inside excluded area 98. Selection of the border of excluded area 98 may include assessing whether the equipment 96 has moving parts that could disrupt the safety system if not ignored.
Figure 5 is a perspective view of a safety camera system that monitors an area of interest having irregular shaped borders. Camera 110 is disposed above an area of interest 114 with an interior region 115 and a border region 116. Equipment 112 is located inside area of interest 114. Border 116 is irregular in shape, made up of segments 140, 142, 144, 146 and 148, with 142 being curved. Camera 110 gathers an image including a larger field of view 130 defined under cone 132. A processing unit (not shown) may then exclude pixels that correspond to areas 134 and 136 from analysis. In a preferred embodiment, a user can select the dimensions and shape of the area interest 114, as desired.
Also shown in Figure 5 is that the camera 110 need not be centered over the desired area of interest 114. Offsetting the camera 110 may be desirable for a variety of reasons including space constraints, etc. In addition, and supposing equipment 112 includes a part 112A that is of particular interest, the camera 110 may be disposed in an off-center fashion relative to the area of interest 114 to provide a better view of the part 112A. Figures 6A is a perspective view of an area of interest with breaks m the border. Camera 160 is disposed above the area of interest 164. The area of interest 164 is divided into two interior regions 165 A and 165B. The interior regions 165 A and 164B are separated by conveyer belt 180 and machine 162. The border region 166 is defined by lines 194, 196, 198, and lines 190, 192, 199. The conveyer belt 180 crosses the border 166 as shown. In a preferred embodiment, movement along the conveyer belt 180 does not trigger the safety system, while movement across border lines 190, 192, 194, 196, 198, 199 does.
Figures 6B is a top view of the area of interest of Figure 6A. Conveyer belt 180 crosses the area of interest 164, splitting the interior area into internal areas 165 A and 165B, thereby making the border region 166 non-continuous. The border 166 is made up of several segments 190, 192, 194, 196, 198, 199, as described above. Equipment 162 connects to the conveyer belt 180. In one embodiment, movement in the area covered by the conveyer belt 180 is ignored, such that steady state analysis would monitor the border segments 190, 192, 194, 196, 198, 199 not covered by the conveyer belt 180. In another embodiment, interior analysis would ignore the area covered by the conveyer belt 180 and would only analyze areas 165 A and 165B. The interior analysis may or may not analyze the area over the equipment 162.
As shown by Figures 6A and 6B, there are two interior zones 165 A and 165B that are monitored by a single camera 160. It is contemplated that a similar approach may be used to monitor two or more separate and/or unrelated safety zones using a single camera. For example, each interior zone 165 A and 165B could contain or surround separate and/or unrelated equipment, assuming each safety zone is in the field of view of the camera. This may reduce the cost of providing the safety camera system. In addition, and depending on the application, each zone may be monitored differently. For example, in a "LOAD MACHINE" mode, only interior region 165B may be monitored, while interior region 165 A may not be monitored. In a "RUN" mode, both interior regions 165 A and 165B may be monitored. This is just one illustration. Figures 7A is perspective view of a safety camera system having an overhead camera 210 and a side camera 220 for monitoring an area of interest 202. The overhead camera may monitor, for example, horizontal movement in the area of interest, as described above. In contrast, the side camera 220 may monitor, for example, vertical movement within the area of interest 202. It is contemplated that a pattern may be applied to a wall or the like in the field of view of the side camera to help detect movement of objects within the area of interest. Further illustration of the camera operations for the illustrative embodiment of
Figure 7A appears in Figures 7B and 7C. Figure 7B is a diagram showing a possible field of view for the overhead camera 210, and shows an area of interest 232. Figure 7C is a diagram of an illustrative field of view for the side camera 220. The side camera 220 may, for example, monitor vertical movement across a predefined plane. In this illustrative embodiment, the field of view 235 of the side camera 220 has a thin selected area 237 that corresponds to the desired plane.
Figure 8A is a perspective view of another safety camera system having two safety cameras 250 and 260 for monitoring a volume of interest 270. In the illustrative embodiment, a first camera 250 is positioned to capture an image under a cone 252 defining a circle 254 along the horizontal plane, while a second camera 260 is disposed to capture an image under a cone 262 defining circle 264 in the vertical plane.
Figure 8B is a diagram showing the field of view 290 of the first camera 250 of Figure 8A, with a first selected area of interest 292. Figure 8C is a diagram showing the field of view 295 of the second camera of Figure 8A, with a second selected area of interest 297. In the illustrative example, volume 270 is a six-sided volume, whose shape is defined by the selected areas 292, 297, and the shape of cones 252, 262. The shape of the volume of interest 270 may be refined by using additional cameras, or by using improved cameras that may capture additional information within the corresponding field of view. Additional optical devices may also be used for the shaping of the volume of interest 270.
In Figure 8A, an object 299 is shown within volume of interest 270. The object 299 lies along a first line 256 corresponding to camera 250, and a second line 266 corresponding to camera 260. Figure 8B shows that the object 299 can be found in a selected area 292, and Figure 8C shows that the object can be found in selected area 297. Because the object 299 appears in both selected areas 292 and 297, the object lies within volume of interest 270. When the object is at a boundary of either of the selected areas 292 or 297, the object is at the boundary of the volume of interest 270.
In one illustrative embodiment, when no object is entering or within the volume of interest 270, only those portions of the incoming images that correspond to the border of the volume of interest 270 may be analyzed. By only monitoring the border area, the present invention may quickly detect when the border has been breached by an object. After the border is breached, the present invention may, for example, sound an alarm, deactivate hazardous equipment in the volume of interest, or perform some other precautionary measure, but this is not required. Once the border is breached by an object, the present invention may begin analyzing the entire volume or selected regions inside the volume of the area of interest, as desired. This may provide some level of presence monitoring of the volume of interest. In some embodiments, the presence monitoring can be performed at a slower rate than the border analysis, particularly when one or more precautionary measures have already been initiated by a border breach. It is contemplated that both modes of analysis can take place simultaneously or sequentially, depending on the application. Once the object leaves the volume of interest, the present invention preferably returns to the original steady state, and monitors only the border regions of the incoming images. Figure 9 is a perspective view of yet another safety camera system having two safety cameras for monitoring an area of interest. In this illustrative embodiment, the area of interest is shown at 300, and is defined by border 302. Corners 306 and 308 are also shown, with cameras 310 and 320 disposed over the comers. A first camera 310 captures images under a cone 312, defined by field of view 314. A second camera 320 likewise captures images under a cone 322, defined by field of view 324. A planar border defined by a polygon 330 and two triangles 332 and 334 can then be defined. The shape and center height of the polygon 330 may be changed by adjusting the angles of cameras 310 and 320 with respect to the area of interest 300, by utilizing different cameras 310 and 320, by adding additional cameras, etc. If so desired, the border may be defined as only including the area within the polygon 330, which would thus monitor an area captured only by both cameras 310 and 320. Once the border is defined, the present invention preferably monitors the border for breach oy an object, υnce breached, the present invention preferably begins analyzing the entire area or selected regions of the area of interest 300, as described above.
As is shown in Figures 1-9, there are a variety of configurations that may be used to detect a border breach and/or monitor an area of interest. Many other configurations are also possible. Data selection and exclusion, along with placement of multiple cameras, manipulation of angles of a single or multiple cameras, and other embodiments may be used to monitor borders, areas, and volumes of many shapes, sizes, configurations, and numbers, as desired.
Figure 10A is a perspective view of an area of interest 350 with an object 370 just entering the border 352 of the area of interest 350. Equipment 356 is in the area of interest 350, and the object 370 is shown just at the border 352 of a pyramid 362 outlining a selected area within the field of view of a camera 360. No other object is observed in the interior 354 of the area of interest 350. Figure 10B is an overhead view of the area of interest 350 and the object 370 of Figure 10A. In steady state operation, that is when no object is entering or within the area of interest, only those portions of the incoming images that correspond to the border 352 of the area of interest 350 are analyzed. When performing the border analysis, a most recent image of the border 352 may be compared to at least one reference image. The reference image may be a single image taken at one point in time, or the reference image may be updated periodically. In one illustrative embodiment, the reference image is updated after a set period of time. Alternatively, or in addition, the reference image may be updated upon the occurrence of an event, such as an external signal requesting that a new updated image be taken, or a response to a change in condition of the border 352 of the area of interest 350. In some embodiments, there may be more than one reference image, where a first reference image may be a set image or a recent image and a second reference image may be the second-to-last image taken, such that a first comparison with one image may note immediate changes in the border 352 of the area of interest 250, while a second comparison may note accumulated changes. In any case, by only monitoring the border area 352, the present invention may quickly detect when the border 352 has been breached by object 370. After the border 352 is breached, the present invention preferably sounds an alarm, deactivates hazardous equipment 356 in the area of interest 350, and/or performs some other precautionary measure, but this is not required.
Figure 11 A is a perspective view of an area of interest 400 with an object 420 already across the border 402 of the area of interest 400. Figure 1 IB is an overhead view of the area of interest 400 and the object 420 of Figure 11A. Once the border 352 is breached by object 420, the present invention preferably begins analyzing the entire area 404 or selected regions inside the border 402 of the area of interest 400. This may provide some level of presence monitoring of the area of interest 400. In some embodiments, the presence monitoring can be performed at a slower rate than the border analysis, particularly when one or more precautionary measures have already been initiated by a border breach. It is contemplated that both modes of analysis can take place simultaneously or sequentially, depending on the application. In some embodiments, the interior region is defined to include the border region as well. In some embodiments, the safety monitoring system may return to the original steady state after a second triggering event, and monitors only the border region of the incoming images. The second triggering event may be, for example, a manual reset input signal, the passage of time after which the object 420 does not move with respect to the interior 404 or the border 402 of the area of interest 400, the exit of the object 420 from the area of interest 400, a determination that the object 420 is smaller than some pre-selected minimum size, etc.
Figure 12A is a perspective view of an area of interest 450 with an object 470 entirely within the area of interest 450. Figure 12B is an overhead view of the illustrative drawing of Figure 12 A. In some embodiments of the present invention, the object 470 would not be noted within the analysis of the border 452 of the area of interest 450, but it would appear as a change in the interior 454 of the area of interest 450 during interior monitoring, which may start when the object 470 breaches the border 452. As indicated above, the safety monitoring system may return to the original steady state after a second triggering event. The second triggering event may be, for example, a manual reset input signal, the passage of time after which the object 420 does not move with respect to the interior 404 or the border 402 of the area of interest 400, the exit of the object 420 from the area of interest 400, a determination that the object 420 is smaller than some pre-selected minimum size, etc. When returning to the original steady state with the object still within the area of interest, the safety system may update the reference image to reflect the change in the interior 454 and/or border 42 of the area of interest 450. Figure 13 is a block diagram showing an illustrative safety camera system in accordance with the present invention. An image analysis block 500 receives an input signal 502 from, for example, a digital or analog video camera, CCD or the like. Input signal 502 may include a synchronization signal or other signal for indicating when, during the data stream, data relating to a new frame begins. The input signal 502 is used to provide an input image or frame 504, which may be stored in a memory or file.
Once an input new image is received by input image block 504, a signal may be sent to the control block 520. The control block may control the operation of the image analysis block 500. For example, when a new reference image is desired, the control block 520 may notify the updating block 506, which transfers the current image to the reference image memory block 508.
The selection block 510 may allow a user to define, for example, an area of interest, borders, excluded areas, etc. Whether the selection block 510 and/or control block 520 elicit any area, interior, and border definitional data is dependent upon the particular system needs and capacities. In some embodiments, a mask may be used on the camera(s) to limit the field of view of the cameras to reduce the amount of data that needs to be saved and/or processed.
As the input image 504 is received at the selection block 510, the selection block 510 also accesses the reference image memory 508. Portions of each of these images are then sent to the border analysis block 512, as per instructions from either the control block 520 or stored within the selection block 510. Border analysis block 512 receives image data from the selection block 510 and determines whether an object has intruded into the area of the input image 504 that is defined as the border. The border analysis block 512 sends an output to the control block 520 indicating the results of the border analysis.
In some embodiments, the selection block 510 may send data to the interior analysis block 514 after a triggering event has occurred, such as detection of an object by the border analysis block 512. The control block 520 may direct the selection block 510 to send such data, or the control block 520 may pass on a signal indicating a violation has taken place to the selection block 510, which may contain instructions as to how to respond. The interior analysis block 514 may receive data relating to the interior area of the area of interest and, upon prompting from the selection block 510 or control block 520, may perform analysis on the interior of the area of interest. The interior area of the area of interest may, in some embodiments, include the border region as well. Upon completion of such analysis, the interior analysis 514 may send a signal back to the control block 520 indicating the results of the interior analysis. There may also be external inputs and outputs 530 connected to the control block 520. Some inputs may include, for example, a manual alarm, a reset switch, a restart switch, communication links for updating definitional data relating to the area of interest, interior, and borders, equipment status monitors, on/off switch, etc. Some outputs may include, for example, a connection to equipment or machines controlling the speed, operation, state, or other characteristics, including whether such equipment or machine is on or off, connection to a brake for stopping equipment or machines, audible or visible alarm systems, communication links to emergency services, monitoring facilities, security, upkeep, custodial or other maintenance facilities or personnel, etc. In a preferred embodiment, the control block 520 may turn off the interior analysis 514 under some conditions, for example, when it is determined that no border violation has occurred. Further, in some embodiments, the border analysis and interior analysis may be performed by the same processing unit at different times, as shown and described with reference to, for example, in Figure 19 (processor 2, 890). It is contemplated that some or all of the capture images and/or reference images may be saved for later viewing, if desired. For example, when a breach is detected, the capture and/or the reference images may be saved to a storage medium such as a hard disk, RAM, Compact Disk, magnetic tape or any other storage medium. At a subsequent time, the images may be viewed to identify the situation that occurred.
To help increase the overall reliability of the system, it is contemplated that two or more image analysis blocks 500 may be provided, each receiving an input s gnal 502 from, for example, a digital or analog video camera, CCD or the like. In this embodiment, both image analysis blocks 500 may analyze the capture image, and provide signals to control block 520. Control block 520 may then only provide external output signals when both image analysis blocks 500 agree that an external output signal is warranted.
Alternatively, or in addition, two or more imaging devices such as digital or analog video cameras, CCDs or the like may be provided, each providing an image of the area of interest to a corresponding control block 500. In this embodiment, all image analysis blocks 500 may analyze the corresponding images, and provide signals to control block 520. Control block 520 may then only provide external output signals when all image analysis blocks 500 agree that an external output signal is warranted.
Figure 14 is another block diagram showing an illustrative safety camera system in accordance with the present invention. In this illustrative embodiment, inputs 560 to a control block 550 may include equipment-generated information 562, such as equipment operation speed, fluid levels, internal temperature, cover or shell integrity signals, error signals, and other information relating to the status, upkeep, or operation of equipment. If the equipment were, for example, an internal combustion engine, equipment generated information could include indications of oil pressure, fuel level, heat level, etc. Other inputs 560 may include manual reset 564 and manual turn-off 566, for example.
Also in the illustrative embodiment shown, outputs 570 from the control block 550 may include an output to the equipment 572, and other outputs 574. The output to the equipment 572 may include signals such as off, idle, startup, stop, or could apply a brake, regulate speed of operation, close emergency safety guards, etc. Other outputs 574 could include sirens, bells, whistles, lights and emergency signals to warn emergency crews, or may be include a memory device that could keep track of the timing and number of unsafe condition messages generated by the safety monitoring system. Figure 15 is a flow chart showing an illustrative relationship between border and interior analysis functions in accordance with the present invention. In the illustrative embodiment, the safety output 646 is used to disable the equipment, set off an a arm, e c. e sa ety output s genera e y a oo ean unct on , w c ANDs the border analysis output 640 with the interior analysis output 642. That is, the border analysis output 640 must indicate that no object has breached the border, AND the interior analysis output 642 must indicate that no object is currently in the interior of the area of interest before the safety output 646 is set high. In one embodiment, the equipment in the safety zone can only be operated when the safety output 646 is high.
In some embodiments, the border analysis 600 may include a quick examination of a minimum number of pixels to establish whether there has been an access violation. In many cases, the border region will include a fewer number of pixels than the interior region. The border analysis algorithms are preferably optimized to analyze the limited number of pixels with the fastest possible response time.
One way to achieve fast, reliable, and robust object detection along a border region is to provide a reference marking on the floor along the desired border, such as with tape or paint. An example of such an approach is shown in Figure 4B above. A preferred method for performing object detection using a reference marking is disclosed in co-pending U.S. Patent Application Serial Number H16-26483, which has been incorporated herein by reference. However, such a reference marking is not required for the present invention. In one illustrative embodiment, border analysis 600 compares the border pixels in the live image with those in the reference image. If a region is detected where the computed difference is greater than a specified threshold, a more detailed analysis can be performed on the region to determine whether the difference is due to an intruding object, a shadow or some other cause. If the difference is determined to be a shadow, the border analysis preferably resumes, examining other regions for detected differences.
In some embodiments, the border analysis may terminate when one of the following conditions is reached: an object is detected on the border; all of the differences have been determined to be shadows or other non-objects; or the time available to perform the border analyze has expired. The resolution used for object detection preferably is set to detect hands, feet, or other objects of similar (or smaller) size that are likely to first penetrate the border. Once the object size is set, the object detection may be limited to detect only objects of the specified size or objects with bigger size. In some embodiments, the minimum size of objects to be detected is automatically determined from a reference object, reference markings or some other feature in the reference image during a configuration procedure. Additionally, the minimum size of the objects to be detected can be different for the border analysis than for the interior analysis. For example, the minimum size of objects used for the border analysis (e.g. hand) may be smaller than the minimum size used for the interior analysis (arm, body, etc.).
Border analysis 600 may be performed for every image captured to achieve the minimum response time. That is, the border analysis 600 may be triggered by arrival of a new image frame from an image input device such as a camera. Other embodiments may perform border analysis 600 on less than all received images, using some of the received frames to perform image verification processes to, for example, help assure that the image capture devices (e.g. cameras) are functioning properly. For example, and as shown and described below with reference to Figure 22, one half of the frames may be used for border analysis, while the other half may be used for camera verification procedures.
The interior analysis block 620 may determine whether there is any new, unexpected, or undesired object present inside the area of interest. The interior analysis 602 may, in some embodiments, be limited to finding objects greater than a specified minimum size. Smaller objects and debris on the floor can, in some embodiments or applications, be ignored. The interior analysis 620 may not need to be performed for every image frame received, but instead may be invoked on demand when needed, for example, after a triggering event such as the breach detected by the border analysis block 600.
Analysis of the interior of an area of interest typically must process more pixels and may require more computation time than the border analysis 600. To help increase the speed and robustness of the interior analysis 602, a reference marking may be provided on the floor of the interior of the area of interest. However, in some applications it may not be desirable or possible to do so. A longer response time may be acceptable for interior analysis 602 since it is not used for the initial detection of a safety violation, but instead can be used for interior analysis that takes place after a border violation is detected. The slower algorithms and analysis may thus be acceptable because the system is already "aware" of the safety violation, and ameliorative actions may have taken place, for example, shutting down a hazardous machine or setting off alarm systems. The border analysis 600 and interior analysis 620 procedures can be used together to maintain safety for the overall system. The interaction of these two analyses can be thought of as a gate that "closes" the safety zone. When the border analysis 600 determines that no object is in the process of entering the safety zone, the border of the safety zone is "closed". The interior analysis can then work to determine whether there is an object present in the interior of the safety zone. Once the interior is found to be empty and no further border violation occurs, the system is in the safe or original steady state mode. As long as the border remains "closed" (no access violation), the system preferably remains in the safe or original steady state mode. Sometimes, during the time that the interior analysis 620 is being performed, a border violation occurs. In this case, the interior analysis 620 may be repeated with a new image frame obtained after the border is clear again, since a border violation during a first interior analysis may suggest an additional object has entered the safety zone which is not included in the frame used during the first interior analysis. Figure 16 is a flow chart showing an illustrative method for performing interior and border analysis in accordance with the present invention. The border analysis block 600 and interior analysis block 602 of Figure 15 are shown in Figure 16, outlined in dashed lines. After the illustrative system is powered up or reset, the system preferably determines whether the safety zone (e.g. area of interest) is empty before activating the safety output 646. To do so, the illustrative control system preferably waits for an indication that the border is not being violated, as shown at block 622. Border analysis block 600 determines if there is a border violation by obtaining a next live image, analyzing the border of the live image as shown at 604, and determining if there is a border violation as shown at 606. If there is a border violation, the safety output of the border analysis block 600 is set to zero, as shown at 610. If no border violation is detected at 606, a next live image is received and the process is repeated. Once no border violation is detected, the interior analysis block 602 receives a new live image, as shown at 624 of the area of interest. The interior analysis block 602 then analyzes the interior of the safety zone, as shown at 626. After completing the interior analysis 626, the system may determine whether there has been a violation of the interior, as shown at 628. If there has been a violation of the interior, the system may return to block 622 and wait for an indication that the border is no longer being violated. If there is no interior violation, the system may check whether the border has been violated during the interior analysis, as shown at 630. In some embodiments, the received frames may be saved into memory during the interior analysis. After the interior analysis is complete, border analysis 600 may be performed on the saved frames in a relatively short amount of time, and the system may conclude that no border violations have occurred during the interior analysis. If the border has been violated, the system may return to block 622 and wait for an indication that the border is no longer being violated. If no border violation occurred during the interior analysis at block 630, and the interior analysis 628 determines no interior violation has occurred, the system may generate an output indicating the system is safe, as shown at 632. Once safe, the system may enter the RUN state as shown at 634.
When both border analysis 600 and interior analysis 602 indicate safe conditions, the safety output signal 646 may go high, indicating safe operating conditions. The RUN state 634 is a safe or steady state in which the illustrative system performs no further interior analysis until a border violation is detected. When a border violation is detected, the system returns to block 622 and waits for an indication that the border is no longer being violated. Figure 17 is a state machine diagram of an illustrative embodiment of the present invention. In the initialization state 750, the system may determine whether a current configuration is valid or a new reference image and safety zone configuration is necessary or desirable. In the configuration state 760, the system may communicate with an operator to capture a new reference image, define a desired safety zone, and/or compute the needed configuration data for use in the safety monitoring procedures. It is contemplated that the system may automatically identify the border region and/or interior region in the reference image from a reference marker positioned in the area of interest. The reference marker may be, for example, a pattern on the floor, etc. Once the configuration is complete (and accepted by the operator) the system may switch to the clearing state 780 in which the interior analysis is performed. The border analysis may also start at this time and run continually whenever a new frame is received, or, in an alternative illustrative embodiment, when every other new frame is received. When the safety zone border and interior are determined to be free of safety violations, the system switches to the running state 790. The safety output may also be activated at this time, while the border analysis continues. When a border access violation is detected, the system preferably deactivates the safety output and switches to either the stop state 799 (for a manual restart) or the clear state 780 (if an automatic restart is desired).
In some applications, the lighting conditions may vary significantly throughout the day. Under these conditions, it may be desirable to include a procedure for automatically updating the reference image and the configuration data to account for changing lighting conditions. While there is no safety violation (in the run state 790), the system may periodically switch to the update state 770 to capture a new reference image and compute new configuration data corresponding to the new reference image. If a valid configuration is achieved and there is no safety violation with the new configuration the system may return to the run state 790. Alternatively, if there has been a safety violation and the system is unable to confirm that the interior of the safety zone is empty (due to, for example, some extreme change in lighting conditions) an operator may choose to manually initiate the update procedure. The system may include an output directed to a light source that may be varied as ambient light conditions change in order to maintain consistent lighting in the area of interest. The system may then capture a new reference image for the current lighting conditions and attempt to return to the run state 790 if a valid configuration is obtained and the safety zone is determined to be empty.
Figure 18 is a block diagram showing an illustrative data flow and analysis in accordance of the present invention. In the illustrative embodiment, an image capturing device is shown, and is preferably a standard black and white CCD video camera 810 operating at thirty frames per second. Use of a color or CMOS-based camera is also contemplated. Other frame rates are also possible, and they will impact the response time of the system. In the illustrative embodiment, an analog output signal from the camera 810 is converted to an eight-bit digital sequence of luminance data (pixels) by an analog/digital converter 802. A pixel processing block 804 performs initial analysis on the digitized output of the camera 810. For faster response times, the pixel processing device may use, for example, two pipelined processing elements, as shown in Figure 24. A first stage processor may be selected to assure sufficient speed to perform initial processing of the pixel data received from the camera 810. Pixel data may be received at, for example, a rate of about 12.5MHz for a standard video input signal. The first stage processor may, for example, perform initial sorting and accumulation steps of the object detection algorithm for the pixels on the border of the safety zone in each image produced by the camera 810. These interim results are preferably stored in a memory, as shown at 820. When commanded by the control block 850, the first stage processor may also perform an initial differencing and thresholding operation as the first step of the interior analysis procedure.
A second stage processor may be, for example, a standard microcontroller that receives the initial analysis results stored in memory 820 and performs the remaining border and interior analysis processing to determine the correct state of the safety output, as shown at 840. The second stage processor may also implement the control function, as shown at 850. The second stage processor may. be signaled via an interrupt when the first stage processor completes its portion of the analysis. The second stage processor may then perform averaging and comparison operations of the object detection algorithm, and deactivate the safety output if a border access violation has been detected. If a border access violation has been detected, the second stage processor may send one or more signals to the first stage processor to capture a live image and perform the initial interior analysis operations. When this is complete, the second stage processor may use the results stored in memory 820 to complete the interior analysis. The controller 850 may direct the pixel processing block 804 to capture a new reference image, as desired.
To define a desired safety zone, a software program 862 or the like may be executed on a separate computer 860, such as a PC. The software preferably enables an operator to capture and view a reference image and graphically identify the border of the desired safety zone as it appears in the image. The configuration data needed by the border and interior analysis procedures may be sent back to the pixel processing block 804. The computer 860 used for the configuration procedure may not be needed during normal operation of the safety system.
Figure 19 is a timing diagram showing illustrative timing relationships for the embodiment of Figure 18. A top line 870 corresponds to a camera output, numbering individual frames i, i+1, i+2, etc. In the illustrative example, individual frames arrive every thirty-three milliseconds, as the camera takes thirty frames per second. A second line 880 may correspond to a first processor, which performs pixel processing on each successive frame received. Preferably, the pixels are processed as they are received, so that at a time just after data for a given frame is completed, the first processor can send the ready frame to the second processor, which corresponds to line 891. Typical data flow for the illustrative embodiment of Figure 19 is shown by dashed line 876. An image is captured and sent via a data stream to pixel processing, which preferably does not process the entire image simultaneously, but rather performs pixel-by-pixel processing. The output of pixel processing is preferably stored in a memory, which is then read by the second processor. The second processor preferably performs a border analysis once all of the pixels of an image are processed by the first processor.
In other embodiments, the entire processing may be performed pixel by pixel, for example, by comparing each individual pixel received to a corresponding reference pixel. Alternatively, or in addition, the pixel information may be filtered in a first state so that, rather than processing the pixels for an entire frame before performing border analysis, only pixels corresponding to the border region are processed, while the other pixels are ignored, discarded, or further filtered to save those pixels that correspond to the interior of an area of interest.
The second processor, as shown by line 891, may use the thirty-three milliseconds between arrivals of new frames to perform two different functions. As shown, border analysis may take up a relatively short period of time for the second processor. Once the border analysis is complete, the second processor 890 may begin processing the interior, if appropnate. The interior analysis, as shown at 892, preferably begins in the RUN state 893, but preferably only after the safety output is opened 894 as a result of a border violation. The second processor then enters the CLEAR state 895. The CLEAR state 895 requests a new live image from the first processor, and then enters CLEAR WAITING state 896 until the new live frame is ready. Then, the second processor goes from CLEAR WAITING 896 into CLEAR CHECK INTERIOR 897, and performs an interior analysis. In the illustrative embodiment, this is repeated until the frame 874 that shows that the safety zone is clear is processed by the second processor, wherein the safety output is closed as shown at 898.
As shown at the bottom 899 of Figure 19, the maximum response time is equal to the time it takes for a new frame to be captured by the camera, plus the time it takes for pixel processing of the new frame to be completed, plus the time it takes for the border analysis to be completed. In the illustrative embodiment, the maximum response time is about seventy-six milliseconds, less than one-tenth of a second.
Response times can vary depending on a wide number of factors. A faster camera can take more frames per second, reducing the maximum delay times significantly. A higher resolution camera may be useful to detect smaller objects, but may increase the processing times. One way to counteract the effect of a faster camera on processor time may be to define the border to include a smaller area, which in turn may decrease the likelihood that a small or fast moving object can completely cross the border without being detected. Alternatively, faster processors may be utilized, but this may increase the cost of the system. Also, the border region may be defined using different patterns that may maximize the effectiveness of the pixel processing. More shaφly defined borders, however, may in turn reduce the robustness of the system. Such variables should be considered and weighed in determining the best system for a given application and/or environment.
Figure 20 is a state diagram showing the progression of states for an illustrative embodiment of the present invention. An INIT block 900 may perform such functions as initializing hardware, performing a self-check or a camera check, and starts all internal processes. If there is a bad configuration or pc connected, as shown at 902, the illustrative embodiment moves to a CONFIG block 910. Otherwise, if the configuration is ok and no PC is connected, as shown at 904, the illustrative embodiment moves to the CLEAR block 920.
In the CONFIG block 910, configuration routines may be initiated, including, for example, capturing of a new reference image. If a pc is connected, a new reference image may be routed to the pc, and definitions for a safety zone and windows for analysis may be returned by the pc, as described above. If there is a pc, the user of the pc may confirm the validity, as part of the routines in the CONFIG block 910, via a user interface. For each defined analysis window, the CONFIG block 910 may determine median, light, and dark pixel sets. Also, reference contrast values may be computed and validated, and, if connected, sent to the pc. Border analysis may also begin while in the CONFIG block 910.
If the routines performed within the CONFIG block 910 are successfully completed, the CONFIG block 910 may return with OK 912 and enter the CLEAR block 920. Otherwise, if some of the routines cannot be completed, the CONFIG block 910 may instead return with FAILED CONFIG 914 and enter the STOP block 950.
Once CONFIG OK & NO PC 904 is returned by the INIT block 900 or CONFIG OK 912 is returned by the CONFIG block 910, the illustrative embodiment enters CLEAR block 920. One puφose of the CLEAR block 920 is to determine whether the interior region of the area of interest is clear for a sufficient period of time during which the border region is also clear. In the illustrative embodiment of Figure 20, a border analysis may be performed more quickly than the interior analysis, as, for example, shown in Figure 19. Thus, the CLEAR block 920 may include a first sub- block, WAIT FOR BORDER CLEAR 921, which in the illustrative embodiment, can be satisfied first, which sends a BDRCLEAR 922 signal to COMPARE INTERIOR TO REFERENCE 923. Once the border is clear, the COMPARE INTERIOR TO REFERENCE 923 is initiated. Analysis performed by WAIT FOR BORDER CLEAR 921 may include comparison of areas of a live image or most recent image to areas of a reference image. The areas being compared may correspond to some of the windows defined in the CONFIG block 910, as further described in co-pending U.S. Patent Application Serial No. H16-26483. entitled "OBJECT DETECTION", which is incoφorated herein by reference. The COMPARE INTERIOR TO REFERENCE 923 preferably determines whether it may be safe to initiate operation or a run state (RUN block 930) for the system. Such analysis may include comparison of a live image of the interior of the area of interest to a reference image, and this analysis may be divided into sub-blocks corresponding to some of the windows defined in the CONFIG block 910. If the interior is not clear, indicated by UNTCLEAR 924, the CLEAR block 920 returns to the WAIT FOR BORDER CLEAR 921.
In the illustrative example, a safety output may be defined to be closed when it is safe for operations to resume in the area of interest, and open when unsafe conditions are present. For this particular illustrative embodiment, the safety output is defined as open while in the INIT 900, CONFIG 910, CLEAR 920, UPDATE 940, and STOP 950 blocks. If the CLEAR block 920 determines that the border is clear and the interior is clear, it may go to the RUN block 930, as shown by BDRCLEAR & INTCLEAR 925. Upon the determination that the border and interior are both clear, the CLEAR block 920 also moves the safety output to a closed, or safe, state.
When the illustrative embodiment of Figure 20 enters the RUN block 930, border scanning preferably continues. Border scanning may include comparison of a live frame to a reference frame. Such comparison may be performed according to the windows defined in the CONFIG block 910. If a border violation is detected, the safety output is opened. A variable may also be defined in the illustrative embodiment as AUTORESTART. AUTORESTART may have two states, for example, 1/0, high/low, on/off, etc. If AUTORESTART is in a first state, the system may autorestart in response to a border violation, entering the CLEAR block 920 (shown at IBORDERCLEAR & AUTORESTART 932) when, during operation in the RUN block 930, a border violation is detected. Altematively, if AUTORESTART is in a second state, the system may enter the STOP state 950 (shown at IBORDERCLEAR & ! AUTORESTART) upon detecting a border violation.
Also in the illustrative example is an UPDATE block 940. The UPDATE block 940 may be entered from the CLEAR 920 (UPDATE REF 926) or RUN 930 (UPDATE REF 934) blocks at some predefined time. The predefined time may be a time related to a border or interior violation, a preset interval from the last updated reference image, or any other time. Also, the predefined time may arise as a result of a manual input or other passage of time. In the illustrative example shown in Figure 23 below, the safety output is opened while routines in the UPDATE block 940 are performed. In other embodiments, the safety output remains closed, at least in some circumstances, during performance of routines in the UPDATE block 940. While in the UPDATE block 940, a camera check may be performed, and a new reference image may be captured by the camera(s) of the system. If there are multiple cameras in use, the UPDATE block 940 may take new reference images from each camera, or may only capture a new reference image from one or more of the cameras. The UPDATE block 940 may also include the steps of computing a reference contrast value for each window and validating the sufficiency of the contrast. If the UPDATE block 940 fails to obtain a valid reference image, the UPDATE block 940 may return BAD CONFIG 944 and enter the STOP block 950. Otherwise, if the UPDATE block 940 completes all routines, it may retum OK 942 and go into the CLEAR block 920. In other embodiments, it may be possible to return to the RUN block 930 instead of the CLEAR block 920 after completion of the routines of the UPDATE block 940. Also, a failure in the UPDATE block 940 to return OK 942 may send the system into the CONFIG 910 or INIT 900 blocks.
When in the STOP state 950, the illustrative example stops border scanning and remains in a state wherein the safety output is open until the system can be reset or otherwise checked, at which point the system may enter the INIT 900 block again. Other states and relationships between states are also contemplated.
Figure 21 is a schematic diagram of an illustrative embodiment using multiple channels of data flow in accordance with the present invention. In this illustrative embodiment, safety camera 1010 gathers images at a certain rate of frames per second, creating a composite video stream 1012. The video stream is split into two channels, Channel A 1020 and Channel B 1030. Channel A 1020 generates Safety Output A 1022, and Channel B 1030 generates Safety Output B. A first cross-check connection 1024 connects Safety Output B 1032 to the image analysis block for Channel A 1020, and a second cross-check connection 1034 connects Safety Output A 1022 to the image analysis block for Channel B 1030.
In the illustrative embodiment, alternating frames in the image stream are available for validation of the camera data. For example, if camera 1010 provides thirty frames per second (fps), 15 fps may be used by Channel A 1020 for safety analysis. The remaining or intervening 15 fps may be analyzed in Channel B 1030 to detect dead pixels, row failures, insufficient lighting, etc. The analysis performed in Channel B 1030 may rely on detecting, in every frame, a boundary marking for the safety zone that provides a sufficient or reliable reference signal. Because camera fault conditions, such as pixel failures, loss of sensitivity, bum out, etc., may also result in insufficient contrast compared to the reference image, such analysis may detect additional safety violations.
Figure 22 is a timing diagram showing illustrative timing relationships of the embodiment of Figure 21. In the illustrative embodiment, the camera takes a certain number of frames per second; the frames are represented in a first line 1100, enumerated as frame i, frame i+1, etc. Hardware processing 1110 alternates between performing image analysis 1112 and signal validation 1114. In a preferred embodiment, analysis takes place as the pixels are received, rather than in a buffered grouping, so hardware processing 1110 continually analyzes data. At intervals relating to the completion of each frame, software processing 1120 receives data streams from hardware processing 1110. At the end of image analysis 1112 by hardware processing 1110, software processing 1120 performs an evaluation and generates a safety output 1122. Once evaluation and safety output 1122 are completed, software processing 1 120 performs self-check and background tasks 1122 until pixel data is again ready to be sent by hardware processing 1110.
At the end of signal validation 1114, hardware processing 1110 also sends a signal to software processing 1120. As shown, the signal may indicate that the signal is properly received, or OK 1116. As shown at lower portion 1130 of Figure 22, the maximum response time of this embodiment is slower than it was for the earlier illustrative embodiment. However, in exchange for the added response time, the system may be more reliable as the input signal is constantly validated. Such a system may be particularly suitable for harsh environments or where high reliability is required. Figure 23 is a block diagram showing processing, memory and control blocks of an illustrative embodiment of the present invention. In the illustrative embodiment, a CCD video camera 1200 generates a composite video signal. The composite video signal is received by a video decoder 1210. Video decoder 1210 may include an analog/digital converter and a synchronization function. The video decoder generates signals that may indicate luminescence, a pixel clock, and a synchronization signal to a pixel processing block 1215. The pixel processing block 1215 may fetch pixels and perform a contrast comparison algorithm or, in some embodiments, a frame matching algorithm, to determine whether an object has entered a portion of an area of interest. Pixel processing block 1215 also has access to memory 1220, and is preferably capable of saving and retrieving data from memory 1220. Memory 1220 may include a live image 1222, a reference image 1224, window definitions 1226, and analysis results 1228.
Control, safety, and configuration software block 1230 may be accessed by pixel processing block 1215. Control, safety, and configuration software block 1230 may also access and save data to memory block 1220, and receive inputs and generate outputs 1240 as shown. The control, safety, and configuration software block 1230 preferably controls the safety system, as described above.
Figure 24 is a functional block diagram of another illustrative embodiment of the present invention. In the illustrative embodiment, hardware 1303 includes CCD video camera 1300, video decoder 1310, pixel analysis 1320, analysis windows memory block 1330, analysis window definitions memory block 1340, reference image memory block 1350, and live image memory block 1360.. As shown, analysis window definition block 1340 and reference image block 1350 may include flash memory elements. Software 1306 may include system supervision 1370, safety analysis 1375, configuration 1380, and user interface software 1390.
CCD video camera 1300 generates a composite video data signal that is received by video decoder 1310. Video decoder 1310 may include analog/digital conversion capabilities and may generate synchronization data. Video decoder 1310 is controlled by system supervision 1370 during a configuration step. Video decoder 1310 may send luminescence data, clock data, horizontal synchronization data, and vertical synchronization data to pixel analysis block 1320. In the illustrative embodiment, pixel analysis block 1320 performs a contrast comparison algorithm and image validation. Pixel analysis 1320 provides data regarding pixels to analysis windows 1330, which generates status and results that are then sent on to system supervision 1370 and safety analysis 1375. Pixel analysis block may be controlled by system supervision 1370 to determine the mode of pixel analysis. Pixel analysis can include any number of pixel processing methods including, for example, contrast comparison algorithm mode, frame matching algorithm mode, image validation mode, etc.
Pixel analysis block also send data corresponding to the live image to live image memory block 1360, where the live image may be stored. Configuration 1380 may control live image memory block 1360, and may request live image memory block 1360 to send data to reference image memory block 1350, possibly updating the reference image memory block 1350. Pixel analysis block 1320 receives a reference image from reference image memory block 1350 as well as data relating to the analysis windows from analysis window definitions block 1340. The analysis window definitions enable the pixel analysis 1320 to group pixels together into bins to perform contrast comparison algorithms comparing a live image to a reference image. Analysis window definitions block 1340 receives data from configuration block 1380.
System supervision 1370 receives data from reference image memory block
1350 and analysis window definition block 1340 as well as analysis windows block
1330. System supervision performs tasks including initialization and self-check tasks by controlling the configuration of video decoder 1310 and the mode of pixel analysis 1320, along with controlling safety analysis 1375 and configuration 1380. Safety analysis 1375 evaluates analysis windows 1330 outputs and generates a safety output. Safety analysis also receives generates an error output, which can actually be generated by any of system supervision 1370, safety analysis 1375, or configuration 1380. Safety analysis receives control signals from system supervision 1370. Configuration 1380 may send data to analysis window definitions 1340, and controls capture of the live image memory block 1360. Via a coupling such as an RS 232 or any other data connection, configuration 1380 sends and receives data to user interface software 1390. Configuration 1380 can also generate an ercor signal.
During an initialization mode, hardware 1303 is initialized, internal processes are started, and a self-check is performed. Much of initialization can be controlled from system supervision 1370, though information received from camera 1300 and user interface 1390 can be incoφorated as well. An error signal will be generated if there is a failure to correctly initialize the system.
In the configuration mode, a new reference image is captured with camera 1300 and sent to the PC operating the user interface software 1390. Safety zone parameters are received from the PC, again using user interface software 1390. Analysis windows definitions 1340 are generated using the safety zone parameters. For each analysis window 1330, the median pixel level (for example, with respect to luminescence) and defined light and dark pixel sets are determined. Reference contrast values are computed and validated, and ultimately sent to the PC to confirm that the reference contrast values are acceptable. The PC user can confirm the validity of reference contrast values. An error will be generated if there is a failure to define a valid safety zone.
In a calibration mode, reference contrast values for each analysis window 1330 are computed. Whether there is sufficient contrast in each window is then validated. Windows lacking sufficient contrast for processing via contrast comparison algorithms are marked for frame differencing analysis.
During a clearing mode, the comparison of live images to a reference image
(or reference images) may begin. Once there is an agreement between the comparison of the live image and the reference image, boundary scanning begins, and the safety output is closed (set to "safe"). While the system is running, boundary scans are continued. The safety output will be opened if there is a detected violation.
Having thus described the prefened embodiments of the present invention, those of skill in the art will readily appreciate that the teachings found herein may be applied to yet other embodiments within the scope of the claims hereto attached.

Claims

WHAT IS CLAIMED IS:
1. A method for monitoring an area of interest having a border and an interior region, the method comprising the steps of: monitoring at least a portion of the border region of the area of interest for breach by an object; and monitoring at least a portion of the interior region of the area of interest for the object after the object breaches the border.
2. The method of claim 1 further comprising the step of: ceasing to monitor the interior region of the area of interest after the object leaves the area of interest; and continuing to monitor at least a portion of the border region of the area of interest after the object leaves the area of interest.
3. The method of claim 1 wherein the interior region of the area of interest is not monitored until the object no longer breaches the border region of the area of interest.
4. The method of claim 1 further comprising the step of: continuing to monitor at least a portion of the border region of the area of interest while the interior region is being monitored.
5. The method of claim 1 further comprising the step of providing a safety output when the border region is breached by the object.
6. The method of claim 5 wherein the safety output disables a piece of equipment located in the area of interest.
7. The method of claim 5 wherein the safety output sounds an alarm.
8. The method of claiml wherein the border region comprises a continuous region.
9. The method of claim 1 wherein the border region comprises an interrupted region.
10. The method of claim 1 wherein the area of interest excludes a defined region from its interior.
11. A method for monitoring an area of interest having a border and an interior, the method comprising the steps of: capturing an capture image of the area of interest; identifying one or more border regions in the captured image that correspond to the border of the area of interest; analyzing the one or more border regions of the captured image and determining if an object has entered the one or more border regions of the area of interest; and outputting a signal indicating when an object has entered the one or more border regions of the area of interest.
12. The method of claim 11 wherein the one or more border regions include a reference marking.
13. The method of claim 11 wherein the reference marking is a predetermined pattern.
14. The method of claim 11 wherein the step of analyzing the one or more border regions of the captured image comprises the step of comparing the one or more border regions of the capture image to one or more corresponding regions of a reference image.
15. The method of claim 14 wherein the border of the area of interest includes a reference marking, and the one or more border regions in the reference image are identified by identifying the reference marking in the reference image.
16. The method of claim 15 wherein the reference marking is a predetermined pattern.
17. The method of claim 16 wherein the predetermined pattern determines a minimum size of the objects to be detected.
18. The method of claim 11 further comprising the step of storing the capture image when an object has entered the area of interest.
19. The method of claim 18 further comprising the step of viewing the stored capture images at a later time.
20. The method of claim 14 wherein the reference image is taken in response to a change in one or more conditions in the area of interest.
21. The method of claim 14 wherein the reference image is taken at a set time interval.
22. The method of claim 11 wherein the step of analyzing the one or more border regions of the captured image comprises the step of comparing the one or more border regions of the capture image to corresponding regions of two or more reference images.
23. The method of claim 22 wherein at least one comparison detects relatively immediate changes, and at least one comparison detects accumulated changes.
24. A method for monitoring an area of interest having a border and an interior region, the method comprising the steps of: capturing at least two images of the area of interest using two separate image capturing devices; identifying one or more border regions in the captured images that correspond to the border of the area of interest; analyzing the one or more border regions of the captured images to determine when an object enters the area of interest; and outputting a signal indicating whether or not an object has entered the area of interest.
25. The method of claim 24, wherein the image capturing devices are video cameras.
26. The method of claim 24, wherein the image capturing devices are digital cameras.
27. A system for monitoring an area of interest having a border and an interior region, comprising: capturing means for capturing a capture image of the area of interest; and monitoring means for monitoring at least a portion of the border region of the area of interest for breach by an object, and for monitoring at least a portion of the interior region of the area of interest for the presence of the object after the object breaches the border.
28. A system for monitoring an area of interest, comprising: image capturing means for capturing at least one image of the area of interest; first processing means for processing at least one of the capture images to determine if an object has entered the area of interest; second processing means for processing at least one of the capture images to determine if an object has entered the area of interest; and output means for outputting a signal indicating that an object has entered the area of interest when both the first processing means and second processing indicate that an object has entered the object of interest.
29. A system according to claim 28 wherein the image capturing means includes a single image capture device.
30. A system according to claim 28 wherein the image capturing means includes two image capture devices each providing a separate image of the area of interest, wherein a first one of the image capture devices provides a first image of the area of interest to the first processing means and a second one of the image capture devices provides a second image of the area of interest to the second processing means.
31. A method for monitoring an area of interest having a border region and an interior region, the method comprising the steps of: monitoring at least a portion of the border region of the area of interest for breach by an object having a first minimum size; and monitoring at least a portion of the interior region of the area of interest for an object having a second minimum size after the object breaches the border region of the area of interest.
32. The method of claim 31 wherein the first minimum size is smaller than the second minimum size.
33. The method of claim 31 wherein the first minimum size is bigger than the second minimum size.
34. The method of claim 31 wherein the interior region is defined to include the border region.
35. The method of claim 31 wherein the interior region is defined to exclude the border region.
36. A method for monitoring an area of interest having two or more regions, each region having a border and an interior region, the method comprising the steps of: capturing a capture image of the area of interest; monitoring the border and/or interior region of a first region of the area of interest for breach by an object; and monitoring the border and/or interior region of a second region of the area of interest for breach by an object.
37. The method of claim 36 wherein the border and/or interior region of the first region are monitored independently of the border and/or interior region of the second region.
38. The method of claim 36 wherein the border and/or interior regions of the first and second region are selectively monitored.
39. The method of claim 38 wherein the border and/or interior region of the first region are monitored and the border and/or interior region of the second region are not monitored.
40. The method of claim 38 wherein the border and/or interior region of the first region are monitored and the border and/or interior region of the second region are also monitored.
41. The method of claim 38 wherein the border and/or interior region of the first region are not monitored and the border and/or interior region of the second region are not monitored.
PCT/US2002/007493 2001-03-14 2002-03-13 Object detection WO2002073086A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2002572310A JP2004535616A (en) 2001-03-14 2002-03-13 Object detection
EP02723404A EP1373785B1 (en) 2001-03-14 2002-03-13 Object detection
DE60224373T DE60224373T2 (en) 2001-03-14 2002-03-13 RECORDING OF OBJECTS

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US27587901P 2001-03-14 2001-03-14
US60/275,879 2001-03-14
US09/981,928 2001-10-16
US09/981,928 US7200246B2 (en) 2000-11-17 2001-10-16 Object detection

Publications (1)

Publication Number Publication Date
WO2002073086A1 true WO2002073086A1 (en) 2002-09-19

Family

ID=26957646

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2002/007493 WO2002073086A1 (en) 2001-03-14 2002-03-13 Object detection

Country Status (5)

Country Link
US (1) US7200246B2 (en)
EP (1) EP1373785B1 (en)
JP (1) JP2004535616A (en)
AT (1) ATE382823T1 (en)
WO (1) WO2002073086A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004029502A1 (en) * 2002-09-24 2004-04-08 Pilz Gmbh & Co. Kg Method and device for making a hazardous area safe
EP1418380A1 (en) * 2002-11-06 2004-05-12 Leuze lumiflex GmbH + Co. KG Method and device for monitoring an area
EP1482238A3 (en) * 2003-05-29 2005-08-17 CASAGRANDE SpA Safety device for operating machines, particularly drilling machines or suchlike, and method to recognize the presence of persons, using such safety device
WO2007085330A1 (en) * 2006-01-30 2007-08-02 Abb Ab A method and a system for supervising a work area including an industrial robot
DE102006023787A1 (en) * 2006-05-20 2007-11-22 Sick Ag Opto-electronic protection device operating method, involves partially deactivating image signals of monitoring devices, proportional to respective viewing angle of camera at zones and combining with switching signal in evaluating unit
WO2008061607A1 (en) 2006-11-24 2008-05-29 Pilz Gmbh & Co. Kg Method and apparatus for monitoring a three-dimensional spatial area
WO2009007159A1 (en) * 2007-07-11 2009-01-15 Robert Bosch Gmbh Machine tool comprising a safety unit and a calibration unit for calibrating the safety unit
US7729511B2 (en) 2002-09-24 2010-06-01 Pilz Gmbh & Co. Kg Method and device for safeguarding a hazardous area
GB2479749A (en) * 2010-04-20 2011-10-26 Cementation Skanska Ltd Drill rig optical safety system
AT516140A1 (en) * 2014-08-11 2016-02-15 Scm Group Spa Method for safety control of a system and corresponding safety system
DE102006012823B4 (en) * 2006-03-21 2016-12-22 Leuze Electronic Gmbh + Co. Kg Device for monitoring a danger zone on a work equipment
US10081107B2 (en) 2013-01-23 2018-09-25 Denso Wave Incorporated System and method for monitoring entry of object into surrounding area of robot

Families Citing this family (69)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8948442B2 (en) * 1982-06-18 2015-02-03 Intelligent Technologies International, Inc. Optical monitoring of vehicle interiors
US7768549B2 (en) * 2001-06-08 2010-08-03 Honeywell International Inc. Machine safety system with mutual exclusion zone
JP4066168B2 (en) * 2003-03-13 2008-03-26 オムロン株式会社 Intruder monitoring device
US20050071166A1 (en) * 2003-09-29 2005-03-31 International Business Machines Corporation Apparatus for the collection of data for performing automatic speech recognition
CA2564171C (en) * 2004-04-30 2012-07-24 Utc Fire & Security Corp. Atm security system
GB0416583D0 (en) * 2004-07-23 2004-08-25 Rwl Consultants Ltd Access monitoring apparatus
WO2007020666A1 (en) * 2005-08-18 2007-02-22 Datasensor S.P.A. Vision sensor for security systems and its operating method
DE202006008112U1 (en) * 2006-05-20 2006-08-10 Sick Ag Optoelectronic protection device e.g. for monitoring hazardous area, has lighting equipment housing shaped for minimal jutting into entrance aperture
US20080000968A1 (en) * 2006-06-30 2008-01-03 Robert Thomas Cato Rotating Light Beam/Distance Meter Based Location Determining System
US8064683B2 (en) * 2006-09-27 2011-11-22 Wipro Limited Vision-controlled system and method thereof
US8558885B2 (en) * 2006-09-29 2013-10-15 The Chamberlain Group, Inc. Barrier operator system and method with obstruction detection
DE102006050235B4 (en) * 2006-10-17 2014-02-13 Pilz Gmbh & Co. Kg Camera system for monitoring a room area
JP4318724B2 (en) 2007-02-14 2009-08-26 パナソニック株式会社 Surveillance camera and surveillance camera control method
DE102007014612A1 (en) * 2007-03-23 2008-09-25 TRüTZSCHLER GMBH & CO. KG Device for monitoring and securing hazardous areas on power-driven textile machines, in particular spinning preparation machines
DE102007020448B4 (en) * 2007-04-27 2015-10-15 Trützschler GmbH & Co Kommanditgesellschaft Device on a spinning preparation machine, e.g. Track, card, combing machine o. The like. With a drafting system
US9293017B2 (en) 2009-02-26 2016-03-22 Tko Enterprises, Inc. Image processing sensor systems
US9740921B2 (en) 2009-02-26 2017-08-22 Tko Enterprises, Inc. Image processing sensor systems
US9277878B2 (en) * 2009-02-26 2016-03-08 Tko Enterprises, Inc. Image processing sensor systems
US9551575B2 (en) 2009-03-25 2017-01-24 Faro Technologies, Inc. Laser scanner having a multi-color light source and real-time color receiver
DE102009015920B4 (en) 2009-03-25 2014-11-20 Faro Technologies, Inc. Device for optically scanning and measuring an environment
ITTO20090374A1 (en) * 2009-05-12 2010-11-13 Newco 2000 S R L METHOD AND SYSTEM OF CONTROL OF THE PARKING AND RESTART OF A VEHICLE DURING A COMPETITION
US8253792B2 (en) * 2009-08-28 2012-08-28 GM Global Technology Operations LLC Vision system for monitoring humans in dynamic environments
US8659662B2 (en) * 2009-10-14 2014-02-25 Harris Corporation Surveillance system with target based scrolling and related methods
US9210288B2 (en) 2009-11-20 2015-12-08 Faro Technologies, Inc. Three-dimensional scanner with dichroic beam splitters to capture a variety of signals
DE102009057101A1 (en) 2009-11-20 2011-05-26 Faro Technologies, Inc., Lake Mary Device for optically scanning and measuring an environment
US9529083B2 (en) 2009-11-20 2016-12-27 Faro Technologies, Inc. Three-dimensional scanner with enhanced spectroscopic energy detector
US9113023B2 (en) * 2009-11-20 2015-08-18 Faro Technologies, Inc. Three-dimensional scanner with spectroscopic energy detector
US9163922B2 (en) 2010-01-20 2015-10-20 Faro Technologies, Inc. Coordinate measurement machine with distance meter and camera to determine dimensions within camera images
US9628775B2 (en) 2010-01-20 2017-04-18 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9879976B2 (en) 2010-01-20 2018-01-30 Faro Technologies, Inc. Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features
US9607239B2 (en) 2010-01-20 2017-03-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
DE112011100292B4 (en) 2010-01-20 2016-11-24 Faro Technologies Inc. Display for a coordinate measuring machine
DE102010020925B4 (en) 2010-05-10 2014-02-27 Faro Technologies, Inc. Method for optically scanning and measuring an environment
JP5178797B2 (en) * 2010-09-13 2013-04-10 キヤノン株式会社 Display control apparatus and display control method
US8970369B2 (en) * 2010-09-13 2015-03-03 Fasteners For Retail, Inc. “Invisi wall” anti-theft system
EP2619724A2 (en) 2010-09-23 2013-07-31 Stryker Corporation Video monitoring system
US20120081537A1 (en) * 2010-10-04 2012-04-05 General Electric Company Camera protective system and method
US9168654B2 (en) 2010-11-16 2015-10-27 Faro Technologies, Inc. Coordinate measuring machines with dual layer arm
EP2641236A1 (en) 2010-11-17 2013-09-25 Omron Scientific Technologies, Inc. A method and apparatus for monitoring zones
JP5696482B2 (en) * 2011-01-11 2015-04-08 富士通株式会社 Information processing apparatus and program
US8989437B2 (en) 2011-05-16 2015-03-24 Microsoft Corporation Salient object detection by composition
DE102012100609A1 (en) 2012-01-25 2013-07-25 Faro Technologies, Inc. Device for optically scanning and measuring an environment
CN102867384A (en) * 2012-03-12 2013-01-09 钛星科技股份有限公司 Guard chain monitoring system
US8997362B2 (en) 2012-07-17 2015-04-07 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with optical communications bus
CN102956084B (en) * 2012-09-16 2014-09-17 中国安全生产科学研究院 Three-dimensional space safety protection system
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
DE102012109481A1 (en) 2012-10-05 2014-04-10 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9513107B2 (en) 2012-10-05 2016-12-06 Faro Technologies, Inc. Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner
JPWO2014091667A1 (en) 2012-12-10 2017-01-05 日本電気株式会社 Analysis control system
EP2746820B1 (en) * 2012-12-20 2020-08-26 Leuze electronic GmbH + Co KG Optical sensor
DE102013200817A1 (en) * 2013-01-18 2014-07-24 Hella Kgaa Hueck & Co. Method for detecting an occupancy of a monitored zone
US9218538B2 (en) 2013-01-30 2015-12-22 Xerox Corporation Methods and systems for detecting an object borderline
US9098871B2 (en) * 2013-01-31 2015-08-04 Wal-Mart Stores, Inc. Method and system for automatically managing an electronic shopping list
FR3007176B1 (en) * 2013-06-18 2016-10-28 Airbus Operations Sas DEVICE, SYSTEM AND METHOD FOR ESCORTING AN AIRCRAFT ON THE GROUND
EP2819109B1 (en) * 2013-06-28 2015-05-27 Sick Ag Optoelectronic 3D-sensor and method for recognising objects
DE102014203749A1 (en) * 2014-02-28 2015-09-17 Robert Bosch Gmbh Method and device for monitoring at least one interior of a building and assistance system for at least one interior of a building
US9921300B2 (en) 2014-05-19 2018-03-20 Rockwell Automation Technologies, Inc. Waveform reconstruction in a time-of-flight sensor
US9696424B2 (en) 2014-05-19 2017-07-04 Rockwell Automation Technologies, Inc. Optical area monitoring with spot matrix illumination
US9256944B2 (en) * 2014-05-19 2016-02-09 Rockwell Automation Technologies, Inc. Integration of optical area monitoring with industrial machine control
US11243294B2 (en) 2014-05-19 2022-02-08 Rockwell Automation Technologies, Inc. Waveform reconstruction in a time-of-flight sensor
US9625108B2 (en) 2014-10-08 2017-04-18 Rockwell Automation Technologies, Inc. Auxiliary light source associated with an industrial application
DE202015101170U1 (en) * 2015-03-09 2015-05-06 Geotec Bohrtechnik Gmbh Vertical drill with secure working area
CN106716447B (en) * 2015-08-10 2018-05-15 皇家飞利浦有限公司 Take detection
DE102015122844A1 (en) 2015-12-27 2017-06-29 Faro Technologies, Inc. 3D measuring device with battery pack
PL3455541T3 (en) * 2016-05-12 2021-12-20 Kando Innovation Limited Enhanced safety attachment for cutting machine
US10416646B2 (en) * 2016-11-11 2019-09-17 Safe Tek, LLC Systems, methods, and articles of manufacture for operation of an industrial machine
EP3613012A1 (en) * 2017-04-19 2020-02-26 Schneider Electric IT Corporation Systems and methods of proximity detection for rack enclosures
US11125860B2 (en) * 2017-09-29 2021-09-21 Rockwell Automation Technologies, Inc. Triangulation applied as a safety scanner
SE2250812A1 (en) * 2022-06-29 2023-12-30 Flasheye Ab Information carrier for providing information to a lidar sensor

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4249207A (en) * 1979-02-20 1981-02-03 Computing Devices Company Perimeter surveillance system
DE19619688A1 (en) * 1996-05-15 1997-11-20 Herion Werke Kg Method of video monitoring for machine operating areas, e.g. for presses
EP1037181A2 (en) * 1999-03-12 2000-09-20 Delphi Technologies, Inc. Method and system for monitoring an interior
DE10026710A1 (en) * 2000-05-30 2001-12-06 Sick Ag Optoelectronic protection device for surveillance area has latter enclosed by reflector with coded reflector segments

Family Cites Families (87)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS56160183A (en) * 1980-05-09 1981-12-09 Hajime Sangyo Kk Method and device for monitoring
EP0062655B1 (en) * 1980-10-22 1986-10-15 The Commonwealth Of Australia Video movement detector
US5845000A (en) * 1992-05-05 1998-12-01 Automotive Technologies International, Inc. Optical identification and monitoring system using pattern recognition for use with vehicles
US4589140A (en) 1983-03-21 1986-05-13 Beltronics, Inc. Method of and apparatus for real-time high-speed inspection of objects for identifying or recognizing known and unknown portions thereof, including defects and the like
US4923066A (en) * 1987-10-08 1990-05-08 Elor Optronics Ltd. Small arms ammunition inspection system
US4857912A (en) * 1988-07-27 1989-08-15 The United States Of America As Represented By The Secretary Of The Navy Intelligent security assessment system
FR2665317B1 (en) 1990-07-27 1996-02-09 Thomson Surveillance Video SURVEILLANCE CAMERA WITH INTEGRATED SUPPORT.
EP0484076B1 (en) 1990-10-29 1996-12-18 Kabushiki Kaisha Toshiba Video camera having focusing and image-processing function
KR930010843B1 (en) 1990-12-15 1993-11-12 삼성전자 주식회사 Mobil camera apparatus
CA2057961C (en) * 1991-05-06 2000-06-13 Robert Paff Graphical workstation for integrated security system
US5359363A (en) * 1991-05-13 1994-10-25 Telerobotics International, Inc. Omniview motionless camera surveillance system
US5509082A (en) * 1991-05-30 1996-04-16 Matsushita Electric Industrial Co., Ltd. Vehicle movement measuring apparatus
US5479021A (en) 1991-06-10 1995-12-26 Picker International, Inc. Transmission line source assembly for spect cameras
JP3151921B2 (en) 1991-06-17 2001-04-03 松下電器産業株式会社 Television camera equipment
CA2062620C (en) * 1991-07-31 1998-10-06 Robert Paff Surveillance apparatus with enhanced control of camera and lens assembly
US5164827A (en) 1991-08-22 1992-11-17 Sensormatic Electronics Corporation Surveillance system with master camera control of slave cameras
CA2068022C (en) 1991-09-17 2002-07-09 Norbert M. Stiepel Surveillance device with eyeball assembly and pivotably mountable carriage assembly
USD349713S (en) * 1991-11-18 1994-08-16 Elmo Company Ltd. Surveillance camera
USD349714S (en) * 1991-11-18 1994-08-16 Elmo Company Ltd. Surveillance camera
FR2686720A1 (en) 1992-01-28 1993-07-30 Bidault Louis MONITORING DEVICE WITH MOBILE CAMERA.
KR100264196B1 (en) 1992-03-06 2000-08-16 알 아트킨스 Control apparatus for line marking machines
US5835613A (en) * 1992-05-05 1998-11-10 Automotive Technologies International, Inc. Optical identification and monitoring system using pattern recognition for use with vehicles
FR2692423B1 (en) * 1992-06-16 1995-12-01 Thomson Csf MULTISTANDARD OBSERVATION CAMERA AND MONITORING SYSTEM USING SUCH A CAMERA.
JPH0667266A (en) 1992-08-21 1994-03-11 Ngk Insulators Ltd Burgular preventive camera device and warning system
USD354973S (en) * 1992-10-07 1995-01-31 Sony Corporation Surveillance video camera
USD347442S (en) * 1992-11-06 1994-05-31 Falconer Leonard S Combined imitation surveillance camera and support therefor
USD349913S (en) * 1993-01-22 1994-08-23 Morris Bryan W Surveillance video camera security box
US5418567A (en) * 1993-01-29 1995-05-23 Bayport Controls, Inc. Surveillance camera system
USD349911S (en) * 1993-04-05 1994-08-23 Koyo Electronics Industries Co., Ltd. Surveillance camera
GB9308952D0 (en) * 1993-04-30 1993-06-16 Philips Electronics Uk Ltd Tracking objects in video sequences
EP0700623A1 (en) * 1993-05-14 1996-03-13 Rct Systems, Inc. Video traffic monitor for retail establishments and the like
JP3169196B2 (en) 1993-08-10 2001-05-21 オリンパス光学工業株式会社 camera
JPH0795598A (en) * 1993-09-25 1995-04-07 Sony Corp Object tracking device
JPH07104362A (en) 1993-10-01 1995-04-21 Canon Inc Controller for camera
JPH07159892A (en) 1993-12-08 1995-06-23 Canon Inc Camera
JPH07175128A (en) 1993-12-20 1995-07-14 Canon Inc Camera
JPH07191390A (en) 1993-12-27 1995-07-28 Canon Inc Camera
JPH07222039A (en) 1994-01-31 1995-08-18 Mitsubishi Electric Corp Power unit for video camera
JP3293308B2 (en) * 1994-03-10 2002-06-17 三菱電機株式会社 Person state detection device
JP3462899B2 (en) 1994-03-14 2003-11-05 松下電器産業株式会社 Three-panel TV camera device
JPH07281276A (en) 1994-04-04 1995-10-27 Konica Corp Camera
FR2719670B1 (en) * 1994-05-03 1996-07-05 Sopha Medical Gamma camera with approach and safety plans.
US5627616A (en) * 1994-06-22 1997-05-06 Philips Electronics North America Corporation Surveillance camera system
GB9413413D0 (en) 1994-07-04 1994-08-24 At & T Global Inf Solution Apparatus and method for testing bank-notes
US5477212A (en) * 1994-07-18 1995-12-19 Rumpel; David C. Surveillance camera simulator apparatus
GB2305061B (en) * 1994-07-26 1998-12-09 Maxpro Systems Pty Ltd Text insertion system
JP3531231B2 (en) 1994-09-07 2004-05-24 株式会社ニコン Power supply circuit, load device, and camera with power supply circuit
USD378095S (en) * 1994-09-26 1997-02-18 Elmo Company Limited Surveillance camera
US5541585A (en) * 1994-10-11 1996-07-30 Stanley Home Automation Security system for controlling building access
JPH08140941A (en) 1994-11-25 1996-06-04 Canon Inc Ophthalmologic camera
USD365834S (en) * 1995-01-06 1996-01-02 Dozier Charles W Housing for a surveillance camera
JP3569992B2 (en) * 1995-02-17 2004-09-29 株式会社日立製作所 Mobile object detection / extraction device, mobile object detection / extraction method, and mobile object monitoring system
US5729471A (en) * 1995-03-31 1998-03-17 The Regents Of The University Of California Machine dynamic selection of one video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene
US6088468A (en) * 1995-05-17 2000-07-11 Hitachi Denshi Kabushiki Kaisha Method and apparatus for sensing object located within visual field of imaging device
US6151065A (en) 1995-06-20 2000-11-21 Steed; Van P. Concealed integrated vehicular camera safety system
WO1997005744A1 (en) 1995-07-27 1997-02-13 Sensormatic Electronics Corporation Image splitting, forming and processing device and method for use with no moving parts camera
US5691765A (en) * 1995-07-27 1997-11-25 Sensormatic Electronics Corporation Image forming and processing device and method for use with no moving parts camera
JP3350296B2 (en) * 1995-07-28 2002-11-25 三菱電機株式会社 Face image processing device
JP3220626B2 (en) 1995-09-20 2001-10-22 シャープ株式会社 In-vehicle surveillance camera device
US5649255A (en) 1995-09-25 1997-07-15 Sensormatic Electronics Corporation Video surveillance camera release and removal mechanism
SG87750A1 (en) 1995-11-01 2002-04-16 Thomson Consumer Electronics Surveillance system for a video recording camera
US5745170A (en) * 1995-11-01 1998-04-28 Itt Corporation Mounting device for joining a night vision device to a surveillance camera
EP0774730B1 (en) 1995-11-01 2005-08-24 Canon Kabushiki Kaisha Object extraction method, and image sensing apparatus using the method
US6002995A (en) * 1995-12-19 1999-12-14 Canon Kabushiki Kaisha Apparatus and method for displaying control information of cameras connected to a network
US5793900A (en) * 1995-12-29 1998-08-11 Stanford University Generating categorical depth maps using passive defocus sensing
US5818519A (en) * 1996-01-17 1998-10-06 Wren; Clifford T. Surveillance camera mounting apparatus
JPH09193078A (en) 1996-01-22 1997-07-29 Hitachi Constr Mach Co Ltd Camera direction control device of remote control machine
US5752100A (en) * 1996-01-26 1998-05-12 Eastman Kodak Company Driver circuit for a camera autofocus laser diode with provision for fault protection
JP3146150B2 (en) 1996-04-01 2001-03-12 スター精密株式会社 Surveillance camera system
US5982418A (en) * 1996-04-22 1999-11-09 Sensormatic Electronics Corporation Distributed video data storage in video surveillance system
JPH1031256A (en) 1996-07-16 1998-02-03 Fuji Photo Film Co Ltd Camera
JPH1042231A (en) 1996-07-19 1998-02-13 Canon Inc Digital camera and digital camera system
US5953055A (en) * 1996-08-08 1999-09-14 Ncr Corporation System and method for detecting and analyzing a queue
DE19644278A1 (en) 1996-10-24 1998-05-07 Ines Elektronik Systementwickl Optical barrier and monitoring device constructed from it
US5731832A (en) * 1996-11-05 1998-03-24 Prescient Systems Apparatus and method for detecting motion in a video signal
USD399517S (en) * 1996-12-06 1998-10-13 Elmo Co., Ltd. Surveillance television camera
US5992094A (en) * 1997-02-11 1999-11-30 Diaz; William Access control vestibule
WO1998046116A2 (en) 1997-04-16 1998-10-22 Charles Jeffrey R Solid catadioptric omnidirectional optical system having central coverage means which is associated with a camera, projector, medical instrument, or similar article
US6097429A (en) * 1997-08-01 2000-08-01 Esco Electronics Corporation Site control unit for video security system
US5790910A (en) * 1997-08-04 1998-08-04 Peerless Industries, Inc. Camera mounting apparatus
US5852754A (en) * 1997-08-27 1998-12-22 Videolarm, Inc. Pressurized housing for surveillance camera
US6052052A (en) * 1997-08-29 2000-04-18 Navarro Group Limited, Inc. Portable alarm system
JP3567066B2 (en) * 1997-10-31 2004-09-15 株式会社日立製作所 Moving object combination detecting apparatus and method
US6359647B1 (en) * 1998-08-07 2002-03-19 Philips Electronics North America Corporation Automated camera handoff system for figure tracking in a multiple camera system
US6285394B1 (en) * 1999-05-10 2001-09-04 James L. F. Huang Video monitoring system
US6469734B1 (en) * 2000-04-29 2002-10-22 Cognex Corporation Video safety detector with shadow elimination
US6720874B2 (en) * 2000-09-29 2004-04-13 Ids Systems, Inc. Portal intrusion detection apparatus and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4249207A (en) * 1979-02-20 1981-02-03 Computing Devices Company Perimeter surveillance system
DE19619688A1 (en) * 1996-05-15 1997-11-20 Herion Werke Kg Method of video monitoring for machine operating areas, e.g. for presses
EP1037181A2 (en) * 1999-03-12 2000-09-20 Delphi Technologies, Inc. Method and system for monitoring an interior
DE10026710A1 (en) * 2000-05-30 2001-12-06 Sick Ag Optoelectronic protection device for surveillance area has latter enclosed by reflector with coded reflector segments

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004029502A1 (en) * 2002-09-24 2004-04-08 Pilz Gmbh & Co. Kg Method and device for making a hazardous area safe
US7729511B2 (en) 2002-09-24 2010-06-01 Pilz Gmbh & Co. Kg Method and device for safeguarding a hazardous area
US7567272B2 (en) 2002-11-06 2009-07-28 Leuze Lumiflex Gmbh + Co. Kg Method and device for monitoring an area of coverage
EP1418380A1 (en) * 2002-11-06 2004-05-12 Leuze lumiflex GmbH + Co. KG Method and device for monitoring an area
EP1482238A3 (en) * 2003-05-29 2005-08-17 CASAGRANDE SpA Safety device for operating machines, particularly drilling machines or suchlike, and method to recognize the presence of persons, using such safety device
WO2007085330A1 (en) * 2006-01-30 2007-08-02 Abb Ab A method and a system for supervising a work area including an industrial robot
DE102006012823B4 (en) * 2006-03-21 2016-12-22 Leuze Electronic Gmbh + Co. Kg Device for monitoring a danger zone on a work equipment
DE102006023787A1 (en) * 2006-05-20 2007-11-22 Sick Ag Opto-electronic protection device operating method, involves partially deactivating image signals of monitoring devices, proportional to respective viewing angle of camera at zones and combining with switching signal in evaluating unit
DE102006023787B4 (en) * 2006-05-20 2009-12-10 Sick Ag Optoelectronic protective device
DE102006057605A1 (en) * 2006-11-24 2008-06-05 Pilz Gmbh & Co. Kg Method and device for monitoring a three-dimensional space area
CN101542184B (en) * 2006-11-24 2012-03-21 皮尔茨公司 Method and apparatus for monitoring a three-dimensional spatial area
US8988527B2 (en) 2006-11-24 2015-03-24 Pilz GmbH & Co KG. Method and apparatus for monitoring a three-dimensional spatial area
WO2008061607A1 (en) 2006-11-24 2008-05-29 Pilz Gmbh & Co. Kg Method and apparatus for monitoring a three-dimensional spatial area
WO2009007159A1 (en) * 2007-07-11 2009-01-15 Robert Bosch Gmbh Machine tool comprising a safety unit and a calibration unit for calibrating the safety unit
GB2479749A (en) * 2010-04-20 2011-10-26 Cementation Skanska Ltd Drill rig optical safety system
US10081107B2 (en) 2013-01-23 2018-09-25 Denso Wave Incorporated System and method for monitoring entry of object into surrounding area of robot
AT516140A1 (en) * 2014-08-11 2016-02-15 Scm Group Spa Method for safety control of a system and corresponding safety system

Also Published As

Publication number Publication date
US20020061134A1 (en) 2002-05-23
ATE382823T1 (en) 2008-01-15
EP1373785B1 (en) 2008-01-02
EP1373785A1 (en) 2004-01-02
JP2004535616A (en) 2004-11-25
US7200246B2 (en) 2007-04-03

Similar Documents

Publication Publication Date Title
EP1373785B1 (en) Object detection
US7167575B1 (en) Video safety detector with projected pattern
JP4859879B2 (en) Object detection device and gate device using it
US7768549B2 (en) Machine safety system with mutual exclusion zone
US6707486B1 (en) Directional motion estimator
EP1598792B1 (en) Infrared safety systems and methods
JP2003515811A (en) Video crisis management curtain
CA2275893C (en) Low false alarm rate video security system using object classification
KR101036947B1 (en) The automatic guard system to prevent the crime and accident using computer video image analysis technology
JP2010070299A (en) Protection device of escalator
KR20190046351A (en) Method and Apparatus for Detecting Intruder
CN112757300A (en) Robot protection system and method
JP4430027B2 (en) Image processing apparatus, surveillance camera, and image surveillance system
KR101014842B1 (en) Security image monitoring system and method using rfid reader
JP2010182021A (en) Image monitor system
JP4776324B2 (en) Monitoring terminal device
JPH09265585A (en) Monitoring and threatening device
KR101695127B1 (en) Group action analysis method by image
JPH10232985A (en) Indoor monitoring device
JP7176868B2 (en) monitoring device
JP2012103901A (en) Intrusion object detection device
KR101661759B1 (en) Apparatus and method for monitoring fall
JPH0397080A (en) Picture monitoring device
JPH1133975A (en) Robot monitoring device
JP4038878B2 (en) Fire detection device using image processing

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2002572310

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 2002723404

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2002723404

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWG Wipo information: grant in national office

Ref document number: 2002723404

Country of ref document: EP