Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20020008758 A1
Publication typeApplication
Application numberUS 09/801,441
Publication dateJan 24, 2002
Filing dateMar 7, 2001
Priority dateMar 10, 2000
Also published asUS6867799, US7307652, US20010035907, US20020030741, WO2001069930A1, WO2001069931A1, WO2001069932A1
Publication number09801441, 801441, US 2002/0008758 A1, US 2002/008758 A1, US 20020008758 A1, US 20020008758A1, US 2002008758 A1, US 2002008758A1, US-A1-20020008758, US-A1-2002008758, US2002/0008758A1, US2002/008758A1, US20020008758 A1, US20020008758A1, US2002008758 A1, US2002008758A1
InventorsRaymond Broemmelsiek, Mark Scott
Original AssigneeBroemmelsiek Raymond M., Scott Mark A.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Method and apparatus for video surveillance with defined zones
US 20020008758 A1
Abstract
In the field of motion video object tracking, environmental noise and other such sources of movement can lead to false tracking and tracking of unimportant objects. The present invention uses user-defined exclusion regions within a motion video camera's field of view to ignore the origination of object track data. Therefore, the user defines in advance, those regions that may generate false alarms within a motion video camera's field of view. The use of user-defined zones in conjunction with object detection and tracking techniques provides a video surveillance system that has greater object tracking accuracy than existing systems.
Images(6)
Previous page
Next page
Claims(27)
1. A method for defining a control zone in a field of view of a motion video camera, said method comprising the steps of:
displaying motion video data representative of the field of view of the motion video camera;
receiving indication of a control zone type; and
receiving indication of a control zone size within the field of view of the motion video camera.
2. The method according to claim 1 further including the step of displaying graphics representative of the control zone size in association with the field of view and the motion video data.
3. The method according to claim 1 wherein the control zone type is selected from the group consisting of tracking, black-out, exclusion, entry and privacy.
4. A video surveillance method executed according to the control zone defined in claim 1, said method comprising the steps of:
(a) detecting movement in a field of view of the motion video camera;
(b) determining if a moving object is in a tracking origination zone;
(c) defining the moving object if the moving object is in a tracking origination zone; and
(d) tracking the defined moving object.
5. The video surveillance method according to claim 4 further comprising the steps of:
(e) determining if the defined moving object has entered a new control zone type;
(f) determining if the new control zone type is a tracking continuation zone; and
(g) repeating step (d) to (f) if the new control zone type is a tracking continuation zone.
6. The video surveillance method according to claim 4 further comprising the steps of:
(e) determining if the defined moving object has entered a new control zone type;
(f) determining if the new control zone type is a tracking continuation zone; and
(g) ceasing tracking of the defined moving object if the new control zone type is not a tracking continuation zone.
7. The video surveillance method according to claim 4 wherein the control zone type is selected from the group consisting of tracking, black-out, exclusion, entry and privacy.
8. The video surveillance method according to claim 7 wherein a tracking zone is a tracking origination zone and a tracking continuation zone and defines a region in which motion is monitored in the field of view of the motion video camera.
9. The video surveillance method according to claim 7 wherein a privacy zone only monitors movement.
10. The video surveillance method according to claim 7 wherein an exclusion zone is a tracking continuation zone.
11. The video surveillance method according to claim 7 wherein an entry zone is a tracking origination zone and a tracking continuation zone.
12. The video surveillance method according to claim 7 wherein a black-out zone is not monitored for movement.
13. A system for defining control zones of different types in a field of view of a motion video camera, said system comprising:
a database containing a description for each of a plurality of control zone types;
means for defining a control zone in a selected area of the field of view of the motion video camera, said control zone being of a type selected from one of said plurality of control zone types in said database; and
means for displaying a received motion video signal from the motion video camera including an indication of said defined control zone.
14. The system according to claim 13 wherein said means for displaying includes means for providing a graphical representation of a size of said selected area of the field of view with the received motion video signal.
15. A video surveillance system using a motion video camera having control zones in the field of view thereof as defined in claim 13, said system comprising:
means for detecting movement in the field of view of the motion video camera;
means for determining a current control zone of the moving object;
means for defining the moving object dependent on the current control zone of the moving object; and
means for performing a tracking operation the defined moving object dependent on a control zone type of the current control zone.
16. A computer readable medium having stored thereon computer-executable instructions for defining a control zone in a field of view of a motion video camera performing the steps comprising:
displaying motion video data representative of the field of view of the motion video camera;
receiving indication of a control zone type; and
receiving indication of a control zone size within the field of view of the motion video camera.
17. The computer readable medium according to claim 16 further including the step of displaying graphics representative of the control zone size in association with the motion video data.
18. The computer readable medium according to claim 16 wherein the control zone type is selected from the group consisting of tracking, black-out, exclusion, entry and privacy.
19. A computer readable medium having stored thereon computer-executable instructions for executing motion video camera surveillance according to the control zone defined in claim 16 performing the steps comprising:
(a) detecting movement in a field of view of a motion video camera;
(b) determining if a moving object is in a tracking origination zone;
(c) defining the moving object if the moving object is in a tracking origination zone; and
(d) tracking the defined moving object.
20. The computer readable medium according to claim 19 further comprising the steps of:
(e) determining if the defined moving object has entered a new control zone type;
(f) determining if the new control zone type is a tracking continuation zone; and
(g) repeating step (d) to (f) if the new control zone type is a tracking continuation zone.
21. The computer readable medium according to claim 19 further comprising the steps of:
(e) determining if the defined moving object has entered a new control zone type;
(f) determining if the new control zone type is a tracking continuation zone; and
(g) ceasing tracking of the defined moving object if the new control zone type is not a tracking continuation zone.
22. The computer readable medium according to claim 19 wherein the control zone type is selected from the group consisting of tracking, black-out, exclusion, entry and privacy.
23. The computer readable medium according to claim 22 wherein a tracking zone is a tracking origination zone and a tracking continuation zone and defines a region in which motion is monitored in the field of view of the motion video camera.
24. The computer readable medium according to claim 22 wherein a privacy zone only monitors movement.
25. The computer readable medium according to claim 22 wherein an exclusion zone is a tracking continuation zone.
26. The computer readable medium according to claim 22 wherein an entry zone is a tracking origination zone and a tracking continuation zone.
27. The computer readable medium according to claim 22 wherein a black-out zone is not monitored for movement.
Description
CROSS REFERENCE TO RELATED APPLICATIONS

[0001] This application claims the benefit of United States provisional patent application Ser. No. 60/188,171 filed on Mar. 10, 2000. United States Patent applications, also claiming the benefit of U.S. Provisional application Ser. No. 60/188,171, and entitled “Method and Apparatus for Object Tracking and Detection” and “Method and Apparatus for Object Surveillance with a Movable Camera” were filed concurrently herewith.

FIELD OF THE INVENTION

[0002] The present invention relates to the field of video surveillance systems using motion video cameras.

BACKGROUND OF THE INVENTION

[0003] There are several shortcomings in current video surveillance systems that need to be overcome for widespread use of automatic detection and collection of relevant video data in response to scene stimulus without the need of a human operator present. When viewing a scene from a video camera a large amount of data is generated. The vast amount of data created produces a data reduction problem. Automatically detecting and accurately and reliably collecting image information of a moving object using a motion video camera is a difficult task. This task is made even more difficult when trying to detect, track and maintain camera line-of-sight using a single motion video camera without requiring human intervention.

[0004] U.S. Pat. No. 5,473,369 (Abe) describes the use of a camera to detect and track a moving object without using conventional block matching. In the system described in Abe single object tracking is performed only after an object is placed within a frame on a screen; however, there is no user input device for manual target selection Moreover, Abe does not provide for camera movement to maintain line-of-site.

[0005] Other prior art solutions provide for image stabilization for a camera in arbitrary motion without object tracking functionality. U.S. Pat. No. 5,629,988 (Burt) teaches electronic stabilization of a sequence of images with respect to one another but provides no tracking facility.

[0006] Still other prior art solutions control camera movement to maintain line-of-sight between camera and object but lack arbitrary motion compensation or do not provide for automatic and user selected object tracking. U.S. Pat. No. 5,434,621 (Yu) teaches a method for automatic zooming and automatic tracking of an object using a zoom lens but does not provide for reorienting the camera's line-of-sight.

SUMMARY OF THE INVENTION

[0007] It is an object of the present invention to provide a video surveillance method and system having increased accuracy.

[0008] It is an object of the present invention to provide a method and system for reducing false alarms in motion video object detection and surveillance.

[0009] It is an object of the present invention to provide a method and system for identifying user-defined zones to control motion video tracking.

[0010] In accordance with one aspect of the present invention there is provided a method for defining a control zone in a field of view of a motion video camera, said method comprising the steps of: displaying motion video data representative of the field of view of the motion video camera; receiving indication of a control zone type; and receiving indication of a control zone size within the field of view of the motion video camera.

[0011] In accordance with another aspect of the present invention there is provided a system for defining control zones of different types in a field of view of a motion video camera, said system comprising: a database containing a description for each of a plurality of control zone types; means for defining a control zone in a selected area of the field of view of the motion video camera, said control zone being of a type selected from one of said plurality of control zone types in said database; and means for displaying a received motion video signal from the motion video camera including an indication of said defined control zone.

[0012] In accordance with a further aspect of the present invention there is provided a computer readable medium having stored thereon computer-executable instructions for defining a control zone in a field of view of a motion video camera performing the steps comprising: displaying motion video data representative of the field of view of the motion video camera; receiving indication of a control zone type; and receiving indication of a control zone size within the field of view of the motion video camera.

BRIEF DESCRIPTION OF THE DRAWINGS

[0013]FIG. 1 is an exemplary configuration of a video surveillance system having user-defined zones according to an embodiment of the present invention;

[0014]FIG. 2 shows an exemplary field of view for a camera highlighting user-defined zones according to the present invention;

[0015]FIG. 3 is a system diagram of a zone defining processing system the video surveillance system of FIG. 1 according to an embodiment of the present invention;

[0016]FIG. 4 is a flow chart illustrating a method of identifying user-defined zones according to the present invention; and

[0017]FIG. 5 is a flow chart illustrating a general method for video surveillance using the user-defined zones identified by the method in FIG. 4.

DETAILED DESCRIPTION

[0018] Object tracks, derived from motion video data, are a very useful way to automatically detect alarm conditions within a camera's field of view for purposes such as security and surveillance. However, in the field of motion video object tracking, environmental noise and other such sources of movement can lead to false tracking and tracking of unimportant objects such as leaves falling from a tree.

[0019] The present invention uses user-defined regions within a motion video camera's field of view to ignore the origination of object track data but allow tracking of objects already being tracked before they entered that region. Therefore, the user defines in advance those regions that may generate false alarms within a motion video camera's field of view. The use of user-defined zones in conjunction with object detection and tracking techniques provides a video surveillance system that has greater object tracking accuracy than existing systems.

[0020]FIG. 1 illustrates a video surveillance system 100 according to an embodiment of the present invention. A motion video camera 108 has a field of view 118. A computer 102 receives and processes a video signal 112 from the motion video camera 108 and performs object tracking and detection to determine if there was movement in the field of view 118. The computer 102 contains a zone defining processing system (FIG. 3) for defining surveillance zones in the camera's 108 field of view.

[0021] For a moveable motion video camera 108, the computer 102 generates a control signal 114 for servo controlled pan-tilt-zoom assembly 110. The control signal 114 is based on the current position of the servo controlled pan-tilt-zoom assembly 110 and information contained in the video signal 112. Such movement allows the motion video camera 108 to capture an object of interest in greater detail or improve the camera's 108 line of sight with the object of interest. The object detection and tracking techniques that are used can be, for example, those taught in Applicant's related applications entitled “Method and Apparatus for Object Tracking and Detection” and “Method and Apparatus for Remote Object Tracking and Detection”, filed concurrently herewith, both of which are incorporated herein by reference.

[0022] Object detection may be accomplished using any number of methods for image segmentation known in the art. For example, motion detection may be performed by frame differencing sequential pairs of video frames and applying thresholding techniques thereby yielding pixels within the processed image that reflect motion of objects with the field of view of the camera 108. Additional image processing techniques such as centroid analysis may then be applied to remove spurious motion. Kalman filtering may be applied over time to further remove random motion and to estimate motion of objects for the purpose of anticipating camera 108 repositioning and maintaining tracking when moving objects are temporarily occluded by stationary ones. Object tracking and detection is discussed in greater detail in applicant's co-pending related applications entitled “Method and Apparatus for Object Tracking and Detection” and “Method and Apparatus for Object Surveillance with a Movable Camera” filed concurrently herewith and incorporated herein by reference.

[0023] For a fixed motion video camera 108 (not shown) not having a servo controlled pan-tilt-zoom assembly 110, the computer 102 uses the video signal and the current magnification of the camera 108 to create a control signal similar to the control signal of the moveable camera 114. However, the control signal for a fixed camera only uses a zoom function already on the camera 108 to capture the object of interest in greater detail.

[0024] The video signal 112 received from video camera 108 is passed out from the computer 102 as video signal out 106 either directly to a display 104 or modified to include graphic information that may be used to set up response parameters of a tracking program, indicate an object that is actively being tracked or identify (or allow identification of) user-defined zones for tracking. A pointing device 116 may be a mouse or trackball and is the user input for modifying said response parameters or defining the tracking zones. The pointing device 116 may also be used by the user to select an object that appears within the field of view 118 such that tracking program residing on the computer 102 acknowledges user's selection and initiates tracking of the selected object.

[0025]FIG. 2 is an illustration of an exemplary field of view 200 for a camera highlighting user-defined zones according to the present invention. In this illustration, the environment that the motion video camera 108 is receiving within its field of view 200 is shown. For the purpose of illustration, the environment contains a house 220, a tree 230, a walkway 240, and a pond 250. There are five different types of zones 202, 204, 206, 208, and 210 that control tracking behavior and subsequent video output. These zones are defined as follows:

Zones Definition Application
Tracking Zone Sets the overall surveillance Include only the region
region defining pan and tilt in which tracking is
limits of camera. The tracker required.
does not track nor does it
move the camera beyond the
tracking zone.
Black-out Zone Sets regions within the Reflections, high traffic
tracking zone in which areas, machinery, other
tracker will not track. The unwanted distractions.
tracker does not track any
target that moves into a
black-out zone, nor does it
originate a target within a
blackout zone.
Exclusion Zone Sets regions within the Trees, machinery, water
tracking zone in which tracker and other stationary
will not originate a new track reflective surfaces.
but that tracked objects may
enter and exit. The tracker
maintains a track that enters
an exclusion zone.
Entry Zone Sets region within the tracking Watch doors, windows,
zone in which tracker will vehicles and other
automatically originate a new assets.
track and follow. When the
tracker is programmed to
search only the entry zone,
then this is the only region
in which tracks can originate.
Privacy Zones Sets regions in the tracking Areas within tracker
zone not viewable by the field of view but
operator. The tracker requiring privacy from
maintains tracking passing camera such as office
into and out of privacy zones. windows and homes.

[0026] The tracking zone 208 is shown on the display encompassing the house 220, the tree 230, the walkway 240, and the pond 250. Only for the purpose of illustration is the tracking zone 208 contained within a single field of view 118. For embodiments using a moveable camera 108, the tracking zone 208 may span several fields of view. An exclusion zone 250 is shown encompassing that portion of the pond 250 that is within the tracking zone 208. A black-out zone 202 is shown encompassing the leafy portion of the tree 230 which may generate tracking events when the leaves move in the wind. Since in this field of view 200, it is not anticipated that tracks will not enter into the perimeter, the entire region receives a black-out zone 202 rather than an exclusion zone 204. An entry zone 206 is shown encompassing the door of the house 220. A privacy zone 210 is shown encompassing the window of the house 220.

[0027]FIG. 3 is a zone defining processing system 120 for defining surveillance zones in the camera's 108 field of view according to an embodiment of the present invention. The zone defining processing system 120 has a field of view (FOV) area definer 132 that connects interfaces to various devices (i.e. camera 108, input device 116, and display 104), a zone type database 128 and a tracking/monitoring controller 130. A camera interface 122 receives the video signal in from the motion video camera 108 and passes this signal to the FOV area definer 132. The FOV area definer 132 can pass this signal directly to a display interface 126 to be shown on the display 106 or the signal may be modified to include a graphic overlay containing information on zones within the field of view.

[0028] The FOV area definer 132 may send a zone type selection menu to the display interface 126 to prompt a user to select a zone type using the input device 116. The zone type database 128 contains definitions for the different types of zones as well as corresponding actions for each zone type (i.e. do not initiate tracking objects but continue following previous tracks). Indication of a selected zone type is received at a input device interface 124.

[0029] After a zone type has been selected the FOV area definer 132 provides the display interface 126 with a graphic overlay for the field of view to assist the user in drawing a zone of the selected type. After an indication of a drawn zone has been received through the input device interface 124, the FOV area definer 132 provides the display interface 126 with a graphic overlay indicating the defined zone.

[0030] The tracking/monitoring controller 130 tracks and monitors defined objects within the field of view. Given a field of view the tracking/monitoring controller 130 consults the FOV area definer 132 to determine a mapping between defined zones and the field of view. The FOV area definer 132 provides the tracking/monitoring controller 130 with the mapping between defined zones and the field of view as well as a definition of the zone types and the corresponding actions for each zone type.

[0031]FIG. 4 is a flow chart illustrating a method 300 of identifying user-defined zones according to the present invention. All of these zones are shown on the display 104 as graphic overlays to the video signal in 112 received from the camera 108 in step 302. An input device such as a mouse 116 is provided for selecting a zone type in step 304. The type of zone may be selected, for example, from a menu listing all zone types or applicable zone types given other settings. These zones may be selected and resized in step 306 by a user with a mouse 116 or similar pointing device by holding down a mouse button and moving the mouse 116 to resize a rectangular region over some portion of the display's image. If the extent of a rectangular region exceeds a single field of view, the mouse 116 is moved to any of the four edges or corners of the display which results in the tracking program residing on the computer 102 sending control signal 114 to the pan-tilt-zoom assembly 110 resulting in an altered field of view. The mouse button is released when the desired region is encompassed by a rectangle. After the size of the zone has been drawn in step 306, the method 300 is repeated for multiple instances of zones of a single type, as applicable, as for other types of zones. Multiple overlapping and non overlapping shapes may be used to define more complex zones. Once these regions are defined, the tracker performs in its environment according to the rules that are defined for each type of region.

[0032]FIG. 5 is a flow chart illustrating a general method 400 for video surveillance using the user-defined zones identified by the method 300 in FIG. 4. The video surveillance system continuously monitors a tracking zone (except for any black-out zones) in the field of view for movement 402. When movement is detected 404 subsequent video frames are compared 406 to determine what zone the movement occurred in 408. If the movement is not in a black-out zone or exclusion zone the object is isolated 410 and tracking starts 412. If the movement is in a black-out zone or exclusion zone tracking is not initiated and the video surveillance system continues to monitor the tracking zone for movement 402. While an object is being tracked if it leaves the current field of view and enters a new field of view 414 the camera 108 may be moved 418 to center the tracked object in the camera's 108 field of view. Movement of the camera 108 to maintain a moving object in the field of view is discussed in more detail in applicant's co-pending application related applications entitled “Method and Apparatus for Object Surveillance with a Movable Camera” filed concurrently herewith and incorporated herein by reference. If the object enters a black-out zone or exits the tracking zone 416 then tracking stops and the system continues to monitor the tracking zone for movement 402. If the object has not entered a black-out zone or exited the tracking zone then the object continues to be tracked.

[0033] If movement is not detected in step 404 then the camera 108 is moved to the next field of view 403 to continue searching for movement 402.

[0034] It is apparent to one skilled in the art that numerous modifications and departures from the specific embodiments described herein may be made without departing from the spirit and scope of the invention.

Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US6970083 *Nov 12, 2003Nov 29, 2005Objectvideo, Inc.Video tripwire
US7082209 *Aug 21, 2001Jul 25, 2006Hitachi Kokusai Electric, Inc.Object detecting method and object detecting apparatus and intruding object monitoring apparatus employing the object detecting method
US7113616 *Dec 4, 2002Sep 26, 2006Hitachi Kokusai Electric Inc.Object tracking method and apparatus using template matching
US7283161 *Jan 13, 2004Oct 16, 2007Canon Kabushiki KaishaImage-taking apparatus capable of distributing taken images over network
US7310442Jul 2, 2003Dec 18, 2007Lockheed Martin CorporationScene analysis surveillance system
US7382400Feb 19, 2004Jun 3, 2008Robert Bosch GmbhImage stabilization system and method for a video camera
US7428314 *Dec 2, 2004Sep 23, 2008Safehouse International Inc.Monitoring an environment
US7447331Feb 24, 2004Nov 4, 2008International Business Machines CorporationSystem and method for generating a viewable video index for low bandwidth applications
US7583416 *Dec 20, 2001Sep 1, 2009Eastman Kodak CompanyDocument scanning system with tethered platen element providing sheet-fed and platen scanning functions
US7596240 *Jul 22, 2004Sep 29, 2009Hitachi Kokusai Electric, Inc.Object tracking method and object tracking apparatus
US7643066Dec 6, 2005Jan 5, 2010Robert Bosch GmbhMethod and apparatus for producing frame accurate position data in a PTZ dome camera with open loop control
US7742077Aug 9, 2005Jun 22, 2010Robert Bosch GmbhImage stabilization system and method for a video camera
US7868912Apr 5, 2005Jan 11, 2011Objectvideo, Inc.Video surveillance system employing video primitives
US7932923Sep 29, 2009Apr 26, 2011Objectvideo, Inc.Video surveillance system employing video primitives
US7940432 *Oct 18, 2005May 10, 2011Avermedia Information, Inc.Surveillance system having a multi-area motion detection function
US8107680Aug 19, 2008Jan 31, 2012Lighthaus Logic Inc.Monitoring an environment
US8154578May 31, 2007Apr 10, 2012Eastman Kodak CompanyMulti-camera residential communication system
US8154583May 31, 2007Apr 10, 2012Eastman Kodak CompanyEye gazing imaging for video communications
US8159519 *May 31, 2007Apr 17, 2012Eastman Kodak CompanyPersonal controls for personal video communications
US8169481May 5, 2008May 1, 2012Panasonic CorporationSystem architecture and process for assessing multi-perspective multi-context abnormal behavior
US8212872Jun 2, 2004Jul 3, 2012Robert Bosch GmbhTransformable privacy mask for video camera images
US8253770May 31, 2007Aug 28, 2012Eastman Kodak CompanyResidential video communication system
US8311275 *Jun 10, 2008Nov 13, 2012Mindmancer ABSelective viewing of a scene
US8451331Apr 2, 2007May 28, 2013Christopher L. HughesAutomotive surveillance system
US8493443Jan 4, 2005Jul 23, 2013Hewlett-Packard Development Company, L.P.Methods and apparatus for location determination and asserting and maintaining privacy
US8558892 *Oct 20, 2004Oct 15, 2013Honeywell International Inc.Object blocking zones to reduce false alarms in video surveillance systems
US20090086022 *Apr 29, 2005Apr 2, 2009Chubb International Holdings LimitedMethod and device for consistent region of interest
US20100183227 *Jan 14, 2010Jul 22, 2010Samsung Electronics Co., Ltd.Person detecting apparatus and method and privacy protection system employing the same
US20120289850 *May 9, 2011Nov 15, 2012Xerox CorporationMonitoring respiration with a thermal imaging system
EP1685717A2 *Nov 12, 2004Aug 2, 2006Objectvideo, Inc.Video tripwire
EP1835472A2 *Mar 14, 2007Sep 19, 2007Hitachi, Ltd.Object detection apparatus
WO2005050971A2 *Nov 12, 2004Jun 2, 2005Paul C BrewerVideo tripwire
WO2007078475A2 *Nov 29, 2006Jul 12, 2007Yongtong HuVideo surveillance system employing video primitives
WO2009137118A1 *Jan 20, 2009Nov 12, 2009Panasonic CorporationSystem architecture and process for assessing multi-perspective multi-context abnormal behavior
WO2011038465A1 *Sep 30, 2010Apr 7, 2011National Ict Australia LimitedObject tracking for artificial vision
Classifications
U.S. Classification348/143, 348/E07.09, 348/155
International ClassificationG01S11/12, G01S5/16, H04N7/18, G01S3/786
Cooperative ClassificationG01S11/12, G01S5/16, H04N7/188, G08B13/19686, G01S3/7864
European ClassificationG08B13/196U4, G01S11/12, G01S5/16, H04N7/18E, G01S3/786C
Legal Events
DateCodeEventDescription
Jun 11, 2002ASAssignment
Owner name: SENSORMATIC ELECTRONICS CORPORATION, FLORIDA
Free format text: MERGER/CHANGE OF NAME;ASSIGNOR:SENSORMATIC ELECTRONICS CORPORATION;REEL/FRAME:012991/0641
Effective date: 20011113
Jul 17, 2001ASAssignment
Owner name: SENSORMATIC ELECTRONICS CORPORATION, FLORIDA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BROEMMELSIEK, RAYMOND M.;SCOTT, MARK A.;REEL/FRAME:012021/0957
Effective date: 20010307