|Publication number||US20020008758 A1|
|Application number||US 09/801,441|
|Publication date||Jan 24, 2002|
|Filing date||Mar 7, 2001|
|Priority date||Mar 10, 2000|
|Also published as||US6867799, US7307652, US20010035907, US20020030741, WO2001069930A1, WO2001069931A1, WO2001069932A1|
|Publication number||09801441, 801441, US 2002/0008758 A1, US 2002/008758 A1, US 20020008758 A1, US 20020008758A1, US 2002008758 A1, US 2002008758A1, US-A1-20020008758, US-A1-2002008758, US2002/0008758A1, US2002/008758A1, US20020008758 A1, US20020008758A1, US2002008758 A1, US2002008758A1|
|Inventors||Raymond Broemmelsiek, Mark Scott|
|Original Assignee||Broemmelsiek Raymond M., Scott Mark A.|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (15), Referenced by (57), Classifications (17), Legal Events (2)|
|External Links: USPTO, USPTO Assignment, Espacenet|
 This application claims the benefit of United States provisional patent application Ser. No. 60/188,171 filed on Mar. 10, 2000. United States Patent applications, also claiming the benefit of U.S. Provisional application Ser. No. 60/188,171, and entitled “Method and Apparatus for Object Tracking and Detection” and “Method and Apparatus for Object Surveillance with a Movable Camera” were filed concurrently herewith.
 The present invention relates to the field of video surveillance systems using motion video cameras.
 There are several shortcomings in current video surveillance systems that need to be overcome for widespread use of automatic detection and collection of relevant video data in response to scene stimulus without the need of a human operator present. When viewing a scene from a video camera a large amount of data is generated. The vast amount of data created produces a data reduction problem. Automatically detecting and accurately and reliably collecting image information of a moving object using a motion video camera is a difficult task. This task is made even more difficult when trying to detect, track and maintain camera line-of-sight using a single motion video camera without requiring human intervention.
 U.S. Pat. No. 5,473,369 (Abe) describes the use of a camera to detect and track a moving object without using conventional block matching. In the system described in Abe single object tracking is performed only after an object is placed within a frame on a screen; however, there is no user input device for manual target selection Moreover, Abe does not provide for camera movement to maintain line-of-site.
 Other prior art solutions provide for image stabilization for a camera in arbitrary motion without object tracking functionality. U.S. Pat. No. 5,629,988 (Burt) teaches electronic stabilization of a sequence of images with respect to one another but provides no tracking facility.
 Still other prior art solutions control camera movement to maintain line-of-sight between camera and object but lack arbitrary motion compensation or do not provide for automatic and user selected object tracking. U.S. Pat. No. 5,434,621 (Yu) teaches a method for automatic zooming and automatic tracking of an object using a zoom lens but does not provide for reorienting the camera's line-of-sight.
 It is an object of the present invention to provide a video surveillance method and system having increased accuracy.
 It is an object of the present invention to provide a method and system for reducing false alarms in motion video object detection and surveillance.
 It is an object of the present invention to provide a method and system for identifying user-defined zones to control motion video tracking.
 In accordance with one aspect of the present invention there is provided a method for defining a control zone in a field of view of a motion video camera, said method comprising the steps of: displaying motion video data representative of the field of view of the motion video camera; receiving indication of a control zone type; and receiving indication of a control zone size within the field of view of the motion video camera.
 In accordance with another aspect of the present invention there is provided a system for defining control zones of different types in a field of view of a motion video camera, said system comprising: a database containing a description for each of a plurality of control zone types; means for defining a control zone in a selected area of the field of view of the motion video camera, said control zone being of a type selected from one of said plurality of control zone types in said database; and means for displaying a received motion video signal from the motion video camera including an indication of said defined control zone.
 In accordance with a further aspect of the present invention there is provided a computer readable medium having stored thereon computer-executable instructions for defining a control zone in a field of view of a motion video camera performing the steps comprising: displaying motion video data representative of the field of view of the motion video camera; receiving indication of a control zone type; and receiving indication of a control zone size within the field of view of the motion video camera.
FIG. 1 is an exemplary configuration of a video surveillance system having user-defined zones according to an embodiment of the present invention;
FIG. 2 shows an exemplary field of view for a camera highlighting user-defined zones according to the present invention;
FIG. 3 is a system diagram of a zone defining processing system the video surveillance system of FIG. 1 according to an embodiment of the present invention;
FIG. 4 is a flow chart illustrating a method of identifying user-defined zones according to the present invention; and
FIG. 5 is a flow chart illustrating a general method for video surveillance using the user-defined zones identified by the method in FIG. 4.
 Object tracks, derived from motion video data, are a very useful way to automatically detect alarm conditions within a camera's field of view for purposes such as security and surveillance. However, in the field of motion video object tracking, environmental noise and other such sources of movement can lead to false tracking and tracking of unimportant objects such as leaves falling from a tree.
 The present invention uses user-defined regions within a motion video camera's field of view to ignore the origination of object track data but allow tracking of objects already being tracked before they entered that region. Therefore, the user defines in advance those regions that may generate false alarms within a motion video camera's field of view. The use of user-defined zones in conjunction with object detection and tracking techniques provides a video surveillance system that has greater object tracking accuracy than existing systems.
FIG. 1 illustrates a video surveillance system 100 according to an embodiment of the present invention. A motion video camera 108 has a field of view 118. A computer 102 receives and processes a video signal 112 from the motion video camera 108 and performs object tracking and detection to determine if there was movement in the field of view 118. The computer 102 contains a zone defining processing system (FIG. 3) for defining surveillance zones in the camera's 108 field of view.
 For a moveable motion video camera 108, the computer 102 generates a control signal 114 for servo controlled pan-tilt-zoom assembly 110. The control signal 114 is based on the current position of the servo controlled pan-tilt-zoom assembly 110 and information contained in the video signal 112. Such movement allows the motion video camera 108 to capture an object of interest in greater detail or improve the camera's 108 line of sight with the object of interest. The object detection and tracking techniques that are used can be, for example, those taught in Applicant's related applications entitled “Method and Apparatus for Object Tracking and Detection” and “Method and Apparatus for Remote Object Tracking and Detection”, filed concurrently herewith, both of which are incorporated herein by reference.
 Object detection may be accomplished using any number of methods for image segmentation known in the art. For example, motion detection may be performed by frame differencing sequential pairs of video frames and applying thresholding techniques thereby yielding pixels within the processed image that reflect motion of objects with the field of view of the camera 108. Additional image processing techniques such as centroid analysis may then be applied to remove spurious motion. Kalman filtering may be applied over time to further remove random motion and to estimate motion of objects for the purpose of anticipating camera 108 repositioning and maintaining tracking when moving objects are temporarily occluded by stationary ones. Object tracking and detection is discussed in greater detail in applicant's co-pending related applications entitled “Method and Apparatus for Object Tracking and Detection” and “Method and Apparatus for Object Surveillance with a Movable Camera” filed concurrently herewith and incorporated herein by reference.
 For a fixed motion video camera 108 (not shown) not having a servo controlled pan-tilt-zoom assembly 110, the computer 102 uses the video signal and the current magnification of the camera 108 to create a control signal similar to the control signal of the moveable camera 114. However, the control signal for a fixed camera only uses a zoom function already on the camera 108 to capture the object of interest in greater detail.
 The video signal 112 received from video camera 108 is passed out from the computer 102 as video signal out 106 either directly to a display 104 or modified to include graphic information that may be used to set up response parameters of a tracking program, indicate an object that is actively being tracked or identify (or allow identification of) user-defined zones for tracking. A pointing device 116 may be a mouse or trackball and is the user input for modifying said response parameters or defining the tracking zones. The pointing device 116 may also be used by the user to select an object that appears within the field of view 118 such that tracking program residing on the computer 102 acknowledges user's selection and initiates tracking of the selected object.
FIG. 2 is an illustration of an exemplary field of view 200 for a camera highlighting user-defined zones according to the present invention. In this illustration, the environment that the motion video camera 108 is receiving within its field of view 200 is shown. For the purpose of illustration, the environment contains a house 220, a tree 230, a walkway 240, and a pond 250. There are five different types of zones 202, 204, 206, 208, and 210 that control tracking behavior and subsequent video output. These zones are defined as follows:
Zones Definition Application Tracking Zone Sets the overall surveillance Include only the region region defining pan and tilt in which tracking is limits of camera. The tracker required. does not track nor does it move the camera beyond the tracking zone. Black-out Zone Sets regions within the Reflections, high traffic tracking zone in which areas, machinery, other tracker will not track. The unwanted distractions. tracker does not track any target that moves into a black-out zone, nor does it originate a target within a blackout zone. Exclusion Zone Sets regions within the Trees, machinery, water tracking zone in which tracker and other stationary will not originate a new track reflective surfaces. but that tracked objects may enter and exit. The tracker maintains a track that enters an exclusion zone. Entry Zone Sets region within the tracking Watch doors, windows, zone in which tracker will vehicles and other automatically originate a new assets. track and follow. When the tracker is programmed to search only the entry zone, then this is the only region in which tracks can originate. Privacy Zones Sets regions in the tracking Areas within tracker zone not viewable by the field of view but operator. The tracker requiring privacy from maintains tracking passing camera such as office into and out of privacy zones. windows and homes.
 The tracking zone 208 is shown on the display encompassing the house 220, the tree 230, the walkway 240, and the pond 250. Only for the purpose of illustration is the tracking zone 208 contained within a single field of view 118. For embodiments using a moveable camera 108, the tracking zone 208 may span several fields of view. An exclusion zone 250 is shown encompassing that portion of the pond 250 that is within the tracking zone 208. A black-out zone 202 is shown encompassing the leafy portion of the tree 230 which may generate tracking events when the leaves move in the wind. Since in this field of view 200, it is not anticipated that tracks will not enter into the perimeter, the entire region receives a black-out zone 202 rather than an exclusion zone 204. An entry zone 206 is shown encompassing the door of the house 220. A privacy zone 210 is shown encompassing the window of the house 220.
FIG. 3 is a zone defining processing system 120 for defining surveillance zones in the camera's 108 field of view according to an embodiment of the present invention. The zone defining processing system 120 has a field of view (FOV) area definer 132 that connects interfaces to various devices (i.e. camera 108, input device 116, and display 104), a zone type database 128 and a tracking/monitoring controller 130. A camera interface 122 receives the video signal in from the motion video camera 108 and passes this signal to the FOV area definer 132. The FOV area definer 132 can pass this signal directly to a display interface 126 to be shown on the display 106 or the signal may be modified to include a graphic overlay containing information on zones within the field of view.
 The FOV area definer 132 may send a zone type selection menu to the display interface 126 to prompt a user to select a zone type using the input device 116. The zone type database 128 contains definitions for the different types of zones as well as corresponding actions for each zone type (i.e. do not initiate tracking objects but continue following previous tracks). Indication of a selected zone type is received at a input device interface 124.
 After a zone type has been selected the FOV area definer 132 provides the display interface 126 with a graphic overlay for the field of view to assist the user in drawing a zone of the selected type. After an indication of a drawn zone has been received through the input device interface 124, the FOV area definer 132 provides the display interface 126 with a graphic overlay indicating the defined zone.
 The tracking/monitoring controller 130 tracks and monitors defined objects within the field of view. Given a field of view the tracking/monitoring controller 130 consults the FOV area definer 132 to determine a mapping between defined zones and the field of view. The FOV area definer 132 provides the tracking/monitoring controller 130 with the mapping between defined zones and the field of view as well as a definition of the zone types and the corresponding actions for each zone type.
FIG. 4 is a flow chart illustrating a method 300 of identifying user-defined zones according to the present invention. All of these zones are shown on the display 104 as graphic overlays to the video signal in 112 received from the camera 108 in step 302. An input device such as a mouse 116 is provided for selecting a zone type in step 304. The type of zone may be selected, for example, from a menu listing all zone types or applicable zone types given other settings. These zones may be selected and resized in step 306 by a user with a mouse 116 or similar pointing device by holding down a mouse button and moving the mouse 116 to resize a rectangular region over some portion of the display's image. If the extent of a rectangular region exceeds a single field of view, the mouse 116 is moved to any of the four edges or corners of the display which results in the tracking program residing on the computer 102 sending control signal 114 to the pan-tilt-zoom assembly 110 resulting in an altered field of view. The mouse button is released when the desired region is encompassed by a rectangle. After the size of the zone has been drawn in step 306, the method 300 is repeated for multiple instances of zones of a single type, as applicable, as for other types of zones. Multiple overlapping and non overlapping shapes may be used to define more complex zones. Once these regions are defined, the tracker performs in its environment according to the rules that are defined for each type of region.
FIG. 5 is a flow chart illustrating a general method 400 for video surveillance using the user-defined zones identified by the method 300 in FIG. 4. The video surveillance system continuously monitors a tracking zone (except for any black-out zones) in the field of view for movement 402. When movement is detected 404 subsequent video frames are compared 406 to determine what zone the movement occurred in 408. If the movement is not in a black-out zone or exclusion zone the object is isolated 410 and tracking starts 412. If the movement is in a black-out zone or exclusion zone tracking is not initiated and the video surveillance system continues to monitor the tracking zone for movement 402. While an object is being tracked if it leaves the current field of view and enters a new field of view 414 the camera 108 may be moved 418 to center the tracked object in the camera's 108 field of view. Movement of the camera 108 to maintain a moving object in the field of view is discussed in more detail in applicant's co-pending application related applications entitled “Method and Apparatus for Object Surveillance with a Movable Camera” filed concurrently herewith and incorporated herein by reference. If the object enters a black-out zone or exits the tracking zone 416 then tracking stops and the system continues to monitor the tracking zone for movement 402. If the object has not entered a black-out zone or exited the tracking zone then the object continues to be tracked.
 If movement is not detected in step 404 then the camera 108 is moved to the next field of view 403 to continue searching for movement 402.
 It is apparent to one skilled in the art that numerous modifications and departures from the specific embodiments described herein may be made without departing from the spirit and scope of the invention.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US5329368 *||Jan 11, 1993||Jul 12, 1994||Hughes Aircraft Company||Image tracking system and technique|
|US5387768 *||Sep 27, 1993||Feb 7, 1995||Otis Elevator Company||Elevator passenger detector and door control system which masks portions of a hall image to determine motion and court passengers|
|US5434621 *||Oct 12, 1993||Jul 18, 1995||Samsung Electronics Co., Ltd.||Object tracking method for automatic zooming and the apparatus therefor|
|US5467402 *||Mar 14, 1994||Nov 14, 1995||Hitachi, Ltd.||Distributed image recognizing system and traffic flow instrumentation system and crime/disaster preventing system using such image recognizing system|
|US5473369 *||Feb 23, 1994||Dec 5, 1995||Sony Corporation||Object tracking apparatus|
|US5552823 *||Jul 17, 1995||Sep 3, 1996||Sony Corporation||Picture processing apparatus with object tracking|
|US5629988 *||Oct 28, 1994||May 13, 1997||David Sarnoff Research Center, Inc.||System and method for electronic image stabilization|
|US5754225 *||Sep 26, 1996||May 19, 1998||Sony Corporation||Video camera system and automatic tracking method therefor|
|US5798787 *||Aug 8, 1996||Aug 25, 1998||Kabushiki Kaisha Toshiba||Method and apparatus for detecting an approaching object within a monitoring zone|
|US6055014 *||Jun 23, 1997||Apr 25, 2000||Sony Corporation||Control apparatus and control method|
|US6408027 *||Dec 12, 2000||Jun 18, 2002||Matsushita Electric Industrial Co., Ltd.||Apparatus and method for coding moving picture|
|US6509926 *||Feb 17, 2000||Jan 21, 2003||Sensormatic Electronics Corporation||Surveillance apparatus for camera surveillance system|
|US6727938 *||Apr 14, 1997||Apr 27, 2004||Robert Bosch Gmbh||Security system with maskable motion detection and camera with an adjustable field of view|
|US20020054210 *||May 10, 2001||May 9, 2002||Nestor Traffic Systems, Inc.||Method and apparatus for traffic light violation prediction and control|
|US20020104094 *||Dec 3, 2001||Aug 1, 2002||Bruce Alexander||System and method for processing video data utilizing motion detection and subdivided video fields|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US6970083 *||Nov 12, 2003||Nov 29, 2005||Objectvideo, Inc.||Video tripwire|
|US7082209 *||Aug 21, 2001||Jul 25, 2006||Hitachi Kokusai Electric, Inc.||Object detecting method and object detecting apparatus and intruding object monitoring apparatus employing the object detecting method|
|US7113616 *||Dec 4, 2002||Sep 26, 2006||Hitachi Kokusai Electric Inc.||Object tracking method and apparatus using template matching|
|US7283161 *||Jan 13, 2004||Oct 16, 2007||Canon Kabushiki Kaisha||Image-taking apparatus capable of distributing taken images over network|
|US7310442||Jul 2, 2003||Dec 18, 2007||Lockheed Martin Corporation||Scene analysis surveillance system|
|US7382400||Feb 19, 2004||Jun 3, 2008||Robert Bosch Gmbh||Image stabilization system and method for a video camera|
|US7428314 *||Dec 2, 2004||Sep 23, 2008||Safehouse International Inc.||Monitoring an environment|
|US7447331||Feb 24, 2004||Nov 4, 2008||International Business Machines Corporation||System and method for generating a viewable video index for low bandwidth applications|
|US7583416 *||Dec 20, 2001||Sep 1, 2009||Eastman Kodak Company||Document scanning system with tethered platen element providing sheet-fed and platen scanning functions|
|US7596240 *||Jul 22, 2004||Sep 29, 2009||Hitachi Kokusai Electric, Inc.||Object tracking method and object tracking apparatus|
|US7643066||Dec 6, 2005||Jan 5, 2010||Robert Bosch Gmbh||Method and apparatus for producing frame accurate position data in a PTZ dome camera with open loop control|
|US7742077||Aug 9, 2005||Jun 22, 2010||Robert Bosch Gmbh||Image stabilization system and method for a video camera|
|US7868912||Apr 5, 2005||Jan 11, 2011||Objectvideo, Inc.||Video surveillance system employing video primitives|
|US7932923||Sep 29, 2009||Apr 26, 2011||Objectvideo, Inc.||Video surveillance system employing video primitives|
|US7940432 *||Oct 18, 2005||May 10, 2011||Avermedia Information, Inc.||Surveillance system having a multi-area motion detection function|
|US8107680||Aug 19, 2008||Jan 31, 2012||Lighthaus Logic Inc.||Monitoring an environment|
|US8154578||May 31, 2007||Apr 10, 2012||Eastman Kodak Company||Multi-camera residential communication system|
|US8154583||May 31, 2007||Apr 10, 2012||Eastman Kodak Company||Eye gazing imaging for video communications|
|US8159519 *||May 31, 2007||Apr 17, 2012||Eastman Kodak Company||Personal controls for personal video communications|
|US8169481||May 5, 2008||May 1, 2012||Panasonic Corporation||System architecture and process for assessing multi-perspective multi-context abnormal behavior|
|US8212872||Jun 2, 2004||Jul 3, 2012||Robert Bosch Gmbh||Transformable privacy mask for video camera images|
|US8253770||May 31, 2007||Aug 28, 2012||Eastman Kodak Company||Residential video communication system|
|US8311275 *||Jun 10, 2008||Nov 13, 2012||Mindmancer AB||Selective viewing of a scene|
|US8451331||Apr 2, 2007||May 28, 2013||Christopher L. Hughes||Automotive surveillance system|
|US8493443||Jan 4, 2005||Jul 23, 2013||Hewlett-Packard Development Company, L.P.||Methods and apparatus for location determination and asserting and maintaining privacy|
|US8558892 *||Oct 20, 2004||Oct 15, 2013||Honeywell International Inc.||Object blocking zones to reduce false alarms in video surveillance systems|
|US8790269 *||May 9, 2011||Jul 29, 2014||Xerox Corporation||Monitoring respiration with a thermal imaging system|
|US8964029 *||Apr 29, 2005||Feb 24, 2015||Chubb Protection Corporation||Method and device for consistent region of interest|
|US8965047||Jun 13, 2012||Feb 24, 2015||Mindmancer AB||Selective viewing of a scene|
|US20040100563 *||Nov 27, 2002||May 27, 2004||Sezai Sablak||Video tracking system and method|
|US20040105570 *||Nov 12, 2003||Jun 3, 2004||Diamondback Vision, Inc.||Video tripwire|
|US20040145659 *||Jan 13, 2004||Jul 29, 2004||Hiromi Someya||Image-taking apparatus and image-taking system|
|US20040196369 *||Mar 3, 2004||Oct 7, 2004||Canon Kabushiki Kaisha||Monitoring system|
|US20050018879 *||Jul 22, 2004||Jan 27, 2005||Wataru Ito||Object tracking method and object tracking apparatus|
|US20050123172 *||Dec 2, 2004||Jun 9, 2005||Safehouse International Limited||Monitoring an environment|
|US20050146605 *||Nov 15, 2001||Jul 7, 2005||Lipton Alan J.||Video surveillance system employing video primitives|
|US20050157169 *||Oct 20, 2004||Jul 21, 2005||Tomas Brodsky||Object blocking zones to reduce false alarms in video surveillance systems|
|US20050162515 *||Feb 15, 2005||Jul 28, 2005||Objectvideo, Inc.||Video surveillance system|
|US20050169367 *||Apr 5, 2005||Aug 4, 2005||Objectvideo, Inc.||Video surveillance system employing video primitives|
|US20050185058 *||Feb 19, 2004||Aug 25, 2005||Sezai Sablak||Image stabilization system and method for a video camera|
|US20050185823 *||Feb 24, 2004||Aug 25, 2005||International Business Machines Corporation||System and method for generating a viewable video index for low bandwidth applications|
|US20050270371 *||Jun 2, 2004||Dec 8, 2005||Sezai Sablak||Transformable privacy mask for video camera images|
|US20050270372 *||Jun 2, 2004||Dec 8, 2005||Henninger Paul E Iii||On-screen display and privacy masking apparatus and method|
|US20050275723 *||Aug 9, 2005||Dec 15, 2005||Sezai Sablak||Virtual mask for use in autotracking video camera images|
|US20090086022 *||Apr 29, 2005||Apr 2, 2009||Chubb International Holdings Limited||Method and device for consistent region of interest|
|US20100183227 *||Jul 22, 2010||Samsung Electronics Co., Ltd.||Person detecting apparatus and method and privacy protection system employing the same|
|US20110007158 *||Apr 13, 2009||Jan 13, 2011||Alex Holtz||Technique for automatically tracking an object|
|US20120098854 *||Oct 19, 2011||Apr 26, 2012||Canon Kabushiki Kaisha||Display control apparatus and display control method|
|US20120212594 *||Sep 30, 2010||Aug 23, 2012||National Ict Australia Limited||Object Tracking for Artificial Vision|
|US20120289850 *||May 9, 2011||Nov 15, 2012||Xerox Corporation||Monitoring respiration with a thermal imaging system|
|US20140003657 *||Jun 21, 2013||Jan 2, 2014||Canon Kabushiki Kaisha||Setting apparatus and setting method|
|EP1685717A2 *||Nov 12, 2004||Aug 2, 2006||Objectvideo, Inc.||Video tripwire|
|EP1835472A2 *||Mar 14, 2007||Sep 19, 2007||Hitachi, Ltd.||Object detection apparatus|
|WO2005050971A2 *||Nov 12, 2004||Jun 2, 2005||Paul C Brewer||Video tripwire|
|WO2007078475A2 *||Nov 29, 2006||Jul 12, 2007||Yongtong Hu||Video surveillance system employing video primitives|
|WO2009137118A1 *||Jan 20, 2009||Nov 12, 2009||Panasonic Corporation||System architecture and process for assessing multi-perspective multi-context abnormal behavior|
|WO2011038465A1 *||Sep 30, 2010||Apr 7, 2011||National Ict Australia Limited||Object tracking for artificial vision|
|U.S. Classification||348/143, 348/E07.09, 348/155|
|International Classification||G01S11/12, G01S5/16, H04N7/18, G01S3/786|
|Cooperative Classification||G01S11/12, G01S5/16, H04N7/188, G08B13/19686, G01S3/7864|
|European Classification||G08B13/196U4, G01S11/12, G01S5/16, H04N7/18E, G01S3/786C|
|Jul 17, 2001||AS||Assignment|
Owner name: SENSORMATIC ELECTRONICS CORPORATION, FLORIDA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BROEMMELSIEK, RAYMOND M.;SCOTT, MARK A.;REEL/FRAME:012021/0957
Effective date: 20010307
|Jun 11, 2002||AS||Assignment|
Owner name: SENSORMATIC ELECTRONICS CORPORATION, FLORIDA
Free format text: MERGER/CHANGE OF NAME;ASSIGNOR:SENSORMATIC ELECTRONICS CORPORATION;REEL/FRAME:012991/0641
Effective date: 20011113