|Publication number||US6400401 B1|
|Application number||US 08/563,704|
|Publication date||Jun 4, 2002|
|Filing date||Nov 28, 1995|
|Priority date||Nov 29, 1994|
|Publication number||08563704, 563704, US 6400401 B1, US 6400401B1, US-B1-6400401, US6400401 B1, US6400401B1|
|Inventors||Takashi Morino, Hiroshi Okazaki, Makoto Murata|
|Original Assignee||Canon Kabushiki Kaisha|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (7), Referenced by (49), Classifications (17), Legal Events (5)|
|External Links: USPTO, USPTO Assignment, Espacenet|
This invention is ideal for use in camera control for a case where a computer, to which a camera apparatus capable of being subjected to panning, tilting and zooming control is connected, as well as other computers on a network, remotely controls the camera apparatus. This art can be applied to desktop video conferences, surveillance cameras and the like.
A method of camera control in a video conference system or system of surveillance cameras using camera apparatus whose panning, tilting and zooming operations are capable of being controlled is as shown in FIG. 1. Specifically, a control panel window (hereinafter referred to as a “control window”) 54 for controlling the state of a camera apparatus (not shown) is displayed on a screen 50 a of a display unit 50. This window is separate from a motion-picture window 52 which displays a motion image obtained from the camera apparatus. Panning, tilting and zooming can be controlled by using a mouse (not shown) or the like to operate the control window 54.
In a case where the motion-picture display window 52 and control window 54 are presented separately on the screen 50 a of the display unit 50 in a video conference system or surveillance camera system based upon use of a computer, it is difficult to view both of the windows 52, 54 simultaneously. Accordingly, one's view direction is directed toward either of the windows when the state of the camera apparatus is controlled. If view of direction is directed toward the motion-picture window 52, mistakes tend to be made in terms of operating the control window 54. If view of direction is directed toward the control window, it is very difficult to perform control so as to make the camera follow up the motion of a human being or the like.
Accordingly, an object of the present invention is to provide a control system for controlling a camera apparatus in which it is possible to control the operation of the camera apparatus in simple fashion merely by directing one's view of direction toward one window and not other windows.
Another object of the present invention is to improve the operability of control for moving the camera apparatus.
According to the present invention, the foregoing objects are attained by providing a camera capable of being moved in prescribed directions that differ from one another, comprising display means for displaying an image picked up by the camera; the display means having a display screen area divided into a plurality of zones, with a direction of movement of the camera being assigned to each of the zones; designating means for designating a prescribed position in the display screen area; and moving means for moving the camera in the direction of movement that has been assigned to whichever zone corresponds to the position designated by the designating means.
Further, the foregoing objects are attained by providing a method of controlling a camera capable of being moved in prescribed directions that differ from one another, comprising: a display step of displaying an image, which has been picked up by the camera, on display means having a display screen area divided into a plurality of zones, with a direction of movement of the camera being assigned to each of the zones; a designating step of designating a prescribed position in the display screen area; and a moving step of moving the camera in the direction of movement that has been assigned to whichever zone corresponds to the position designated at the designating step.
Other features and advantages of the present invention will be apparent from the following description taken in conjunction with the accompanying drawings, in which like reference characters designate the same or similar parts throughout the figures thereof.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
FIG. 1 is a diagram illustrating an example of the prior art;
FIG. 2 is a block diagram showing the configuration of a first embodiment of the present invention;
FIG. 3A is an explanatory view illustrating a display on a screen according to the first embodiment;
FIG. 3B is a diagram showing the manner in which a screen is divided;
FIG. 4 is a flowchart for describing the operation of the first embodiment;
FIG. 5 is an explanatory view illustrating a display on a screen according to a second embodiment;
FIG. 6 is a flowchart for describing the operation of the second embodiment; and
FIG. 7 is a diagram for describing the second embodiment.
Preferred embodiments of the present invention will now be described in detail with reference to the drawings.
FIG. 2 is a block diagram illustrating the general configuration of a camera control apparatus 2 according to a first embodiment of the present invention. A camera apparatus 10 includes a camera unit 12 capable of capturing the image of a subject and of performing a zooming operation, and a panning head 14 capable of controlling the attitude (rotational movement) of the camera unit 12 in a panning direction (to the left and right in FIG. 2) and tilting direction (up and down in FIG. 2). A display unit 16 has a display screen 16 a on which the image accepted by the camera unit 12 is capable of being displayed. A work station 20 is equipped with a CPU 21, a ROM 22, a RAM 23, input/output units 24, 29 and a video board 25 for the purpose of issuing instructions to control the state of the camera unit 10 (its positions in the panning and tilting directions and its position in the zooming direction) and for causing the display unit 16 to display the image information accepted by the camera unit 10. A mouse 26 serving as a pointing device and a keyboard 27 are connected to the work station 20 in order to enter data. In this embodiment, the mouse 26 and keyboard 27 function as means for designating various types of processing. These elements are interconnected via a bus 28. The CPU 21 controls the camera control apparatus 2. The program executed by the CPU 21 to control the apparatus is stored in the ROM 22, and various data are stored in the RAM 23. The input/output unit 24 outputs a camera-status control signal, which is issued by the CPU 21, to the camera apparatus 10 and enters values indicating various states of the camera apparatus 10 from the camera apparatus 10. The input/output unit 24 in this embodiment is connected to the camera apparatus 10 via an RS232C. The video board 25 sends the display unit 16 image information obtained from the camera apparatus 10. The video board 25 in this embodiment converts an NTSC signal to an RGB format and outputs the RGB signal.
A plurality of the camera control apparatuses 2 are capable of being connected via a network 30 and signals can be sent and received by the input/output units 29 via the network. Depending upon the configuration, a so-called video conference can be implemented. Though a work station is utilized as the control apparatus in this embodiment in view of the universality and processing capability of a work station, it goes without saying that a personal computer may be utilized or that use may be made of a special-purpose control apparatus.
FIGS. 3A and 3B are diagrams for describing the method of camera control according to this embodiment. FIG. 3A illustrates the display unit 16 and the mouse 26, and FIG. 3B depicts the screen of the display unit 16. In this embodiment, the motion-picture window 17 on the screen 16 a in FIG. 3A is divided into four zones by two diagonal lines L1, L2, as shown in FIG. 3B, to construct a GUI (Graphical User Interface) for controlling the camera apparatus 10. The camera apparatus 10 is controlled in conformity with the zone in which a mouse cursor 18 is situated and the operation of buttons 26 a, 26 b of mouse 26. It is possible to move the camera apparatus 10 in the direction of the arrow illustrated in each zone. Zones A and B are used to designate tilting motion and zones C and D are used to designate panning motion. More specifically, zones A, B, C and D in FIG. 3B make it possible to move the image pick-up direction of the camera apparatus 10 up, down, left and right, respectively. It should be noted that the characters indicated in each of the zones are attached to facilitate explanation and are not actually displayed on the screen. The arrows and diagonal lines L1, L2 also are not displayed. Since the divided zones are sufficiently large, pointing to an approximate position with the cursor makes it possible to designate the corresponding zone even through the boundaries of the divided zones are not displayed.
The processing operation according to this embodiment will be described with reference to FIG. 4, which is a flowchart of the processing operation. In the description below, it will be assumed that an image captured by the camera is being displayed in the motion-picture window 17.
First, at step S1, the operator manipulates the mouse 26 to move the cursor 18 on the motion-picture window 17 and presses the first button 26 a to select any one of the zones A, B, C, D, thereby designating the direction in which image pick-up is desired, namely the direction in which the camera unit 12 is desired to be moved.
The CPU 21 senses the coordinates (x,y) of the position of cursor 18 at step S2. The coordinates (x,y) are stored in the RAM 23.
The CPU 21 determines, at steps S3, S4 and S5, which zone in the motion-picture window 17 is the zone that has been designated. This processing will be described later in greater detail. In FIG. 3B, let the lower left-hand corner of the motion-picture window 17 be the origin (0,0), and consider that the motion-picture window 17 is a plane having a size m along the x axis and n along the y axis. The equation of the diagonal line L1 is y=−(n/m)x+n, and the equation of diagonal line L2 is y=(n/m)x. Therefore, the zones delimited by the diagonal lines L1 and L2 are expressed by the following limits: zone A: y>(n/m)x, y>−(n/m)x+n; zone B: y≦(n/m)x, y≦−(n/m)x+n; zone C: y>(n/m)x, y≦−(n/m)x+n; zone D: y≦(n/m)x, y>−(n/m)x+n. Accordingly, it is determined at step S3 whether y>(n/m) holds. If the decision rendered at step S3 is “YES”, the program proceeds to step S4, at which it is determined whether y >−(n/m)x+n holds. If the decision rendered at step S3 is “NO”, the program proceeds to step S5, at which it is determined whether y>−(n/m)x+n holds. Thus, it can be determined which zone is the zone that has been designated by the operator.
If “YES” decisions are rendered at both steps S3 and S4, then it is judged that the zone A has been designated (step S7).
If a “YES” decision is rendered at step S3 and a “NO” decision at step S4, then it is judged that the zone C has been designated (step S6). If a “NO” decision is rendered at step S3 and a “YES” decision at step S4, then it is judged that the zone D has been designated (step S8). If “NO” decisions are rendered at both steps S3 and S4, then it is judged that the zone B has been designated (step S9).
When the direction in which the camera is to be moved has been determined by the processing described above, the CPU 21 determines at step S10 whether the source providing the image information displayed in the motion-picture window 17 is the camera of another camera control apparatus or the camera which the CPU 21 itself is controlling. In the former case, the program proceeds to step S11, at which the CPU 21 sends a camera control signal to the other camera control apparatus via the input/output unit 29 and the network 30. In the latter case, namely the case in which the camera apparatus is the camera apparatus 10 connected to the CPU's own work station, the program proceeds to step S12, at which the CPU 21 sends the control signal to the panning head 14 via the input/output unit 14.
By repeating the processing of steps S1˜S11 described above, the image of any subject situated within the range capable of being picked up by the camera apparatus under control can be captured and the captured image can be formed in the motion-picture window 17. It should be noted that the processing according to this embodiment is capable of accepting an interrupt. When a series of processing operations ends, the interrupt is generated and processing is terminated. Icons 41, 42 (see FIG. 3A) for zooming operations are separately provided so that zooming can be designated on the screen 16 a. In this case, zooming-in toward the telephoto side (enlargement) can be performed by using the mouse 26 to designate the icon 41, and zooming-out toward the wide side (reduction) can be performed by using the mouse 26 to designate the icon 42.
In this embodiment, the motion-picture window 17 is divided by the diagonal lines L1, L2 and therefore the zones that can be designated are formed vertically and horizontally. This coincides with the directions in which the camera is operated and is easy to understand visually, thereby improving operability.
In the first embodiment, the motion-picture window 17 is divided into four zones to make it possible to control the state of the camera apparatus 10. In the second embodiment illustrated below, the window 17 is divided into nine zones to perform control. This will now be described with reference to FIGS. 5 and 6.
FIG. 5 is a diagram for describing the method of camera control. According to this embodiment, the motion-picture window 17 on the screen 16 a is divided into nine zones (namely three zones horizontally by three zones vertically) to construct a GUI for controlling the camera apparatus 10. The camera apparatus 10 is capable of being controlled in conformity with the zone in which a mouse cursor 18 is situated and the operation of the buttons 26 a, 26 b of mouse 26. It is possible to move the camera apparatus 10 in the directions of the arrows illustrated in the respective zones. Zones D and F are used to designate panning motion; zones B and H to designate tilting motion; and zones A, C, G and I to designate panning motion and tilting motion simultaneously. Zone E is used to designate zooming of the camera apparatus 10. In FIG. 5, arrows are illustrated in the respective zones in the motion-picture window 17 so as to facilitate the description of this embodiment. In actuality, however, the arrows are not displayed since doing so would detract from the appearance of the moving picture displayed. The fact that the arrows are not displayed causes almost no inconvenience because the boundaries of the zones are clearly displayed. It is possible to display the arrows in a semi-transparent manner, however, by lowering the luminance of the arrow character information and mixing this information with the image information. In such case the moving picture will become somewhat more difficult to see but it will be unnecessary to display the zone boundaries since it will suffice for the operator to indicate the vicinity of an arrow using the cursor 18. By displaying arrow-display selection icons 19 a, 19 b and selecting either icon, it is possible to switch between a mode in which the arrows are displayed and a mode in which the arrows are not displayed.
The operation of this embodiment will now be described with reference to FIGS. 6 and 7. FIG. 6 is a flowchart illustrating the processing operation of this embodiment, and FIG. 7 is a diagram for describing operation. The operation described below is controlled by the CPU 21.
When the first button 26 a of mouse 26 is pressed for the purpose of controlling the state (position) of the camera apparatus 10, the positional coordinates (x0,y0) of the cursor 18 displayed in the motion-picture window 17 are read (step S101). The read values are saved in the RAM 23.
Next, it is determined in which zone of the nine divided zones the read coordinates (x0,y0) reside (step S102). This can be carried out by a method similar to that used to judge the zone in the first embodiment.
In a case where the first button 26 a of mouse 26 is pressed when the cursor 18 resides in any one of the zones A, B, C, D, F, G, H, I, as shown in FIG. 7, the camera is moved at a predetermined velocity Vconst in the direction corresponding to the direction of the arrow in this zone in the motion-picture window of FIG. 5 (step S103). For example, in a case where the zone D in FIG. 7 has been designated, the camera unit 12 is moved (panned) to the left.
Furthermore, the CPU 21 determines whether the mouse 26 has been dragged (moved) (step S104). In a case where the mouse 26 has been dragged to move the cursor 18 to the position (x1,y1), the CPU 21 calculates the distance d0 from the center coordinates (0,0) of the motion-picture window 17 to the initial position (x0,y0) of the cursor 18 as well as the distance d1 from (0,0) to the position (x1,y1) to which the cursor has been moved. The camera unit 12 is moved at a velocity V, which is obtained using the following equation, proportional to the ratio d0/d1 (step S5):
where C represents a constant.
According to this moving operation, the velocity V increases the farther the cursor 18 is moved from the center coordinates (0,0) of the motion-picture window 17 by dragging the mouse and decreases the closer the cursor 19 is moved to center coordinates (0,0) by dragging the mouse. Accordingly, the speed at which the camera unit 12 is moved (rotated) can be changed with ease. This movement of the camera unit 12 continues as long as the first button 26 a of the mouse 26 is being pressed. The movement of the camera unit 12 is carried out by transmitting an instruction, which has been issued by the CPU 21, to a drive circuit 40 built in the panning head 14. This instruction is transmitted via the bus 28 and input/output unit 24.
In a case where a button on the mouse 26 is pressed when the cursor 18 resides in zone E, the CPU determines whether the pressed button is the first button 26 a or the second button 26 b (step S106).
Zooming in is performed (step S107) when the first button 26 a is pressed and zooming out is performed (step S108) when the second button 26 b is pressed. This zooming operation continues as long as the button on the mouse 26 is being pressed.
Thus, in accordance with the second embodiment, operations for panning, tilting and zooming the camera unit 12 can be performed with ease by designating predetermined zones on the window. The window is divided into nine zones. The central zone is made to correspond to a designation for zooming the camera, the zones above and below the central zone are assigned to the tilting direction, and the zones to the left and right of the central zone are assigned to the panning direction. The zones at the corners of the window are assigned to a combination of tilting and panning directions. As a result, the zones can be made to coincide to the operating directions of the camera. This is easy to understand visually, thereby improving operability.
In addition, the speed at which the camera unit 12 is moved (rotated) can be changed with ease.
Operation in the first embodiment can be improved as well by providing the central portion of the motion-picture window 17 with the zone for designating the zooming operation.
Since a separate control window need not be specially provided, the screen of the display unit can be utilized effectively.
Thus, as will be evident from the foregoing description, the operation of a camera apparatus can be controlled in simple fashion merely by directing one's line of sight toward one window and not other windows.
In addition, the zones obtained by dividing the window can be made to coincide to the operating directions of the camera apparatus. This is easy to understand visually, thereby improving operability.
In the embodiments described above, the motion-picture window is divided into a plurality of zones. However, it goes without saying that the entirety of the display screen may be divided into a plurality of zones in a similar manner. Further, the display screen in the foregoing embodiments is the screen of a display unit that is separate from the camera. However, the screen may be picture screen in the viewfinder of the camera. In such case a detecting unit would be provided to detect the line of sight of the user looking at the viewfinder. In such an arrangement, the line of sight would be detected and the point of intersection between the line of sight and the picture screen of the viewfinder would be adopted as the designated position.
The present invention can be applied to a system constituted by a plurality of devices (e.g., work station 20, 16 display unit, camera unit, panning head) or to an apparatus comprising a single device.
Further, the object of the present invention can be also achieved by providing a storage medium storing program codes for performing the aforesaid processes to a system or an apparatus, reading the program codes with a computer (e.g., CPU, MPU) of the system or apparatus from the storage medium, then executing the program.
In this case, the program codes read from the storage medium realize the functions according to the embodiments, and the storage medium storing the program codes constitutes the invention.
Further, the storage medium, such as a floppy disk, a hard disk, an optical disk, a magneto-optical disk, CD-ROM, CD-R, a magnetic tape, a non-volatile type memory card, and ROM can be used for providing the program codes.
Furthermore, besides aforesaid functions according to the above embodiments are realized by executing the program codes which are read by a computer, the present invention includes a case where an OS or the like working on the computer performs a part or entire processes in accordance with designations of the program codes and realizes functions according to the above embodiments.
Furthermore, the present invention also includes a case where, after the program codes read from the storage medium are written in a function extension board which is inserted into the computer or in a memory provided in a function extension unit which is connected to the computer, CPU or the like contained in the function extension board or unit performs a part of entire process in accordance with designations of the program codes and realizes functions of the above embodiments.
In a case where the present invention is applied to the aforesaid storage medium, the storage medium stores program codes corresponding to the flowcharts described in the embodiments.
As many apparently widely different embodiments of the present invention can be made without departing from the spirit and scope thereof, it is to be understood that the invention is not limited to the specific embodiments thereof except as defined in the appended claims.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US4224615 *||Sep 14, 1978||Sep 23, 1980||Texas Instruments Incorporated||Method of using a liquid crystal display device as a data input device|
|US4516156||Mar 15, 1982||May 7, 1985||Satellite Business Systems||Teleconferencing method and system|
|US5396287 *||Feb 18, 1993||Mar 7, 1995||Fuji Photo Optical Co., Ltd.||TV camera work control apparatus using tripod head|
|US5426732 *||Oct 5, 1994||Jun 20, 1995||International Business Machines Corporation||Method and apparatus for user control by deriving next states of a process from a current state and by providing a visual presentation of the derived next states|
|US5479206 *||Feb 2, 1993||Dec 26, 1995||Fuji Photo Film Co., Ltd.||Imaging system, electronic camera, computer system for controlling said electronic camera, and methods of controlling same|
|US5568183 *||Jan 17, 1995||Oct 22, 1996||Videoconferencing Systems, Inc.||Network videoconferencing system|
|US5764276 *||Oct 31, 1996||Jun 9, 1998||Interactive Pictures Corporation||Method and apparatus for providing perceived video viewing experiences using still images|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US6980238 *||Sep 19, 2001||Dec 27, 2005||Canon Kabushiki Kaisha||Image communication system and method utilizing self-portrait and partner display windows|
|US7061525 *||Jan 21, 1998||Jun 13, 2006||Canon Kabushiki Kaisha||Apparatus and method for controlling a camera based on a displayed image|
|US7321453 *||Jun 10, 2004||Jan 22, 2008||Canon Kabushiki Kaisha||Image input system|
|US7334190||Jun 27, 2003||Feb 19, 2008||Mjw Corporation Inc.||Interactive video tour system editor|
|US7532238||Jun 9, 2005||May 12, 2009||Canon Kabushiki Kaisha||Apparatus and method for controlling a camera based on a displayed image|
|US7583414||Aug 30, 2006||Sep 1, 2009||Canon Kabushiki Kaisha||Image input system|
|US7589760 *||Nov 23, 2005||Sep 15, 2009||Microsoft Corporation||Distributed presentations employing inputs from multiple video cameras located at multiple sites and customizable display screen configurations|
|US7720359||Nov 12, 2008||May 18, 2010||Sony Corporation||Controller for photographing apparatus and photographing system|
|US7868919 *||Jul 26, 2002||Jan 11, 2011||Honeywell Limited||Control system for allowing an operator to proportionally control a work piece|
|US8045837||Jul 20, 2009||Oct 25, 2011||Sony Corporation||Controller for photographing apparatus and photographing system|
|US8085300||Jul 6, 2006||Dec 27, 2011||Sony Corporation||Surveillance camera system, remote-controlled monitoring device, control method, and their control program|
|US8200078 *||Oct 11, 2010||Jun 12, 2012||Dumm Mark T||Camera control system and associated pan/tilt head|
|US8508713||Jul 20, 2007||Aug 13, 2013||Nikon Corporation||Exposure apparatus, exposure method, and method for producing device|
|US8594483||Oct 4, 2011||Nov 26, 2013||Sony Corporation||Controller for photographing apparatus and photographing system|
|US8648909 *||Dec 5, 2007||Feb 11, 2014||Canon Kabushiki Kaisha||Camera monitoring apparatus and registration method thereof|
|US8743225 *||Jul 8, 2011||Jun 3, 2014||Canon Kabushiki Kaisha||Imaging control system, control apparatus and method for imaging apparatus, and storage medium|
|US8994828 *||Feb 28, 2013||Mar 31, 2015||Apple Inc.||Aligned video comparison tool|
|US9065973 *||Mar 11, 2013||Jun 23, 2015||Cisco Technology, Inc.||System and method for displaying a videoconference|
|US9086790 *||Jul 16, 2010||Jul 21, 2015||Canon Kabushiki Kaisha||Image displaying method, image displaying program, and display|
|US9210328||Nov 13, 2013||Dec 8, 2015||Sony Corporation||Controller for photographing apparatus and photographing system|
|US9329458||Oct 23, 2014||May 3, 2016||Mark T. Dumm||Pan/tilt head with tilt range extender|
|US20020054216 *||Sep 19, 2001||May 9, 2002||Masanori Kawashima||Image communication system, apparatus, and method|
|US20020104094 *||Dec 3, 2001||Aug 1, 2002||Bruce Alexander||System and method for processing video data utilizing motion detection and subdivided video fields|
|US20020171742 *||Mar 26, 2002||Nov 21, 2002||Wataru Ito||Method and apparatus for controlling a view field of an image picking-up apparatus and computer program therefor|
|US20040056883 *||Jun 27, 2003||Mar 25, 2004||Wierowski James V.||Interactive video tour system editor|
|US20040189802 *||Jul 26, 2002||Sep 30, 2004||Mark Flannery||Control system for allowing an operator to proportionally control a work piece|
|US20040223191 *||Jun 10, 2004||Nov 11, 2004||Makoto Murata||Image input system|
|US20050225638 *||Jun 9, 2005||Oct 13, 2005||Canon Kabushiki Kaisha||Apparatus and method for controlling a camera based on a displayed image|
|US20070064104 *||Jul 6, 2006||Mar 22, 2007||Sony Corporation||Surveillance camera system, remote-controlled monitoring device, control method, and their control program|
|US20070097460 *||Aug 30, 2006||May 3, 2007||Tomoaki Kawai||Image input system|
|US20070118868 *||Nov 23, 2005||May 24, 2007||Microsoft Corporation||Distributed presentations employing inputs from multiple video cameras located at multiple sites and customizable display screen configurations|
|US20080141155 *||Jan 11, 2008||Jun 12, 2008||Mjw Corporation Inc.||Interactive video tour system editor|
|US20080141312 *||Jan 11, 2008||Jun 12, 2008||Mjw Corporation Inc.||Interactive video tour system editor|
|US20080158356 *||Dec 5, 2007||Jul 3, 2008||Canon Kabushiki Kaisha||Monitoring apparatus and control method thereof|
|US20080215983 *||Jan 11, 2008||Sep 4, 2008||Mjw Corporation Inc.||Interactive video tour system editor|
|US20090083672 *||Aug 28, 2008||Mar 26, 2009||Autodesk, Inc.||Navigation system for a 3d virtual scene|
|US20090115841 *||Nov 12, 2008||May 7, 2009||Masakazu Koyanagi||Controller For Photographing Apparatus And Photographing System|
|US20090278914 *||Jul 20, 2009||Nov 12, 2009||Masakazu Koyanagi||Controller for Photographing Apparatus and Photographing System|
|US20090327949 *||Jun 26, 2008||Dec 31, 2009||Honeywell International Inc.||Interactive overlay window for a video display|
|US20100309224 *||Jul 16, 2010||Dec 9, 2010||Canon Kabushiki Kaisha||Image displaying method, image displaying program, and display|
|US20110025861 *||Oct 11, 2010||Feb 3, 2011||Dumm Mark T||Camera control system and associated pan/tilt head|
|US20120007999 *||Jul 8, 2011||Jan 12, 2012||Canon Kabushiki Kaisha||Imaging control system, control apparatus and method for imaging apparatus, and storage medium|
|US20120127319 *||Nov 19, 2010||May 24, 2012||Symbol Technologies, Inc.||Methods and apparatus for controlling a networked camera|
|US20130258041 *||Mar 11, 2013||Oct 3, 2013||Philip R. Graham||System and Method for Displaying a Videoconference|
|US20150279330 *||Mar 30, 2015||Oct 1, 2015||Apple Inc.||Aligned video comparison tool|
|EP1496702A2 *||Jun 23, 2004||Jan 12, 2005||Sungjin C&C Co.,LTD.||Virtual joystick system for controlling the operation of security cameras and controlling method thereof|
|EP1496702A3 *||Jun 23, 2004||Jan 19, 2005||Sungjin C&C Co.,LTD.||Virtual joystick system for controlling the operation of security cameras and controlling method thereof|
|WO2004003842A1 *||Jun 27, 2003||Jan 8, 2004||Mjw Corporation||Interactive video tour system editor|
|WO2005062603A1 *||Dec 23, 2004||Jul 7, 2005||Mobotix Ag||Monitoring arrangement with integrated camera system|
|U.S. Classification||348/211.1, 348/E05.042, 348/E07.079, 348/E07.086, 348/143|
|International Classification||H04N7/14, H04N5/232, H04N7/15, H04N7/18|
|Cooperative Classification||H04N5/23216, H04N5/232, H04N7/181, H04N7/142|
|European Classification||H04N5/232G, H04N7/14A2, H04N5/232, H04N7/18C|
|Nov 28, 1995||AS||Assignment|
Owner name: CANON KABUSHIKI KAISHA, JAPAN
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MORINO, TAKASHI;OKAZAKI, HIROSHI;MURATA, MAKOTO;REEL/FRAME:007801/0326
Effective date: 19951109
|Mar 11, 2003||CC||Certificate of correction|
|Nov 14, 2005||FPAY||Fee payment|
Year of fee payment: 4
|Nov 4, 2009||FPAY||Fee payment|
Year of fee payment: 8
|Nov 6, 2013||FPAY||Fee payment|
Year of fee payment: 12