CA2130779C - Method and apparatus for improving motion compensation in digital video coding - Google Patents

Method and apparatus for improving motion compensation in digital video coding

Info

Publication number
CA2130779C
CA2130779C CA002130779A CA2130779A CA2130779C CA 2130779 C CA2130779 C CA 2130779C CA 002130779 A CA002130779 A CA 002130779A CA 2130779 A CA2130779 A CA 2130779A CA 2130779 C CA2130779 C CA 2130779C
Authority
CA
Canada
Prior art keywords
block
motion
motion vector
frame
video data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CA002130779A
Other languages
French (fr)
Other versions
CA2130779A1 (en
Inventor
Caspar Horne
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AT&T Corp
Original Assignee
American Telephone and Telegraph Co Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by American Telephone and Telegraph Co Inc filed Critical American Telephone and Telegraph Co Inc
Publication of CA2130779A1 publication Critical patent/CA2130779A1/en
Application granted granted Critical
Publication of CA2130779C publication Critical patent/CA2130779C/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/527Global motion vector estimation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/56Motion estimation with initialisation of the vector search, e.g. estimating a good candidate to initiate a search

Abstract

A novel method and apparatus for use in digital video compression provides improved block-based motion compensation using the global motion of a video frame. A
video frame comprising a plurality of blocks may be compressed for transmission using block motion vectors.
Motion vectors are generated by block matching a block to be coded with a block within a reference frame and determining the displacement therebetween. To effect block matching, motion compensation techniques define a search window within one or more reference frames within which a displaced block will be found. According to the present invention, the location of the search window within a reference frame is defined using the global motion of the frame.

Description

METHOD AND APPARATUS FOR IMPROVING MOTION COMPENSATION IN
DIGITAL VIDEO CODING
Field of the Invention The present invention relates to the field of digital video data compression, and in particular video data compression using block-based motion compensation.
HackQround of the Invention Block-based motion compensation techniques including either motion interpolation or motion prediction are used in digital image sequence coding to reduce the amount of data that must be transmitted in digital video systems.
Video sequences typically exhibit a substantial amount of repetitive image information between consecutive frames.
By compensating for the motion of objects within an image frame, a high compression of the video data can be achieved.
Block-based motion compensation techniques initially divide a video frame into several blocks of picture elements (pels). These techniques employ motion vectors to indicate the movement of these blocks between successive frames. It is assumed that each pel within a block has the same translational motion. These motion vectors, along with error signals, may be transmitted to a receiver. The receiver may then regenerate the video blocks using the motion vectors to locate the corresponding blocks from a previously transmitted frame.
The error signals are used to account for pel differences within each block between successive frames.
Consider, for example, the image of a football in a football game video sequence. The football image remains substantially identical from frame to frame, but its location within the video frame changes. If the motion vector of the football is known, then the image of the football can be reconstructed in a new video frame using the football image data from the previous frame. As a consequence, instead of transmitting repetitive image data for each new frame, motion vectors may be transmitted. An error signal is also transmitted to account for other differences, for example, a change in orientation of the football.
Motion compensation methods employ a process known as motion estimation to generate the motion vectors. In motion estimation, a motion vector for a particular block within a current frame is determined by matching the image data within the block to the image data in a displaced block within a reference frame. The motion vector represents the difference in position between the block from the current frame and the displaced block from the reference frame. The reference frame may be a previous frame, as in motion compensation prediction techniques, or a previous or future frame, as in motion compensation interpolation techniques.
To effect such block matching, the entire reference frame is not ordinarily searched for the displaced block, as such a search can consume considerable computation power and time. Instead, a search window is defined within the reference frame.
In motion estimation, a search window may be defined in terms of its size in pels and its location within the reference frame. In determining search window location, it is known in the art to center the search window on the position of the block to be matched. The method assumes that an image within a video block is most likely to be found in the vicinity of its location in a previous or subsequent frame. The search window size is then optimally chosen to be as large as the sum of the block size and the maximum probable block movement in any direction. For example, for a 15 X 15 pel block capable of moving up to 5 pels in any direction between frames, the maximum search window size would be 25 X 25 pels.
Referring again to the analogy of a football within a football game video sequence, the football may, for discussion purposes, represent a block of video data. To find a displaced football in a reference frame, a motion estimation method would initially define a search window within which it would expect to find the football image in the reference frame. The search window is then scanned for the image within the window most resembling the football. Once the football image is found, a motion vector may be calculated for the football describing its movement between the reference frame and the present frame.
Under such methods of locating the search window, however, the search window size used for block matching limits the range of motion that may be compensated for.
In the 25 x 25 pel search window defined above, block motion of more than 6 pels cannot be successfully compensated. If the motion estimation method fails to find the proper displaced block, a large error signal will have to be transmitted, and the advantages of motion compensation will be greatly reduced.
To increase the range of motion which may be compensated for, it is well known that the search window size may be enlarged in order to account for greater image movement. The use of larger search windows, however, consumes more computation time and power because there are more candidate displaced blocks to evaluate to effect block matching. For example, if an exhaustive search is used to effect block matching, an increase in size of a 25 x 25 search window by a single pel in each direction will increase the number of candidate displaced blocks by 272-252=104 blocks .

Several attempts have been made to increase the range of motion which may be compensated for without drastically increasing the computational load. It is well known in the art to employ search algorithms which are faster than an exhaustive search. These algorithms include the hierarchical search, the logarithmic search, and conjugate direction search. Because motion estimation techniques using such algorithms greatly reduce the number of calculations required for block matching, they may employ larger search windows than an exhaustive search routine.
The faster search techniques, however, are not guaranteed to find the proper displaced block. In other words, these algorithms may not find the best match for a particular block to be coded, which leads to coding inefficiency. Furthermore, such search techniques require complex image data manipulation when compared with the exhaustive search. The complex image data manipulation may also computationally load the system.
Summaszr of the Iaveatioa The present invention provides a method of motion estimation operable to enable compensation for a large range of motion using a novel method of locating the search window. Large amounts of motion in video are often due to a combination of object motion and global motion caused by, for example, camera motion. According to the present invention, the search window is placed in a location that takes into account the global motion of the entire video frame, thus enabling compensation for greater amounts of motion when a component of the motion is caused by global motion.
For example, again consider the image of a moving football in a football game video sequence. The motion of the football is often due to camera motion as well as object motion, which in the process of block matching may _ 5 _ cause the football image to fall outside a reasonably sized search window. If, however, the search window is placed in a location in a reference frame that takes into account the camera motion, then the search window would more likely contain the football image, thus facilitating successful compensation.
Accordingly, the method of the present invention generates a global motion vector from time to time and transmits this data to a motion estimation device. To perform block matching, the motion estimation device uses the global motion vector to define a search window in which a displaced block may be found within one or more reference frames. The motion estimation device then identifies the displaced block within the defined search window. The method of the present invention is compatible with both motion compensation prediction and motion compensation interpolation techniques.
The present invention may be utilized with either an exhaustive search in block matching, or it may also be used with faster search techniques.
In accordance with one aspect of the present invention there is provided a method of converting a block of video data which is one of a plurality of blocks defining a frame of video data, into a compressed encoded digital bit stream for transmission over a transmission medium, said method comprising: a) generating a global motion vector; b) generating both video data and position data corresponding to the block of video data to be coded; and c) effecting block matching by using predetermined criteria to identify a displaced block corresponding to the block of video data to be coded, the displaced block having a location within a reference frame, said block matching step further comprising defining a search window within the reference frame using the global motion vector.
In accordance with another aspect of the present invention there is provided an apparatus for determining the location of a displaced block of video data within a reference frame corresponding to a block of video data where ~

.~1~' - - 5a -the displaced block is identified from a plurality of candidate displaced blocks using predetermined criteria, the apparatus comprising: a) a motion estimator comprising a processing means and memory means, the processing means operable to determine the location of a displaced block by defining a search window within the reference frame using a global motion vector, identifying a plurality of candidate displaced blocks having a location with the search window, and comparing each candidate displaced blocks with the block of video data; and b) a global motion estimator coupled to the motion estimator, comprising processing means and storage means, said processing means operable to generate global motion vectors.
Brief Description of the Drawincts FIG. 1 shows a block diagram of a video coder circuit employing a novel block-based motion compensation method and apparatus according to one embodiment of the present invention;
FIGS. 2A and 2B show a video image frame divided into blocks of pels in accordance with standard methods of video data coding techniques;
FIG. 3A shows a functional flow diagram of the steps performed by a motion estimation device according to the present invention;
FIG. 3B illustrates how a global motion vector may be utilized in accordance with the present invention to define a search window within a reference frame to more .-effectively achieve block matching according to the present invention; and FIG. 4 shows a functional block diagram of a global motion estimator operable to generate global motion vector data for use in the methods and apparatus of the present invention.
Detailed Description The present invention provides an improvement to current motion estimation techniques used in block-based motion compensation by using the global motion, caused by, for example, camera motion, to assist in defining a search window in the process of block matching. A novel video coder circuit 100 employing motion estimation techniques according to the present invention is described below in connection with FIG. 1.
FIG. 1 shows, in block diagram form, video coder circuit 100 for encoding a sequence of image frames represented in digital form using block-based motion compensation. The block diagram is shown in simplified form for purposes of clarity. To this end, various aspects of digital circuitry are not represented in the diagram and should be considered as inherent to the system. For example, a timing and sequence controller for the various functional blocks, as. well as imbedded storage elements such as buffers and memories are not shown. The details pertaining to the use of such digital circuitry in a system according to the present invention are well known or will be readily apparent to one of ordinary skill in the art.
The video coder circuit 100 receives at its input 101 video data corresponding to temporal frames of video. A
frame of video comprises a plurality of picture elements or pels. The digital representation of the video frame as it appears at the input 101 of video coder circuit 100 i ".~.."
_ 7 _ comprises pel video data for each pel within the frame.
The pel video data typically comprises chrominance and luminance values. A video frame may be further subdivided into blocks of pels, and frames are processed by the video coder circuit 100 on a block by block basis.
The video coder circuit 100 produces at its output 102 a compressed and encoded digital signal in a bit stream for subsequent storage or transmission over a transmission medium, not shown. The compressed and encoded digital signal may be suitably coded according to the MPEGl, MPEG2 or any other video data transmission standard that is designed specifically to work with data compression techniques including block-based motion compensation prediction and interpolation. See, for example, ANSI Standard ISO-IEC DIS 11172, "Coding of Moving Pictures and Associated Audio for Digital Storage Media For Up to About 1.5 Mbit/sec."
The video coder circuit 100 illustrated in.FIG. 1 compresses and encodes data employing motion compensation prediction techniques. In processing the blocks of a video frame, the circuit 100 may either encode a particular video data block directly or encode the block using motion compensation prediction. Video blocks that cannot be successfully compensated are encoded directly.
The functional elements in the video coder circuit 100 are interconnected as follows. A video frame storage unit 103 is operably connected to transfer both video data and position data corresponding to a new block of video to be coded ("new block") to a motion estimator 104, and to transfer the same new block video data to both a subtraction node 105 and a coding device 107. The video frame storage unit 103 includes a digital memory or buffer which stores both video data corresponding to one or more frames and is operable to generate and store block position data for the video blocks within each stored frame.
The motion estimator 104 is operably connected to receive block video data from the video frame storage unit 103 and global motion data from a global motion estimator 106. The motion estimator 104 comprises a processing means such as a microprocessor and related memory circuitry which is operable both to effect block matching using global motion data and to generate motion vectors.
The motion estimator 104 further includes motion compensation means for providing displaced block data to the subtraction node 105. According to one embodiment of the invention, the processing means within the motion estimator 104 is programmed to execute the block matching and motion vector generation consistent with the flow chart 300 described below in connection with FIG. 3A. The motion estimator 104 is further connected to transfer block video data to the subtraction node 105 and to provide motion vector data to both a coding device 107 and the global motion estimator 106.
The motion estimator 104 is further connected to a reference frame storage buffer 108 and a new frame storage buffer 109. The new frame storage buffer 109 is connected between an addition node 110 and the reference frame storage buffer 108. Both the new frame storage buffer 109 and the reference frame storage buffer 108 have sufficient storage space to hold the video data corresponding to at least one video frame.
The subtraction node 105 is operably connected to receive new block video data from the video frame storage unit 103 and to receive displaced block video data from the motion estimator 104. The subtraction node 105 is further connected to the coding device 107. The subtraction node 105 comprises a data storage means, such as a buffer and means for generating an error signal r ..

representing the difference in pel luminance and chrominance between two video data blocks received from the motion estimator 104 and the video frame storage unit 103.
The coding device 107 is operably connected to receive the error signal from the subtraction node 105, motion vector data from the motion estimator 104 and block video data from the video frame storage unit 103. The coding device 107 is further connected to provide an error signal to the addition node 110, and includes the output 102 for transmitting coded video data in the form of a bit stream. The coding device 107 comprises a means for encoding and decoding video data using the MPEG1, MPEG2 or other coding standard for transmitting a coded bit stream.
Devices capable of coding in these standards are known in the art. For more information on such coding methods and devices, see ANSI Standard ISO-IEC.
The global motion estimator 106 is connected to receive motion vectors from the motion estimator 104 and to provide the global motion vector to motion estimator 104. The global motion estimator comprises the processing means and related circuitry operable to store a plurality of motion vectors and to compute therefrom a global motion vector that is representative of a common motion component within an entire video frame. Although the global motion estimator 106 is illustrated as a separate element, the functions performed by the global motion estimator processing means may be performed by either a dedicated microprocessor or a microprocessor included within the system to perform other control and operational functions.
To achieve its desired functions, the global motion estimator 106 may suitably be programmed consistently with the flow chart 400 discussed below in connection with FIG.
4.

. - 10 -The addition node 110 is connected to receive block video data from the motion estimator 104 and error signal from the coding device 107. The addition node 110 comprises means for combining a block of video data with an error signal to produce a compensated block of video data. The addition node provides the compensated block to new frame memory buffer 109.
The video coder circuit 100 described above operates in the following manner to process and encode a new video frame, one block at a time, using block-based motion compensation with global motion estimation. A video frame to be coded is first divided into blocks. Each block is then encoded either directly or using motion compensation prediction until the entire frame is encoded. The decision of whether or not to encode using motion compensation is made by the control circuitry using predetermined criteria, typically dependent on the size of the error signal produced during motion compensation. The next frame is thereafter processed in the same manner.
It should be noted that the various elements discussed below exchange data using digital signals, the generation of which is known to those of ordinary skill.
For purposes of clarity, it will be understood that when any element is said to provide or receive data, for example, representing video blocks or motion vectors, to or from another element, the element is actually providing or receiving digital signals containing the data.
In order to process a video frame one block at a time, the control circuitry directs that each frame, prior to processing, be divided into blocks of video data, DBi,t, where i is the block number within a frame corresponding to a time t. Initially, video data defining a video frame Ft corresponding to a time t in a video sequence is stored in a buffer or other storage means within the video frame storage unit 103. The time t actually represents a sequential frame number such that, for example, Ft_1 is the frame immediately preceding Ft, and Ft+1 is the frame immediately subsequent to Ft. The control circuitry directs that each frame be divided into a number of blocks per row m, and a number of blocks per column n, so that the frame has total number of blocks, NB, or in other words, m x n = NB blocks. The control circuitry may be user controllable to select the number of blocks per row m and the number of blocks per column n. Once the number of blocks is determined, the video frame storage unit 103 then obtains and stores block position values BPi,t for each block within one video frame.
For the present embodiment of the invention, the block position value may suitably be defined as the pel position value of the pel in the upper left hand corner of each block. The pel position value constitutes the position within the two-dimensional array of pels comprising the video frame. The pel position may suitably be provided by the control circuitry. The video frame storage unit 103 may determine the block position for each block i; BPi,t = (x, y), using any appropriate method.
For example, FIG. 2A illustrates the frame Ft 201, which is divided into 16 rows and 16 columns of blocks.
Block 202 represents data block DBls,t in frame 201 wherein the first block 203 is DBo,t and the block i-values are counted from left to right and top to bottom. Block 202, as shown in FIG. 2B comprises a 15 X 15 array of pels, one of which is exemplified by upper left corner pel 204.
The pel position value for pel 204 is (45,15), corresponding to its x,y coordinates. These coordinates are determined as follows. Pel 204 is the upper left hand pel of block 202. Block 202 is the 19th block in frame Ft 201, and it is in the fourth column and second row of frame Ft 201. X-pels 0-14 are in blocks in the same column as block 203, x-pels 15-29 are in blocks in the second column, x-pets 30-44 are in blocks in the third column, and x-pels 45-59 are in blocks in the fourth column, which is the column block 202 is in. Thus, the left most pels of block 202 will have an x-coordinate of 45. Likewise, y-pels 0-14 are in blocks in the same column of block 203, and y-pels 15-29 are in blocks in the next column, which is the column block 202 is in. Thus, the upper most pels of block 202 will have a y-coordinate of 15. The block position value of block 202, as discussed above, is the pel position value for its upper left hand corner pel 204. Therefore, BPl9,t = (45,15).
Once the data blocks and their respective block position values are defined, the control circuitry directs that each block DBi,t be processed, block by block.
Returning to FIG. 1, the control circuitry instructs the video frame storage unit 103 to provide a new block of video data, DBi,t and the corresponding block position data, BPi,t to the motion estimator 104. The new block video data DBi,t is further transmitted to the subtraction node 105 and stored in a buffer therein.
The motion estimator 104 then effects block matching in order to match DBi,t with a data block, MBref, from the video frame stored in the reference frame storage buffer 108. The storage buffer 108 contains at least one reference video frame. Because the block is to be encoded using motion compensation prediction in the present embodiment, the reference frame comprises the previous video frame, Ft_1. In motion compensation prediction, the location of the displaced block represents the location of the video image defined by the new block within the previous frame.
If, however, the new block is to be encoded using motion compensation interpolation techniques, the motion estimator 104 locates MBref within a plurality of reference frames stored in the reference frame storage buffer 109.

The plurality of reference frames may correspond to both previous and/or future video frames depending on the chosen motion compensation technique. In such cases, the motion estimator 104 performs the block matching step for the plurality of reference frames.
The motion estimator 104 is operable to effect block matching for each reference frame in the following manner.
The motion estimator 104 first defines a search window within the reference frame in which the displaced block may be expected to be found. According to the present invention, the motion estimator 104 uses a global motion vector in defining the location of the search window. The global motion vector is representative of motion common to all the blocks within the video frame Ft, and is from time to time provided by the global motion estimator 106 as discussed in detail further below. The defined search window is thereafter searched until the displaced block MBref for the new block DBi,t is found.
Once a displaced block MBref is found, the motion estimator 104 determines the motion vector MVi,at corresponding to the new block. The value dt indicates the frame number difference between Ft and the reference frame. For motion prediction, the reference frame will typically be the previous frame, and therefore dt = 1.
The motion vector MVi,ac comprises a two-dimensional value (dx,dy) representing the horizontal and vertical displacement between the new block DBi,t and the displaced block MBref. The two-dimensional motion vector value (dx,dy) may suitably be determined by calculating the difference between the new block position value BPi,t and the pel position value of the pel in the upper left corner of the displaced block MBref. For example, referring to Fig. 2, the block 202 (DBl9,t) has a block position value BPls,t = (45,15). If it is further assumed that a displaced block has been identified as corresponding to r block 202 and that the pel position value of the upper left hand corner pel of the displaced block MBref is (50,21), then the motion vector MVl9,at = (dx.dY) - (+5,+6).
A functional block diagram of an exemplary block matching and motion vector generating method which may be employed by the motion estimator 104 to achieve the foregoing is discussed further below in connection with FIG. 3A.
The displaced block video data MBref In the motion compensator 104 is then transmitted to the subtraction node 105. The displaced block video data is also transmitted to the addition node 110 for purposes discussed further below. In the subtraction node 105, the displaced block video data MBref~ comprising both luminance and chrominance data, is subtracted on a pel by pel basis from the new block video data DBi,t received and stored from the video frame storage unit 103. The resulting error signal ERROR represents the difference between the displaced block MBref and the new block DBi,t. The signal ERROR is then provided to the coding device 107.
The motion estimator 104 provides the motion vector MVi,at to the coding device 107. The motion vector is also provided to the global motion estimator 106 for purposes discussed further below. The coding device 107 comprises digital circuitry, which may include a microprocessor, that is capable of producing digital signals according to the MPEG1, MPEG2 or other standard to the video coder output 102.
The coding device 107 also receives the new block video data DBi,t from the video frame storage buffer 103.
The control circuitry then determines whether the motion compensated version of the block, in other words, the motion vector and the signal ERROR, should be encoded and transmitted. If not, the control circuitry instructs the coding device 107 to encode the new block video data DBi,t.

If, however, the motion compensated block is to be transmitted, the control circuitry instructs the coding device 107 to encode for transmission the motion vector MV;,at and the signal ERROR. The encoded data is then provided to the circuit output 102 for transmission to one or more receiving devices, not shown. In either case, the encoded data is transmitted in a bit stream along with appropriate signals to indicate whether the motion compensated version of the block has been transmitted.
After transmission, the new block DBi,t is stored in order to build the current frame in new frame storage buffer 109. If the motion compensated version of the block was transmitted, the addition node 110 will receive a decoded version of the coded error signal from the coding device 107 and will add the error signal to MBref which is received from the motion estimator 104. The resulting block, called a compensation block, closely resembles the new block DBi,t. The compensation block is placed inside the new frame buffer 109 in a position corresponding to the new block position value, BPi,t.
If, however, the motion compensated version of the block was not transmitted, the new block video data DBi,t may be placed directly into the new frame buffer 109 from the motion estimator 104. In this manner, the frame Ft is built within the new frame storage buffer 109 which will eventually be placed in the reference frame storage buffer 108.
Concurrent with the operation of the coding device 107 and the addition node 110, the global motion estimator 106 receives the motion vector MVi,at. The motion vector MVi,at is received into a buffer, stack or memory within the global motion estimator 106.
The process performed by video coder circuit 100, as outlined above, is repeated for each block DBi,t of each frame until an entire video frame (i=NB) has been encoded ~w..

by the video coder circuit 100. Once the entire frame Ft has been processed, the control circuitry will cause the new frame buffer 109 to transfer the video data corresponding to the frame Ft to the reference frame storage buffer 108 in order to process the next frame corresponding to the time t+1.
Furthermore, before the next frame is processed, the global motion estimator 106 will generate and transmit a new global motion vector to the motion estimator 104. The global motion estimator 106 generates the global motion vector using the motion vectors stored during the processing of Ft. The global motion vector approximates-the component of motion common to every block in a frame of a video sequence which may be due to camera motion.
The motion estimator 104 receives and stores the global motion vector GMV prior to receiving the new video data block for the first block of the next frame (DBo,t+1) - The global motion estimator may be programmed to execute the above functions as discussed below in connection with FIG.
4.
It will be understood that the video coder circuit 100 described in connection with FIG. 1 is exemplary. The global motion estimation method and apparatus of the present invention may be utilized in any block-based video motion compensation device that performs block matching to compute motion vectors between a block to be coded and a displaced block within one or more reference frames, by first defining a search window in the reference frames.
For example, U.S. Pat. No. 5,151,784 illustrates a video coder circuit incorporating such a motion compensation device that is operable to employ both motion compensation prediction and motion compensation interpolation for coding block based video data. The method according to the present invention may readily be adapted for use therein.

FIG. 3A illustrates a functional flow chart for a motion compensation device that employs a global motion vector to effect block matching according to the present invention. The flow chart may suitably be executed by the motion estimator 104 in FIG. 1. The functionality of the flow chart, however, is designed to be compatible with alternative video coder circuits employing both motion compensation prediction and motion compensation interpolation. The processing means and memory means (processor) within the motion estimator execute a program performing the steps in FIG. 3A to effect block matching and motion vector generation.
The processor first receives new block video data DBi,t for a block i within a frame Ft in step 302. Next, in step 310, the search window Aref within the reference frame is defined. Arep is video data representing the portion of the reference frame within which a displaced block MBref for DBi,t may be located. The location of Aref within a the reference frame is defined by first centering Aref on the new block position BPi,t, and then displacing A=ef bY an amount defined by the stored global motion vector GM.
It should be noted that the global motion vector GM
is normalized to represent the global motion between the current frame and the immediately preceding frame, Ft_1.
As a consequence, if the reference frame is not the immediately preceding frame, the global vector should be scaled accordingly. Scaling may suitably be linear. For example, if the reference frame corresponds to a time t-2, then the global motion vector should be doubled before determining the location of Ar.ef. Likewise, in motion compensation interpolation, the reference frame may be a future frame, in which case the global motion vector should be inverted. By storing and subsequently scaling a normalized global motion vector, the processor may suitably apply the global motion vector to block matching operations in both interpolated and predicted modes.
The size of the search window Aref should preferably be large enough to ensure inclusion of the displaced block, but small enough so that the block matching function does not unduly load the processing means of the motion estimator. The size of search window AZ.ef 1S a predetermined number which takes into account the maximum object motion between video frames and may vary depending upon the application, as well as other factors. In the context of a particular application, the determination of a proper size will be apparent to one skilled in the art.
FIG. 3B further illustrates the novel method of defining the search window Aref aS described in connection with step 310. In FIG. 3B, the given block 354 from the frame Ft, effects block matching to identify displaced block 351 from the reference frame, which may suitably be the previous frame.
The image in a data block DBi,t is assumed to have moved from the position of block 351 in the reference frame to the position of DBi,t shown by the block 354. The movement of the image in DBi,t is due in substantial part to camera movement or some other source of global motion.
The reference frame 352 has the same dimensions as the video frame to be coded from which DBi,t originated.
The search window A=ef iS located by first centering the search window on the position of the block position for the block 354, as illustrated by centered window 356.
The search window is then displaced by the global motion vector 355, as illustrated by displaced window 353. The global motion vector 355 is representative of the component of motion, which may be caused, for example, by camera movement, that is common to all blocks within a frame. The global motion vector 355 may suitably be r w provided from time to time by the methods described below in connection with FIG. 4.
The use of global motion in defining the search window location provides a clear improvement over the prior art. Referring to FIG. 3B, the prior art method for locating the search window consists essentially of centering the search window around the position of the new block 354, as illustrated by centered window 356. See, for example, Jain, et al, "Displacement Measurement and its Application in Interframe Image Coding," 29 IEEE
Transactions on Communications pp. 1799-1808 (1981). It can be seen that the centered search window 356 does not.
include the displaced block 351. If the displaced block is not found, the motion estimator cannot effectively aid video compression.
The centered search window 356 could, of course, be enlarged to encompass displaced block 351. As discussed above, however, enlarging the search window can greatly increase the number of calculations required for block matching. Furthermore, because many video sequences have little or no camera motion, it would be wasteful to enlarge the search window when a smaller search window is adequate in still sequences. By using global motion, the search window can remain a reasonable size yet still encompass the displaced block in large camera movement sequences. Consequently, because the use of global motion allows for a larger range of motion when camera or global motion is present, it provides a substantial improvement to motion compensation techniques.
Returning to FIG. 3A, once the size and location of the search window Azef is chosen, the motion estimator identifies a displaced block MBref within the search window using predetermined criteria in steps 312, 314 and 316.
The identification of a displaced block within a reference frame is well known in the art. See, for example, Jain, et al, above, at 1799-1808. The displaced block is a block of data within the search window having the least image difference when compared to the DBi,t. In step 312, the block DBi,t is compared with several candidate displaced blocks in the search window in order to determine the candidate block having the least image difference. In step 314, the data for the candidate blocks is retrieved as needed from a means for storing the reference frame, for example, the reference frame storage buffer 108 of FIG. 1. The image difference may suitably constitute the mean-absolute-difference or the mean-squared-error between the total pel luminance value for the new block DBi,t and the total pel luminance value for each candidate displaced block. For more information on such block matching techniques, see, for example, Jain, et al, above, at 1800; Srinivasan, et al, "Predictive Coding Based on Efficient Motion Estimation," Proc. Int'1 Conf.
Communications pp. 521-26 (Amsterdam, May 14-17, 1984).
The choice of candidate blocks within the search window for which such image comparisons are made in step 312 is dictated by a predetermined search methodology.
The search methodology may suitably be an exhaustive search, which compares every definable block within the search window with the new block. The exhaustive search is often preferable because it is guaranteed to find the block in the search area having the minimum error. The search methodology may alternatively be a logarithmic search, a hierarchical search or other search method which is known in the art. For more information on these alternative search methodologies, see, for example, Jain, et al, above; Srinivasan, et al, above; and Bierling, "Displacement Estimation by Hierarchical Blockmatching,"
Proc. Visual Communications & Image Processing '88, (SPIE, November 9-11, 1988).

Once the displaced block MBref is identified, the motion estimator 104 derives the motion vector MVi,at for the new block DBi,t in step 318. The motion vector is defined as the displacement between the new block position BPi,t and the pel position of the upper left hand corner pel, or the first pel, of the displaced block, MBref. The displaced block is thereafter provided in step 320 to other elements in the coding circuit, for example, the subtraction node 105 and addition node 110 as illustrated in FIG. 1 above. In step 322, the motion vector MVi,ac is provided to the global motion estimator and other elements in the coding circuit, for example, the coding device from FIG. 1.
lnThile motion compensation prediction employs only one reference frame, motion compensation interpolation performs block matching using more than one reference frame. As a consequence, in step 324, the processor must determine if there is another reference frame with which to perform block matching. If in step 324 the processor determines that another block matching step must be performed, the processor executes step 326. In step 326, the processor instructs the reference frame storage means to make the next reference frame available for block matching. The processor then returns to step 310 to define a search window within the new reference frame.
If, however, the answer to step 324 is no, the processor executes step 328.
In step 328, the processor determines whether there are more new data blocks within the current frame to be compensated. This information may suitably be provided by the control circuitry. If no, the processor retrieves a new global motion vector GMV from the global motion estimator in step 330. The global motion estimator may suitably operate according to the method discussed below in connection with FIG. 4. In step 332, the new global motion vector is stored as GM for use within the motion estimator 104 for processing the next video frame, which in motion compensation prediction would typically be Ft+1.
The motion estimator 104 thereafter executes step 302.
If, however, the answer in step 328 is yes, the motion estimator 104 proceeds directly to step 302 to receive further blocks from the current frame.
FIG. 4 shows an exemplary functional flow diagram of the novel global motion estimation program according to the present invention. The program is executed by processing means within a global motion estimator such as the global motion estimator 106 that is discussed above in connection with FIG. 1. The global motion estimator may suitably be a digital microprocessor circuit with associated memory and buffer configurations capable of executing the software as well as receiving and transmitting the appropriate data from and to other components. Such circuitry is well known in the art. In one embodiment, the microprocessor required to execute the program may be a dedicated microprocessor. However, because the global motion estimation computation is relatively simple and occurs only once per video frame, the program may be executed by a shared microprocessor, including, for example, one used for timing and control of the overall coder circuit.
Essentially, the global motion estimator generates a global motion vector for a video frame by estimating the component of motion common to every block in the entire frame. This motion component is typically caused by camera motion or panning. The global motion estimator according to the present invention uses the block motion vectors associated with the previous frame to calculate the present global motion component.
A global motion estimator embodiment that executes the steps illustrated in FIG. 4 operates by first defining a common motion vector by taking the average of the motion vectors stored during the coding of a previous frame. A
new global motion vector may then be provided to a motion estimator programmed to operate as discussed above in reference to FIG. 3A. Other methods for estimating the global motion may be readily employed by one of ordinary skill in the art.
In step 401, motion vector data defining a motion vector, MVi,at for a block i, between a frame and a reference frame dt frames apart, is received from a block-based motion estimator. The motion estimator may suitably operate in accordance with FIG. 3A above to effect block matching and motion vector generation. In step 401, the received motion vector is also normalized so that it represents the motion between consecutive frames. A
normalized motion vector is given by equation MVi,at~dt where dt is the current frame number minus the reference frame number. In step 402, the motion vector, MVi,at~ is added to a motion vector total, MVT, for the frame.
Alternatively, the motion vector, MVi,ar~ may be stored in a memory in order to facilitate alternative global motion vector computation methods, as discussed further below.
In step 403, a counter C is advanced to track the number of motion vectors received for the current frame.
In step 404, the global motion estimator may return to receive another motion vector for the current block i.
If the coder circuit employs motion compensation interpolation, multiple motion vectors, typically two, are generated during the processing of each new block of video data. Control circuitry typically provides this information to the global motion estimator so that the appropriate decision in step 404 can be made. If there are no more motion vectors for the block i, step 405 is executed.
r In step 405, the global motion estimator determines whether there is another block of the current video frame to be motion compensated, in other words, whether another motion vector is expected for the frame Ft. If another motion vector is expected for the current frame, the program returns to step 401 and waits for the next motion vector, MVi,ac from the motion estimation device. If, however, another motion vector is not expected for the frame, the global motion estimator executes step 406.
In step 406 the common motion vector is calculated for the current frame. The common motion vector may be the average motion vector determined by dividing the motion vector total by the number of vectors, C, received for the frame, or in other words, MVT/C. Alternatively, the common motion vector may suitably be the median motion vector or some other value as discussed below.
In step 407, the global motion vector GMV is generated for use in motion estimating the next frame, which'for motion compensation prediction would be the frame Ft+1. Because the motion vectors are normalized as they are received, the resulting global motion vector will also already be normalized. As discussed above, the global motion vector may then be scaled in the motion estimator to allow for block matching therein with respect to any reference frame. The global motion vector GMV may suitably be the common motion vector calculated in step 406. It should be noted, however, that the use of multiple common motion vectors to generate a global motion vector may allow more sophisticated methods of estimating the global motion.
In step 408, the global motion vector is provided to a connection to the motion estimator which is operable to perform block-based motion compensation. The motion estimator 104 may suitably perform the method discussed above in connection with FIG. 3A.

- 1 ~ ~~

In the functional block 409, the counter C and motion vector total MVT of the global motion estimator are re-initialized to prepare for the global motion vector calculation for the next video frame.
In an alternative embodiment of the global motion estimation method, the median motion vector from a plurality of stored motion vectors may provide the common motion vector in step 406. In such an embodiment, step 402 of FIG. 4 would include the operation of storing the motion vector MVi,at. Step 406 would then include the step of analyzing the several stored motion vectors to determine the median. The foregoing steps may be readily implemented.
In yet another alternative embodiment, the global motion vector may suitably be determined from a histogram.
Under this method, the motion vector occurring with the most repetition within the processing of a frame would constitute the global motion vector.
To implement this embodiment, step 402 would include the step of updating a histogram of a table of possible motion vector values. For example, if MVi,at = (0,+3), the occurrence index for (0,+3) within the table of possible motion vector values is incremented by one. The population of possible motion vectors may suitably be created for each new frame, whereby the value of motion vector MVi,at is added to the population unless it is already represented. For example, if MVi,ac = (0,+3), and no occurrence index for that value exists, then (0,+3) is added and its index is incremented to one.
Additionally, step 405 would include the step of scanning the table for the largest occurrence index. The motion vector value occurring the most is then determined to be the~common motion vector. The programming necessarily to achieve the foregoing may be readily accomplished.

The usefulness of this embodiment is clear. It may assumed that most objects (and blocks) within a video frame do not exhibit independent movement. For example, background objects in most scenes are relatively static.
However, if camera motion or global motion is present, these otherwise static objects will all have the same motion vector. Because independently moving objects having the exact same motion vector will seldom outnumber the static objects, the motion vector having the most occurrences within a video frame is assumed to represent the global motion.
It is to be understood that the above-described arrangements of the invention are merely illustrative.
Other arrangements may be devised by those skilled in the art which will embody the principles of the invention and fall within the spirit and scope thereof.

Claims (21)

1. A method of converting a block of video data which is one of a plurality of blocks defining a frame of video data, into a compressed encoded digital bit stream for transmission over a transmission medium, said method comprising:
a) generating a global motion vector;
b) generating both video data and position data corresponding to the block of video data to be coded; and c) effecting block matching by using predetermined criteria to identify a displaced block corresponding to the block of video data to be coded, the displaced block having a location within a reference frame, said block matching step further comprising defining a search window within the reference frame using the global motion vector.
2. The method of claim 1 further comprising the steps:
d) generating a motion vector defined by the block position data and the location of the displaced block within the reference frame;
e) storing said motion vector;
f) encoding said motion vector into a bit stream for transmission; and g) repeating steps b) through f) for a plurality of blocks to be encoded, said plurality of blocks constituting at least a portion of one frame of video data.
3. The method of claim 2 wherein the step of generating the global motion vector further comprises the step of utilizing a plurality of the previously stored motion vectors from said step (e) to generate the global motion vector and further comprising the step of:

h) repeating steps a) through g) for a plurality of frames of video data.
4. The method of claim 3 wherein the plurality of previously stored motion vectors comprise motion vectors stored during the encoding of a frame of video data.
5. The method of claim 3 wherein the step of generating the global motion vector utilizing a plurality of the previously stored motion vectors further comprises utilizing the mean of the plurality of the previously stored motion vectors to generate the global motion vector.
6. The method of claim 3 wherein the step of generating the global motion vector utilizing a plurality of the previously stored motion vectors further comprises utilizing the median of the plurality of the previously stored motion vectors to generate the global motion vector.
7. The method of claim 3 wherein the step of generating the global motion vector utilizing a plurality of the previously stored motion vectors further comprises utilizing the motion vector occurring with the most repetition within the plurality of the previously stored motion vectors to generate the global motion vector.
8. An apparatus for determining the location of a displaced block of video data within a reference frame corresponding to a block of video data where the displaced block is identified from a plurality of candidate displaced blocks using predetermined criteria, the apparatus comprising:

a) a motion estimator comprising a processing means and memory means, the processing means operable to determine the location of a displaced block by defining a search window within the reference frame using a global motion vector, identifying a plurality of candidate displaced blocks having a location with the search window, and comparing each candidate displaced blocks with the block of video data; and b) a global motion estimator coupled to the motion estimator, comprising processing means and storage means, said processing means operable to generate global motion vectors.
9. The apparatus of claim 8 wherein the motion estimator further comprises means for effecting motion compensation prediction.
10. The apparatus of claim 8 wherein the motion estimator further comprises means for effecting motion compensation interpolation.
11. A video coder circuit for converting a plurality of frames of video data comprising a plurality of pels, each pel having pel video data, a plurality of pels further defining a block, into a compressed encoded digital bit stream for transmission over a transmission medium, said video coder circuit comprising:
a) a video frame storage unit operable to generate a block of video to be coded from a frame of video data, said block having a position within the frame of video data;
b) a motion estimation means connected to the video frame storage unit for receiving the video data corresponding to a block of video to be coded, said motion estimation means operable to effect block matching between a block of video to be coded and a block within a reference frame to identify a displaced block having a location within the reference frame, and to generate a motion vector defined by the position of the block to be coded and the location of the displaced block within the reference frame, said motion estimation means further operable to effect block matching using a global motion vector;
c) a first buffer for storing video data corresponding to one or more reference frames connected to the motion estimation means for providing reference frame video data thereto;
d) a second buffer for storing video data corresponding to a current frame connected to the first buffer for providing current frame video data thereto and connected to the motion estimation means;
e) a subtraction node connected to the video frame storage unit for receiving block video data and connected to the motion estimation means for receiving displaced block video data, said subtraction node operable to generate an error signal representative of the difference in pel video data between two blocks of video data; and f) a coding device connected to the subtraction node for receiving an error signal and connected to the motion estimation means for receiving a motion vector, said coding device operable to encode video data from a block error signal and motion vector data into a bit stream.
12. The system of claim 11 further comprising a global motion estimation means connected to the motion estimation means for providing the global motion vector thereto, said global motion estimation means operable to generate global motion vector data representative of the global motion between video frames.
13. The system of claim 12 wherein the global motion estimation means is operably connected to receive motion vectors from the motion estimation means and is further operable to generate a global motion vector from a plurality of received motion vectors.
14. The video coder circuit of claim 11 wherein the motion estimation means further comprises means for effecting motion compensation prediction.
15. The video coder circuit of claim 11 wherein the motion estimation means further comprises means for effecting motion compensation interpolation.
16. A method of defining a search window within a stored reference video frame for use in block matching in a block-based motion estimation device, the block-based motion estimation device being operable to effect block matching and generate motion vectors representing the displacement between a new video data block and a displaced block within the reference video frame, the search window definition method comprising:
a) generating a motion vector for the new data block from a frame of video data comprising a plurality of data blocks;
b) storing said motion vector in a memory;
c) repeating steps a) and b) until a plurality of motion vectors for a plurality of said data blocks within said frame of video data have been generated and stored;
d) generating a global motion vector from the plurality of the stored motion vectors;
e) identifying a video data block having a location within a frame of video data to be coded; and f) defining a search window having a location within a stored reference video frame wherein the location of the search window is at least partially dependent upon the global motion vector.
17. The method of claim 16 wherein the location of the search window to be defined in step f) is at least partially dependent on the location of the video data block within the frame of video data to be coded.
18. The method of claim 16 wherein the step of generating the global motion vector using a predetermined method further comprises the step of calculating the mean of the plurality of stored motion vectors.
19. The method of claim 16 wherein the step of generating the global motion vector using a predetermined method further comprises the step of determining the median of the plurality of stored motion vectors.
20. The method of claim 16 wherein the step of generating the global motion vector using a predetermined method further comprises the step of determining the motion vector occurring with the most repetition within the plurality of stored motion vectors.
21. The method of claim 20 wherein the step of generating the global motion further comprises the step of creating a table of occurrence indices for a plurality of possible motion vectors and the step of storing the motion vector comprises incrementing an occurrence index corresponding to the motion vector in the table.
CA002130779A 1993-11-04 1994-08-24 Method and apparatus for improving motion compensation in digital video coding Expired - Fee Related CA2130779C (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US08/147,802 US5473379A (en) 1993-11-04 1993-11-04 Method and apparatus for improving motion compensation in digital video coding
US147,802 1993-11-04

Publications (2)

Publication Number Publication Date
CA2130779A1 CA2130779A1 (en) 1995-05-05
CA2130779C true CA2130779C (en) 1999-06-22

Family

ID=22522957

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002130779A Expired - Fee Related CA2130779C (en) 1993-11-04 1994-08-24 Method and apparatus for improving motion compensation in digital video coding

Country Status (5)

Country Link
US (1) US5473379A (en)
EP (1) EP0652678B1 (en)
JP (1) JP3190529B2 (en)
CA (1) CA2130779C (en)
DE (1) DE69422266T2 (en)

Families Citing this family (94)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5539836A (en) * 1991-12-20 1996-07-23 Alaris Inc. Method and apparatus for the realization of two-dimensional discrete cosine transform for an 8*8 image fragment
BE1007681A3 (en) * 1993-10-29 1995-09-12 Philips Electronics Nv Device for transfer of television images and device for receiving it.
US6798836B1 (en) * 1993-10-29 2004-09-28 Koninklijke Philips Electronis N.V. Device for transmitting television pictures and device for receiving said pictures
US6052414A (en) * 1994-03-30 2000-04-18 Samsung Electronics, Co. Ltd. Moving picture coding method and apparatus for low bit rate systems using dynamic motion estimation
KR0148154B1 (en) * 1994-06-15 1998-09-15 김광호 Coding method and apparatus with motion dimensions
KR0126871B1 (en) * 1994-07-30 1997-12-29 심상철 HIGH SPEED BMA FOR Bi-DIRECTIONAL MOVING VECTOR ESTIMATION
KR100378636B1 (en) * 1994-09-02 2003-06-18 사르노프 코포레이션 Method and apparatus for global-to-local block motion estimation
US5682208A (en) * 1994-10-31 1997-10-28 Intel Corporation Motion estimation with efficient block matching
US5841476A (en) * 1995-03-03 1998-11-24 Kokusai Denshin Denwa Co. Ltd. Coding apparatus for encoding motion picture
JPH11506575A (en) 1995-03-07 1999-06-08 インターバル リサーチ コーポレイション Information selective storage system and method
US5734677A (en) * 1995-03-15 1998-03-31 The Chinese University Of Hong Kong Method for compression of loss-tolerant video image data from multiple sources
JPH0955941A (en) * 1995-08-16 1997-02-25 Sony Corp Picture encoding method/device and recording device for picture encoding data
US5835138A (en) * 1995-08-30 1998-11-10 Sony Corporation Image signal processing apparatus and recording/reproducing apparatus
JP3994445B2 (en) * 1995-12-05 2007-10-17 ソニー株式会社 Motion vector detection apparatus and motion vector detection method
WO1997027707A1 (en) * 1996-01-22 1997-07-31 Matsushita Electric Industrial Co., Ltd. Digital image encoding and decoding method and digital image encoding and decoding device using the same
US5661524A (en) * 1996-03-08 1997-08-26 International Business Machines Corporation Method and apparatus for motion estimation using trajectory in a digital video encoder
US6215910B1 (en) 1996-03-28 2001-04-10 Microsoft Corporation Table-based compression with embedded coding
US6571016B1 (en) * 1997-05-05 2003-05-27 Microsoft Corporation Intra compression of pixel blocks using predicted mean
US6404923B1 (en) 1996-03-29 2002-06-11 Microsoft Corporation Table-based low-level image classification and compression system
DE69710413T2 (en) * 1996-05-24 2002-10-02 Koninkl Philips Electronics Nv MOTION ESTIMATE
US5818535A (en) * 1996-09-30 1998-10-06 Alaris, Inc. Method and apparatus for adaptive hybrid motion video compression and decompression
JP3774954B2 (en) 1996-10-30 2006-05-17 株式会社日立製作所 Video encoding method
US5893062A (en) 1996-12-05 1999-04-06 Interval Research Corporation Variable rate video playback with synchronized audio
US6263507B1 (en) 1996-12-05 2001-07-17 Interval Research Corporation Browser for use in navigating a body of information, with particular application to browsing information represented by audiovisual data
JPH10210473A (en) * 1997-01-16 1998-08-07 Toshiba Corp Motion vector detector
US6115420A (en) 1997-03-14 2000-09-05 Microsoft Corporation Digital video signal encoder and encoding method
US6584226B1 (en) 1997-03-14 2003-06-24 Microsoft Corporation Method and apparatus for implementing motion estimation in video compression
US6639945B2 (en) 1997-03-14 2003-10-28 Microsoft Corporation Method and apparatus for implementing motion detection in video compression
US6118817A (en) * 1997-03-14 2000-09-12 Microsoft Corporation Digital video signal encoder and encoding method having adjustable quantization
US5903673A (en) * 1997-03-14 1999-05-11 Microsoft Corporation Digital video signal encoder and encoding method
AU1941797A (en) * 1997-03-17 1998-10-12 Mitsubishi Denki Kabushiki Kaisha Image encoder, image decoder, image encoding method, image decoding method and image encoding/decoding system
JPH10262258A (en) * 1997-03-19 1998-09-29 Sony Corp Image coder and its method
US6339616B1 (en) 1997-05-30 2002-01-15 Alaris, Inc. Method and apparatus for compression and decompression of still and motion video data based on adaptive pixel-by-pixel processing and adaptive variable length coding
US6067322A (en) * 1997-06-04 2000-05-23 Microsoft Corporation Half pixel motion estimation in motion video signal encoding
US6212237B1 (en) * 1997-06-17 2001-04-03 Nippon Telegraph And Telephone Corporation Motion vector search methods, motion vector search apparatus, and storage media storing a motion vector search program
DE69803639T2 (en) * 1997-08-07 2002-08-08 Matsushita Electric Ind Co Ltd Device and method for detecting a motion vector
EP0897247A3 (en) * 1997-08-14 2001-02-07 Philips Patentverwaltung GmbH Method for computing motion vectors
KR100249223B1 (en) * 1997-09-12 2000-03-15 구자홍 Method for motion vector coding of mpeg-4
KR100590436B1 (en) * 1997-11-07 2006-06-19 코닌클리케 필립스 일렉트로닉스 엔.브이. Coding a sequence of pictures
DE69803195T2 (en) * 1997-11-27 2002-08-29 British Telecomm CODE IMPLEMENTATION
KR100325253B1 (en) * 1998-05-19 2002-03-04 미야즈 준이치롯 Motion vector search method and apparatus
EP1119975B1 (en) * 1998-10-13 2003-04-23 STMicroelectronics Asia Pacific Pte Ltd. Motion vector detection with local motion estimator
JP4656680B2 (en) * 1998-11-30 2011-03-23 シャープ株式会社 Image search information recording apparatus and image search apparatus
EP1181828B1 (en) * 1999-05-13 2010-03-17 STMicroelectronics Asia Pacific Pte Ltd. Adaptive motion estimator
EP1075147A1 (en) * 1999-08-02 2001-02-07 Koninklijke Philips Electronics N.V. Motion estimation
US7155735B1 (en) 1999-10-08 2006-12-26 Vulcan Patents Llc System and method for the broadcast dissemination of time-ordered data
US6809758B1 (en) * 1999-12-29 2004-10-26 Eastman Kodak Company Automated stabilization method for digital image sequences
US6757682B1 (en) 2000-01-28 2004-06-29 Interval Research Corporation Alerting users to items of current interest
JP2001285874A (en) * 2000-03-28 2001-10-12 Nec Corp Device for searching motion vector, its method and recording medium for recording program
US7266771B1 (en) * 2000-04-21 2007-09-04 Vulcan Patents Llc Video stream representation and navigation using inherent data
KR20020020940A (en) * 2000-05-19 2002-03-16 요트.게.아. 롤페즈 Method, system and apparatus
US6968396B1 (en) * 2001-07-26 2005-11-22 Openwave Systems Inc. Reloading of hypermedia pages by sending only changes
FR2828055B1 (en) * 2001-07-27 2003-11-28 Thomson Licensing Sa METHOD AND DEVICE FOR CODING AN IMAGE MOSAIC
US7050500B2 (en) * 2001-08-23 2006-05-23 Sharp Laboratories Of America, Inc. Method and apparatus for motion vector coding with global motion parameters
JP4472986B2 (en) * 2001-09-12 2010-06-02 エヌエックスピー ビー ヴィ Motion estimation and / or compensation
US7227896B2 (en) * 2001-10-04 2007-06-05 Sharp Laboratories Of America, Inc. Method and apparatus for global motion estimation
US7602848B2 (en) * 2002-03-26 2009-10-13 General Instrument Corporation Methods and apparatus for efficient global motion compensation encoding and associated decoding
US7266151B2 (en) * 2002-09-04 2007-09-04 Intel Corporation Method and system for performing motion estimation using logarithmic search
US20040042551A1 (en) * 2002-09-04 2004-03-04 Tinku Acharya Motion estimation
US7421129B2 (en) * 2002-09-04 2008-09-02 Microsoft Corporation Image compression and synthesis for video effects
US20040057626A1 (en) * 2002-09-23 2004-03-25 Tinku Acharya Motion estimation using a context adaptive search
SG111093A1 (en) * 2002-11-18 2005-05-30 St Microelectronics Asia Motion vector selection based on a preferred point
GB0227570D0 (en) 2002-11-26 2002-12-31 British Telecomm Method and system for estimating global motion in video sequences
GB0227566D0 (en) 2002-11-26 2002-12-31 British Telecomm Method and system for estimating global motion in video sequences
US7408986B2 (en) * 2003-06-13 2008-08-05 Microsoft Corporation Increasing motion smoothness using frame interpolation with motion analysis
US7558320B2 (en) * 2003-06-13 2009-07-07 Microsoft Corporation Quality control in frame interpolation with motion analysis
US20050105621A1 (en) * 2003-11-04 2005-05-19 Ju Chi-Cheng Apparatus capable of performing both block-matching motion compensation and global motion compensation and method thereof
JP4591657B2 (en) 2003-12-22 2010-12-01 キヤノン株式会社 Moving picture encoding apparatus, control method therefor, and program
KR100994773B1 (en) * 2004-03-29 2010-11-16 삼성전자주식회사 Method and Apparatus for generating motion vector in hierarchical motion estimation
US8045614B2 (en) 2005-05-11 2011-10-25 Dolby Laboratories Licensing Corporation Quantization control for variable bit depth
US7471336B2 (en) 2005-02-18 2008-12-30 Genesis Microchip Inc. Global motion adaptive system with motion values correction with respect to luminance level
US7675573B2 (en) 2005-02-18 2010-03-09 Genesis Microchip Inc. Global motion adaptive system with motion values correction with respect to luminance level
US9225994B2 (en) * 2005-03-14 2015-12-29 British Telecommunications Public Limited Company Global motion estimation using reduced frame lines
US8705614B2 (en) * 2005-04-04 2014-04-22 Broadcom Corporation Motion estimation using camera tracking movements
EP1991957A2 (en) * 2005-07-12 2008-11-19 Nxp B.V. Method and device for removing motion blur effects
IL170320A (en) * 2005-08-17 2010-04-29 Orad Hi Tec Systems Ltd System and method for managing the visual effects insertion in a video stream
KR100714698B1 (en) * 2005-08-29 2007-05-07 삼성전자주식회사 Enhanced motion estimation method, video encoding method and apparatus using the same
CN100474929C (en) * 2005-09-07 2009-04-01 深圳市海思半导体有限公司 Loading device and method for moving compensating data
WO2007114611A1 (en) 2006-03-30 2007-10-11 Lg Electronics Inc. A method and apparatus for decoding/encoding a video signal
WO2007148906A1 (en) 2006-06-19 2007-12-27 Lg Electronics, Inc. Method and apparatus for processing a vedeo signal
US8532178B2 (en) 2006-08-25 2013-09-10 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8509313B2 (en) * 2006-10-10 2013-08-13 Texas Instruments Incorporated Video error concealment
KR101356735B1 (en) 2007-01-03 2014-02-03 삼성전자주식회사 Mothod of estimating motion vector using global motion vector, apparatus, encoder, decoder and decoding method
TWI367026B (en) * 2007-03-28 2012-06-21 Quanta Comp Inc Method and apparatus for image stabilization
CN101771870B (en) * 2009-01-06 2013-06-19 上海中科计算技术研究所 Quick searching method for block motion matching of video coding technique
US20110013852A1 (en) * 2009-07-17 2011-01-20 Himax Technologies Limited Approach for determining motion vector in frame rate up conversion
US8976860B2 (en) * 2009-09-23 2015-03-10 Texas Instruments Incorporated Method and apparatus for determination of motion estimation search window area utilizing adaptive sliding window algorithm
US20120147961A1 (en) * 2010-12-09 2012-06-14 Qualcomm Incorporated Use of motion vectors in evaluating geometric partitioning modes
JP2013074571A (en) * 2011-09-29 2013-04-22 Sony Corp Image processing apparatus, image processing method, program, and recording medium
GB2497812B (en) * 2011-12-22 2015-03-04 Canon Kk Motion estimation with motion vector predictor list
KR102290964B1 (en) 2014-02-19 2021-08-18 삼성전자주식회사 Video encoding device using adaptive search range and method thereof
US9769494B2 (en) * 2014-08-01 2017-09-19 Ati Technologies Ulc Adaptive search window positioning for video encoding
CN108293128A (en) 2015-11-20 2018-07-17 联发科技股份有限公司 The method and device of global motion compensation in video coding and decoding system
FR3111253B1 (en) * 2020-06-04 2023-08-25 Ateme IMAGE PROCESSING METHOD AND EQUIPMENT FOR IMPLEMENTING THE METHOD

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3689796T2 (en) * 1985-01-16 1994-08-04 Mitsubishi Electric Corp Video coding device.
JPH0722394B2 (en) * 1986-03-31 1995-03-08 日本放送協会 Motion compensation method
JP3001897B2 (en) * 1989-03-03 2000-01-24 松下電器産業株式会社 Image motion vector detection method and image motion vector detection device
JPH0813146B2 (en) * 1989-06-09 1996-02-07 松下電器産業株式会社 Motion vector detector
DE4007851A1 (en) * 1989-08-24 1991-02-28 Thomson Brandt Gmbh METHOD FOR MOTION COMPENSATION IN A MOTION IMAGE CODER OR DECODER
EP0419752B1 (en) * 1989-09-25 1995-05-10 Rai Radiotelevisione Italiana System for encoding and transmitting video signals comprising motion vectors
JP2679778B2 (en) * 1990-05-16 1997-11-19 松下電器産業株式会社 Image motion detection device
US5151784A (en) * 1991-04-30 1992-09-29 At&T Bell Laboratories Multiple frame motion estimation
JP2866222B2 (en) * 1991-06-12 1999-03-08 三菱電機株式会社 Motion compensation prediction method
EP0557007A2 (en) * 1992-02-15 1993-08-25 Sony Corporation Picture processing apparatus
US5247355A (en) * 1992-06-11 1993-09-21 Northwest Starscan Limited Partnership Gridlocked method and system for video motion compensation

Also Published As

Publication number Publication date
US5473379A (en) 1995-12-05
DE69422266D1 (en) 2000-01-27
JP3190529B2 (en) 2001-07-23
EP0652678B1 (en) 1999-12-22
EP0652678A2 (en) 1995-05-10
EP0652678A3 (en) 1995-09-27
DE69422266T2 (en) 2000-08-10
JPH07193823A (en) 1995-07-28
CA2130779A1 (en) 1995-05-05

Similar Documents

Publication Publication Date Title
CA2130779C (en) Method and apparatus for improving motion compensation in digital video coding
US5926231A (en) Method and apparatus for detecting motion vectors based on hierarchical motion estimation
US5657087A (en) Motion compensation encoding method and apparatus adaptive to motion amount
US7362808B2 (en) Device for and method of estimating motion in video encoder
US6289052B1 (en) Methods and apparatus for motion estimation using causal templates
US5786860A (en) High speed block matching for bi-directional motion vector estimation
US5760846A (en) Apparatus for estimating motion vectors for feature points of a video signal
EP1274254B1 (en) Video coding device and video decoding device with a motion compensated interframe prediction
US5706059A (en) Motion estimation using a hierarchical search
US6014181A (en) Adaptive step-size motion estimation based on statistical sum of absolute differences
EP1147668B1 (en) Improved motion estimation and block matching pattern
KR100659704B1 (en) Method, apparatus and system for obtaining a motion vector between first and second frames of video image data, and computer-readable medium
KR100492127B1 (en) Apparatus and method of adaptive motion estimation
JPH09179987A (en) Method and device for detecting motion vector
US5717470A (en) Method and apparatus for detecting optimum motion vectors based on a hierarchical motion estimation approach
JPH0795594A (en) Method and apparatus for detection of motion vector of half pixel accuracy
US5699129A (en) Method and apparatus for motion vector determination range expansion
Wee Reversing motion vector fields
US5862261A (en) Current frame prediction method and apparatus for use in an image signal encoding system
US7221390B1 (en) Computer-assisted motion compensation of a digitized image
US5485214A (en) Dual bus dual bank architecture for motion compensation
KR20030065314A (en) Method of performing motion estimation
JPH08228353A (en) Motion compensation circuit for moving image encoding
JPH1042300A (en) Motion vector detection device
JPH0951536A (en) Method and device for detecting moving vector

Legal Events

Date Code Title Description
EEER Examination request
MKLA Lapsed