Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS7324160 B2
Publication typeGrant
Application numberUS 10/901,383
Publication dateJan 29, 2008
Filing dateJul 29, 2004
Priority dateNov 22, 2003
Fee statusPaid
Also published asCN1620109A, CN100385914C, US20050110902
Publication number10901383, 901383, US 7324160 B2, US 7324160B2, US-B2-7324160, US7324160 B2, US7324160B2
InventorsSeung-Joon Yang
Original AssigneeSamsung Electronics Co., Ltd.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
De-interlacing apparatus with a noise reduction/removal device
US 7324160 B2
Abstract
A de-interlacing apparatus with a noise reduction/removal device. The noise reduction/removal device can include a motion prediction unit that predicts motion vectors between an image one period ahead of a previous image and a current image with respect to individual images which are sequentially inputted; a motion checking unit that applies the motion vectors predicted by the motion prediction unit to the image one period ahead of the previous image and two different images ahead of the current image in time, and checks whether the motion vectors are precise motion vectors; a motion compensation unit that compensates for motions in use of the motion vectors checked for preciseness thereof by the motion checking unit; and a noise removal unit that removes noise on images using the motion-compensated images by the motion compensation unit and the inputted images. Accordingly, the noise reduction/removal device can reduce or remove noise through simple procedures on noise-bearing images.
Images(4)
Previous page
Next page
Claims(20)
1. A noise reduction/removal device, comprising:
a motion prediction unit that predicts motion vectors between an image one period ahead of a previous image and a current image with respect to individual images which are sequentially inputted;
a motion checking unit that applies the motion vectors predicted by the motion prediction unit to the image one period ahead of the previous image and two different images ahead of the current image in time, and checks whether the motion vectors are precise motion vectors;
a motion compensation unit that compensates for motions using the motion vectors checked for preciseness thereof by the motion checking unit; and
a noise removal unit that removes noise on images using the motion-compensated images by the motion compensation unit and the inputted images.
2. The noise reduction device as claimed in claim 1, wherein the motion prediction unit calculates Sum-of-Absolute-Difference (SAD) values of block units with respect to all vectors in a predicted radius and/or cost functions corresponding to the SAD values, and predicts the motion vectors based on the calculated cost functions.
3. The noise reduction/removal device as claimed in claim 1, wherein the motion checking unit calculates SAD values of block units and/or cost functions corresponding to the SAD values with respect to the image one period ahead of the previous image and the two different images ahead of the current image in time, and checks based on the calculated cost functions whether the motion vectors are precise motion vectors.
4. The noise reduction/removal device as claimed in claim 1, wherein the motion checking unit compares the calculated SAD values and/or the cost functions corresponding to the SAD values with SAD values corresponding to multiple motion vectors except for the predicted motion vectors and/or cost functions corresponding to the SAD values, and checks whether the motion vectors are precise motion vectors.
5. The noise reduction/removal device as claimed in claim 4, wherein, if the calculated SAD values and/or values of the cost functions corresponding to the SAD values are larger than any of the SAD values of block units corresponding to multiple motion vectors except for the predicted motion vectors and/or values of the cost functions corresponding to the SAD values, the motion compensation unit compensates for the motions using any of the multiple motion vectors except for the predicted motion vectors.
6. The noise reduction/removal device as claimed in claim 5, wherein the noise removal unit adds weight values of the inputted images and the images compensated by using the motion vectors checked by the motion checking unit, and produces noise-removed images.
7. A de-interlacing apparatus, comprising:
a motion prediction unit that predicts motion vectors between an image one period ahead of a previous image and a current image with respect to individual images which are sequentially inputted;
a motion checking unit that applies the motion vectors predicted by the motion prediction unit to the image one period ahead of the previous image and two different images ahead of the current image in time, and checks whether the motion vectors are precise motion vectors;
a motion compensation unit that compensates for motions using the motion vectors checked for preciseness thereof by the motion checking unit; and
a de-interlacing unit that converts an interlaced scanning format into a progressive scanning format using the motion-compensated images by the motion compensation unit and the inputted images.
8. The de-interlacing apparatus as claimed in claim 7, wherein the motion prediction unit calculates Sum-of-Absolute-Difference (SAD) values of block units with respect to all vectors in a predicted radius and/or cost functions corresponding to the SAD values, and predicts the motion vectors based on the calculated cost functions.
9. The de-interlacing apparatus as claimed in claim 7, wherein the motion checking unit calculates SAD values of block units and/or cost functions corresponding to the SAD values with respect to the image one period ahead of the previous image and the two different images ahead of the current image in time, and checks based on the calculated cost functions whether the motion vectors are precise motion vectors.
10. The de-interlacing apparatus as claimed in claim 7, wherein the motion checking unit compares the calculated SAD values and/or the cost functions corresponding to the SAD values with SAD values corresponding to multiple motion vectors except for the predicted motion vectors and/or cost functions corresponding to the SAD values, and checks whether the motion vectors are precise motion vectors.
11. The de-interlacing apparatus as claimed in claim 10, wherein, if the calculated SAD values and/or values of the cost functions corresponding to the SAD values are larger than any of the SAD values of block units corresponding to multiple motion vectors except for the predicted motion vectors and/or values of the cost functions corresponding to the SAD values, the motion compensation unit compensates for the motions using any of the multiple motion vectors except for the predicted motion vectors.
12. The de-interlacing apparatus as claimed in claim 11, wherein the de-interlacing unit adds weight values of the inputted images and the images compensated by using the motion vectors checked by the motion checking unit, and produces noise-removed images.
13. A de-interlacing apparatus comprising:
a memory to store a sequence of noise-reduced field images and a noise-bearing field image temporally succeeding the sequence;
a motion prediction unit to generate motion vectors between the noise-bearing field image and a noise-reduced field image preceding the noise-bearing field image in the sequence by at least two time intervals;
a motion checking unit to evaluate the generated motion vectors against image motion between at least a noise-reduced field image preceding in the sequence the noise-bearing field image and a noise-reduced field image preceding the noise-bearing field image in the sequence by three time intervals;
a noise processing unit to produce a noise-reduced field image temporally next in the sequence from the noise-bearing field image using a set of motion vectors matching to a predetermined threshold the image motion as evaluated by the motion checking unit; and
a de-interlacing processing unit to produce a progressive scan image from the noise-reduced field image preceding the noise-bearing field image in the sequence by the at least two time intervals using the set of motion vectors.
14. The de-interlacing apparatus as claimed in claim 13, wherein the motion prediction unit generates the motion vectors using a sum-of-absolute-differences between respective pixel blocks in the noise-bearing field image and the noise-reduced field image preceding the noise-bearing field image in the sequence by at least two time intervals.
15. The de-interlacing apparatus as claimed in claim 14, wherein the predetermined threshold in the motion checking unit is the sum-of-absolute-differences in the motion predicting unit.
16. The de-interlacing apparatus as claimed in claim 13, wherein the noise processing unit includes a motion compensating unit to apply the set of motion vectors from the noise-reduced field image preceding the noise-bearing field image in the sequence by at least two time intervals to the noise-bearing field image to produce a motion compensated noise-bearing field image.
17. The de-interlacing apparatus as claimed in claim 16, wherein the noise processing unit includes a weight value calculation unit and a noise removal unit to remove noise from the motion compensated noise-bearing field image using a weight value determined from at least the noise-bearing field image to produce the noise-reduced field image temporally next in the sequence.
18. The de-interlacing apparatus as claimed in claim 13, wherein the de-interlacing processing unit includes a motion compensating unit to apply the set of motion vectors from the noise-reduced field image preceding the noise-bearing field image to the noise-reduced field image preceding the noise-bearing field image by three time intervals to the noise-bearing field image to produce a motion compensated field image.
19. The de-interlacing apparatus as claimed in claim 18, wherein the de-interlacing processing unit includes a weight value calculation unit and a de-interlacing unit to de-interlace the motion compensated field image with a weight value determined from at least the noise-reduced field image preceding the noise-bearing field image in the sequence by at least two time intervals to produce the progressive scan image.
20. The de-interlacing apparatus as claimed in claim 13, wherein the motion checking unit evaluates motion vectors other than the generated motion vectors to arrive at the set of motion vectors when the generated motion vectors are evaluated to be unacceptable with respect to the predetermined threshold.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of Korean Patent Application No. 2003-72073 filed Oct. 16, 2003, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein in its entirety.

BACKGROUND OF THE INVENTION

1. Field of the Invention

The present general inventive concept relates to a de-interlacing apparatus with an efficient noise reduction/removal device, and more particularly, to a de-interlacing apparatus with an efficient noise reduction/removal device capable of not only efficiently reducing/removing noise on images but also improving de-interlacing performance on images suffering from severe noise.

2. Description of the Related Art

There mainly exists a progressive scanning format and an interlaced scanning format for moving-picture scanning formats. For the purpose of better understanding, FIG. 1A shows a 30 Hz interlaced scanning (a 60 Hz field frequency) and FIG. 1B shows a 30 Hz progressive scanning.

The progressive scanning is performed with sampled image frames each of which is sampled at the same time, whereas the interlaced scanning is performed with sampled image frames each of which is sampled at a different time, and samplings are repeated on alternate lines. That is, in the interlaced scanning, each image frame is made up of two image fields in general, and the image fields sampled at different times are referred to as a top field and a bottom field, an odd field and an even field, an upper field and a lower field, a first field and a second field, or the like, respectively.

The Moving Picture Experts Group (MPEG) deals with an image as a unit of “one picture (a single view)”, and, in MPEG-2, the picture can be reconstructed to frames or fields. That is, a structure of reconstructing a picture to frames is referred to as a “Frame structure” and a structure of reconstructing a picture to fields is referred to as a “Field structure”.

Movies use the progressive scanning format which instantly takes and records scenes for views on film one by one and projects the views on the screen one at a time, view by view. On the other hand, TVs use the interlaced scanning format which divides one image frame into two fields and alternately scans the fields in order to display moving pictures effectively while using limited scan lines. In the NTSC color TV systems adopted in the United States, Japan, Korea, and so on, 30 frames each having 525 scan lines are sent per second for a view, and, in the PAL or the SECAM systems adopted in Europe and so on, 25 frames each having 625 scan lines are sent per second, so that the NTSC TVs processes 60 fields per second for a view, and the PAL or the SECOM TVs processes 50 fields per second for a view.

With not only the wide-spreading of the video display devices using the progressive scanning format, but also the increased necessity of data exchanges among the devices using different scanning formats, the need to convert the interlaced scanning format into the progressive scanning format is increasing more than ever. Such scanning format conversion is referred to as de-interlacing, and such a de-interlacing method is used for circumstances in which television signals through skywave broadcasts are converted into signals for computers to display television programs.

FIG. 2 is a view schematically showing a conventional de-interlacing apparatus. Referring to FIG. 2, the de-interlacing apparatus has a memory unit 210, an Motion Estimation/Motion Compensation (ME/MC) unit 220, a weight value calculation unit 230, a spatial axis execution unit 240, and a weight value averaging unit 250.

The memory 210 stores a current field f (k) (including added noise) and a previous field f (k−1) of an image inputted. The ME/MC unit 220 calculates motion vectors v using the current field f (k) (including the added noise) and the previous field f (k−1) that are stored in the memory 210. That is, the ME/MC unit 220 predicts motions using the fields before and after images and decides whether the motions exist. Further, the ME/MC unit 220 calculates a motion-compensated field fmc (k) using the current field f (k) (including the added noise), the previous field f (k−1), and the calculated motion vectors v.

The weight value calculation unit 230 calculates a weight value w for motion adaptation, using the current field f (k) (including the added noise) and the previous field f (k−1). The weight value calculated by the weight value calculation unit 230 is sent to the weight value averaging unit 250.

The spatial axis execution unit 240 carries out spatial axis deinterlacing using the current field f (k) (including the added noise) according to a method for executing the spatial axis deinterlacing, to thereby generate a progressive scan image fsp (k). That is, there exists the method using motion information and the method not using the motion information for the de-interlacing methods, and the spatial axis execution unit 240 carries out the de-interlacing without using motion information. Spatial axis de-interlacing methods include a line averaging method, a weighted median filtering method, an Edge-based line Averaging method, and so on.

The weight value averaging unit 250 performs the de-interlacing with time using the compensated field fmc (k) calculated by the ME/MC unit 220, the weight value w calculated by the weight value calculation unit 230, and the progressive scan image fsp (k) generated by the spatial axis execution unit 240, and generates a progressive scan image g (k) by the de-interlacing performed with time.

In the meantime, the conventional de-interlacing apparatus has a noise reduction device to reduce noise affecting a display for which the apparatus is provided. FIG. 3 shows a conventional noise reduction device. Referring to FIG. 3, the conventional noise reduction device has a memory 310, an ME/MC unit 320, a weight value calculation unit 330, and a weight value averaging unit 340.

The memory 310 stores a progressive scan image g (k−1) including a previous field f (k−1) and the current field f (k) (including the added noise) of an image inputted. The ME/MC unit 320 calculates motion vectors v using the current field f (k) (including the added noise) and the previous field f (k−1) of an image that are stored in the memory 310. That is, the ME/MC unit 320 predicts motions using the fields before and after images, and decides whether the motions exist. Further, the ME/MC unit 320 calculates a motion-compensated field fmc (k) using the previous field f (k−1) and the calculated motion vectors v.

The weight value calculation unit 330 calculates a weight value w for motion adaptation using the current field f (k) (including the added noise) and the previous field f (k−1). The weight value calculated by the weight value calculation unit 330 is sent to the weight value averaging unit 340.

The weight value averaging unit 340 reduces the noise by using the compensated field fmc (k) calculated by the ME/MC unit 320 and the weight value w calculated by the weight value calculation unit 330, and generates a noise-reduced image f′(k) with the noise reduction operations executed.

However, the conventional de-interlacing apparatus and noise reduction device as described above uses the current field f (k) including the added noise for motion predictions all the time, which leads to a high possibility of miscalculating motion vectors v due to the influence of the noise. Accordingly, there exists a problem in that the de-interlacing and noise reduction performance is deteriorated due to the use of miscalculated motion vectors v.

SUMMARY OF THE INVENTION

The present general inventive concept has been developed in order to solve the above drawbacks and other problems associated with the conventional arrangement. An aspect of the present general inventive concept is to calculate precise motion vectors based on images having severe noise thereon so as to improve noise reduction/removal and de-interlacing performance by using the precisely calculated motion vectors.

Additional aspects and advantages of the present general inventive concept will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the general inventive concept.

The foregoing and/or other aspects and advantages of the present general inventive concept are substantially achieved by providing a noise reduction/removal device comprising a motion prediction unit that predicts motion vectors between an image one period ahead of a previous image and a current image with respect to individual images which are sequentially inputted; a motion checking unit that applies the motion vectors predicted by the motion prediction unit to the image one period ahead of the previous image and two different images ahead of the current image in time, and checks whether the motion vectors are precise motion vectors; a motion compensation unit that compensates for motions using the motion vectors checked for preciseness thereof by the motion checking unit; and a noise removal unit that removes noise of images using the motion-compensated images from the motion compensation unit and the inputted images.

It is an aspect of the general inventive concept that the motion prediction unit can calculate Sum-of-Absolute-Difference (SAD) values of block units with respect to all vectors in a predicted radius and/or cost functions corresponding to the SAD values, and can predict the motion vectors based on the calculated cost functions.

Further, it is an aspect of the general inventive concept that the motion checking unit can calculate SAD values of block units and/or cost functions corresponding to the SAD values with respect to the image one period ahead of the previous image and the two different images ahead of the current image in time, and can check, based on the calculated cost functions, whether the motion vectors are precise motion vectors.

Further, it is an aspect of the general inventive concept that the motion checking unit can compare the calculated SAD values and/or the cost functions corresponding to the SAD values with SAD values corresponding to multiple motion vectors except for the predicted motion vectors and/or cost functions corresponding to the SAD values, and can check whether the motion vectors are precise motion vectors.

Further, it is an aspect of the general inventive concept that, if the calculated SAD values and/or values of the cost functions corresponding to the SAD values are larger than any of the SAD values of block units corresponding to multiple motion vectors, except for the predicted motion vectors and/or values of the cost functions corresponding to the SAD values, the motion compensation unit can compensate for the motions using any of the multiple motion vectors except for the predicted motion vectors.

Further, it is an aspect of the general inventive concept that the noise removal unit adds weight values of the inputted images and the images compensated by using the motion vectors checked by the motion checking unit, and produces noise-removed images.

The foregoing and/or other aspects and advantages of the present general inventive concept are also substantially achieved by providing a de-interlacing apparatus including a motion prediction unit that predicts motion vectors between an image one period ahead of a previous image and a current image with respect to individual images which are sequentially inputted; a motion checking unit that applies the motion vectors predicted by the motion prediction unit to the image one period ahead of the previous image and two different images ahead of the current image in time, and checks whether the motion vectors are precise motion vectors; a motion compensation unit that compensates for motions using the motion vectors checked for preciseness thereof by the motion checking unit; and a de-interlacing unit that converts an interlaced scanning format into a progressive scanning format using the motion-compensated images from the motion compensation unit and the inputted images.

In an aspect of the general inventive concept, the motion prediction unit calculates Sum-of-Absolute-Difference (SAD) values of block units with respect to all vectors in a predicted radius and/or cost functions corresponding to the SAD values, and predicts the motion vectors based on the calculated cost functions.

Further, it is an aspect of the general inventive concept that the motion checking unit calculates SAD values of block units and/or cost functions corresponding to the SAD values with respect to the image one period ahead of the previous image and the two different images ahead of the current image in time, and checks, based on the calculated cost functions, whether the motion vectors are precise motion vectors.

Further, it is an aspect of the general inventive concept that the motion checking unit compares the calculated SAD values and/or the cost functions corresponding to the SAD values with SAD values corresponding to multiple motion vectors except for the predicted motion vectors and/or cost functions corresponding to the SAD values, and checks whether the motion vectors are precise motion vectors.

Further, it is an aspect of the general inventive concept that, if the calculated SAD values and/or values of the cost functions corresponding to the SAD values are larger than any of the SAD values of block units corresponding to multiple motion vectors, except for the predicted motion vectors and/or values of the cost functions corresponding to the SAD values, the motion compensation unit compensates for the motions using any of the multiple motion vectors except for the predicted motion vectors.

Further, it is an aspect of the general inventive concept that the de-interlacing unit adds weight values of the inputted images and the images compensated by using the motion vectors checked by the motion checking unit, and produces noise-removed images.

Therefore, the de-interlacing apparatus with the noise reduction/removal device according to the present general inventive concept can calculate precise motion vectors based on severe noise-carrying images so as to improve noise reduction/removal and de-interlacing performance using the calculated motion vectors.

BRIEF DESCRIPTION OF THE DRAWINGS

These and/or other aspects and advantages of the present general inventive concept will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:

FIG. 1 is a view explaining the general progressive scanning format and the interlaced scanning format;

FIG. 2 is a view schematically showing a conventional de-interlacing apparatus;

FIG. 3 is a view schematically showing a conventional noise reduction device; and

FIG. 4 is a block diagram schematically showing a de-interlacing apparatus with a noise reduction device according to an embodiment of the present general inventive concept.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

Certain embodiments of the general inventive concept will be described in greater detail with reference to the accompanying drawings. In the following description, same drawing reference numerals are used for the same elements even in different drawings. The matters defined in the description such as a detailed construction and elements are nothing but the ones provided to assist in a comprehensive understanding of the general inventive concept. Thus, it is apparent that the general inventive concept can be carried out without those defined matters. Also, well-known functions or constructions are not described in detail since they would obscure the general inventive concept in unnecessary detail.

FIG. 4 is a block diagram schematically showing a de-interlacing apparatus with a noise reduction/removal device according to an embodiment of the present general inventive concept. Referring to FIG. 4, the noise reduction/removal device can have a memory unit 410, a motion prediction unit 420, a motion checking unit 430, and a noise processing unit 440. The noise processing unit 440 can include a motion compensation unit 441, a weight value calculation unit 443, and a noise removal unit 445.

The memory unit 410 stores a current field f (k), a next field f (k+1), a previous field f (k−1), and a pre-previous image y (k−2), of an image inputted. The pre-previous image y (k−2) is a noise-bearing image that obtains noise during image transmissions or processing procedures, which can be expressed in gaussian noise modeling by Equation 1 as follows:
y=f+n,  [Equation 1]

    • where y denotes an image on which noise is observed and n denotes gaussian noise.

The motion prediction unit 420 predicts, with respect to individual images sequentially inputted, motion vectors between an image which is one period ahead of a previous image and a current image. The image one period ahead of the previous image is an image with noise-added during the transmissions of the image or the processing procedures of the image. Further, the motion prediction unit 420 calculates a Sum-of-Absolute-Difference (SAD) value of a unit block or a cost function corresponding to the SAD value with respect to all vectors within a predicted radius, and predicts motion vectors based on the calculated cost function.

The motion checking unit 430 applies the motion vectors predicted by the motion prediction unit 420 to the image one period ahead of the previous image and two different images ahead of the current image in time, and checks whether the motion vector predictions are precise. This is to prevent wrong motion vectors from being predicted since motion vectors predicted with respect to the current image and the one-period-ahead image may bring about a poor degree of precision with respect to images suffering from severe noise. It is assumed that the motions last while having a constant velocity.

Provided that motion vectors are predicted with respect to macroblocks each having a size of [bs, bs], a SAD value of the (m,n)th macroblock between {circumflex over (ƒ)} (k) and y (k−2) with respect to a given motion vector v=[vi,vj]t can be defined as follows. {circumflex over (ƒ)} denotes an original image to which the Infinite Impulse Response (IIR) filter is applied.

Φ ( m , n , k ; v ) = ( i , j ) B ( m , n , k ) f ^ ( i + v i , j + v j , k ) - y ( i , j , k - 2 ) [ Equation 2 ]

    • where B(m,n,k) denotes a set of given pixel coordinates in the (m,n)th macroblock at a kth time axis.

Provided that a predicted motion vector is denoted as {circumflex over (ν)}, the {circumflex over (ν)} can be expressed as shown in the following Equation 3.

v ^ ( m , n , k ) = arg min v S Φ ( m , n , k ; v ) [ Equation 3 ]

    • where, S denotes a set of vectors in a search range. With such a method, the motion checking unit 430 can calculate an SAD value of a unit block or a cost function corresponding to the SAD value with respect to an image one period ahead of a previous image and two different images ahead of a current image in time, and can check, based on the calculated cost function, whether precise motion vectors are obtained. Further, the motion checking unit 430 can compare the calculated SAD value or the cost function corresponding to the SAD value with an SAD value corresponding to the multiple motion vectors except for the predicted motion vectors or a cost function corresponding to the SAD value, and can check whether precise motion vectors are obtained.

The motion compensation unit 441 of the noise processing unit 440 can compensate for motions by using the motion vectors checked for their preciseness by the motion checking unit 430. If the calculated SAD value or a value of the cost function corresponding to the SAD value is larger than at least one of the SAD values of the unit blocks each corresponding to the multiple motion vectors except for the predicted motion vectors or the values of the cost functions corresponding to the SAD values, it is an aspect of the general inventive concept that the motion compensation unit 441 can compensate for motions by using any of the plural motion vectors except for the predicted motion vectors.

The weight value calculation unit 443 can calculate a weight value w for motion adaptation by using the next field f (k+1), the current field f (k), and the previous field f (k−1). The weight value calculated by the weight value calculation unit 443 is sent to the noise removal unit 445.

The noise removal unit 445 can remove the noise of images by using motion-compensated images provided by the motion compensation unit 441 and inputted images. At this time, the noise removal unit 445 can add the weight values of the compensated images and the inputted images by using the motion vectors checked by the motion checking unit 430, and then can produce noise-reduced or noise-removed images. An image f (k−2) output from the noise removal unit 445 denotes a noise-removed image obtained from the noise-bearing image y (k−2).

Thus, the noise removal unit 445 can remove or efficiently reduce noise of noise-bearing images through simple procedures.

In the meantime, the de-interlacing apparatus has the memory unit 410, the motion prediction unit 420, the motion checking unit 430, and a de-interlacing processing unit 450. The de-interlacing processing unit 450 has a motion compensation unit 451, a weight value calculation unit 453, and a de-interlacing unit 455. The constituents of the memory unit 410, the motion prediction unit 420, the motion checking unit 430, the motion compensation unit 451, and the weight value calculation unit 453 are identical to those of the noise removal device, so the constituents are shown in one drawing, and detailed descriptions thereof will be omitted.

The de-interlacing unit 455 can add the weight values of a compensated image and an inputted image by using motion vectors checked by the motion checking unit 430, and then can produce a de-interlacing image. The image g (k) in FIG. 4 denotes a de-interlaced image of the progressive scanning format with respect to an image f (k) of an interlaced scanning format.

Therefore, the de-interlacing apparatus can perform a precise de-interlacing on noise-bearing images.

According to the present general inventive concept, the noise reduction/removal device can reduce/remove noise through simple procedures performed on inputted images, and the de-interlacing apparatus can perform precise de-interlacing with respect to noise-bearing images.

Although the embodiments of the present general inventive concept have been shown and described, it will be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles and spirit of the general inventive concept, the scope of which is defined in the appended claims and their equivalents.

Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US5682205Sep 6, 1996Oct 28, 1997Eastman Kodak CompanyAdaptive, global-motion compensated deinterlacing of sequential video fields with post processing
US5886745Dec 7, 1995Mar 23, 1999Matsushita Electric Industrial Co., Ltd.Progressive scanning conversion apparatus
US6240211 *Apr 14, 1998May 29, 2001Sgs-Thomson Microelectronics S.R.L.Method for motion estimated and compensated field rate up-conversion (FRU) for video applications and device for actuating such method
US6269484 *Jun 24, 1997Jul 31, 2001Ati TechnologiesMethod and apparatus for de-interlacing interlaced content using motion vectors in compressed video streams
US6348949 *Dec 22, 1998Feb 19, 2002Intel CorporationDeinterlacing a video signal using a motion detector
US6509930 *Jun 19, 2000Jan 21, 2003Hitachi, Ltd.Circuit for scan conversion of picture signal using motion compensation
US6577345Jul 26, 2000Jun 10, 2003Lg Electronics Inc.Deinterlacing method and apparatus based on motion-compensated interpolation and edge-directional interpolation
US6606126 *Sep 1, 2000Aug 12, 2003Lg Electronics, Inc.Deinterlacing method for video signals based on motion-compensated interpolation
US6839094 *Dec 14, 2000Jan 4, 2005Rgb Systems, Inc.Method and apparatus for eliminating motion artifacts from video
US7042512 *Feb 14, 2002May 9, 2006Samsung Electronics Co., Ltd.Apparatus and method for adaptive motion compensated de-interlacing of video data
US7057665 *Jul 23, 2003Jun 6, 2006Samsung Electronics Co., Ltd.Deinterlacing apparatus and method
US7079159 *Oct 17, 2003Jul 18, 2006Samsung Electronics Co., Ltd.Motion estimation apparatus, method, and machine-readable medium capable of detecting scrolling text and graphic data
US7098957 *Oct 23, 2001Aug 29, 2006Samsung Electronics Co., Ltd.Method and apparatus for detecting repetitive motion in an interlaced video sequence apparatus for processing interlaced video signals
US20020047919Oct 19, 2001Apr 25, 2002Satoshi KondoMethod and apparatus for deinterlacing
US20020171759Feb 8, 2002Nov 21, 2002Handjojo Benitius M.Adaptive interlace-to-progressive scan conversion algorithm
CN1297304AOct 28, 2000May 30, 2001索尼公司Image processing apparatus and method, data processing apparatus and method, and storage medium
CN1319995AMar 22, 2001Oct 31, 2001松下电器产业株式会社Method and equipment for calculating kinematical vector
EP1096791A2Oct 20, 2000May 2, 2001Sony CorporationImage processing and data processing
JP2000261768A Title not available
JP2003274414A Title not available
KR20010068516A Title not available
Non-Patent Citations
Reference
1China Office Action dated Mar. 10, 2006 of Chinese Patent Application No. 2004100864787.
2Japanese Office Action dated May 29, 2007 issued in JP 2004-335168.
3Search Report dated Jun. 27, 2006 of Dutch Patent Application No. 1027270.
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US7602440 *Jul 6, 2005Oct 13, 2009Sony CorporationImage processing apparatus and method, recording medium, and program
US7944503 *Jan 26, 2007May 17, 2011Texas Instruments IncorporatedInterlaced-to-progressive video processing
US8218091Apr 17, 2007Jul 10, 2012Marvell World Trade Ltd.Shared memory multi video channel display apparatus and methods
US8264610Apr 17, 2007Sep 11, 2012Marvell World Trade Ltd.Shared memory multi video channel display apparatus and methods
US8284322 *Apr 17, 2007Oct 9, 2012Marvell World Trade Ltd.Shared memory multi video channel display apparatus and methods
US8351510 *Jan 29, 2009Jan 8, 2013Zenverge, Inc.Motion compensated noise reduction using shared motion estimation engine
US8503533Jan 29, 2009Aug 6, 2013Zenverge, Inc.Motion estimation engine for performing multiple types of operations
US8508661Jan 29, 2009Aug 13, 2013Zenverge, Inc.Enhanced deinterlacing using predictors from motion estimation engine
US8687123 *Dec 12, 2008Apr 1, 2014Entropic Communications, Inc.Video signal processing
US8736757Jun 15, 2012May 27, 2014Marvell World Trade Ltd.Shared memory multi video channel display apparatus and methods
US8754991Sep 14, 2012Jun 17, 2014Marvell World Trade Ltd.Shared memory multi video channel display apparatus and methods
US8804040Aug 9, 2012Aug 12, 2014Marvell World Trade Ltd.Shared memory multi video channel display apparatus and methods
US20080055466 *Apr 17, 2007Mar 6, 2008Sanjay GargShared memory multi video channel display apparatus and methods
US20100265402 *Dec 12, 2008Oct 21, 2010Koninklijke Philips Electronics N.V.Video signal processing
US20100277644 *Aug 22, 2008Nov 4, 2010Nxp B.V.Method, apparatus, and system for line-based motion compensation in video image data
Classifications
U.S. Classification348/542, 348/E05.077, 348/E05.066
International ClassificationH04N7/01, H04N5/21, H04N5/14, H04N5/44, H04N7/32
Cooperative ClassificationH04N5/21, H04N5/145, H04N7/012
European ClassificationH04N7/01G3, H04N5/21, H04N5/14M2
Legal Events
DateCodeEventDescription
Jun 20, 2011FPAYFee payment
Year of fee payment: 4
Jul 29, 2004ASAssignment
Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YANG, SEUNG-JOON;REEL/FRAME:015644/0768
Effective date: 20040722