|Publication number||US7474841 B2|
|Application number||US 10/971,005|
|Publication date||Jan 6, 2009|
|Filing date||Oct 25, 2004|
|Priority date||Nov 21, 2003|
|Also published as||US20050157798|
|Publication number||10971005, 971005, US 7474841 B2, US 7474841B2, US-B2-7474841, US7474841 B2, US7474841B2|
|Original Assignee||Canon Kabushiki Kaisha|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (26), Referenced by (3), Classifications (25), Legal Events (3)|
|External Links: USPTO, USPTO Assignment, Espacenet|
The present invention relates to a technique for compression-encoding moving image and processing compression-coded moving image stream data.
In recent years, by virtue of development of digital signal processing technology, it is possible to perform high-efficiency coding on high-volume digital information such as moving image, still image or audio information, recording of the data onto a small magnetic medium or small optical medium, and transmission of the data via communication medium. Image sensing devices or the like to easily obtain a high quality video image and immediately output the image to information medium by utilizing the above technology have been developed.
Especially, the MPEG coding technique is used in recent moving image coding. Since a code rate can be greatly reduced by using intra-frame coding for coding utilizing intra-frame correlation and inter-frame coding for coding utilizing correlation between preceding and subsequent frames, the method is widely used in video reproduction apparatuses as represented by a DVD and image sensing apparatuses as represented by a video camera.
In Japanese and U.S. television standards, the frame rate is about 30 frames per 1 second. However, the frame rate of film material video images used as movies is generally about 24 frames per 1 second. Accordingly, as shown in a reference document “ARIB STD-B20”, a video image, with a frame rate of about 24 frames per 1 second, is 2-3 pulled down to a frame rate of about 30 frames per 1 second. Then, the compatibility between film material video image and television video image is improved and MPEG coding is performed by a technique disclosed in Japanese Patent Application Laid-Open No. 2000-41244.
A video signal having a frame rate of about 24 frames per 1 second, inputted from the video signal input terminal 801, is supplied to the coding unit 802. In the coding unit 802, the video signal is rate-converted by the 2-3 pulldown unit 810 to a rate of about 30 frames per 1 second (60 fields). The conversion is so-called 2-3 pulldown widely used in conversion from film materials for movies to television video images.
The rate-converted video signal is supplied from the 2-3 pulldown unit 810 to the MPEG unit 811. As recommended in the ISO/IEC 13818-2, the MPEG unit 811 compresses the video signal and generates a stream by a high-efficiency coding processing, and supplies the stream to the recording unit 803. The recording unit 803 records the generated stream on the recording medium 804.
As shown in
At normal recording times, coding is performed while the parameters are changed not so as to break the periodical repetition.
Next, a case where a new stream is connected to a stream already stored in the storage medium (data after the 2-3 pulldown and coding) and is recorded will be studied.
In this case, as the parameters of the already recorded stream and those of the newly connected stream are determined, a phase mismatch occurs in the parameters in a junction position. For example, in
To solve the above problem, the already-recorded stream or the new stream is re-encoded and the parameters are re-set so as to have a continuous phase. However, re-encoding takes processing time, and further, the image quality is seriously degraded by the re-encoding.
The present invention has been made in consideration of the above problems, and provides a technique for smooth joint recording with suppressed discontinuity between pictures upon connection of two streams without re-encoding of 2-3 pulldown video image.
For this purpose, the present invention provides an image processing apparatus for processing video coded stream data, compression-encoded by converting moving image data having a predetermined frames per 1 second by periodically generating 2-field picture data and 3-field picture data from respective frames of the moving image data and adding phase information of fields in respective pictures of the converted moving image data, comprising first detection means for detecting the phase information of picture data at a rear end of an already-recorded first video coded stream, second detection means for detecting the phase information of picture data at a head of a second video coded stream to be recorded following the first video coded stream, generation means for generating a dummy stream to maintain periodicity of the phase information between the first and second video coded streams detected by the first and second detection means, and recording means for recording the dummy stream generated by the generation means to a storage medium, so as to insert the dummy stream between the rear end of the first video coded stream and the head of the second video coded stream.
Other features and advantages of the present invention will be apparent from the following description taken in conjunction with the accompanying drawings, in which like reference characters designate the same name or similar parts throughout the figures thereof.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
Preferred embodiments of the present invention will now be described in detail in accordance with the accompanying drawings.
A video signal having a frame rate of about 24 frames per 1 second, inputted from the video signal input terminal 101, is supplied to the coding unit 102. In the coding unit 102, the video signal is rate-converted by the 2-3 pulldown unit 110 to a rate of about 30 frames per 1 second (60 fields). As described above, the conversion is well-known 2-3 pulldown method.
Note that as the video input terminal 101, any unit may be employed as long as it inputs a 24 frames/sec. video image. For example, an image sensing apparatus (or image sensing means) to obtain a 24 frames (48 fields)/sec. video image or a film scanner to sequentially read a 24 frames/sec. film material video image (each frame is read as 2-field video image data, otherwise, read data may be converted to 2 field data thereafter on the apparatus side). Further, a network interface may be employed if material data exists on a network.
The rate-converted video signal is supplied from the 2-3 pulldown unit 110 to the MPEG unit 111. The MPEG unit 111 compresses the video signal and generates a stream by a high-efficiency coding processing as recommended in the ISO/IEC 13818-2, and supplies the stream to the selection unit 108 and the stream information detection unit 105.
As shown in the figure, the MPEG unit 111 performs compressing based on parameters supplied from the outside. As in the above-described prior art, to remove redundancy of the 2-3 pulldown video image, parameters “top_field_first” and “repeat_first_field” are used.
If the parameter “repeat_first_field” is “0”, it indicates a 2-field structure, while if the parameter “repeat_first_field” is “1”, it indicates a 3-field structure. As described above, since the first and third fields are the same, image data is omitted regarding the third field. On the other hand, the parameter “top_field_first” indicates whether the top or bottom field is the first field. If the parameter “top_field_first” is “0”, it indicates that the bottom field is the first field, while if the parameter “top_field_first” is “1”, it indicates that the top field is the first field. There are 4 combinations of these binary parameters and they are periodically repeated.
That is, assuming that the combination of the parameters is represented as “(top_field_first, repeat_first_field)”, the combinations are repeated in 4 cycle units as follows.
At normal recording times, coding is performed while the parameters are changed not so as to break the periodical repetition. The stream generated by the coding unit 102 is supplied via the selection unit 108 to the recording unit 103, and is recorded on the recording medium 104.
In the present embodiment, the joint position information detection unit 107 reproduces and detects parameter information at the end of a stream to be subjected to joint recording (first stream: 210 in FIG. 2) among the streams already recorded on the recording medium 104, and the stream information detection unit 105 detects parameter information at the head of a stream to be newly joint-recorded (second stream: 220 in
In the present embodiment, as it is assumed that parameters (1,0), (1,1), (0,0), (0,1) are sequentially generated, it is determined whether or not the order of the end parameters of the first stream and the head parameters of the second stream corresponds with the above order. Further, it is also determined by the above determination whether or not a connection stream including plural picture data must be generated.
In the case of
If the recording medium 104 is a tape, the generated connection stream is recorded prior to recording of the second stream, and then the second stream is recorded. At this time, the second stream, generated from a video signal real-time inputted from the image sensing system or film scanner, must be saved in a buffer memory in the recording unit 103, during generation and recording of the connection stream. The second stream may be similarly saved in the buffer if the recording medium 104 is a disk, and further, it may be arranged such that the connection stream is independently recorded after the second stream and then the connection stream is inserted between the first stream and the second stream by editing.
A video image represented with the connection stream 230 is simply generated by utilizing the end picture of existing stream (e.g., first stream) as a P picture or B picture. As a result, the data size of the connection stream is very small. In this manner, a constant data rate can be maintained in a joint portion by appropriately inserting stuffing data and controlling data amount. The buffer can be prevented from causing underflow/overflow by keeping the data rate constant according to the stuffing data inserted into the connection stream 230, when decoding joint portion.
The connection will be described with reference to
Note that as the stream A and the stream B are connected, the SEC (Sequence End Code) at the end of the stream A is deleted, and an SEC is added to the end of the connected stream B. Further, the connection stream 800 belongs to a GOP (Group Of Pictures) of the end picture of the stream A.
As it is fully understood from the above description, the connection stream has maximum 3 pictures (each picture has 2 or 3 fields) to 0 picture. In the case of 0 picture, i.e., when it is determined that phases of the preceding and the subsequent streams are matched, generation and recording of connection stream are not performed (corresponding to NO at step S3 in the flowchart of
Further, in the present embodiment, the respective pictures of the connection stream are P pictures utilizing the end picture of the stream A, however, they may be B pictures. Further, it is apparent that the respective pictures of the connection stream may be generated by utilizing the head picture of the stream B. In the latter case, An SHC (Sequence Header Code) is inserted at the head of the connection stream, and the SHC at the head of the stream B is deleted.
Numeral 6 denotes a 2-3 pulldown unit; and 7, an MPEG coding unit. Note that as recent PCs have a very high processing capacity, processings corresponding to the 2-3 pulldown unit 7 and the MPEG coding unit 7 may be realized as software processing. Numeral 8 denotes an output unit to output the result of editing such as connection processing. As the output unit, a DVD recorder apparatus or the like may be assumed, however, if the output destination exists on a network (such as a file server), the output unit is a network interface.
In this construction, when the power of the apparatus is turned on, the OS is loaded on the RAM 2, then the application program according to the present embodiment is loaded on the RAM 2, thereby the processing corresponding to the flowchart of
Note that in the above embodiment, a connection stream has up to 3 picture data, on the premise of conversion from 24 frames to 30 frames. It is apparent that in the case of other frame rate conversions, the number of pictures is changed in correspondence with frame rate. Further, if the number of pictures included in the connection stream is increased, it may be arranged such that the pictures are divided into 2 sections. The picture data in the first section are generated by utilizing picture data at the rear end of the first stream, while the picture data in the second section are generated by utilizing picture data at the front end of the second stream.
Further, if the recording medium 104 is a random-accessible storage medium such as a hard disk, plural coded stream data exist there. In such case, the apparatus has means for designating 2 of the plural streams recorded on the recording medium, or means for designating 1 of the streams as a stream with which data, inputted from the outside and encoded, is connected. Further, when 3 or more streams are to be connected, as the processing for connecting 2 streams is repeated, the apparatus has means for designating the order of repetition of the processing.
Further if the recording medium 104 is a sequentially-accessible medium such as a magnetic tape, such designation means is unnecessary (note that means for searching for an end position of already-recorded stream is necessary).
Further, in a case where the construction in
Next, a second embodiment of the present invention will be described.
If the video image inputted from the input terminal has the frame rate of about 30 frames per 1 sec, the parameter values (top_field_first, repeat_first_field) are fixedly (1,0) or (0,0).
Upon connection between a stream having a 24 f/s frame rate and a stream having a 30 f/s frame rate, the connection stream generation unit 106 generates a connection stream to smoothly connect the repetition of the above parameters.
For example, connection between a stream having the 30 f/s frame rate ((top_field_first, repeat_first_field) values are fixedly (1,0))(stream recorded without 2-3 pulldown processing) already recorded on the recording medium, and a stream having the 24 f/s frame rate (as the head parameters, (top_field_first, repeat_first_field)=(0,1), (1,0), (1,1), . . . ) will be studied.
In a case where the rear end picture parameters of the already-recorded stream are (top_field_first, repeat_first_field)=(1,0), and the parameters of the head picture of the subsequent stream (24 f/s) after the 2-3 pulldown-processing are (1,0), since this condition corresponds to that of the first embodiment, a connection stream having parameters (1,1), (0,0), (0,1) is generated, and inserted between the rear end of the already-recorded stream and the head of the subsequent stream.
Further, in a case where the already-recorded stream is a stream which has been 2-3 pulldown-processed and encoded and the parameters of its rear end picture are (top_field_first, repeat_first_field)=(1,0), and a 30 f/s stream where the parameters are fixedly (1,0) is connected as a subsequent stream, a connection stream is generated and inserted as in the above case.
Accordingly, in the second embodiment, the processing procedure is substantially the same as that in
Next, a third embodiment of the present invention will be described.
The difference from the first and second embodiments is that the video signal input terminal 101 and the coding unit 102 are replaced with a stream input unit 501. The other blocks are identical to those of the previous construction. That is, in the apparatus according to the third embodiment having no coding unit, already-coded streams are connected with each other. In this construction, smooth connection to maintain repeated parameter phase in joint portion can be realized in accordance with the above-described procedure.
As described above, the means for realizing the functions in the first to third embodiments may be a general information processing apparatus such as a personal computer as well as an image sensing apparatus. That is, the present invention includes a computer utility application program in its scope. Further, since a computer program is generally stored on a computer-readable storage medium such as a CD-ROM and duplicated/installed into the system from the medium set in the computer, thereby becomes executable, such computer-readable storage medium is also included in the scope of the invention.
As described above, according to the present invention, upon connection between 2 video coded streams, a dummy stream is generated and inserted to maintain periodicity of phase information therebetween. Thus smooth joint recording while maintaining stream continuity without re-encoding can be realized.
As many apparently widely different embodiments of the present invention can be made without departing from the spirit and scope thereof, it is to be understood that the invention is not limited to the specific embodiments thereof except as defined in the appended claims.
This application claims priority from Japanese Patent Application No. 2003-392076 filed on Nov. 21, 2003, the entire contents of which is hereby incorporated by reference herein.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US4982280 *||Jul 18, 1989||Jan 1, 1991||Yves C. Faroudja||Motion sequence pattern detector for video|
|US5313281 *||Jun 23, 1993||May 17, 1994||Sony United Kingdom Ltd.||Video to film conversion|
|US5337154 *||Jul 19, 1993||Aug 9, 1994||Sony United Kingdom Limited||Format conversion of digital video signals, integration of digital video signals into photographic film material and the like, associated signal processing, and motion compensated interpolation of images|
|US5671320 *||Jun 7, 1995||Sep 23, 1997||Time Warner Entertainment Co., L. P.||System and method for controlling play of multiple dialog audio tracks of a software carrier|
|US5691771 *||Dec 26, 1995||Nov 25, 1997||Sony Corporation||Processing of redundant fields in a moving picture to achieve synchronized system operation|
|US5703654 *||Jun 24, 1996||Dec 30, 1997||Nippon Steel Corporation||Video signal encoder|
|US5771357 *||Aug 22, 1996||Jun 23, 1998||Sony Corporation||Encoding/decoding fields of predetermined field polarity apparatus and method|
|US5821991 *||Feb 28, 1996||Oct 13, 1998||C-Cube Microsystems, Inc.||Method and apparatus for inverse telecine process by correlating vectors of pixel differences|
|US5953456||Sep 5, 1996||Sep 14, 1999||Canon Kabushiki Kaisha||Recording apparatus for repetitively recording image data of same frame and reproducing apparatus|
|US5963678||Dec 24, 1996||Oct 5, 1999||Canon Kabushiki Kaisha||Image signal processing apparatus and method|
|US5982444 *||Jun 10, 1998||Nov 9, 1999||Sony Corporation||Encoding method and apparatus for encoding edited picture signals, signal recording medium and picture signal decoding method and apparatus|
|US6360018||Oct 3, 1997||Mar 19, 2002||Canon Kabushiki Kaisha||Image processing apparatus and method|
|US6441813 *||May 13, 1998||Aug 27, 2002||Kabushiki Kaisha Toshiba||Computer system, and video decoder used in the system|
|US6549668 *||Oct 12, 1999||Apr 15, 2003||Stmicroelectronics S.R.L.||Detection of a 3:2 pulldown in a motion estimation phase and optimized video compression encoder|
|US6559890 *||Apr 21, 1999||May 6, 2003||Ascent Media Group, Inc.||Methods and apparatus for correction of 2-3 field patterns|
|US6587505||Aug 3, 1999||Jul 1, 2003||Canon Kabushiki Kaisha||Image processing apparatus and method|
|US6614441 *||Jan 7, 2000||Sep 2, 2003||Intel Corporation||Method and mechanism of automatic video buffer flipping and display sequence management|
|US6674480 *||Jan 29, 2001||Jan 6, 2004||Nec Electronics Corporation||Device for and method of converting a frame rate in a moving picture decoder, and a record medium and an integrated circuit device for implementing such a method|
|US6707984 *||Oct 31, 2001||Mar 16, 2004||Thomson Licensing S.A.||Changing a playback speed for video presentation recorded in a modified film format|
|US6871003 *||Mar 17, 2000||Mar 22, 2005||Avid Technology, Inc.||Edit decision list for identifying the pull down phase of a video signal|
|US6934335 *||Dec 11, 2001||Aug 23, 2005||Sony Corporation||Video encoder with embedded scene change and 3:2 pull-down detections|
|US6937773 *||Oct 6, 2000||Aug 30, 2005||Canon Kabushiki Kaisha||Image encoding method and apparatus|
|US7236207 *||Jan 22, 2003||Jun 26, 2007||Broadcom Corporation||System and method of transmission and reception of progressive content with isolated fields for conversion to interlaced display|
|US20040042550||Aug 13, 2003||Mar 4, 2004||Canon Kabushiki Kaisha||Encoding system conversion apparatus and method for same|
|US20040109678||Nov 19, 2003||Jun 10, 2004||Shingo Nozawa||Imaging apparatus, recording apparatus and recording method|
|JP2000041244A||Title not available|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US8311103 *||Jul 27, 2006||Nov 13, 2012||Canon Kabushiki Kaisha||Image recording apparatus for recording image data with display order field|
|US20070031129 *||Jul 27, 2006||Feb 8, 2007||Canon Kabushiki Kaisha||Image recording apparatus and method|
|US20080252721 *||Apr 1, 2008||Oct 16, 2008||Sony Corporation||Film detection device and method, and picture signal processing device and method|
|U.S. Classification||386/232, 348/97, 348/441, G9B/27.012, 386/E09.013, 348/459, 386/326|
|International Classification||G11B27/034, H04N5/781, H04N9/11, H04N5/253, H04N7/01, H04N3/36, H04N9/804, H04N11/20, H04N5/85, H04N9/47, H04N5/08|
|Cooperative Classification||H04N7/0112, G11B27/034, H04N5/781, H04N9/8042, H04N5/85|
|European Classification||G11B27/034, H04N9/804B|
|Oct 25, 2004||AS||Assignment|
Owner name: CANON KABUSHIKI KAISHA, JAPAN
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NOZAWA, SHINGO;REEL/FRAME:015927/0471
Effective date: 20041018
|Jun 6, 2012||FPAY||Fee payment|
Year of fee payment: 4
|Aug 19, 2016||REMI||Maintenance fee reminder mailed|