WO2002101936A2 - Low complexity channel decoders - Google Patents

Low complexity channel decoders Download PDF

Info

Publication number
WO2002101936A2
WO2002101936A2 PCT/US2002/014878 US0214878W WO02101936A2 WO 2002101936 A2 WO2002101936 A2 WO 2002101936A2 US 0214878 W US0214878 W US 0214878W WO 02101936 A2 WO02101936 A2 WO 02101936A2
Authority
WO
WIPO (PCT)
Prior art keywords
output
result
symbols
decoding process
soft
Prior art date
Application number
PCT/US2002/014878
Other languages
French (fr)
Other versions
WO2002101936A3 (en
Inventor
Doron Rainish
Daniel Yellin
Original Assignee
Intel Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corporation filed Critical Intel Corporation
Publication of WO2002101936A2 publication Critical patent/WO2002101936A2/en
Publication of WO2002101936A3 publication Critical patent/WO2002101936A3/en

Links

Classifications

    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/65Purpose and implementation aspects
    • H03M13/6577Representation or format of variables, register sizes or word-lengths and quantization
    • H03M13/6588Compression or short representation of variables
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/29Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes combining two or more codes or code structures, e.g. product codes, generalised product codes, concatenated codes, inner and outer codes
    • H03M13/2957Turbo codes and decoding
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/29Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes combining two or more codes or code structures, e.g. product codes, generalised product codes, concatenated codes, inner and outer codes
    • H03M13/2957Turbo codes and decoding
    • H03M13/299Turbo codes with short blocks
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/37Decoding methods or techniques, not specific to the particular type of coding provided for in groups H03M13/03 - H03M13/35
    • H03M13/39Sequence estimation, i.e. using statistical methods for the reconstruction of the original codes
    • H03M13/3905Maximum a posteriori probability [MAP] decoding or approximations thereof based on trellis or lattice decoding, e.g. forward-backward algorithm, log-MAP decoding, max-log-MAP decoding
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/65Purpose and implementation aspects
    • H03M13/6502Reduction of hardware complexity or efficient processing
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/65Purpose and implementation aspects
    • H03M13/6502Reduction of hardware complexity or efficient processing
    • H03M13/6505Memory efficient implementations
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/65Purpose and implementation aspects
    • H03M13/6577Representation or format of variables, register sizes or word-lengths and quantization

Definitions

  • This invention relates to channel decoders .
  • channel coding is a key ingredient.
  • the transmitter partition's the data stream into blocks of bits (packets) that are encoded to introduce redundancy information into the transmitted block.
  • the encoded data block is modulated and transmitted through the communication link (channel) connecting the transmitter to the receiver, and a channel- corrupted version of the transmitted data block is received at the receiver-end.
  • the channel decoder at the receiver uses the redundancy introduced by the encoder to recover the transmitted information more reliably.
  • channel decoders are often categorized by the combination of input/output values they accept or provide.
  • a hard-input hard-output (HIHO) decoder accepts a stream of bits (binary values) as input and provides a stream of binary bits at its output that represent the estimated transmitted bit sequence.
  • Soft- input hard-output (SIHO) decoders accept as input real- valued levels called soft symbols or soft bits that represent the reliability of the bit value.
  • SIHO decoder produces a stream of binary bits at its output that represents the estimated transmitted bit sequence.
  • Turbo codes are a special case of channel coding that can operate very close to the theoretical limits of channel capacity and, therefore, are close to optimal. See, for example, C. Berrou et al . , Near Optimum Error Correcting Coding and Decoding: Turbo Codes, 44 IEEE Transaction on Communications 1261 (1996), which addresses parallel concatenated turbo codes and their associated encoders and decoders. Serial concatenated turbo codes are addressed, for example, in S. Benedetto et al., "Serial Concatenation of Interleaved Codes: Performance Analysis, Design and Iterative Decoding," IEEE Transaction on Informa tion Theory, vol. 44, No. 3, pp. 909-926, May 1998.
  • Turbo codes typically use another type of decoder, known as soft-input soft-output (SISO) decoder, that not only accepts soft inputs, but also provides soft-output.
  • SISO decoders the reliability of the estimated bits, as well as the estimated bits, is provided.
  • Some SISO decoders use the Bahl, Cocke, Jeinek and Raviv (BCJR) algorithm or the soft-output Viterbi algorithm (SOVA) . See L.R. Bahl et al . , Optimal Decoding of Linear Codes for Minimizing Symbol Error Ra te, IT-20 IEEE Trans. Inform. Theory 248 (1974); G.D.
  • FIG. 1 shows a simplified transmitter structure for transmitting data in a communications systems .
  • FIG. 2 shows a simplified receiver structure for receiving data in a communications systems.
  • FIG. 3 is a block diagram of a parallel concatenated rate 1/3 turbo encoder.
  • FIG. 4 is a block diagram of a parallel concatenated rate 1/3 turbo decoder.
  • FIG. 5 is a diagram illustrating soft-input soft-output decoding through a look-up table.
  • FIG. 6 is a diagram illustrating preparation of the look-up table.
  • FIG. 7 is a block diagram of another implementation of a parallel concatenated turbo decoder.
  • FIGS . 8 and 9 are diagrams illustrating preparation of a look-up table with multi-symbol quantization.
  • a transmitter includes a data source 100 that produces a bit stream to be transmitted.
  • Framing block 110 partitions the bit streams into packets of M bits that are encoded by channel encoder 120.
  • the encoded bits/packets are modulated by a modulator 130 and are up- converted by an up-conversion element 140 and transmitted through the transmitter's antenna 150.
  • a receiver includes an antenna 240 that receives encoded data packets from the transmitter.
  • the encoded packet is down-converted by a down conversion element 200 and demodulated by a demodulator 210.
  • the demodulated packet is provided to a channel decoder 220 to recover the transmitted package.
  • the sequence of recovered packages is converted to an estimated bit stream by a de- framing device 230.
  • FIG. 3 illustrates a rate 1/3 parallel concatenated turbo encoder as an example of the channel encoder 120.
  • the packet data, al, a2, . . . , aM (121) represents a sequence of M bits to be encoded.
  • the output of the encoder includes the data bits al, a2, . . . , aM (121), a set of redundancy bits cl, c2,. . . , cM (122) that are the output of a first convolutional encoder 125, and a second set of redundancy bits dl, d2, . . . , dM (123) that are the output of a second convolutional encoder 126 that operates on re-ordered input bits obtained from an interleaver 124.
  • FIG. 4 illustrates a channel decoder 220 for decoding data packets encoded by the encoder 120 of FIG. 3.
  • Soft values from the demodulator 210 are passed through a switch 225 to a first SISO decoder 221A.
  • the soft decoder outputs are passed into a summer 222A where the soft inputs to the first SISO decoder are subtracted.
  • the summer 222A arithmetically combines the output from the SISO decoder 221A and the soft input values.
  • the outputs of the summer 222 are considered extrinsic information.
  • the extrinsic information is interleaved by interleaver 223 and fed into a second SISO decoder 221B.
  • Extrinsic information is generated at the output of a summer 222B and is deinterleaved by the de-interleaver 224 and passed through the switch 225 as input to the fist SISO decoder 221A. The process continues in an iterative manner, until a termination criterion is satisfied. The values output from the SISO 221B then are passed through a decision device such as a slicer 226 that provides the final estimate of the
  • one or both of the SISO decoders 221A, 221B can be replaced by a look-up table (LUT) 19 that is pre-configured to approximate the output of an algorithmic SISO decoding process.
  • LUT look-up table
  • the table entries are chosen so that the table's output in response to a given set of N soft symbols (each consisting of Ki bits) corresponds to an approximation of the output of a pre-specified conventional SISO decoder 221 operating on a block of M symbols, where M is greater than N-l.
  • the approximation can be based, for example, on the mean square of the error term E(n), or on some other pre-specified criteria. If the optimization criterion is minimizing the mean square error between the LUT and actual SISO decoder outputs, then the entries to the LUT 19 should be adjusted to minimize the sum of errors squared:
  • An alternative LUT pre-configuration criterion can be the overall probability of error of the decoder that utilizes the LUT. In that case, the entries of the table are chosen to minimize the overall error probability of the resulting decoder.
  • the parameters N, Ki and K 2 are user-defined parameters that may depend on the coding scheme, signal-to-noise ratio (SNR) , anticipated channel conditions, and on the value of M.
  • SNR signal-to-noise ratio
  • n th soft output is obtained by feeding the soft symbols n-ni, n-ni + i, ... , n-l, n, n+1,..., n-n ⁇ +N-1, to the look-up table and reading the corresponding K 2 bit table entry.
  • the parameter n also is a design parameter. To reduce the number of entries in the table, joint quantization of several symbols can be performed, thereby allowing the decoder 220 to operate on soft multi-symbols requiring fewer bits. This can significantly reduce the number of bits required at the input to the LUT 19 and, therefore, significantly reduce the number of its entries. When implemented for turbo decoders, this also can reduce the overall memory requirements of the decoder because joint quantization of multiple symbols is more economical in terms of storage requirements than individual single-symbol scalar quantization of each symbol (bit) .
  • FIG. 7 illustrates a rate 1/3 parallel concatenated turbo decoder 220 that uses both joint quantization and LUT SISO decoding.
  • the decoder 220 receives a vector of soft symbols from the demodulator 210.
  • a compressor 315 jointly quantizes P adjacent soft symbols that then are decoded by a soft-input soft-output (SISO) decoder block 317 that can include a look-up table (LUT) 319 that is pre-configured to approximate the output of a SISO decoder that operates jointly on P soft symbols.
  • SISO soft-input soft-output
  • LUT look-up table
  • the soft outputs of the SISO decoder 317 are passed through a summer 321 where the soft inputs are subtracted to generate extrinsic information.
  • the extrinsic information is decompressed into the single-symbol (bit) level by a decompressor 323 and interleaved by the interleaver 325.
  • the soft symbols are re-compressed by a compressor 327 that functions in a similar manner as the first compressor 315.
  • the compressed symbols are processed by a second SISO decoder block 329 that also can use a LUT 331 to decode the symbols.
  • the SISO block 329 and its LUT 331 can be identical to or substantially the same as the first SISO block 317 and LUT 319.
  • the resulting soft symbols are used to generate extrinsic information at the output of another summer 333.
  • the extrinsic information is decompressed to the bit level by decompressor 335 and de- interleaved by de-interleaver 337.
  • the decompressor 335 may be identical to or substantially the same as decompressor 323.
  • a slicer 339 then converts the soft symbols to bits to provide
  • joint quantization of P symbols can be performed to allow the decoder to operate on soft multi- symbols that require fewer bits to represent each itvulti- symbol compared to the P*K1 that may be required in the scalar single-symbol quantizer approach. This can significantly reduce the number of bits required at the input to the look-up tables 319, 331 and can significantly reduce the number of the entries in the tables .
  • turbo decoding it also can reduce the storage requirements because fewer than M*kl bits are required to represent each block of M soft symbols . For a rate 1/3 turbo code, at least three such blocks would be stored.
  • the SISO decoder 341 that is used as a reference for adapting the look-up table 319 (or 331) can be fed with the soft symbols without quantizing them.
  • the LUT 319 would receive soft multi-symbols as input.
  • a multi-symbol quantizer 351 also can be used to implement a look-up table adapted for use in the decoder, thereby optimizing both the quantizer 351 and LUT 319.
  • the SISO decoder 341 can be fed with the quantized multi-symbols and adaptation of the LUT can be performed as described above.
  • the compressors 315 and 327 can be identical to or substantially the same as the joint quantizer 351, and the decompressors 323 and 335 can implement the inverse operation.
  • the multi-symbol quantizer 351 and the inverse operation of decomposing the multi-symbol soft from the LUT-based SISO decoder 319 into P multiple soft symbols can be pre-configured jointly by a decompressor block 353.
  • the joint quantizer 351 and the decompressor 353 also can be implemented by a look-up table and can be pre- configured to minimize the error term E (n) or to minimize the overall probability of error.
  • the decompressors 323 and 335 can be identical to or substantially similar to the pre-configured multi-symbol decompressor 353.
  • SISO block (s) can be used for other channel decoders, for example, for serially concatenated turbo decoders or for other non-turbo channel decoders .
  • the techniques can be used for SIHO, HIHO and HISO decoders.
  • Using a look-up table that approximates output of the algorithmic decoding process can help reduce the cost and complexity of channel decoders .
  • the joint quantization approach with compress/decompress stages can be performed without the SISO being replaced by look-up tables, for example, to reduce memory requirements in turbo codes .
  • a look-up table can be used to replace a conventional SISO decoder in other contexts as well, such as the BCJR algorithm and the soft- output Viterbi algorithm (SOVA) .
  • Soft-input hard-output (SIHO) decoders such as those used in the Viterbi algorithm also can be implemented with this approach, as well as hard- input hard-output (HIHO) decoders.
  • SIHO soft-input hard-output
  • HIHO hard- input hard-output
  • partial implementation of any of these decoders using a LUT also can be used.
  • the forward iteration of the BCJR algorithm can be implemented in a conventional manner while the backward iteration may be implemented with a LUT.
  • Various features of the system can be implemented in hardware, software, or a combination of hardware and software.
  • some features of the system can be implemented in computer programs executing on programmable computers .
  • Each program can be implemented in a high level procedural or object-oriented programming language to communicate with a computer system.
  • each such computer program can be stored on a storage medium, such as read-only-memory (ROM) readable by a general or special purpose programmable computer or processor, for configuring and operating the computer when the storage medium is read by the computer to perform the function described above.
  • ROM read-only-memory

Abstract

A packet of encoded data is received and decoded using a look-up table that stores information approximating output of an algorithmic decoding process such as a soft-in soft-out (SISO) or a turbo decoding process, hence reducing the complexity of such decoders. Furthermore, the decoding operation is performed on jointly quantized data.

Description

LOW COMPLEXITY CHANNEL DECODERS
BACKGROUND This invention relates to channel decoders .
In many of today's advanced communication systems, channel coding is a key ingredient. The transmitter partition's the data stream into blocks of bits (packets) that are encoded to introduce redundancy information into the transmitted block. The encoded data block is modulated and transmitted through the communication link (channel) connecting the transmitter to the receiver, and a channel- corrupted version of the transmitted data block is received at the receiver-end. After down-conversion and demodulation, the channel decoder at the receiver uses the redundancy introduced by the encoder to recover the transmitted information more reliably.
In general, channel decoders are often categorized by the combination of input/output values they accept or provide. For example, a hard-input hard-output (HIHO) decoder accepts a stream of bits (binary values) as input and provides a stream of binary bits at its output that represent the estimated transmitted bit sequence. Soft- input hard-output (SIHO) decoders accept as input real- valued levels called soft symbols or soft bits that represent the reliability of the bit value. A SIHO decoder produces a stream of binary bits at its output that represents the estimated transmitted bit sequence.
Turbo codes are a special case of channel coding that can operate very close to the theoretical limits of channel capacity and, therefore, are close to optimal. See, for example, C. Berrou et al . , Near Optimum Error Correcting Coding and Decoding: Turbo Codes, 44 IEEE Transaction on Communications 1261 (1996), which addresses parallel concatenated turbo codes and their associated encoders and decoders. Serial concatenated turbo codes are addressed, for example, in S. Benedetto et al., "Serial Concatenation of Interleaved Codes: Performance Analysis, Design and Iterative Decoding," IEEE Transaction on Informa tion Theory, vol. 44, No. 3, pp. 909-926, May 1998. Turbo codes typically use another type of decoder, known as soft-input soft-output (SISO) decoder, that not only accepts soft inputs, but also provides soft-output. Thus, in SISO decoders, the reliability of the estimated bits, as well as the estimated bits, is provided. Some SISO decoders use the Bahl, Cocke, Jeinek and Raviv (BCJR) algorithm or the soft-output Viterbi algorithm (SOVA) . See L.R. Bahl et al . , Optimal Decoding of Linear Codes for Minimizing Symbol Error Ra te, IT-20 IEEE Trans. Inform. Theory 248 (1974); G.D. Forney, The Vi terbi Algorithm, 61 Proc. IEEE 268 (1973) . BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 shows a simplified transmitter structure for transmitting data in a communications systems . FIG. 2 shows a simplified receiver structure for receiving data in a communications systems.
FIG. 3 is a block diagram of a parallel concatenated rate 1/3 turbo encoder.
FIG. 4 is a block diagram of a parallel concatenated rate 1/3 turbo decoder.
FIG. 5 is a diagram illustrating soft-input soft-output decoding through a look-up table.
FIG. 6 is a diagram illustrating preparation of the look-up table. FIG. 7 is a block diagram of another implementation of a parallel concatenated turbo decoder.
FIGS . 8 and 9 are diagrams illustrating preparation of a look-up table with multi-symbol quantization.
DETAILED DESCRIPTION
As shown in FIG. 1, a transmitter includes a data source 100 that produces a bit stream to be transmitted. Framing block 110 partitions the bit streams into packets of M bits that are encoded by channel encoder 120. The encoded bits/packets are modulated by a modulator 130 and are up- converted by an up-conversion element 140 and transmitted through the transmitter's antenna 150.
As shown in FIG. 2, a receiver includes an antenna 240 that receives encoded data packets from the transmitter. The encoded packet is down-converted by a down conversion element 200 and demodulated by a demodulator 210. The demodulated packet is provided to a channel decoder 220 to recover the transmitted package. The sequence of recovered packages is converted to an estimated bit stream by a de- framing device 230.
FIG. 3 illustrates a rate 1/3 parallel concatenated turbo encoder as an example of the channel encoder 120. The packet data, al, a2, . . . , aM (121) represents a sequence of M bits to be encoded. The output of the encoder includes the data bits al, a2, . . . , aM (121), a set of redundancy bits cl, c2,. . . , cM (122) that are the output of a first convolutional encoder 125, and a second set of redundancy bits dl, d2, . . . , dM (123) that are the output of a second convolutional encoder 126 that operates on re-ordered input bits obtained from an interleaver 124.
FIG. 4 illustrates a channel decoder 220 for decoding data packets encoded by the encoder 120 of FIG. 3. Soft values from the demodulator 210 are passed through a switch 225 to a first SISO decoder 221A. The soft decoder outputs are passed into a summer 222A where the soft inputs to the first SISO decoder are subtracted. Thus, the summer 222A arithmetically combines the output from the SISO decoder 221A and the soft input values. The outputs of the summer 222 are considered extrinsic information. The extrinsic information is interleaved by interleaver 223 and fed into a second SISO decoder 221B. Extrinsic information is generated at the output of a summer 222B and is deinterleaved by the de-interleaver 224 and passed through the switch 225 as input to the fist SISO decoder 221A. The process continues in an iterative manner, until a termination criterion is satisfied. The values output from the SISO 221B then are passed through a decision device such as a slicer 226 that provides the final estimate of the
transmitted bits,
Figure imgf000006_0001
• As illustrated in FIG. 5, one or both of the SISO decoders 221A, 221B can be replaced by a look-up table (LUT) 19 that is pre-configured to approximate the output of an algorithmic SISO decoding process. A pre-configuration process is shown in FIG. 6. The contents of the LUT 19 are
determined by generating a 2(JV*Λr,)entry table, with each entry having K2 bits. The table entries are chosen so that the table's output in response to a given set of N soft symbols (each consisting of Ki bits) corresponds to an approximation of the output of a pre-specified conventional SISO decoder 221 operating on a block of M symbols, where M is greater than N-l. The approximation can be based, for example, on the mean square of the error term E(n), or on some other pre-specified criteria. If the optimization criterion is minimizing the mean square error between the LUT and actual SISO decoder outputs, then the entries to the LUT 19 should be adjusted to minimize the sum of errors squared:
Σ| E(n) |2 . n=l
An alternative LUT pre-configuration criterion can be the overall probability of error of the decoder that utilizes the LUT. In that case, the entries of the table are chosen to minimize the overall error probability of the resulting decoder.
The parameters N, Ki and K2 are user-defined parameters that may depend on the coding scheme, signal-to-noise ratio (SNR) , anticipated channel conditions, and on the value of M.
As shown by FIG. 5, when a sequence of soft symbols 41 is received, the corresponding output soft symbol is found in the LUT 19. The nth soft output is obtained by feeding the soft symbols n-ni, n-ni+i, ... , n-l, n, n+1,..., n-nι+N-1, to the look-up table and reading the corresponding K2 bit table entry. The parameter n also is a design parameter. To reduce the number of entries in the table, joint quantization of several symbols can be performed, thereby allowing the decoder 220 to operate on soft multi-symbols requiring fewer bits. This can significantly reduce the number of bits required at the input to the LUT 19 and, therefore, significantly reduce the number of its entries. When implemented for turbo decoders, this also can reduce the overall memory requirements of the decoder because joint quantization of multiple symbols is more economical in terms of storage requirements than individual single-symbol scalar quantization of each symbol (bit) .
FIG. 7 illustrates a rate 1/3 parallel concatenated turbo decoder 220 that uses both joint quantization and LUT SISO decoding. As shown in FIG. 1 , the decoder 220 receives a vector of soft symbols from the demodulator 210. A compressor 315 jointly quantizes P adjacent soft symbols that then are decoded by a soft-input soft-output (SISO) decoder block 317 that can include a look-up table (LUT) 319 that is pre-configured to approximate the output of a SISO decoder that operates jointly on P soft symbols. The pre- configuration process is explained below.
The soft outputs of the SISO decoder 317 are passed through a summer 321 where the soft inputs are subtracted to generate extrinsic information. The extrinsic information is decompressed into the single-symbol (bit) level by a decompressor 323 and interleaved by the interleaver 325.
Next, the soft symbols are re-compressed by a compressor 327 that functions in a similar manner as the first compressor 315. The compressed symbols are processed by a second SISO decoder block 329 that also can use a LUT 331 to decode the symbols. The SISO block 329 and its LUT 331 can be identical to or substantially the same as the first SISO block 317 and LUT 319. The resulting soft symbols are used to generate extrinsic information at the output of another summer 333. The extrinsic information is decompressed to the bit level by decompressor 335 and de- interleaved by de-interleaver 337. The decompressor 335 may be identical to or substantially the same as decompressor 323.
The process continues in an iterative manner for a predetermined number of iterations of the decoding process, or until some other termination criterion is reached. A slicer 339 then converts the soft symbols to bits to provide
the estimated transmitted information sequence <3j,^2?— > M • As explained above, to reduce the number of entries in the tables 319, 331, joint quantization of P symbols can be performed to allow the decoder to operate on soft multi- symbols that require fewer bits to represent each itvulti- symbol compared to the P*K1 that may be required in the scalar single-symbol quantizer approach. This can significantly reduce the number of bits required at the input to the look-up tables 319, 331 and can significantly reduce the number of the entries in the tables . In the context of turbo decoding, it also can reduce the storage requirements because fewer than M*kl bits are required to represent each block of M soft symbols . For a rate 1/3 turbo code, at least three such blocks would be stored.
As shown in FIG. 8, the SISO decoder 341 that is used as a reference for adapting the look-up table 319 (or 331) can be fed with the soft symbols without quantizing them. The LUT 319 would receive soft multi-symbols as input. A multi-symbol quantizer 351 also can be used to implement a look-up table adapted for use in the decoder, thereby optimizing both the quantizer 351 and LUT 319.
Alternatively, the SISO decoder 341 can be fed with the quantized multi-symbols and adaptation of the LUT can be performed as described above.
The compressors 315 and 327 can be identical to or substantially the same as the joint quantizer 351, and the decompressors 323 and 335 can implement the inverse operation.
As shown in FIG 9, the multi-symbol quantizer 351 and the inverse operation of decomposing the multi-symbol soft from the LUT-based SISO decoder 319 into P multiple soft symbols can be pre-configured jointly by a decompressor block 353. The joint quantizer 351 and the decompressor 353 also can be implemented by a look-up table and can be pre- configured to minimize the error term E (n) or to minimize the overall probability of error.
The decompressors 323 and 335 can be identical to or substantially similar to the pre-configured multi-symbol decompressor 353.
The foregoing techniques using look-up tables to implement the SISO block (s) can be used for other channel decoders, for example, for serially concatenated turbo decoders or for other non-turbo channel decoders . The techniques can be used for SIHO, HIHO and HISO decoders.
Using a look-up table that approximates output of the algorithmic decoding process can help reduce the cost and complexity of channel decoders .
Similarly, the joint quantization approach with compress/decompress stages can be performed without the SISO being replaced by look-up tables, for example, to reduce memory requirements in turbo codes .
Although the techniques have been described above in the context of processing turbo codes, a look-up table can be used to replace a conventional SISO decoder in other contexts as well, such as the BCJR algorithm and the soft- output Viterbi algorithm (SOVA) . Soft-input hard-output (SIHO) decoders such as those used in the Viterbi algorithm also can be implemented with this approach, as well as hard- input hard-output (HIHO) decoders. In addition, partial implementation of any of these decoders using a LUT also can be used. For example, the forward iteration of the BCJR algorithm can be implemented in a conventional manner while the backward iteration may be implemented with a LUT.
Various features of the system can be implemented in hardware, software, or a combination of hardware and software. For example, some features of the system can be implemented in computer programs executing on programmable computers . Each program can be implemented in a high level procedural or object-oriented programming language to communicate with a computer system. Furthermore, each such computer program can be stored on a storage medium, such as read-only-memory (ROM) readable by a general or special purpose programmable computer or processor, for configuring and operating the computer when the storage medium is read by the computer to perform the function described above. Other implementations are within the scope of the following claims.

Claims

What is claimed is:
1. A method comprising: receiving a packet of encoded data; and decoding the packet using a look-up table that stores information approximating output of an algorithmic decoding process .
2. The method of claim 1 including performing joint quantization of the data packet before decoding.
3. The method of claim 1 wherein data in the packet is encoded by turbo coding .
4 . The method of claim 3 wherein decoding includes processing the data packet using a parallel concatenated turbo decoder.
5. The method of claim 1 including decoding the packet using a table that stores information approximating output of a soft-input soft-output algorithmic decoding process, a soft-input hard-output algorithmic decoding process, a hard- input soft-output algorithmic decoding process, or a hard- input hard-output algorithmic decoding process.
6. A method comprising: (a) receiving encoded symbols;
(b) compressing the symbols;
(c) decoding the compressed symbols using a first lookup table that stores information approximating output of an algorithmic decoding process;
(d) arithmetically combining the compressed symbols with the decoded symbols to obtain a first result; and
(e) decompressing the first result.
7. The method of claim 6 including:
(f) interleaving the decompressed first result;
(g) compressing the interleaved first result;
(h) decoding the compressed, interleaved first result using a second look-up table that stores information approximating output of an algorithmic decoding process;
(i) arithmetically combining the decoded first result with the compressed, interleaved first result to obtain a second result;
(j) decompressing the second result; and (k) de-interleaving the decompressed second result.
8. The method of claim 7 including: repeating (b) through (k) until predetermined criteria is satisfied; and determining information bits corresponding to the received encoded symbols.
9. An apparatus comprising: a memory storing a look-up table with information approximating output of an algorithmic decoding process; and a processor configured to use the look-up table to decode data packets encoded by convolutional coding.
10. The apparatus of claim 9 wherein the table stores information approximating a soft-input soft-output algorithmic decoding process, a soft-input hard-output algorithmic decoding process, a hard-input soft-output algorithmic decoding process, or a hard-input hard-output algorithmic decoding process.
11. The apparatus of claim 10 including a joint quantization module for converting soft symbols in the packet into soft multi-symbols prior to the processor's decoding the data packets using the look-up table.
12. The apparatus of claim 10 wherein the processor is configured to decode the packet by turbo decoding.
13. An apparatus comprising: memory storing a first look-up table with information approximating output of an algorithmic decoding process; and a processor configured to
(a) compress a packet of received encoded symbols; (b) decode the compressed symbols using the first lookup table;
(c) arithmetically combine the compressed symbols with the decoded symbols to obtain a first result; and
(d) decompress the first result.
14. The apparatus of claim 13 wherein the memory stores a second look-up table with information approximating output of an algorithmic decoding process and wherein the processor is configured to: (e) interleave the decompressed first result;
(f) compress the interleaved first result;
(g) decode the compressed, interleaved first result using the second look-up table;
(h) arithmetically combine the decoded first result with the compressed, interleaved first result to obtain a second result;
(i) decompress the second result; and
(j) de-interleave the decompressed second result.
15. The apparatus of claim 14 wherein the processor is configured to: repeat (a) through (j) until predetermined criteria is satisfied; and determine information bits corresponding to the encoded symbols .
16. An article comprising a computer-readable medium that stores computer-executable instruction for causing a computer system, in response to receiving an encoded data packet, to use a look-up table that approximates output of an algorithmic decoding process to decode the packet.
17. The article of claim 16 including instructions for causing the computer system to perform joint quantization before using the look-up table to decode the packet.
18. The article of claim 16 wherein data in the packet to be decoded was encoded by turbo coding.
19. An article comprising a computer-readable medium that stores computer-executable instruction for causing a computer system to:
(a) compress a packet of received encoded symbols; (b) decode the compressed symbols using a first look-up table approximating output of an algorithmic decoding process ;
(c) arithmetically combine the compressed symbols with the decoded symbols to obtain a first result; and
(d) decompress the first result.
20. The article of claim 19 including instructions for causing the computer system to: (e) interleave the decompressed first result;
(f) compress the interleaved first result;
(g) decode the compressed, interleaved first result using a second look-up table approximating output of an algorithmic decoding process; (h) arithmetically combine the decoded first result with the compressed, interleaved first result to obtain a second result;
(i) decompress the second result; and
(j) de-interleave the decompressed second result.
21. The article of claim 20 including instructions for causing the computer system to: repeat (a) through (j) until predetermined criteria is satisfied; and determine information bits corresponding to the encoded symbols .
22. The article of claim 16 including instructions for causing the computer system to decode the compressed symbols using a first look-up table approximating output of a soft- input soft-output algorithmic decoding process, a soft-input hard-output algorithmic decoding process, a hard-input soft- output algorithmic decoding process, or a hard-input hard- output algorithmic decoding process.
23. A method comprising: receiving a packet of encoded symbols; jointly quantizing multiple symbols; decoding the jointly quantized symbols to obtain a result; and decompressing the result into individual decoded symbols .
24. The method of claim 23 including decoding the jointly quantized symbols using a look-up table that approximates output of an algorithmic decoding process.
25. The method of claim 23 including decoding the jointly quantized symbols using a look-up table that approximates output a soft-input soft-output algorithmic decoding process, a soft-input hard-output algorithmic decoding process, a hard-input soft-output algorithmic decoding process, or a hard-input hard-output algorithmic decoding process .
26. An article comprising a computer-readable medium that stores computer-executable instruction for causing a computer system, in response to receiving a packet of encoded symbols, to: jointly quantize multiple ones of the symbols; decode the jointly quantized symbols to obtain a result; and decompress the result into individual decoded symbols .
27. The article of claim 26 including instructions for causing the computer system to decode the jointly quantized symbols using a look-up table that approximates output of an algorithmic decoding process.
28. The method of claim 26 including instructions for causing the computer system to decode the jointly quantized symbols using a look-up table that approximates output a soft-input soft-output algorithmic decoding process, a soft- input hard-output algorithmic decoding process, a hard-input soft-output algorithmic decoding process, or a hard-input hard-output algorithmic decoding process.
PCT/US2002/014878 2001-06-12 2002-05-10 Low complexity channel decoders WO2002101936A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/880,707 US7243295B2 (en) 2001-06-12 2001-06-12 Low complexity channel decoders
US09/880,707 2001-06-12

Publications (2)

Publication Number Publication Date
WO2002101936A2 true WO2002101936A2 (en) 2002-12-19
WO2002101936A3 WO2002101936A3 (en) 2003-02-27

Family

ID=25376895

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2002/014878 WO2002101936A2 (en) 2001-06-12 2002-05-10 Low complexity channel decoders

Country Status (4)

Country Link
US (3) US7243295B2 (en)
CN (1) CN100389540C (en)
MY (1) MY138817A (en)
WO (1) WO2002101936A2 (en)

Families Citing this family (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7571683B2 (en) * 2001-03-27 2009-08-11 General Electric Company Electrical energy capture system with circuitry for blocking flow of undesirable electrical currents therein
US7243295B2 (en) * 2001-06-12 2007-07-10 Intel Corporation Low complexity channel decoders
US7450631B2 (en) * 2001-10-26 2008-11-11 Intel Corporation Metric correction for multi user detection, for long codes DS-CDMA
US7298788B2 (en) * 2003-10-27 2007-11-20 Ge Medical Systems Information Technologies, Inc. Wireless communication system and method
EP1562295A1 (en) * 2004-02-09 2005-08-10 Matsushita Electric Industrial Co., Ltd. A method to reduce the memory requirement of the deinterleaver within a digital audio broadcast radio receiver using data compression
US8194760B2 (en) * 2006-06-01 2012-06-05 Ntt Docomo, Inc. Method and apparatus for distributed space-time coding in wireless radio networks
US8027407B2 (en) * 2006-11-06 2011-09-27 Ntt Docomo, Inc. Method and apparatus for asynchronous space-time coded transmission from multiple base stations over wireless radio networks
US8059732B2 (en) 2006-11-28 2011-11-15 Ntt Docomo, Inc. Method and apparatus for wideband transmission from multiple non-collocated base stations over wireless radio networks
US8861356B2 (en) * 2007-03-13 2014-10-14 Ntt Docomo, Inc. Method and apparatus for prioritized information delivery with network coding over time-varying network topologies
US8064548B2 (en) * 2007-05-18 2011-11-22 Ntt Docomo, Inc. Adaptive MaxLogMAP-type receiver structures
US20090285323A1 (en) * 2008-05-15 2009-11-19 Sundberg Carl-Erik W Adaptive soft output m-algorithm receiver structures
US20090075686A1 (en) * 2007-09-19 2009-03-19 Gomadam Krishna S Method and apparatus for wideband transmission based on multi-user mimo and two-way training
US7978793B2 (en) * 2008-02-06 2011-07-12 Freescale Semiconductor, Inc. Method for generating soft decision signal from hard decision signal in a receiver system
US8325840B2 (en) * 2008-02-25 2012-12-04 Ntt Docomo, Inc. Tree position adaptive soft output M-algorithm receiver structures
US8279954B2 (en) * 2008-03-06 2012-10-02 Ntt Docomo, Inc. Adaptive forward-backward soft output M-algorithm receiver structures
US8565329B2 (en) * 2008-06-03 2013-10-22 Ntt Docomo, Inc. Soft output M-algorithm receiver structures with generalized survivor selection criteria for MIMO systems
US8229443B2 (en) * 2008-08-13 2012-07-24 Ntt Docomo, Inc. Method of combined user and coordination pattern scheduling over varying antenna and base-station coordination patterns in a multi-cell environment
US8705484B2 (en) * 2008-08-15 2014-04-22 Ntt Docomo, Inc. Method for varying transmit power patterns in a multi-cell environment
US8451951B2 (en) * 2008-08-15 2013-05-28 Ntt Docomo, Inc. Channel classification and rate adaptation for SU-MIMO systems
US8542640B2 (en) * 2008-08-28 2013-09-24 Ntt Docomo, Inc. Inter-cell approach to operating wireless beam-forming and user selection/scheduling in multi-cell environments based on limited signaling between patterns of subsets of cells
US8855221B2 (en) * 2008-09-15 2014-10-07 Ntt Docomo, Inc. Method and apparatus for iterative receiver structures for OFDM/MIMO systems with bit interleaved coded modulation
US9048977B2 (en) * 2009-05-05 2015-06-02 Ntt Docomo, Inc. Receiver terminal driven joint encoder and decoder mode adaptation for SU-MIMO systems
US8514961B2 (en) * 2010-02-04 2013-08-20 Ntt Docomo, Inc. Method and apparatus for distributed space-time coding in wireless radio networks
CN102832953B (en) * 2011-06-16 2017-12-12 中兴通讯股份有限公司 Convolutional code decoder method and device
US8929432B2 (en) 2012-09-07 2015-01-06 Sony Corporation Combination A/53 and A/153 receiver using a HIHO viterbi decoder
CN109347484B (en) * 2018-11-05 2022-07-12 西安微电子技术研究所 64B/66B encoder based on two-stage table look-up and encoding method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5381425A (en) * 1992-03-27 1995-01-10 North Carolina State University System for encoding and decoding of convolutionally encoded data
US6009552A (en) * 1997-06-18 1999-12-28 Motorola, Inc. Soft-decision syndrome-based decoder for convolutional codes

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4051331A (en) * 1976-03-29 1977-09-27 Brigham Young University Speech coding hearing aid system utilizing formant frequency transformation
US4393276A (en) * 1981-03-19 1983-07-12 Bell Telephone Laboratories, Incorporated Fourier masking analog signal secure communication system
CA1212452A (en) 1982-06-11 1986-10-07 Tokumichi Murakami Vector quantizer
US4873701A (en) * 1987-09-16 1989-10-10 Penril Corporation Modem and method for 8 dimensional trellis code modulation
JP2707564B2 (en) * 1987-12-14 1998-01-28 株式会社日立製作所 Audio coding method
US5384891A (en) * 1988-09-28 1995-01-24 Hitachi, Ltd. Vector quantizing apparatus and speech analysis-synthesis system using the apparatus
US5191548A (en) 1990-03-14 1993-03-02 C-Cube Microsystems System for compression and decompression of video data using discrete cosine transform and coding techniques
US5297170A (en) 1990-08-21 1994-03-22 Codex Corporation Lattice and trellis-coded quantization
US5434623A (en) * 1991-12-20 1995-07-18 Ampex Corporation Method and apparatus for image data compression using combined luminance/chrominance coding
JP3144009B2 (en) * 1991-12-24 2001-03-07 日本電気株式会社 Speech codec
DK0670555T3 (en) * 1992-09-28 2000-09-18 Olympus Optical Co Registration medium with bar code and information registration system
JP3343965B2 (en) * 1992-10-31 2002-11-11 ソニー株式会社 Voice encoding method and decoding method
US5319707A (en) 1992-11-02 1994-06-07 Scientific Atlanta System and method for multiplexing a plurality of digital program services for transmission to remote locations
JPH06178274A (en) 1992-11-30 1994-06-24 Sony Corp Motion picture decoding device
US5651090A (en) * 1994-05-06 1997-07-22 Nippon Telegraph And Telephone Corporation Coding method and coder for coding input signals of plural channels using vector quantization, and decoding method and decoder therefor
JPH087332A (en) * 1994-06-16 1996-01-12 Olympus Optical Co Ltd Information recording medium and reading device
US5675590A (en) * 1994-11-23 1997-10-07 At&T Wireless Services, Inc. Cyclic trellis coded modulation
US5701390A (en) * 1995-02-22 1997-12-23 Digital Voice Systems, Inc. Synthesis of MBE-based coded speech using regenerated phase information
US6131084A (en) * 1997-03-14 2000-10-10 Digital Voice Systems, Inc. Dual subframe quantization of spectral magnitudes
US6161089A (en) * 1997-03-14 2000-12-12 Digital Voice Systems, Inc. Multi-subframe quantization of spectral parameters
US5974181A (en) 1997-03-20 1999-10-26 Motorola, Inc. Data compression system, method, and apparatus
US6161209A (en) * 1997-03-28 2000-12-12 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of Industry Through The Communications Research Centre Joint detector for multiple coded digital signals
US5983384A (en) * 1997-04-21 1999-11-09 General Electric Company Turbo-coding with staged data transmission and processing
US6029264A (en) * 1997-04-28 2000-02-22 The Trustees Of Princeton University System and method for error correcting a received data stream in a concatenated system
US6199037B1 (en) * 1997-12-04 2001-03-06 Digital Voice Systems, Inc. Joint quantization of speech subframe voicing metrics and fundamental frequencies
US6253185B1 (en) * 1998-02-25 2001-06-26 Lucent Technologies Inc. Multiple description transform coding of audio using optimal transforms of arbitrary dimension
US6393072B1 (en) * 1998-09-24 2002-05-21 Lockheed Martin Corporation Channel decoder using vocoder joint statistics
US6202189B1 (en) * 1998-12-17 2001-03-13 Teledesic Llc Punctured serial concatenated convolutional coding system and method for low-earth-orbit satellite data communication
JP3246484B2 (en) * 1999-07-07 2002-01-15 日本電気株式会社 Turbo decoder
CN1133276C (en) 1999-11-12 2003-12-31 深圳市中兴通讯股份有限公司 Decoding method and decoder for high-speed parallel cascade codes
KR100474833B1 (en) * 1999-11-17 2005-03-08 삼성전자주식회사 Predictive and Mel-scale binary vector quantization apparatus and method for variable dimension spectral magnitude
US6377916B1 (en) * 1999-11-29 2002-04-23 Digital Voice Systems, Inc. Multiband harmonic transform coder
US6516437B1 (en) * 2000-03-07 2003-02-04 General Electric Company Turbo decoder control for use with a programmable interleaver, variable block length, and multiple code rates
US6307901B1 (en) * 2000-04-24 2001-10-23 Motorola, Inc. Turbo decoder with decision feedback equalization
US6725409B1 (en) * 2000-06-30 2004-04-20 Texas Instruments Incorporated DSP instruction for turbo decoding
US7243295B2 (en) 2001-06-12 2007-07-10 Intel Corporation Low complexity channel decoders
US7634399B2 (en) * 2003-01-30 2009-12-15 Digital Voice Systems, Inc. Voice transcoder

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5381425A (en) * 1992-03-27 1995-01-10 North Carolina State University System for encoding and decoding of convolutionally encoded data
US6009552A (en) * 1997-06-18 1999-12-28 Motorola, Inc. Soft-decision syndrome-based decoder for convolutional codes

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HSU C-P ET AL: "A soft decision syndrome decoding algorithm for convolutional codes" IEEE MILCOM 1990, vol. 1, 30 September 1990 (1990-09-30) - 3 October 1990 (1990-10-03), pages 375-379, XP010002805 Monterey (US) *
TAJIMA M: "ON THE STRUCTURE OF AN SST VITERBI DECODER FOR GENERAL RATE (N-1)/NCONVOLUTIONAL CODES VIEWED IN THE LIGHT OF SYNDROME DECODING" IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS, COMMUNICATIONS AND COMPUTER SCIENCES, INSTITUTE OF ELECTRONICS INFORMATION AND COMM. ENG. TOKYO, JP, vol. E79-A, no. 9, 1 September 1996 (1996-09-01), pages 1447-1449, XP000679639 ISSN: 0916-8508 *

Also Published As

Publication number Publication date
CN1515079A (en) 2004-07-21
US20040199856A1 (en) 2004-10-07
US20070198899A1 (en) 2007-08-23
US20020194567A1 (en) 2002-12-19
WO2002101936A3 (en) 2003-02-27
CN100389540C (en) 2008-05-21
MY138817A (en) 2009-07-31
US7243295B2 (en) 2007-07-10
US7240274B2 (en) 2007-07-03

Similar Documents

Publication Publication Date Title
US7243295B2 (en) Low complexity channel decoders
Bajcsy et al. Coding for the Slepian-Wolf problem with turbo codes
Benedetto et al. Iterative decoding of serially concatenated convolutional codes
US8443265B2 (en) Method and apparatus for map decoding and turbo decoder using the same
Jeanne et al. Joint source-channel decoding of variable-length codes for convolutional codes and turbo codes
US6859906B2 (en) System and method employing a modular decoder for decoding turbo and turbo-like codes in a communications network
US6812873B1 (en) Method for decoding data coded with an entropic code, corresponding decoding device and transmission system
WO2003103152A2 (en) Soft decoding of linear block codes
US6795507B1 (en) Method and apparatus for turbo decoding of trellis coded modulated signal transmissions
US6993085B2 (en) Encoding and decoding methods and devices and systems using them
Reed et al. Turbo-code termination schemes and a novel alternative for short frames
US7573962B1 (en) Diversity code combining scheme for turbo coded systems
US7185267B2 (en) Non linear scaling of reliability values in a turbo decoder system
US7236591B2 (en) Method for performing turbo decoding in mobile communication system
US6842871B2 (en) Encoding method and device, decoding method and device, and systems using them
Ruscitto et al. Joint source and channel coding using turbo codes over rings
US20040017857A1 (en) Transmitter, receiver, methods, program and signal adapted to modulations having a large number of states
Ould-Cheikh-Mouhamedou et al. Enhanced Max-Log-APP and enhanced Log-APP decoding for DVB-RCS
Huettinger et al. Information processing in soft-output decoding
US8924811B1 (en) Fast, efficient architectures for inner and outer decoders for serial concatenated convolutional codes
Brejza et al. Adaptive iterative detection for expediting the convergence of a serially concatenated unary error correction decoder, turbo decoder and an iterative demodulator
Thobaben Joint network/channel coding for bandwidth-efficient multi-user ARQ
Cabral et al. Bootstrap hybrid decoding using the multiple stack algorithm
Hedayat et al. List-decoding of variable-length codes with application in joint source-channel coding
Ghouri et al. Performance of modified iterative decoding algorithm for product codes

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG UZ VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG UZ VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 028117816

Country of ref document: CN

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP