-
Summary. Hello, in traditional coding, people focus on the short code construction of channel coding, that is, to find a decoding structure to make it have the largest possible minimum distance. However, two recursive system convolutional code parallel cascade (RSC-PCCC) are used in the turbo code, and an interleaver and a deinterleaver are used in the decoder, which effectively realizes the idea of random decoding, and obtains the effect of long code through the effective combination of short codes, so as to achieve a performance close to the limit of Shannon's theory.
Hello, in traditional coding, people focus on the short code construction of channel coding, that is, to find a decoding structure to make it have the largest possible minimum distance. However, two recursive system convolutional code parallel cascades (RSC-PCCC) are used in the Turbo code, and the Zen interleaver and de-interweaver are used in the Zen decoder in the Turbo codec, which effectively realizes the idea of random decoding, and obtains the effect of long code through the effective combination of short-sliding volt codes, so as to achieve a performance close to the limit of Shannon's theory.
The coding consists of 3 parts: direct input complex interface, through horizontal encoder 1, then through the switch circuit into the multiplexer, and through the vertical encoder Chang merge 2, and then through the switch circuit into the multiplexer. The horizontal code of the horizontal encoder 1 and the vertical code of the vertical encoder 2 are called the component codes of the turbo codes.
-
The main innovation of the turbo code is the use of likelihood ratio to mediate the difference between the outputs of the two decoders. Each decoder generates a set of hypotheses corresponding to m-input bits (likelihood ratio), and then compares the results of the two sets of hypotheses, and if there are differences, the decoders exchange the hypothetical results. Each decoder can use the other's hypothesis to estimate the new hypothesis, and then they compare the results of the new hypothesis, repeating the process until both decoders arrive at the same hypothesis.
This process is similar to Xiaoqiang's crossword or Sudoku. It can be understood that two different people (decoders) get the same small strong crossword, but they use different ways to solve the crossword problem, one only looks at the horizontal and vertical directions, and the other only looks at the oblique line direction. Of course, the results they make independently cannot guarantee that they will be completely correct, so they write down their certainty of the conclusion while filling in the words, such as certain words they dare to promise that they will never make mistakes, some words are inaccurate, some are random guesses...
They then compare their results with their corresponding certainty, and by referring to each other's results, both parties can get some revelations based on the differences, and then they try to crossword again based on these revelations, repeating the process until the results of the two are exactly the same (but still not sure that it is the same as the correct answer, only that they are almost the same). It is a probabilistic decoding algorithm, which is the maximum posterior probability algorithm (MAP). However, before the advent of turbo codes, the probabilistic decoding algorithm used in channel coding was the maximum likelihood algorithm (ML).
The ML algorithm is a simplification of the MAP algorithm, that is, it is a sub-optimal decoding algorithm assuming that the source symbol and other probabilities are present. The decoding algorithm of turbo code adopts the map algorithm, and the structure of decoding is improved, and the concept of feedback is introduced again, and a compromise between performance and complexity is obtained. At the same time, the decoding of turbo code adopts iterative decoding, which is completely different from classical algebraic decoding.
The decoding algorithm of turbo code was first improved on the basis of the BCJR algorithm, which we called the MAP algorithm, and later formed the log-map algorithm, max-log-map and soft input soft output (SOVA) algorithm. Diagram of the decoding structure of the turbo code.
1) Serial cascade.
2) Iterative decoding.
3) In the iterative decoding process, external information is exchanged.
Probabilistic decoding principle and structure.
When decoding, the received information is first processed, and the transmission of external information between the two member decoders forms a cyclic and iterative structure. Due to the effect of external information, the bit error rate at a certain signal-to-noise ratio will decrease with the increase of the number of cycles. However, at the same time, the correlation between the external information and the received sequence also increases gradually with the increase of the number of decoding, and the error correction ability provided by the external information is also weakened, and the decoding performance will no longer improve after a certain number of cycles.
-
Link Turbo analyzes the front grinding in an all-round way: the principle, the rotten joints of the shed, and the leakage of the test chain.
-
As mentioned earlier, Turbo code requires a soft-input soft-output decoding algorithm. The output of a soft-output decoder should contain not only a hard verdict, but also a degree of confidence in making that judgment. The decoding algorithm should take into account three aspects: the introduction of external information; how to make full use of all kinds of information in iterative decoding to prevent the formation of simple positive feedback and ensure the convergence of the algorithm; Make full use of the information about the original code.
There are several common algorithms for modularization: It is twice as computationally intensive as the standard Viterbi algorithm. The Viterbi algorithm is a maximum likelihood sequence estimation algorithm, but it cannot provide soft output because it has to remove some low-likelihood paths at each step, leaving only one optimal path for each state.
For Tachibana to give a credibility to each bit he outputs, some corrections need to be made to remove the low likelihood path to preserve the necessary information. The basic idea is to use the metric difference between the optimal retention path and the deleted path, and the smaller the difference, the better the reliability of the calculation. This difference is then used to correct the credibility of each bit on the path.
-
Coding theory has long followed the traditional concept of cutoff rate, and although various complex coding methods are emerging, the performance gap of several decibels beyond the Shannon limit is always blocked by huge computational complexity. The emergence of turbo code has broken through the shackles and technical barriers of these traditional concepts, cleverly opened up a new way in coding theory and iterative processing technology, and created a new era of channel coding and related field research. The turbo code is a cascading code first proposed by et al. in 1993.
The basic principle is that the encoder cascades the two component encoders in parallel through the interleaver, and the two component encoders output the corresponding check bit bits respectively; The decoder performs iterative decoding between the two component decoders, and the component decoders pass the information outside the positive feedback to each other, so that the whole decoding process is similar to a turbo work. Therefore, this encoding method is also figuratively called turbo code. The turbo code has excellent error correction performance, the performance is close to the Shannon limit, and the complexity of the coding is not high.
The turbo code cleverly concatemates two simple component codes in parallel through pseudo-random interleavers to construct a long code with pseudo-random characteristics, and realizes pseudo-random decoding by performing multiple iterations between the two soft-in-soft-out (SISO) decoders. His performance far surpasses that of other coding methods, has received extensive attention and development, and has had a profound impact on today's coding theory and research methods, and channel coding has entered a new stage.
By a photoelectric code disc with an axis in the center, there is a ring through and a dark engraved line on it, and a photoelectric transmitting and receiving device reads it, and obtains four groups of sine wave signals combined into A, B, C, D, each sine wave is 90 degrees apart by a phase difference (360 degrees relative to a cycle), the C and D signals are reversed, and superimposed on the two phases of A and B, which can enhance the stable signal; A z-phase pulse is also output per revolution to represent the zero reference bit. >>>More
Respect and trust are the foundation of love. When questioning the other person, you should first reflect on yourself.
Fisherman on the river - Fan Zhongyan.
People come and go on the river, but love the beauty of sea bass. >>>More
In today's broadcasting industry, the number of recorded programs is increasing rapidly, because while short, flat and fast time-sensitive radio programs are needed, there is also a need for high-quality recorded programs for the audience to enjoy. The feature of recorded programs is that they have more time and energy to ideate, prepare, produce and revise than live programs. The production will also be very natural and audible, and the recording technology of the sound and the post-production skills are very important elements. >>>More
To query the wifi password, you can log in to the router to query, and the operation steps are as follows: >>>More