Turbo Coding And MAP Decoding - 1Example EC1 U

2y ago
15 Views
3 Downloads
502.45 KB
16 Pages
Last View : 23d ago
Last Download : 3m ago
Upload by : Dahlia Ryals
Transcription

Turbo Coding and MAP Decoding - Example1Intuitive Guide to Principles of Communicationswww.complextoreal.comTurbo decoding using the MAP algorithmPart 2 – A Step by step exampleEC1uky1k, p y1k, sEC2yk2, pyk2, sFigure 1 – A rate 1/3 PCCC Turbo codeIn this example, we will use a rate 1/3 Turbo Code which has two identicalconvolutional codes. Note that since yk2,s is a deterministically reshuffle version of1,sy1,sk , it is not transmitted. The second decoder is given y k , and it then de-interleaves itto get its copy of yk2,s which it needs for decoding.The coding trellis for each is given by the figure below. The blue lines showtransitions in response to a 0 and red lines in response to a 1. The notation 1/11, thefirst number is the input bit, the next are two code bits. Of these, the first is what wecalled the systematic bit, and as you can see, it is same as the input bit. The secondbit is the parity bit. Each code uses this trellis for encoding.Copyright 2006 Charan Langtonwww.complextoreal.com

Turbo Coding and MAP Decoding - Example200s2s111s111s20010s301s30110s4s4Figure 2 – the trellis diagram of the rate 1/2 code RSC codeFor the example, we pick the information bit sequence: 1, 0, 1, 0, 1, 0, 0, which inbi-polar form is shown in Table 1. Code 1 encodes each of these bits as systematicand parity bits using the trellis in Fig. 7. The bits are coded in a bipolar format.Table 1 – Coded bits through first Encoderk 111Code 1SystematicParity bit, ENC1k 2-11k 31-1k 4-11k 511k 6-1-1k 7-1-1The sequence looks like this on the trellis diagram.s1s2s3s4k 0k 1k 2k 3k 4k 5k 6k 7Figure 3 – Trellis diagram for information (1, -1, 1, -1, 1, -1, -1) through firstencoderThe code 2 receives the same information bits but interleaved by a given pattern.Interleaving is an important issue in turbo coding. There are several different types ofinterleaving patterns other than the row column method you know. Here I have madeup a “pseudo-random” interleaving pattern shown in Table 2. The deinterleavingpattern puts the bits back in the original order. Simulations shows that pseudorandominterleaving works best with turbo codes in AWGN channel.Table 2 – Interleaving patter from Encoder 1 to Encoder 2 and thendeinterleaving back to Encoder 1 from Encoder 2Interleaving 1-2Deinterleaving 2-1Copyright 2006 Charan ww.complextoreal.com

Turbo Coding and MAP Decoding - Example3After reordering the information sequence (1, -1, 1, -1, 1, -1, -1) into (1, 1, 1, -1, 1, -1, -1) Encoder 2 codes the new sequence as in Table 3.Table 3 – Interleaved bits coded by Encoder 2k 111Code 2SystematicParity bit, ENC2k 21-1k 311k 4-1-1k 5-1-1k 6-1-1k 7-1-1On the trellis diagram this new sequence is coded as shown in Figure 8.s1s2s3s4k 0k 1k 2k 3k 4k 5k 6k 7Figure 4 – Trellis diagram for information (1, 1, 1, -1, -1, -1, -1) throughsecond encoderNote that the systematic bits from Code 2 are redundant and will not betransmitted. This gives us a rate 1/3 code if you don’t count the tail bits. Thetransmitted data is given in columns or: 1 1 1 -1 1-1 1-1 1 Table 4 – Transmitted data, one systematic bit and two parity bits from eachencoderTransmittedSystematicParity bit, ENC1Parity bit, ENC2k 1111k 2-11-1k 31-11k 4-11-1k 511-1k 6-1-1-1k 7-1-1-1That’s all for the encoding. On the receiving end, assume that this signal goesthrough a AWGN channel with signal Es/N0 0.25, which is quite small. A bunch oferrors popup and the received data with some channel gain is received as shown inTable 5.Table 5 – Received signal amplitudes (soft data)Received DataSystematic bitParity bitParity bitk 12-56Copyright 2006 Charan Langtonk 212-1k 33-12k 4-2-2-2k 521-5k 6-4-25k 7-5-1-6www.complextoreal.com

Turbo Coding and MAP Decoding - Example4These values are called soft inputs. Although the above are integers, they don’thave to be. I happen to pick integers for this example.We have total of four errors. One in the systematic bits and three in parity bits.Unlike Viterbi and other codes, in MAP algorithm channel SNR effects decoding andit does that by exaggerating the effect of the systematic bits. So if there is an error in thesystematic bit, it also gets exaggerated. In Table 5, I have assumed that the secondsystematic bits is in error, as well as the first and fourth parity bits. The term Lc is themeasure of the signal SNR.Lc 4 Es/N0 4 x 0.25 1.0Multiply the received signal by Lc which is defined above and is equal to 1.0 forthis example. Lc is indicates the condition of the channel. High is good.Table 6 – Input signal level multiplied by channel LcDEC1 Received DataLc y1,skLc yk2,pDEC2 Received DataLc y1,skLc yk2,pk 12-5k 212k 33-1k 4-2-2k 521k 6-4-2k 7-5-1k 136k 22-1k 322k 41-2k 5-2-5k 6-45k 7-5-6DecodingFor each time tick k, decoding is done by calculating the L-values of a 1 bit. If itis positive, the decision is in favor of a 1. Calculation of the L-values or L(uk) isquite a complex process. The main equation used to calculate the L-value is this. L(u k ) Le (u k ) Lc y 1k, s log u k 1 u ( s' ) k ( s) ke ( s ' , s)k 1 ( s' ) k ( s ) ke ( s ' , s )(1)From this computation if L(uk) is positive, then uk is equal to 1.The first term, Le (uk ) is the a-priori value from Decoder 2. This is the L-value ofthe a-priori probability of the bit in question. At first, the decoder has no idea what itis. A good guess is to assume it is 0.5. The second term, Lc y1,sk is computed bymultiplying the systematic information with Lc as in Table 6. This value is thechannel L-value and gives an indication of channel SNR. The third big term with allkinds of squiggly items is the a-posteriori probability. This number is calculated forCopyright 2006 Charan Langtonwww.complextoreal.com

Turbo Coding and MAP Decoding - Example5each trellis segment. Remember that we can only have one result for a trellis section,a 1 or -1. The L(uk) calculation tells us what that number is,The big equation can be written simply as sum of three pieces of information.L-apriori – This is our initial guess about a 1 bit in first iterationL-Channel - This is related to channel SNR and systematic bit and is equal toLc y1,skLe – the computed information in each iteration is called the a-posteriori L-value.Eq. 1 can be now written as the sum of these as L apriori L channel Le (uk )(2)The L-channel value does not change from iteration to iteration since is it givenNeither the Lc nor the systematic bit changes from iteration toiteration, So lets called it K. The only two items that change are the a-priori and aposteriori L-values.by L channel Lc y1,k s .L(uk ) L apriori K L posterioriA-priori value goes in, it is used to compute the new a-posteriori probability andthen it can be used to compute L(uk) or is passed to the next decoder. Although in theexample we compute L(uk) each time, during actual decoding this is not done. Only aposteriori metric is computed and decoders and keep doing this either a fixed numberof times or until it converges. L-posteriori is also called Extrinsic information.Iteration 1eL1 (uk ) 0 K1 L1,1eeL2 (uk ) L1,1 K2 L1,2Iteration2eL1 (uk ) L1,2 K1 Le2,1L2 (uk ) Le2,1 K2 Le2,2Iteration 3L1 (uk ) Le2,2 K1 Le3,1L2 (uk ) Le3,1 K2 Le3,2So the whole objective for the calculations is to compute the extrinsic L-value, keep an eyeon them and when they get big, then compute L(uk) and make a decision. The computationCopyright 2006 Charan Langtonwww.complextoreal.com

Turbo Coding and MAP Decoding - Example6of these L-values uses an algorithm called Forward-Backward Recursion and this is of courseanother name for BCJR and many others.In equation, we have three terms in the ratio that need to be computed. They are k 1 (s ') k (s) ke (s ', s)The first term is called the forward metric. It is computed by a forward recursion using thelast term called branch metrics. It is given by k k 1 (s ') k (s ', s)sThis is a recursive process. The forward branch metric is the product of previous forwardmetric times the branch metric. (The reason we call these metrics is that these are not reallyprobabilities in a strict sense, just a measure of probability.) So to compute the forwardmetric, we first need to compute branch metrics.In real situations, if the block is 10,000 bits long, then we would need to compute thesemetrics for 10000 times the number of states times 2. The example here has just 7 bits. It has4 states so we will do this 7 x 4 x 2 56 times.A branch metric is the correlation of the received signal with its trellis values. In a nutshell, ifthe received values are the same sign as the coded bits, then the correlation will be high. Foreach decoder, there are full transition branch metrics which incorporate, the systematic andthe parity bits and partial branch metrics which incorporate only the parity bits. y1k,s , y1k, p , y k2, py1k,sy1k, pDEC1eL12(u k )InitializeLe21 (uk ) 0 1Le21 (u k )Decision Lcyk2,sDEC2L (u k )y k2, pFigure 5 – Iteration cycle of MAP Turbo Decoding1. Computing full branch metricsThe full branch metric is given by the equation 1 212 i 2Copyright 2006 Charan Langtonq Lc 2 y (s ', s) exp Le (c1k ) c1k Lc y1,k s c1k exp 1i, pk cki www.complextoreal.com

Turbo Coding and MAP Decoding - Example7The data required is Lc, the systematic value, parity value and trellis coded values.We compute this for each branch of the trellis for all seven time ticks.Example: For branch 1-1 (from state1 to state1), k 1,y11,s 2, y11, p -5, Le 0What is Le and why is it zero? Le is our initial L-value. In the start, we do notknow if the encountered bit was a 0 or a 1, so we assume that the probability of a 0 isequal to a probability of 1. The L-value is given byL(uk ) lnP(uk 1)P(uk 1)If 0 and 1 are equally likely, a safe bet, the then log of 1.0 is zero. And that iswhat Le is. It is the initial value of the Extrinsic L-value. The Extrinsic L-value is thekey information that the decoders share. It works something like this.y1k,s y1k, pLe21 (uk ) 0DEC1 1Le21 (u k )eL12(u k ) LcDEC2Figure 6 – Feedback nature of the Extrinsic, value. The value producede(uk ) ) becomes the input for the next decoder.by each decoder1 ( L12This thing – Extrinsic L-value is what goes around from one code to the next, It isalso called the a-posteriori probability at the output of a decoder and becomes theapriori probability for the next decoder in a feedback manner going round and roundand getting larger each iteration. Nice thing about L-values is that large means betterreliability.Computing Full branch metrics requires five inputs, shown in columns 1-5 below.Branch value is calculated using the equationTable 7 – Computing Full branch metricsc1kck2-1-1-1-1Copyright 2006 Charan LangtonLe00Lc y1,sk22Lc yk2,p-5-5 s ', s 0.03020.0302www.complextoreal.com

Turbo Coding and MAP Decoding - 48174.481733.115533.11550.22310.2231Table 8 – Full branch metrics for full length of the signal, by DEC12. Compute partial branch metricsThese values are based only on parity bits. It is last part in Eq. 1. We will usethese to compute the total extrinsic L-values.The equation for calculating the partial metrics is given by i 2 q1 e ( s' , s) exp Lc yki , p cki 2(3)Note that there is nothing here that changes from iteration to iteration, so theseonly need to be computed once.Table 9 – Partial branch metrics stay constant for each decoderCopyright 2006 Charan Langtonwww.complextoreal.com

Turbo Coding and MAP Decoding - Example9Figure 7 - Partial branch metrics on the trellisCalculation of forward metricsEncoding always starts in state 1. So we assume that signal can only be in state 1at time k 0. The initial value of the forward metric is set at 1.0. The other three statesare given value of 0.0. Meaning, these are not possible states. Thereafter the forwardmetrics are computed recursively for each state (not branches).The equation is k ( s ) k 1( s ' ) k ( s ' , s)(4)s'sk 1(s ' ) k (s ' , s)s'Computation of forward metrics will be done in two steps, first we will computethe numerator at time k, then we will compute the k-133.1155kk 1Figure 8 – Computing forward metricCopyright 2006 Charan Langtonwww.complextoreal.com

Turbo Coding and MAP Decoding - Example10We will compute the forward metric at state 3, k 1 and then at state 2, k 2.Here is how you think about this. There are two branches converging into state 3 attime k 1. Each of these brings its full branch metric to bear on the converging state.Computation involves multiplying the branch metrics with the starting state value andthen summing these for all converging branches at the state in question. This is twobranches. We do this for each state based on the specific branches that are cominginto it. We can think of as the amount of traffic these roads are bringing to theintersection.We compute the forward state metric at state 3 as 13 1.0 x 0.2231 0.0 x 4.4817 0.2231The only other non-zero forward metric at this point is state 1, which is 11 1.0 x 4.4817 0.0 x 0.2231 4.4817Before we continue on to k 2, we need to normalize these values, which isdenominator in Eq . The purpose of normalization is to control the numericaloverflow which may occur due to the large number of numbers that are beingmultiplied together. Before we proceed, we normalize the values at each time. Thisgives them more of a flavor of probabilities and makes it easier to understand andreduces the numerical overload.To normalize, we use the sum of forward metrics which is 4.7048. Normalizingthe two metrics with this sum, gives us the two values in col 3, Table.Now to move forward to state 2, it has two branches (from s3 and from s4)converging into it. Its forward metric is calculated by 11 0.0474 x 0.6065 0.0 x 0.0302 0.0288We compute these for all four states and we get the following values. The columnis the added to compute the normalizing sum. It is then used in Table to normalize allforward metrics.Table 10 – Computation of forward metricsTable 11 – Normalized forward state metricsCopyright 2006 Charan Langtonwww.complextoreal.com

Turbo Coding and MAP Decoding - Example113. Computing backward state metrics k 1 ( s' ) k( s) k ( s' , s)sk 2s(5)( s' ) k 1 ( s' , s)s'We assume that the signal will end in state 1. (Tail bits are added to force it end instate 1.) The ending state is always s1, so it gets a value of 1.0 just as the forwardstate metric. The rest of the three states at k 7 are given a backward state value ofzero. Look at how these propagate backwards. (s) kk (s ', s)sWe will compute the backward state metric at state 3. There are two branchescoming into state s3 at k 5. So we will compute their backward metrics first. Forstate s2, it has two branches coming into it, 2-1 and 2-3. 72 1.0 x 0.0498 0.0 x 1.6487 0.0498 74 0 (It has two branches coming into it but both have starting backwardmetrics of 0 so, it is zero as well.)1.0 679.0821k0(0)k 1.0821k 2Figure 9 – Computing backward state metricsCompute all four as shown at k 7. Now again, we normalize. But we normalizewith the same set of numbers as the forward metrics so they are both on the samefooting. Divide each backward state metric after computation and normalize beforeproceeding to the next time tick.Copyright 2006 Charan Langtonwww.complextoreal.com

Turbo Coding and MAP Decoding - Example12Table 12 –Backward state metricsTable 13 –Normalized Backward state metrics4. Computing Extrinsic L-valuesNow we will compute the Extrinsic L-value. Which are given by the product ofthe all three metrics. k (s) k 1 (s ') ke (s ', s) k (s)This is multiplication of the forward, backward the partial branch metric for eachbranch. We get one value for each branch, or eight values per time as shown below.Table 14 – Computation of final branch 060.228.162.490.010.0000k 6k 7k 0Copyright 2006 Charan Langtonk 1k 2k 3k 4k 5www.complextoreal.com

Turbo Coding and MAP Decoding - Example13Figure 10 – Forward and backward state metrics for the trellisTo compute the L-value we need the ratio of these numbers for the 1 and -1branches. For that we add the top four branch metrics and divide by the sum of thebottom four branch metrics to satisfy the following equation log e k 1 (s ') k (s) k (s ', s)u e k 1 (s ') k (s) k (s ', s)u Take the natural log of the ratio. This is the extrinsic value output for thisiteration. These values after interleaving go to Decoder 2 and become it’s a-prioriprobabilities.Normally the decoder at this point would stop because it has done its job ofcomputing the extrinsic value. But I will compute the L-value of the bit to show youhow the bit decisions can be made at any iteration.The L-value of the bit is given by L apriori L channel Le (uk )At each iteration, we have all three of these as shown in Table 17.Table 15 – Computing L(uk) and making a 93.00911-3.579-7.57900-3.682-8.68200Notice that we make independent decisions in each time tick, with no regard tothe trellis, so that makes this algorithm a bit error rate minimization algorithm, wherein Viterbi decoding the decision is made at the sequence level, so that makes it asequence error minimization algorithm.From here the Le, interleaved numbers go to the DEC2 as follows.Table 16 – Transfer of a-posteriori to DEC2 from DEC1Values InLc y 1k, sLc y k2, pk 1167.035Lekk 0k 22-11.032k 1k 3321.009k 2k 42-2-3.685k 3k 5-2-5-2.013k 4k 6-45-3.579k 5k 7-5-6-3.682k 6The Le values for this decoder are no longer 0 as they were for DEC1. They arenow giving a hint as to what was transmitted. So far the relationship is not clear, butCopyright 2006 Charan Langtonwww.complextoreal.comk 7

Turbo Coding and MAP Decoding - Example14as iterations continue these values become more stable and hopefully have the samesign as the transmitted bit.Here are all the tables giving pertinent information for the four iterations. Thecode makes errors in earlier iterations but then converges to the correct value. Onlythe first iteration is independent, after that each iteration is modifying the otherslowly. The extrinsic values do converge as we can see in the plot of these value foreach decoder for k 1 to k 4.Table 17 – Summary of decisions after 4 iterationsDEC1 DecisionsIteration 1Iteration 2Iteration 3Iteration 4k 11111k 20000k 31111k 40000k 51111k 60000k 70000DEC2 DecisionsIteration 1Iteration 2Iteration 3Iteration 4k 11111k 21111k 31111k 40000k 50000k 60000k 70000The L-values that are passed back and forth are shown in table 20.Table 18 – Extrinsic output and L(uk) after four iterationsDEC1 Extrinsic OutputIteration 1Iteration 2Iteration 3Iteration 4k 17.046.2526.6338.36k 2-3.69-7.64-22.21-34.01k 31.037.6718.3632.19k 4-2.01-12.53-20.62-35.20k 51.015.3719.1936.60k 6-3.58-12.61-22.01-37.62k 7-3.68-14.94-22.65-34.36DEC1 L(uk)Iteration 1Iteration 2Iteration 3Iteration 4k 19.0420.7945.2771.58k 2-2.69-15.01-39.40-68.61k 34.0319.2838.3767.81k 4-4.01-20.78-41.27-67.56k 53.0115.0139.4068.61k 6-7.58-19.28-38.37-67.81k 7-8.68-29.54-41.28-67.58DEC2 Extrinsic OutputIteration 1Iteration 2Iteration 3Iteration 4k 112.5416.6431.2143.01k 28.6117.0132.6247.20k 37.6418.2130.0145.62k 4-8.37-18.19-35.60-47.31k 5-6.25-18.65-30.36-44.19k 6-2.67-12.36-26.19-43.60k 7-9.59-13.64-28.21-40.01DEC2 L(uk)Iteration 1Iteration 2Iteration 3Iteration 4k 120.5823.8958.8482.37k 211.6426.6852.9881.39k 311.6526.5952.2085.22k 4-10.06-23.83-55.81-79.32k 5-10.26-33.18-52.98-81.39k 6-10.25-28.97-52.20-85.22k 7-18.28-33.58-55.86-79.37Copyright 2006 Charan Langtonwww.complextoreal.com

Turbo Coding and MAP Decoding - Example151 8 0 .0 02 0 .0 01 6 0 .0 01 0 .0 01 4 0 .0 00 .0 01 2 0 .0 0-1 0 . 0 012341 0 0 .0 0-2 0 . 0 08 0 .0 0-3 0 . 0 06 0 .0 0-4 0 . 0 04 0 .0 0-5 0 . 0 02 0 .0 0-6 0 . 0 00 .0 0-7 0 . 0 01234-8 0 . 0 00 .0 01 8 0 .0 011 6 0 .0 0234-2 0 . 0 01 4 0 .0 0-4 0 . 0 01 2 0 .0 0-6 0 . 0 01 0 0 .0 0-8 0 . 0 08 0 .0 06 0 .0 0-1 0 0 . 0 04 0 .0 0-1 2 0 . 0 02 0 .0 0-1 4 0 . 0 00 .0 01234-1 6 0 . 0 0Figure 11 – Extrinsic values over 4 iterations, for DEC1 and DEC2Charan LangtonBibliographyTech online, Course on Turbo Codes: From Theory to PracticeTurbo Code Tutorial, William E. Ryan, New Mexico University, Online paperTurbo Code in IS-2000 Code Division Multiple Access Communications Under Fadingby Jian Qi, Thesis, 1999, http://hometown.aol.com/jxqi/qi/thesis/ (Contains a worked outexample for one decoder.)Material by Binton Cooper on Convolutional codes,http://www.ece.jhu.edu/ cooper/courses/466/09va.pdfTurbo Decoding, a brief introduction, charts by Brinton Cooper IIICopyright 2006 Charan Langtonwww.complextoreal.com

Turbo Coding and MAP Decoding - Example16http://www.ece.jhu.edu/ cooper/courses/466/12turbo.pdfand many others – which I will add later.Copyright 2006 Charan Langtonwww.complextoreal.com

Turbo decoding using the MAP algorithm Part 2 – A Step by step example S u k s p k y y 1, 1, s k p k y y 2, 2, EC2 EC1 Figure 1 – A rate 1/3 PCCC Turbo code In this example, we will use a rate 1/3 Turbo Code which has two identical convolutional codes. Note that since 2,s y k is a dete

Related Documents:

Turbo Coding and MAP Decoding www.complextoreal.com 4 EC1 EC2 ECn S1 Sn 1 p y k 1, p y k 2, n p y k, s y k k 1, u Figure 1 – A rate 1/(n 1) Parallel Concatenated Convolutional Code (PCC) Turbo Code Turbo codes also come as Serial Concatenated Convolutional Code or SCCC.File Size: 720KBPage Count: 24

Turbo Coding and MAP Decoding www.complextoreal.com 4 EC1 EC2 ECn S1 Sn 1 p y k 1, p y k 2, n p y k, s y k k 1, u Figure 1 - A rate 1/(n 1) Parallel Concatenated Convolutional Code (PCC) Turbo Code Turbo codes also come as Serial Concatenated Convolutional Code or SCCC. The SCCC codes appear to have better performance at higher SNRs. Where .

Coding and Decoding Coding and Decoding is an important part of Logical reasoning section in all aptitude related examinations. Coding is a process used to encrypt a word, a number in a particular code or pattern based on some set of rules. Decoding is a process to decrypt the pattern into its original form from the given codes.

turbo codes, and other concatenated coding schemes are outlined. Index Terms— Belief propagation, iterative decoding, low-den-sity parity-check (LDPC) codes, message-passing decoders, turbo codes, turbo decoding. I. INTRODUCTION IN the wake of the phenomenal success of turbo codes [2],

This makes it a rate 1/3 turbo code. D Turbo Interleaver D D D p l (1) p l (2) u l u l Fig. 2. LTE rate 1/3 turbo encoder [17]. Similarly here, knowing the starting and ending states of the of the encoder at the decoder is important to avoid performance loss. This is handled via trellis termination. B. Decoding The turbo decoder consists of two .

INSTRUCTIONS FOR ADDING PRIMOS SOUNDS TO BOSS DOGGTM, ALPHA DOGG TM AND TURBO DOGGTM Boss DoggTM Model# 3757 Alpha Dogg TM Model# 3756 Turbo Dogg TM Model# 3755 INSTALLATION: STEP 1 – Connect Boss / Alpha / Turbo Dogg Speaker to computer (PC or Mac) a) Turn OFF Boss / Alpha / Turbo Dogg remote. b) Turn ON Boss / Alpha / Turbo Dogg Speaker.File Size: 1MBPage Count: 5

Shell Turbo T 32 Toegestaan Shell Turbo T 32 Ja Ja Shell Turbo T 46 Toegestaan Shell Turbo T 46 Ja Ja . BP Turbo Oil 2380 Toegestaan BP Turbo Oil 2380 Ja Ja Castrol Perfecto T32 Toegestaan Castrol Perfecto T32 Ja Ja Tribol GR 1350-2.5 PD Toegestaan Tribol GR 1350-2.5 PD Ja Ja

An Alphabetical List of Diocesan and Religious Priests of the United States REPORTED TO THE PUBLISHERS FOR THIS ISSUE (Cardinals, Archbishops, Bishops, Archabbots and Abbots are listed in previous section)