Information Theory And Coding

3y ago
95 Views
6 Downloads
1.17 MB
6 Pages
Last View : 6d ago
Last Download : 3m ago
Upload by : Rosemary Rios
Transcription

Information Theory and CodingExamples for Entropy فؤاد حمادي . م 9191 - 9102

Example6: a source produce a stream of twenty letters (A,B,C,D,E) withprobabilitiesP(A) P(E),P(B) P(D),P(A) 0.5 P(B) 0.25 P(C).Finda. The entropy for this sourceb. The amount of information each letter conveyc. The amount of information that the total message convey.SolP(A) P(E) 0.1P(B) P(D) 0.2P(C) 0.4a. H - Σ pi log pi - ( 0.1 log 0.1 0.2 log0.2 0.4 log0.4 0.2 log0.2 0.1 log0.1) - ( 2 0.1 log 0.1 2 0.2 log 0.2 0.4 log 0.4 ) - ( 2 0.1 -3.322 2 0.2 -2.322 0.4 -1.322 ) 0.66444 0.92888 0.52888 2.1222b. I A - log 0.1 3.3222 I EI B - log 0.2 2.3222 I DIC - log 0.4 1.3222c . I message 2 0.3222 4 2.3222 8 1.3222 4 2.3222 2 3.3222 42.444OrI message no. of letters Ī 20 * 2.1222 42.444

Entropy and Length of the CodeOne of the key concepts in coding theory : we want to assign a fewernumber of bits to code the more likely events.0 Entropy log2 (M) also 0 H L;This illustrate that the more randomness that exist in the source symbols,the more bits per symbol are required to represent those symbols.on the other hands, entropy provides us with the theoretical minimum forthe average number of bits per symbol ( average length of the code ) thatcould be used to encode the same symbol. The closer L is to the entropy,the better the coder.Code Efficiency and Redundancy

Source Coding Techniques1. Fixed Length CodingIn fixed length coding technique all symbols assigned with equal lengthbecause the coding don’t take the probability in account.The benefit of the fixed length code is ease of applied (easy in coding anddecoding)Example1: Let x { x1,x2, ,x16} where pi 1/16 for all i ,findζ source codeSolH(x) log2 M log2 16 4 bit/symbol(because p 1 p2 p16 1/M)For fixed length code𝐿 i [ 𝑙𝑜𝑔2 𝑀 ] [ 𝑙𝑜𝑔2 16] [4] 4 ξ 𝑠𝑜𝑢𝑟𝑐𝑒 𝑐𝑜𝑑𝑒 H(x) / L * 100% 4/4 *100% 100%

Example2: Let x { x1,x2, x3,x4,x5} where pi 1/5 for all i ,ζsource codeSolH(x) log2 M log2 5 2.322 bit/symbolFor fixed length code𝐿 i [ 𝑙𝑜𝑔2 𝑀 ] [ 𝑙𝑜𝑔2 5] [2.322] 3 bit ξ 𝑠𝑜𝑢𝑟𝑐𝑒 𝑐𝑜𝑑 𝑒 H(x) / L * 100% 2.322/3 *100% 77%find

Example3: Let x { x1,x2, ,x12} where pi 1/12 for all i ,find ζcodeSolH(x) log2 M log2 12 3.585 bit/symbolFor fixed length code𝐿 i [ 𝑙𝑜𝑔2 𝑀 ] [ 𝑙𝑜𝑔2 12] [3.585] 4 bit ξ 𝑠𝑜𝑢𝑟𝑐𝑒 𝑐𝑜𝑑 𝑒 H(x) / L * 100% 3.585/4 *100% 89%source

Source Coding Techniques 1. Fixed Length Coding In fixed length coding technique all symbols assigned with equal length because the coding don’t take the probability in account. The benefit of the fixed length code is ease of applied (easy in coding and decoding) Example1: Let x { x 1,x 2, ,x 16} where pi 1/16 for all i , find ζ

Related Documents:

8 Bernd Girod: EE368b Image and Video Compression Introduction no. 15 Outline EE368b n Some fundamental results of information theory n Scalar quantization and vector quantization n Human visual perception n Predictive coding n Transform coding n Resolution pyramids and subband coding n Interframe coding n Motion estimation n Motion compensated coding n Coding standards JPEG, H.261, H.263 and MPEG

Coding ClinicReferences 1 Injury and Poisoning Coding Clinic 4Q 2008 ICD-9-CM Official Guidelines for Coding and Reporting Effective October 1, 2008 Chapter 17: Injury and Poisoning (800-999) Coding of Injuries When coding injuries, assign separate codes for ea

1.2 Employment outcomes of coding bootcamp training 2 2 Employment and the coding skills shortage 5 2.1 Background: ICTs and employment 6 2.2 ICT skills shortages 7 2.3 Summary 13 3 Learning to code: Adoption of the coding bootcamp model 13 3.1 Coding bootcamp models in developing countries 14 3.2 Coding bootcamp business models 18

Information Theory and Coding Theory. The fundamental philosophical contribution of [1] was the formal application of probability theory to the study and analysis of communication systems. The theoretical contribution of Shannon’s work in the area of channel coding was a useful definition of “information”

Information Theory and Network Coding. Springer 2008. network coding 8/ 74. Course material A. E. Gamal and Y-H. Kim. Network Information Theory. Cambridge University Press 2011. network information theory Slides: A. E. Gamal and Y-H. Kim. Lecture Notes on Network Information Theory. arXiv:1001.3404v5, 2011. web 9/ 74.

The study of error-control codes is called coding theory. This area of discrete applied mathematics includes the study and discovery of various coding schemes that are used to increase the number of errors that can be corrected during data transmission. Coding theory emerged following the publi-

ECE 407 –Spring 2009 –Farhan Rana –Cornell University Classical Coding Theory and Information Compression The coding scheme #1 does not take into account that some measurement outcomes are very unlikely and some are much more likely Value of X Probability Coding #1 Coding #2 a 1/32 000 00000 b 1/32 001 00001 c 1/8 010 011 d1/401110 e 1/8 .

P, produced by A02. Next, A01 asks A03 for every such component to get offers from companies that are able to supply the component. So, a number of exploring transactions T03 may be carried out within one T01, namely as many as there are components of P which are not produced by the tier n company. In order to execute