Coding Theory Case Study

Coding Theory Case StudyMTH 221Coding Theory Case StudyWhat is Coding theory? Coding theory is sometimes called algebraic coding theory, this theory is design with error correcting codes in order to accurately transmit information across noisy channels. Coding theory uses bit strings in sequences of ones and zeros in order to check the reliability of data that is being transmitted. These bit strings are encoded by translating them into longer strings called code words and a set of code words is called code. There are a number of different ways to ensure that the data being transmitted is delivered accurately. One of those ways is using a technique called Error Detecting Codes.Error detecting codes uses a parity bit to check to see if there is either an even number of ones. For example if a message was transmitted with the code 1101. We would encode this message 11011, then we know that the original message was 1101. But if we used the code 1001 we would encode the message 10010, because the original message already contain a even number of ones. Now if we received the message 10101. We know that we must have an error because we have an odd number of ones, but we cannot correct the error and we do not know which 1 is the error. With Error detecting we can only detect not correct. In order to correct the code we use Error Correcting Codes.

Error correcting code uses redundancy in order to detect an error and correct that error. We can use repetition in order to detect and fix a message. So if the original message sent is x1,x2,x3 it is encoded as x1,x2,x3,x4,x5,x6,x7,x8,x9. In error correcting we take the first x1 and compare it to x4 and x7, x2 is compare to x5, and x8, and x3 is compare to x6 and x9. In other words, if two or three of the bits are 1 we conclude that x1=1 or if two or three bits are 0 then x1=0. For example if we receive the message 011111010, using the above statement we can figure out the original message was 011. (Rosen, ND) Hamming distance is the measurement between two but strings. This says that by measuring the distance between two bit strings equals the number of changes in individual bits needed to change one of the strings into the other. (Rosen, ND) This approach was created by Richard Hamming who lived from 1915-1998. He won the Turing Prize from the ACM and the IEEE Hamming Medal which was named after him. Hammings formulas allow computers to detect and correct error on their own. Here is an example of how this all works: