Friday, April 10, 2009

Shannon limit progress


Shannon was the guy who came up with much of the theory of digital computers but is mostly famous for creating information theory. I found out today invented the worlds first wearable computer to cheat at roullette in Vegas here. This is pretty similar to finding out Einstein invented a time machine to grift rubes in three card monte.

He also developed a motorised pogo-stick and flame throwing trumpet. A picture of Shannon using these or even just juggling on his unicycle would finally prove the internet was useful.

So how has attempts to get a code that approaches the Shannon limit progressed? "The Shannon limit or Shannon capacity of a communications channel is the theoretical maximum information transfer rate of the channel, for a particular noise level."

It is really important because it controls how much information a device can transmit. Which is important if you make phone calls or communicate with a satellite.
So in 1948 Shannon put a limit on how much information can be transferred. Mathematicians (and engineers) devise ways to encode information to make communications close to this limit. As maths progresses you would expect the codes to get closer to this limit.

I have taken this first picture from this very good article. You can see how codes improve over time


In this picture you can see new codes moving closer to 0 on the y-axis showing improvements in the coding schemes. I'm having trouble getting the world record closeness to the Shannon limit for particular dates. When I do I'll add them.
What I have so far is
Year Power Who
1977 2.1 dB Reed–Solomon code(now 1–1.5db)
1997 .27 db hamming code
2009 0.0045 dB LDPC codes

Sometimes codes have been invented before they were practical, but even then people tend not to have known how good they were.
"LDPC codes were invented by Robert Gallager in 1962! However, LDPC codes were largely forgotten until their rediscovery by David Mackay in 1998, who not only rediscovered them but used powerful modern computers (which were not available to
Gallager) to simulate their performance and thereby demonstrate their astonishing power"

The importance of this progress has been summed up by suggesting future historians will write
"Claude Shannon formulated the notion of channel capacity in 1948 A.D. Within several decades, mathematicians and engineers had devised practical ways to communicate reliably at data rates within 1 percent of the Shannon limit...."
Coding theory is one where we can see practical applied progress in mathematics over the last 60 years.

1 comment:

artied said...

Greetings- love the content but where are the bees??

Anyway thought you might like this link to that greatest of men - Shannon...

http://www.youtube.com/watch?v=sBHGzRxfeJY

regards

AD