top of page
Search
Writer's pictureKennie Nybo Pontoppidan

Bits, probability, and information - learning about information theory



Get a copy of Claude Shannons paper (that defined this research area) by clicking on the picture

A few weeks ago, I was taking some time to reflect on my learning plan for the past semester, and also started planning for the sprint semester. This is something have done since the beginning of my career (see links to how I plan and execute on that below), and one of the topics on my list of “Interesting topics to study” was “Shannon, Entropy and information theory.”

I have a busy work and family life, so picking up a university text book and just start digging into that, is not always the most effective way for me to learn a topic. Therefore, I tend to use MOOC courses (eg. from Coursera), especially for those “this is interesting, but not really needed in my daily working life” topics. MOOC courses are great for introductory courses, but more advanced topics are typically not available there. Information theory is an example of a topic, where I never found a MOOC offering.

Not until two weeks ago...

The Santa Fe Institute, an independent, nonprofit research and education center located in Santa Fe, New Mexico, have published a number of small online courses/tutorials on topics related to their research topic of complexity science, and one of them was on… Information theory.

I signed up (it is free, you can donate to the program if you like), and spend my mornings in the next weeks learning about bits, Shannon entropy, channels and how probability theory and computer science fits nicely together in this topic.

The tutorial description says this about the course

In this tutorial, students will follow the development of information theory from bits to modern application in computing and communication. Along the way Seth Lloyd introduces valuable topics in information theory such as mutual information, boolean logic, channel capacity, and the natural relationship between information and entropy.

Lloyd coherently covers a substantial amount of material while limiting discussion of the mathematics involved. When formulas or derivations are considered, Lloyd describes the mathematics such that less advanced math students will find the tutorial accessible. Prerequisites for this tutorial are an understanding of logarithms, and at least a year of high-school algebra.

I can highly recommend the curious learner to dig in. It was indeed a very light weight (on the mathematics side) introduction, but on the other hand also very interesting. I might even invest in that text book…

Read more about (and do) the tutorial on information theory here:

Get a copy of Claude Shannons paper (that defined this research area) here: http://math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf

Read about how I plan and execute on personal and professional learning here (four blog posts):

Read about MOOC courses on Coursera here;

23 views0 comments

Comments


bottom of page