What is Reality: Part 2 Information.

11:03

So yesterday I was thinking about information, trying to define it. And I came to the most generalized logical solution I could get, it is a lack of chaos. The chaos of organizing a party is reduced if all your guests have information on when and where to come. It makes perfect sense. So after finding the answer with impure logic, I decided to go and look at it mathematically. Let's say all information is depicted with a string of 0s and 1s. Observe the block of characters below.

000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000

Does this string of 0s convey any information? Which means, by my original definition should mean, that this string of characters would have 0 chaos. None. As physicists would say, an entropy of 0. No chaos, and no information.

Well, this is just one example, it can't prove anything. Besides, a mathematician will tell you that  0 is a misleading number to work with, a fair rebel when it comes to following rules. So let's look at a different set of numbers now.

Observe this next string of letters.

011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011011
011011011011011011011011011011011011011011011011011011011011011011011011011011011

This string has a set pattern. Patterns are often associated with information. It can be written as repeat(011). That is the only information it contains. Needless to say, it also has a fairly high entropy. Compressibility shows a deficit of information, as it shows that a large part of the string was but useless.

In the final example, I'd like to look at randomness. Common conception should tell you that randomness is null when it comes to information. Or is it? A random string of 0s and 1s is in-compressible, you can't simply collapse it into a smaller statement that conveys the same information as the random string of characters shows no pattern of any conceivable respect. It is extremely high on entropy, and due to it's in-compressibility, it is essentially the most rich in information! A random string of characters therefore conveys the most information!

So with these examples, I have proven that information is chaos.

So what do you think about the relationship between chaos? Leave it in the comments below. I'm Daksh Gupta, and as always, never stop asking questions.

You Might Also Like

0 comments