Entropy and conservation of information

Next →

comments powered by Disqus
Lecture overview

Professor Susskind introduces statistical mechanics as one of the most universal subjects in modern physics in terms of it's ability to explain and predict natural phenomena.  He begins with a brief introduction to probability theory and then moves on to draw the connection between the concept of laws of motion as rules for updating states of a system, and the probability of being in a given state.  Proper laws of physics are reversible and therefore preserve the distinctions between states - i.e. information.  In this sense, the conservation of information is more fundamental that other physical quantities such as temperature or energy.