In reply to camalins:
> Yeap spot on John, increasing entropy doesn't necessarily imply increasing disorder. Linking entropy to disorder is one of the most common misconceptions amongst physicists.
Sorry, I missed this exchange at the time it was posted a week or so ago.
I'm someone with a background in physics who would really, really struggle to understand what entropy means if it's not ultimately linked to disorder. My recollection is that Boltzmann introduced his definition of entropy when he developed statistical mechanics. Crudely speaking ordered states are much less likely than disordered states, so when you calculate the ratio of the number of ordered states to total possible states (of some system) it's a small number: take the log of this and you get a negative number that is large in magnitude. The ratio of disordered states to total possible states is just a bit less than unity; take the log of this and you get a negative number but which is still close to zero.
This is the original definition (I think), but alternative definitions of entropy in terms of macro quantities (pressure, temp, etc.) can be derived that are more useful in practice to people like chemists, engineers and astrophysicists who don't want to do statistical mechanical calculations all the time. But these all are ultimately related to the original definition of entropy in terms of statistical states.
If I'm wrong, it would be good of you to quote an example a a change to a system where entropy increases (as it must) but disorder does not.
By the way, did anyone see this Sunday's programme where Brian Cox tried to get Jonathan Ross to do arithmetic on a blackboard? Painful.