De-obfuscation of the term Entropy and its relevance to computing
1.4 Entropy is Confused
The real problem is while entropy applies to any system where heat is transferred it cannot be defined in that way and this is what brings the meaning of entropy into the world of computation and this is the reason:
From Classical Thermodynamics the units of entropy are Joules/Kelvin
which is energy/temperature = E/T but temperature = energy/molecule = E/N
where N is the number of molecules. So we have..
Classical Entropy = E/T = E/E/N = N (just a number and the energy term disappears!)
The result is just as Ludwig Boltzmann discovered, entropy fundamentally has no units and is just a number! But was not Boltzmann's formula s = k.Ln(W) where K has units Joules/Kelvin?
Yes but as Richard Feynman pointed out it is only an arbitrary constant as is logarithm base 'e' chosen to derive the classical Clausius heat equation from Boltzmann's general equation.
So we conclude Entropy = log(W) to any base we choose and given W is the number of microstates in a macrostate chosen by an observer we conclude with James Jeans entropy is entirely subjective! Which all means there exists about an infinite number of entropies for any system we chose! Hence the problem it has presented to science that attempts to 'naturalise' all its terms and the confusion generated by trying to enforce that philosophy on it.
From Classical Thermodynamics the units of entropy are Joules/Kelvin
which is energy/temperature = E/T but temperature = energy/molecule = E/N
where N is the number of molecules. So we have..
Classical Entropy = E/T = E/E/N = N (just a number and the energy term disappears!)
The result is just as Ludwig Boltzmann discovered, entropy fundamentally has no units and is just a number! But was not Boltzmann's formula s = k.Ln(W) where K has units Joules/Kelvin?
Yes but as Richard Feynman pointed out it is only an arbitrary constant as is logarithm base 'e' chosen to derive the classical Clausius heat equation from Boltzmann's general equation.
So we conclude Entropy = log(W) to any base we choose and given W is the number of microstates in a macrostate chosen by an observer we conclude with James Jeans entropy is entirely subjective! Which all means there exists about an infinite number of entropies for any system we chose! Hence the problem it has presented to science that attempts to 'naturalise' all its terms and the confusion generated by trying to enforce that philosophy on it.
Total Comments 0