Stephen Hawking, in accordance with the formula for calculating the entropy of a black hole that he has discovered, follows that quantum gravity, in his own words “may exhibit the property called holography” and “information about quantum states in a region of spacetime can be encoded in some way on the border of that region, which has two dimensions less”. Well, we don´t panic, what he means is that somehow the second law of thermodynamics is true, despite the fact that the mere existence of black holes seem to prove otherwise, that is, the information which is apparently swallowed by black holes, actually does not evaporate or disappear but remains engraved on the edge of the event horizon, as is the case with holographic images, the full image information is encoded in its entirety in each of the pixels that compose it. Certainly it is a mere speculation, since quantum gravity is currently just a theory, although it has eminent and renowned advocates like Professor Michio Kaku.
Entropy is used to measure the disorder within any system and although it is an easily defined thing belongs to this class of concepts so abstract that seem to be completely useless. However it is a very practical idea that is not limited to the world of physics, in fact it is useful in many aspects of life, if we apply it to a closed system. For example you can say that unfortunately too people still prefers low-entropy political systems, inadvertently. The truth is that if you tell someone that he has a great mental entropy, he would might even feel grateful, it really is a word that sounds really well. Some use the term “chaos” instead of “disorder” and although are synonymous, there is a risk of being confused with Chaos Theory which is another matter. I’m not a fan of disorder but I recognize that the strict order gives me chills, I do not like the stiffness.
Is attributed to a physicist called Clausius the idea of quantifying disorder, although it seems that was not the first, according to cosmologist Sean Carroll, Latin author Lucrecius , around the year 70 before the current era, he speculated about the nature of entropy, although I bet to the Pythagoreans had outlined the idea, in any case the merit goes to, a long time after, Ludwig Boltzmann, who was able to give a mathematical form to the idea.
Low entropy defines a highly homogeneous system or what is the same, has a high degree of entropy when there is no order, ie the randomness of the component parts is very high. As you can see, the meanings of few, very, low or great is starting to stumble because we are talking about “depends” and “perhaps”, a quasi-esoteric things depending how you look. The truth is that the specialists since the last century, in front of the inability to provide accurate measurements, have been compelled to work with mere assumptions, ie, with probabilities.
The fundamental characteristic of all this is that closed systems do not exist, there is always a degree of interactivity or contact general scattering in function of time, I mean, the typical example of the glass on the table who falls and crashes to the ground, breaking into pieces. The glass on the table has low entropy and who is broken has a high entropy, but the same glass on the table in a room where there are children, has a higher entropy than if simply there are no children around. It is what is usually called prognosis.