Entropy

 

Stephen Hawking, in accordance with the formula for calculating the entropy of a black hole that he has discovered, follows that quantum gravity, in his own words “may exhibit the property called holography” and “information about quantum states in a region of spacetime can be encoded in some way on the border of that region, which has two dimensions less”. Well, we don´t panic, what he means is that somehow the second law of thermodynamics is true, despite the fact that the mere existence of black holes seem to prove otherwise, that is, the information which is apparently swallowed by black holes, actually does not evaporate or disappear but remains engraved on the edge of the event horizon, as is the case with holographic images, the full image information is encoded in its entirety in each of the pixels that compose it. Certainly it is a mere speculation, since quantum gravity is currently just a theory, although it has eminent and renowned advocates like Professor Michio Kaku.

The researcher Rajesh Rao to determine if the undecipherable at present, Indus Valley scripture is really a language, a cluster of signs without meaning or perhaps some sort of heraldic symbols, he analyzed the entropy of the texts available, an effort whose outcome  is that it is a language actually, a deduction that although at first glance may seem obvious, in that way is presented as less intuitive, certainly more scientific.

Entropy is used to measure the disorder within any system and although it is an easily defined thing belongs to this class of concepts so abstract that seem to be completely useless. However it is a very practical idea that is not limited to the world of physics, in fact it is useful in many aspects of life, if we apply it to a closed system. For example you can say that unfortunately too people still prefers low-entropy political systems, inadvertently. The truth is that if you tell someone that he has a great mental entropy, he would might even feel grateful, it really is a word that sounds really well. Some use the term “chaos” instead of “disorder” and although are synonymous, there is a risk of being confused with Chaos Theory which is another matter. I’m not a fan of disorder but I recognize that the strict order gives me chills, I do not like the stiffness.

Is attributed to a physicist called Clausius the idea of quantifying disorder, although it seems that was not the first, according to cosmologist Sean Carroll, Latin author Lucrecius , around the year 70 before the current era, he speculated about the nature of entropy, although I bet to the Pythagoreans had outlined the idea, in any case the merit goes to, a long time after, Ludwig Boltzmann, who was able to give a mathematical form to the idea.

Low entropy defines a highly homogeneous system or what is the same,  has a high degree of entropy when there is no order, ie the randomness of the component parts is very high. As you can see, the meanings of few, very, low or great is starting to stumble because we are talking about “depends” and “perhaps”,  a quasi-esoteric things depending how you look. The truth is that the specialists since the last century, in front of the inability to provide accurate measurements, have been compelled to work with mere assumptions, ie, with probabilities.

One way to look is in statistical terms, that which is statistically unlikely, rare, has a strong entropy, whereas what is usual, what is very likely,  has a low entropy. This is interesting because in some way sets, something very difficult to define at times, the concept of normality.
On the other hand, there is the absurd tendency to think that low or null entropy is a good thing, to the extent that in our mind, we readily associate “order” with “virtue”, in fact they say of someone who “he has a messy life” as a way of recrimination. It is a dangerous fallacy , since they bet in the political arena by the dictatorship, it is a fact it has a lower entropy than any democracy, but the nuclear holocaust has even a much lower entropy and  obviously is not at all desirable.

The fundamental characteristic of all this is that closed systems do not exist, there is always a degree of interactivity or contact general scattering in function of time, I mean, the typical example of the glass on the table who falls and crashes to the ground, breaking into pieces. The glass on the table has low entropy and who is broken has a high entropy, but the same glass on the table in a room where there are children, has a higher entropy than if simply there are no children around. It is what is usually called prognosis.

Entropy actually works with time, both the atmospheric one and that one who kills us and it provides data to obtain predictions with a definite degree of probability. But while this tool allows us to speculate about the future discretly, the most amazing is that by definition, the past must be immutable or simply has disappeared, both states have zero entropy, however in this sense it seems there is no reason to be pessimistic.
We know that light employs a significant number of years to traverse the vast intergalactic space. If we accept this, then a hypothetical inhabitant of a possible planet of a solar system in the Andromeda galaxy could be right now receiving photons of light that our sun issued two million years ago, in his alien telescope, the same photons that illuminated our planet around that time and probably if his optical technology enables it, would contemplate an Earth without dinosaurs, full of giant mammals and no modern man yet. Somehow, the past still exists. It’s not science fiction.