- Entropy is about the number of microstates that can form a macrostate. In terms of information, we say: a microstate is either present or not, 1 or 0. Information theory also works with entropy: Shannon entropy. A basic explanation of entropy by Luis Serrano can be found on Medium.
- What does Rovelli say about entropy? In his widely acclaimed book The Mystery of Time, theoretical physicist Carlo Rovelli argues that there may be no different microstates at all. Perhaps all microstates are equal, and a supposed difference is only a subjective experience of the observer. In that case, one cannot speak of an objectively higher or lower entropy, only subjective entropy. The concept unfolds as follows:
Take a pack of twelve cards, six red and six black. Arrange it so that the red cards are all at the front. Shuffle the pack a little and then look for the black cards that have ended up among the red ones at the front. Before shuffling, there are none; after, some. This is a basic example of the growth of entropy. At the start of the game, the number of black cards among the red in the first half of the pack is zero (the entropy is low) because it has started in a special configuration.
But now let’s play a different game. First, shuffle the pack in a random way, then look at the first six cards and commit them to memory. Shuffle a little and look to see which other cards have ended up among the first six. At the start, there were none, then their number grew, as it did in the previous example, together with the entropy. But there is a crucial difference between this example and the previous one: at the beginning of this one, the cards were in a random configuration. It was you who declared them to be particular, by taking note of which cards were in the front of the pack at the beginning of the game.
The same may be true for the entropy of the universe: perhaps it was in no particular configuration. Perhaps we are the ones who belong to a particular physical system with respect to which its state can be particular.
This explanation no longer revolves around 1 or 0 (a configuration/microstate that either exists or doesn’t). In this case, all states are considered equal. However, this version of entropy still pertains to discrete values. The cards, whether red or black, are discrete.
- Superentropy is not about 1 or 0, nor is it about discrete values. It concerns probability distributions, shared information in the form of entanglements and superpositions. Moreover, there is a difference in complexity involving not only qubits but also qudits. Superentropy does not align with the concept of phase space either. While phase space does involve combinations of properties, i.e., superpositions, it also pertains to an isolated space that quantizes everything. The latter is associated with an interpretation in which the collapse of a superposition creates a value/state. Note that this is partly a macroscopic perspective. A pure relation interpretation does not involve isolation. It acknowledges a universe which is interconnected through overlap among neighbors and neighbors of neighbors. In this view, during collapse, one superposition transitions into the next. Thus, superentropy guides developments towards the most, or most fitting, options. Because superentropy applies to a system with values/states in superposition, even ‘degrees of freedom’ at the quantum level are mutually interchangeable and, at this level, essentially not independent parameters. Superentropy is incalculable. In that sense, it is comparable to Heisenberg’s uncertainty principle.
The aforementioned entropy descriptions all center on the probability of transitioning to the next macrostate.