Stipe tries to use a dictionary for scientific terms:
entropy (noun):
1. a thermodynamic quantity representing the unavailability of a system's thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system.
"the second law of thermodynamics says that entropy always increases with time"
2. lack of order or predictability; gradual decline into disorder.
Entropy is not “disorder”
Entropy change measures the dispersal of energy (at a specific temperature), i.e. qrev/T
* Energy dispersal; energy becoming spread out. In simple physico-chemical processes such as ideal gas expansion into a vacuum , ”spread out” describes the literal movement of energetic molecules throughout a greater volume than they occupied before the process. The final result is that their initial , comparatively localized, motional energy has become more dispersed in that final greater volume. . Such a spontaneous volume change is fundamentally related to macro entropy change by determining the reversible work (=-q) to compress the gas to its initial volume, RT ln (V2/V1) with the result being ΔS =R ln (V2/V1). On a molecular thermodynamic basis, gas expansion into a vacuum would be described in terms of microstates by the Boltzmann equation via:
ΔS = kB ln W2/W1
= kB ln [(V2 /V1 )N ]
= kBN ln V2 /V1
= R ln V2 /V1.
Thus, the amount of the spontaneous dispersion of molecules' energy in three-dimensional space is related to molecular thermodynamics, to a larger number of microstates, and measured by changes in entropy.
IMPORTANT NOTE: The initial statement above (about the movement of molecules into a larger volume as solely due to their motional energy) is only valid as a description of entropy change to high school students or non-scientists when there will not be any further discussion of entropy. Such a statement “smuggles in entropy change” wrote Norman C. Craig. I interpret this to be ‘smuggling’ because thermodynamic entropychange is not simply a matter of random molecular movement, but consists of two factors, not one. Entropy change is certainly enabled in chemistry by the motional energy of molecules (that can be can be increased by bond energy change in chemical reactions) but thermodynamic entropy is only actualized if the process itself (expansion, heating, mixing) makes accessible a larger number of microstates, a maximal Boltzmann probability at the specific temperature. [Information 'entropy' only has the latter factor of probability (as does the 'sigma entropy' of physics, σ = S/kB) This clearly distinguishes both from thermodynamic entropy.]
http://entropysite.oxy.edu/entropy_isnot_disorder.html
You're still mind-numbingly stupid.
One of us is. Have you figured out yet whether a drop of water or a crystal ice had more entropy? When you do, show us your numbers, Stipe.
Irrelevant, of course. The law is that entropy increases.
As you learned, my daffodils are demonstrating a reduction in entropy at this moment. So do hurricanes, ocean currents, reproducing bacteria, and many,many other things. All of this is a dense mystery to you, because you don't know what "entropy" is.
Therefore, if you leave either of those things alone, they will tend toward disorder.
I left my daffodils alone. And they are exhibiting an increase in order. If reality won't match your beliefs, it's time for you to make an accommodation.
We know you hate reality.
Barbarian explains:
All predictions should be testable, meaning it should be possible to design an experiment that would verify or invalidate the prediction.
Sounds like the practice of discarding ideas when they are falsified trumps a prediction.
Yep. Predictions are all that matter. If they're good ones,and are verified, then the hypothesis is a theory. If not, the hypothesis is scrapped and a new one is tried. You're starting to catch on, Stipe.
We ask you to describe how shining the sun on an organism leads to it gaining genetic information
As you learned, all new mutations increase information in a population. Would you like to see the numbers for that, Stipe?
Or are you confused how sunlight provides the energy for organisms to live and eventually to produce mutations?
and you either dissemble — "what is information?"
If you understood what "information" is, you'd have already figured out your issue.
You bring up other examples of local decreases in entropy
You just brought up mutations, which are another example. You're very confused about this issue.
but we can explain the mechanics of flowering or storms without saying "they flower" or "storms happen."
Just as we can explain the mechanics of evolution without saying "they evolve." The difference is, flowers don't scare you.
The challenge lies before you: Explain the process without using your assumptions as evidence.
As you see, mutations are no different than other natural phenomena. And as you already learned, all new mutations in a population increase information. Would you like me to show you again?