The challenge to evolution is that there is no known means by which sunlight, or any energy, can be turned into information without intelligent guidance.
Replication
The challenge to evolution is that there is no known means by which sunlight, or any energy, can be turned into information without intelligent guidance.
Entropy deals with separate concepts as it is applied in different fields.
The challenge to evolution is that there is no known means by which sunlight, or any energy, can be turned into information without intelligent guidance.
If God chose to, could He make a simple modification to DNA that adds information to it?The challenge to evolution is that there is no known means by which sunlight, or any energy, can be turned into information without intelligent guidance.
Then you've conceded all of Johnny's points for him. You also agree with my instant response to LoL's original thread. Styer has not addressed the full and correct challenge to evolution from entropy.Outside what the Styer paper addresses.
I see. And would you mind sharing how it is that biological evolution ignores the trends imposed on everything else by entropy?Replication
I'll not use that term. Far too confusing. Populations and features follow trends that change over time. Those changes adhere to the principles of entropy in that a new feature always comes at a net cost to the population.Stripe- Do you accept that there is a thing sometimes called "micro-evolution"?
Yes.If God chose to, could He make a simple modification to DNA that adds information to it?
I'll not use that term. Far too confusing. Populations and features follow trends that change over time. Those changes adhere to the principles of entropy in that a new feature always comes at a net cost to the population.
Sure.Ah. Can you give an example of how this works?
Just so we are not crossing paths with semantics, when you say that “Styer has not addressed the full and correct challenge to evolution from entropy”, I am going to presume you must be including information entropy, since no one I’ve seen is even pretending to counter him on thermodynamic entropy.Then you've conceded all of Johnny's points for him. You also agree with my instant response to LoL's original thread. Styer has not addressed the full and correct challenge to evolution from entropy.
Sure.
Ever been to Fiji?
The challenge is from entropy. Information and thermodynamics are two fields that utilise this observed trait that can be applied to all scientific fields.Just so we are not crossing paths with semantics, when you say that “Styer has not addressed the full and correct challenge to evolution from entropy”, I am going to presume you must be including information entropy, since no one I’ve seen is even pretending to counter him on thermodynamic entropy.
Do you think he would be interested in addressing the challenge as it now stands?But since, as has been shown several times, Styer made it explicitly clear that he was addressing Thermodynamic entropy, and only Thermodynamic entropy, then you are correct that he has not covered the full range. He never intended to.
:chuckle: I do tend to lump you guys together a bit, don't I. Apologies. That was not my intent. Just overly strong emphasis on the point I wanted to make...As to my upending Johnny, remember Johnny is the one-on-one participant. I am just on the sidelines, and what I say is not what decides the outcome.
I think the changes wrought in Fijian natives is obvious if one assumes they originated in the Middle East...No, I haven't. OS please describe the changes in the population and what the cost was.
Not sure if my biological terminology is correct and I'm sure it's a bit more complex than that, but the simple point is that entropy ensures that there will be a cost. I can be certain it exists whether or not I have an idea of what it might be.
Ah .. another Jokia post. Another dose of :spam:A bit more complex than Stripe understands---ya think???
A bit more complex than Stripe understands---ya think???
Make it support evolution and your nobel is guaranteed. :thumb:I wonder, as I wander, in the maze of entrophy and thermodynamics,Is someone saying that matter is disappearing, ceasing to exit? and, conversely, that matter is being created out of nothing to fill the space vacated by the matter which has been destroyed? I am, somewhere in left field! What? bybee
I don’t know what “this observed trait is”. If you are speaking of entropy, then it is not an observed trait common to both fields, anymore than showing fear (to quail) is the same as eating a type of bird called quail.The challenge is from entropy. Information and thermodynamics are two fields that utilise this observed trait that can be applied to all scientific fields.
I have no idea where Styer’s future interests lie. E-mail him and ask.Do you think he would be interested in addressing the challenge as it now stands?
Apology accepted. Johnny reminds me of some of my colleagues, great guys who honor their faith yet are not afraid of doing honest science.I do tend to lump you guys together a bit, don't I. Apologies.
which error Enyart goes on to say is that of conflating the two definitions of entropy. In his OP Enyart relies on claims by Timothy Stout, and then talks a bit more about entropy confusion. But nowhere in his OP, other than by saying it is so, did Enyart show that Styer mixed up the disparate definitions of entropy. That is the challenge of this debate – did Styer cheat by relying on two different concepts of entropy?.But the paper repeats an error that Henry Morris made fifty years ago,
Radio announcer Bob Enyart takes me to task for not distinguishing between "heat entropy" and "information entropy" in my American Journal of Physics article "Entropy and Evolution". Any knowledgeable person could just look at the equations in my paper and see that I mean "thermodynamic/statistical mechanical entropy".
The word "entropy", like most words, has many meanings, and the meaning in use is determined from context. If I say "Run away from danger", you don't think "A run is a small stream, so I must follow a small stream away from danger".
Here I want to present some of the other meanings of the word "entropy", to emphasize that it would have been silly to say that I'm not talking about each of them:
information entropy
topological entropy
Kolmogorov entropy
Kolmogorov-Sinai entropy
metric entropy
Gibbs entropy
Boltzmann entropy
Tsallis entropy
von Neumann entropy
Shannon entropy
Rényi entropy
volume entropy
If I spend so much time talking about what I'm not going to be talking about, the paper would have been quite long indeed!