Jump to content
IGNORED

Let's Discuss Scientific Objections to Evolution


one.opinion

Recommended Posts


  • Group:  Royal Member
  • Followers:  8
  • Topic Count:  15
  • Topics Per Day:  0.01
  • Content Count:  5,731
  • Content Per Day:  3.53
  • Reputation:   3,524
  • Days Won:  12
  • Joined:  11/27/2019
  • Status:  Offline

On 8/23/2020 at 3:44 PM, The Barbarian said:

I've pointed out the logical errors you made in those assumptions.   You won't listen, so I'll stop flogging the dead horse.

Perhaps you don't know what "information" means in science.   What do you think it means?

I did work on that in graduate school.  Yes, I know what it is.   The point is that you don't seem to know what it is.   Since you declined to answer, I'll assume you don't know what it means.  

Information can be quantified as follows. If ? is the set of all messages {x1, ..., xn} that X could be, and p(x) is the probability of some x ∈ X {\displaystyle x\in \mathbb {X} } x\in \mathbb {X}, then the entropy, H, of X is defined:[9

image.png.27d621cc1740a226dea1529fefc7c36f.png

Where p(x) is the frequency of that allele in the population.

Let's take a very simple example, so you can see how it works.   Suppose there's a population of organisms, with two different alleles at a certain gene locus, each with a frequency of 0.5.    What is the information for that gene?   It's   about 0.30.    Now suppose that a mutation occurs at this locus and it eventually increases in the population until each allele has a frequency of about 0.333.

What did this mutation do to the information?

Now, it's about 0.477, an increase.   Every new mutation in a population increases information.   That's how it works.

Information is the uncertainty as to the allele in a sample, until you determine what the allele is.   If there's only one allele in the population, the uncertainty is 0 (you already know what it is).  And therefore the information for that gene is 0.   If there are two, the uncertainty is now greater than 0, and if there are three, the uncertainty increases, with the information increasing thereby.

Oh good grief!  So you define "information" as levels of statistical uncertainty, then, hey presto, mutations increase information!

Information conveys meaning (a message or significance of some kind).  It requires an intelligent coder, a code, data (and a recipient, if information is to be conveyed).  To be useful, information requires the recipient to be able decipher the code and put the data to some kind of use.

Regarding randomness: absolute randomness is impossible, which is why random number generators can only approximate randomness.  As soon as you have written an algorithm, you have introduced an element of order, however slight it might be.

I'm not going to engage with you any further, since you clearly do not believe what the Bible says about the days of creation and do not want to.  You will find any excuse to accept secular evolutionary hypotheses and reject creation, so we do not have common ground.

 

Link to comment
Share on other sites


  • Group:  Worthy Ministers
  • Followers:  22
  • Topic Count:  195
  • Topics Per Day:  0.11
  • Content Count:  11,054
  • Content Per Day:  6.50
  • Reputation:   9,018
  • Days Won:  36
  • Joined:  09/12/2019
  • Status:  Offline
  • Birthday:  01/09/1956

The most succinct definition of information is: "that which reduces uncertainty".

Link to comment
Share on other sites


  • Group:  Royal Member
  • Followers:  3
  • Topic Count:  27
  • Topics Per Day:  0.00
  • Content Count:  5,074
  • Content Per Day:  0.67
  • Reputation:   970
  • Days Won:  0
  • Joined:  06/20/2003
  • Status:  Offline

4 hours ago, David1701 said:

Oh good grief!  So you define "information" as levels of statistical uncertainty

That's what it.   As you now realize, that understanding is what makes possible things like the internet and communications with distant spacecraft over low-powered equipment.   As you learned the information for a given gene is 0 if there is only one allelle (you always know what it will be) and the information will be higher and higher if there are mutations that produce more alleles.

4 hours ago, David1701 said:

then, hey presto, mutations increase information!

Yep.   Just a it does for any other system.  That was Shannon's big discovery.    Are you beginning to realize that "information" doesn't have to increase when populations evolve?   You could temporarily have a decrease in information, which would then be followed by an increase as mutations occur.

4 hours ago, David1701 said:

Information conveys meaning (a message or significance of some kind).

Can.  But as Shannon demonstrated, it doesn't have to. Hence the finding that we can make electronic communications as reliable as we like, by increasing redundancy.  Would you like to learn how that works?

4 hours ago, David1701 said:

It requires an intelligent coder, a code, data (and a recipient, if information is to be conveyed).

No, that's wrong.  For example, we can gain much information from a hurricane by observation, even though no one actually coded it.   You've been sold a rather faulty notion of "information."

4 hours ago, David1701 said:

To be useful, information requires the recipient to be able decipher the code and put the data to some kind of use.

But information doesn't have to be useful.   That was Shannon's point.   It's a key understanding to making communications more reliable.

4 hours ago, David1701 said:

Regarding randomness: absolute randomness is impossible,

Quantum events.    Irreducibly random.

4 hours ago, David1701 said:

I'm not going to engage with you any further,

Probably a good thing, as you clearly do not agree with what Genesis says about the "days" of creation, and you seem to have a rather odd misunderstanding about the nature of information, particularly as it applies to genetics.   You will find any excuse to accept YE creationism and reject creation, so we do not have common ground.

 

  • Thumbs Up 1
Link to comment
Share on other sites


  • Group:  Royal Member
  • Followers:  3
  • Topic Count:  27
  • Topics Per Day:  0.00
  • Content Count:  5,074
  • Content Per Day:  0.67
  • Reputation:   970
  • Days Won:  0
  • Joined:  06/20/2003
  • Status:  Offline

3 hours ago, Alive said:

The most succinct definition of information is: "that which reduces uncertainty".

Technically, information is a measure of uncertainty.    If, for example, you have no doubt at all about what the next message will be in a system, the information of that message is 0.   If there are a number of possible states, the information of the message will be increased as the number of possible states increase.

 

  • Thumbs Up 1
Link to comment
Share on other sites


  • Group:  Royal Member
  • Followers:  8
  • Topic Count:  15
  • Topics Per Day:  0.01
  • Content Count:  5,731
  • Content Per Day:  3.53
  • Reputation:   3,524
  • Days Won:  12
  • Joined:  11/27/2019
  • Status:  Offline

4 hours ago, Alive said:

The most succinct definition of information is: "that which reduces uncertainty".

Succinct?  Yes.  Useful?  Not so much.

Link to comment
Share on other sites


  • Group:  Royal Member
  • Followers:  8
  • Topic Count:  15
  • Topics Per Day:  0.01
  • Content Count:  5,731
  • Content Per Day:  3.53
  • Reputation:   3,524
  • Days Won:  12
  • Joined:  11/27/2019
  • Status:  Offline

13 hours ago, The Barbarian said:

Technically, information is a measure of uncertainty.    If, for example, you have no doubt at all about what the next message will be in a system, the information of that message is 0.   If there are a number of possible states, the information of the message will be increased as the number of possible states increase.

 

This is a very strange definition of information.  Just because a message adds no new information, does not mean that it contains no information, unless you contrive the definition so that existing information is no longer information.  The whole thing is bizarre.

Surely, if you have no doubt about what the next message will be, in a system, the additional information of that message is zero (because you already knew the information it contains); but this is different from what you said.

If I send an e-mail to someone, saying that I received his previous e-mail, then this is information.  If I then forget that I sent it and send it again (then remember and warn him that I've sent a duplicate), it still contains the same information, and he's certain about what that message will be, just no new information.

Edited by David1701
There's a duplicate coming next
Link to comment
Share on other sites


  • Group:  Royal Member
  • Followers:  8
  • Topic Count:  15
  • Topics Per Day:  0.01
  • Content Count:  5,731
  • Content Per Day:  3.53
  • Reputation:   3,524
  • Days Won:  12
  • Joined:  11/27/2019
  • Status:  Offline

5 minutes ago, David1701 said:

This is a very strange definition of information.  Just because a message adds no new information, does not mean that it contains no information, unless you contrive the definition so that existing information is no longer information.  The whole thing is bizarre.

Surely, if you have no doubt about what the next message will be, in a system, the additional information of that message is zero (because you already knew the information it contains); but this is different from what you said.

If I send an e-mail to someone, saying that I received his previous e-mail, then this is information.  If I then forget that I sent it and send it again (then remember and warn him that I've sent a duplicate), it still contains the same information, and he's certain about what that message will be, just no new information.

 

Edited by David1701
Duplicate containing the same information
Link to comment
Share on other sites


  • Group:  Royal Member
  • Followers:  3
  • Topic Count:  27
  • Topics Per Day:  0.00
  • Content Count:  5,074
  • Content Per Day:  0.67
  • Reputation:   970
  • Days Won:  0
  • Joined:  06/20/2003
  • Status:  Offline

Technically, information is a measure of uncertainty.    If, for example, you have no doubt at all about what the next message will be in a system, the information of that message is 0.   If there are a number of possible states, the information of the message will be increased as the number of possible states increase.

1 hour ago, David1701 said:

This is a very strange definition of information. 

Nope.  It's the standard definition used in information theory and engineering of communications systems.

1 hour ago, David1701 said:

Just because a message adds no new information, does not mean that it contains no information, unless you contrive the definition so that existing information is no longer information.  The whole thing is bizarre.

You still don't understand how it works.   The information of a message is how much more you know after you get it, than before you get it.   So, if a gene has only one allele, you don't know any more after checking that gene from a sample of the population than you did before you checked it.   So the information for that gene is 0.    If there are two possible alleles for that gene in the population, now the gene has some information, depending on the relative frequency of the two alleles.   That's how it works.   And it does work.   It's how the internet is possible, and how you can assure accurate transmission by adding redundancy to the message.

1 hour ago, David1701 said:

Surely, if you have no doubt about what the next message will be, in a system, the additional information of that message is zero (because you already knew the information it contains); but this is different from what you said.

No, that's exactly what I told you.  Except that the information in the message will be 0, not the additional information.

1 hour ago, David1701 said:

If I send an e-mail to someone, saying that I received his previous e-mail, then this is information.  If I then forget that I sent it and send it again (then remember and warn him that I've sent a duplicate), it still contains the same information, and he's certain about what that message will be, just no new information.

No.  The information of the new message is 0 because he already knows exactly what it is, and so when he gets it, he knows no more than he did before.

In reality, the new message will have a different time stamp, and possibly be worded differently, and so will have some information content for him.   But suppose you merely agreed that if one thing happened you'd send a 0 and if something else happened you'd send a 1.    In this case (ignoring time stamps) the second message would literally have an information content of 0.

That's how it works.   We're familiar with creationists talking about "information" with no understanding of what it actually is.   I guess they think it's all sciencey and stuff.  But few of them actually understand the theory, or how the theory works to make electronic communications possible and more reliable.   And so far, none of them have a clue about how it works in genetics, even though Shannon first used his theory in biology.

Edited by The Barbarian
Link to comment
Share on other sites


  • Group:  Worthy Ministers
  • Followers:  22
  • Topic Count:  195
  • Topics Per Day:  0.11
  • Content Count:  11,054
  • Content Per Day:  6.50
  • Reputation:   9,018
  • Days Won:  36
  • Joined:  09/12/2019
  • Status:  Offline
  • Birthday:  01/09/1956

Stephen Meyer does a good job of discussing 'information; in his book--'Signature in the Cell' and also 'Darwin's Doubt'.

Link to comment
Share on other sites


  • Group:  Royal Member
  • Followers:  8
  • Topic Count:  15
  • Topics Per Day:  0.01
  • Content Count:  5,731
  • Content Per Day:  3.53
  • Reputation:   3,524
  • Days Won:  12
  • Joined:  11/27/2019
  • Status:  Offline

14 minutes ago, The Barbarian said:

Technically, information is a measure of uncertainty.    If, for example, you have no doubt at all about what the next message will be in a system, the information of that message is 0.   If there are a number of possible states, the information of the message will be increased as the number of possible states increase.

Nope.  It's the standard definition used in information theory and engineering of communications systems.You still don't understand how it works.   The information of a message is how much more you know after you get it, than before you get it.  

Yes, I understood that perfectly; and it's a very useful concept.  I simply would not describe it merely as information.  I would describe it as additional information.

The definition you have given, for information, is highly specialised and not one for common usage, outside of that specialised field.

 

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...