Sunday, September 25, 2011

Reply to Nick Matzke on "Richard Lenski's Long-Term Evolution Experiments with E. coli and the Origin of New Biological Information"

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1


I've only had a chance to skim through some of what you've written, but here's my two cents
on things that caught my attention:

<blockquote>"1. Agree or disagree -- Michael Behe accepts that evolution can produce new
genes, and thus new information."</blockquote>

You've pretty much engaged in the very thing Luskin called out Dennis Venema for. Producing
 <i>something</i> is a far cry from producing <i>everything.</i>

<blockquote>"Please define "fundamentally new". Creationists/IDists regularly use such
vague language as an escape hatch when their claims are under pressure, but they never give
a precise, scientific definition of "fundamentally new"."</blockquote>

http://www.uncommondescent.com/comment-policy/put-a-sock-in-it/

Is their response to the question of <i>"What Do You Mean by “Constructive” Beneficial
Mutations Exactly?"</i> good enough?

And what is a "precise, scientific definition" supposed to look like to you? I agree that the
kind of information/traits we are talking about must be decided before a verdict can be
ruled about any evolutionary outcome. But if the standards for a useful definition are not
defined ahead of time, then all Design critics will do is just declare any new definition
someone puts forth for new traits/information as being insufficient.

<blockquote>"Please note that if "fundamentally new" means "no significant sequence
similarity to other genes, then, at the very least, most of most genomes is not
"fundamentally new" and thus could have evolved from common ancestors without violating
 Meyer's "law"."</blockquote>

I'm pretty sure there is a big enough difference between the genomes of <i>past</i>
ancestors (yes, I am assuming common descent is a fact here) and the species of
<i>today</i> to see plenty of fundamentally new sequencing in the process. Even if there is
significant sequence homology, surely a <i>new feature</i> would require a <i>new set of
information</i> to generate it, correct?

<blockquote>"A new gene with a new function has got to be a fair amount of new
information, doesn't it? If it isn't, you can't go around claiming that genes have lots of
information in them, which Meyer et al. clearly do."</blockquote>

Agreed (as I said above), but there's nothing about that which Meyer or Luskin would deny.
The disagreement is over whether or not new biochemical features touted as evolutionary
success stories are beyond the UPB threshold. (see below)

<blockquote>"So, if mutation and natural selection produce 10 bits, what magical process
stops them from adding another 10 bits, and another 10 bits after that?"</blockquote>

If low probability is a "process" then there's your answer. The main (but not the only)
argument made from the ID camp as I understand it is that it is unlikely for new traits to be
generated by evolutionary processes - just minor variations on what's already there to begin
 with.

But more importantly, you've missed the point that Meyer was originally making and that
Casey just pointed out below, namely that Meyer was referring to information in a
<b>pre-biotic</b> context and not in a biological/Darwinian sense. Thus you're focusing on
an <i>entirely different</i> problem from what Luskin and Meyer were referring to.

In your next paragraph you basically state that <i>seperate isolated</i> cases of
evolutionary processes producing a bit or two of information can all "add up" to 500 bits of
information; thus Meyer, Dembski, or anyone else who agrees with their work is wrong.

There are two main problems with this:

1. First, when anyone speaks in reference to the Universal <b>Probability</b> Bound in any
way, it refers to a <i>probability</i> of 10 to the negative 150th power; not necessarily
<i>500 bits,</i> and this is where I think you and Luskin are talking past each other. The
difference?

Generating 500 bits by <i>chance alone</i> is equivalent to the UPB, but it's clear that you are
 not referring to chance alone being involved in the production of new features. Thus 500
<i>bits</i> may be generated without it being something approaching an
<i>improbability</i> of 10 to the negative 150th.

So if there is <i>more than blind chance</i> involved in generating information, then the
question should not be, "Did we generate 500 <b>bits?</b>" but rather, "Is the
<b>probability</b> of generating this more or less likely than 10 to the negative 150th?"

2. Now here is the main issue I have with the idea that <i>seperately isolated</i> cases of
generating <i>smaller changes</i> that don't quite reach the UPB threshold can "add up to"
a total net increase in function that renders the UPB false. The difference between seperate
 cases of small information increase and an entire new feature that reaches the UPB
threshold makes all the difference.

Here are separate isolated cases (in this case, words) of information:

TO PROBABILITY THE LESS FIFTIETH RANDOMLY TO THAN GENERATE ONE POWER BOUND
SENTENCE OF PROBABLE THE HUNDRED TEN UNIVERSAL IS THIS 

Each word is less improbable to randomly generate than the UPB. But that does not mean
that they all "add up" to something meaningful.

Now here is a full string (or sentence) that - if you were trying to randomly generate it -
reaches the level of Dembski's Universal Probability Bound:

THIS SENTENCE IS LESS PROBABLE TO RANDOMLY GENERATE THAN THE UNIVERSAL
PROBABILITY BOUND OF TEN TO THE ONE HUNDRED FIFTIETH POWER

As you can see, there is a huge difference between <b>separate</b> cases of producing
something of a probability that is greater than UPB and a case in which a feature <b>as a
whole</b> manages to surpass that threshold.

<blockquote>"If you can't produce an explanation that is as detailed and well-evidenced,
why should scientists take ID seriously in this case?"</blockquote>

A few points on this one:

1. Didn't you say it makes no sense to expect that a theory must explain <i>everything</i> in
 order to be true?

2. If we find that undirected processes are incapable of generating something and we know
that intelligence has far greater causal adequacy of producing anything analogous, then
what should we conclude?
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.14 (GNU/Linux)

iQGcBAEBAgAGBQJOf9tuAAoJEAlY1kTu1KLhiI8L/2CxMHdA5EXeLMCfYhGBcPTx
vus09o6Zubf6WnKWTeSi/ioNlxGtwALlp+ddRd5SHRRywZqKf3UyQP1sR4wa9nhW
L13Hx0KkppKuCisNliDWCK4ERNRfUlCjLmUNMfoN7bUn3TKF54kxntxExUkiuDa/
RtnAYSsYPz8tO0IyRp/eGU2XaGR+gtVkdjkg80kayk/nT4pUmkwF/WALrNbRV8gt
dyyy2xdXTj8VDWK4lBnhjaGConubqtv4l2A9oQGGikqvqw6MXdblnb8zU3hzVvIz
kSI6ufDL429ebZeN+k+L3/VphxHoYJGUC5zqqZBKnh6oxnHff5iZcF4aSZSyIcWq
h/RQyxfqRNXTX/r4W64OSIViypKxXGxWNrBXlnm9O7mQ5yc8mTR7rPtQe5RLK0UM
DDNpYMCJUTyDLuBGTyi96z9upTNNCnJNNOcysTvkMW7bQeNQEj1jcP8eVC89nKiF
or+K9OP3KSf5l2R7PcjNpsEw6xsn7RCOsLixj+QBjw==
=gE09
-----END PGP SIGNATURE-----

No comments:

Post a Comment

Comments that aren't encrypted with my key and are also digitally signed with sender's own key go to the trash. I might skim them for lulz though...

This is the policy given to me by Janus. They don't agree with my views, but since they are willing to protect my real identity this is the least I can do in return.

Crypto-IDEA WA: Don't waste your time here. We have a far more cryptographically secure outpost than this one...