tag:blogger.com,1999:blog-20067416.post8432907642773390320..comments2023-12-21T06:35:36.624-05:00Comments on Recursivity: Yet Another Creationist Misunderstands Information TheoryUnknownnoreply@blogger.comBlogger27125tag:blogger.com,1999:blog-20067416.post-22181837887598627892013-06-28T02:11:49.090-04:002013-06-28T02:11:49.090-04:00Charles Allan:
What about the amount of NEW infor...Charles Allan:<br /><br /><i>What about the amount of NEW information required to make the first cell in a pool of muddy water. The evos dont even have a start to their crazy theory</i><br /><br />Congratulations - you are yet another creationist who doesn't understand information theory.<br /><br />Please prove you are not a blithering idiot.<br /><br />1. What is the difference between "new information" and ordinary "information"?<br /><br />2. Why are random events insufficient to generate "new information"?<br /><br />3. How much "new information" in the following string of bits? 0110100110010110. Show your work.Jeffrey Shallithttps://www.blogger.com/profile/12763971505497961430noreply@blogger.comtag:blogger.com,1999:blog-20067416.post-1255630474834213582013-06-28T02:02:10.212-04:002013-06-28T02:02:10.212-04:00What about the amount of NEW information required ...What about the amount of NEW information required to make the first cell in a pool of muddy water. The evos dont even have a start to their crazy theorycharles allanhttps://www.blogger.com/profile/14956135332347230119noreply@blogger.comtag:blogger.com,1999:blog-20067416.post-69334054866277723712012-03-30T12:46:16.227-04:002012-03-30T12:46:16.227-04:00Anonymous:
I think it unlikely that there will be...Anonymous:<br /><br />I think it unlikely that there will be a single information measure "relevant to biology". Instead, mathematicians and biologists will develop measures relevant & appropriate to the problem at hand. But when ID creationists "prove" bogus theorems about their measures and fail to admit the theorems are wrong, they do a disservice to science and mathematics.Jeffrey Shallithttps://www.blogger.com/profile/12763971505497961430noreply@blogger.comtag:blogger.com,1999:blog-20067416.post-80163337465076695262012-03-30T11:31:11.307-04:002012-03-30T11:31:11.307-04:00Interesting. I just happened to have one of Ming L...Interesting. I just happened to have one of Ming Li's papers on the table beside my desk this morning. He uses Kolmogorov information as an 'information distance' when clustering homologous sequences for, say, a protein family. He defines the information distance as 'the length of the shortest binary program that is needed to transform' two sequences into each other. I wouldn't say that Kolmogorov information is used in biology 'all the time', but it is an interesting distance measure for sequence clustering. Whether Kolmogorov distance is 'meaningful' to the cell is another question (i.e., makes any structural or functional difference). For example, some sequences can have a large Kolmogorov distance from each other, yet be indistinguishable to the cell as far as structure and function goes. Another sequence may be have a single 'knock out' mutation that renders the sequence non-functional for the cell, yet have a very short Kolmogorov distance. What I'm saying is that Kolmogorov information may be 'meaningful' to us, but not necessarily to biology. Sometimes yes, sometimes no ... an indication that more work needs to be done to come up with an information measure that is more relevant to biology.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-20067416.post-81372792704391814352012-03-07T08:03:54.948-05:002012-03-07T08:03:54.948-05:00The fallacy referred to above is called equivocati...The fallacy referred to above is called equivocation - using a term in different ways at different points, as if the different definitions were equivalent when they aren't.PNGhttps://www.blogger.com/profile/10152649124164862094noreply@blogger.comtag:blogger.com,1999:blog-20067416.post-35774439862061288202012-03-04T08:44:22.463-05:002012-03-04T08:44:22.463-05:00Yes, Kolmogorov information (or variants of it) is...Yes, Kolmogorov information (or variants of it) is used all the time in biology. One application I know of is in constructing phylogenies. You can read Ming Li's work on the subject, probably easily findable with a google search.Jeffrey Shallithttps://www.blogger.com/profile/12763971505497961430noreply@blogger.comtag:blogger.com,1999:blog-20067416.post-34108564794778653012012-03-04T07:02:13.136-05:002012-03-04T07:02:13.136-05:00In The Blind Watchmaker, Dawkins says that a tree ...In The Blind Watchmaker, Dawkins says that a tree shedding seeds is literally raining information - it wouldn't be more true if the tree were raining floppy disks.<br /><br />Do biologists use Kolmogorov information when talking about the information in the biological systems? Is it part of the biologists mathematical toolkit? Does anyone know a good review paper or book on this?Luke Barneshttp://www.lukebarnes.infonoreply@blogger.comtag:blogger.com,1999:blog-20067416.post-57119701071036090292012-03-04T01:57:23.670-05:002012-03-04T01:57:23.670-05:00That was very clear, John, thanks. It's too ba...That was very clear, John, thanks. It's too bad that neither statement had to do with biological systems, which is what this post was originally talking about.Mirandanoreply@blogger.comtag:blogger.com,1999:blog-20067416.post-51872440099933281072012-03-02T12:47:12.396-05:002012-03-02T12:47:12.396-05:00Miranda, you need to take those two statements in ...Miranda, you need to take those two statements in context<br /><br />"...it's clear that repeating a phrase twice has the potential to provide more information than the phrase uttered once."<br /><br />this is specifically in response to Luke's example of a spoken phrase.<br /><br />"...doubling a string is guaranteed to increase information infinitely often."<br /><br />is part of an explanation of the definition of information as used by mathematicians and computer sciences.Johnhttps://www.blogger.com/profile/10876775111703252840noreply@blogger.comtag:blogger.com,1999:blog-20067416.post-31001020868466642612012-03-02T09:44:11.797-05:002012-03-02T09:44:11.797-05:00Thanks, Miranda! I will give your suggestion all ...Thanks, Miranda! I will give your suggestion all the consideration it deserves.Jeffrey Shallithttps://www.blogger.com/profile/12763971505497961430noreply@blogger.comtag:blogger.com,1999:blog-20067416.post-40854327617672241522012-03-01T22:50:41.734-05:002012-03-01T22:50:41.734-05:00Fine, I'll take that class, and you take this ...Fine, I'll take that class, and you take this one: http://www.duq.edu/communication/graduate/phd/curriculum.cfmMirandanoreply@blogger.comtag:blogger.com,1999:blog-20067416.post-46826750415973417232012-03-01T16:57:25.792-05:002012-03-01T16:57:25.792-05:00Miranda: I do not think I can remedy your confusi...Miranda: I do not think I can remedy your confusion. Perhaps a course in mathematics at your local community college might help.Jeffrey Shallithttps://www.blogger.com/profile/12763971505497961430noreply@blogger.comtag:blogger.com,1999:blog-20067416.post-37348717776109029692012-03-01T14:15:08.510-05:002012-03-01T14:15:08.510-05:00"In the Kolmogorov theory, xx need not have m..."In the Kolmogorov theory, xx need not have more information than x for every string x. But it will for infinitely many x. "<br /><br />But who cares if it will for infinitely many x, with respect to information in biology? Last I checked, there aren't infinitely many biological systems out there.Mirandanoreply@blogger.comtag:blogger.com,1999:blog-20067416.post-20696499848631183682012-03-01T07:24:02.684-05:002012-03-01T07:24:02.684-05:00James:
I agree. Dembski is particularly effectiv...James:<br /><br />I agree. Dembski is particularly effective at this fallacy.Jeffrey Shallithttps://www.blogger.com/profile/12763971505497961430noreply@blogger.comtag:blogger.com,1999:blog-20067416.post-91119037684893377862012-03-01T07:23:10.285-05:002012-03-01T07:23:10.285-05:00Yes, Miranda, you definitely seem confused.
In ...Yes, Miranda, you definitely seem confused. <br /><br />In the Kolmogorov theory, <i>xx</i> need not have more information than <i>x</i> for every string <i>x</i>. But it will for infinitely many <i>x</i>.Jeffrey Shallithttps://www.blogger.com/profile/12763971505497961430noreply@blogger.comtag:blogger.com,1999:blog-20067416.post-18873886865271245642012-03-01T04:17:15.484-05:002012-03-01T04:17:15.484-05:00Most times I read nonsensical arguments in philoso...Most times I read nonsensical arguments in philosophy or theology, the underlying fallacy is of the same sort.<br /><br />Usually there are two meanings of a word floating around: a jargon meaning (either pre-existing or cooked up specially for the purpose) and an everyday meaning. Then the error consists in managing to confuse the two: introducing an example in one sense, and using as if it were an example in the other sense.James Cranchhttp://jdc41.user.srcf.net/noreply@blogger.comtag:blogger.com,1999:blog-20067416.post-58896931963226413762012-03-01T01:56:00.038-05:002012-03-01T01:56:00.038-05:00I could be misinterpreting you, but the following ...I could be misinterpreting you, but the following two sentences appear to contradict each other, or are at least talking about different things:<br />1) "...it's clear that repeating a phrase twice has the <b>potential</b> to provide more information than the phrase uttered once."<br />2) "... doubling a string <b>is guaranteed</b> to increase information <b>infinitely often. </b>Mirandanoreply@blogger.comtag:blogger.com,1999:blog-20067416.post-1373748316299011352012-03-01T01:07:21.794-05:002012-03-01T01:07:21.794-05:00You didn't answer yes or no, whether in the re...<i>You didn't answer yes or no, whether in the realm of speaking words, or in the realm of biology. </i><br /><br />Miranda: I'm not a linguist or biologist. I'm a mathematician and computer scientist. In the standard Kolmogorov definition of information <i>as used by mathematicians and computer scientists</i>, and explained in dozens of papers and books, doubling a string is guaranteed to increase information infinitely often. <br /><br />The technical understanding of "information" should not be confused with various vague folk understandings of the word - in the same way that the folk understanding of the word "field" has little to do with how it used in algebra or vector analysis.<br /><br />It is quite unwise to overload a term like "information", which is well-understood in mathematics and computer science, to mean something entirely different from what professionals in those subjects expect.Jeffrey Shallithttps://www.blogger.com/profile/12763971505497961430noreply@blogger.comtag:blogger.com,1999:blog-20067416.post-69317636977151284962012-02-29T22:00:15.916-05:002012-02-29T22:00:15.916-05:00"Even in the example you give, it's clear..."Even in the example you give, it's clear that repeating a phrase twice has the potential to provide more information than the phrase uttered once."<br /><br />Has the <i>potential</i>, yes. In biology, too, it has the potential. (IOW, Atheistoclast may be wrong.) But Luke said "There must be some sense of the word "information" in which another copy of the same sentence does not add information." You didn't answer yes or no, whether in the realm of speaking words, or in the realm of biology. You avoided answering by demanding a "rigorous definition for discussion and debate."Mirandanoreply@blogger.comtag:blogger.com,1999:blog-20067416.post-26236442438769506412012-02-29T17:20:05.057-05:002012-02-29T17:20:05.057-05:00I've subscribe to one of those brain-dead Chri...I've subscribe to one of those brain-dead Christian podcasts, and for the last couple of episodes they've been going in detail over the "book" <i><a href="http://www.thebigmystery.com/" rel="nofollow">Me, The Professor, Fuzzy, and The Meaning of Life</a></i>, which starts with basic principles and build up to a "proof" of the Christian God.<br /><br />The syllogism basically goes like this:<br /><br />1. The universe had a beginning.<br />2. Every event has to have a cause.<br />3. Entropy is always increasing.<br />4. There are two exceptions to entropy increasing: life and intelligence.<br />5. Entropy is the same as disorder. Complex things are ordered, therefore complexity is always decreasing.<br />6. Except when there's intelligence involved.<br />7. Since we see complex things in the world today, and complexity is always decreasing, then back at the beginning of the universe, it had to be much more complex.<br />8. Since the only way to get complexity is with intelligence, there must have been a great intelligence who put all that complexity into the universe at the beginning.<br /><br />9. Therefore the cause of the universe had to be very intelligent, which is another way of saying God.<br /><br />(He goes on later to show how it must specifically be the Christian version of God)<br /><br />I see three main problems with this. The first one I'm good with, but Jeffrey, can you comment on the second and third?<br /><br />Premise #2, that every event has to have a cause, I know to be false from my knowledge of quantum physics (I was an EE major in college back 30 years ago, so I had a fair amount of QM and some information theory).<br /><br />If you relate complexity and entropy, they're not inverses of each other - complexity is an equivalent concept to entropy, isn't it? Can't you basically get Boltzman's constant by taking the log2 of the probability of something or other?<br /><br />And finally, does the 2nd Law of Thermodynamics apply to the information kind of entropy? If we say that complexity is kinda like entropy, does it follow that complexity must be increasing?<br /><br /><br />Thanks for your thoughts.Curt Cameronhttps://www.blogger.com/profile/08048312089881459521noreply@blogger.comtag:blogger.com,1999:blog-20067416.post-88366187718102886082012-02-27T19:50:34.364-05:002012-02-27T19:50:34.364-05:00You could think of it in terms of a physical probl...You could think of it in terms of a physical problem. If you had an string of alternating black and white beads. It would take a certain amount of energy to make the initial string.<br /><br />Suppose that you then used that string as a template for making 1 copy. Even if copies were cheaper to make than the original, they would cost something in energy,<br />it would take n times as much energy to make n copies as it would to make one copy. <br /><br />Of course, that is a handwavey example and not a real math proof.John Stockwellhttps://www.blogger.com/profile/03496308585336775569noreply@blogger.comtag:blogger.com,1999:blog-20067416.post-24093100911232064612012-02-27T14:29:27.834-05:002012-02-27T14:29:27.834-05:00Even in the example you give, it's clear that ...<i>Even in the example you give, it's clear that repeating a phrase twice has the potential to provide more information than the phrase uttered once. </i><br /><br />Yeah, yeah.Tim Kenyonnoreply@blogger.comtag:blogger.com,1999:blog-20067416.post-82346164308996109482012-02-27T07:17:36.955-05:002012-02-27T07:17:36.955-05:00"As for your final question, no, the program ...<i>"As for your final question, no, the program need not necessarily "specify n", whatever that means." I think rather concretely about this, so being a fortran programmer I had in mind:<br /></i><br /><br />Well, see, that's the point. You demonstrate a fundamental misunderstanding here: producing a single program to print x^n says <i>nothing at all</i> about the Kolmogorov complexity of x^n. You are confusing upper bounds with lower bounds - a common mistake among beginning students.<br /><br />The correct argument goes the other way. Given a program to produce x^n, we can deduce from that a program to produce n. Since we know that infinitely many n are incompressible, this gives the result.Jeffrey Shallithttps://www.blogger.com/profile/12763971505497961430noreply@blogger.comtag:blogger.com,1999:blog-20067416.post-72766359068517243142012-02-27T07:03:41.172-05:002012-02-27T07:03:41.172-05:00Has Kolmogorov's definition failed to correctl...<i>Has Kolmogorov's definition failed to correctly formalise our intuitive concept of information?</i><br /><br />Yes, well, that's the thing, isn't it? Kolmogorov's theory is clearly defined and usable, whereas our "intuitive understanding" is vague, contradictory and inconsistent between different persons.<br /><br />Thus, if we use Kolmogorov, we can check each other's work and see if mistakes have been made, whereas if we use "intuitive understanding" all we are left with is seeing who can shout the loudest.<br /><br />Face it, buddy, intuition fails, at least when it comes to this.Valhar2000https://www.blogger.com/profile/05467019327257867276noreply@blogger.comtag:blogger.com,1999:blog-20067416.post-49793701587357134162012-02-27T07:01:10.407-05:002012-02-27T07:01:10.407-05:00"One if by land, two if by sea". Good ex..."One if by land, two if by sea". Good example.<br /><br />"As for your final question, no, the program need not necessarily "specify n", whatever that means." I think rather concretely about this, so being a fortran programmer I had in mind:<br /><br />integer :: n<br />n = 100000<br />do i = 1,n<br />write(*,*) ... call (shortest) function that prints x ...<br />end do<br /><br />The second line can be made arbitrarily long by making n arbitrarily large (ignoring such practicalities the limits on the integer kind). So I can place an arbitrarily large amount of information into n itself.<br /><br />Obviously, that's not how a mathematician would argue - I really should get around to reading Li and Vitanyi's textbook. Their article (with Kirchherr) you linked to a while back ("miraculous universal distribution") was excellent, though as a scientist I am a tad worried that the complexity of a string is non-computable. It doesn't seem like there will be a simple "numerical recipe" for hypothesis testing with Kolmogorov, and yet Kolmogorov's thesis says that his definition is provably better than yours.Luke Barneshttp://lukebarnes.infonoreply@blogger.com