Sunday, April 27, 2008


“The empiricist will never allow himself to accept any epoch of nature for the first–the absolutely primal state; he will not believe that there can be limits to his outlook into her wide domains, nor pass from the objects of nature, which he can satisfactorily explain by means of observation and mathematical thought–which he can
determine synthetically in intuition, to those which neither sense nor imagination can ever present in concreto; he will not concede the existence of a faculty in nature, operating independently of the laws of nature–a concession which would introduce uncertainty into the procedure of the understanding, which is guided by necessary laws to the observation of phenomena; nor, finally, will he permit himself to seek a cause beyond nature, inasmuch as we know nothing but it, and from it alone receive an objective basis for all our conceptions and instruction in the unvarying laws of things.

In truth, if the empirical philosopher had no other purpose in the establishment of his antithesis than to check the presumption of a reason which mistakes its true destination, which boasts of its insight and its knowledge, just where all insight and knowledge cease to exist, and regards that which is valid only in relation to a practical interest, as an advancement of the speculative interests of the mind (in order, when it is convenient for itself, to break the thread of our physical investigations, and, under pretence of extending our cognition, connect them with transcendental ideas, by
means of which we really know only that we know nothing)–if, I say, the empiricist rested satisfied with this benefit, the principle advanced by him would be a maxim recommending moderation in the pretensions of reason and modesty in its affirmations, and at the same time would direct us to the right mode of extending the province of the understanding, by the help of the only true teacher, experience.

In obedience to this advice, intellectual hypotheses and faith would not be called in aid of our practical interests; nor should we introduce them under the pompous titles of science and insight. For speculative cognition cannot find an objective basis any other where than in experience; and, when we overstep its limits our synthesis, which requires ever new cognitions independent of experience, has no substratum of intuition upon which to build.

But if–as often happens–empiricism, in relation to ideas,
becomes itself dogmatic and boldly denies that which is above the sphere of its phenomenal cognition, it falls itself into the error of intemperance–an error which is here all the more reprehensible, as thereby the practical interest of reason receives an irreparable injury.

And this constitutes the opposition between Epicureanism and Platonism.”


Broke down and started a youtube account. Here are a couple old videos.

Skiing, Durango CO:

Whale breaching off the coast of Alaska:

Scabs running around in the Albuquerque snow:

The Silverton railroad, Colorado:

Saturday, April 26, 2008

The eye

Another video reflecting profound ignorance:

Here are the errors:

1) Non-sequitor. Titling the video "creationism disproved?" As though the presentation of a hypothesis of the evolution of one organ, the eye, could disprove the idea that the organism itself was created. Silly, if course.

2) Oversimplification. The video starts out with "light sensitive cells," as though the development of such cells was "no big deal." But let's take a closer look at this. How much genetic information is required to transform a limited number of cells into light sensitive cells? First, the cell has to hyperpolarize in light. Then, it has to have the capacity to produce various neurotransmitters depending on its state of polarization, and transmit them. And it has to be connected via nervous tissue which can transfer that neurotransmitters. And it has to have some capacity to receive the neurotransmitters. And it has to have the capacity to use this information in a way that adds survival value. All this irreducibly complex functionality, glossed over in just the "first step." All the further steps similarly oversimplify the steps of development. Like all the genetic information required to construct the "rudimentary lens," including the differentiation of lens cells, and the mechanisms required to manipulate those cells in order to make the lens useful. She skipped those steps, too. Finally, by focusing only on mollusks, she doesn't even get to the real difficult stuff -- the development of the eye BALL, which moves freely inside an eye socket.

3) Hypothesis masqueraded as science. All they have presented is a hypothesis -- not a theory -- there is in fact no evidence to prove that these various forms of eyes actually, historically, evolved from one another. The evolution of the eye could have been radically different than the process presented here. Or there could have been no such evolution at all. There is no evidence to falsify either of these other possibilities, and therefore the "just-so" story presented is not science.

4) Non-sequitur. I think my favorite quote of hers is "If it can grow, it can evolve." How beautiful is that. Growth is the process by which an organism follows the instructions encoded as genetic information preprogrammed into the organism. Evolution is the process by which genetic information is modified through the generations. Radically different processes. But she argues that if an organism can follow its preprogrammed genetic instructions, then surely those genetic instructions could be programmed into the organism without any intelligent intervention. Nice:). Like if a computer can start up, following the instructions pre-ncoded in its hardware and software, then surely those instructions can develop through random variation and non-random selection.

Mind and reality.

Seems to me there exists a spectrum of philosophical belief regarding the relationship between mind and reality. On the one extreme you have solipsists who believe the mind manufactures reality. On the other extreme you have materialistic reductionists who want to reduce every feeling and thought to a chemical configuration, and ultimately to equate the mind with the physical brain. In the middle there are those (like me) who believe that the mind and reality both exist, but in different categories -- mind being a subjective experience, and reality being objective.

Concepts that fit into the category of subjective experience:

Pain. It has no physical reality. Yet we experience it. We know what it is. We know what to do when we feel it. Its causes and effects can be observed in the objective world -- a slap in the face, or a brain-scan at the time the pain is initiated to see which portions of the brain are lit up. But you can't ever point to "pain."

Similarly, the mind is a subjective experience. It has no physical reality. Yet we experience it. We all know what it's like when our "mind" is "groggy." We all know what it's like when a new idea flashes into our mind. We speak of these things are real -- and they are -- albeit not physically real.

Philosophically, and experientially, we treat these subjective experiences as real. And rightly so. We experienced ideas long before we had any idea of how the brain actually functions. In fact, today, we still don't know what the physical manifestations or causes of an "idea" are. Yet we have them from the earliest age.

Materialists would like to reduce those experiences to mere chemical components. How many times have I read articles about how "love" has been proven by "science" to be "dopamine." But when you read the article, you discover that the basis for this conclusion is only that dopamine is associated with love. But dopamine is also associated with all good feelings -- it's a feel-good drug. It explains why one feels high, but it does not select why one loves one person or another -- or how that love is reflected in the physical reality of the brain. It basically reduces love to "feeling good" -- which is, of course, nonsense to anyone who has ever experienced love.

We experience love, mind, and pain as real, not as their physical manifestations. There's no sense denying what we all experience, every day.

And the mind, while subjective, also has the power to directly impact reality. The placebo effect is real, and it is documented. People act and manipulate objective reality based on these IDEAS, FEELINGS, HOPES, CHOICES, and DREAMS, which, although we cannot see, touch, or measure them, are nevertheless very real to us in our experience.

Then on the other hand, there are the dualists who insist that the mind has some substantive reality. Like the mind isn't just an experience, but "something" that could conceivably be found "somewhere." They do this without any evidence. And a substantial argument to the contrary: the fact that you can fiddle with someone's mind by fiddling with the physical components of their brain.

The key is to keep categories separate. Treat the subjective as real, and subjective. Treat the objective as real, and objective. Don't treat the subjective as objective. Don't treat the objective as subjective. Everything in its right place.

Sunday, April 20, 2008

Placental genes

Cutting through the obligatory evolutionary ranting here, I think it's some pretty cool stuff. Basically, during the first stage in placental development, we activate primarily genes shared by other species. But in the second portion, we activate species-specific genes.

How cool is that?

The evolutionary error, of course, is to conclude from these facts that all the species are related. No go. Just because the Model T and the Highlander Hybrid both have tires doesn't mean they're related. But it's still really cool. I'm interested in what gene (or other biological structure) controls the activation of the genes within the two stages. How does the embryo "know" when different genes need to be activated?

Monday, April 14, 2008

Dawkins. Yikes.

I can't believe what I'm hearing here. Richard Dawkins said that "In order to prove that a fossil was really 3000 years old, they would have to find igneous rocks which were found in proximity to the fossils, date these by radioactive dating, several different methods of radioactive dating, all of which give independent estimates of the date of the fossil."

That's simply not true.

When you date igneous rock by radioactive dating, you determine how long ago the igneous rock cooled from volcanic rock. You do NOT determine how long ago the sedimentary strata in which you find the rock were laid down. A rock may have emerged from a volcano 1M years ago, and been buried in sedimentary strata 10 years ago. Radiometric dating will still find the rock to be 1M years old.

But, of course, the date of a FOSSIL laid down in a sedimentary strata is not the date of the old rock that got buried there, but the new bones!

By Mr. Dawkins' reasoning, you could have an igneous rock that was 1M years old, bury it in the dirt next to my shoe, and then upon uncovering them both, conclude that my shoe was 1M years old.

But even further, radiometric dating of rocks assumes that the rock had a particular composition when it was initially formed. For instance, K-Ar dating assumes that the rock initially had no Argon. If there was in fact Argon in the rock initially, the rock itself will date millions of years older than it actually is.

And in fact, there are many examples of volcanic rocks being found with Argon in them.

It never ceases to amaze me how these "scientists" are either unwilling or unable to grapple with reality.

Hardware and software

I often hear people, the popular press, and even "scientists" say things like "DNA contains all the information necessary to create life." Seems to me pretty clear that the claim isn't true, for one simple reason: In order to function, DNA must be in a cell which already has all the structures in place to keep the cell alive, as well as to use the DNA to code proteins.

Seems analogous to a computer: DNA is comparable to the software, and the cell is comparable to the hardware. But in a computer, both the software AND the hardware contain information necessary to the function of the computer.

Same with a cell: DNA by itself doesn't contain all the information necessary for life. The rest of the cell probably contains just as much information (if not more) in the configuration and function of all its organelles.

How much of the total information necessary for life is in the DNA? And how did the rest of the information come to be encoded in the cell? Someday maybe we'll know.

Sunday, April 13, 2008

Label and object

Recently, I've been realizing more and more the importance of differentiating between objects and our linguistic labels for them. In fact, there are many words that we use for things that have no corresponding physical reality.

The number 2, for instance. You can't point to it. You can't touch it. You can't see it. 2 is a word we use to describe a particular grouping, real or imagined, of other objects.

Infinity, also. You can't point to it, touch it, more memorize it. It's defined as something that doesn't end. But because of our inherently limited nature, we can't measure or see anything without end.

Random. Many things we think of as random are not in fact random, but determined by variables we don't know. Randomness, then, would be better understood as "Something I can't put into a pattern."

Imaginary numbers. Not really any more imaginary than real numbers. Again, just a label we use to stick in our formulas. There are no "square roots" in reality, much less any "negative numbers." These are linguistic labels we use to manipulate algebraic formulae.

An idea. We have ideas all the time, and speak of them as though they are real. But you can't touch or feel an idea.

Dimensions. You can't touch or see dimensions. They are just labels we use to describe the universe in which we live. And the big X,Y, and Z are not the only, nor even the best means of describing reality. Non-Euclidean geometries can also describe the universe equally effectively, if not more effectively. Dimensions are just language, not reality.

Seems like the most viscious philosophical arguments are usually over the definitions of these words without any physical correspondending reality.

And what's REALLY weird are the folks (like Pythagoras, Plato, Anselm, and Descartes) who treat these labels as though they are real -- or even, more real than reality. Pythagoros went as far as to worship numbers. Plato thought the form of "greenness" was more real than anything green on the Earth. Anselm thought that because God was defined as the being than which no greater can be conceived, that he must, therefore, exist in reality. And Descartes had his own ontological argument.

All these people made the same fundamental mistake -- treating a label as though it were real -- and in some cases, more real than reality itself.

The ambiguity between label and reality caused a lot of unnecessary struggles more me, particularly in the area of math. As I got higher in math, it seemed more and more like the teachers and books were treating things like probability and integrals like they were something real -- and I was trying to conceptualize them that way. But they're not. Probability is just a label we put on our degrees of ignorance. If we knew all the variables that went into determining the outcome, there wouldn't be any probability. The same problem with calculus. You can't point to or touch an "integral." It's a mathematical game we play to bridge ourselves from physical reality to physical reality quickly and conveniently.

My math and stats teachers, I think, didn't get that. They insisted on treating math as though it were something real -- as though it had some corresponding REALITY to it -- which, of course, it does not. They were just making the same mistake Pythagorus did.

It also caused me to struggle with philosophical discussions about "essence." Philosophers spoke (and speak) of "essence" as though it has some reality it it. But "essence" is really just a definition we put on what we see.

I come to realize that all this means is that I'm thoroughly nominalist and existentialist in my philosophy without knowing it. But now by knowing it, I'm much better able to understand what they meant with all that "essence" stuff.

I wonder what it is about people that makes some of them want to treat labels like reality.

Endogenous retroviruses

Interesting -- apparently there are thousands upon thousands of sequences in our DNA with the characteristics of retroviruses, and evolutionists use this as an argument for Common Descent.

First some definitions -- a retrovirus is a virus that codes itself into the DNA of the host cell and hijacks the host cell to create copies of the virus.

An endogenous retrovirus, then, is a retrovirus that is permanently in an organism's DNA.

Estimates of the percentage of our DNA that is retroviruses range from 6-10%. A small number of these retroviruses are in almost exactly the same place as they appear in chimps. Evolutionists use this fact to argue for common descent, because, they argue, if these retroviruses are in exactly the same place in both humans and chimps, that retrovirus must have been in place prior to chimps and humans branching, because the chances are too slim that both would be infected in exactly the same place, independently.

This argument rests on a number of key unstated assumptions:

1) That the endogenous retroviruses appear (and appear in exactly the same place) in the genes of all humans. If they weren't, then we have an interesting question: are some humans related to apes and others not? Or, given the 30,000 ERVs in the genome, aren't some bound to turn up in the same location in apes + humans, even if they're not related and were infected independently?

2) These actually are retroviruses. Most of these estimates are just based on segments that have similar coding to viruses. Isn't it possible for DNA to have coding similar to virus coding without actually containing viruses?

3) These viruses weren't intentionally inserted into our DNA. We use retroviruses to perform genetic engineering. Who's to say the Designer didn't use them to genetically engineer us? This is especially interesting, given the fact that some (many) ERVs are indispensible to life.

4) ERVs can actually infect germ lines. In order for an ERV to be passed on to the next generation, it would need to infect the testes or egg of a parent. While we've seen these ERVs in DNA, we've never actually seen them infect a germ line. We assume that what we see are actually old viruses. But we still haven't seen an infection.

5) It also leaves us with an interesting question: if we do all share the same ERVs, how did they come to be set in the entire population? If a single individual is infected with an ERV, that ERV is going to be wiped out by genetic drift over time, unless there is some distinct survival advantage to the ERV. What are the chances, seriously, that viruses that bungle their way into our gene pool are so advantageous to our survival that everybody without them dies? That is, unless the viruses were designed to be good for us ...

What does all this mean? I don't know. But I think it's cool.

Culture and religion

People of all stripes seem to have a hard time figuring out how to fairly judge and compare religions and cultures. This genius, for example, is deluded enough to think that this had something to do with religion.

So what are the top errors with respect to evaluating culture and religion?

Well first off is the mistake of comparing the Ideal with the Real. What I mean is this: If you compare one man's ideals to another man's actual behavior, you're truly comparing apples and oranges.

"Atheism is great, because atheists believe in human rights. Religion is horrible -- remember the Crusades?."

That's not a fair comparison. If you're going to compare religions, you have to compare apples to apples -- compare one man's ideals to another man's ideals, and one man's behavior to another man's behavior.

Compare the atheist's belief in "human rights" to the Christian injunction that "he who is without sin should cast the first stone." They come out roughly equivalent.

Compare the religious persecution performed by Christians in the 20th century to that performed by Atheists. The Atheists come out much worse. 31,689,000 Christians were martyred by atheists in the 20th century , helped along a great deal by atheists Joseph Stalin and Mao Tse Tung, who killed Christians by the millions simply for being Christian.

That's not to bash my many awesome atheist friends. It's just to show the problem with comparing ideals of many to practices of the few.

The second mistake one can make in this arena is an error in CAUSATION. Are acts committed by people BECAUSE of a belief, or in spite of it?

In the blog by Mr. Myers, he implies some link between the Islamic religion and the forced marriage and sexual abuse of the 8-year old girl in Yemen. But, in fact, the interpretation of Islam by muslims today is that this is immoral. Even in Yemen, while marrying a girl of that age is legal, having sex with her is not. Therefore, the perpretator in this case violated the law. How then can you blame a religion for an act when contemporary religious authorities condemn the act?

If a person commits an act forbidden by his religion, then the problem is not that they are too religious -- the problem is that they are not religious enough.

Interestingly, the Quran itself has no minimum age for marriage -- Muhammed himself married an 8 year old and consumated the relationship with a 9 year old, when Muhammed himself was 52. This is vile to our sensibilities.

However, again, it is important to determine whether a behavior comes from a religion or from the religion's cultural context. A Sharia expert explains the Islamic argument based on cultural context here. In reality, these cultures force children into sexual contact at exceedingly young ages whether Islamic or not -- it's part of their culture. In fact, prior to Islam, women were typically bought and sold as chattel. Even today, in countries like Zambia, one out of three 8 year olds is having sex, and about 1/7 would deliberately have sex with someone if they knew they had HIV. I ask you, is it better for an 8-year old to be having sex with a 50-year old who is responsible for her welfare, or a 9 nine-year with AIDS with no obligations to the girl at all?

Seems about a tossup to me.

And as the teacher explains, the Quran requires some degree of security and protection for the girl. Thus, while the Quran permits something we in the West would never permit and which arguably shouldn't be permitted by anyone, it is certainly BETTER than the culture without Islam.

The last thing one should do when evaluating religion and culture is to take a look at the log in our own eye. As the teacher points out, many girls in the West become sexually promiscuous around age 12 or below. In the US, one third of all births are to unwed women, one third of girls get pregnant at least once before they turn twenty, and four million teens contract an STD every year. We tolerate this because in our value system, it's their "right." However, our protection of these "rights" results in STDs, teen pregnancy, and the broken families and children that come out of those families. Agreed. Marrying children off at 8 is not a good idea. But neither is a culture permitting its 10-year olds to have sex out of bounds. Maybe we should clean up our own house first.