Saturday, November 10, 2007

BSOD

Tacky, tacky.



Doesn't the person who manages this screen system ever step outside and look? Or, better yet, set up a webcam across the street so you can check the readout!

Have it all--on a Mac

I have to concur with this sentiment. It just works fantastically for everything I need to do in my role as a system administrator. Having that Unix terminal just makes everything better, especially with the virtualization technology of Parallels.

Friday, November 9, 2007

Finding the origins of English mathematics

When I studied the history of mathematics in college, we didn't talk about the contributions of the Brits--mostly, I think, because no one ever thought there was a contribution before the middle ages. So this article on megolithic circles just looks really cool. The money paragraph for me is this:
In an interesting aside, Linacre describes the research of Anne Macauley (1924-1998) who studied the monuments and showed that these megalithic people understood the Fibonacci series of numbers and the Golden Mean five thousand years before Leonardo of Pisa explained them. The evidence also suggests they used square roots and Pythagorean mathematics two thousand years before Pythagoras.
Leonardo of Pisa was around 1175-1250, so the Brits supposedly understood Fibonacci series around 4000 BC. Impressive, yes; but I think Mesopotamia may have had similar technology, although this timeline shows nothing for that time period. To my knowledge, we have no evidence that Mesopotamia was interested in Fibonacci numbers, but then there numerical system was sexigesimal, so I'll lay the blame at that. They did work on quadratic equations, although I can't imagine doing those without the shorthand notation we use!

Wednesday, November 7, 2007

The future of parking

This article on stackable cars isn't nearly as interesting after reading it again as it was at first. I was hoping the cars would be stacked vertically. That would be far more fun. Just watch that first step, I think. It's a doozy.

Wherefore art thou earmarks?

This is an interesting use of technology.
One of Sunlight's resident creative geniuses (yes, there are many of them) have taken all the Defense Appropriations Earmarks and made them available for viewing within Google Earth. (You can only view this using Google Earth which you can download from this page.)


I think we could apply a similar idea to a whole host of state budgets, company budgets (where you could analyze the amounts assigned per departments), nonprofits organizations, etc. Lots of ideas here.

I really prefer visual images that describe information; Tufte's books on visual design should be required reading for everyone, in my opinion.

Sunday, November 4, 2007

The Reprehensible Christian

I read this story about Anthony Flew, who may have been taken advantage of by supposed Christians. I highly recommend you read it all, but here's a snippet:
But is Flew’s conversion what it seems to be? Depending on whom you ask, Antony Flew is either a true convert whose lifelong intellectual searchings finally brought him to God or a senescent scholar possibly being exploited by his associates. The version you prefer will depend on how you interpret a story that began 20 years ago, when some evangelical Christians found an atheist who, they thought, might be persuaded to join their side. In the intellectual tug of war that ensued, Flew himself — a continent away, his memory failing, without an Internet connection — had no idea how fiercely he was being fought over or how many of his acquaintances were calling or writing him just to shore up their cases.
If this is true, then it is horrific. No one who claims to know Christ should use any person in this manner. And the editor should never have allowed this to occur!

Thursday, November 1, 2007

How to be a Vulcan in One Easy Step?



Well, "step" isn't quite right. More like one surgery. Yes, you too can have Vulcan Ears.

Now I'm a huge Star Trek fan, but I'm averse to body modification through surgery. I'm willing to let Time just do its own work on my body and see where it takes me. But this seriously disturbs me on another level.

The desire to get into the world of fiction is growing stronger in our society. Oh, I know fiction has been around since before the Greeks started writing plays. But our society seems to be taking our interest in fiction to dangerous levels. Look at our recent Halloween event. How many adults had parties and dressed up? (The big complaint was about all the slutty costumes for women down to age 8).

How many people have gotten totally into the Harry Potter series? (Disclaimer: I enjoyed reading the series, yet I understand why Christian parents would think very carefully before letting their kids read it.) The number of people who read fan sites devoted to HP is huge.

Go into the gaming community: World of Warcraft, Civ, and many others I don't even know about. They are all about escaping this world and living in another. Second Life is another manifestation of this behavior. People don't want to be themselves.

How sad, as God made them that way and He loves them as they are.

Monday, October 29, 2007

The best Laptop to run Vista is...a Mac!

Yes, you read that right. PCWorld did some testing to see which laptop would be the best to run Microsoft Vista on, and the clear winner was a Macbook Pro.

I'm feeling the love. Are you feeling it? Finally, at long last, the Apple hardware line is getting some serious street cred.

The future scriptorium?


A robot calligrapher!

I tried to find where this picture originated, but I didn't have any luck. Part of the trail went through here and here.

So I never did find the original source, but this is just stunning. What does it mean for those of us who played with calligraphy? Most of us were put out of business as a result of desktop publishing and laser printers. But there was still a place for beautiful Bibles lettered by hand, I thought.

I guess not. But it's still cool.

What every geek needs in this day and age of cubicles

Shooting Cubicle Alarm System keeps your stapler, paperclips safe - Engadget






Because that last piece of cake is never safe. At least, not from me. (Hat tip: Instapundit.)

Saturday, October 27, 2007

No Spots for Me

Update Monday, October 29 at 11:30am:

As it turns out, I went ahead and installed Leopard anyway. I figured, "Why not? I've got everything backed up." So I ran the installer but did an archive and install, where Leopard is installed in a new System folder.

Everything is running fine, and a day later, now my disk's S.M.A.R.T. status says it's okay! So perhaps Leopard does something odd. I'll be watching this closely for the next week.





Last night, I went to my local Apple store, hungry for the chance to buy my copy of Leopard. I had pined for this day for months. And there I was, in the line, at the front of the line, laying down my money, and finally--finally!--picking up the shiny holographic box with my tiny little hands.

It's mine! MINE! BWAHHAHAHAH!

I drove home in a frenzy, and walked into my house.

And sanity hit me up beside the head with a brick.

I breathed deeply, and realized: if I am going to install Leopard, I have to do this right. Which means: you back up first.

The imp inside me said, No! You can't wait! You have to start immediately! You have to be the first to install!

After beating my internal imp into submission, I dutifully started running my backup program. And I did my household chores while I waited for the backup to run. I did dishes. I fed the cat. I cleaned her water bowl. I put clothes away, washed my face, brushed my teeth--all the good stuff.

Finally, I checked the computer and the backup was done. FINALLY! said the bludgeoned imp inside me. So I stuck the Leopard DVD in and rebooted.

And again, sanity hit me up beside the head, with the same brick. So I started to run the Disk Utility on the Leopard DVD. By this point, the imp inside me was groaning and clawing (but gingerly, as it was still a bit bruised) and realizing that it wasn't going to win.

And that's when the bad news hit.

DISK FAILING.

Huh? What?

There it was. Disk Utility said that the S.M.A.R.T. disk in my machine had determined that it was failing. My imp shouted, But I've only had this machine for a year! It can't be failing!

And somewhere deep in my brain this brain cell fired up--a brain cell storing information regarding some potential issues with this particular model of Macs having problems with hard drives failing. Oh, I can't find a link now that confirms this memory, but it's moot.

So now I have to wait until Monday to take my laptop in to the Apple store (it's covered under an AppleCare agreement) and talk to the bods about it.

Forgive me while I wear black, but I am in mourning.

Saturday, October 20, 2007

How my cat wakes me up in the morning

From CuteOverload.com.





Apparently I'm not the only one with a cat who does this.

Friday, October 19, 2007

Order out of chaos

NASA - Giant Waves Over Iowa



radar image



Oh, sure; we know that meterologists have found some order in the chaos of weather for ages, but this...this is wild.

Those giant waves—"undular bore waves"—were photographed Oct. 3rd flowing across the skies of Des Moines, Iowa. (Credit: KCCI-TV Des Moines and Iowa Environmental Mesonet SchoolNet8 Webcam.)

"Wow, that was a good one!" says atmospheric scientist Tim Coleman of the National Space Science and Technology Center (NSSTC) in Huntsville, Alabama. Coleman is an expert in atmospheric wave phenomena and he believes bores are more common and more important than previously thought.

But first, Iowa: "These waves were created by a cluster of thunderstorms approaching Des Moines from the west," he explains. "At the time, a layer of cold, stable air was sitting on top of Des Moines. The approaching storms disturbed this air, creating a ripple akin to what we see when we toss a stone into a pond."

Ripples in the sky?! Look in the lower-right hand corner of that radar image. Stunning. If you follow the link, there's a video that shows what it looks like in real life. It's slow to load, but worth it.

Thursday, October 18, 2007

THE geek gift for Christmas

Now THIS is what you buy a geek for Christmas.

Nikko electronics R2-D2 digital audio & video dvd projector


I actually saw this in the Sharper Image catalog, and said to myself, "Any respectable geek would find this to be the best gift for several years, at least."

It stands 20.5" high. It has an LCoS projector that does 4:3 (eh) as wide as 80 inches. It has a built-in DVD/CD player, an FM tuner, an iPod dock, an SD card slot and a USB connector. It has built-in stero speakers with surround sound.

But to give it true geek cred, it has a wireless remote that is a replica Milleniium Falcon. Not only does the remote control all the geeky toys inside, it controls R2: forward, backward, right, left.

AND: put the remote in the stand and it LIGHTS UP!

Excuse me, but I need a mop, and now. To clean up all my drool on the floor.

Saturday, October 13, 2007

Geeks in Hollywood

Alas, too long between posts... my own fault.

But I was finally pushed to write today because of something I have noticed within the new TV lineups. There are, to put it mildly, a lot of geek shows compared to previous years, when there were none.

We've had "Numb3rs" for a few seasons, and "Heroes" has some geekish qualities, but this year things have expanded. We have "Big Bang Theory" and now SciFi (visible online at their website) has "A Town called Eureka". All these shows have, in some position, geeks or scientists or computer nerds or other specialists in their fields. "Eureka" is geek heaven with all the brilliant minds supposedly living together in one town.



Actually, I'd like to pick on "Eureka" a bit, because they come so close to getting it right--but fall flat. First, while there are some super geniuses who feel as if regular men and women do not deserve any attention, that's not true of all of them. And those who act that way actually don't talk to regular folks at all. They avoid them like the plague. (And unfortunately, I know a few of these people--from far away.) But most of them realize that there super abilities are only in one field, and know that others can far exceed them in other specialties.

And while super geniuses can do amazing things, they generally don't mess up in the small areas; the small areas are so obvious to them they can't escape the attention. (It does happen, but with far less regularity than "Eureka" plots.)

They have nailed down the odd habits and hobbies that some geeks have. I mean, that was one of the things that got me interested in the show. But no one has a pet, which is odd. Either the animals are either lab animals being tested on or they are science projects that are released into the wild.

I have seen, first hand, how really intelligent people can have hugely emotional arguments about how right they are and how wrong others are, and it's amazing how well "Eureka" gets that right. Someone must know or have seen such arguments in the past. I'm not sure if it happens if they work together in proximity for so long that they start grating on each other, but the fireworks are simply astounding.

But the thing that is starting to bother me greatly about "Eureka" is how the sheriff is always the one to solve the problem. Always. He's always the one risking his life, figuring out the "out of the box" thinking that the supergeniuses aren't doing. Well, hello! That qualifies him to be a supergenius too. So either start changing the plots, guys, or get more people to realize his abilities for what they are. It's becoming very predictable. I wonder if that is because regular folks are trying to write the script as regular folks and haven't got a real geek to pull on to help them?

There's another issue that is actually good with "Eureka"--it admits that the supersmart aren't perfect and in fact can have serious flaws. Those of us who believe in Christ and the Bible would call that "sin", because sin is essentially a character flaw in all of us.

Tuesday, September 4, 2007

Reality vs. Simulation and the Big Question

Oh, fresh meat. Just when I was afraid this was going to become a stale blog, I had forgotten about the following story and the follow-up has given me a reason to post.

Let the fun begin.

Back on August 14, John Tierney posted on the New York Times a story about the latest twist on the question of whether we're all just imagining this universe: instead, we're sitting in a simulation (free registration required). Tierney writes:




I hadn’t imagined that the omniscient, omnipotent creator of the heavens and earth could be an advanced version of a guy who spends his weekends building model railroads or overseeing video-game worlds like the Sims.

But now it seems quite possible. In fact, if you accept a pretty reasonable assumption of Dr. Bostrom’s, it is almost a mathematical certainty that we are living in someone else’s computer simulation.

This simulation would be similar to the one in “The Matrix,” in which most humans don’t realize that their lives and their world are just illusions created in their brains while their bodies are suspended in vats of liquid. But in Dr. Bostrom’s notion of reality, you wouldn’t even have a body made of flesh. Your brain would exist only as a network of computer circuits.
Ah, all the posthumanists come out of the closet and start saying, "Yeah! We've known about this for years!" Yes, all this is just fine and dandy, except as I hinted at earlier (and as Tierney also says) this concept has been around for a long time. Lots of philosophers have asked whether we're really "here," and what "here" means.

But Tierney decided to have some fun with this. He first posted follow-up questions in his TierneyLab, and it generated enough discussion that he decided to host a contest: a Talk-To-The-Designer contest. I would have to say the winners had their tongues firmly planted in their cheeks as they typed off their entries. My personal favorites are:

8th Place:

My dad always said that “reality” is the place where you have to pay the rent.

If this is indeed all a simulation, would you be so kind as to so advise my landlord.

...

6th Place:

You know when you can slide the little cursor thing to the left and the video starts at the place you moved to?

Can you move me back 30 years or so?

Thanks.

...

4th Place:

To: “The Simulator, The Creator”
From: “Computer Generated Spammer”
Subject: “DIAMOND TRANSFER FOR INVESTMENT”
FROM THE DESK OF MR. EGOMAH FELIX SYNTAX ERROR, LAGOS, NIGERIA, an account officer to late Mark Jones an Immigrant, who was a Businessman and Building Contractor in my Country.
On the 21st of April 2001, a customer designed and created by your simulation was involved in a Car accident along Lagos-Shagamu express road. All occupants of the Vehicle unfortunately lost their lives and were deleted from your program.
The deleted character has an account valued at USD$15.5 million, I have been in contact with his lawyer prior to locate any of his relatives for over 2 years that seems abortive. Now, I seek your consent, to present you, the creator and simulator, granting you opportunity to be presented.
We discovered an abandoned sum of diamonds worth $12,500,000.00, and Mr. Jones will states that you have claim to 25% of this sum if you give us your user name, password and the name of your favorite pet for security purposes.
Thank you for your time, and your simulation.
SYNTAX ERROR
Awesome entries. Mucho fun. There was some serious quibbling about the winner, but that's not what I want to discuss here.

I could see having a lot of fun with this, but it doesn't address something that I find fascinating: the fact that we can have this discussion.

Think about it. If we really were in the Matrix, then Agent Smith would have already talked to us, reset our memory, and convinced us that it was all just a bad dream caused by skipping dinner for the Nth time in a row. If we were in a true simulation, then the ability to have self-referential thought within the simulation should cause a serious loop that would generate enough feedback to cause the holomatrix to start fritzing.

And yet....

We are in a simulation. We just don't call it that. Oh, most people poo-poo this idea of "afterlife" and think that Christianity is just for blue-haired ladies willing to throw their money away, but Christianity's basic tenets can be presented in Matrix-like, simulation terminology (although the Matrix was not what I would call a Christian film):
  • This world isn't reality. It's a mirror--a simulation--for what's going to come at the end. (1 Corinthians 13:12)
  • There is Someone--the Source, the Programmer--watching our every move and tweaking our lives; partially to test us, partially to answer prayer. (Philippians 4:6, Colossians 4:12, James 1:3, James 5:15-16, 1 Peter 3:12) (I'm just scratching the surface here; big theological discussions on the meaning of pain and suffering aren't going to be covered at this time.)
  • There is a reality beyond our simulation, our world. (Job 11:7-9, Job 28:23-24) (but I need a better source, as some would say that poetry is exempt)
  • While our souls are eternal, our bodies are not. (1 Corinthians 15:42-58)
  • There is a glitch in the simulation--sin. One of the residents of the simulation "broke," and his replicated progeny have this same glitch. We're like Agent Smith when he becomes a virus. (Romans 5:12-21)
  • Those who trust in Jesus Christ will receive so much: forgiveness for the "glitch" that exists inside us, new bodies, better than those we had before.
So while all this talk about metaphysical ideas of reality sounds silly or lame or cerebral, there is a kernel of Truth in it all.

Monday, September 3, 2007

The Apple Tree Grows

Slashdot posted an article about how Apple sales are now surpassing Gateway Computers in the market. The original Computerworld article noted that "Apple's share of U.S. sales [is] at 5.6%" and notes that one in every six laptops now purchased is a Mac.

Why this increase? Spaketh Stephen Baker, analyst at NPD Group, Inc:
Baker attributed the jump in market share to refreshes that both laptop lines recently received.
Heh; they can't just come out and say that the laptops that Apple makes are superior. No. They have to couch it carefully in terms of "improvements."

Most folks that I talk to want to get Macs for the following reasons:
  • They want to be able to manage music, photos, and movies and build their own stuff. Sure, you can get iTunes on Windows, but iPhoto and iMovie aren't offered on other platforms.
  • They want to run Windows for one or two programs and like the fact that they can keep Windows isolated in a virtual environment. If Windows gets corrupted, they don't care; they can just recreate their PC in Parallels or Boot Camp and re-install their software with a minimum amount of effort (compared to doing this on Windows directly). Better still, Parallels now offers the ability to take "snapshots" of your environment, so even if things go bad, you can just revert to a snapshot.
  • They love the fact that they just don't have to worry about viruses and malware as they do on Windows.
All eminently good reasons, in my book. People who never would have looked at a Mac a few years ago are now seriously considering it, including a coworker of mine and family members.

Apple, let me just say it: Nicely done. Keep up the good work.

Good statistics, bad statistics part II

I wasn't going to use this example for my second installment on statistics, but when I read it on Sandy Szwarc's site I thought it was perfect for presenting one of the big issues with so many studies we hear about in the media. (Ms. Szwarc, I hope you do not mind at my continued references to your work.)

The purpose of the base study referenced in the article? To investigate if a certain drug can alleviate weight gain caused by taking an anti-psychotic. (Trust me, it's there; just keep reading.) The money quote from Ms. Szwarc regarding that study is this:
So, they tested 3 male schizophrenic patients (average age of 22 years) hospitalized for acute psychotic episodes by giving betahistine along with their anti-psychotic medication (olanzapine) for 6 weeks.
Three guys! Three guys comprised the study! Nothing statistically significant could be gleaned from such a small study and such a small study should never be considered a basis for any decision except that further study is required. I don't care if the researchers claim it was statistically significant; I simply don't buy it. That sample is way, way too small, even for a cell or a cluster.

And yet, as Ms. Szwarc demonstrated aptly in her article, the pharmaceutical firms are going crazy, wanting to work with the listed drug and media articles going ga-ga over this super fantastic drug (to borrow a phrase from Manolo).

I really worry that soon the media will be going crazy because a study came out based on two persons or even just one person that demonstrate the next Practically Perfect Pill™. You'll never hear about the study; just how great it could be, should be, how we should be behind it now.

Sad, sad, sad.

Saturday, August 18, 2007

Good statistics, bad statistics part I

The Lancet study of 2004, written by Roberts et al., on the mortality count before and after the US Coalition invasion of Iraq has been endlessly hit upon by conservative bloggers, and Michelle Malkin posted a critique of the study last month, written by David Kane of Harvard University. I've greatly wanted to explain why this study is so important, but also want to do both studies justice. Let me attempt to do it now.

The Lancet study

Roberts' study compared the mortality in the 14.6 months before the March 2003 invasion to the mortality in the 17.8 months after it. The tracking of deaths in Iraq is considered inaccurate because only a third of all deaths happen in hospitals. So the authors went to households in 33 clusters around the country of Iraq in an effort to estimate mortality in the two time periods of January 1, 2002 - March 18, 2003 and also in March 19, 2003 - September 20, 2004.

Interviews took place September 8-20, 2004. The idea is that each cluster will be representative of 1/33 of the country and can be used to extrapolate and come up with an estimate for the number of deaths before and after. (Given the lack of security in Iraq, it is clear that a full-blown census as performed in the US is impossible.) The study was designed to minimize risk to the interviewers while attempting to keep clusters random.

Each household, if it agreed to be interviewed, was asked for the age and sex of every current household member as well as to recall births, deaths, or visitors who stayed for more than 2 months as of January 1, 2002. (While sitting here, reading this, ask yourself: what can you recall of your household five and a half years ago?) If a death was reported, the interviewers attempted to confirm the deaths by way of death certificates.

Lots of statistics were gathered and generated. The study noted, "More than a third of reported post-attack deaths (n=53), and two-thirds of violent deaths (n=52) happened in the Falluja cluster. This extreme statistical outlier has created a very broad confidence estimate around the mortality measure and is cause for concern about the precision of the overall finding." So the researchers noted that Falluja may have had an impact on the numbers that were hard to address.

At the end, the paper estimates the risk of death increased by 2.5 times and, noting the statistical variability issues, estimated the 95 % confidence interval for this number as 1.6 - 4.2.

About confidence intervals

Before we go into the Kane study, let me explain what the heck a confidence interval is. For any given statistic, like this estimate of an increase in death of 2.5 times, we cannot be 100% sure of this 2.5 number unless we talked to everyone in Iraq. As we can't do that, we have to try and decide how "good" of an estimate this is.

We do this by figuring out how big of an interval we need so we can feel pretty sure the real value lies inside this interval. What do we mean by "pretty sure"? Well, if we use this standard 95% confidence interval, what we're saying is that if we could repeat our data gathering and repeat everything we did multiple times, then we could say that 95% of the time the value we calculated would fall in between the two numbers that make up the interval. In our case, 95% of the time a new data gathering and analysis would give us a number instead of 2.5 but it would still be between 1.6 and 4.2.

Now this number, 2.5, is a figure to indicate how much more the probability of dying had increased in Iraq. The risk of dying had gone up something like 2.5 times, according to this study. The larger this number, the more likely someone would die after the invasion. The smaller this number, the less likely someone would die after the invasion. If the number were 1, then there would be no increase. Note that the confidence interval found in the Lancet study had as its lower end the value 1.6. So the author was saying that if they kept repeating this study, then 95% of the time the increased probability of death would be no lower than 1.6.

The Kane Analysis

Now we're ready to look at Kane's paper. Kane's paper doesn't question the way the numbers were gathered or how well the numbers reflect the changes in mortality in Iraq. He looks at the analysis; he compares this overall increase of 2.5 times against the estimation of mortality rate before and after invasion. Each of these rates has its own confidence interval. He notes that the confidence interval for the mortality rate after invasion is eight times wider than that before invasion. That seems odd, considering the sample sizes for each are almost exactly the same. The issue is Falluja. Data from Falluja was included in the post-invasion mortality rate.
The Roberts paper noted:
There was one place, the city of Falluja that had just been devastated by shelling and bombing, and it was so far out of whack with all the others that it made our confidence intervals very, very wide.
Okay, that's good that they noted this. So what did the authors do? They did calculations both with and without the Falluja data. With the Falluja data, you get the 2.5 number and the confidence interval of 1.6 - 4.2. But if Falluja was excluded, the the number was instead 1.5 with a confidence interval of 1.1 to 2.3!

However, statisticians don't like to exclude data; my statistics professor in college was very adamant about the value of keeping outliers and what they can teach you.

Statisticians use the phrase "statistically insignificant." That phrase means the data doesn't conclusively show any change. Kane says that with Falluja included, the supposed increase in Iraqi mortality becomes statistically insignificant. Kane notes that including the data from Falluja would have resulted in making the confidence interval so big that there would be no way to conclude that the mortality rate went up.

Kane also notes that he compares the mortality rates before and after invasion, and basically the result aren't consistent. Kane recalculates the post-invasion mortality rate confidence interval with Falluja included, and because he doesn't have the actual data he makes a calculated guess using other data presented in Robert's paper. Using these other numbers, Kane says that you might be able to conclude that the mortality rate went down after invasion.

The main figure touted by the Lance paper is that 100,000 more deaths have occurred in Iraq since the coalition arrived. Kane notes that this number is calculated using the estimated death rates when the Falluja data is excluded. When he attempted to recalculate this estimate using the statistics that are based on data including Fallujah, the estimate changes to 264,000, but the conficence interval for this number is -130,000 to 659,000! Please note: the confidence interval includes 0, meaning NO additional deaths is a possibility.

Kane then goes on to prove, using some statistical arguments, that there is a definitely possibility that the mortality rate after invasion has actually gone down.

Other issues

It is rarely a good sign for a researcher to refuse to share his data; the key to good science is reproducibility. Yet the authors of the Lancet study (as of Kane's paper) had not shared their data. That isn't one of the seven signs of bogus science, but I think it should be.

My take on all this

I think the Robert et al. paper tried to do something extremely difficult: figure out mortality rates in a country that, at the time, did not have a firm grasp on the value of keeping track of these kinds of data. Any numbers they reported from Iraq were going to be difficult to support. I have some serious questions about the methodology they used to get their numbers, but I have no expertise in cluster analysis and therefore do not feel qualified to point out my criticisms. I think Kane's paper does a credible job of pointing out the inconsistencies in the analysis that takes place in the Lancet paper, and he is to be commended for his careful analysis.

It isn't clear to me why Roberts et al. won't share the data they collected. That acts as a big red warning flag for me, because it doesn't allow for independent verification of the analysis. That's a shame.

But I think the bigger issue is reporting science via the mainstream media. The only figure I heard reported in the media was "100,000 more died! 100,000 more deaths caused by the coalition forces!" There was no mention made of all the caveats in the study, the problems noted in the study, and since Kane's report has been released, I'm not aware of any media avenues talking about the problems in the Lancet study.

This problem of the media misreporting science and not reporting followup studies has been documented over and over again on the Junk Food Science Blog; Sandy Szwarc has done a fantastic job of showing just how terribly studies are reported in the media.

Let this be a lesson to all of us: if the media go ga-ga about a new scientific study, be very wary. If you can't read the study yourself, find a friend who can, and ask them to read the study and tell you what it means.

A silence too long

Alas, I have not written, not because I have nothing to write about, but too much. I feel a great pressure to spend significant time on each subject to give it the proper attention, but I have not had the strength or isolation that allows for it.

I hope to correct that this week.

Saturday, July 14, 2007

One Way to Waste a Saturday


I find it very easy, some days, to get totally bogged down in the minutest of details about something that, frankly, means very little.

Today's rabbit trail involved how the various early manuscripts handled Revelation 13:18. Yes, this is the nefarious "number of the beast" passage. The NIV interprets the verse as, "This calls for wisdom. If anyone has insight, let him calculate the number of the beast, for it is man's number. His number is 666." The Arabic (more properly, Hindu-Arabic) number 666 is an interpretation of the Greek representation of that number.

But how is the Greek represented? Well, that's where the rabbit trail started. My interlinear Greek New Testament uses the United Bible Societies' Fourth, Corrected edition and has:
εξακοσιοι εξηκοντα εξ

which is literally translated "six hundred sixty six" (sorry, no accents; getting them to show up correctly in this blog is another rabbit trail for another day). Well, that's great, but what do the earliest manuscripts say? Time to travel down the rabbit trail a bit more and learn a bit about biblical textual criticism. The link here did a credible job, I'd say, given my lack of knowledge in the area. Using that page as a starting point, I started looking for digital images of the various manuscripts or codices (or is it codexes here?) that could answer my question.

I believe the Codex Vaticanus has the same word-for-word phrase as above, and it is one of the main sources used for the Bible today. But the Codex Sinaiticus Petropolitanus, on page 131 in the third column, has something different:
χξς

And for purposes of simplification, I have used an ending sigma when in fact the character is a stigma, which is supposed to be a combined sigma and tau and quite archaic. Representing numbers using letter abbreviations is quite common; I learned about it when I studied writing utensils for a term paper in college. So this doesn't surprise me. I believe both ancient Hebrew and Greek languages used this technique; perhaps more Semitic languages did as well.

But someone brought up an idea on a blog that sent me further down the trail. This page presents an interesting idea: what if John wasn't trying to get across a number, but a set of characters he had never seen before?

Here's the thought process, hopefully put in a way that gives credit to the blog mentioned. John traveled to the future or saw a vision (yes, I believe one of these happened; for now, I'm leaning toward the former) and saw many things that were too awesome and incredible for him to comprehend. Yet Jesus instructed him to "write what you see... " (Rev. 1:11). As the languages that are spoken and written now did not exist in their present forms in the first century, John would not recognize anything that was written down as they would involve characters he would not recognize. So what if this mark of the best was a set of characters? Using the word "mark" in v. 16 and v. 17 seems kind of silly if in fact it was just a number. He could have written "number." (To be honest, I hate arguing from what an inspired author would have done, but bear with me here.)

Now the author of the web page mentioned above then goes on to show how the Arabic version of Allah could be mistaken for those three Greek characters, especially if your brain is wired to think in Greek. But he misunderstands something: the original manuscripts weren't written using minuscule writing (aka lowercase Greek), but uncial, or uppercase Greek. In order for our web author's premise to work, it would have to be the case that the original texts were written in minuscule--and that would them require that the students be taught the minuscule forms. But they weren't. Uncial was the order of the day.

But the author depended on the Textus Receptus (written in minuscule), and not the papyri or uncials. So his hypothesis is, in the end, shown to be bogus.

Now while the end point of the author's post is bogus, I think he may have something in his starting point--maybe, potentially. Possibly. But here's the thing: it doesn't really matter in the end. While I don't think God is upset that I spent a day looking into this and understanding where our Scriptures came from, the bigger question is: how did I treat my family today? My neighbor? Did I share the gospel or live out the gospel today and showed that my faith has meaning?


Allow me to quote from the NRSV for 1 Corinthians 13:2:

And if I have prophetic powers, and understand all mysteries and all knowledge, and if I have all faith, so as to remove mountains but do not have love, I am nothing.

Time for me to do something for the Kingdom.

Thursday, June 28, 2007

A big salute to Junk Food Science

I want to put a big shout out to a new member of our list of sites on the right, and that is to Junk Food Science. Sandy Szwarc does what I cannot do in the field of medicine, and that is shine a light of pure, blinding logic to the world of medicine and health, specifically in tearing apart what the research does and does not say. Sandy, if you're reading, please know you have a huge fan here who reads you every day.

"Natural Language"

Microsoft, that bastion of Western capitalism, has a research project that it calls Natural Language Processing:

The goal of the Natural Language Processing (NLP) group is to design and build software that will analyze, understand, and generate languages that humans use naturally, so that eventually you will be able to address your computer as though you were addressing another person.

This is a very laudable goal. The problem is, they need to do something now: They need to make their operating system understandable by regular humans. So the "natural language" should have been involved in the text and processes in the operating systems, such as:


There are so many more examples on the web of these issues and many more, but the problem boils down to this: Microsoft programmers often write their programs thinking they will be used by other Microsoft programmers and tested and debugged by Microsoft programmers, and therefore they don't end up programming for the customer. You know, that user at Small Business Shop in Smalltown, Ohio who got a business degree and is just trying to get his project done. He doesn't know what the heck a DLL is; he doesn't have any clue whether a dialog box is giving important information or just a warning. And he doesn't want to learn.



Microsoft needs to start using more natural text in its operating system. And now. Not years and decades from now, hoping to reach the vocal interface.

And then, of course, Microsoft should do as they say, and not as they do.

Saturday, June 23, 2007

A Life of Caffeine

So I went to the doctor today (yes, on a Saturday) for a checkup. While I'm sitting in the waiting room, I pick up an old copy of US News and World Report. The cover story was on our society's dependence on caffeine to do all the stuff we do.




I didn't get a chance to actually read the article; for once, the doctor came a bit too quickly for my taste. But when I went to put up the magazine, I happened to see the advertising on the back of the cover. I have no idea what it was attempting to sell, but the tag line? "Who knew two vanilla lattes could be so relaxing" or some such nonsense.



Don't magazine publishers check to make sure that the ads they're placing in the magazine don't contradict any articles in the edition? Doesn't it just demonstrate no attention to detail at the level of the editors? If that's the case with something as simple as advertising, what does that mean about the articles they publish? 



But since I am a geek, lets talk about caffeine. 



According to cosic.com, caffeine, or 1,3,7-trimethylxanthine, as its known chemically, is "the most widely consumed pharmacologically active substance in the world." Geeks have depending on caffeine for decades, but we picked that up from others. CoolNurse.com says that Chinese emperor Shen Nung drank strong, hot brewed tea. Coffee was first located in Africa around 575 AD. We in the US switched to coffee in the eighteenth century, perhaps only partly because of the Boston Tea Party



A 12-ounce can of Coca Cola has 34 milligrams (mg) of caffeine. Cosic.com lists several other amounts, such as 5 ounces (150 milliliters) of regular filtered coffee has 60 mg. Most of the web sites I've seen suggested 300 mg/day as a safe level for most people, but some individuals are very sensitive to caffeine and should not have anywhere near that much. 



The concern in the US News and World Report article is the huge increase in consumption of caffeine for all age groups but especially growing children. According to the article, there have been no studies of the effects of caffeine on growing children. Then they proceed to trot out individual kids who have are using insane amounts of caffeinated beverages or moving up to methylphenidates to get serious long-term highs. The article goes into "upcoming medical crisis" mode. 



First, you cannot make decisions about products based on individual stories. No matter how many times television news programs or written news articles trot out someone to be their example of the problem in question, such low-number experiential data points don't count. We can say something about caffeinated sodas, which have been consumed as a result of being in the market in their current forms for over fifty years (and closer to eighty years).  And in the decades since these cola products have been on the market, there has not been any problem that warrants intervention. The quantities of caffeine in these products has been at a reasonable level and children have been properly monitored by parents to limit intake. In a quick search, the most I could find was holding Coca-Cola accountable for activities in Colombia and killing vending machines



(And while I can appreciate why parents are concerned about their kids being on Ritalin and whether it is being incorrectly prescribed to too many children, I have a different take on it: I know two family members who took it, and in them the result of being on the drug was a remarkable improvement. Ritalin is not evil, especially when it is properly dispensed.)



To me, the problem is not the caffeinated beverages, even the new ones like Red Bull, Cocaine, and such. As long as the product is not inherently dangerous, in our capitalist society they should have the right to make and sell such a product. The problem is a lack of oversight by parents. Why aren't parents watching what beverages their children consume? My family watched over me, and I watched over my children. Why can't the US News article bring the focus around to "Parent, be aware of this." Instead, it just seems to sound a "woe is us!" kind of tone. 



Update, 6/28/2007: Just fixed some typos in the text.

Saturday, June 16, 2007

The Future of Applications

So over in my list of links I've included Zoho. I've had it sitting there mostly as a placeholder so I could remind myself to write about the Web 2.0 sites that have been such a hot item lately. I don't have a complete list of such collaboration sites, but technologies and names involved include Zoho, Google Docs (of course), SimDesk, and several others listed here on Wikipedia.


When I first heard about the concept of using applications online (and I think it was Microsoft who originally was talking about it), I just couldn't get it. Why would folks want to use such applications over a network when the files could be intercepted, the applications were susceptible to network outages, and people didn't actually "own" the software? 



But I was introduced to Zoho at work, and I discovered the value of these online applications. I had a project that I had to do at home and I needed to keep track of a mountain of information. In addition, I couldn't have my spouse constantly asking me for where the info was. 



So by using Zoho projects, I was able to set up a project there that allowed me to keep track of all the little things that needed to be done and set it up so the spouse could easily find information. And thus I discovered the two key attributes of online apps: (1) the ability to work together with others in collaboration, and (2) the ability to find an app that I needed without having to spend money on something I may only use once or twice. 



I still am concerned about my information being online, but frankly most of us don't have anything that important or serious. I mean, if someone use Zoho Projects to keep track of a house remodeling project, who really cares? I still don't think individual privacy should be compromised, but that doesn't mean that we have to use such tools for things that are truly private. Obviously if you are working with information that must be secure, using an online collaboration tool is still not a great idea. But these tools do fill a need, one that I had not considered before. 

Friday, June 15, 2007

Disappointment, thy name is Jobs


As I am not paid to program on the Macintosh platform (even though I may support them), I was not able to go to WWDC this year. I did watch the keynote speech by Steve Jobs, and I have to say it was extremely disappointing. 



All the features shown in Leopard? Mostly shown last year. iPhone? I'm sorry, but while I can see why Apple is moving in this direction, it doesn't interest me. Gaming? Here I must confess that I do not truly qualify as a geek, because gaming doesn't interest me in the slightest. Role-playing games do not interest me at all. I know, shocking, even blasphemous, but true. 



And I could swear that in listening to Jobs do his presentation, he seemed to know it, despite the classic black sweater, jeans, and tone of his presentation. There was a slight levity to his voice. It must frustrate him to no end. Apple is seen (rightly or wrongly; I prefer to think rightly) as pioneers of innovation. And they didn't have it this time. I think that's probably because of the time and manpower needed to get the iPhone to work. I really hope that it pulls in the people as the iPods did. I do.



Don't get me wrong. I want Leopard for a number of reasons: Time Machine, Spaces, and the cool background option in iChat. But I hope the iPhone was worth the delay. 

Equality of the "sexes"

In the spirit of cooperation and equality, I shall at least link to this article that talks about the Intel Xeon chips paired together in the V8 machine. The Intel Xeon chip was also in the Mac Pro I mentioned in this article.



It's slightly interesting to note the level of detail given in at Hot Hardware as opposed to what appeared at Apple's site regarding their machine, although that is the difference of the audience of each. We geeks love our details, even if we don't necessarily understand them all. (Of course, we would never admit this out loud.)

Tuesday, June 12, 2007

A little perfection


There's something great and wonderful and, dare I say, majestic about a well-made chocolate chip cookie. I'm not sure I can say that it touches the sublime, but perhaps this recipe could. The author, Nancy Rommelman, shares this recipe and notes something that is true in many areas of life: knowledge is not expertise.
I know that, no matter how meticulously I explain a recipe – and people seem to think baking is a very meticulous business, despite my telling them, I rarely measure anything – it will not taste the same as mine. Why? Because I’ve been baking since I was seven, I’m good at it and, chances are, you’re not.



I am not trying to be snotty here. If you give me a violin, and tell me exactly and a thousand times how to play a Tchaikovsky concerto, I am not going to sound like Jascha Heifetz, even if I practice, for years. I might play well, but I will not play like him. The same holds true for baking... .

I like this. There is so much truth wrapped up in this statement that many people today seem to not understand. (Perhaps something in our culture is pushing us away from this knowledge? Human Resource departments sort of know this, but there is still a push to document your knowledge, it seems.) Quantifying expertise is impossible, but I think each person has a spark of something that allows them to take one or more areas and turn their knowledge into something more. Just because Joe Smith and I took the same four years of classes at Big State University and both majored in Big Computational Disciplines does not mean that our skill sets are the same. It doesn't mean that we bring the same abilities to the knowledge that we have.



And perhaps, if I may be bold, this is a part of that concept of being made in the image of God. Yes, we know that primates can create tools and use them, but has any primate taken that knowledge and developed a Home Depot in the jungle? I haven't seen one.



And yet, even though I may not have great baking expertise, I'm still thinking it's worth my time to try this cookie recipe. If nothing else, it will let me have some quality time with my family.

Saturday, June 9, 2007

I want it, NOW



While Minority Report made the interface popular, this story talks about how it is in existence now. It backs up what Microsoft is doing (posted here).




So is Microsoft the first vendor to sell this? More research will be needed to check on that. Right now, it appears defense applications are the only places where it will be used. Which means commercial applications will be years behind--except for Microsoft, that is. But their offering is so expensive as to be out of reach for most of us.


But the trend is clear: as computers have moved through time, the interface has looked less and less like code and command lines and more intuitive, based on the way we interact with our world around us. There is but one downside to this: the number of people who will be able to do the coding will continue to decrease, as few people will have the skills to code. The only way to avoid this is to improve the languages themselves. Even if many business applications still exist and still run that are based on COBOL, COBOL is as evil as it can be, so I am more than happy to leave it in the dust heap of time.

Thursday, June 7, 2007

Entering the Holy of Holies


So we went to the dentist today. Nothing remarkable happened to my teeth, but something odd happened as I walked into the lobby.


The dentist is in a new building, and the lobby is decorated with a slightly Asian theme. Consequently, the receptionists prefer to play music that is New Age sounding with an Asian sound, in minor keys with odd plucking notes periodically. I guess it's supposed to sound soothing to those who would prefer to be stuck in an elevator than be at a dentist's office.


So my immediately impression when I walked into the lobby was that I was walking into the Forbidden City. Now why would a dentist want his office to be seen as holy ground?


If they force me to start bowing and walking backwards in the dentist's presence, I'm finding a new dentist.

Thursday, May 31, 2007

The future of computing


Lots of new toys are going to be hitting the market soon; thanks to HotAir.com, here are several that have crossed my path recently:





  • Palm 's Foleo: So according to the video at this link, this is supposed to be a device that you use with your PDA when the small screen and lack of keyboard are an issue. If that's the case, then why on earth would I buy a PDA to begin with? Just slap all the abilities of the PDA in the Foleo and be done. Now, instead of carrying my laptop, my cell phone, and my PDA I'm carrying four things?? I don't think so.


  • Microsoft Surface: This is interesting. First of all, it seems to be one of the first times that Microsoft has gotten into computer hardware beyond keyboards, mouses, etc. (Don't quote me on that.) And they are using a paradigm that was developed and video released several months ago (alas, cannot find that link, darn it). But there are issues with this: (a) it's so incredibly bad ergonomically speaking to be looking down at a computer like that; (b) no keyboard is shown, so I'm guessing you use a virtual keyboard, which takes some getting used to; and I'm sure there are many more such problems.


I think the paradigm shown on the Microsoft Surface (yes, think Minority Report) is great for people who aren't computer geeks. It's a really nice interactive way to work with concepts, but the horizontal surface makes it more limiting to me. Now, it is reminiscent of the way people would look over photos at a coffee table, so I can see it hearkening back to an earlier time. The interaction with cell phones and credit cards is nice but no credit card can do that now.


The Foleo is just stupid. I can't see a use for it at all. I simply can't. If it's too small to see on your PDA, then just look at it on your laptop. End of story. If taking a laptop around is too difficult, then the Foleo will still be too big.


Give me the Surface, but put it in my kitchen and let me use it for recipes, directions, etc. I start my day in my kitchen. Don't ever hand me a Foleo.

Icky poo


Yes, it's official. Cooties exist:


This may be the first study, though, that reveals an evolutionary basis to shopping preferences. Low-threshold revulsion makes sense, protecting our ancestors from eating rotten or poisonous food or touching animals that had died of infectious disease. The face of disgust--with the nose wrinkled and the eyes squinted as if against some pungent smell, and the tongue often protruding as if spitting something out--tells you a lot. "It was probably," says Fitzsimons, "a pretty good proxy for the germ theory of disease before anyone knew germs existed."


....


Strong preferences were just what the subjects exhibited. Any food that touched something perceived to be disgusting became immediately less desirable itself, though all of the products were in their original wrapping. The appeal of the food fell even if the two products were merely close together; an inch seemed to be the critical distance.

Notice that the first response regarding this research is to immediately link it to evolution, but not in a way that actually explains why this emotional response developed.


The author talks briefly about the germ theory, as noted in this quote, but the actual thing being tested here is the emotional response to objects.


If we were truly rational creatures, then the proximity of objects to each other would truly make no difference at all. We would acknowledge the wrapping around the foods and realize that no contamination was possible.


But we are not always rational creatures. I would find it to be a better study to develop this into discerning why we are not more rational in our approach to objects. But the focus is instead turned to marketing and improving the chances of redirecting the almighty dollar to one company's product instead of the competitor's.

Wednesday, May 30, 2007

To post or not to post

I didn't post for a few days (well, closer to three weeks) because (a) I was extraordinarily busy, (b) I had to do a lot of extra stuff at work, and (c) I had the audacity to take a vacation. Pray forgive me. I hope to start posting again on a regular basis starting this weekend.

Saturday, May 5, 2007

What technology can and cannot do


The New York Times (hat tip: SlashDot) noted that some schools are not going to continue to give laptops to students. Says the Times:


Many of these districts had sought to prepare their students for a technology-driven world and close the so-called digital divide between students who had computers at home and those who did not.



“After seven years, there was literally no evidence it had any impact on student achievement — none,” said Mark Lawson, the school board president here in Liverpool....



Yet school officials here and in several other places said laptops had been abused by students, did not fit into lesson plans, and showed little, if any, measurable effect on grades and test scores at a time of increased pressure to meet state standards. Districts have dropped laptop programs after resistance from teachers, logistical and technical problems, and escalating maintenance costs.

There is no way that laptops or any technology is going to magically solve the problems in education today--or the problems in the workplace, the home, the church, or in politics.


We seem to think that technology can do all sorts of wonderful things, and it can--when someone designs the technology to do that. The downfall of the program here in Liverpool is that the lesson plans were not updated to reflect the use of technology. Just sitting a student in front of computer without instruction or a plan leads to the disaster shown above. But the same thing happens at work, at church, and at home: dropping a computer in the environment without having a plan for training the people to use it and having a plan for what software to use is a recipe for disaster.


And just installing software to block pornography wasn't enough:


Soon, a room that used to be for the yearbook club became an on-site repair shop for the 80 to 100 machines that broke each month, with a “Laptop Help Desk” sign taped to the door. The school also repeatedly upgraded its online security to block access to sites for pornography, games and instant messaging — which some students said they had used to cheat on tests.

No plan had been put in place to support the technology. It was, in short, a disaster waiting to happen.


The sad part of this is that computers can and should be used in school, but they shouldn't be included in every class. They are tools that have more ability than a typewriter, so just teaching a keyboarding class is not enough. But it's up to each school district to decide what skills students need to have in order to succeed.


In churches and businesses, the focus should be on answering this question: What can the computer help us do to make our focus easier, better, or faster? Some businesses frankly cannot benefit from a computer, or if they do it is simply as a type of high-expense frosting.


Technology cannot replace face-to-face communication (although videoconferencing software can facilitate it). Technology cannot replace teachers (although you can share a teacher via online courses). We know this. So why do we continue to make these assumptions that we can simply drop money on a problem, throw a computer at it, and fix it?


Frankly, I think it's a desire by people to feel like they're doing something. They want to use up their budget money so they won't lose funds next year. They want to look like they are solving problems--without expending the brain power to come up with a complete solution.

Saturday, April 21, 2007

The root cause of today's Islamic fanaticism

Read it. Read it now.



A Lesson in Hate by David Von Drehle



It is a sad demonstration of the power of the pen.

Thursday, April 12, 2007

My own personal supercomputer


Ah, let us remember the days of yore, when we had to spend millions of dollars to get the power that we can get now for less than $25,000. Go back through history with me:



  1. The Cray-1, from 1976


    • Cost: $8.8 million dollars

    • Speed: 16 megaflops (that is, 16 times one million floating-point operations per second)

    • Source: http://www.cray.com/about_cray/history.html


  2. The Cray Y-MP, from 1988


    • Cost: $40 million in 1991

    • Speed: 4.3 gigaflops (that is, 4.3 times one billion floating-point operations per second)

    • Source: http://www.iht.com/articles/2005/11/15/yourmoney/msft.php


  3. The new MacPro with eight cores


    • Cost: maxed out, no more than $25,000

    • Speed: estimated, but we'll guess 41 gigaflops (41 times 1 billion floating-point operations per second)

    • Source: http://www.itjungle.com/breaking/bn111406-story01.html




You gotta love the advance of technology.

Thursday, April 5, 2007

The science of Genesis 1


What, you say? You thought the Creation story was just faith and had nothing to do with pure, unadulterated, hard-core science? Alas, bloghopper, you have not stopped to think.


I don't want my blog to simply parrot other people's words, and lifting stuff completely is rude to the extreme. But I want to give Bryan a huge amount of credit for this piece. It just rocks.


Bryan at Hot Air linked to this page which talked about Dr. Francis Collins and how God showed up in the Human Genome Project. Then Bryan wrote:

Eight years at the Hubble Space Telescope project had a similar effect on me. I was never an atheist as Dr. Collins was, and I didn’t head up anything on the scale of the Human Genome Project, but examining the universe in detail through Hubble’s eye at first challenged, and then strengthened, my faith. For me, it was a supernova — Supernova 1987A, to be exact, and how its position 168,000 light-years from us makes it a TiVO writ large that we can use to figure out how large and old the universe is by yardsticking distances to it and other supernovas, eventually all the way out as far as we can see, and then rewinding back to the Big Bang. Genesis 1 turned out to be one of the most interesting and profound documents ever written, once you start to get the science of it all. The God of the Bible is the God of the genome is the God of the distant dying star. If you’re interested in the how and why of that, here’s an article I wrote a while back that attempts to explain some of it.


Now, you have to go read Bryan's article. I'm serious. You must read it in its entirety.


I'll wait.


[whistles to self]


Okay, if you haven't read the article(s), then the following may not make a lot of sense. But I'm going to assume that you were honorable and read the article.


One of the reasons why science works so well is that scientists who use the scientific method are critical thinkers and don't take things for granted. They probe, they test, they re-test, they gather data first, analyze it, and from that data they produce hypotheses that they go and test with brand new data. When it works right, knowledge builds from knowledge and has a stable footing underneath that can support new information, new insights, new theories. The facts dictate the theories. Kepler's idea was completely radical, but it worked because it was based on facts.


One of the problems with scientists (who are human, and not Vulcans, despite the great need for Vulcans in this area) is the tendency to assume that they have completely answered the questions in an area of research, that they have proven things beyond a shadow of a doubt and that the current model for this corner of the universe is the best there is. Perhaps it's intentional, but it can't be for all scientists. I can't say that it's laziness, or ego, or tenure, but it happens to all humans who like to call themselves "intellectuals": We know what we know.


So this applies to the lay person who believes science trumps faith as well as the scientist who can barely keep up with the research in his own field. (It's a side effect of the growth of knowledge in the modern era.) Part of it may be a type of modernist ego as well: only the knowledge of today is true and accurate. Only the science performed today is really meaningful. Those folks that lived before the Enlightenment couldn't possibly have a clue. Well, Socrates was okay, Plato had some points, and Pythagoras had a nifty idea, but really, they were the exception.


Why do we think this? We are we so ego-centric about the day and age we live in? In some ways, it is unfounded. We can only stand on the giants who came before us, so if there are no giants, we cannot stand.


So we first suffer from modern science egoism. But the thing I love so much about Bryan's article is how it points out how non-mythic Genesis 1 is. How many times have I heard that Genesis 1 is a great metaphor, a myth, a story? Far too many to keep track. And yet, how poorly it stands as a mythic story when compared to the other societies and creation stories that Bryan lists! Really, he's right: Genesis 1 is boring from the standpoint of a story teller.


But what really makes Bryan's article stand out for me is the application of Einstein's theory of relativity to Genesis 1. If Hugh Ross wanted to impress me, why didn't he do this? (Maybe he did, and I just missed it. If so, my fault. I'll update this page accordingly.) My position on the creation story has changed throughout my lifetime, but this explains several things wonderfully.


The beauty of this solution where time expands:

Take your Genesis clock off the earth and set it for the whole universe. An hour to the universe, due to the mass and velocity difference, is an epoch to the tiny earth. A day to the universe, an era to the earth.
Is it cheating? You bet. Was Moses talking about "universe" days and not "earth days" when he wrote about the creation story? Well, if you look strictly at the Hebrew (which I have, although I don't have that analysis handy), it says "day", as in 24-hour period day. But because Moses writes from the perspective of God looking down at the earth--and God is omnipotent, omnipowerful, and omnipresent--then it makes sense that the "day" is referring to a day as seen from God, and potentially from the entire universe. So relativity could have played into this.


Now, this doesn't address Robert Gentry's research, but then he's already been evaluated for problems in his research and methodology. I have no problem with certain universal constants not being constant when the earth was formed. I have heard of some research that indicates this is true, but cannot locate it easily at this time.


I will say this: Once Adam and Eve were created, I think (not sure, but assume at this point) that time ran at its usual speed for the frame of reference of the earth. I don't see enough reason to believe that after Man populated the earth that it took millions of years for us to get where we are. How long ago was creation, then? I'd like to split the difference. I think the young-earth creationists might be closer to the truth than we realize when you consider time since the fall; but maybe there is room for an older earth, especially when we consider that the creation story starts at the formation of earth, and not of the entire universe. (I reserve the right to revisit this topic after I've learned more Hebrew. Of course.)


Does this mean I won't associate with young-earth creationists, old-earth creationists, and purely evolutionary scientists? Not at all. How long the creation took is not as important to me as the fact that creation happened. In order for salvation to be necessary, there has to be a perfect order from which man fell. A reason for which Christ came to die for sin.

Monday, March 26, 2007

How green shall I be

So everyone I know of is talking about this news article where a writer has decided to spend a year without toilet paper. The article says:

Welcome to Walden Pond, Fifth Avenue style. Isabella’s parents, Colin Beavan, 43, a writer of historical nonfiction, and Michelle Conlin, 39, a senior writer at Business Week, are four months into a yearlong lifestyle experiment they call No Impact. Its rules are evolving, as Mr. Beavan will tell you, but to date include eating only food (organically) grown within a 250-mile radius of Manhattan; (mostly) no shopping for anything except said food; producing no trash (except compost, see above); using no paper; and, most intriguingly, using no carbon-fueled transportation.

Mr. Beavan, who has written one book about the origins of forensic detective work and another about D-Day, said he was ready for a new subject, hoping to tread more lightly on the planet and maybe be an inspiration to others in the process.

Okay, so this is a project so he can write a book about it. Fair enough. But I had to do some calculations for myself to decide if going a year without toilet paper was worth it.

(I did warn you there would be math in this blog.)

After visiting various sites and pages (no Wikipedia), the following information has been gleaned:
  • According to the manufacturers of Charmin, a roll of toilet paper lasts around five days. In my experience, it doesn't seem to last as long. So for my calculations, I'm going to say a roll will last 2 days. Given that assumption, in one year I will use 365 / 2 = 182.5 rolls.
  • On average, a roll of toilet paper will weigh 227 grams, which is just a hair over half a pound. To simplify the math, let's call it a half pound even. So in a year, I will use 182.5 * .5 = 91.25 pounds of toilet paper.
  • We can't really determine how much wood is produced by a single tree because of the variability in sizes and wood extracted, so there is another figure to define a "standard" amount of wood: a cord. A cord is defined as a pile of round wood 4 feet wide, 8 feet long and 4 feet high.
  • We can produce 1,000 pounds of toilet paper from one cord of wood. Given our half pound estimate for a single roll of toilet paper, that means that I can produce 2000 rolls of toilet paper from a single cord of wood.
  • That means that from a single cord of wood I have 2000 / 182.5 = 10.959 years worth of toilet paper. From a single cord.
  • One nice little tidbit is that for each tree used for paper (and that means all types), five more are planted.
Now I couldn't find any estimates for how old a tree is before it is harvested for paper, but that conversion right there gives me some comfort in knowing that I can continue to use toilet paper for the rest of my life and I will only use 5 more cords of wood to clean my little hiney. I use way more paper in other areas of my life; I feel no problem in keeping the paper here.

Plus, the one thing that drives me nuts about this discussion is that paper is a renewable source and that there are people gainfully employed in making paper. I really don't want to put them out of a job.

Friday, March 23, 2007

How do you think?

A friend forwarded me the following link, and somehow I found a chunk of time to sit down and watch it.




The fellow is Evan Sayet, a man who has several job titles but in this venue is best described as a political commentator. The title of his talk is "How Modern Liberals Think." And after watching it, I think he may have something. But it's just one chunk of a larger edifice.


My friends have tried to describe the different ways in which liberals and conservatives think. Rush Limbaugh makes a point of saying that he knows how liberals think; I have yet to feel competent enough to say anything close that. I have had long talks with at least one good friend who is liberal (and quite pointedly reminds me that this is not the same as leftist), and it would take us literally hours to find common ground from which we could move forward on any discussion. Why?


Sayet's thesis is summarized as follows (and I hope he allows me to regurgitate it for myself or I will never get it), as written on his blog:

[F]act, reason, evidence, logic, morality, decency and justice play no part in how Modern Liberals "think." These concepts are seen by the left as inherently bigoted, so fatally flawed by one's prejudices as to make rational, moral and intellectual thought nothing less than an act of evil. ...

[Victor David Hansen has defined modern liberalism as] "All cultures (must be seen as) equally good and equally valid" and to which I add only that, then, all behaviors stemming from these cultures must, too, be recognized as equally good and equally valid.

Since all cultures and behaviors are to be thought equally good and equally valid, the Modern Liberal believes that the outcomes of all behaviors must, too, be equally good. When in the real world different behaviors lead to different outcomes, the Modern Liberal simply must believe that some sort of injustice (likely due to bigotry and oppression) has been done.

The goal of the Modern Liberal, then, isn't to apply fact, reason, logic, evidence, morality, decency and justice in an effort to find the best of all possible explanations and policies but rather to manipulate these things in order to uphold their preordained conclusion that all things are the same.


I think he's on to something here. (He gives a lot of credit to Alan Bloom's book The Closing of the American Mind and with good reason.) His reason as to why liberals think this way is just as important as how they think, and he covers it in the video above fairly early on. It goes back to looking at the entire history of the world with a broad brush and saying, "No one is perfect. No society is perfect. So that means the United States of America isn't perfect and we shouldn't say we are or even that we're any better than anyone else." The fact that we are light-years and eons away from the evil practices of the past merit no mention of progress or success, but simply as facts to demonstrate how far we are from perfect. That is an explanation of the mindset based on history. It's good, but it doesn't seem quite right to me.


(An aside: I thought it was odd when schools stopped giving medals for first, second, and third places at school competitions. "We don't want to hurt anyone's feelings," I was told. "We just want to make sure children have a healthy self-esteem." I thought it was an aberration, but I see that it's simply an offshoot of this mentality.)


So what is a better explanation for how this way of thinking developed? I have a guess, and pending further data and research it seems like a good hypothesis for now: I don't want to be told that I am bad, that I am evil, that I have--and here's the big, bad word--sinned. I don't want to be told that I'm not good enough for heaven, that there is a God out there who is measuring me and has decided I am not good enough to make it. So I will set up an elaborate system to protect me from that nasty realization. If no person, culture, or system is worse than any other, then I cannot be judged for sin.


Where did this mindset come? I cannot say this for certain (again, at this point it is only a hypothesis), but as I have talked to people and analyzed myself it seems that very few people make lifestyle, paradigm, or worldview decisions based on outside events. (Some do, but I cannot confirm that the number of people who do it is large. I can't say that I do, even though I do try to be rational.) Most people make decisions from the inside, in their gut, or their heart, or whatever you want to call it. It seems quite reasonable that Satan would turn my guilt of the wrong I've done and try to show it as a good; he would love to take the good news of Jesus and turn it on its ear by telling me--whispering into that gut, that heart, that internal decision maker--that churches are only focused on judgment, on pointing out to me how I am just as good as those hypocrites in church are. In fact, he whispers, that judgmental way of thinking is really the source of all the world's problems. We should stop judging people and just get along.


And so he's taken that argument and turned it into a lifestyle, a culture, a world-wide creed.


Do I blame the world's ills on Satan? No. But he seems quite willing to take advantage of any situation he can.


Praise God that he was able to save us--I certainly couldn't save myself.


Addendum: In discussing this point with the spouse, another thought came up that should be attached to this:


Many liberals have enshrined a type of fairness doctrine as a result of this desire to have everyone considered the same, regardless of situations, beliefs, culture, etc. This is usually manifested in a show of concern for the poor, the downtrodden, the oppressed. And it is an admirable show of concern for others.


But I believe what actually ends up being created is not a fairness doctrine but a perversion of fairness. Why? Because one aspect of fairness is judgment. Oh, liberals don't like to judge themselves, but they love to judge others for not creating their desired utopian society. And the judgment is never equal.


One example: Liberal fairness doctrine says that if a poor man steals bread, we shouldn't hold it against him. He's trying to keep himself alive and has no money to pay the baker. At the same time, the bakery where the man stole bread is accursed for being profitable and forcing the beggar to steal. But that has a built-in assumption about the position of the baker. The beggar gets an excess amount of grace; the baker gets none.


Only when I put myself in the position of both the beggar and the baker can I find justice for both.


There's more than we can discuss here, but here's my goal in thinking about this: How do I share Jesus' great gift with a liberal and get past the liberal mind to the heart?

Thursday, March 22, 2007

One of those days

I don't know how to describe my feelings after today. I had one of those moments where you just shake your head and hope that you never had to deal with this level of stupidity again--and heaven help you if it is your own stupidity.



Where I work, there is a security system installed by one of the big names in building security. The name isn't relevant. Because of my geekiness, I often get called in to help on all sorts of computer- or electronic-related issues. So I wasn't surprised when J. asked me to help out with the security system. Apparently we were getting messages on the system that indicated the backup battery was dead. She worked with a security guard to find out where the backup battery panel was stored--which room, which panel.



J. did some research and discovered what kind of battery we needed, so once the battery was purchased she called the security guards to come in and open the room so the battery could be changed. As J. only works half-days, I ended up taking the call from a different security guard. He arrived and we went into the room where the system's backup battery was stored. He knew exactly what he was doing. He went straight to a grey box mounted on the wall, opened it, and we proceeded to change the battery. It was really quite straightforward.



I was surprised that even after changing the battery we were still getting "dead battery" messages, but a call to the manufacturer of the system said that the battery needed to charge up. It could take anywhere from 24 to 48 hours. That didn't quite sound right to me, but I thought that their system may be slightly different, so I didn't say anything. According to the company, everything should be fine after two days.



Two days came and went. We were still getting "dead battery" messages. First we called the company who sold us the battery. They assured us that the battery was fully charged when we bought it. So it couldn't be a dead battery.



After several more phone calls, an engineer from the manufacturer arrived to help us. I was asked to talk to him. So we went into the room where the backup battery panel was located, and I followed him in--and watched, stunned, as he proceeded to a different panel, a white one.



"Wait a minute--that's the box with the backup battery?" I asked.



"Sure," he replied.



"So what is this box?" I said, pointing at the grey box the security guard and I had been messing with earlier.



The engineer opened it, and after looking over the wiring he said, "This appears to be a card security system. You know, the kind where you have to pass an access card through a card reader to gain access. It looks like it's been out of commission for years."



So the security guard opened the wrong box. A second security guard went to the wrong box. And here's the worse part: the white box the engineer opened clearly was labeled with the security company's logo.