Apparently UD has "My Failed Simulation" some article about a mathematician who wrote a program that failed to produce libraries with a simulation of all the particles on Earth... on his laptop. Now, I fully understand that the only people who read UD are evolutionists who think it's funny that Dembski is so stupid... but come on! Really?
Where to start. First off "computers have come a long way" since 2000? Um, no. In 8 years you can do a quick application Moore's law (8 years / 18 months) 5.3 (rounding up to 8 years), now 2^5.3 = 40 So that's the total difference, the computers have gotten 40 times faster since 2000, back when I was running my old 500 mhz computer... having gotten it for college. Well, now I run a dual core 1.9 ghz Athlon XP 3600 (sounds pretty low-end for today but the 3600 Brisbane core is actually a thing of beauty, I bought it more for heat, power usage, and price for performance than for speed). So, what can this amazing computer do that the old computer couldn't? --- NOTHING! ABSOLUTELY NOTHING! Rather than taking "a few minutes" to run this amazing simulation would take a couple hours (time taken in 2008 * 40 = time taken in 2000).
This lying sack of crap wants me to believe that he simulated the entire planet on his laptop and didn't come up with libraries... oh la-di-da! If he actually had done this, it would overwhelm his memory, it isn't possible to do. He would need to simulate every chemical reaction in order to get life started. He would honestly need to track every last atom and guess what? You can't do that with a laptop! The fact is it would take at the very least one atom to record information about one atom and thus, at the minimum, you'd need your computer's memory to be the size of the planet. Also, he says he wrote it in Fortran! Because everybody simulates things in Fortran! Wait... that's perhaps the last programing language I'd use.
To show you what kind of processing power this takes look no further than real computer scientists who in March 2006 simulated a virus for 50 billionths of a second, using U.S. National Center for Supercomputing Applications systems, on a program the researches spent a DECADE creating called NAMD or NAnoscale Molecular Dynamics written on top of a parallel processing language. Why use parallel processing? Oh yeah BECAUSE IT TAKES A HELL OF A LOT OF PROCESSING POWER AND YOU NEED TO SPLIT THE WORK UP BETWEEN THOUSANDS OF COMPUTERS!
Well, at least this would count as rather astounding work by an IDiot. Release the source code or the precompiled binary. I'd love to take a look at it. I mean, that's some impressive code you have there, and in Fortran! Wow. Show me that code and I take back EVERYTHING I've ever said about creationists or those who believe in Intelligent Design. In fact, this work should be published in a real journal. There's more than a few who would be impressed with a world simulator that can run 4 billion years of every atom on Earth in a couple minutes on a laptop. I know they don't give Nobel Prizes for Computer Science or for Mathematics, but the secondary effects on chemical simulation alone should earn you a nod for a chemistry Nobel Prize.
This is amazing by itself, though it doesn't prove your suggestion that running a simulation once didn't produce something, so nothing could be produced. It's like rolling the dice once and since you didn't roll a six, a six isn't possible to roll.
Give me the code and I'll check through it, fix any problem, run in a few more times to give you a better count of simulations. It would be far more interesting than running these evolutionary algorithms all the time, which by the way work exceedingly well and predictably create amazing results (not predictable results, but amazing results predictably so).