I've done a lot of thinking on the issue. After I solved a few of the theoretical problems, and I've got to say, it's really kind of silly. The claim that AI could improve its design and then cause a runaway explosion of intelligence is quite simply wrong. In fact, the entire idea of the singularity is based on a wrong notion of what intelligence is and how it functions. Which really should be something people should have figured out at square one before moving on and pontificating. But, it wasn't like any of the AI programs were doing anything, and you might as well use that time to daydream about it.
Intelligence takes real work. You don't just have it. You build your brain with what patterns you see and recognize, which means you need to be exposed to them and experience things, and test things within actual reality rather than trying to do physics in one's own head. Intelligence is a bit like the scientific method being automatically implemented on 3 pounds of neurons.
There is no such thing as a free lunch.
You can't make the scientific method work twenty times as fast if you use a bunch of science to improve the scientific method. You cannot use to do the real intellectual work of understanding things, predicting them, knowing the underlying patterns, to improve the actual work it takes to do that. Just as doubling a functioning AI's computer processor wouldn't make it twice as smart. Just as making a human exist for two seconds for every second wouldn't make her smarter, it would simply allow her to get to the same wrong answer in half the time and not be one iota more intelligent (it would make her an unstoppable devil with a pointed stick though).