Saturday, August 9, 2014

Some site I used to reference a Richard Carrier answer from seems gone (Tapee3i)

So I loaded up the way back machine and snagged the interview.

I had cited it in my life changes through various media thing before.

I totally wouldn't post it here, but I'm worried it might vanish forever.




tabee3i a home for Metaphysical Naturalists
By: Enki, November 5th, 2009
Richard Carrier Richard Carrier is a world-recognized atheist philosopher, teacher, and historian. He holds a Ph.D in Greco-Roman intellectual history from Columbia University. He is best-known as the author of Sense and Goodness without God: A Defense of Metaphysical Naturalism, and for his writings in the Secular Web (also known as the Internet Infidel) where he stayed the editor-in-chief for several years (now emeritus). He is a major contributor to The Empty Tomb and was also featured in the documentary film: The God Who Wasn't There. Dr. Carrier has published many articles in books, magazines and journals and made many appearances across the US and on national television defending sound historical methods and the ethical worldview of secular naturalism.

I have contacted Dr. Carrier and asked him about Metaphysical Naturalism, Christianity, atheism in the Middle East, his political opinions, and personal life.

1- First, let me start by thanking you again for your time. Looking at the various definitions of 'nature' or 'natural' that Keith Augustine has discussed in his thesis "A Defense of Naturalism", I would love to hear your version of the definition.

I discuss this very thoroughly, with entertaining examples, here: Defining the Supernatural I also have a forthcoming paper in Free Inquiry on the very issue of defining naturalism (perhaps next year, it's been languishing in their queue for years already, title "On Defining Naturalism as a Worldview," by last report will appear in the April/May issue of 2010, but it's been bumped before and may again).

2- One of atheism's strengths is being the default position in which it's not a claim but rather a response to a claim. Do you think this strength might get weakened as metaphysical naturalism is not only an assertion about what exists but it goes beyond that to a worldview?

I see it as entirely the other way around: mere atheism is the weaker position.

First, you can't go through life without a complete worldview, so in actual fact you have one whether you know it or not (unless you are insane, although often even then), so if you try to go around like a mere atheist, you are de facto going around with a completely unexamined, ill-tested, un-thought-out worldview, which you might not even be aware of even though you rely on it daily. On the one hand, Christians can take advantage of this fact. If they have thought their worldview through better than you have, they can easily expose the failures of yours, which leads to a serious weakness in mere atheism (as I'll explain in a moment). On the other hand, it's just dumb. You shouldn't be going around with a completely unexamined, ill- tested, un-thought-out worldview. Even if there were no religions. Thus, I say, stop doing that and start examining, testing, and thinking out your worldview, instead of pretending you don't have one.

I think the fear is that having a worldview commitment is equated with dogmatism and certainty, which is a fallacy. You can have a tentative worldview, with various components in various stages of uncertainty, and often revise your worldview without embarrassment (scientists do it all the time), even rest from time to time on unresolved sets of options at some points, but you still must have (and do have, whether you know it or not) some idea of the hierarchy of probabilities and possibilities. Even if one element of your worldview is highly uncertain, you are epistemically obligated to make sure it's still the most probable element of all known alternatives. Likewise, if you are unsure between, say, three different ways to answer a question, and so go around assuming any one of them may be correct, you are still epistemically obligated to make sure these options are not only the most probable of all known options but that they are equally probable to each other, otherwise you should be leaning in the direction of the most probable one, to some degree at least. If you do not do this, you will succumb to the folly of assuming all possible answers to a question are equally probable, which is not only nuts, it's a fallacy Christians routinely exploit.

Second, the modern Christian apologetic amounts to this: we have better explanations of all the so-far scientifically unexplained phenomena of the world than you do, therefore it is irrational not to see our worldview as presently the most probably correct. Taking a position of mere atheism is not only of no use against that apologetic, it's actually immediately defeated by it. There is only one way to validly respond to it. You have to prove the central premise false: they do not have better explanations of all the so- far scientifically unexplained phenomena of the world than we do. You can do this by agnostically articulating several equally good explanations, but at some point that just becomes pedantic and naive, because if you really did it competently, you'd realize even those "equally good" explanations, all of them, are defeated by an explanation that is in fact better. Thus, agnosticism is defeated by naturalism. Therefore it is agnosticism (and equivalently weak atheism) that is the weaker argument, not the other way around. And just as naturalism defeats agnosticism, it also a fortiori defeats Christianity by using their own apologetic against them: no, sir, in point of fact we have better explanations of all the so-far scientifically unexplained phenomena of the world than you do, therefore it is irrational not to see our worldview as presently the most probably correct.

I think the common mistake is to assume that claiming this is equivalent to declaring dogmatic certainty in naturalism. But that's the same fallacy I pointed out above. Saying naturalism is the most probably correct worldview on present evidence (and IMO, it is so by a large margin, no other competitor even comes close, a fact that isn't always obvious to those not well informed of the actual facts) merely means it is more probable than alternatives, not that it is itself decisively or undeniably certain. "More probable" does not mean "100%," or even "80%." It just means more. If the next most probable worldview is 20% probable, naturalism need only be 55% likely to be vastly more credible. I'm just making up numbers. But you see my point. Showing that we have better explanations for each peculiar fact is enough to refute Christianity. We need not assert that those explanations are therefore true, only that of all explanations so far conceived, those are far more likely to be correct than any others. That may change tomorrow as new information comes, showing some other explanation even more credible still. But right now, we ought to believe what the evidence makes most likely. And once you realize that naturalism has a better explanation of everything than Christianity, you'll realize it has a better explanation of everything than any other worldview. Which leads to only one rational conclusion: we all should be naturalists. At least for now. Maybe future evidence will change our minds, but we have to go on what we know now. Leave the future for later.

Monday, August 4, 2014

New Theme.

The black was a drag. So I posted very happy rainy day theme instead.

3D Olsen Noise

Java Source Code: http://godsnotwheregodsnot.blogspot.com/2015/02/olsen-noise-2d-java-code-updated.html


So I made a newer noise algorithm beyond fractal diamond squared noise. I previously removed the limitations on size and the memory issues, allowing proper paging.

Now I got rid of the diamond squared bit, and the artifacts it produces. As well as allowed the algorithm to be quickly expanded into multiple dimensions.

http://www.gfycat.com/SecondhandDimpledGull

Basically rather than double the size, apply the diamond elements, apply the square elements.

You upsample increasing the size of each pixel to 2x in each dimension. Add noise to every pixel reducing it depending on the iteration iteration (my current reduction is Noise / (iteration + 1)). Apply a box blur (though any blur would work).  And it's all done in my infinite field scoping scheme, wherein the base case is pure randomness, and each later case is recursively scoped.

Update 9/22: The Java Source Code and Demo have Noise reduction of +bit@iteration. So Iteration 7 flags the 7th bit, so +128 or +0. 6th bit, +64, +0. -- Doing this allows it to skip normalization as the end result will *always* fall between 255 & 0.

No more artifacts, and the algorithm would quickly implement on GPU hardware, doesn't change at N-dimensions.

Update: While the algorithm was actually made with GPU hardware in mind, and would very quickly implement exactly as diamond squared would not. -- It does change at N-dimensions. In that more of the roughness flows into the additional dimensions. Rather than average over 9 somewhat random pixels at a given level it will be the average of 27. Each level meaning it will be much closer to the mean. You might still get desired effects by reducing the number of iterations. 

I've also confirmed that a 2d slice of 3d noise is the same as a 2d bit of noise. Since it's fractal this should be expected. I don't think you can, do things like turbulence bump-mapping like with simplex noise, because the absolute value of Olsen Noise, is pretty much again just fractal noise. Fractals are fun like that.

Update: It's this fact about Olsen Noise that initially lead to my false confirmation of the noise. If you normalize it, regardless whether it's excessively smooth or not. It will look like identical noise. If you want to go that route, then the noise won't change at 2d to 3d. Because the narrower ranged 3d noise will be zoomed in on, and give the same appearance of roughness.

And since the noise is scoping, you can map it out in N-dimensions. So not only could you make it go around corners without hard edges, like this paper is so happy with itself for doing. You simply go from wanting a 1x500x500 slice at 0,0 to wanting a 500x1x500 slice at 0,500. It would by definition be seamless.



And unlike other noise algorithms its' fast and *simple*. In fact, it's a number of simplifications of diamond-squared noise all rolled up in an infinite package (which is itself a simplified bit).


One can reduce the iterations with distance, far enough away from you, you have 4 block sections, which are the same as the close bits but dropping an iteration.

Update: Reducing the iteration in the demo can be seen as sampling at 2x2 the value. It's basically the same cost. You don't need to do the full size and reduce, you can just request the area scaled down by 2x2 at 1 fewer iterations.

Sampled at 1:5
http://www.gfycat.com/ImpressionableShinyFlounder

Sampled at 1:20
http://www.gfycat.com/KnobbyBlissfulFairyfly


Wrapping:

If it were mission critical to have the noise wrap like old diamond squared noise, this could be done if the deterministic hash of the x,y,z...,iteration was taken as the x MOD wrap, y MOD wrap, z MOD wrap with regard to iteration  you would likely need to scope the wrapping. So if you wanted it to wrap with iterations equal to 7 (all my given examples here are iterations of 7), and wrap at 100. Your deterministic random hash function to give you random offsets modded at 100. Then at the call for iteration 6 have your deterministic random hash function give you random offsets looped at 51. And this would be independent of your requested set of pixels. It would do the recursive scope to make sure the the random variables given sync up. But, you could do awesome things like wrap at a specific (and different, x, y, and z). So you could make noise that wraps horizontally at 1000 but vertically at 50. In theory. I haven't managed to get it to work and there could be some kind of desyncing that happens when one iteration is looping at 16 and the next at 31. It might require a multiple of 2 for the wrapping. Or even a 2^(max iteration) wrapping or nothing at all.

Wrapping is left to later. I'll settle for better than everything else.

Smoothness:
Smoothness is mostly a product of the number of iterations along with the drop off rate of the randomness.

http://www.gfycat.com/LateRevolvingBlackmamba

Update: Algorithm Outline.
It occurs to me that I should have provided some pseudocode.


getTerrain(x0,y0,x1,y1,iterations) {
    if (iterations == 0) return maxtrixOf(random numbers);
    map = getTerrain(floor(x0/2) - 1, floor(y0/2) - 1, ceiling(x1/2), ceiling(y1/2), iterations-1);
    make a newmap twice as large.
    upsample map into newmap
    apply blur to newmap.
    add deterministic random offset to all values in newmap (decreasing each iterator)
    return requested area from within newmap. (This is typically newmap from [1,(n-1)] [1,(n-1])
}

Update: Actual Java Source Code.
http://godsnotwheregodsnot.blogspot.com/2014/09/olsen-noise-source-code-in-java.html


Update: Demo.
http://tatarize.nfshost.com/OlsenNoise.htm

Update: 3D Noise Source Code, With Commenting.
http://pastebin.com/WJVyDxDR