Intelligence explosion arguments don’t require Platonism

Scott Alexander in Astral Codex Ten:

Intelligence explosion arguments don’t require Platonism. They just require intelligence to exist in the normal fuzzy way that all concepts exist.

First, I’ll describe what the normal way concepts exist is. I’ll have succeeded if I convince you that claims using the word “intelligence” are coherent and potentially true.

Second, I’ll argue, based on humans and animals, that these coherent-and-potentially-true things are actually true.

Third, I’ll argue that so far this has been the most fruitful way to think about AI, and people who try to think about it differently make worse AIs.

Finally, I’ll argue this is sufficient for ideas of “intelligence explosion” to be coherent.

More here.