In his Theory of Self-Reproducing Automata John Von Neuman one of the father’s of the modern computer tells us:
there is … this completely decisive property of complexity, that there exists a critical size below which the process of synthesis is degenerative, but above which the phenomenon of synthesis, if properly arranged, can become explosive, in other words, where syntheses of automata can proceed in such a manner that each automaton will produce other automata which are more complex and of higher potentialities than itself.1
This notion that complex systems can at certain thresholds begin to degenerate, but that at other boundary lines suddenly shift into gear and begin to create more complex systems with greater potential and adaptive capabilities is now a cornerstone of certain forms of computing. It’s upon this very principle of complexification that many of the popularizers of a singularity and AI theoretic base their claims.
View original post 980 more words