Singularitarianism is a moral philosophy based upon the belief that a technological singularity the technological creation of smarter-than-human intelligence is possible, and advocating deliberate action to bring it into effect and ensure its safety. While many futurists and transhumanists speculate on the possibility and nature of this technological development (often referred to as the Singularity), Singularitarians believe it is not only possible, but desirable if, and only if, guided safely. Accordingly, they might sometimes "dedicate their lives" to acting in ways they believe will contribute to its safe implementation.
The term "singularitarian" was originally defined by Extropian Mark Plus in 1991 to mean "one who believes the concept of a Singularity". This term has since been redefined to mean "Singularity activist" or "friend of the Singularity"; that is, one who acts so as to bring about the Singularity.[1]
Ray Kurzweil, the author of the book The Singularity is Near, defines a Singularitarian as someone "who understands the Singularity and who has reflected on its implications for his or her own life".[2]
In his 2000 essay, "Singularitarian Principles", Eliezer Yudkowsky writes that there are four qualities that define a Singularitarian:[3]
In July 2000 Eliezer Yudkowsky, Brian Atkins and Sabine Atkins founded the Singularity Institute for Artificial Intelligence to work towards the creation of self-improving Friendly AI. The Singularity Institute's writings argue for the idea that an AI with the ability to improve upon its own design (Seed AI) would rapidly lead to superintelligence. Singularitarians believe that reaching the Singularity swiftly and safely is the best possible way to minimize net existential risk.
Many believe a technological singularity is possible without adopting Singularitarianism as a moral philosophy. Although the exact numbers are hard to quantify, Singularitarianism is presently a small movement. Other prominent Singularitarians include Ray Kurzweil and Nick Bostrom.
Often ridiculing the Singularity as "the Rapture for nerds", many critics have dismissed singularitarianism as a pseudoreligion of fringe science.[4] However, some green anarchist militants have taken singularitarian rhetoric seriously enough to have called for violent direct action to stop the Singularity.[5]
lt:Singuliaritarianizmas
Read this article:
Singularitarianism | Transhumanism Wiki | FANDOM powered ...
- Singularitarianism | Prometheism.net [Last Updated On: January 7th, 2017] [Originally Added On: January 7th, 2017]
- Sterling Crispin: Begin at the End - ArtSlant [Last Updated On: February 28th, 2017] [Originally Added On: February 28th, 2017]
- Technological utopianism - Wikipedia [Last Updated On: March 7th, 2017] [Originally Added On: March 7th, 2017]
- Singularitarianism - Lesswrongwiki [Last Updated On: August 4th, 2017] [Originally Added On: August 4th, 2017]
- Singularitarianism? - Pharyngula [Last Updated On: February 6th, 2018] [Originally Added On: February 6th, 2018]
- Singularitarianism r/Singularitarianism - reddit [Last Updated On: May 6th, 2018] [Originally Added On: May 6th, 2018]
- Singularitarianism? Pharyngula [Last Updated On: November 4th, 2018] [Originally Added On: November 4th, 2018]
- Singularitarianism Research Papers - Academia.edu [Last Updated On: April 1st, 2019] [Originally Added On: April 1st, 2019]
- True AI is both logically possible and utterly implausible ... [Last Updated On: April 14th, 2019] [Originally Added On: April 14th, 2019]
- New World Order Explained - The New World Order ... [Last Updated On: April 20th, 2019] [Originally Added On: April 20th, 2019]
- Singularitarianism | Prometheism Transhumanism Post Humanism [Last Updated On: April 19th, 2020] [Originally Added On: April 19th, 2020]
- Ethics of artificial intelligence - Wikipedia [Last Updated On: December 6th, 2022] [Originally Added On: December 6th, 2022]