Open Letters
THE ORION PARTY
The Prometheus League
- Humanity Needs A World Government PDF
- Cosmos Theology Essay PDF
- Cosmos Theology Booklet PDF
- Europe Destiny Essays PDF
- Historical Parallels PDF
- Christianity Examined PDF
News Blogs
Euvolution
- Home Page
- Pierre Teilhard De Chardin
- Library of Eugenics
- Genetic Revolution News
- Science
- Philosophy
- Politics
- Nationalism
- Cosmic Heaven
- Eugenics
- Future Art Gallery
- NeoEugenics
- Contact Us
- About the Website
- Site Map
Transhumanism News
Partners
Mind Uploading - An Introduction
By Adam Kadmon Uploading is the transfer of the brain’s mindpattern onto a different substrate (such as an advanced computer) which better facilitates said entity’s ends. Uploading is a central concept in our vision of technological ascension; here I shall examine a few common objections, present thought experiments, and make a few points of my own. The most common initial reaction to uploading is absolute incomprehension, a feeling of modelling utter foreigness, followed by a deluge of mental associations with bad sci-fi and knee-jerk dismissal. It’s not "natural". You won’t be "you" after being put into a computer. A number of people have made this objection many times before, and I certainly agree if we are thinking in terms of a virtual reality life or even a computer I/O life within the limits, or minor extensions of what is available today. However, if you think of the highly advanced computer into which you are uploaded as simply your brain, with all the potential sensing devices which might exist as your inputs (microphones for all sound frequencies, all the various types of EMF detectors, particle detectors of all kinds, smellers, feelers, tasters, DNA scanners, all manner of bio-detectors, etc. - besides internally and/or externally modulated pleasure generators), and all the potential physical action causing devices which might exist as your outputs, then I can hardly see why you would feel limited at all by such a transition (uploading) and would rather be dead. In fact, I don't see any good reason why a truly advanced uploading situation could not provide everything that a human body has now and more. Moreover, the current human body is proof in principle that this can be done. Your computer brain currently drives your robot body via a signal network of nerves and your robotic body dynamically feeds back information about itself via another signal network of nerves to the controlling brain. With technology advanced enough, it will even be possible for us to feel and act as non-human bodies if we wish, and for all other kinds of machines which we do not now have the experience of directly controlling and sensing. As some have even suggested, if you want to travel the universe then you will be best to become a spaceship with all of its necessary I/O and whatever other sensors/controllers that you need/want to help you enjoy the adventure.
A long time ago, people would have been disturbed not only be the idea of having their body or brain replaced, but even something as mild-seeming as having their photo taken or riding on a train or airplane. The intuitive aversion is not even so precise as "wanting to protect one's own identity" or the like, just a foggy, irrational set of guesses made by our obsolete evolution-rooted decision-making faculties that say "this is probably more trouble than it's worth". Part of the human maturation process is learning what things are actually scary and bad, and which things just seem bad at first but turn out to come in handy later. I assume you're on a computer of some sort right now, unless you're already an upload; many elderly individuals have a fear of computers simply because they aren't used to them. Human beings are constrained to only find very few types of change acceptable, although the rational parts of our mind often remind us that change can be good, sometimes too much, even. The point the above paragraph is making is that it's useless to desire a cybernetic body with a (still) organic brain. Our consciousness, our identity lives in the patterns of our neurons, and isn't magically connected to its organic, evolved substrate; this just happens to be the only currently available "hardware" ("wetware") that is capable of supporting human-level intelligence. Patterns can be picked up, modified, accelerated, deaccelerated, blended with other patterns, copied, and placed in a wider variety of contexts than organic brains could ever experience. We may love life as ourselves very much, and have a sort of protectiveness about exactly who we are, advocating stasis over change, but eventually those who choose stasis will die off due to natural causes and only the inorganic "immortals" will remain. For the same reasons why one who lives for the sake of living, it is worth it to replace the organic brain, which is prone to suffer constant problems over time and different environments due to the (inherently weak, imperfect) nature of its substrate. The flesh has no long-term future, but "machines" do. The original "identity" is nothing more nor less than structure and patterns (i.e. software). If I replaced every molecule in your brain one by one, maintaining all the original intermolecular interfaces, there would be nothing left of your original organic brain but you could not possibly tell the difference. Ray Kurzweil illustrates this problem by offering a scenario where the original Ray (Ray1) is uploaded during his sleep, which results in a creation of an exact copy of his mind (Ray2) in the computer. When Ray1 wakes up, he is informed that there exists Ray2 who swears that he is the original Ray, and attempts to prove it by giving specific facts from Kurzweil's life. Now, here's a question: Does Ray2 represent a successful experiment in Ray1's identity/consciousness preservation? Are the two the same person?
My answer is "no", because two Rays would be forced to coexist, and their perception and experiences would be different and diverging with the passing of time. They would be almost exactly the same in every detail, but they would not form the same entity. For example, Ray1's mind cannot process Ray2's sensory data. The link has been broken. In fact there was no link at all. Both have separate minds, and separate copies of their consciousness. Uploading performed in this way would merely create a new being, who, I might add, would deserve the same legal status as the original from whom he/she was derived. Perfect twin, but not the same person. As a consequence, if we were to kill Ray1, we would kill real Ray Kurzweil i.e. his subjective experience would be a "feeling of death".
There also exists a different way of performing uploading where all neurons are gradually replaced by new ones. In this procedure, no extra copies are made (otherwise, a new copy would be forced to coexist with the original one, which would result in a creation of a new being). The idea is that our minds are just a pattern independent of the substrates on which it might run. As an example and informal proof of this, Kurzweil describes an ongoing and natural process in which individual atoms that make up our brains are continuously replaced by different atoms. The process is analogous to a river where the water flow follows the same pattern while the water that makes up the flow is continuously replaced by new sections of water. The concluding statement is that the pattern which produces consciousness is independent of the physical substrate of our minds. Kurzweil, however, goes on to equate "uploading where a new copy is made" with "gradual uploading", saying that both approaches do not preserve identity. I believe this is wrong. The second approach does appear to preserve identity and consciousness, since it seems that our identity is not affected during the process in which our atoms are being continuously replaced by other atoms of the same kind. Why the practical consequences of this natural process could be extended, and applied to a procedure seemingly as alien and theoretical as uploading depends on one subtle observation, namely that our minds replace their substrates all the time by the units of the same kind (and atoms replaced by other atoms just implement this rule). The atoms do not posses intrinsic "magical" properties that a mind and identity depend upon, therefore the phrase "identity is preserved when a mind's substrate is replaced by the atoms of the same kind" could be replaced by "identity is preserved when a mind's substrate is replaced by the perfect equivalents of the mind's functional units". There are few things here that need to be emphasized. Identity remains intact in the process where the mind's existing substrate is only gradually replaced by the exact functional equivalents of the substrate units. How big should those units be to prevent identity crisis depends on the level of functionality we are able to reproduce. If we are able to create functional equivalent of a given neuron, it shouldn't matter how it is implemented as long as it performs exactly like the original one. If the mind can be subjected to a process such as the one described above, my belief is that this kind of uploading would preserve a person's identity/consciousness. All living organisms are no more than walking algorithms. What many individuals do not seem to be able to grasp is that the complexity of a fully simulated "walking algorithm" is far, far beyond and different in kind to what we can now do with and within computers. No one was planning to upload anyone onto a Windows system. In the next decade or so, nanocomputing will become possible, and an insane amount of computing power will be dumped directly into our laps for us to play with and reshape our very souls for the better. Uploading is not betraying humanity, in fact, it is doing humanity a favor. The true "core" of humanity used to be defined by white males - soon it will be defined by beings of intelligence and empathy beyond our conception. (If these entities are not empathetic, then the whole thought experiment is moot - you can't fight a being that thinks a million times faster than you. For that matter, it's not likely that an upload with merely an accelerating brain will be able to fight a being with qualitative improvements in intelligence.) Morality is one of the manifestations of complexity. So is "free will". Humans are another, yet they can yet be vastly improved. It's interesting to observe the "rebel against society" trend really picking up, but unfortunately when you're constantly bitter about something, it's hard to see it objectively or see that solutions might exist outside the box of their little dichotomy. What is important is to use all of reality in a manner which optimizes the individual's long-term, rational happiness. The high intelligence of humans simply allows them to do that more successfully then other animals can do, in the same way that a transhuman or upload could pursue their goals in a much more appropriate fashion than a human being would. Enhancing one's life requires both a deep understanding of one's nature and what is required for one's happiness (read: life enhancement), and then, the application of one's best possible reasoning and evaluation (in terms of ones personal "life-tastes") to decide on one's actions in a continuously updating, dynamically self-observing, self-correcting manner.
Are they lying? Functional MRI holds the answer, scientists say April 19, 2002. Four regions of the brain are more strongly activated when a person is lying than when he is telling the truth, according to images captured at the Health Science Center's Research Imaging Center (RIC). "This study is the first to directly see how the brain responds to lying," said Dr. Jia-Hong Gao, associate professor at the RIC and director of the U.S. studies. "Previous studies have been behavioral in scope, without the brain scanning component."... "An extremely clever aspect of this study is that it is lie detection based on the difference in cognitive strategy between lying and telling the truth, rather than on an unreliable difference in emotional state," commented Dr. Peter T. Fox, director of the RIC and a co-author. "The polygraph -the most widely used method for lie detection- relies on heart rate, blood pressure and skin conductance (sweating) to detect increased anxiety. Polygraph fails in people who aren't anxious about lying. Lying skillfully requires mental calculations which are independent of emotional state; this is what was imaged here." Building a Bridge to the Brain March 02, 2003. Researchers are close to breakthroughs in neural interfaces, meaning we could soon mesh our minds with machines. Overview of (then) state of the art neurotech, by Immint's Bruce J. Klein. World's first brain prosthesis revealed New Scientist, March 12, 2003. The world's first brain prosthesis - an artificial hippocampus - is about to be tested in California. Unlike devices like cochlear implants, which merely stimulate brain activity, this silicon chip implant will perform the same processes as the damaged part of the brain it is replacing. Synapse chip taps into brain chemistry New Scientist, March 24, 2003. A microchip that uses chemicals instead of pulses of electricity to stimulate neurons has been created. It could open the way to implants that interact with our nervous system in a far more subtle way than is possible now. UIC unveils world's most powerful MRI for decoding the human brain September 21, 2004. The University of Illinois at Chicago unveiled today the world's most powerful magnetic resonance imaging machine for human studies, capable of imaging not just the anatomy but metabolism within the brain. This advanced technology ushers in a new age of metabolic imaging that will help researchers understand the workings of the human brain, detect diseases before their clinical signs appear, develop targeted drug therapies for illnesses like stroke and provide a better understanding of learning disabilities. Central to the technology is a 9.4-tesla magnet, larger than any other human-sized magnet, built by GE Healthcare, a unit of General Electric Company. The current industry standard for MRI systems is 1.5 tesla, which limits researchers to imaging water molecules. As a result, only anatomical changes can be detected and monitored. By contrast, the 9.4-tesla magnet, which is three times more powerful than current state-of-the-art clinical MRI magnets and more than 100,000 times stronger than the earth's magnetic field, will enable UIC researchers to detect signals from sodium, phosphorus, carbon, nitrogen, and oxygen -- the metabolic building blocks of brain function and human thought. [This represents another significant step towards mind uploading, or at least towards post-cryogenic identity reconstruction. Theoretically, if you had your brain scanned, preferably several times a year for several hours and during various mental states, the resulting data (saved to DVD or some other medium) could be used, at some future date, for reference. When combined with frozen, freeze-dried, or plastinated remains (brain), DNA code, and various other info such as video recordings or writings it should be a lot easier to either repair the original or create a high-fidelity digital copy. Ideally, detailed MRI scans should be part of the cryonics package deal. Of course, they could (and should) also be used for advanced biofeedback-based personal empowerment training (re/deprogramming) and lie detection.] Paralysed man sends e-mail by thought Nature, October 13, 2004. Brain chip reads mind by tapping straight into neurons. A pill-sized brain chip has allowed a quadriplegic man to check e-mail and play computer games using his thoughts. The device can tap into a hundred neurons at a time, and is the most sophisticated such implant tested in humans so far.
Moving brain implant seeks out signals New Scientist, November 10, 2004. A device that automatically moves electrodes through the brain to seek out the strongest signals is taking the idea of neural implants to a new level. Its developers at the California Institute of Technology in Pasadena say devices like this will be essential if brain implants are ever going to work. Implanted electrodes are usually unable to sense consistent neuronal signals for more than a few months, according to Igor Fineman, a neurosurgeon at the Huntington Hospital, also in Pasadena. This loss of sensitivity has a number of causes: the electrodes may shift following a slight knock or because of small changes in blood pressure; tissue building up on the electrodes may mask the signal; or the neurons emitting the signals can die. To get around these problems, Joel Burdick and Richard Andersen at Caltech have developed a device in which the electrodes sense where the strongest signal is coming from, and move towards it. Their prototype, which is mounted on the skull, uses piezoelectric motors to move four electrodes independently of each other in 1-micrometre increments. It has successfully been used to decode motor signals in rats and intention signals in monkeys. The researchers say that within a year they expect to be able to fit a paralysed person with a microdrive implant that will allow them to control a computer cursor and navigate the web. Autonomous microdrives could also eventually be used in other types of implant, such as the deep brain stimulators used to treat Parkinson’s disease.
|
Transtopia
- Main
- Pierre Teilhard De Chardin
- Introduction
- Principles
- Symbolism
- FAQ
- Transhumanism
- Cryonics
- Island Project
- PC-Free Zone
More News
- Aerospace
- Astro Physics
- Beaches
- Eco System
- Gene Therapy
- Genetic Engineering
- Genetic Medicine
- Health Care
- Human Genetics
- Islands
- Libertarian
- Libertarianism
- Medical School
- Medicine
- Mind Upload
- Molecular Medicine
- Moore's Law
- Nano Engineering
- Nano Medicine
- Planetology
- Red heads
- Space Flight