Daily Archives: June 10, 2016

High Seas Forecast (Tropical Atlantic)

Posted: June 10, 2016 at 12:46 pm

000FZNT02 KNHC 101546HSFAT2HIGH SEAS FORECASTNWS NATIONAL HURRICANE CENTER MIAMI FL1630 UTC FRI JUN 10 2016SUPERSEDED BY NEXT ISSUANCE IN 6 HOURSSEAS GIVEN AS SIGNIFICANT WAVE HEIGHT...WHICH IS THE AVERAGEHEIGHT OF THE HIGHEST 1/3 OF THE WAVES. INDIVIDUAL WAVES MAY BEMORE THAN TWICE THE SIGNIFICANT WAVE HEIGHT.SECURITEATLANTIC FROM 07N TO 31N W OF 35W INCLUDING CARIBBEAN SEA AND GULF OF MEXICO.SYNOPSIS VALID 1200 UTC FRI JUN 10.24 HOUR FORECAST VALID 1200 UTC SAT JUN 11.48 HOUR FORECAST VALID 1200 UTC SUN JUN 12..WARNINGS..NONE..SYNOPSIS AND FORECAST..ATLC STATIONARY FRONT 31N60W TO LOW PRES NEAR 29N73W 1016 MB TO27N76W TO LOW PRES 28N80W. N OF 30N BETWEEN 60W AND 62W SW WINDS20 TO 25 KT. SEAS TO 9 FT..12 HOUR FORECAST STATIONARY FRONT FROM 31N55W TO 27N68W TO LOWPRES NEAR 27N77W TO 27N80W. N OF 30N BETWEEN 52W TO 57W SW WINDS20 TO 25 KT. SEAS TO 9 FT. .24 HOUR FORECAST STATIONARY 31N55W TO 26N68W TO LOW PRES NEAR26N76W 1015 MB TO 27N80W. WITHIN 90 NM NE OF LOW PRES WINDS 20TO 25 KT. SEAS LESS THAN 8 FT..48 HOUR FORECAST STATIONARY FRONT FROM 31N53W TO 26N65W TO27N80W. WINDS 20 KT OR LESS. SEAS LESS THAN 8 FT..CARIBBEAN FROM 11N TO 17N BETWEEN 68W TO 78W E WINDS 20 TO 25KT. SEAS 8 TO 10 FT. .24 HOUR FORECAST FROM 11N TO 15N BETWEEN 69W AND 78W E WINDS 20TO 25 KT. SEAS 8 TO 10 FT. ELSEWHERE FROM 11N TO 17N BETWEEN 75WAND 82W WINDS 20 KT OR LESS. SEAS TO 9 FT IN E SWELL..48 HOUR FORECAST FROM 12N TO 13N BETWEEN 72W AND 74W NE TO EWINDS 20 TO 25 KT. SEAS LESS THAN 8 FT. FROM 11N TO 15N BETWEEN74W AND 80W WINDS 20 KT OR LESS. SEAS 8 FT. .CARIBBEAN 06 HOUR FORECAST S OF 17N BETWEEN 85W AND 87W...INCLUDING THE GULF OF HONDURAS...E TO SE WINDS 20 TO 25 KT. SEASLESS THAN 8 FT..24 HOUR FORECAST WINDS 20 KT OR LESS. SEAS LESS THAN 8 FT..36 HOUR FORECAST S OF 18N W OF 85W...INCLUDING THE GULF OFHONDURAS...E TO SE WINDS 20 TO 25 KT. SEAS TO 8 FT..48 HOUR FORECAST WINDS 20 KT OR LESS. SEAS LESS THAN 8 FT..REMAINDER OF AREA WINDS 20 KT OR LESS. SEAS LESS THAN 8 FT.$$.FORECASTER CHRISTENSEN. NATIONAL HURRICANE CENTER.

Read more from the original source:

High Seas Forecast (Tropical Atlantic)

Posted in High Seas | Comments Off on High Seas Forecast (Tropical Atlantic)

About Buying Islands – Private Islands Online

Posted: at 12:46 pm

Private Islands offers island buyers a host of services that make the process of finding a suitable property almost as enjoyable as owning one. From the largest inventory of islands available on the web to our weekly island blog, monthly island newsletter and bi-annual print publication, Private Islands gives buyers everything they need to discover their ideal island.

To learn more about our services, visit our corporate site http://www.privateislandsinc.com

Established in 1999, the Private Islands Online (PIO) website is an institution in the private island industry. The first service to unite the previously fragmented island business, our highly popular website is often the first and only place that owners, brokers and agents list their properties; ensuring the largest and most up-to-date selection of islands to choose from. With over 500 active listings featuring big, beautiful images as well as a host of new and innovative mapping tools, PIO is the most comprehensive information source for buyers. With approximately 4 million visitors a year, the largest inventory of islands available anywhere and the most relevant online resources, PIO is the center of the island world.

Learn more about advertising your island for sale on Private Islands Online

Available by subscription and at finer bookstores across the globe, Private Islands Magazine is the only publication of its kind dedicated to bringing buyers the latest inventory of islands available. Every issue contains over 100 pages of oversized, glossy images and articles featuring more than 50 island properties for sale or rent, making Private Islands Magazine a must-have for any serious island buyer.

http://www.privateislandsmag.com

Issued monthly, the Private Islands Newsletter introduces buyers to the newest properties to hit the market. Offering price and property changes, a window into upcoming investment regions as well as updates on company projects and general news of interest, The Private Islands Newsletter is an ideal means of keeping buyers informed with the latest events taking place in the island world.

Subscribe by clicking on "Newsletter" on the top right menu

Updated frequently, the Private Islands Blog gives buyers the chance to learn about the latest developments in the island world. From new, interesting and unusual island listings to the latest in island services and must have accessories to updates on island policies and politics, the Private Islands Blog is the discerning buyers island lifestyle guide.

http://www.privateislandsblog.com

the Private Islands Buyers Guide is an invaluable resource that offers you a wealth of information on the most important elements of private island ownership. From your initial search to a few important questions to consider to an overview of island regions through to regulatory consideration and insights into developing your island, the Private Islands Buyers Guide will help direct you from the initial inquiries through to the final sale.

Private Island Buyer's Guide

The Virtual Island Broker helps connect serious buyers with privacy-conscious sellers. The VIB opens the doors to the hidden island market for buyers by enabling privacy-conscious sellers to market the details of their island confidentially. Our in-depth knowledge and years of experience in this complex field have enabled us to develop a technology for accurately matching serious buyers with their dream properties- whether or not the island is publicly listed. If you cant find your property on Private Islands Online, perhaps it is time to consider registering for the VIB.

Virtual Island Broker

Islands for Rent inspires potential buyers on the journey to private island ownership. With more than 200 islands - many available on an exclusive basis - Islands for Rent is your opportunity to test drive the island experience and find out exactly what kind of island is right for you.

rent.privateislandsonline.com

Original post:

About Buying Islands - Private Islands Online

Posted in Private Islands | Comments Off on About Buying Islands – Private Islands Online

Singularity University – Solving Humanity’s Grand Challenges

Posted: at 12:46 pm

What is Singularity University?

Our mission is to educate, inspire and empower leaders to apply exponential technologies to address humanity's grand challenges.

Gathering the worlds leading innovators to drive industries forward and positively impact the world. Come join the movement!

Incubating companies, one experiment at a time

Eager to connect with SU enthusiasts near you? Learn more here

Thoughtful coverage on science, technology, and the singularity

Our custom program for Fortune 500 companies

Vivek Wadhwa

David Roberts

Salim Ismail

Kathryn Myronuk

Catherine Mohr

Neil Jacobstein

Ralph Merkle

Raymond McCauley

Marc Goodman

Daniel Kraft, MD

Brad Templeton

Gregg Maryniak

Robert Freitas

Andrew Hessel

Paul Saffo

Jonathan Knowles

Jeremy Howard

Eric Ries

Avi Reichental

Peter Diamandis

Ray Kurzweil

Nicholas Haan

John Hagel

Robert Hariri, MD, PhD

June 6th, 2016

Aconference about the radical impact of future technologyfeaturing some of the worlds most forward-thinking expertsis coming to Christchurch. The SingularityUsummit will have its first event in Australasia in Christchurch later []

May 31st, 2016

Zinnov and Authentise announce a strategic partnership to help global companies through their digital manufacturing transitions across technologies such as 3D printing, IIoT and software automation.

NASA Research Park Building 20 S. Akron Rd. MS 20-1 Moffett Field CA 94035-0001 Phone: +1-650-200-3434

Singularity University, Singularity Hub, Singularity Summit, SU Labs, Singularity Labs, Exponential Medicine, Exponential Finance and all associated logos and design elements are trademarks and/or service marks of Singularity Education Group.

Singularity University is not a degree granting institution.

Continue reading here:

Singularity University - Solving Humanity's Grand Challenges

Posted in Singularity | Comments Off on Singularity University – Solving Humanity’s Grand Challenges

God Is the Machine | WIRED

Posted: at 12:46 pm

Skip Article Header. Skip to: Start of Article.

IN THE BEGINNING THERE WAS 0. AND THEN THERE WAS 1. A MIND-BENDING MEDITATION ON THE TRANSCENDENT POWER OF DIGITAL COMPUTATION.

At today's rates of compression, you could download the entire 3 billion digits of your DNA onto about four CDs. That 3-gigabyte genome sequence represents the prime coding information of a human body your life as numbers. Biology, that pulsating mass of plant and animal flesh, is conceived by science today as an information process. As computers keep shrinking, we can imagine our complex bodies being numerically condensed to the size of two tiny cells. These micro-memory devices are called the egg and sperm. They are packed with information.

This article has been reproduced in a new format and may be missing content or contain faulty links. Contact wiredlabs@wired.com to report an issue.

That life might be information, as biologists propose, is far more intuitive than the corresponding idea that hard matter is information as well. When we bang a knee against a table leg, it sure doesn't feel like we knocked into information. But that's the idea many physicists are formulating.

The spooky nature of material things is not new. Once science examined matter below the level of fleeting quarks and muons, it knew the world was incorporeal. What could be less substantial than a realm built out of waves of quantum probabilities? And what could be weirder? Digital physics is both. It suggests that those strange and insubstantial quantum wavicles, along with everything else in the universe, are themselves made of nothing but 1s and 0s. The physical world itself is digital.

The scientist John Archibald Wheeler (coiner of the term "black hole") was onto this in the '80s. He claimed that, fundamentally, atoms are made up of of bits of information. As he put it in a 1989 lecture, "Its are from bits." He elaborated: "Every it every particle, every field of force, even the space-time continuum itself derives its function, its meaning, its very existence entirely from binary choices, bits. What we call reality arises in the last analysis from the posing of yes/no questions."

To get a sense of the challenge of describing physics as a software program, picture three atoms: two hydrogen and one oxygen. Put on the magic glasses of digital physics and watch as the three atoms bind together to form a water molecule. As they merge, each seems to be calculating the optimal angle and distance at which to attach itself to the others. The oxygen atom uses yes/no decisions to evaluate all possible courses toward the hydrogen atom, then usually selects the optimal 104.45 degrees by moving toward the other hydrogen at that very angle. Every chemical bond is thus calculated.

If this sounds like a simulation of physics, then you understand perfectly, because in a world made up of bits, physics is exactly the same as a simulation of physics. There's no difference in kind, just in degree of exactness. In the movie The Matrix, simulations are so good you can't tell if you're in one. In a universe run on bits, everything is a simulation.

An ultimate simulation needs an ultimate computer, and the new science of digitalism says that the universe itself is the ultimate computer actually the only computer. Further, it says, all the computation of the human world, especially our puny little PCs, merely piggybacks on cycles of the great computer. Weaving together the esoteric teachings of quantum physics with the latest theories in computer science, pioneering digital thinkers are outlining a way of understanding all of physics as a form of computation.

From this perspective, computation seems almost a theological process. It takes as its fodder the primeval choice between yes or no, the fundamental state of 1 or 0. After stripping away all externalities, all material embellishments, what remains is the purest state of existence: here/not here. Am/not am. In the Old Testament, when Moses asks the Creator, "Who are you?" the being says, in effect, "Am." One bit. One almighty bit. Yes. One. Exist. It is the simplest statement possible.

All creation, from this perch, is made from this irreducible foundation. Every mountain, every star, the smallest salamander or woodland tick, each thought in our mind, each flight of a ball is but a web of elemental yes/nos woven together. If the theory of digital physics holds up, movement (f = ma), energy (E = mc), gravity, dark matter, and antimatter can all be explained by elaborate programs of 1/0 decisions. Bits can be seen as a digital version of the "atoms" of classical Greece: the tiniest constituent of existence. But these new digital atoms are the basis not only of matter, as the Greeks thought, but of energy, motion, mind, and life.

From this perspective, computation, which juggles and manipulates these primal bits, is a silent reckoning that uses a small amount of energy to rearrange symbols. And its result is a signal that makes a difference a difference that can be felt as a bruised knee. The input of computation is energy and information; the output is order, structure, extropy.

Our awakening to the true power of computation rests on two suspicions. The first is that computation can describe all things. To date, computer scientists have been able to encapsulate every logical argument, scientific equation, and literary work that we know about into the basic notation of computation. Now, with the advent of digital signal processing, we can capture video, music, and art in the same form. Even emotion is not immune. Researchers Cynthia Breazeal at MIT and Charles Guerin and Albert Mehrabian in Quebec have built Kismet and EMIR (Emotional Model for Intelligent Response), two systems that exhibit primitive feelings.

The second supposition is that all things can compute. We have begun to see that almost any kind of material can serve as a computer. Human brains, which are mostly water, compute fairly well. (The first "calculators" were clerical workers figuring mathematical tables by hand.) So can sticks and strings. In 1975, as an undergraduate student, engineer Danny Hillis constructed a digital computer out of skinny Tinkertoys. In 2000, Hillis designed a digital computer made of only steel and tungsten that is indirectly powered by human muscle. This slow-moving device turns a clock intended to tick for 10,000 years. He hasn't made a computer with pipes and pumps, but, he says, he could. Recently, scientists have used both quantum particles and minute strands of DNA to perform computations.

A third postulate ties the first two together into a remarkable new view: All computation is one.

In 1937, Alan Turing, Alonso Church, and Emil Post worked out the logical underpinnings of useful computers. They called the most basic loop which has become the foundation of all working computers a finite-state machine. Based on their analysis of the finite-state machine, Turing and Church proved a theorem now bearing their names. Their conjecture states that any computation executed by one finite-state machine, writing on an infinite tape (known later as a Turing machine), can be done by any other finite-state machine on an infinite tape, no matter what its configuration. In other words, all computation is equivalent. They called this universal computation.

When John von Neumann and others jump-started the first electronic computers in the 1950s, they immediately began extending the laws of computation away from math proofs and into the natural world. They tentatively applied the laws of loops and cybernetics to ecology, culture, families, weather, and biological systems. Evolution and learning, they declared, were types of computation. Nature computed.

If nature computed, why not the entire universe? The first to put down on paper the outrageous idea of a universe-wide computer was science fiction writer Isaac Asimov. In his 1956 short story "The Last Question," humans create a computer smart enough to bootstrap new computers smarter than itself. These analytical engines recursively grow super smarter and super bigger until they act as a single giant computer filling the universe. At each stage of development, humans ask the mighty machine if it knows how to reverse entropy. Each time it answers: "Insufficient data for a meaningful reply." The story ends when human minds merge into the ultimate computer mind, which takes over the entire mass and energy of the universe. Then the universal computer figures out how to reverse entropy and create a universe.

Such a wacky idea was primed to be spoofed, and that's what Douglas Adams did when he wrote The Hitchhiker's Guide to the Galaxy. In Adams' story the earth is a computer, and to the world's last question it gives the answer: 42.

Few ideas are so preposterous that no one at all takes them seriously, and this idea that God, or at least the universe, might be the ultimate large-scale computer is actually less preposterous than most. The first scientist to consider it, minus the whimsy or irony, was Konrad Zuse, a little-known German who conceived of programmable digital computers 10 years before von Neumann and friends. In 1967, Zuse outlined his idea that the universe ran on a grid of cellular automata, or CA. Simultaneously, Ed Fredkin was considering the same idea. Self-educated, opinionated, and independently wealthy, Fredkin hung around early computer scientists exploring CAs. In the 1960s, he began to wonder if he could use computation as the basis for an understanding of physics.

Fredkin didn't make much headway until 1970, when mathematician John Conway unveiled the Game of Life, a particularly robust version of cellular automata. The Game of Life, as its name suggests, was a simple computational model that mimicked the growth and evolution of living things. Fredkin began to play with other CAs to see if they could mimic physics. You needed very large ones, but they seemed to scale up nicely, so he was soon fantasizing huge really huge CAs that would extend to include everything. Maybe the universe itself was nothing but a great CA.

The more Fredkin investigated the metaphor, the more real it looked to him. By the mid-'80s, he was saying things like, "I've come to the conclusion that the most concrete thing in the world is information."

Many of his colleagues felt that if Fredkin had left his observations at the level of metaphor "the universe behaves as if it was a computer" he would have been more famous. As it is, Fredkin is not as well known as his colleague Marvin Minsky, who shares some of his views. Fredkin insisted, flouting moderation, that the universe is a large field of cellular automata, not merely like one, and that everything we see and feel is information.

Many others besides Fredkin recognized the beauty of CAs as a model for investigating the real world. One of the early explorers was the prodigy Stephen Wolfram. Wolfram took the lead in systematically investigating possible CA structures in the early 1980s. By programmatically tweaking the rules in tens of thousands of alterations, then running them out and visually inspecting them, he acquired a sense of what was possible. He was able to generate patterns identical to those seen in seashells, animal skins, leaves, and sea creatures. His simple rules could generate a wildly complicated beauty, just as life could. Wolfram was working from the same inspiration that Fredkin did: The universe seems to behave like a vast cellular automaton.

Even the infinitesimally small and nutty realm of the quantum can't escape this sort of binary logic. We describe a quantum-level particle's existence as a continuous field of probabilities, which seems to blur the sharp distinction of is/isn't. Yet this uncertainty resolves as soon as information makes a difference (as in, as soon as it's measured). At that moment, all other possibilities collapse to leave only the single yes/no state. Indeed, the very term "quantum" suggests an indefinite realm constantly resolving into discrete increments, precise yes/no states.

For years, Wolfram explored the notion of universal computation in earnest (and in secret) while he built a business selling his popular software Mathematica. So convinced was he of the benefits of looking at the world as a gigantic Turing machine that he penned a 1,200-page magnum opus he modestly calls A New Kind of Science. Self-published in 2002, the book reinterprets nearly every field of science in terms of computation: "All processes, whether they are produced by human effort or occur spontaneously in nature, can be viewed as computation." (See "The Man Who Cracked the Code to Everything," Wired 10.6.)

Wolfram's key advance, however, is more subtly brilliant, and depends on the old Turing-Church hypothesis: All finite-state machines are equivalent. One computer can do anything another can do. This is why your Mac can, with proper software, pretend to be a PC or, with sufficient memory, a slow supercomputer. Wolfram demonstrates that the outputs of this universal computation are also computationally equivalent. Your brain and the physics of a cup being filled with water are equivalent, he says: for your mind to compute a thought and the universe to compute water particles falling, both require the same universal process.

If, as Fredkin and Wolfram suggest, all movement, all actions, all nouns, all functions, all states, all we see, hear, measure, and feel are various elaborate cathedrals built out of this single ubiquitous process, then the foundations of our knowledge are in for a galactic-scale revisioning in the coming decades. Already, the dream of devising a computational explanation for gravity, the speed of light, muons, Higgs bosons, momentum, and molecules has become the holy grail of theoretical physics. It would be a unified explanation of physics (digital physics), relativity (digital relativity), evolution (digital evolution and life), quantum mechanics, and computation itself, and at the bottom of it all would be squirming piles of the universal elements: loops of yes/no bits. Ed Fredkin has been busy honing his idea of digital physics and is completing a book called Digital Mechanics. Others, including Oxford theoretical physicist David Deutsch, are working on the same problem. Deutsch wants to go beyond physics and weave together four golden threads epistemology, physics, evolutionary theory, and quantum computing to produce what is unashamedly referred to by researchers as the Theory of Everything. Based on the primitives of quantum computation, it would swallow all other theories.

Any large computer these days can emulate a computer of some other design. You have Dell computers running Amigas. The Amigas, could, if anyone wanted them to, run Commodores. There is no end to how many nested worlds can be built. So imagine what a universal computer might do. If you had a universally equivalent engine, you could pop it in anywhere, including inside the inside of something else. And if you had a universe-sized computer, it could run all kinds of recursive worlds; it could, for instance, simulate an entire galaxy.

If smaller worlds have smaller worlds running within them, however, there has to be a platform that runs the first among them. If the universe is a computer, where is it running? Fredkin says that all this work happens on the "Other." The Other, he says, could be another universe, another dimension, another something. It's just not in this universe, and so he doesn't care too much about it. In other words, he punts. David Deutsch has a different theory. "The universality of computation is the most profound thing in the universe," he says. Since computation is absolutely independent of the "hardware" it runs on, studying it can tell us nothing about the nature or existence of that platform. Deutsch concludes it does not exist: "The universe is not a program running somewhere else. It is a universal computer, and there is nothing outside of it."

Strangely, nearly every mapper of this new digitalism foresees human-made computers taking over the natural universal computer. This is in part because they see nothing to stop the rapid expansion of computation, and in part because well why not? But if the entire universe is computing, why build our own expensive machines, especially when chip fabs cost several billion dollars to construct? Tommaso Toffoli, a quantum computer researcher, puts it best: "In a sense, nature has been continually computing the 'next state' of the universe for billions of years; all we have to do and, actually, all we can do is 'hitch a ride' on this huge, ongoing Great Computation."

In a June 2002 article published in the Physical Review Letters, MIT professor Seth Lloyd posed this question: If the universe was a computer, how powerful would it be? By analyzing the computing potential of quantum particles, he calculated the upper limit of how much computing power the entire universe (as we know it) has contained since the beginning of time. It's a large number: 10^120 logical operations. There are two interpretations of this number. One is that it represents the performance "specs" of the ultimate computer. The other is that it's the amount required to simulate the universe on a quantum computer. Both statements illustrate the tautological nature of a digital universe: Every computer is the computer.

Continuing in this vein, Lloyd estimated the total amount of computation that has been accomplished by all human-made computers that have ever run. He came up with 10^31 ops. (Because of the fantastic doubling of Moore's law, over half of this total was produced in the past two years!) He then tallied up the total energy-matter available in the known universe and divided that by the total energy-matter of human computers expanding at the rate of Moore's law. "We need 300 Moore's law doublings, or 600 years at one doubling every two years," he figures, "before all the available energy in the universe is taken up in computing. Of course, if one takes the perspective that the universe is already essentially performing a computation, then we don't have to wait at all. In this case, we may just have to wait for 600 years until the universe is running Windows or Linux."

The relative nearness of 600 years says more about exponential increases than it does about computers. Neither Lloyd nor any other scientist mentioned here realistically expects a second universal computer in 600 years. But what Lloyd's calculation proves is that over the long term, there is nothing theoretical to stop the expansion of computers. "In the end, the whole of space and its contents will be the computer. The universe will in the end consist, literally, of intelligent thought processes," David Deutsch proclaims in Fabric of Reality. These assertions echo those of the physicist Freeman Dyson, who also sees minds amplified by computers expanding into the cosmos "infinite in all directions."

Yet while there is no theoretical hitch to an ever-expanding computer matrix that may in the end resemble Asimov's universal machine, no one wants to see themselves as someone else's program running on someone else's computer. Put that way, life seems a bit secondhand.

Yet the notion that our existence is derived, like a string of bits, is an old and familiar one. Central to the evolution of Western civilization from its early Hellenistic roots has been the notion of logic, abstraction, and disembodied information. The saintly Christian guru John writes from Greece in the first century: "In the beginning was the Word, and the Word was with God, and the Word was God." Charles Babbage, credited with constructing the first computer in 1832, saw the world as one gigantic instantiation of a calculating machine, hammered out of brass by God. He argued that in this heavenly computer universe, miracles were accomplished by divinely altering the rules of computation. Even miracles were logical bits, manipulated by God.

There's still confusion. Is God the Word itself, the Ultimate Software and Source Code, or is God the Ultimate Programmer? Or is God the necessary Other, the off-universe platform where this universe is computed?

But each of these three possibilities has at its root the mystical doctrine of universal computation. Somehow, according to digitalism, we are linked to one another, all beings alive and inert, because we share, as John Wheeler said, "at the bottom at a very deep bottom, in most instances an immaterial source." This commonality, spoken of by mystics of many beliefs in different terms, also has a scientific name: computation. Bits minute logical atoms, spiritual in form amass into quantum quarks and gravity waves, raw thoughts and rapid motions.

The computation of these bits is a precise, definable, yet invisible process that is immaterial yet produces matter.

"Computation is a process that is perhaps the process," says Danny Hillis, whose new book, The Pattern on the Stone, explains the formidable nature of computation. "It has an almost mystical character because it seems to have some deep relationship to the underlying order of the universe. Exactly what that relationship is, we cannot say. At least for now."

Probably the trippiest science book ever written is The Physics of Immortality, by Frank Tipler. If this book was labeled standard science fiction, no one would notice, but Tipler is a reputable physicist and Tulane University professor who writes papers for the International Journal of Theoretical Physics. In Immortality, he uses current understandings of cosmology and computation to declare that all living beings will be bodily resurrected after the universe dies. His argument runs roughly as follows: As the universe collapses upon itself in the last minutes of time, the final space-time singularity creates (just once) infinite energy and computing capacity. In other words, as the giant universal computer keeps shrinking in size, its power increases to the point at which it can simulate precisely the entire historical universe, past and present and possible. He calls this state the Omega Point. It is a computational space that can resurrect "from the dead" all the minds and bodies that have ever lived. The weird thing is that Tipler was an atheist when he developed this theory and discounted as mere "coincidence" the parallels between his ideas and the Christian doctrine of Heavenly Resurrection. Since then, he says, science has convinced him that the two may be identical.

While not everyone goes along with Tipler's eschatological speculations, theorists like Deutsch endorse his physics. An Omega Computer is possible and probably likely, they say.

I asked Tipler which side of the Fredkin gap he is on. Does he go along with the weak version of the ultimate computer, the metaphorical one, that says the universe only seems like a computer? Or does he embrace Fredkin's strong version, that the universe is a 12 billion-year-old computer and we are the killer app? "I regard the two statements as equivalent," he answered. "If the universe in all ways acts as if it was a computer, then what meaning could there be in saying that it is not a computer?"

Only hubris.

Continued here:

God Is the Machine | WIRED

Posted in Extropy | Comments Off on God Is the Machine | WIRED

Space exploration – Wikipedia, the free encyclopedia

Posted: at 12:45 pm

Space exploration is the ongoing discovery and exploration of celestial structures in outer space by means of continuously evolving and growing space technology. While the study of space is carried out mainly by astronomers with telescopes, the physical exploration of space is conducted both by unmanned robotic probes and human spaceflight.

While the observation of objects in space, known as astronomy, predates reliable recorded history, it was the development of large and relatively efficient rockets during the early 20th century that allowed physical space exploration to become a reality. Common rationales for exploring space include advancing scientific research, national prestige, uniting different nations, ensuring the future survival of humanity, and developing military and strategic advantages against other countries.[1]

Space exploration has often been used as a proxy competition for geopolitical rivalries such as the Cold War. The early era of space exploration was driven by a "Space Race" between the Soviet Union and the United States. The launch of the first human-made object to orbit Earth, the Soviet Union's Sputnik 1, on 4 October 1957, and the first Moon landing by the American Apollo 11 mission on 20 July 1969 are often taken as landmarks for this initial period. The Soviet space program achieved many of the first milestones, including the first living being in orbit in 1957, the first human spaceflight (Yuri Gagarin aboard Vostok 1) in 1961, the first spacewalk (by Aleksei Leonov) on 18 March 1965, the first automatic landing on another celestial body in 1966, and the launch of the first space station (Salyut 1) in 1971.

After the first 20 years of exploration, focus shifted from one-off flights to renewable hardware, such as the Space Shuttle program, and from competition to cooperation as with the International Space Station (ISS).

With the substantial completion of the ISS[2] following STS-133 in March 2011, plans for space exploration by the USA remain in flux. Constellation, a Bush Administration program for a return to the Moon by 2020[3] was judged inadequately funded and unrealistic by an expert review panel reporting in 2009.[4] The Obama Administration proposed a revision of Constellation in 2010 to focus on the development of the capability for crewed missions beyond low Earth orbit (LEO), envisioning extending the operation of the ISS beyond 2020, transferring the development of launch vehicles for human crews from NASA to the private sector, and developing technology to enable missions to beyond LEO, such as EarthMoon L1, the Moon, EarthSun L2, near-Earth asteroids, and Phobos or Mars orbit.[5]

In the 2000s, the People's Republic of China initiated a successful manned spaceflight program, while the European Union, Japan, and India have also planned future manned space missions. China, Russia, Japan, and India have advocated manned missions to the Moon during the 21st century, while the European Union has advocated manned missions to both the Moon and Mars during the 20/21st century.

From the 1990s onwards, private interests began promoting space tourism and then private space exploration of the Moon (see Google Lunar X Prize).

The highest known projectiles prior to the rockets of the 1940s were the shells of the Paris Gun, a type of German long-range siege gun, which reached at least 40 kilometers altitude during World War One.[6] Steps towards putting a human-made object into space were taken by German scientists during World War II while testing the V-2 rocket, which became the first human-made object in space on 3 October 1942 with the launching of the A-4. After the war, the U.S. used German scientists and their captured rockets in programs for both military and civilian research. The first scientific exploration from space was the cosmic radiation experiment launched by the U.S. on a V-2 rocket on 10 May 1946.[7] The first images of Earth taken from space followed the same year[8][9] while the first animal experiment saw fruit flies lifted into space in 1947, both also on modified V-2s launched by Americans. Starting in 1947, the Soviets, also with the help of German teams, launched sub-orbital V-2 rockets and their own variant, the R-1, including radiation and animal experiments on some flights. These suborbital experiments only allowed a very short time in space which limited their usefulness.

The first successful orbital launch was of the Soviet unmanned Sputnik 1 ("Satellite 1") mission on 4 October 1957. The satellite weighed about 83kg (183lb), and is believed to have orbited Earth at a height of about 250km (160mi). It had two radio transmitters (20 and 40MHz), which emitted "beeps" that could be heard by radios around the globe. Analysis of the radio signals was used to gather information about the electron density of the ionosphere, while temperature and pressure data was encoded in the duration of radio beeps. The results indicated that the satellite was not punctured by a meteoroid. Sputnik 1 was launched by an R-7 rocket. It burned up upon re-entry on 3 January 1958.

The second one was Sputnik 2. Launched by the USSR in November 1957, it carried dog Laika inside.

This success led to an escalation of the American space program, which unsuccessfully attempted to launch a Vanguard satellite into orbit two months later. On 31 January 1958, the U.S. successfully orbited Explorer 1 on a Juno rocket. In the meantime, the Soviet dog Laika became the first animal in orbit on 3 November 1957.

The first successful human spaceflight was Vostok 1 ("East 1"), carrying 27-year-old Russian cosmonaut Yuri Gagarin on 12 April 1961. The spacecraft completed one orbit around the globe, lasting about 1 hour and 48 minutes. Gagarin's flight resonated around the world; it was a demonstration of the advanced Soviet space program and it opened an entirely new era in space exploration: human spaceflight.

The U.S. first launched a person into space within a month of Vostok 1 with Alan Shepard's suborbital flight in Mercury-Redstone 3. Orbital flight was achieved by the United States when John Glenn's Mercury-Atlas 6 orbited Earth on 20 February 1962.

Valentina Tereshkova, the first woman in space, orbited Earth 48 times aboard Vostok 6 on 16 June 1963.

China first launched a person into space 42 years after the launch of Vostok 1, on 15 October 2003, with the flight of Yang Liwei aboard the Shenzhou 5 (Spaceboat 5) spacecraft.

The first artificial object to reach another celestial body was Luna 2 in 1959.[10] The first automatic landing on another celestial body was performed by Luna 9[11] in 1966. Luna 10 became the first artificial satellite of the Moon.[12]

The first manned landing on another celestial body was performed by Apollo 11 on 20 July 1969.

The first successful interplanetary flyby was the 1962 Mariner 2 flyby of Venus (closest approach 34,773 kilometers). The other planets were first flown by in 1965 for Mars by Mariner 4, 1973 for Jupiter by Pioneer 10, 1974 for Mercury by Mariner 10, 1979 for Saturn by Pioneer 11, 1986 for Uranus by Voyager 2, 1989 for Neptune by Voyager 2. In 2015, the dwarf planets Ceres and Pluto were orbited by Dawn and passed by New Horizons, respectively.

The first interplanetary surface mission to return at least limited surface data from another planet was the 1970 landing of Venera 7 on Venus which returned data to Earth for 23 minutes. In 1971 the Mars 3 mission achieved the first soft landing on Mars returning data for almost 20 seconds. Later much longer duration surface missions were achieved, including over 6 years of Mars surface operation by Viking 1 from 1975 to 1982 and over 2 hours of transmission from the surface of Venus by Venera 13 in 1982, the longest ever Soviet planetary surface mission.

The dream of stepping into the outer reaches of Earth's atmosphere was driven by the fiction of Jules Verne[13][14][15] and H.G.Wells,[16] and rocket technology was developed to try to realise this vision. The German V-2 was the first rocket to travel into space, overcoming the problems of thrust and material failure. During the final days of World War II this technology was obtained by both the Americans and Soviets as were its designers. The initial driving force for further development of the technology was a weapons race for intercontinental ballistic missiles (ICBMs) to be used as long-range carriers for fast nuclear weapon delivery, but in 1961 when the Soviet Union launched the first man into space, the United States declared itself to be in a "Space Race" with the Soviets.

Konstantin Tsiolkovsky, Robert Goddard, Hermann Oberth, and Reinhold Tiling laid the groundwork of rocketry in the early years of the 20th century.

Wernher von Braun was the lead rocket engineer for Nazi Germany's World War II V-2 rocket project. In the last days of the war he led a caravan of workers in the German rocket program to the American lines, where they surrendered and were brought to the USA to work on U.S. rocket development ("Operation Paperclip"). He acquired American citizenship and led the team that developed and launched Explorer 1, the first American satellite. Von Braun later led the team at NASA's Marshall Space Flight Center which developed the Saturn V moon rocket.

Initially the race for space was often led by Sergei Korolyov, whose legacy includes both the R7 and Soyuzwhich remain in service to this day. Korolev was the mastermind behind the first satellite, first man (and first woman) in orbit and first spacewalk. Until his death his identity was a closely guarded state secret; not even his mother knew that he was responsible for creating the Soviet space program.

Kerim Kerimov was one of the founders of the Soviet space program and was one of the lead architects behind the first human spaceflight (Vostok 1) alongside Sergey Korolyov. After Korolyov's death in 1966, Kerimov became the lead scientist of the Soviet space program and was responsible for the launch of the first space stations from 1971 to 1991, including the Salyut and Mir series, and their precursors in 1967, the Cosmos 186 and Cosmos 188.[17][18]

Although the Sun will probably not be physically explored at all, the study of the Sun has nevertheless been a major focus of space exploration. Being above the atmosphere in particular and Earth's magnetic field gives access to the solar wind and infrared and ultraviolet radiations that cannot reach Earth's surface. The Sun generates most space weather, which can affect power generation and transmission systems on Earth and interfere with, and even damage, satellites and space probes. Numerous spacecraft dedicated to observing the Sun have been launched and still others have had solar observation as a secondary objective. Solar Probe Plus, planned for a 2018 launch, will approach the Sun to within 1/8th the orbit of Mercury.

Mercury remains the least explored of the inner planets. As of May 2013, the Mariner 10 and MESSENGER missions have been the only missions that have made close observations of Mercury. MESSENGER entered orbit around Mercury in March 2011, to further investigate the observations made by Mariner 10 in 1975 (Munsell, 2006b).

A third mission to Mercury, scheduled to arrive in 2020, BepiColombo is to include two probes. BepiColombo is a joint mission between Japan and the European Space Agency. MESSENGER and BepiColombo are intended to gather complementary data to help scientists understand many of the mysteries discovered by Mariner 10's flybys.

Flights to other planets within the Solar System are accomplished at a cost in energy, which is described by the net change in velocity of the spacecraft, or delta-v. Due to the relatively high delta-v to reach Mercury and its proximity to the Sun, it is difficult to explore and orbits around it are rather unstable.

Venus was the first target of interplanetary flyby and lander missions and, despite one of the most hostile surface environments in the Solar System, has had more landers sent to it (nearly all from the Soviet Union) than any other planet in the Solar System. The first successful Venus flyby was the American Mariner 2 spacecraft, which flew past Venus in 1962. Mariner 2 has been followed by several other flybys by multiple space agencies often as part of missions using a Venus flyby to provide a gravitational assist en route to other celestial bodies. In 1967 Venera 4 became the first probe to enter and directly examine the atmosphere of Venus. In 1970, Venera 7 became the first successful lander to reach the surface of Venus and by 1985 it had been followed by eight additional successful Soviet Venus landers which provided images and other direct surface data. Starting in 1975 with the Soviet orbiter Venera 9 some ten successful orbiter missions have been sent to Venus, including later missions which were able to map the surface of Venus using radar to pierce the obscuring atmosphere.

Space exploration has been used as a tool to understand Earth as a celestial object in its own right. Orbital missions can provide data for Earth that can be difficult or impossible to obtain from a purely ground-based point of reference.

For example, the existence of the Van Allen radiation belts was unknown until their discovery by the United States' first artificial satellite, Explorer 1. These belts contain radiation trapped by Earth's magnetic fields, which currently renders construction of habitable space stations above 1000km impractical. Following this early unexpected discovery, a large number of Earth observation satellites have been deployed specifically to explore Earth from a space based perspective. These satellites have significantly contributed to the understanding of a variety of Earth-based phenomena. For instance, the hole in the ozone layer was found by an artificial satellite that was exploring Earth's atmosphere, and satellites have allowed for the discovery of archeological sites or geological formations that were difficult or impossible to otherwise identify.

The Moon was the first celestial body to be the object of space exploration. It holds the distinctions of being the first remote celestial object to be flown by, orbited, and landed upon by spacecraft, and the only remote celestial object ever to be visited by humans.

In 1959 the Soviets obtained the first images of the far side of the Moon, never previously visible to humans. The U.S. exploration of the Moon began with the Ranger 4 impactor in 1962. Starting in 1966 the Soviets successfully deployed a number of landers to the Moon which were able to obtain data directly from the Moon's surface; just four months later, Surveyor 1 marked the debut of a successful series of U.S. landers. The Soviet unmanned missions culminated in the Lunokhod program in the early 1970s, which included the first unmanned rovers and also successfully brought lunar soil samples to Earth for study. This marked the first (and to date the only) automated return of extraterrestrial soil samples to Earth. Unmanned exploration of the Moon continues with various nations periodically deploying lunar orbiters, and in 2008 the Indian Moon Impact Probe.

Manned exploration of the Moon began in 1968 with the Apollo 8 mission that successfully orbited the Moon, the first time any extraterrestrial object was orbited by humans. In 1969, the Apollo 11 mission marked the first time humans set foot upon another world. Manned exploration of the Moon did not continue for long, however. The Apollo 17 mission in 1972 marked the most recent human visit there, and the next, Exploration Mission 2, is due to orbit the Moon in 2021. Robotic missions are still pursued vigorously.

The exploration of Mars has been an important part of the space exploration programs of the Soviet Union (later Russia), the United States, Europe, Japan and India. Dozens of robotic spacecraft, including orbiters, landers, and rovers, have been launched toward Mars since the 1960s. These missions were aimed at gathering data about current conditions and answering questions about the history of Mars. The questions raised by the scientific community are expected to not only give a better appreciation of the red planet but also yield further insight into the past, and possible future, of Earth.

The exploration of Mars has come at a considerable financial cost with roughly two-thirds of all spacecraft destined for Mars failing before completing their missions, with some failing before they even began. Such a high failure rate can be attributed to the complexity and large number of variables involved in an interplanetary journey, and has led researchers to jokingly speak of The Great Galactic Ghoul[19] which subsists on a diet of Mars probes. This phenomenon is also informally known as the Mars Curse.[20] In contrast to overall high failure rates in the exploration of Mars, India has become the first country to achieve success of its maiden attempt. India's Mars Orbiter Mission (MOM)[21][22][23] is one of the least expensive interplanetary missions ever undertaken with an approximate total cost of 450 Crore (US$73 million).[24][25]

The Russian space mission Fobos-Grunt, which launched on 9 November 2011 experienced a failure leaving it stranded in low Earth orbit.[26] It was to begin exploration of the Phobos and Martian circumterrestrial orbit, and study whether the moons of Mars, or at least Phobos, could be a "trans-shipment point" for spaceships travelling to Mars.[27]

The exploration of Jupiter has consisted solely of a number of automated NASA spacecraft visiting the planet since 1973. A large majority of the missions have been "flybys", in which detailed observations are taken without the probe landing or entering orbit; such as in Pioneer and Voyager programs. The Galileo spacecraft is the only one to have orbited the planet. As Jupiter is believed to have only a relatively small rocky core and no real solid surface, a landing mission is nearly impossible.

Reaching Jupiter from Earth requires a delta-v of 9.2km/s,[28] which is comparable to the 9.7km/s delta-v needed to reach low Earth orbit.[29] Fortunately, gravity assists through planetary flybys can be used to reduce the energy required at launch to reach Jupiter, albeit at the cost of a significantly longer flight duration.[28]

Jupiter has 67 known moons, many of which have relatively little known information about them.

Saturn has been explored only through unmanned spacecraft launched by NASA, including one mission (CassiniHuygens) planned and executed in cooperation with other space agencies. These missions consist of flybys in 1979 by Pioneer 11, in 1980 by Voyager 1, in 1982 by Voyager 2 and an orbital mission by the Cassini spacecraft, which entered orbit in 2004 and is expected to continue its mission well into 2017.

Saturn has at least 62 known moons, although the exact number is debatable since Saturn's rings are made up of vast numbers of independently orbiting objects of varying sizes. The largest of the moons is Titan. Titan holds the distinction of being the only moon in the Solar System with an atmosphere denser and thicker than that of Earth. As a result of the deployment from the Cassini spacecraft of the Huygens probe and its successful landing on Titan, Titan also holds the distinction of being the only object in the outer Solar System that has been explored with a lander.

The exploration of Uranus has been entirely through the Voyager 2 spacecraft, with no other visits currently planned. Given its axial tilt of 97.77, with its polar regions exposed to sunlight or darkness for long periods, scientists were not sure what to expect at Uranus. The closest approach to Uranus occurred on 24 January 1986. Voyager 2 studied the planet's unique atmosphere and magnetosphere. Voyager 2 also examined its ring system and the moons of Uranus including all five of the previously known moons, while discovering an additional ten previously unknown moons.

Images of Uranus proved to have a very uniform appearance, with no evidence of the dramatic storms or atmospheric banding evident on Jupiter and Saturn. Great effort was required to even identify a few clouds in the images of the planet. The magnetosphere of Uranus, however, proved to be completely unique and proved to be profoundly affected by the planet's unusual axial tilt. In contrast to the bland appearance of Uranus itself, striking images were obtained of the Moons of Uranus, including evidence that Miranda had been unusually geologically active.

The exploration of Neptune began with the 25 August 1989 Voyager 2 flyby, the sole visit to the system as of 2014. The possibility of a Neptune Orbiter has been discussed, but no other missions have been given serious thought.

Although the extremely uniform appearance of Uranus during Voyager 2's visit in 1986 had led to expectations that Neptune would also have few visible atmospheric phenomena, The spacecraft found that Neptune had obvious banding, visible clouds, auroras, and even a conspicuous anticyclone storm system rivaled in size only by Jupiter's small Spot. Neptune also proved to have the fastest winds of any planet in the Solar System, measured as high as 2,100km/h.[30] Voyager 2 also examined Neptune's ring and moon system. It discovered 900 complete rings and additional partial ring "arcs" around Neptune. In addition to examining Neptune's three previously known moons, Voyager 2 also discovered five previously unknown moons, one of which, Proteus, proved to be the last largest moon in the system. Data from Voyager 2 supported the view that Neptune's largest moon, Triton, is a captured Kuiper belt object.[31]

The dwarf planet Pluto presents significant challenges for spacecraft because of its great distance from Earth (requiring high velocity for reasonable trip times) and small mass (making capture into orbit very difficult at present). Voyager 1 could have visited Pluto, but controllers opted instead for a close flyby of Saturn's moon Titan, resulting in a trajectory incompatible with a Pluto flyby. Voyager 2 never had a plausible trajectory for reaching Pluto.[32]

Pluto continues to be of great interest, despite its reclassification as the lead and nearest member of a new and growing class of distant icy bodies of intermediate size (and also the first member of the important subclass, defined by orbit and known as "plutinos"). After an intense political battle, a mission to Pluto dubbed New Horizons was granted funding from the United States government in 2003.[33] New Horizons was launched successfully on 19 January 2006. In early 2007 the craft made use of a gravity assist from Jupiter. Its closest approach to Pluto was on 14 July 2015; scientific observations of Pluto began five months prior to closest approach and will continue for at least a month after the encounter.

Until the advent of space travel, objects in the asteroid belt were merely pinpricks of light in even the largest telescopes, their shapes and terrain remaining a mystery. Several asteroids have now been visited by probes, the first of which was Galileo, which flew past two: 951 Gaspra in 1991, followed by 243 Ida in 1993. Both of these lay near enough to Galileo's planned trajectory to Jupiter that they could be visited at acceptable cost. The first landing on an asteroid was performed by the NEAR Shoemaker probe in 2000, following an orbital survey of the object. The dwarf planet Ceres and the asteroid 4 Vesta, two of the three largest asteroids, were visited by NASA's Dawn spacecraft, launched in 2007.

Although many comets have been studied from Earth sometimes with centuries-worth of observations, only a few comets have been closely visited. In 1985, the International Cometary Explorer conducted the first comet fly-by (21P/Giacobini-Zinner) before joining the Halley Armada studying the famous comet. The Deep Impact probe smashed into 9P/Tempel to learn more about its structure and composition and the Stardust mission returned samples of another comet's tail. The Philae lander successfully landed on Comet ChuryumovGerasimenko in 2014 as part of the broader Rosetta mission.

Hayabusa was an unmanned spacecraft developed by the Japan Aerospace Exploration Agency to return a sample of material from the small near-Earth asteroid 25143 Itokawa to Earth for further analysis. Hayabusa was launched on 9 May 2003 and rendezvoused with Itokawa in mid-September 2005. After arriving at Itokawa, Hayabusa studied the asteroid's shape, spin, topography, colour, composition, density, and history. In November 2005, it landed on the asteroid to collect samples. The spacecraft returned to Earth on 13 June 2010.

Deep space exploration is the term used for the exploration of deep space, and which is usually described as being at far distances from Earth and either within or away from the Solar System. It is the branch of astronomy, astronautics and space technology that is involved with the exploration of distant regions of outer space.[34] Physical exploration of space is conducted both by human spaceflights (deep-space astronautics) and by robotic spacecraft.

Some of the best candidates for future deep space engine technologies include anti-matter, nuclear power and beamed propulsion.[35] The latter, beamed propulsion, appears to be the best candidate for deep space exploration presently available, since it uses known physics and known technology that is being developed for other purposes.[36]

In the 2000s, several plans for space exploration were announced; both government entities and the private sector have space exploration objectives. China has announced plans to have a 60-ton multi-module space station in orbit by 2020.

The NASA Authorization Act of 2010 provided a re-prioritized list of objectives for the American space program, as well as funding for the first priorities. NASA proposes to move forward with the development of the Space Launch System (SLS), which will be designed to carry the Orion Multi-Purpose Crew Vehicle, as well as important cargo, equipment, and science experiments to Earth's orbit and destinations beyond. Additionally, the SLS will serve as a back up for commercial and international partner transportation services to the International Space Station. The SLS rocket will incorporate technological investments from the Space Shuttle program and the Constellation program in order to take advantage of proven hardware and reduce development and operations costs. The first developmental flight is targeted for the end of 2017.[37]

The idea of using high level automated systems for space missions has become a desirable goal to space agencies all around the world. Such systems are believed to yield benefits such as lower cost, less human oversight, and ability to explore deeper in space which is usually restricted by long communications with human controllers.[38]

Autonomy is defined by 3 requirements:[38]

Autonomed technologies would be able to perform beyond predetermined actions. It would analyze all possible states and events happening around them and come up with a safe response. In addition, such technologies can reduce launch cost and ground involvement. Performance would increase as well. Autonomy would be able to quickly respond upon encountering an unforeseen event, especially in deep space exploration where communication back to Earth would take too long.[38]

NASA began its autonomous science experiment (ASE) on Earth Observing 1 (EO-1) which is NASA's first satellite in the new millennium program Earth-observing series launched on 21 November 2000. The autonomy of ASE is capable of on-board science analysis, replanning, robust execution, and later the addition of model-based diagnostic. Images obtained by the EO-1 are analyzed on-board and downlinked when a change or an interesting event occur. The ASE software has successfully provided over 10,000 science images.[38]

The research that is conducted by national space exploration agencies, such as NASA and Roscosmos, is one of the reasons supporters cite to justify government expenses. Economic analyses of the NASA programs often showed ongoing economic benefits (such as NASA spin-offs), generating many times the revenue of the cost of the program.[39] It is also argued that space exploration would lead to the extraction of resources on other planets and especially asteroids, which contain billions of dollars worth of minerals and metals. Such expeditions could generate a lot of revenue.[40] As well, it has been argued that space exploration programs help inspire youth to study in science and engineering.[41]

Another claim is that space exploration is a necessity to mankind and that staying on Earth will lead to extinction. Some of the reasons are lack of natural resources, comets, nuclear war, and worldwide epidemic. Stephen Hawking, renowned British theoretical physicist, said that "I don't think the human race will survive the next thousand years, unless we spread into space. There are too many accidents that can befall life on a single planet. But I'm an optimist. We will reach out to the stars."[42]

NASA has produced a series of public service announcement videos supporting the concept of space exploration.[43]

Overall, the public remains largely supportive of both manned and unmanned space exploration. According to an Associated Press Poll conducted in July 2003, 71% of U.S. citizens agreed with the statement that the space program is "a good investment", compared to 21% who did not.[44]

Arthur C. Clarke (1950) presented a summary of motivations for the human exploration of space in his non-fiction semi-technical monograph Interplanetary Flight.[45] He argued that humanity's choice is essentially between expansion off Earth into space, versus cultural (and eventually biological) stagnation and death.

Spaceflight is the use of space technology to achieve the flight of spacecraft into and through outer space.

Spaceflight is used in space exploration, and also in commercial activities like space tourism and satellite telecommunications. Additional non-commercial uses of spaceflight include space observatories, reconnaissance satellites and other Earth observation satellites.

A spaceflight typically begins with a rocket launch, which provides the initial thrust to overcome the force of gravity and propels the spacecraft from the surface of Earth. Once in space, the motion of a spacecraftboth when unpropelled and when under propulsionis covered by the area of study called astrodynamics. Some spacecraft remain in space indefinitely, some disintegrate during atmospheric reentry, and others reach a planetary or lunar surface for landing or impact.

Satellites are used for a large number of purposes. Common types include military (spy) and civilian Earth observation satellites, communication satellites, navigation satellites, weather satellites, and research satellites. Space stations and human spacecraft in orbit are also satellites.

Current examples of the commercial use of space include satellite navigation systems, satellite television and satellite radio. Space tourism is the recent phenomenon of space travel by individuals for the purpose of personal pleasure.

Astrobiology is the interdisciplinary study of life in the universe, combining aspects of astronomy, biology and geology.[46] It is focused primarily on the study of the origin, distribution and evolution of life. It is also known as exobiology (from Greek: , exo, "outside").[47][48][49] The term "Xenobiology" has been used as well, but this is technically incorrect because its terminology means "biology of the foreigners".[50] Astrobiologists must also consider the possibility of life that is chemically entirely distinct from any life found on Earth.[51] In the Solar System some of the prime locations for current or past astrobiology are on Enceladus, Europa, Mars, and Titan.

Space colonization, also called space settlement and space humanization, would be the permanent autonomous (self-sufficient) human habitation of locations outside Earth, especially of natural satellites or planets such as the Moon or Mars, using significant amounts of in-situ resource utilization.

To date, the longest human occupation of space is the International Space Station which has been in continuous use for 700849245840000000015years, 221days. Valeri Polyakov's record single spaceflight of almost 438 days aboard the Mir space station has not been surpassed. Long-term stays in space reveal issues with bone and muscle loss in low gravity, immune system suppression, and radiation exposure.

Many past and current concepts for the continued exploration and colonization of space focus on a return to the Moon as a "stepping stone" to the other planets, especially Mars. At the end of 2006 NASA announced they were planning to build a permanent Moon base with continual presence by 2024.[53]

Beyond the technical factors that could make living in space more widespread, it has been suggested that the lack of private property, the inability or difficulty in establishing property rights in space, has been an impediment to the development of space for human habitation. Since the advent of space technology in the latter half of the twentieth century, the ownership of property in space has been murky, with strong arguments both for and against. In particular, the making of national territorial claims in outer space and on celestial bodies has been specifically proscribed by the Outer Space Treaty, which had been, as of 2012[update], ratified by all spacefaring nations.[54]

Articles related to space exploration

See the article here:

Space exploration - Wikipedia, the free encyclopedia

Posted in Space Exploration | Comments Off on Space exploration – Wikipedia, the free encyclopedia

Nanotechnology – Wikipedia, the free encyclopedia

Posted: at 12:45 pm

Nanotechnology ("nanotech") is manipulation of matter on an atomic, molecular, and supramolecular scale. The earliest, widespread description of nanotechnology[1][2] referred to the particular technological goal of precisely manipulating atoms and molecules for fabrication of macroscale products, also now referred to as molecular nanotechnology. A more generalized description of nanotechnology was subsequently established by the National Nanotechnology Initiative, which defines nanotechnology as the manipulation of matter with at least one dimension sized from 1 to 100 nanometers. This definition reflects the fact that quantum mechanical effects are important at this quantum-realm scale, and so the definition shifted from a particular technological goal to a research category inclusive of all types of research and technologies that deal with the special properties of matter which occur below the given size threshold. It is therefore common to see the plural form "nanotechnologies" as well as "nanoscale technologies" to refer to the broad range of research and applications whose common trait is size. Because of the variety of potential applications (including industrial and military), governments have invested billions of dollars in nanotechnology research. Until 2012, through its National Nanotechnology Initiative, the USA has invested 3.7 billion dollars, the European Union has invested 1.2 billion and Japan 750 million dollars.[3]

Nanotechnology as defined by size is naturally very broad, including fields of science as diverse as surface science, organic chemistry, molecular biology, semiconductor physics, microfabrication, etc.[4] The associated research and applications are equally diverse, ranging from extensions of conventional device physics to completely new approaches based upon molecular self-assembly, from developing new materials with dimensions on the nanoscale to direct control of matter on the atomic scale.

Scientists currently debate the future implications of nanotechnology. Nanotechnology may be able to create many new materials and devices with a vast range of applications, such as in nanomedicine, nanoelectronics, biomaterials energy production, and consumer products. On the other hand, nanotechnology raises many of the same issues as any new technology, including concerns about the toxicity and environmental impact of nanomaterials,[5] and their potential effects on global economics, as well as speculation about various doomsday scenarios. These concerns have led to a debate among advocacy groups and governments on whether special regulation of nanotechnology is warranted.

The concepts that seeded nanotechnology were first discussed in 1959 by renowned physicist Richard Feynman in his talk There's Plenty of Room at the Bottom, in which he described the possibility of synthesis via direct manipulation of atoms. The term "nano-technology" was first used by Norio Taniguchi in 1974, though it was not widely known.

Inspired by Feynman's concepts, K. Eric Drexler used the term "nanotechnology" in his 1986 book Engines of Creation: The Coming Era of Nanotechnology, which proposed the idea of a nanoscale "assembler" which would be able to build a copy of itself and of other items of arbitrary complexity with atomic control. Also in 1986, Drexler co-founded The Foresight Institute (with which he is no longer affiliated) to help increase public awareness and understanding of nanotechnology concepts and implications.

Thus, emergence of nanotechnology as a field in the 1980s occurred through convergence of Drexler's theoretical and public work, which developed and popularized a conceptual framework for nanotechnology, and high-visibility experimental advances that drew additional wide-scale attention to the prospects of atomic control of matter. In the 1980s, two major breakthroughs sparked the growth of nanotechnology in modern era.

First, the invention of the scanning tunneling microscope in 1981 which provided unprecedented visualization of individual atoms and bonds, and was successfully used to manipulate individual atoms in 1989. The microscope's developers Gerd Binnig and Heinrich Rohrer at IBM Zurich Research Laboratory received a Nobel Prize in Physics in 1986.[6][7] Binnig, Quate and Gerber also invented the analogous atomic force microscope that year.

Second, Fullerenes were discovered in 1985 by Harry Kroto, Richard Smalley, and Robert Curl, who together won the 1996 Nobel Prize in Chemistry.[8][9] C60 was not initially described as nanotechnology; the term was used regarding subsequent work with related graphene tubes (called carbon nanotubes and sometimes called Bucky tubes) which suggested potential applications for nanoscale electronics and devices.

In the early 2000s, the field garnered increased scientific, political, and commercial attention that led to both controversy and progress. Controversies emerged regarding the definitions and potential implications of nanotechnologies, exemplified by the Royal Society's report on nanotechnology.[10] Challenges were raised regarding the feasibility of applications envisioned by advocates of molecular nanotechnology, which culminated in a public debate between Drexler and Smalley in 2001 and 2003.[11]

Meanwhile, commercialization of products based on advancements in nanoscale technologies began emerging. These products are limited to bulk applications of nanomaterials and do not involve atomic control of matter. Some examples include the Silver Nano platform for using silver nanoparticles as an antibacterial agent, nanoparticle-based transparent sunscreens, carbon fiber strengthening using silica nanoparticles, and carbon nanotubes for stain-resistant textiles.[12][13]

Governments moved to promote and fund research into nanotechnology, such as in the U.S. with the National Nanotechnology Initiative, which formalized a size-based definition of nanotechnology and established funding for research on the nanoscale, and in Europe via the European Framework Programmes for Research and Technological Development.

By the mid-2000s new and serious scientific attention began to flourish. Projects emerged to produce nanotechnology roadmaps[14][15] which center on atomically precise manipulation of matter and discuss existing and projected capabilities, goals, and applications.

Nanotechnology is the engineering of functional systems at the molecular scale. This covers both current work and concepts that are more advanced. In its original sense, nanotechnology refers to the projected ability to construct items from the bottom up, using techniques and tools being developed today to make complete, high performance products.

One nanometer (nm) is one billionth, or 109, of a meter. By comparison, typical carbon-carbon bond lengths, or the spacing between these atoms in a molecule, are in the range 0.120.15 nm, and a DNA double-helix has a diameter around 2nm. On the other hand, the smallest cellular life-forms, the bacteria of the genus Mycoplasma, are around 200nm in length. By convention, nanotechnology is taken as the scale range 1 to 100 nm following the definition used by the National Nanotechnology Initiative in the US. The lower limit is set by the size of atoms (hydrogen has the smallest atoms, which are approximately a quarter of a nm diameter) since nanotechnology must build its devices from atoms and molecules. The upper limit is more or less arbitrary but is around the size that phenomena not observed in larger structures start to become apparent and can be made use of in the nano device.[16] These new phenomena make nanotechnology distinct from devices which are merely miniaturised versions of an equivalent macroscopic device; such devices are on a larger scale and come under the description of microtechnology.[17]

To put that scale in another context, the comparative size of a nanometer to a meter is the same as that of a marble to the size of the earth.[18] Or another way of putting it: a nanometer is the amount an average man's beard grows in the time it takes him to raise the razor to his face.[18]

Two main approaches are used in nanotechnology. In the "bottom-up" approach, materials and devices are built from molecular components which assemble themselves chemically by principles of molecular recognition. In the "top-down" approach, nano-objects are constructed from larger entities without atomic-level control.[19]

Areas of physics such as nanoelectronics, nanomechanics, nanophotonics and nanoionics have evolved during the last few decades to provide a basic scientific foundation of nanotechnology.

Several phenomena become pronounced as the size of the system decreases. These include statistical mechanical effects, as well as quantum mechanical effects, for example the quantum size effect where the electronic properties of solids are altered with great reductions in particle size. This effect does not come into play by going from macro to micro dimensions. However, quantum effects can become significant when the nanometer size range is reached, typically at distances of 100 nanometers or less, the so-called quantum realm. Additionally, a number of physical (mechanical, electrical, optical, etc.) properties change when compared to macroscopic systems. One example is the increase in surface area to volume ratio altering mechanical, thermal and catalytic properties of materials. Diffusion and reactions at nanoscale, nanostructures materials and nanodevices with fast ion transport are generally referred to nanoionics. Mechanical properties of nanosystems are of interest in the nanomechanics research. The catalytic activity of nanomaterials also opens potential risks in their interaction with biomaterials.

Materials reduced to the nanoscale can show different properties compared to what they exhibit on a macroscale, enabling unique applications. For instance, opaque substances can become transparent (copper); stable materials can turn combustible (aluminium); insoluble materials may become soluble (gold). A material such as gold, which is chemically inert at normal scales, can serve as a potent chemical catalyst at nanoscales. Much of the fascination with nanotechnology stems from these quantum and surface phenomena that matter exhibits at the nanoscale.[20]

Modern synthetic chemistry has reached the point where it is possible to prepare small molecules to almost any structure. These methods are used today to manufacture a wide variety of useful chemicals such as pharmaceuticals or commercial polymers. This ability raises the question of extending this kind of control to the next-larger level, seeking methods to assemble these single molecules into supramolecular assemblies consisting of many molecules arranged in a well defined manner.

These approaches utilize the concepts of molecular self-assembly and/or supramolecular chemistry to automatically arrange themselves into some useful conformation through a bottom-up approach. The concept of molecular recognition is especially important: molecules can be designed so that a specific configuration or arrangement is favored due to non-covalent intermolecular forces. The WatsonCrick basepairing rules are a direct result of this, as is the specificity of an enzyme being targeted to a single substrate, or the specific folding of the protein itself. Thus, two or more components can be designed to be complementary and mutually attractive so that they make a more complex and useful whole.

Such bottom-up approaches should be capable of producing devices in parallel and be much cheaper than top-down methods, but could potentially be overwhelmed as the size and complexity of the desired assembly increases. Most useful structures require complex and thermodynamically unlikely arrangements of atoms. Nevertheless, there are many examples of self-assembly based on molecular recognition in biology, most notably WatsonCrick basepairing and enzyme-substrate interactions. The challenge for nanotechnology is whether these principles can be used to engineer new constructs in addition to natural ones.

Molecular nanotechnology, sometimes called molecular manufacturing, describes engineered nanosystems (nanoscale machines) operating on the molecular scale. Molecular nanotechnology is especially associated with the molecular assembler, a machine that can produce a desired structure or device atom-by-atom using the principles of mechanosynthesis. Manufacturing in the context of productive nanosystems is not related to, and should be clearly distinguished from, the conventional technologies used to manufacture nanomaterials such as carbon nanotubes and nanoparticles.

When the term "nanotechnology" was independently coined and popularized by Eric Drexler (who at the time was unaware of an earlier usage by Norio Taniguchi) it referred to a future manufacturing technology based on molecular machine systems. The premise was that molecular scale biological analogies of traditional machine components demonstrated molecular machines were possible: by the countless examples found in biology, it is known that sophisticated, stochastically optimised biological machines can be produced.

It is hoped that developments in nanotechnology will make possible their construction by some other means, perhaps using biomimetic principles. However, Drexler and other researchers[21] have proposed that advanced nanotechnology, although perhaps initially implemented by biomimetic means, ultimately could be based on mechanical engineering principles, namely, a manufacturing technology based on the mechanical functionality of these components (such as gears, bearings, motors, and structural members) that would enable programmable, positional assembly to atomic specification.[22] The physics and engineering performance of exemplar designs were analyzed in Drexler's book Nanosystems.

In general it is very difficult to assemble devices on the atomic scale, as one has to position atoms on other atoms of comparable size and stickiness. Another view, put forth by Carlo Montemagno,[23] is that future nanosystems will be hybrids of silicon technology and biological molecular machines. Richard Smalley argued that mechanosynthesis are impossible due to the difficulties in mechanically manipulating individual molecules.

This led to an exchange of letters in the ACS publication Chemical & Engineering News in 2003.[24] Though biology clearly demonstrates that molecular machine systems are possible, non-biological molecular machines are today only in their infancy. Leaders in research on non-biological molecular machines are Dr. Alex Zettl and his colleagues at Lawrence Berkeley Laboratories and UC Berkeley.[1] They have constructed at least three distinct molecular devices whose motion is controlled from the desktop with changing voltage: a nanotube nanomotor, a molecular actuator,[25] and a nanoelectromechanical relaxation oscillator.[26] See nanotube nanomotor for more examples.

An experiment indicating that positional molecular assembly is possible was performed by Ho and Lee at Cornell University in 1999. They used a scanning tunneling microscope to move an individual carbon monoxide molecule (CO) to an individual iron atom (Fe) sitting on a flat silver crystal, and chemically bound the CO to the Fe by applying a voltage.

The nanomaterials field includes subfields which develop or study materials having unique properties arising from their nanoscale dimensions.[29]

These seek to arrange smaller components into more complex assemblies.

These seek to create smaller devices by using larger ones to direct their assembly.

These seek to develop components of a desired functionality without regard to how they might be assembled.

These subfields seek to anticipate what inventions nanotechnology might yield, or attempt to propose an agenda along which inquiry might progress. These often take a big-picture view of nanotechnology, with more emphasis on its societal implications than the details of how such inventions could actually be created.

Nanomaterials can be classified in 0D, 1D, 2D and 3D nanomaterials. The dimensionality play a major role in determining the characteristic of nanomaterials including physical, chemical and biological characteristics. With the decrease in dimensionality, an increase in surface-to-volume ratio is observed. This indicate that smaller dimensional nanomaterials have higher surface area compared to 3D nanomaterials. Recently, two dimensional (2D) nanomaterials are extensively investigated for electronic, biomedical, drug delivery and biosensor applications.

There are several important modern developments. The atomic force microscope (AFM) and the Scanning Tunneling Microscope (STM) are two early versions of scanning probes that launched nanotechnology. There are other types of scanning probe microscopy. Although conceptually similar to the scanning confocal microscope developed by Marvin Minsky in 1961 and the scanning acoustic microscope (SAM) developed by Calvin Quate and coworkers in the 1970s, newer scanning probe microscopes have much higher resolution, since they are not limited by the wavelength of sound or light.

The tip of a scanning probe can also be used to manipulate nanostructures (a process called positional assembly). Feature-oriented scanning methodology may be a promising way to implement these nanomanipulations in automatic mode.[44][45] However, this is still a slow process because of low scanning velocity of the microscope.

Various techniques of nanolithography such as optical lithography, X-ray lithography dip pen nanolithography, electron beam lithography or nanoimprint lithography were also developed. Lithography is a top-down fabrication technique where a bulk material is reduced in size to nanoscale pattern.

Another group of nanotechnological techniques include those used for fabrication of nanotubes and nanowires, those used in semiconductor fabrication such as deep ultraviolet lithography, electron beam lithography, focused ion beam machining, nanoimprint lithography, atomic layer deposition, and molecular vapor deposition, and further including molecular self-assembly techniques such as those employing di-block copolymers. The precursors of these techniques preceded the nanotech era, and are extensions in the development of scientific advancements rather than techniques which were devised with the sole purpose of creating nanotechnology and which were results of nanotechnology research.

The top-down approach anticipates nanodevices that must be built piece by piece in stages, much as manufactured items are made. Scanning probe microscopy is an important technique both for characterization and synthesis of nanomaterials. Atomic force microscopes and scanning tunneling microscopes can be used to look at surfaces and to move atoms around. By designing different tips for these microscopes, they can be used for carving out structures on surfaces and to help guide self-assembling structures. By using, for example, feature-oriented scanning approach, atoms or molecules can be moved around on a surface with scanning probe microscopy techniques.[44][45] At present, it is expensive and time-consuming for mass production but very suitable for laboratory experimentation.

In contrast, bottom-up techniques build or grow larger structures atom by atom or molecule by molecule. These techniques include chemical synthesis, self-assembly and positional assembly. Dual polarisation interferometry is one tool suitable for characterisation of self assembled thin films. Another variation of the bottom-up approach is molecular beam epitaxy or MBE. Researchers at Bell Telephone Laboratories like John R. Arthur. Alfred Y. Cho, and Art C. Gossard developed and implemented MBE as a research tool in the late 1960s and 1970s. Samples made by MBE were key to the discovery of the fractional quantum Hall effect for which the 1998 Nobel Prize in Physics was awarded. MBE allows scientists to lay down atomically precise layers of atoms and, in the process, build up complex structures. Important for research on semiconductors, MBE is also widely used to make samples and devices for the newly emerging field of spintronics.

However, new therapeutic products, based on responsive nanomaterials, such as the ultradeformable, stress-sensitive Transfersome vesicles, are under development and already approved for human use in some countries.[citation needed]

As of August 21, 2008, the Project on Emerging Nanotechnologies estimates that over 800 manufacturer-identified nanotech products are publicly available, with new ones hitting the market at a pace of 34 per week.[13] The project lists all of the products in a publicly accessible online database. Most applications are limited to the use of "first generation" passive nanomaterials which includes titanium dioxide in sunscreen, cosmetics, surface coatings,[46] and some food products; Carbon allotropes used to produce gecko tape; silver in food packaging, clothing, disinfectants and household appliances; zinc oxide in sunscreens and cosmetics, surface coatings, paints and outdoor furniture varnishes; and cerium oxide as a fuel catalyst.[12]

Further applications allow tennis balls to last longer, golf balls to fly straighter, and even bowling balls to become more durable and have a harder surface. Trousers and socks have been infused with nanotechnology so that they will last longer and keep people cool in the summer. Bandages are being infused with silver nanoparticles to heal cuts faster.[47]Video game consoles and personal computers may become cheaper, faster, and contain more memory thanks to nanotechnology.[48] Nanotechnology may have the ability to make existing medical applications cheaper and easier to use in places like the general practitioner's office and at home.[49] Cars are being manufactured with nanomaterials so they may need fewer metals and less fuel to operate in the future.[50]

Scientists are now turning to nanotechnology in an attempt to develop diesel engines with cleaner exhaust fumes. Platinum is currently used as the diesel engine catalyst in these engines. The catalyst is what cleans the exhaust fume particles. First a reduction catalyst is employed to take nitrogen atoms from NOx molecules in order to free oxygen. Next the oxidation catalyst oxidizes the hydrocarbons and carbon monoxide to form carbon dioxide and water.[51] Platinum is used in both the reduction and the oxidation catalysts.[52] Using platinum though, is inefficient in that it is expensive and unsustainable. Danish company InnovationsFonden invested DKK 15 million in a search for new catalyst substitutes using nanotechnology. The goal of the project, launched in the autumn of 2014, is to maximize surface area and minimize the amount of material required. Objects tend to minimize their surface energy; two drops of water, for example, will join to form one drop and decrease surface area. If the catalyst's surface area that is exposed to the exhaust fumes is maximized, efficiency of the catalyst is maximized. The team working on this project aims to create nanoparticles that will not merge. Every time the surface is optimized, material is saved. Thus, creating these nanoparticles will increase the effectiveness of the resulting diesel engine catalystin turn leading to cleaner exhaust fumesand will decrease cost. If successful, the team hopes to reduce platinum use by 25%.[53]

Nanotechnology also has a prominent role in the fast developing field of Tissue Engineering. When designing scaffolds, researchers attempt to the mimic the nanoscale features of a Cell's microenvironment to direct its differentiation down a suitable lineage.[54] For example, when creating scaffolds to support the growth of bone, researchers may mimic osteoclast resorption pits.[55]

Researchers have successfully used DNA origami-based nanobots capable of carrying out logic functions to achieve targeted drug delivery in cockroaches. It is said that the computational power of these nanobots can be scaled up to that of a Commodore 64.[56]

An area of concern is the effect that industrial-scale manufacturing and use of nanomaterials would have on human health and the environment, as suggested by nanotoxicology research. For these reasons, some groups advocate that nanotechnology be regulated by governments. Others counter that overregulation would stifle scientific research and the development of beneficial innovations. Public health research agencies, such as the National Institute for Occupational Safety and Health are actively conducting research on potential health effects stemming from exposures to nanoparticles.[57][58]

Some nanoparticle products may have unintended consequences. Researchers have discovered that bacteriostatic silver nanoparticles used in socks to reduce foot odor are being released in the wash.[59] These particles are then flushed into the waste water stream and may destroy bacteria which are critical components of natural ecosystems, farms, and waste treatment processes.[60]

Public deliberations on risk perception in the US and UK carried out by the Center for Nanotechnology in Society found that participants were more positive about nanotechnologies for energy applications than for health applications, with health applications raising moral and ethical dilemmas such as cost and availability.[61]

Experts, including director of the Woodrow Wilson Center's Project on Emerging Nanotechnologies David Rejeski, have testified[62] that successful commercialization depends on adequate oversight, risk research strategy, and public engagement. Berkeley, California is currently the only city in the United States to regulate nanotechnology;[63]Cambridge, Massachusetts in 2008 considered enacting a similar law,[64] but ultimately rejected it.[65] Relevant for both research on and application of nanotechnologies, the insurability of nanotechnology is contested.[66] Without state regulation of nanotechnology, the availability of private insurance for potential damages is seen as necessary to ensure that burdens are not socialised implicitly.

Nanofibers are used in several areas and in different products, in everything from aircraft wings to tennis rackets. Inhaling airborne nanoparticles and nanofibers may lead to a number of pulmonary diseases, e.g. fibrosis.[67] Researchers have found that when rats breathed in nanoparticles, the particles settled in the brain and lungs, which led to significant increases in biomarkers for inflammation and stress response[68] and that nanoparticles induce skin aging through oxidative stress in hairless mice.[69][70]

A two-year study at UCLA's School of Public Health found lab mice consuming nano-titanium dioxide showed DNA and chromosome damage to a degree "linked to all the big killers of man, namely cancer, heart disease, neurological disease and aging".[71]

A major study published more recently in Nature Nanotechnology suggests some forms of carbon nanotubes a poster child for the nanotechnology revolution could be as harmful as asbestos if inhaled in sufficient quantities. Anthony Seaton of the Institute of Occupational Medicine in Edinburgh, Scotland, who contributed to the article on carbon nanotubes said "We know that some of them probably have the potential to cause mesothelioma. So those sorts of materials need to be handled very carefully."[72] In the absence of specific regulation forthcoming from governments, Paull and Lyons (2008) have called for an exclusion of engineered nanoparticles in food.[73] A newspaper article reports that workers in a paint factory developed serious lung disease and nanoparticles were found in their lungs.[74][75][76][77]

Calls for tighter regulation of nanotechnology have occurred alongside a growing debate related to the human health and safety risks of nanotechnology.[78] There is significant debate about who is responsible for the regulation of nanotechnology. Some regulatory agencies currently cover some nanotechnology products and processes (to varying degrees) by bolting on nanotechnology to existing regulations there are clear gaps in these regimes.[79] Davies (2008) has proposed a regulatory road map describing steps to deal with these shortcomings.[80]

Stakeholders concerned by the lack of a regulatory framework to assess and control risks associated with the release of nanoparticles and nanotubes have drawn parallels with bovine spongiform encephalopathy ("mad cow" disease), thalidomide, genetically modified food,[81] nuclear energy, reproductive technologies, biotechnology, and asbestosis. Dr. Andrew Maynard, chief science advisor to the Woodrow Wilson Centers Project on Emerging Nanotechnologies, concludes that there is insufficient funding for human health and safety research, and as a result there is currently limited understanding of the human health and safety risks associated with nanotechnology.[82] As a result, some academics have called for stricter application of the precautionary principle, with delayed marketing approval, enhanced labelling and additional safety data development requirements in relation to certain forms of nanotechnology.[83][84]

The Royal Society report[10] identified a risk of nanoparticles or nanotubes being released during disposal, destruction and recycling, and recommended that manufacturers of products that fall under extended producer responsibility regimes such as end-of-life regulations publish procedures outlining how these materials will be managed to minimize possible human and environmental exposure (p. xiii).

The Center for Nanotechnology in Society has found that people respond to nanotechnologies differently, depending on application with participants in public deliberations more positive about nanotechnologies for energy than health applications suggesting that any public calls for nano regulations may differ by technology sector.[61]

Read the original here:

Nanotechnology - Wikipedia, the free encyclopedia

Posted in Nanotech | Comments Off on Nanotechnology – Wikipedia, the free encyclopedia

What is Nanotechnology?

Posted: at 12:45 pm

New!

Nanotech Scenario Series

Join the conversation at CRNtalk!

What is Nanotechnology?

In its original sense, 'nanotechnology' refers to the projected ability to construct items from the bottom up, using techniques and tools being developed today to make complete, high performance products.

With 15,342 atoms, this parallel-shaft speed reducer gear is one of the largest nanomechanical devices ever modeled in atomic detail. LINK

The Meaning of Nanotechnology

When K. Eric Drexler (right) popularized the word 'nanotechnology' in the 1980's, he was talking about building machines on the scale of molecules, a few nanometers widemotors, robot arms, and even whole computers, far smaller than a cell.Drexler spent the next ten years describing and analyzing these incredible devices, and responding to accusations of science fiction.Meanwhile, mundane technology was developing the ability to build simple structures on a molecular scale.As nanotechnology became an accepted concept, the meaning of the word shifted to encompass the simpler kinds of nanometer-scale technology.The U.S. National Nanotechnology Initiative was created to fund this kind of nanotech: their definition includes anything smaller than 100 nanometers with novel properties.

Much of the work being done today that carries the name 'nanotechnology' is not nanotechnology in the original meaning of the word. Nanotechnology, in its traditional sense, means building things from the bottom up, with atomic precision. This theoretical capability was envisioned as early as 1959 by the renowned physicist Richard Feynman.

I want to build a billion tiny factories, models of each other, which are manufacturing simultaneously. . . The principles of physics, as far as I can see, do not speak against the possibility of maneuvering things atom by atom. It is not an attempt to violate any laws; it is something, in principle, that can be done; but in practice, it has not been done because we are too big. Richard Feynman, Nobel Prize winner in physics

Based on Feynman's vision of miniature factories using nanomachines to build complex products, advanced nanotechnology (sometimes referred to as molecular manufacturing) will make use of positionally-controlled mechanochemistry guided by molecular machine systems. Formulating a roadmap for development of this kind of nanotechnology is now an objective of a broadly based technology roadmap project led by Battelle (the manager of several U.S. National Laboratories) and the Foresight Nanotech Institute.

Shortly after this envisioned molecular machinery is created, it will result in a manufacturing revolution, probably causing severe disruption. It also has serious economic, social, environmental, and military implications.

Four Generations

Mihail (Mike) Roco of the U.S. National Nanotechnology Initiative has described four generations of nanotechnology development (see chart below). The current era, as Roco depicts it, is that of passive nanostructures, materials designed to perform one task. The second phase, which we are just entering, introduces active nanostructures for multitasking; for example, actuators, drug delivery devices, and sensors. The third generation is expected to begin emerging around 2010 and will feature nanosystems with thousands of interacting components. A few years after that, the first integrated nanosystems, functioning (according to Roco) much like a mammalian cell with hierarchical systems within systems, are expected to be developed.

Some experts may still insist that nanotechnology can refer to measurement or visualization at the scale of 1-100 nanometers, but a consensus seems to be forming around the idea (put forward by the NNI's Mike Roco) that control and restructuring of matter at the nanoscale is a necessary element. CRN's definition is a bit more precise than that, but as work progresses through the four generations of nanotechnology leading up to molecular nanosystems, which will include molecular manufacturing, we think it will become increasingly obvious that "engineering of functional systems at the molecular scale" is what nanotech is really all about.

Conflicting Definitions

Unfortunately, conflicting definitions of nanotechnology and blurry distinctions between significantly different fields have complicated the effort to understand the differences and develop sensible, effective policy.

The risks of today's nanoscale technologies (nanoparticle toxicity, etc.) cannot be treated the same as the risks of longer-term molecular manufacturing (economic disruption, unstable arms race, etc.). It is a mistake to put them together in one basket for policy considerationeach is important to address, but they offer different problems and will require different solutions. As used today, the term nanotechnology usually refers to a broad collection of mostly disconnected fields. Essentially, anything sufficiently small and interesting can be called nanotechnology. Much of it is harmless. For the rest, much of the harm is of familiar and limited quality. But as we will see, molecular manufacturing will bring unfamiliar risks and new classes of problems.

General-Purpose Technology

Nanotechnology is sometimes referred to as a general-purpose technology. That's because in its advanced form it will have significant impact on almost all industries and all areas of society. It will offer better built, longer lasting, cleaner, safer, and smarter products for the home, for communications, for medicine, for transportation, for agriculture, and for industry in general.

Imagine a medical device that travels through the human body to seek out and destroy small clusters of cancerous cells before they can spread. Or a box no larger than a sugar cube that contains the entire contents of the Library of Congress. Or materials much lighter than steel that possess ten times as much strength. U.S. National Science Foundation

Dual-Use Technology

Like electricity or computers before it, nanotech will offer greatly improved efficiency in almost every facet of life. But as a general-purpose technology, it will be dual-use, meaning it will have many commercial uses and it also will have many military usesmaking far more powerful weapons and tools of surveillance. Thus it represents not only wonderful benefits for humanity, but also grave risks.

A key understanding of nanotechnology is that it offers not just better products, but a vastly improved manufacturing process. A computer can make copies of data filesessentially as many copies as you want at little or no cost. It may be only a matter of time until the building of products becomes as cheap as the copying of files. That's the real meaning of nanotechnology, and why it is sometimes seen as "the next industrial revolution."

My own judgment is that the nanotechnology revolution has the potential to change America on a scale equal to, if not greater than, the computer revolution. U.S. Senator Ron Wyden (D-Ore.)

The power of nanotechnology can be encapsulated in an apparently simple device called a personal nanofactory that may sit on your countertop or desktop. Packed with miniature chemical processors, computing, and robotics, it will produce a wide-range of items quickly, cleanly, and inexpensively, building products directly from blueprints.

Click to enlarge Artist's Conception of a Personal Nanofactory Courtesy of John Burch, Lizard Fire Studios (3D Animation, Game Development)

Exponential Proliferation

Nanotechnology not only will allow making many high-quality products at very low cost, but it will allow making new nanofactories at the same low cost and at the same rapid speed. This unique (outside of biology, that is) ability to reproduce its own means of production is why nanotech is said to be an exponential technology. It represents a manufacturing system that will be able to make more manufacturing systemsfactories that can build factoriesrapidly, cheaply, and cleanly. The means of production will be able to reproduce exponentially, so in just a few weeks a few nanofactories conceivably could become billions. It is a revolutionary, transformative, powerful, and potentially very dangerousor beneficialtechnology.

How soon will all this come about? Conservative estimates usually say 20 to 30 years from now, or even much later than that. However, CRN is concerned that it may occur sooner, quite possibly within the next decade. This is because of the rapid progress being made in enabling technologies, such as optics, nanolithography, mechanochemistry and 3D prototyping. If it does arrive that soon, we may not be adequately prepared, and the consequences could be severe.

We believe it's not too early to begin asking some tough questions and facing the issues:

Many of these questions were first raised over a decade ago, and have not yet been answered.If the questions are not answered with deliberation, answers will evolve independently and will take us by surprise; the surprise is likely to be unpleasant.

It is difficult to say for sure how soon this technology will mature, partly because it's possible (especially in countries that do not have open societies) that clandestine military or industrial development programs have been going on for years without our knowledge.

We cannot say with certainty that full-scale nanotechnology will not be developed with the next ten years, or even five years. It may take longer than that, but prudenceand possibly our survivaldemands that we prepare now for the earliest plausible development scenario.

More Background on Nanotechnology:

See the rest here:

What is Nanotechnology?

Posted in Nanotech | Comments Off on What is Nanotechnology?

Safety – How to Use Psychedelics

Posted: at 12:45 pm

The purpose of this site is to provide science-based information about how to safely use psychedelics. Following basic safety precautions dramatically reduces any risks.

Safety with psychedelics is important not just because these are powerful substances but also because creating a safe environment and approach will make treatment much more effective. Crucially, when psychedelics are used in a safe and comfortable setting, it becomes easier for people to relax and open themselves to the experience and gain the most benefit.

The most common psychedelics -- LSD, mushrooms, MDMA -- are far safer to their users and the people around them than drugs like alcohol, tobacco, and many prescription medicines. However, because of prohibition in many countries, there is far less available information and far more uncertainty about the sources and contents of psychedelic substances. Caution is very important.

We recommend psychedelics be taken only when you are in good health (unless you have decided to use it specifically to deal with an illness or in alleviating end of life anxiety). Those with heart problems should avoid psychedelics unless specifically discussed with a doctor. Pregnant women should not take psychedelics.

During a psychedelic experience, the mind/body connection is usually magnified. Any physical pain or discomfort (including something as small as hunger pangs) can affect the nature of the experience. Women should avoid tripping when experiencing menstrual cramps. If a woman typically experiences strong pre-menstrual symptoms (back aches, bloating, moodiness, headaches etc) its advisable to postpone the trip until they have subsided.

Entering a psychedelic journey with a comfortable and healthy body will help you toward a positive and meaningful experience.

For the advanced psychedelic user, a trip to deliberately investigate physical discomfort or pain may prove beneficial in understanding the mind/body connection and creating a deeper connection with your physical body.

Here are the some components of a safe psychedelic experience:

Setup the room that you will be in so that it feels cozy and pleasant. Have a comfortable place to sit or lie down, have pillows and decorations. You may want to have some music to listen to as well (music without words is recommended, as it is less likely to pull your thoughts away).

Keep in mind that going from place to place, being in a public setting, or dealing with logistical matters can be difficult while using psychedelics and can also be a source of anxiety. This is not recommended and can distract your focus.

When in doubt: be very cautious, don't rush, and start small.

Dont take a psychedelic for the first time without someone present who has experience with that substance. Make sure this is someone who you trust to help talk you through any stressful or confusing moments. Having trusted support is incredibly helpful for letting you feel safe and open to the experience. In addition, having someone around to help with practical needs (like getting you a glass of water) will let you focus on the process. Even if you have lots of experience with psychedelics, we recommend having a trusted person present (they don't need to be in the same room, just on hand).

We generally believe that psychedelics are most appropriate for people in their mid-20's and older. We don't know of research that has explored age-associated effects, but anecdotally, older individuals tend to take a more serious and intentional attitude towards psychedelic exploration.

If you have a dissociative psychological disorder, we do not recommend that you use psychedelics. Again, there is not good research on this topic and anecdotal reports are highly contradictory-- some people have had their conditions healed by use of psychedelics, others have found it has made their symptoms worse. We encourage you to err on the side of caution for these types of disorders.

Be sure to very carefully research any potential interactions with other prescription or non-prescription medicines or supplements that you are taking. In particular, anti-depressants, SSRIs, anti-anxiety, and anti-psychotic medications are not safe to take with psychedelics.

Before using a particular psychedelic for the first time, you may want to gauge your sensitivity to it. Some people are much more sensitive to certain substances than others. One mushroom cap might feel strong for one person and very mild for another.

If it is your first time taking a substance, try to take a very small tester amount (less than 1/10th of a dose) a few days before your planned trip. You may not feel any effects, but you will be able to roughly judge your sensitivity if you do, and you may feel more comfortable when the time comes to take a standard dose.

Several companies make purity testing kits for various substances, including MDMA and LSD. These can be found easily on Amazon.com or with a Google search. If you are uncertain about the purity of a substance, it's always better to test and be safe.

Psychedelics have been misunderstood and misrepresented for decades. That's changing. Please help us share safe, responsible information on using psychedelics by sending this page to friends, and posting to Facebook, Twitter, and Google:

Read the original post:

Safety - How to Use Psychedelics

Posted in Psychedelics | Comments Off on Safety – How to Use Psychedelics

Entheogens – Salvia Forum | Psychoactive Plants

Posted: at 12:45 pm

Entheology is a term I invented to describe the theology of entheogens and have since devoted much of my time exploring this topic. We're simply doing our small part to preserve and to promote the sacred knowledge that is being lost to history, both consciously and unconsciously. We continue to build a carefully-selected database of articles specifically related to entheogens, religion, ethnobotanicals, shamanic cultures and the politics that go with it. We also offer concise and unique information for any curious modern-day spiritual explorer, including an RSS feed below, so you can be automatically notified of new articles when they're posted. Please read about the new focus under the "ABOUT US > Who We Are" tab on our completely re-made site.

VIEW THE NEW ENTHEOLOGY - A LABOR OF LOVE.

The LARGEST ONLINE Entheogen Forum is HERE. The TOP RATED Entheogen Vendor Rating Site is HERE. Entheogens can give access to spiritual dimensions of consciousness that are indistinguishable from classic religious mysticism, yet only a few are presently protected under religious freedom laws. Awakening experiences instantly reveal how important it is to fight to keep the knowledge of these plants alive, as well as the cultures that have cultivated them for centuries.

Excerpt from:

Entheogens - Salvia Forum | Psychoactive Plants

Posted in Entheogens | Comments Off on Entheogens – Salvia Forum | Psychoactive Plants

Biohacker Guide | Nootropics

Posted: at 12:45 pm

Nootropics are a broad classification of cognition-enhancing compounds that produce minimal side effects and are suitable for long-term use. These compounds include those occurring in nature or already produced by the human body (such as neurotransmitters), and their synthetic analogs. We already regularly consume some of these chemicals: B vitamins, caffeine, and L-theanine, in our daily diets.

A fundamental aspect of human evolution has been the drive to augment our capabilities. The neocortex is the neural seat of abstract and higher order cognitive processes. As it grew, so did our ability to create. The invention of tools and weapons, writing, the steam engine, and the computer have exponentially increased our capacity to influence and understand the world around us. These advances are being driven by improved higher-order cognitive processing.1Fascinatingly, the practice of modulating our biology through naturally occurring flora predated all of the above discoveries. Indeed, Sumerian clay slabs as old as 5000 BC detail medicinal recipes which include over 250 plants2. The enhancement of human cognition through natural compounds followed, as people discovered plants containing caffeine, theanine, and other cognition-enhancing, or nootropic, agents.

There is an ancient precedent to humans using natural compounds to elevate cognitive performance. Incan warriors in the 15th century would ingest coca leaves (the basis for cocaine) before battle. Ethiopian hunters in the 10th century developed coffee bean paste to improve hunting stamina. Modern athletes ubiquitously consume protein powders and hormones to enhance their training, recovery, and performance. The most widely consumed psychoactive compound today is caffeine. Millions of people use coffee and tea to be more alert and focused.

The term nootropic is simply a descriptor, and this descriptor spans across all legal classifications of compounds. Broadly speaking, there are four main classifications:

Nootrobox stacks are strictly derived from the first category - compounds that are GRAS and approved for human consumption as dietary supplements.

The mechanisms by which nootropic compounds influence our cognition and neurophysiology are as diverse as those of prescription drugs. We present below our in-progress work, detailing the mechanisms, effects, and history of various nootropic compounds. Check in for updates and additions.

2016 Nootrobox, Inc. All Rights Reserved.

For informational purposes only. These statements have not been evaluated by the FDA. Products are not intended to diagnose, treat, cure, or prevent any disease.

View post:

Biohacker Guide | Nootropics

Posted in Nootropics | Comments Off on Biohacker Guide | Nootropics