July 20, 2007

Improving on the hypersonic skyhook concept

A tidal stabilized tether is called a "skyhook" since it appears to be "hooked onto the sky". They are also called "hypersonic tethers" because the tip nearest the earth travels about Mach-12 in typical designs. Longer tethers would travel more slowly. A grapple system attached to the tip of the tether can thus reach down below the facility and rendezvous with a payload moving in a slower, suborbital trajectory. The grapple would then capture the payload and pull it into orbit along with the tether system. Later, it could release the payload at the top of the swing, tossing it into a higher orbit

The net benefit of the Tether Launch Assist is that it can significantly reduce ΔV that a launch system such as a reusable launch vehicle (RLV) must provide to the payload. Mach 8-12 instead Mach 25 (orbital velocity), so the energy needed is 4-8 times less.

Electro-Dynamic Orbital Accelerator concept has a large vehicle in orbit with a trailing conductive tether. Instead of trying to meet up with a point hook, a vehicle would meet up with the long trailing tether.

The launch vehicle must provide only enough energy to reach OX altitude, thus allowing 30% or more of lift-off weight to be payload, as compared to ~3% for conventional rockets. The result is an enormous reduction in the cost to reach orbit. Payload increases by ten times.

Instead of a meeting an almost point like hook, a sub-orbital vehicle would connect anywhere along a long tether.

NOTE: the tether is unstable and the design would need to be modeled in detail and means to stabilize it found. I was thinking that small sensors and small segments of the tether could have some means of propulsion OR a small towed could be on the end which also flies to stabilize the tether or to pull it taught before rendezvous.

July 19, 2007

Controlling the quantum world

Wired has an article about the activity in ultracold physics
Ultracold atoms could enable precise non-GPS navigation by using measurements based on the Earth's rotation. Very precise measurements of the strength of magnetic or gravitational fields can be used to find oil.

This is only part of the revolution from measuring and controlling the quantum world.
Atomic, molecular, and optical (AMO) science is making great strides as described in this 230 page pdf "Controlling the Quantum World: The Science of Atoms, Molecules, and Photons (2007)" by the Board on Physics and Astronomy.

Revolutionary new methods to measure space and time have emerged within the last decade from a convergence of technologies in coherent control of ultrafast lasers and ultracold atoms. This new capability creates unprecedented new research opportunities.

Ultracold AMO physics was the most spectacularly successful new AMO research area of the past decade and led to the development of coherent quantum gases. This new field is poised to contribute significantly to the resolution of important fundamental problems in condensed matter science and in plasma physics, bringing with it new interdisciplinary opportunities.

High-intensity and short-wavelength sources such as new x-ray freeelectron lasers promise significant advances in AMO science, condensed matter physics and materials research, chemistry, medicine, and defenserelated science.

Ultrafast quantum control will unveil the internal motion of atoms within molecules, and of electrons within atoms, to a degree thought impossible only a decade ago. This capability is sparking a revolution in the imaging and coherent control of quantum processes and will be among the most fruitful new areas of AMO science in the next 10 years.

Quantum engineering on the nanoscale of tens to hundreds of atomic diameters has led to new opportunities for atom-by-atom control of quantum structures using the techniques of AMO science. Compelling opportunities in both molecular science and photon science are expected to have far-reaching societal applications.

Quantum information is a rapidly growing research area in AMO science and one that faces special challenges owing to its potential application for data security and encryption. Multiple approaches to quantum computing and communication are likely to be fruitful in the coming decade, and open international exchange of people and information is critical in order to realize the maximum benefit.

Temperature scale in powers of 10. Note: The low temperatures are farther away from our temperature than our temperature is from the surface of the sun

Using optical lattices—formed by counterpropagating lasers producing standing electromagnetic waves—one can control to an unprecedented degree the environment in which the atoms sit. For example, one can produce lattices in one, two, and three dimensions with a wide variety of lattice spacings and structures in which neutral atoms can be trapped.

Brightness comparison between current and future sources of x rays generated in laboratory x-ray lasers or at accelerators. Sources shown are the National Synchrotron Light Source (NSLS) at Brookhaven National Laboratory; Advanced Light Source (ALS) at Lawrence Berkeley National Laboratory; Advanced Photon Source (APS) at Argonne National Laboratory; European Synchrotron Research Facility (ESRF); Subpicosecond Pulse Source (SPPS) at the Stanford Linear Accelerator Center (SLAG); Linac Coherent Light Source (LCLS) at SLAG; the vacuum ultraviolet XFEL at the TESLA Test Facility at DESY, Hamburg, Germany (VUV-FEL); and the future European X-Ray Laser Project XFEL. Years on the right edge of the diagram denote approximate commissioning dates. SOURCE: R.W.Lee, Lawrence Livermore National Laboratory.

Cheap solar cells could be printed from inkjet printers

Cheap carbon nanotube solar cells that could be printed from inkjet printers are getting closer

Researchers at New Jersey Institute of Technology (NJIT) have developed an inexpensive solar cell that can be painted or printed on flexible plastic sheets. "The process is simple," said lead researcher and author Somenath Mitra, PhD, professor and acting chair of NJIT's Department of Chemistry and Environmental Sciences. "Someday homeowners will even be able to print sheets of these solar cells with inexpensive home-based inkjet printers. Consumers can then slap the finished product on a wall, roof or billboard to create their own power stations."

Mitra and his research team took the carbon nanotubes and combined them with tiny carbon Buckyballs (known as fullerenes) to form snake-like structures. Buckyballs trap electrons, although they can't make electrons flow. Add sunlight to excite the polymers, and the buckyballs will grab the electrons. Nanotubes, behaving like copper wires, will then be able to make the electrons or current flow.

"Using this unique combination in an organic solar cell recipe can enhance the efficiency of future painted-on solar cells," said Mitra. "Someday, I hope to see this process become an inexpensive energy alternative for households around the world."

"Fullerene single wall carbon nanotube complex for polymer bulk heterojunction photovoltaic cells," published June 21, 2007 in the Journal of Materials Chemistry by the Royal Society of Chemistry,

Other progess is being made to develop higher efficiency (6.5%) organic solar cells

Lee says he expects the process will eventually lead to solar cells having three or more layers, and that three cells in tandem could yield an efficiency of nearly 10 percent. Lee's collaborator Heeger is the cofounder of Konarka Technologies , in Lowell, Mass., a well-funded start-up developing plastic solar cells.

Tandem cells are comprised of two multilayered parts that work together to gather a wider range of the spectrum of solar radiation. (Credit: Alan Heeger / University of California - Santa Barbara)

Nobel laureate Alan Heeger, professor of physics at UC Santa Barbara, worked with Kwanghee Lee of Korea and a team of other scientists to create a new "tandem" organic solar cell with increased efficiency.

The cells are separated and connected by the material TiOx, a transparent titanium oxide. This is the key to the multilayer system that allows for the higher-level efficiencies. TiOx transports electrons and is a collecting layer for the first cell. In addition, it acts as a stable foundation that allows the fabrication of the second cell, thus completing the tandem cell architecture.

The new tandem cells have 6.5% conversion efficiency They expect to make a cost breakthrough.

"It takes 2 U.S. dollars to generate one watt of electric power if you use silicon solar cells," explained Professor Lee, "Only ten U.S. cents [would be required] to generate 1 watt if you use this tandem polymer solar cell," said Lee. The use of inexpensive plastics is a key to cut down the cost for its fabrication. They expect to reach the market in 3-5 years.

Controlling neurons with light

Scientists can now turn on and off specific parts of the brain with a simple flash of light. The new molecular tool, developed by scientists at MIT and Stanford, allows unprecedented control over the brain and could lead to more-effective treatments for epilepsy, Parkinson's, and other diseases.

These capabilities will also help accelerate the growth in understanding how brains work which relates to the development of artificial intelligence and artificial general intelligence (the hardware for AI is discussed in the preceding article).

New Scientist discusses using light to control neurons

One possibility is that the technology, coupled with a method of getting light into the human skull, could create a Brave New World of neuro-modification in which conditions such as depression or Parkinson's disease are treated not with sledgehammer drugs or electrodes, but with delicate pinpricks of light. In the long term it is even possible that such treatments could be modified to enhance normal brain function, for example improving memory or alertness.

The technology could also lead to spectacular advances in basic neuroscience, allowing researchers to tease apart the neural circuits that control everything from reflexes to consciousness with unprecedented accuracy. "We'll be able to understand how specific cell types in the brain give rise to fuzzy concepts like hope and motivation," predicts Karl Deisseroth, a psychiatrist at Stanford University in California, who is spearheading some of the work.

These new possibilities materialised when neuroscientists finally cracked a long-standing problem in their field: how to take control of individual neurons.

The work is described in this 2007 MIT Technology Review article

Worm workout: A light-activated “off” switch can control the movement of microscopic worms. Scientists engineered the worms to express the switch in motor neurons that control the organisms' ability to swim. Without light, the worms swim normally. But when they are exposed to yellow light, as indicated by the yellow circle, their motor neurons can no longer function, paralyzing the worms.
Credit: Alexander Gottschalk

Last year, Karl Deisseroth, a bioengineer and physician at Stanford, and Ed Boyden, a bioengineer at MIT, co-opted a light-sensitive channel from jellyfish to create a genetic "on" switch. The channel sits on the cell membrane and opens when exposed to light, allowing positive charge to flow into the cell. Shining light on neurons that are genetically engineered to carry the channel triggers electrical activity within the cell that then spreads to the next neuron in the circuit. (Scientists use optical fibers to shine light into the brain.)

Deisseroth and Boyden have now independently created an "off" switch that works by a similar mechanism. This time the scientists used a gene that codes for a protein pump: when hit with yellow light, it pumps negative charge into the cell, blocking that neuron from firing. Both switches can be used in the same cell, effectively giving neuroscientists a light switch that can be used to turn on and off neural activity.

This newfound ability to precisely control neurons could finally bring answers to major questions about the brain. It might help scientists find the specific cells or neural activity patterns that are involved in cognitive processes, such as attention, or in particular diseases, such as epilepsy.

Scientists can manipulate the specific units of the neural code--the pulses, or spikes, of electrical activity that are transmitted between cells. "We've shown we can push spikes around, block them, delay them," says Boyden. "We can really alter neural coding at a millisecond time scale." That should allow scientists to determine which aspect of the code--the spikes' timing or the spikes' rate--encodes information in the brain, a debate that has raged for decades.

Hardware for artificial intelligence

The information that I am providing here suggests that within 5 years (by 2012) a researcher with about $20,000 should be able to buy custom hardware to simulate 64 to 250 million neurons in real time or general purpose hardware (the latest Nvidia/Intel GPGPUs for a 10 million neuron simulation (scaling teraflops for the GPGPUs versus current Blue Gene/L 8,000,000 neurons at 1/6th real time). $400K for 1 billion neurons $4 million for 10 billion, $40 million for 100 billion. (A real time human brain simulation could be achieved with a 2011-2012 supercomputer. a 100 to 200 million grand challenge type project would be highly likely to succeed.) The price will fall in half each year after 2012. It could happen faster and cost less if we are even more clever. It should be about 2015-2018 for petaflop level performance at about $20k. There are also possibities for faster development using Ovonic quantum control devices or using simulations like those provided by CCortex. The simulation of neurons could exceed these estimates with more efficient programming of the simulations. The quality and precision of the simulated neurons is still being improved.

Rough estimates:
2012 for a full real time human brain simulation. (100 billion neurons)
2018 for that simulation to be less than an average annual salary of someone in the developed countries. ($60,000/year at that time)

Wikipedia discusses the estimates on the hardware needed for a human brain simulation


Simulated human brain model estimate from Ray Kurzweil 10 petaflops.
Newest IBM blue gene/P will have 3 petaflops at the topend (costs about 200 million).
There are nine current computing projects (such as BlueGene/P) to build more general purpose petaflops computers all of which should be completed by 2008.

Most other attempted estimates of the brain's computational power equivalent have been rather higher, ranging from 100 million MIPS to 100 billion MIPS. Furthermore, the overhead introduced by the modelling of the biological details of neural behaviour might require a simulator to have access to computational power much greater than that of the brain itself.

Software. Software to simulate the function of a brain would be required.

Understanding. Finally, it requires sufficient understanding thereof to be able to model it mathematically. This could be done either by understanding the central nervous system, or by mapping and copying it. Neuroimaging technologies are improving rapidly, and Kurzweil predicts that a map of sufficient quality will become available on a similar timescale to the required computing power. However, the simulation would also have to capture the detailed cellular behaviour of neurons and glial cells, presently only understood in the broadest of outlines.
Once such a model is built, it will be easily altered and thus open to trial and error experimentation. This is likely to lead to huge advances in understanding, allowing the model's intelligence to be improved/motivations altered.
Recent article about controlling neurons with light that will help speed up the science of understanding the workings of the brain

The Blue Brain project has used a supercomputer, IBM's Blue Gene platform, to simulate a neocortex consisting of approximately 8,000,000 neurons and 50 billion interconnecting synapses. The eventual goal of the project is to use supercomputers to simulate an entire brain.
IBM simulated 8 million neurons on a Blue Gene/L earlier this year

The lastest results digital mouse brain that needs about 6 seconds to simulate 1 second of real thinking time. That's still a long way from a true mouse-size simulation, and it runs on a Blue Gene/L supercomputer with 8,192 processors, four terabytes of memory, and 1 Gbps of bandwidth running to and from each chip. This is one eighth the 65,000 processors for the 280 teraflop version.

The human brain has roughly 100 billion neurons operating simultaneously, connected by roughly 100 trillion synapses. By comparison, a modern computer microprocessor uses only 1.7 billion transistors. Although estimates of the brain's processing power put it at around 10**14 neuron updates per second, it is expected that the first unoptimized simulations of a human brain will require a computer capable of 10**18 FLOPS.

The problem of brain simulation is that there are seven levels of investigation and 10 orders of magnitude (10 to 100 billion neurons)

Nvidia has released some cheap teraflops which will be improved next year for double precision. Intel will also be introducing Larabee anther general purpose graphic processing unit. These special machines speed up neuron simulation 100 times and molecular modeling by 240 times. Intel will also be introducing 80 core chips.

At the start of 2008, Nvidia will provide 12 teraflops for about $60K and 2 teraflops for $10K. The 12 teraflops would be pretty close to the power of the Blue gene supercomputer 22.8 teraflops which simulated 10,000-60,000 neurons. Another simulation ran 8 million neurons on a Blue Gene/L earlier in 2007

The IBM work by Dharmendra Modha simulated 8 million neurons at ten times slower than real time.

The new highly parallized approaches seems to be doubling every 12 months. Flash memory is improving faster than Moore's law as well which will help speed up what was disk heavy searches and reduce the power used.

Custom analog hardware seems like the cheaper route to brute forcing AGI.

The link above discusses a Stanford effort to simulate 64 million neurons by about 2011 (5 years after the 2006 presentation). Real time cortex scale simulations.

Hardware from 2005

Here is an update from Feb 2007 on the Boahen work in MIT Technology Review

A mouse brain houses over 16 million neurons, with more than 128 billion synapses running between them.

There is an effort for a very large (billions of neuron)simulation by a company CCortex

CCortex accurately models the billions of neurons and trillions of connections in the human brain with a layered distribution of spiking neural nets running on a high-performance supercomputer. This Linux cluster is one of the 20 fastest computers in the world with 500 nodes, 1,000 processors, 1 terabyte of RAM, 200 terabytes of storage, and a theoretical peak performance of 4,800 Gflops.

I think the CCortex simulation is not as accurate as the simulation that is being performed on the custom hardware.

Ovonic quantum control devices are possible transistor replacements from the inventor and billion dollar company behind PRAM and the nickle hydride battery. The quantum control device is more neuron like. The goal is print those out reel to reel. If that works (and he has big companies working with him) then multi-billion and multi-trillion neuron simulations could happen very quickly.

Accelerating Future discusses the odds of Artificial General Intelligence (AGI)

There is a also a suggested reading list for those interested in AGI

The state of cognitive enhancement

There are also the other AI work like numenta

Somewhat related: Today there was news of progress that checkers has been weakly solved. All positions with 10 or fewer pieces on either side were calculated. Once you go from 12 to 20 pieces down to 10 then the computer plays perfectly and at either draws or wins. Checkers at wikipedia A brute force solution to checkers.

IEEE spectrum discusses the checkers solution Checkers on a 12X12 board has 5 x 10**20 possible situations.

First, he constructed databases of endgames, building backward from all the possible wins, losses, or draws that checkers could conclude with. A so-called backward-searching algorithm built the path of situations that would have led to these endgames all the way to the point where there were 10 game pieces on the board. The result is a database of 39 trillion positions compressed using a homebrew algorithm into an average of 237 gigabytes for an average of 154 positions per byte of data.

The next step was to use a forward-search technique, such as the ones chess software typically rely on to figure out how to get to those 10-piece situations from the beginning of the game, when all 24 pieces are on the board. Schaeffer and his colleagues used a technique called “best first” to prioritize searching various positions and lines of play. At a given position in the game there are several possible moves that can be made. Instead of exploring all of these moves to their final outcomes using deep search, Schaeffer's team used Chinook to provide a measure of what the strongest line of play would be—what would most likely result in a win in the fewest moves. This line of play was evaluated first. If it did result in a win, then there was no need to search any other parallel lines of play, because the entire line was already known to result in a strong win. Since a win was achieved so quickly, it means the losing side made a mistake and did not play perfectly. Entire lines of play branching from various positions were eliminated this way, vastly reducing the number of lines that had to be deeply explored. By applying such a technique, Schaeffer’s team was able to solve checkers using the least amount of effort. Of the 5 x 10**20 possible positions, Schaeffer needed to evaluate only 10**14 to prove that checkers, played perfectly, results in a draw.

Chess has somewhere in the order of 10**40 to 10**50 positions.

July 17, 2007

Silicon spintronics closer for cheap continuation of Moore's law

From EEtimes, Naval Research Laboratory (NRL) scientists will next month describe a technique that would allow spintronics to be inserted into the standard silicon CMOS processes using ferromagnetic materials similar to those already used for magnetic random access memory.

Our demonstration showed a 30 percent polarization of the injected electrons, which is not bad considering that polarizaiton of electrons in magnetic metals is about 45 percent," said lead scientist Berend Jonker. "Now we want to build an electronic detector, rather than use an LED, as the next step toward silicon spintronics."

The NRL scientists claim to have injected electrons through a ferromagnetic film over a high-k aluminum oxide dielectric. Specifically, they formed an iron/aluminum oxide tunnel barrier contact over a light-emitting diode built from layers of n-doped silicon over a silicon-oxide insulator over p-doped silicon.

Despite the weak electroluminescence of a silicon LED, the resulting circularly polarized light confirmed the injection of spin-polarized electrons.

OTHER READING from EETimes.com:
Intermolecular Inc. (San Jose, Calif.) says its High-Productivity Combinatorial (HPC) platform of "fab in a lab" technologies will facilitate R&D of IC materials, processes and device structures

Intermolecular is addressing what he called the "perfect storm" in semiconductor R&D. "R&D spending is running out of control," he said. "Our mission is to make R&D more productive." While it remains to be seen whether Intermolecular's strategy will work, there is a crying need in the chip industry for feasible new R&D models. In 1978, total semiconductor R&D was $600 million, according to IC Insights. Intermolecular claims that figure had grown to $45 billion by 2006 and is expected to hit a whopping $100 billion in 2012.

The F30 fab-in-a-lab tool provides cleaning, electroless deposition, self-assembly and other functions in the same unit. The modular tool taps 28 separate screening models.

Intermolecular insists the tool will not replace a production machine but says it can be used by chip makers to develop a process "10 to 100 times faster" than conventional methods.

For example, to develop its "molecular masking layer" IP technology, Intermolecular ran 7,635 experiments with 60 base molecular types in the F30. In a short time, the tool discovered two "hits," or matches, according to the company.

In a move that will raise some eyebrows — and could upset others — International Sematech Monday (July 16) launched a new and dedicated 450-mm fab R&D program. Sematech will now have two separate and parallel 450-mm efforts. The new program is bound to upset most chip-equipment companies, which lack the R&D dollars to move to the next-generation and costly wafer size. 450-mm fabs are not expected to emerge until 2012 or later. Only a few companies can afford to build these fabs, namely Intel, Samsung, and perhaps TSMC.

Patch for regenerating heart tissue

When human hearts are injured, as during a heart attack, healthy tissue normally can’t regrow. Researchers now demonstrate in rats that a sponge-like patch, soaked in a compound called periostin and placed over the injury, can not only get heart cells to begin dividing and making copies of themselves again, but also improves heart function.

Periostin is a component of the material that surrounds cells and is derived from the skin around bone. Though the mature heart only has tiny amounts, it’s abundant during fetal heart development, and increased amounts are also made after skeletal-muscle injury, bone fracture and blood vessel injury, stimulating mature, specialized cells to begin dividing again. Led by Bernhard Kuhn, MD, in the Department of Cardiology at Children’s Hospital Boston, the researchers theorized that placing periostin near the site of a myocardial infarction could help restore this growth-friendly environment and get heart tissue to regenerate.

Using a small patch fashioned from a sponge-like material called Gelfoam, they then moved to experiments in rats with induced heart attacks. In half the rats, a patch that had been incubated with periostin was placed over the infarct site; the others received Gelfoam only.

Twelve weeks later, the treated patches were still releasing biologically-active periostin. The periostin-treated rats had improved cardiac pumping ability, as indicated by increased ejection fraction and improved ventricular remodeling on echocardiograms, and decreased left-ventricular wall stress on catheterization. They also had less scarring of heart tissue, a reduction in infarct size and a denser network of blood vessels feeding the area. In contrast, the rats receiving Gelfoam alone showed little if any improvement.

At the cellular level, the periostin-treated group had a 100-fold increase in the number of cardiomyocytes entering the cell cycle, and grew, on average, 6 million more cardiomyocytes, far exceeding the number of dying cells. (For perspective, the average rat heart has about 20 million cardiomyocytes overall.)

Kuhn, a pediatric cardiologist, envisions using a sustained-delivery periostin patch not only to treat adults with heart attack, but also to encourage cardiomyocyte proliferation in children with congenital heart disease.

910,000 americans die each year from heart disease More than 70 million Americans live every day with some form of heart disease, which can include high blood pressure, cardiovascular disease, stroke, angina (chest pain), heart attack and congenital heart defects.

Supervision enhancement

Michael Anissimov over at Accelerating Future discusses CNN polls target=blank>that CNN presented as part of their future summit

One facet of enhancement that is not controversial but is widespread is vision enhancement. In the further reading section, I refer to my past articles on other methods of enhancement which are being developed such as regeneration, life extension and cognitive enhancement. I believe the most important and likely enhancements are those that provide a economic benefit and boosts individual and group productivity or reduce costs such as superior health.

This article at Slate discusses how LASIK eye surgery can provide a sports performance advantage.

McGwire's custom-designed lenses improved his vision to 20/10, which means he could see at a distance of 20 feet what a person with normal, healthy vision could see at 10 feet. Think what a difference that makes in hitting a fastball. Imagine how many games those lenses altered.

Tiger Woods, who had lost 16 straight tournaments before his [LASIK] surgery, ended up with 20/15 vision and won seven of his next 10 events.

In 2004, 69 percent of traditional LASIK patients in a study had 20/16 vision six months after their surgery, and new "wavefront" technology raised the percentage to 85. Odds are, if you're getting LASIK, you're getting enhanced.

This emedicine article discusses what perfect vision is

To the refractive surgery patient in year 2000, achieving an uncorrected visual acuity of 20/20 after refractive surgery was considered a success. Ongoing research in this field is focused on further improving these results. Realizing that 20/20 does not represent perfect vision is important because many young healthy adults have visual acuities of 20/15 to 20/12. If optical aberrations in the eye could be eliminated, the theoretical limit of foveal acuity would be 20/12 for a small pupil and up to 20/5 for a dilated pupil.

Achieving super vision by using custom ablation

In 100 eyes with 3-months follow-up, postoperative UCVA of 20/15 or better was 45.0%, 20/20 or better was 89.0%, and 20/40 or better was 99.8%. These compared favorably with those completed with conventional ablation, which showed postoperative UCVA of 20/15 or better was 29.0%, 20/20 or better was 84.0%, and 20/40 or better was 99.5%. Some other studies have achieved 29% with 20/12.5.

Using better lasers for LASIK

There is a reversible and correctable LASIK treatment in development called pai-lasik which uses a photoablative inlay The plastic inlay is sculpted by the excimer laser and, then, left in between the flap and the underlying stroma.

Gene therapy status
January 2007, there are 1,260 gene therapy clinical trials in progress.
Millions will get gene therapy for disease treatment. Gene therapy, RNA interference and RNA activation will be used to restore vision to the blind, help control obesity and it will be used to enhance people so that everyone who so chooses can have what today are superior physical and mental gifts.

Here are some of my past articles on transhuman enhancement

Here are my past articles on advances that are leading to human regeneration

Here are my past articles on advances towards life extension

here are my articles on brain enhancement, brain/computer interfacing and brain science advances

Here are my past articles on cognitive enhancement

Here are my past articles on robotics

July 16, 2007

Updating Project Orion: External Pulsed Plasma Propulsion

Nuclear rockets can have 2 to 200 times the performance of chemical rockets. They are a technology that we only need the will to develop. The science is solid and straight forward. We just have to have the courage to become a truly interplanetary civilization. This article will review the various pulsed plasma (using nuclear bombs for propulsion) proposals and have a bit of review of the nuclear thermal rockets at the end. Modern materials will allow smaller nuclear rockets to be produced which could be deployed in space by chemical launch systems. Also, there is uranium and thorium on the moon, so lunar materials could be mined and processed and these nuclear rockets could be made almost entirely from lunar material.

External Pulsed Plasma Propulsion (EPPP) is a space propulsion concept that gets thrust from plasma waves generated from a series of small, supercritical fission/fusion pulses behind an object in space.


This NASA study from 2000 had the following designs:
The realistic maximum Isp obtainable with fission-based EPPP is -100,000 seconds. [A fusion powered version of Orion would max out at 1,000,000 ISP (this link compares Orion versions to other space propulsion including photonic propulsion)] However, this type of performance would only be possible with very large spacecraft. Such vehicles would be impractical until the cost of access to space dropped substantially or in-space manufacturing became available. Therefore, a more conservative approach has been taken by considering smaller vehicles with lower performance (Isp 10,000 seconds) using technology available in the near-term. This concept has been informally termed “GABRIEL.” The GABRIEL series includes an evolutionary progression of vehicle concepts that build upon the nearest-term implementation of EPPP. This concept roadmap eventually culminates in larger systems that employ more sophisticated methods for pulse initiation and momentum transfer. GABRIEL is characterized by the following four levels:
1. Mark I: Solid pusher plate and conventional shock absorbers (small size)
2. Mark II: Electromagnetic coupling incorporated into the plate and shocks (medium size)
3. Mark III: Pusher plate extensions such as canopy, segments, cables (large size)
4. Mark IV: External pulse unit driver such as laser, antimatter, etc. (large size)
All of these levels, besides the GABRIEL Mark I, require technology that is not currently available, but may be attainable for a second-generation vehicle. The Mark I is also the smallest and least expensive version, but suffers from the poorest performance (nominally 5,000 seconds and 4 million newtons of thrust).

Note: this poorest performance is still over ten times better than chemical rockets and is five times better than nuclear thermal solid core. The ISP is the level of good ion drive but has thrust that is over 10,000 times better.

The classic version of this idea is Project Orion, which was the idea of firing about 800 small nuclear bombs through a hole in a large metal plate.

An imagining of Project Orion [wikipedia from this link] taking off

Sam Dinkins notes:
One of the 1958 designs achieves an Isp of 12,000 seconds with an Earth-launched payload capacity to LEO of 5500 tons. The fuel would be 800 nuclear bombs. The weight of 800 nuclear bombs has gone way down. For the eight terajoule (two kiloton) explosions required for the 4500-ton version that can carry 20 people to the Moon, Jupiter, and Saturn in the same trip, instead of the 900 tons of nuclear bombs at almost one ton each, there would need to be probably only 50 tons’ worth of bombs [because of technological improvements]

To protect the people from the huge kick of a blast, Orion envisioned huge shock absorbers designed to absorb the impulse of the 1000-ton pusher plate with the entire rest of the ship. Other versions looked at further absorption just in the crew area. My recommendation would be to put the crew on a long electromagnetic track. They could potentially be the only part of the ship that is isolated from the high-g shocks. By having a very small mass isolated from the pusher shocks, the mass of the shock absorbers could be reduced from 900 tons to something more manageable.

Apollo size rocket on the left and a first smaller version of Orion nuclear rocket

Metals have been strengthened with nanograin structure that are four times stronger while not becoming brittle

Here is an article that examined the mass fraction and payload that different versions of Project Orion could launch.

Space bombardment blog also examines the details of the 880ton 1959 design Improved pusher plate and lighter bombs would have it launch 650 tons to orbit instead of 300 tons.

Pictures of Project Orion

The project Orion page

Rapid fire z-pinch fusion is being developed

The Z-pinch is the basis for the minimag Orion concept

Mag Orion would have detonated 100 kiloton bombs 2 kilometers behind the space craft and had a superconducting magnetic sail interact with the blast to generate 1,000,000 newtons at 30,000 ISP.

Mini-mag Orion would use sub-critical explosions with Z pinch technology. 5 tons of explosive power with a 5 meter magnetic bottle. Specific Impulse of 21,500 sec and thrust 625,000 Newtons.

Minimag Orion

A review of NERVA nuclear rockets and variants of NERVA

Nuclear thermal rocket like NERVA

Most studies of nuclear thermal (NERVA variants) believe that an ISP of 925 would be very achievable. Russian NTR fuel elements would allow ISP of 960+. A 2005 NASA presentation on nuclear thermal Systems with ISPs of 1010 considered. There are designs of closed cycle gas core versions with 1500-2000 seconds (ISP) (15–20 kN·s/kg). Gas core reactor rockets can achieve 3000 to 5000 seconds (ISP) (30 to 50 kN·s/kg).

The disadvantage of the open cycle is that the fuel can escape with the working fluid through the nozzle before it reaches significant burn-up levels. Thus, finding a way to limit the loss of fuel is required for open-cycle designs. Unless an outside force is relied upon (i.e. magnetic forces, rocket acceleration), the only way to limit fuel-propellant mixing, is through flow hydrodynamics. Another problem is that the radioactive efflux from the nozzle makes the design totally unsuitable for operation within Earth's atmosphere. The advantage of the open cycle design is that it can attain much higher operating temperatures than the closed cycle design, and does not require the exotic materials needed for a suitable closed cycle design.

So if you are going for high performance you might as well go for EPPP instead of open cycle gas core nuclear thermal.

Lunar resource overview

Lunar resources could be developed so that EPPP rockets could be built there and launched to the rest of the solar system. The next step is to build upon a plan that I have proposed for taking space based solar power profitably to the megawatt level Then developing a lunar base with hundreds of megawatts to gigawatts of lightweight solar power. Some solar cells and reflective material could be built using lunar regolith. After the industrial and energy base is established then the mining, industry, nuclear power plants and nuclear rockets could be built.

Mirrors and lasers for photonic propulsion should also be constructed

Studies of the lunar regolith indicates a lot of Thorium, although the exact amount is uncertain

There is Thorium in the lunar regolith which can be put into a breeder reactor to make Uranium 233.
The Uranium 233 can make bombs but it would more difficult than Uranium 235 because of the radiation.

Thorium can be used to power nuclear reactors that are superior to current nuclear reactors

Here is an analysis of lunar regolith

Composition of lunar regolith

KREEP ore deposits on the moon would include Chlorine, Zirconium, Fluorine, Thorium, Potassium, Uranium, Phosphorus, Rare Earth Elements, Sodium

Thorium as a tracer for KREEP deposits on the Moon. Note the high concentrations around the Procellarum KREEP Terrane.

Sometimes environmental factors are key to disease

Pregnant women who get respiratory infections in the second trimester are up to seven times more likely to have a child with schizophrenia. There is also an increased risk of autism. Blocking the proteins generated by the mother in the second trimester would prevent millions in the USA and around the world from getting schizophrenia and autism. It would help prevent 20-25% of the homeless problem.

California Institute of Technology neuroscientist Prof Paul Patterson said this risk was greater than any known genetic influence.

He believes the virus triggers a mental switch that alters and inflames the fetal brain and sets the child up for mental illness in later life.

They found that it was the mother's immune response that caused the problem, rather than the virus itself.

"The proteins produced by the mother's immune system to fight the infection seemed to be linked to the problem," Prof Patterson said.

Schizophrenia, which affects 3.2 million Americans [or 2.1 million depending upon source of study], is a chronic, recurrent mental illness, characterized by hallucinations, delusions, and disordered thinking. The medications used to treat the disorder are called antipsychotics.

It has been estimated that 25 percent of homeless adults aged 18 years and older suffer from severe mental illness

About 12 million people of the U.S. adult population have experienced literal homelessness at some time in their lives. In 1996, the most recent year for which such data are available, 3.5 million Americans were homeless at least once during the year, an increase of 1.2 million over that estimated 10 years earlier.

In 2005, more than 1.5 million people in the world are afflicted with autism. In the next decade, an estimated 4.5 million people will be diagnosed with ASD [autism].

Форма для связи


Email *

Message *