November 06, 2009

Technology Roundup: Booming EReaders in 2010, Ionic Batteries, Batteryless Neural Sensing Chip

1. MIT Technology Review reports on the work of Arizona based Fluidic Energy who are working toward development of a metal-air battery that relies on ionic liquids, instead of an aqueous solution, as its electrolyte.

The company aims to build a Metal-Air Ionic Liquid battery that has up to 11 times the energy density of the top lithium-ion technologies for less than one-third the cost. Cody Friesen, a professor of materials science at Arizona State and founder of Fluidic Energy, says the use of ionic liquids overcomes many of the problems that have held back metal-air batteries in the past.

The research team will target energy densities of at least 900 watt-hours per kilogram and up to 1,600 watt-hours per kilogram in the DOE-funded ($5 million) project.

The problem with ionic liquids is that they're still made in small quantities, making them expensive compared to many other solvents used to dissolve salts. "But some people are making ionic liquids now out of things that are already known and produced in high quantities, like detergents," says Wilkes.

2. A tiny radio chip implanted in a moth harvests power and senses neural activity.

Electrical engineers at the University of Washington have developed an implantable neural sensing chip that needs less power. Other wireless medical devices, such as cochlea or retinal implants, rely on inductive coupling, which means the power source needs to be centimeters away. The new sensor platform, called NeuralWISP, draws power from a radio source up to a meter away.

3. In 2008 1.1 million e-readers with e-paper displays were sold and in 2010 that number will rise to about 6 million, according to market analysis firm MediaIdeas

4. Forbes reports that Google Wave is being developed into a business collaboration system by SAP

Electric Solar Wind Sail Could Have Five Times Higher Thrust

An electric solar wind sail is a recently introduced propellantless space propulsion method whose technical development has also started. The electric sail consists of a set of long, thin, centrifugally stretched and conducting tethers which are charged positively and kept in a high positive potential of order 20 kV by an onboard electron gun. The positively charged tethers deflect solar wind protons, thus tapping momentum from the solar wind stream and producing thrust. The amount of obtained propulsive thrust depends on how many electrons are trapped by the potential structures of the tethers, because the trapped electrons tend to shield the charged tether and reduce its effect on the solar wind.

A new research paper shows that if trapped electrons can be removed that thrust can increase five times from 500 nN/m [at 1AU for average solar wind conditions and
for reasonable values of the driving voltage] to 2500 nN/meter, which means 1 Newton of thrust for 2000 kilometers of total tethers. From the picture above you could have 50 tethers (wires) that were each 40 kilometers long.

There is a project to launch a prototype solar electric sail in 2012 with an Estonian satellite

The ESTCube-1 is a 1 kg nanosatellite and Estonia's first satellite, with planned launch in 2012. It will open a 10 meter tether made of very thin metal wire and charge it to 200 V with a miniature onboard electron gun. As the satellite flies in its orbital path through the ionospheric plasma, the speed difference between the satellite and the plasma induces a small force on the tether which can be measured. The measurement is used to validate and calibrate existing plasma physical theory of the electric sail effect.

Later, production-scale electric sails will use much longer tethers and will fly in the solar wind, utilising the much larger speed difference between the satellite and the fast-moving solar wind. According to estimates, electric sails can be orders of magnitude more efficient than existing methods (chemical rockets and ion engines) for many transport tasks in the solar system. Scientifically, they could revolutionize solar system science by enabling fast missions out of the heliosphere and affordable sample return missions from planetary, moon and asteroid targets. Commercially, electric sail could enable the economic utilization of asteroid resources for e.g. orbital rocket propellant production or orbital manufacturing of structural parts.

Picture caption: A more advanced siamese twin deployment of two sets of slar electric sail tethers

Increased electric sail thrust through removal of trapped shielding electrons by orbit chaotisation due to spacecraft body

Here we present physical arguments and test particle calculations indicating that in a realistic three-dimensional electric sail spacecraft there exist a natural mechanism which tends to remove the trapped electrons by chaotising their orbits and causing them to eventually collide with the conducting tethers. We present calculations which indicate that if these mechanisms were able to remove trapped electrons nearly completely, the electric sail performance could be about five times higher than previously estimated, about 500 nN/m, corresponding to 1N thrust for a baseline construction with 2000 km total tether length.

2000 km total length of tether (for example, 50 tethers 40 km long each) could weigh 50–100 kg (frame, solar panels, high-voltage power source, electron gun, motorised tether reels, various sensors and control processor), of which the tether mass is 10 kg. According to the new results, such a device could produce 1N thrust and produce a specific acceleration of 10–20 mm/s2. If used to move a 500 kg payload, for example, the device would produce a 30 km/s velocity change over six months.

The theoretical results presented here call for experimental verification. The verification could come from a measurement of electrosphere size, thrust force or both in a space or laboratory experiment. Two-dimensional particlein-cell or Vlasov plasma simulations might give a better estimate of the thrust force than the rough analytical calculations presented in this paper. The 2-D simulations would need to be equipped with some kind of trapped electron removal scheme. Because the electron temperature 12 eV is several thousand times smaller than the depth of the potential well, extra care should be taken into the simulations to avoid spurious trapping by numerical errors.

Although the electric sail plasma physical problem is simple in the sense that only electrostatic forces are involved, the problem spans a wide range in parameter space. The range in energy goes from 12 eV electron temperature to 20 kV tether potential. The spatial scale is from 10μm radius wires to 100 m wide potential structure and to 20–100 km long tethers, which gives 7 to 10 orders of magnitude in space. Finally, the timescales start from 0.1 ps needed for an electron to move across a 10μm wire width to several minutes needed to remove the trapped electrons (15–16 orders of magnitude). It is evident from this range of scales that a brute-force simulation approach is not fruitful. Thus, while theory is essential and simulations helpful, experimental studies are crucial in designing the electric sail.

Finally, it is worth remarking that if the electric sail thrust is indeed as large as the estimates presented in this paper indicate, the potential of the electric sail for space transportation in the solar system is enormous. Exploring the potential scientific and commerical applications and implications is, however, outside the scope of this theoretical study.

Electric sail site

A full-scale electric sail consists of a number (50-100) of long (e.g., 20 km), thin (e.g., 25 microns) conducting tethers (wires). The spacecraft contains a solar-powered electron gun (typical power a few hundred watts) which is used to keep the spacecraft and the wires in a high (typically 20 kV) positive potential. The electric field of the wires extends a few tens of metres into the surrounding solar wind plasma. Therefore the solar wind ions "see" the wires as rather thick, about 100 m wide obstacles. A technical concept exists for deploying (opening) the wires in a relatively simple way and guiding or "flying" the resulting spacecraft electrically.

The main limitation of the electric sail is that since it uses the solar wind, it cannot produce much thrust inside a magnetosphere where there is no solar wind. Although the direction of the thrust is basically away from the Sun, the direction can be varied within some limits by inclining the sail. Tacking towards the Sun is therefore also possible.

Negativly charged solar electric sail (9 page pdf)

Acceleration of Neutral Atoms with Lasers With Acceleration Up to 100 trillion Gs

Researchers observed previously unconsidered strong kinematic forces on neutral atoms in short-pulse laser fields. The ponderomotive force on electrons is the driving mechanism, producing ultra-strong acceleration of neutral atoms greater that Earth's gravitational acceleration by 14 orders of magnitude. A force of such strength may lead to new applications in both fundamental and applied physics. On the cover, a record of the deflection of neutral helium atoms after interaction with a focused laser beam.

Acceleration of neutral atoms in strong short-pulse laser fields

A charged particle exposed to an oscillating electric field experiences a force proportional to the cycle-averaged intensity gradient. This so-called ponderomotive force plays a major part in a variety of physical situations such as Paul traps for charged particles, electron diffraction in strong (standing) laser fields(the Kapitza–Dirac effect) and laser-based particle acceleration. Comparably weak forces on neutral atoms in inhomogeneous light fields may arise from the dynamical polarization of an atom; these are physically similar to the cycle-averaged forces. Here we observe previously unconsidered extremely strong kinematic forces on neutral atoms in short-pulse laser fields. We identify the ponderomotive force on electrons as the driving mechanism, leading to ultrastrong acceleration of neutral atoms with a magnitude as high as 10^14 times the Earth's gravitational acceleration, g. To our knowledge, this is by far the highest observed acceleration on neutral atoms in external fields and may lead to new applications in both fundamental and applied physics.

The Pondermotive force at wikipedia

In physics, a ponderomotive force is a nonlinear force that a charged particle experiences in an inhomogeneous oscillating electromagnetic field.

In the new research the force was applied to uncharged atoms.

Nanocapsules for Artificial Photosynthesis and Improved Nanoparticles for Gene Therapy

1. Chemists from the University of Würzburg have made progress to achieving artificial photosynthesis Nanocapsules have been loaded with reactive molecules which convert UV light to visible light which varies based on the pH of the environment. The chemical conversion to light could lead to artificial photosynthesis and separately be used for tiny sensors for pH.

Unique material for the capsule shell

The Würzburg nanocapsules are comprised of a unique material. This was developed in Frank Würthner's working group on the basis of so-called amphiphilic perylene bisimides. If the base material, which can be isolated as a powder, is placed in water, its molecules automatically form so-called vesicles, though these are not stable at that point. It is only through photopolymerization with light that they become robust nanocapsules that are stable in an aqueous solution - regardless of its pH value.

The diameter of one nanocapsule is a mere 20 to 50 nanometers. Dr Xin Zhang, a visiting scientist from China, managed to fill the nanocapsules with other photoactive molecules.

Zhang smuggled bispyrene molecules into the nanocapsules. The special thing about these molecules is that they change their shape to suit their environment. Where the pH value is low, in other words in an acidic environment, they assume an elongated form. If they are then excited with UV light, they emit blue fluorescent light.

If the pH value rises, the molecules fold. In this shape they emit green fluorescent light. In this state the bispyrenes excite the capsule shell energetically, which reacts to this with red fluorescence.

Blue, green, and red. If the three primary colors overlap, this produces white - as with a color television. It is the same with the nanocapsules: with a pH value of 9, in other words just right of neutral, they emit white fluorescent light - "a so far unique effect in the field of chemical sensing, which might be groundbreaking for the design of fluorescence probes for life sciences," explains Professor Würthner.

The Würzburg chemists have access to an extremely sensitive nanoprobe: the pH value of an aqueous solution can be determined with nanoscale spatial resolution over the wavelength of the fluorescent light emitted by the nanocapsules.

This means that nanocapsules are not just an option for artificial photosynthesis, they can also be used for diagnostic applications. For example, they could be equipped with special surface structures that purposefully dock to tumor cells and then make these visible by means of fluorescence.

The value of artificial photosynthesis

Why conduct research into artificial photosynthesis? In photosynthesis, plants consume the "climate killer" that is carbon dioxide. In view of global warming, many scientists see artificial photosynthesis as a possible way of reducing the volume of the greenhouse gas carbon dioxide in the atmosphere. In addition, this process would also create valuable raw materials: sugar, starch, and the gas methane.

2. MIT has nanoparticles, made of biodegradable polymers, which offer a chance to overcome one of the biggest obstacles to realizing the promise of gene therapy:

The viruses often used to carry genes into the body can endanger patients. Furthermore, the particles created in Langer’s lab now rival viruses’ efficiency at delivering their DNA payload.

This summer the nanoparticle-delivered gene therapy successfully suppressed ovarian tumor growth in mice.

One drawback to non-viral vectors is that they are not as efficient as viruses at integrating their DNA payload into the target cell’s genome, says Leaf Huang, professor in the School of Pharmacy at the University of North Carolina. However, in the past several years, advances by Langer and others have improved that efficiency by several orders of magnitude.

“Non-viral vectors are now comparable to viral vectors, in some cases,” says Huang, whose research focuses on delivering genes surrounded by a fatty membrane. “They have come a long way compared to 10 years ago.”

Both viral and non-viral methods could eventually prove useful and safe, says gene therapy researcher Katherine High, who is part of a team that recently used viral gene therapy to restore some sight to children suffering from a congenital retinal disease.

The ovarian cancer treatment developed at MIT and the Lankenau Institute has been successful in animal studies but is not yet ready for clinical trials. Such trials could get under way in a year or two, says Anderson. Meanwhile, he and others in Langer’s lab are exploring other uses for their nanoparticles. Last month, the researchers reported using the particles to boost stem cells’ ability to regenerate vascular tissue (such as blood vessels) by equipping them with genes that produce extra growth factors.

“We’ve had success with gene delivery using these nanoparticles, so we thought they might be a safer, temporary way to modify stem cells,” says Anderson.

Second Day Results from the Space Elevator Games and the Third and Final Day

Day 3 also appears to be done with no change in the standings or prizes won. Lasermotive has won the level 1 prize of $900,000. No other prizes were won and no other team qualified for a prize.

LaserMotive retained their lead, and inched closer to the 5 m/s benchmark – they removed some payload, and thus ran a bit faster – the official times were 3:49 and 3:48 – 13 seconds faster, in fact, for a speed of 3.9 m/s. The payload was about 200 grams lighter – 0.4 kg (unofficial), for an unofficial score of 3.9 * 0.4 / 4.8 = 0.325.

Kansas City still failed short of reaching the top, though it seems that their problems are largely solved and so we can expect a credible challenge to LaserMotive from KCSP tomorrow.

USST were facing a series of problems, and were not able to run at all. What they can do Friday morning is anyone’s guess. Based on previous years, however, we should definitely not be counting them as having lost. All of their first-place climbs to date were made at the last minute of the last possible day.

Live action for day three has started.

USST (University of Saskatchewan) failed to climb more than a few meters and are now done. Lasermotive is done. Only Kansas City Pirates can improve and try to get a prize qualifying run.

Lasermotive climbing on day 3

10:39 PST: Hey #SEGames Ted Semon & Bryan Laubscher (Space Elevator Games live ›
11:23 PST: Hey #SEGames No. (Space Elevator Games live ›

USST & LaserMotive done. Hoping to get started with KCSP about 10 minutes from now [at about 1 PM PST].

November 05, 2009

Inertial MEMS accelerometers That are 1000 Times more Sensitive will benefit applications such as bridge, infrastructure and seismic monitoring

HP today announced new inertial sensing technology that enables the development of digital micro-electro-mechanical systems (MEMS) accelerometers that are up to 1,000 times more sensitive than high-volume products currently available.

A MEMS accelerometer is a sensor that can be used to measure vibration, shock or change in velocity. By deploying many of these detectors as part of a complete sensor network, HP will enable real-time data collection, management evaluation and analysis. This information empowers people to make better, faster decisions, and take subsequent action to improve safety, security and sustainability for a range of applications, such as bridge and infrastructure health monitoring, geophysical mapping, mine exploration and earthquake monitoring.

The HP sensing technology enables a new class of ultrasensitive, low-power MEMS accelerometers. Up to 1,000 times more sensitive than high-volume, commercial products, sensors based on this technology can achieve noise density performance in the sub 100 nano-g per square root Hz range to enable dramatic improvements in data quality. The MEMS device can be customized with single or multiple axes per chip to meet individual system requirements.

The sensing technology is a key enabler of HP’s vision for a new information ecosystem, the Central Nervous System for the Earth (CeNSE). Integrating the devices within a complete system that encompasses numerous sensor types, networks, storage, computation and software solutions enables a new level of awareness, revolutionizing communication between objects and people.

“With a trillion sensors embedded in the environment – all connected by computing systems, software and services – it will be possible to hear the heartbeat of the Earth, impacting human interaction with the globe as profoundly as the Internet has revolutionized communication,” said Peter Hartwell, senior researcher, HP Labs.

Modfied HIV Delivered Gene Therapy Could Treat Many Diseases

In a pilot study of two patients monitored for two years, an international team of researchers slowed the onset of the debilitating brain disease X-linked adrenoleukodystrophy (ALD) using a lentiviral vector to introduce a therapeutic gene into patient's blood cells. Although studies with larger cohorts of patients are needed, these results suggest that gene therapy with lentiviral vectors, which are derived from disabled versions of human immunodeficiency virus (HIV), could potentially become instrumental in treating a broad range of human disorders

Other Gene Therapy Success and Progress
Lungs too damaged for use in transplant operations may be salvageable through a gene-based technique, doubling or tripling the supply of organs.

The flawed lungs could be removed from donors’ bodies after death and repaired using the gene IL-10, which lowers inflammation. 1800 people in the US are awaiting lung transplants.

Gene Therapy helps treat a form of blindness The condition is known as Leber’s congenital amaurosis and there are 2000 people in the US who have it.

A number of companies are developing gene therapies and 320 trials are under way or cleared to begin by U.S. regulators, said Karen Riley, a U.S. Food and Drug Administration spokeswoman. Genzyme Corp. of Cambridge, Massachusetts, will begin a human trial using gene therapy next year to treat macular degeneration, the leading form of age-related vision loss, said John Lacey, a Genzyme spokesman

Researchers at the California Institute of Technology (Caltech) have shown that a highly specific intrabody (an antibody fragment that works against a target inside a cell) is capable of stalling the development of Huntington's disease in a variety of mouse models.

Research in monkeys suggests that genetically delivering dopamine avoids some side effects and helps with Parkinson's.

In the new trial, reported today in the journal Science Translational Medicine, Bechir Jarraya and colleagues at the Molecular Imaging Research Center in Fontenay-aux-Roses, France, mimicked Parkinson's in monkeys by giving them a neurotoxin that causes movement problems characteristic of the disorder. The researchers then injected three genes involved in dopamine production into the brains of the monkeys, as well as specially designed probes to measure dopamine levels in the brain, monitoring the animals for up to three and a half years. The gene therapy restored concentrations of dopamine in the brain, corrected movement problems, and prevented dyskinesias--without any severe adverse side effects. An early stage human clinical trial using the same dopamine gene therapy approach is now underway.

The Modified HIV Gene Therapy

The healthy ALD protein was expressed in about 15 percent of blood cells, yet surprisingly this low level was sufficient to slow brain disease in ALD. "This percentage of correction will not be sufficient for all diseases," warns Aubourg. "There is a lot of work to be done to make this gene therapy vector more powerful, less complicated, and less expensive. This is only the beginning," he said.

Gene therapy is not without serious risks. Like other retrovirus vectors, the HIV-derived lentivirus vector is tasked with inserting the therapeutic gene in the chromosomes of the patients' cells. In a worst case scenario, this action could disturb the biology of the cells and patients could end up with leukemia; this outcome has occurred in past gene therapy trials. "The HIV-derived lentivirus vector basically has this same risk, although the design of the vector makes patients less prone to this side effect," said Aubourg.

Wrong Diagnosis has statistics on ALD

Prevalance Rate: approx 1 in 20,000 or 0.00% or 13,600 people in USA

Wikipedia on Adrenoleukodystrophy

Adrenoleukodystrophy (ALD) (also known as "Addison-Schilder Disease," "Siemerling-Creutzfeldt Disease," and "Schilder's disease") is a rare, inherited disorder that leads to progressive brain damage, failure of the adrenal glands and eventually death. ALD is one disease in a group of inherited disorders called leukodystrophies. Adrenoleukodystrophy progressively damages the myelin, a complex fatty neural tissue that insulates many nerves of the central and peripheral nervous systems, eventually destroying it. Without myelin, nerves are unable to conduct an impulse, leading to increasing disability as myelin destruction increases and intensifies.


NY Times: For Gene Therapy, Seeing Signs of a Resurgence

Complete Genomics Sequences Human Genome for $1726 Cost of Materials and a New Project to Sequence 10 thousand Vertebrate Genomes

1. Complete Genomics has a report in the journal Science describing its proprietary DNA sequencing platform, including analysis of sequence data from three complete human genomes. The consumables cost for these three genomes sequenced on the proof-of-principle genomic DNA nanoarrays ranged from $8,005 for 87x coverage to $1,726 for 45x coverage for the samples described in this report.

Complete Genomics' sequencing process includes four distinct steps:
1) Sample preparation and library construction
2) Self-assembling DNA nanoarrays
3) Imaging, assembly and analysis
4) Combinatorial probe -- anchor ligation (cPAL).

Complete Genomics’ scientists generated high-quality diploid base calls in as much as 95 percent of the genomes sequenced, identifying 3.2 million to 4.5 million sequence variants per genome processed.

Detailed validation of one genome dataset demonstrates a sequence accuracy of just
one false variant per 100 kilobases, a remarkably low error rate, particularly for such an affordable technology.

Patterned genomic DNA nanoarrays and 70-base, unchained sequence reads are unique technical achievements. The company’s new patterned genomic DNA nanoarrays, which achieve a record high density of 2.85 billion spots per slide at 0.7 micron pitch, will enable Complete Genomics to sequence 10,000 human genomes in 2010.

Human Genome Sequencing Using Unchained Base Reads on Self-Assembling DNA Nanoarrays

Genome sequencing of large numbers of individuals promises to advance the understanding, treatment, and prevention of human diseases, among other applications. We describe a genome sequencing platform that achieves efficient imaging and low reagent consumption with combinatorial probe anchor ligation (cPAL) chemistry to independently assay each base from patterned nanoarrays of self-assembling DNA nanoballs (DNBs). We sequenced three human genomes with this platform, generating an average of 45- to 87-fold coverage per genome and identifying 3.2 to 4.5 million sequence variants per genome. Validation of one genome data set demonstrates a sequence accuracy of about 1 false variant per 100 kilobases. The high-accuracy, affordable cost of $4,400 for sequencing consumables and scalability of this platform enable complete human genome sequencing for the detection of rare variants in large-scale genetic studies.

51 page pdf with supplemental material

2. A group of genome and museum experts today launched an ambitious plan to decipher 10,000 vertebrate genomes. The Genome 10K plan, formally announced today and described online in the 5 November issue of the Journal of Heredity, is short on details: where funding will come from; what sequencing strategy to use; how to process and make use of data generated.

O’Brien, Haussler, and Ryder want to see sequencing genomes cost $2500 each—a hundred-fold decrease in the current cost or more. By waiting a few years for better sequencing technology, they expect to spend $50 million for the whole project

Eight Objectives of the Lawrenceville Plasma Physics Focus Fusion Experiments

Lawrenceville Plasma Physics (LPP) a small research and development company part way through a two-year-long experimental project to test the scientific feasibility of Focus Fusion, controlled nuclear fusion using the dense plasma focus (DPF) device and hydrogen-boron fuel. Hydrogen-boron fuel produces almost no neutrons and allows the direct conversion of energy into electricity. Success would mean thousands of times more total energy would be available and the energy would be cleaner and cheaper. LPP believes that with success they can lower the cost of energy up to 50 times.

This site has described how the Mr. Fusion scenario would change the world.

They have achieved one of eight experimental goals so far. The eight goals and the timeline they are working on as listed at

By the End of 2009

* At 25kV (kilovolts): Produce 1 MA (million amperes), determine optimum gas pressure

Get the experimental machine to function at 25 kilovolts, the lowest planned experimental voltage, and to produce more than 1 Million Amps of current. They will also very shortly switch over to running with deuterium and thus achieve their first fusion reactions with FF-1. In achieving this goal, they will also determine the optimum gas pressure for this current.

* Test theory of axial magnetic field

The third goal is to test the theory that adding a small axial magnetic field, and thus a small amount of angular momentum, to the plasma will greatly increase the size of the plasmoids and thus the efficiency of energy transfer into the plasmoid.

* Move to 45kV, 2MA, with Deuterium
The fourth goal is to increase the charging potential on the machine, by 5 kV steps, up to the full capacity of 45 kV and in the process achieve a peak current of about 2 MA with deuterium.

* Confirm University of Texas Dense Plasma Fusion results, with better instruments

The fifth goal is to confirm the Texas results of high temperature and density, but with far more complete diagnostic instruments.

By end of 2010

* Heavier Gases: D + He + N, and shorter electrodes

The sixth goal is to confirm LPP’s theory that heavier gases will lead to higher compression and to thereby achieve gigagauss fields. This will involve running with combination of D (deteurium), He (Helium) and perhaps N (Nitrogen) and will also involve replacing the electrode with shorter ones, which they predict will be optimized for the heavier gases. These experiments are more complex and will be more time-consuming.

* pB11

The seventh goal is to demonstrate some fusion burn with pB11 (proton-boron) fuel.

Proton-boron fusion would have very little neutron radiation as described in the wikipedia entry on aneutronic fusion

* Net energy
The eighth and final goal will be to demonstrate the scientific feasibly of producing net energy with pB11.

First Goal Achieving a Pinch, Has Been Done. Why it Matters

From Focus Fusion, Eric Lerner summarizes the significance of first shots and pinch as follows:

The achievement of a pinch, and on the second shot, means that we have accomplished one of the eight technical goals of the current experimental program. The machine is doing what we designed it to do, which is to transfer energy into a tiny plasmoid. It is quite unusual for a DPF to pinch right way. Normally fine-tuning of the electrodes and insulator and “conditioning” of the electrodes by several shots is required. That this was not needed is confirmation that our electrode and insulator dimensions, derived from LPP’s quantitative theory of DPF functioning, are accurate.

Tweaking of the Experimental System to Setup for Firing/Shots

The “down time” the crew has been experiencing stems from various components in the machine which prevent the “shot” from going off as it should. The whole machine, in a sense, has to be fine tuned to eliminate leaks and losses and bring the charge to bear along the electrodes with the correct timing, and keep the gas in the vacuum.

Various components such as the vacuum, switches, triggers and so forth have been assembled, disassembled, tweaked, re-assembled.

Consider the vacuum chamber. It has many vulnerable points - there are “windows” for observation and connecting diagnostic instruments. Each connection point represents some vulnerability. Every time they change something, they have to test the vacuum again. There’s a big table in the room with FoFu, covered with tools. I visit the lab, and the guys are in there, switching out a rogowski coil from the drift tube, for example. Re-connecting it. Testing the vacuum again. This is why the machine was designed as it is, with access to walk in under the machine and constantly take things off and add things on.

Forbes finds 79 Billionaires in China

According to Forbes there are 79 billionaires in China

Earlier in the year Forbes Mar 2009 list of world billionaires only found 28 Chinese Billionaires. The economic crisis is easing, particularly in China, so the net worth of Chinese billionaires is better than earlier in the year,

Here is a link to the Forbes March 2009 list of world billionaires by country

Second Day of the Space Elevator Beaming Contest

No new successful runs in day 2. Day 2 has been completed. We have lasermotive as the only team with successful prize level 1 runs in the 2 meter/second to 5 meter/second range. They had day one speed of 3.73 meters/second.

Thu, Nov 05 2009
6:43:48 AM PST:
Good morning. First up today: USST. That will complete Round 1. Live at
6:45:06 AM PST:
Round 2 starts immediately after Round 1. Teams will run in this order: First LaserMotive, then KCSP then USST. Live at
USST (University of Saskatchewan Space Team) ran unsuccesfully for their first two attempts. Round 1 closed.

9:58:32 PST:
Getting ready for Round 2. First up is LaserMotive. Let's hope for a 5 m/s run! Live at
10:08:54 PST:
Let's hope for a 5 m/s run from LaserMotive!
Live at

Lasermotive twitter feed

11:15 PM PST
3 runs completed. [Lasermotive] Did not go fast enough for 5 m/s [during their second round runs]

View from Lasermotive Trailer after their second round runs. Helicopter is landing and cable is being brought down

Next up University of Saskatchewan (USST) and then Kansas Pirates for round 2.

21:46:37 PST:
USST has passed on their turn. KCSP has until 2:00pm to start their run.

USST's climber had overheating problems.

2:20:52 PM PST:
Hey #SEGames Looks like KCSP is ready to go! (Space Elevator Games live ›
2:21:39 PM PST:
Looks like KCSP is ready to go for another try...

Space Elevator Games Live coverage console is linked to here

Yesterday Lasermotive qualified for the $900,000 prize for going faster than 2 meters per second (but not past 5 meters per second for even more money) over the 1000 meter cable

The official speed was 3.72 meters per second for Lasermotives best run on Nov 4, 2009.

Official results for day one

Lasermotive: Unofficial empty weight is 4.8 kg. The unofficial payload is 0.58 kg. So the score, unofficial, is (speed times payload ratio) 3.7 * 0.58 / 4.8 = 0.45. If other teams make it into the $900k bracket, the scores will be used to determine the order of the winnings.

Kansas City Space Pirates also climbed, but a lot slower, getting to 850 m at 8:00, where we had to stop them due to a satellite lasing window closing. They were still moving when we shut them down, and their average speed was approximately 1.875 m/s.

Today’s Schedule is promising to be very exciting:

USST will go first, since they didn’t get a climb window yesterday.
LM will go next, and will sure be trying to get into the 5 m/s bracket, for the larger prize purse.
USST will then get their second climb window, and lastly
KCSP will get their second climb window and try to improve their performance.

Another Beyond CMOS Candidate

H. J. De Los Santos is with NanoMEMS Research and they have proposed a new beyond CMOS computer architecture called Nano-electron-fluidic logic.

Theory of Nano-Electron-Fluidic Logic (NFL): A New Digital "Electronics" Concept

A new digital "electronics" concept is introduced. The concept, called nano-electron-fluidic logic (NFL), is based on the generation, propagation and manipulation of plasmons in a two-dimensional electron gas behaving as an electron fluid. NFL gates are projected to exhibit femtojoule power dissipations and femtosecond switching speeds at finite temperatures. NFL represents a paradigm shift in digital technology, and is poised as a strong candidate for "beyond- CMOS" digital logic.

* Operates with far less heat and more efficient energies (femtojoules)
* Faster switching speeds (femtosecond)
* higher density potential for devices
* Terahertz operating speeds for chips
* Propogation velocity of electron fluid is hundreds of times faster than electrons in current CMOS
* Device construction is compatible with current lithography

Nano-Electron Fluidic Logic (NFL) Device patent application 2009026764

A nano-electron fluidic logic (NFL) device for controlling launching and propagation of at least one surface plasma wave (SPW) is disclosed. The NFL device comprises a metallic gate patterned with a plurality of terminals at which SPWs may be launched and a plurality of drain terminals at which the SPWs may be detected. A wave guiding structure such as a 2 DEG EF facilitates propagation of the SPW within the structure so as to scatter/steer the SPW in a direction different from a pre-scattering direction. A bias SPW is excited by an application of a control SPW with a momentum vector at an angle to the bias SPW and a control current with a wavevector which scatters the bias SPW in the direction of at least one output SPW, towards a drain terminal. The NFL device being rendered with device speed as a function of SPW propagation velocity.

* speed of the device is a function of SPW propagation velocity in terahertz switching frequencies.

A previous paper from 2004 by Héctor J. De Los Santos: NanoMEMS SYSTEMS ON CHIP

NanoMEMS exploits the convergence between nanotechnology and microelectromechanical systems (MEMS) brought about by advances in the ability to fabricate nanometer-scale electronic and mechanical device structures. While the “nano” aspect of this field is in its infancy, and is not expected to reach maturity until well into the 21st century, its “MEMS” aspect is a topic of much current and near-term impact in, for instance, RF/Wireless communications. In this context, we discuss the fundamentals of NanoMEMS, in particular, as it relates to its most speculative and futuristic paradigms and applications, and then focus on the RF/Wireless MEMS aspect, specifically in its role as enabler of ubiquitous wireless connectivity.

Galactic Suite Orbital Hotel Taking Reservations for 2012

Galactic Suite is a Barcelona based company that plans to have a space hotel operating in orbit by 2012.

Physorg had some information on this project funded by an anonymous billionaire.

The cost of three nights on the Galactic Suite Space Resort (plus a two-month training course on a Caribbean island beforehand) will be $4.4 million US. At least 43 people have already reserved their place, with over 200 expressing an interest.

The Galactic Suite Space Resort plans to start with one pod holding four passengers and two astronaut pilots. The pod would orbit 280 miles (450 km) above the earth and travel at 18,640 mph (30,000 kph). Passengers would take a day and a half to reach the pod by Russian-built rocket, after blasting off from a spaceport on a Caribbean island. The rocket would dock with the pod for their entire stay to give the guests a sense of security. At the end of their stay the passengers would return to the rocket for the trip back to earth.

Claramunt said the project had received an anonymous grant of $3 billion given to the company by a space enthusiast billionaire.

Galactic Suite News site

Bigelow Aerospace, the American Billionaire backed Space Hotel Company
A competing and more well known company that is trying to get inflatable space hotels is Bigelow Aerospace.

Bigelow has launched two prototypes into orbit and has the following plans.

The currently third planned Bigelow launch, Sundancer, will be equipped with full life support systems, attitude control, orbital maneuvering systems, and will be capable of reboost and deorbit burns. Like the Genesis pathfinders, Sundancer will launch with its outer surface compacted around its central core, with air expanding it to its full size after entering orbit. After expansion, the module will measure 8.7 m (28.5 ft) in length and 6.3 m (20.6 ft) in diameter, with 180 cubic meters (greater than 6,000 cubic feet) of interior volume. Unlike previous Bigelow craft, it will feature three observation windows. As of 2009, SpaceX has been contracted to provide a Falcon 9 vehicle for a launch in 2011.

In August, 2009, Bigelow Aerospace announced the development of the Orion Lite spacecraft, intended to be a lower cost, and less capable version of the Orion spacecraft under development by NASA. The intention would be for Orion Lite to provide access to low earth orbit for using either the Atlas 5 or Falcon 9 launch systems, and carrying a crew of up to 7.

Bigelow Aerospace was founded by Robert Bigelow and is funded in large part by the fortune Bigelow gained through his ownership of the hotel chain Budget Suites of America. As of 2006, Bigelow had invested US$75 million in the company. Bigelow has stated that he is prepared to fund Bigelow Aerospace with about US$500 million through 2015.

On April 10, 2007, Bigelow Aerospace announced business plans to offer (by 2012) a four-week orbital stay for US$15 million, with another four weeks for an additional US$3 million. An entire orbital facility could also be leased for US$88 million a year, or half a facility for US$54 million a year

Bigelow Space Hotel concept

Galactic Suite Video

November 04, 2009

LaserMotive has successfully qualified for the $900,000 Space Elevator Prize

Picture caption: An earlier picture of the laser motive climber. Mostly solar cells that absorb the laser light that power a motor that climbs the cable. On the right is a screenshot of Lasermotive's climber from the ustream video

3:00:39 PM PST:
LaserMotive Climber is climbing! Live at
3:10:58 PST:
First climb was 4 minutes, 2 seconds - officially. They qualified for the $900K prize. Live at
3:11:11 PST:
Now trying a second climb Live at
3:12:48 PST:
Second climb completed - unofficially also 4:00 or so. These guys HAVE to be happy... Live at

If neither the University of Saskatchewan or the Kansas City Space Pirates can beat that time over the next few days then Laser Motive will win the prize.

But in 2009 there will be a winner of the laser beaming space elevator contest.

See the live updates for more runs tomorrow and Friday and to review todays competition

Picture caption:The bottom of the 1000 meter (1100 yard) cable that is suspended from a helicopter.

Youtube Video of the Actual Climb

(H/T to reader Jriskin)

Power beaming prize description and rules page.

Vertical Distance: 1 km
Speed: 2 m/s, 5 m/s
Prize Purses:
$900,000 for 2-4.99 meters/second
$1.1 Million for 5+ meters/second

Lasermotive's climber went 1000 meters in 242 seconds is about 4.13 meters/second

A climb faster than 3 minutes and 20 seconds is needed to win the 5 meter per second prize for $1.1 million.

Lasermotive's blog

LaserMotive (Tuesday, Nov 3, 2009) beamed roughly 400 watts of laser power to a moving target at a distance of 1 kilometer, as part of the vertical laser alignment procedure. The target was a retro-reflective board a little larger than 1 meter on a side. I don’t know offhand if that is a record; I will have to check once things calm down. (It’s a record that will likely be broken tomorrow by one or more teams, of course.)

Lasermotive site

Summarizing Space Elevator Feasibility Articles
For a Carbon Nanotube tether that is 30 MYuri [A MYuri is the name we gave the SI equivalent of N/Tex, or GPa-cc/g] strong, and a characteristic time constant (CTC) of 1 year the Feasibility Condition requires that the climbers will have a power density of at least 1.0 kiloWatt/kg.

So where do the competition requirements stand in respect to this?

It is easy to show that when moving straight up, the power density of the climber is directly proportional to its speed (mgv/m), and so a 5 m/s speed in 1 g gravity corresponds to 50 Watt/kg, or about 5% of a real commercial scale Space Elevator climber.

Today's materials perform at 2.5 – 3 MYuri (GPa-cc/g) at best when built as tethers suitable for the tests.

The Space Elevator will function a lot better with a ~35 MYuri material, but this is the bare minimum that we need. Keep in mind that successive 50% improvements in material strength are very large steps, but that we already know that CNT molecules are measured at ~50 MYuri, and fabricated CNT micro-bundles have been produced by several labs at 10 MYuri, so this challenge is not impossible.

Achieving higher speeds and power density for the climber seems more likely than achieving stronger tethers.

Microfluidic Injector of Biomolecules Into Cells Automates and Reduces Cost of A Complex Drug Development Process

(a) Zebrafish embryo immobilized by suction capillary. (b) Needle inserted into yolk sack. (c) Electroosmotic pumping of methylene blue solution into the embryo by the application of 25 V for 10 s. (d) Needle retracted from the embryo. Credit: McMaster Engineering

Physorg reports the construction and operation of a microfluidic micro-injector achieved an almost 80 per cent success rate in injecting Zebrafish embryos.

"This device is to drug discovery what the assembly line was to the automobile or the silicon chip to information technology," explains Ravi Selvaganapathy, assistant professor of mechanical engineering at McMaster and lead author of the research. "It turns what was a complex, resource-intensive process available to a few into an automated, predictable, reliable, and low-cost system accessible to almost anyone."

Notably absent is the need for a microscope or optical magnification to conduct the process, which is required for manual injection and to monitor transfection methods. The microfluidic device also allows easy integration of post-processing operations including cell sorting and the testing of cell viability on the same chip.

"The micro-injectors can easily be run in parallel and allow for scientists to test far greater combinations of materials in a much shorter time than current processes. It also makes it more feasible to pursue drug discovery for many so-called neglected diseases."

The micro-injector also holds great promise for in-vitro fertilization as it provides far greater accuracy and control than current manual injections procedures, which have high rates of failure, require trained expertise and can be time intensive.

Lab on a Chip Journal abstract from McMaster University Researchers: Microinjection in a microfluidic format using flexible and compliant channels and electroosmotic dosage control

We present a novel PDMS-based microinjection system in a microfluidic format with precise electroosmotic dosage control. The device architecture is fully scalable and enables high-throughput microinjections with integrated pre- and post-processing operations. The injection mechanism greatly simplifies current methods as only a single degree of freedom is required for injections. The injections are performed inside a fully enclosed channel by an integrated microneedle. Actuation of the needle is achieved by the compliant deformation of the channel structure by an external actuator. Reagent transport is achieved using electroosmotic flow (EOF) which provides non-pulsating flow and precise electrical dosage control. The potentials used for injections were between 5 V–25 V. The electrical properties and flow rates for the device were characterized for Zebrafish embryos and Rhodamine B and Methylene blue in pH 10 buffer solution. We also propose a method to enable precise individual dosing of embryos using direct electrical feedback. Additionally, we show that electrical feedback can be used to verify the location of the needle inside the injection target. A preliminary viability study of our device was conducted using Zebrafish (Danio rerio) embryos. The study involved the injection of ultrapure water into the embryos in an E3 buffer, and resulted in embryos that showed normal development at 48 hours.

Space Elevator Games Beaming Competition is Today and Goes Through Friday

Today is the first day of the 2009 Space Elevator Games - Climber / Power-Beaming competition

First climb was 4 minutes, 2 seconds - officially. Laser Motive qualified for the $900K prize. Live at

If neither the University of Saskatchewan or the Kansas City Space Pirates can beat that time over the next few days then Laser Motive will win the prize.

In 2009 there will be a winner of the laser beaming space elevator contest.

See the live updates for more runs tomorrow and Friday and to review todays competition

First up - Kansas City Space Pirates
Next up - Laser Motive
Last up - USST

Depending on the results of the first set of runs, then the order of second and subsequent runs will be set.

There should also be a Space Elevator Games Console with a message and video feed but that does not appear to be working either is now working. Video is showing the helicopter flying and lifting the one kilometer (0.6 mile long) cable track which will be climbed.

Latest Updates:
9:09 PST: We should be seeing a battery powered climb by USST shortly now. Live at
9:15 PST: Helicopter spooling up. Live at
9:22 PST: Heli doing wind check Live at
9:28 PST: Hooking up climb cable Live at
9:30 PST: Heli now lifting cable - battery powered climb coming up Live at
9:35 PST: If you have any questions, please put them on this chat - Brian & I will try to answer them Live at
9:47 PST: We're hoping for the first competition run to start around 10:15 - pacific ime Live at
11:26:05 PST:
Finally getting ready - hope for our first competition climb in 10-15 minutes... Live at
11:27:06 PST:
heli re-fueled and spooling up Live at
11:29:00 PST:
heli up Live at
11:42:43 PST:
Climber now being pulled up to starting position Live at
11:45:27 PST:
Cleared to lase Live at
11:51:18 PST:
There is some issue - they're bringing the climber down to look at it. Live at

12:54:38 PST:
KC Space Pirates done for the day - no "in the money" climb for them today unfortunately. Live at
12:55:28 PST:
Breaking for lunch - LaserMotive first up this afternoon. Stay tuned! Live at
13:39:54 PST:
Looks like the action will resume about 2:00pm, Pacific time, perhaps a bit earlier. Live at
14:14:17 PST:
KC Space Pirates done for the day - they'll try again tomorrow. Live at
14:15:13 PST:
LaserMotive next up. They have their climber on the tether and the helicopter should be lifting shortly. Live at
14:46:03 PST:
LaserMotive Climber failed to move despite repeated attempts. Climber now being brought down. Live at
14:51:05 PST:
LaserMotive people now looking at cimber Live at

Nextbigfuture has been tracking the space elevator games and development work towards space elevators.

This years competition is to see if competitors can surpass about 5% of the power density capability needed for the climber component of a space elevator.

The power density of the climber is directly proportional to its speed (mgv/m), and so a 5 m/s speed in 1 g gravity corresponds to 50 Watt/kg, or about 5% of a real Space Elevator climber.

Alan Boyle has coverage of the space elevator beaming competition

all three of the teams entered in the competition - the Kansas City Space Pirates, LaserMotive and the University of Saskatchewan's USST team - were technically capable of taking the prize. But the challenge could be complicated by other factors, ranging from dealing with the wind to keeping the copter in a stable postion, to making sure the cable "racetrack" is easily navigable.

Shelef is hoping that at least one of the teams will end up with some money by the time all is said and done. In order for the full $2 million to remain in NASA's kitty, "all three teams would have to strike out," Shelef said.

Spaceward Foundation site

Interview with Parabon Computation CEO by Sander Olson

Here is an interview with Dr. Steven Armentrout, the CEO of Parabon Computation. Parabon has a unique approach to cloud computing - they do not own or operate any data centers. Rather, they contract out with Universities and other organizations to utilize unused compute cycles from large groups of idle computers. Parabon then leases most of this unused compute power to its customers. A portion of the unused compute power is allocated to Parabon Nanolabs, which designs drugs.

Parabon in Recent New
Parabon NanoLabs Founding Scientist, Dr. Christopher Dwyer receives the Presidential Early Career Award for Scientists and Engineers from the White House

Dr. Dwyer co-founded Parabon NanoLabs for its unique combination of DNA nanotechnology fabrication and grid computing sequence optimization that has culminated in the development of proprietary technology for precisely directing the self-assembly of designer macromolecules. Dr. Dwyer has a unique combination of wet-lab and bit-lab experience, and is a pioneer in the merged disciplines of DNA nanotechnology and computer science. He has conducted extensive research using DNA as scaffolding to support sensors that are programmed to target specific devices -- for use in cancer therapeutics, bioweapons defense, and rapid readouts of DNA.

New anticancer molecules engineered to self-assemble using synthetic DNA

The grant will be used to demonstrate the viability of a new class of anticancer molecules that are engineered to automatically self-assemble from interlocking strands of synthetic DNA. It was a combination of innovations -- DNA nanotechnology fabrication and grid computing sequence optimization -- that led to Parabon NanoLabs' award.

Unlike other therapeutics, Parabon's compounds are deliberately engineered to solve specific therapeutic goals using an approach that effectively replaces the current paradigm of "drug discovery" with that of "drug design." By affixing molecular subcomponents (e.g., antibodies, pharmaceuticals and enzymes) to strands of DNA that are pre-sequenced to attach to one another to form composite constructs, Parabon NanoLabs researchers produce therapeutics that are able to precisely target and destroy individual cancer cells, without damaging surrounding healthy tissue.

Question 1: Your company, Parabon Computation, has a unique approach to cloud computing.

Answer: Whether you call it high-performance cloud computing (HPCC) or grid computing, as we have for the past 10 years, Parabon offers the only brokered computation service in the world. We enter contracts with Universities and businesses to acquire unused compute capacity. We then aggregate this surplus capacity and make it available as an online supercomputing service, called the Parabon Computation Grid. We offer the capability to scale a computational problem across thousands of machines, and to do so within seconds. (

: How do you quantify the offering?
Answer: Our metric for measuring computer capacity is based on the cap, which is the capacity or speed an average computer. A cap-hour is the amount of work that could be performed by such a computer in an hour. So if a client has a job that requires 5,000 computers to work for two hours, he would reserve 10 kilo-cap hours. Using our system, a client can rapidly and inexpensively acquire as much supercomputing power as they need, for as long as they need it.

Question 3: How much computer capacity can a client reserve? At what point does your system become overwhelmed?
Answer: We can scale almost arbitrarily. If a client needs several hundred teraflops, we can provide that. If a compute job required computing power for weeks or months, we can provide that as well, although we might require some time to ramp up the capacity base to accommodate them, depending on the amount of capacity requested.

Question 4: Amazon charges 10 to 80 cents per hour for each CPU rented. How do Parabon's prices compare?
Answer: We are in the range of 10 to 30 cents per cap-hour. The advantage of our system is that we aren't just giving you a slice of some server's capacity; rather, each server/workstation/PC is entirely dedicated to a task when it is calculating on it. But I should note that we don't consider ourselves competitors to traditional cloud providers such as Amazon. The clients who are using our Frontier Grid Platform service routinely require hundreds or thousands of computers to accomplish their tasks, and it is difficult for most standard cloud providers to supply that level of capacity at a moment's notice.

Question 5: What proportion of compute power on the web is wasted? How much potential is there to utilize unused compute cycles?
Answer: 80-95% of the capacity of most computers currently goes to waste. Parabon's model therefore has an enormous amount of headroom. Demand for computing power will rapidly grow, and organizations will naturally try to find the lowest cost computer power available. For the overwhelming majority of high-performance computing tasks, we offer the most cost-effective and convenient solution.

Question 6. Parabon has recently created a new company called Parabon NanoLabs. What is the focus of this spinoff?
Answer: Parabon NanoLabs (PNL) develops and licenses proprietary macromolecules built from DNA-based nanostructures. PNL has designed molecules for cancer therapeutics, nanoscale sensors, and DNA biometrics using a CAD/CAM program called the inSēquio™ Sequence Design Studio. This application allows us to design a particular structure in software such that, by leveraging the self-assembly capabilities of DNA, we can create trillions of copies in the lab simultaneously. Designing these macromolecules is so complex that it could not be done without high-performance computing. So in this case, NanoLabs is a major consumer of the computing power provided by Parabon Computation. (

Question 7: Does Parabon have any plans to utilize GPU computing in the future?
Answer: Absolutely, we are very excited about the prospects for GPU computing. GPUs can actually be used for grid computing today, but we have on our roadmap an extension to our software development kit that will allow even easier access to the GPU from a grid job. This software should become available in 2010.

Question 8: How many users are renting grid computing power on a regular basis?
Answer: There are currently hundreds of users, and we expect there will soon be thousands. When Parabon began operation in 1999 there were far fewer users, but demand has grown steadily. Initially this paradigm was controversial, but it has clearly proven itself and people now understand the concept of “computation on demand.”

Question 9: One of your services actually simulates Distributed Denial of Service (DDoS) Attacks. Is there any danger that this could be used by criminal elements?
Answer: One of the services that we provide is called Blitz. This service allows organizations to test their vulnerability to DDoS attacks and therefore improve their chances of successfully defending against such. Since we alone have the application and credentials to run these DDoS operations, there is little to no risk that an actual DDoS attack could be launched using Blitz. On the contrary, Blitz allows Government agencies and corporations to strengthen and improve their defenses to DDoS attacks.

Question 10: Parabon has a capacity market. How does this work?
Answer: We seek out organizations that have large numbers of unused computers that are routinely idle during certain periods. Many universities, for instance, have large numbers of PCs and servers that are idle at night. We contract out with the organization to use a set number of computers for set periods. Currently we are only accepting bids for 50 caps of computing power or more, but we will eventually allow bids for individual computers.

Question 11: But many institutions are more interested in effectively utilizing their own surplus computer power than renting it from someone else.
Answer: Yes, and that is why we developed software to allow corporations to utilize the surplus capacity of their own computing infrastructure. Our Frontier Enterprise software can be up and running within 15 minutes; it never interferes with normal operations, and allows corporations to effectively utilize every computer they own in a customizable, private grid. Given that most organizations waste the vast majority of their computing power, this software is very powerful, and an easy solution for companies interested in increasing their high-performance computing (HPC) capability using their own IT infrastructure. (

Question 12: To what extent is cloud computing hampered by a lack of standards?
Answer: I don't think that it is. If you look at the numbers, cloud computing and grid computing are rapidly being adopted, perhaps because they represent a paradigm shift that is long overdue. Both are growing exponentially and should continue to do so.

Question 13: So this supercomputing power can be directly accessed through almost any computing device, including cellphones?
Answer: Yes, the Frontier service can be accessed through almost any computing device. For example, a cell phone that has a browser can be used to drive a web-based Frontier application, providing it direct access to hundreds of teraflops of compute power.

Question 14: How much growth do you expect in the cloud computing field in the next decade?

Answer: I believe that we are on the cusp of a major transition away from traditional data centers and towards cloud and grid models. I expect to see a 20% compounded growth rate in these fields. In 2019, individuals will be renting computer power routinely, and the entire process will be almost transparent. How many machines and what exact resources are being used will be invisible to the user. They will simply specify the computational task they want completed and when they want it done, and the application will tell them how much it will cost. Accessing spare computer power will be no harder than getting electricity from a wall-socket today.

November 03, 2009

Ethanol has up to 2.2 to 1 Energy Return and Half of the Green House Gases of Gasoline

New Energy And Fuel reports on a new study which says that previous studies that tarred ethanol as an environmental villain were flawed because they looked at outdated corn and ethanol production techniques. The more modern ethanol plants – which account for about 60% of U.S. production and will account for 75% by the end of 2009 – have become a lot more efficient at growing and harvesting corn and turning it into alternative fuel.

(H/T Al fin)

(17 page pdf, for the new study) Improvements in Life Cycle Energy Efficiency and Greenhouse Gas Emissions of Corn-Ethanol

Directeffect GHG emissions were estimated to be equivalent to a 48% to 59% reduction compared to gasoline, a twofold to threefold greater reduction than reported in previous studies. Ethanol-to-petroleum output/input ratios ranged from 10:1 to 13:1 but could be increased to 19:1 if farmers adopted high-yield progressive crop and soil management practices. An advanced closed-loop biorefinery with anaerobic digestion reduced GHG emissions by 67% and increased the net energy ratio to 2.2, from 1.5 to 1.8 for the most common systems. Such improved technologies have the potential to move corn-ethanol closer to the hypothetical performance of cellulosic biofuels. Likewise, the larger GHG reductions estimated in this study allow a greater buffer for inclusion of indirect-effect land-use change emissions while still meeting regulatory GHG reduction targets. These results suggest that corn-ethanol systems have substantially greater potential to mitigate GHG emissions and reduce dependence on imported petroleum for transportation fuels than reported previously.


Discussion Related to Skylon Spaceplane Interview

This site had an interview with Richard Varvill, the Technical Director and Chief Designer at Reaction Engines Limited. Reaction Engines Limited is a UK company that is developing a fully reusable launch system called Skylon.

There was a comment (link to the comment) by reader Goatguy which has a response from Richard Varvill

Below is a response to the specific points made by 'Goatguy':

1) SINGLE STAGE - Sounds great. But if they're ejecting a fuel tank, it isn't single stage. Let's remember to call technology standards for what they are.

Skylon does not eject a fuel tank. Skylon is single stage to orbit and has no expendable fuel tanks or equipment. Everything that takes off comes back apart from propellant and some cooling water.

2) 200 use - why there? Well, undoubtedly because the ceramic tiles that protect the present, beautifully ageing space-shuttle last for about 1/4 to 1/2 that many missions. They figure they can outdo the shuttle's re-useability.

The 200 flights is not determined by the lifetime of the TPS aeroshell, nor is the TPS material the same as the Shuttle. Skylon's aeroshell is a reinforced glass ceramic composite manufactured in thin sheets and corrugated for stiffness. The aeroshell is several hundred degrees cooler than the Shuttle during re-entry due to Skylon's lower ballistic coefficient. Re-entry insulation is provided by an internal multifoil insulation blanket.

The vehicle design lifetime is determined by economic considerations, representing a trade-off between development and operational costs. (ie: a longer design lifetime would reduce operational cost but increase the required development program cost and duration). Once introduced into service the vehicle lifetime and reliability will be gradually improved as the operational environment proves what the real life limiting factors are.

3) 400 x better? That takes quite a leap of faith! They're not going to cut the kg/kg fuel/payload ratio much - especially if they don't jettison the external fuel cells and go "2 stage" or "3-stage" in effect.

400x is the improvement in reliability compared to expendable rockets – which is mainly due to Skylon's ability to abort in all flight regimes and the position of each individual vehicle on the “bathtub curve” (as supplied each vehicle is quite a way down the “wear in” Weilbull Distribution since it is flight tested prior to delivery to the customer).

4) $5M per flight? OK, maybe ... especially since the Space One folks figured out (rightly I might add) that wings, turbojet-engines, standard JP1 and a whole lot of atmosphere (especially in a ramjet secondary) can gets you to Mach 6 or 7 (about 2 km/sec) without too much trouble. The 90-minute-orbit (250 km up, r = 7,125km, C = 44,700km, V = c/5400 = 8.3km/sec) then gets its first 2 km/sec off oxygen breathing "stage". But the rest (18x more energy) needs to come from non-air breathing rocketry.

Getting any single stage vehicle to Mach 6-7 from a standing start using air-breathing engines is not easy (and in fact has never been done). Although the rocket powered phase provides most of the energy the air-breathing phase is more technically difficult (from a propulsion perspective).

5) Looking at (4) more ... the whole reusable shuttle concept is vexed with "protection-and-accelleration costs". It has to be BIG enough to store the fuel to get from 2,000 m/sec to 8,300 m/sec. But BIG = WEIGHT, which equals more fuel. Which requires bigger shuttle. gah. And cuts down on the payload.

The above mass growth effect is encapsulated by the well known 'rocket equation'. Actually on Skylon most of the fuel (hydrogen) is used in air-breathing mode getting up to Mach 5, and hydrogen tankage accounts for most of the fuselage volume. Nevertheless the payload fraction always improves as the vehicle scale is increased due to the diminishing effects of fixed masses and minimum material gauges etc.

The rest of the discussion is at rather a tangent to the SKYLON concept. At this point in time we have no reason to alter our approach which 25 years of study and research have shown to be viable.

All the best

Response from Goatguy

I apologize for not having researched more on the Skylon technology before posting my series of questions and rebuttals. More informed (now), I see that the idea is dependent on several factors: the development of a cooled-air hydrogen turbojet engine, the ability of the ceramic-glass composites to deliver the necessarily high strength (and insulating R value)-to-weight ratio, the use of composites and ultralight weight materials for the 41,000 kg empty-weight fuselage, and the jettisoning of pressurized human-occupant cabins and all the associated support systems. I imagine that there would also be a call for minimalist (and light-weight) avionics and a lightly pressurized helium gas-fill to thwart oxygen-leak fire potential.

I also see that the thought-experiment of a space-plane with tanks that weigh nothing, engines that weigh nothing and fuselage that weighs nothing ultimately gains nothing in being multistage… as only the reaction-mass and the payload are accelerated. It is because conventionally the very large tanks, the rather large attendant turbopumps, thrusters, force-frame and control systems weigh so much that it makes sense to “multi-stage” them.

I do wonder Dr. Varvill though … is it possible that the cross-over between a single-stage and a 2 stage design – for a winged, aerodynamic lift vehicle – per the work done by the Scaled Composites SpaceShipOne folks … is such that the Skylon design could easily be kept as a winged lifter with a separable LOX/H2 stratosphere-to-space 2nd stage? Doing so would markedly lower the weight of the air-breathing vehicle: it wouldn’t need any thermal-barrier protection (or at least very little), so that that dead-weight would be reserved just for the secondary stage.

Thinking on this further – and maintaining the idea of a horizontal runway take-off approach – might there also be advantage in borrowing from the military? At least it is the American armed forced SOP to lash a couple or quad of modest-sized solid-rocket launch boosters to get the heaviest conventional aircraft up to take-off velocity when runway-length (combined with the vehicle mass and its jet thrusters) is too short for a safe takeoff. I know “technically” then that is a third stage … but it seems so obvious – attaining the first 400 m/sec over a shorter runway, which at this velocity would allow the craft to do an early abort utilizing a short return-to-spaceport loop (and rapid pumpless fuel dumping). Water cooled brakes … not needed, nor the water mass. Further, the high-pressure rocket casings wouldn’t have to be made from exotic materials, them being jettisoned after the first 30 seconds of acceleration. Cheap, robust, reusable.

Finally … I have to say that I am markedly more enthusiastic about the Skylon space-plane concept than I was when I wrote the comments (to which you graciously replied).

Biodegradable circuits could enable better neural interfaces and LED tattoos

Silicon on silk: This clear silk film, about one centimeter squared, has six silicon transistors on its surface. These flexible devices can be implanted in mice like the one in this image without causing any harm, and the silk degrades over time. The orange liquid on the hair is a disinfectant used during the surgery.
Credit: Rogers/Omenetto

MIT Technology Review reports researchers from several universties have demonstrated arrays of transistors made on thin films of silk. While electronics must usually be encased to protect them from the body, these electronics don't need protection, and the silk means the electronics conform to biological tissue. The silk melts away over time and the thin silicon circuits left behind don't cause irritation because they are just nanometers thick.

(3 page pdf) Silicon electronics on silk as a path to bioresorbable, implantable devices

Many existing and envisioned classes of implantable biomedical devices require high performance electronics/sensors. An approach that avoids some of the longer term challenges in biocompatibility involves a construction in which some parts or all of the system resorbs in the body over time. This paper describes strategies for integrating single crystalline silicon electronics, where the silicon is in the form of nanomembranes, onto water soluble and biocompatible silk substrates. Electrical, bending, water dissolution, and animal toxicity studies suggest that this approach might provide many opportunities for future biomedical devices and clinical applications.

"Current medical devices are very limited by the fact that the active electronics have to be 'canned,' or isolated from the body, and are on rigid silicon," says Brian Litt, associate professor of neurology and bioengineering at the University of Pennsylvania. Litt, who is working with the silk-silicon group to develop medical applications for the new devices, says they could interact with tissues in new ways. The group is developing silk-silicon LEDs that might act as photonic tattoos that can show blood-sugar readings, as well as arrays of conformable electrodes that might interface with the nervous system.

Last year, John Rogers, professor of materials science and engineering at the Beckman Institute at the University of Illinois at Champaign-Urbana, developed flexible, stretchable silicon circuits whose performance matches that of their rigid counterparts. To make these devices biocompatible, Rogers's lab collaborated with Fiorenzo Omenetto and David Kaplan, professors of bioengineering at Tufts University in Medford, MA, who last year reported making nanopatterned optical devices from silkworm-cocoon proteins.

These devices also require electrical connections of gold and titanium, which are biocompatible but not biodegradable. Rogers is developing biodegradable electrical contacts so that all that would remain is the silicon.

The group is currently designing electrodes built on silk as interfaces for the nervous system. Electrodes built on silk could, Litt says, integrate much better with biological tissues than existing electrodes, which either pierce the tissue or sit on top of it. The electrodes might be wrapped around individual peripheral nerves to help control prostheses. Arrays of silk electrodes for applications such as deep-brain stimulation, which is used to control Parkinson's symptoms, could conform to the brain's crevices to reach otherwise inaccessible regions. "It would be nice to see the sophistication of devices start to catch up with the sophistication of our basic science, and this technology could really close that gap," says Litt.

From the same researcher: Lateral Buckling Mechanics in Silicon Nanowires on Elastomeric Substrates (6 page pdf)

Rogers research group publications

November 02, 2009

Critique of the Path to Sustainable Energy 2030

Brave New Climate reviews the work of Mark Z. Jacobson (Professor, Stanford) and Mark A. Delucchi (researcher, UC Davis) entitled “A path to sustainable energy by 2030” (p 58 – 65 Scientific American Nov 2009; they call it WWS: wind, water or sunlight).

Jacobson and Delucchi argue that, by the year 2030:
Wind, water and solar technologies can provide 100 percent of the world’s energy, eliminating all fossil fuels.

Carbon Emissions from Expected Wars Based on Militarization of Technology Similar to Energy Sources

They also state:
Nuclear power results in up to 25 times more carbon emissions than wind energy, when reactor construction and uranium refining and transport are considered.

From Brave New Climate:
They achieve this result by positing that nuclear power means nuclear proliferation, nuclear proliferation leads to nuclear weapons, and this chain of events lead to nuclear war, so they calculate the carbon footprint of a nuclear war.

This issue was noted before on this site when commenting on Mark Jacobson's previous ranking of energy sources.

Jacobson and Delucchi do not apply their inclusion of war effects on a consistent basis. They need to look at oil based weapons.
Napalm is the generic name denoting several flammable liquids used in warfare, often jellied gasoline.

So according to Jacobson then jungle, forests and cities that were set aflame by napalm, fuel air explosions and other oil based weapons should be apportioned as carbon emissions for oil.

* The US air force bombed cities in Japan with napalm, killing 80,000 civilians and making 1,000,000 homeless during world war 2.

* Operation rolling thunder in the Vietnam war dropped over one million tons of bombs on Vietnam. The United States dropped 8 million tons of bombs on Vietnam between 1965 and 1973.

So a historical analysis using the Jacobson/Delucchi method should include the larger amount of carbon emission from the effects of more fossil fuel based weapons used.

Also, the sun and wind are drivers for wildfires in forests. Wildfires burn an average of about 7 million acres each year in the USA.

Wildfires, which release about 90 Tg CO2 annually into the atmosphere over the continuous U.S., are an important factor for the carbon cycle in this region.

Torching oil wells and blowing up hydro dams should also have their effects included. Broken dams can flood areas where plants are growing and release the carbon in the plants.

Fundamental Cost Errors
Brave new climate points out the fundamental cost errors (aka lies) of Jacobson and Delucchi:

They make a token attempt to price in storage (e.g., compressed air for solar PV, hot salts for CSP). But tellingly, they never say HOW MUCH storage they are costing in this analysis (see table 6 of tech paper), nor how much extra peak generating capacity these energy stores will require in order to be recharged, especially on low yield days (cloudy, calm, etc). Yet, this is an absolutely critical consideration for large-scale intermittent technologies, as Peter Lang has clearly demonstrated here. Without factoring in these sort of fundamental ‘details’ — and in the absence of crunching any actual numbers in regards to the total amount of storage/backup/overbuild required to make WWS 24/365 — the whole economic and logistical foundation of the grand WWS scheme crumbles to dust. It sum, the WWS 100% renewables by 2030 vision is nothing more than an illusory fantasy. It is not a feasible, real-world energy plan.

Jacobson and Delucchi are willing to forecast such optimistically low costs for future solar, then we can be quite comfortable doing the same for IFR (Integral Fast Reactors) and LFTR (Liquid Flouride Thorium Reactors), the Gen IV nuclear

Форма для связи


Email *

Message *