March 26, 2009

NASA Smart Shape Changing Helicopter Rotor

NASA is working on smart shape changing helicopter rotors.

Twenty years from now, large rotorcraft could be making short hops between cities such as New York and Washington, carrying as many as 100 passengers at a time in comfort and safety. Routine transportation by rotorcraft could help ease air traffic congestion around the nation's airports.

the SMART Rotor can reduce by half the amount of noise it puts out within the controlled environment of the wind tunnel. The ultimate test of SMART rotor noise reduction capability would come from flight tests on a real helicopter, where the effects of noise that reproduces through the atmosphere and around terrain could be evaluated as well.

The test data also will help future researchers use computers to simulate how differently-shaped SMART Rotors would behave in flight under various conditions of altitude and speed.

Magnetic Spin MRI Could Improve Sensitivity 1000 Times

Shown here is an MRI image of a one-centimeter-wide tube containing smaller tubes covered with a molecule called pyridine

MIT Technology Review reports of a novel method of transferring magnetic spin can amplify the sensitivity of magnetic resonance imaging (MRI) a thousandfold, according to new research from the University of York in the United Kingdom. The new technique, Signal Amplification By Reversible Exchange (SABRE), achieves results without any chemical change being necessary. The SABRE method delivers a proven 1000-fold increase in sensitivity.

Professor Simon Duckett, from the University’s Department of Chemistry and Director of the Centre for Magnetic Resonance, said: "We have been able to increase sensitivity in NMR by over 1000 times so data that once took 90 days to record can now be obtained in just five seconds. Similarly, an MRI image can now be collected in a fraction of a second rather than over 100 hours for Nuclear Magnet Resonance [NMR].

Nuclear Magnetic Resonance (NMR) is what takes many hours and days. Replacing NMR with fast but sensitive spin MRI is where the speed gains are.

The new method, published today in the journal Science, enables the magnetization of a broad range of molecules--including drugs such as nicotine, and organic molecules such as antibodies designed to bind to tumors--so that they can be used as contrast agents.

Scientists first cool the molecule to create a form of molecular hydrogen, called parahydrogen, which has a highly ordered magnetic spin state. An iridium catalyst transfers the magnetic spin from the parahydrogen to other key elements, including oxygen, nitrogen, and carbon.

These polarized drugs or marker molecules are highly visible in MRI scans. "For example, you might use the technique to polarize the molecule that you know will stick to a brain tumor to see what's happening with an MRI scan. Currently MRI is not sensitive enough to do this," says York team member Gary Green, director of the York Neuroimaging Centre. Green notes that his team has already used the technique to polarize a range of key substances, including pyridine and nicotinic acid, which are present in many drugs.

Magnetic Resonance force microscopy has higher resolution but is not for scanning living bodies.

Green aims to begin animal tests of the technology this year, and clinical testing within five years. Extensive clinical testing needs to be done before this approach is approved for medical use.

This method isn't the first to prime molecules for use in MRI by boosting magnetic spin. Dynamic Nuclear Polarization (DNP), under development in the U.S., uses spin taken from electrons. DNP requires temperatures of 20 Kelvin and several hours for substances to be polarized for use in MRI, a disadvantage compared to the new technique. DNP polarization is 10 times higher than the magnetic spin MRI. The claim is that magnetic spin MRI can be improved to that level.

SABRE Method

Signal Amplification By Reversible Exchange: the SABRE method

Schematic representation of the SABRE method. The polarisation (represented by the orange colouring) is transferred from parahydrogen to a substrate which can then be seen by NMR and MRI.

MRI image of a SABRE-enhanced sample. Single average RAREst 1H image on a pyrazine sample on Bruker BioSpin 70/30 employing a 1350-fold signal gain. The data collected in 3 seconds that would otherwise take 42 days.

Smart Dew. 25 cent sensor can detect intruders up to 50 meters away. Cost Effective US-Mexico or Israel Border Control

A Tel Aviv University researcher's fingertip (bottom right) points to a "Smart Dew" droplet. Proposed smart dust designs were of comparable size. More advanced concepts exist for even smaller version of smart dust but this is the smallest and cheapest to date.

A new invention from Tel Aviv University — a network of tiny sensors as small as dewdrops called "Smart Dew" — will foil even the most determined intruder. Scattered outdoors on rocks, fence posts and doorways, or indoors on the floor of a bank, the dewdrops are a completely new and cost-effective system for safeguarding and securing wide swathes of property.

Each individual "dew droplet" can detect an intrusion within a parameter of 50 meters (about 165 feet). And at a cost of 25 cents per "droplet," Prof. Shapira says that his solution is the cheapest and the smartest on the market.

Unlike conventional alarm systems, each droplet of Smart Dew can be programmed to monitor a different condition. Sounds could be picked up by a miniature microphone. The metal used in the construction of cars and tractors could be detected by a magnetic sensor. Smart Dew droplets could also be programmed to detect temperature changes, carbon monoxide emissions, vibrations or light.

Each droplet sends a radio signal to a "base station" that collects and analyzes the data. Like the signals sent out by cordless phones, RF is a safe, low-power solution, making Prof. Shapira's technology extremely cost-effective compared to other concepts.

"It doesn't require much imagination to envision the possibilities for this technology to be used," says Prof. Shapira. "They are really endless."

Science Daily also has information.

Thousands of these Smart Dew sensors — each equipped with a controller and RF transmitter/receiver — can also be wirelessly networked to detect the difference between man, animal, car and truck.

"We've created a generic system that has no scale limitations," says Prof. Shapira. This makes it especially useful for large farms or even the borders of nations where it's difficult, and sometimes impractical, to install fences or constantly patrol them. "Smart Dew is a covert monitoring system. Because the sensors in the Smart Dew wireless network are so small, you would need bionic vision to notice them. There would be so many tiny droplets over the monitored area that it would be impossible to find each and every one."

Smart Dust Concept and Projects

Smart Dust is described at wikipedia

Smartdust is the term used to describe a network of tiny wireless microelectromechanical systems (MEMS) sensors, robots, or devices, installed with wireless communications, that can detect (for example) light, temperature, or vibration.

The smartdust concept was introduced by Kristofer S. J. Pister (University of California) in 2001, though the same ideas existed in science fiction before then (The Invincible, 1964). A recent review discusses various techniques to take smartdust in sensor networks beyond millimeter dimensions to the micrometre level.

Some attribute the concepts behind smart dust to a project at PARC called Smart Matter

The smart dust project finished in 2001

Smartdust devices will be based on sub-voltage and deep-sub-voltage nanoelectronics and include the micro power sources with all solid state impulse supercapacitors (nanoionic supercapacitors).

The recent development of nanoradios may be employed in the implementation of smartdust as a usable technology.

The networked sniper locator system could eventually be adapted to this scale of technology.

Two Thirds of Atlantic Temperature Warming is From Less Volcanic Dust

Two thirds of the temperature warming of the Atlantic is because of less dust from Volcanoes over the last thirty years.

This also means the ideas of adding volcanic dust into the air would work for cooling the climate if necessary.

Researchers have calculated how much of the Atlantic Oceans warming observed during the last 26 years can be accounted for by concurrent changes in African dust storms and tropical volcanic activity, primarily the eruptions of El Chichïan in Mexico in 1982 and Mount Pinatubo in the Philippines in 1991.

In fact, it is a surprisingly large amount, Evan says. "A lot of this upward trend in the long-term pattern can be explained just by dust storms and volcanoes," he says. "About 70 percent of it is just being forced by the combination of dust and volcanoes, and about a quarter of it is just from the dust storms themselves."

The result suggests that only about 30 percent of the observed Atlantic temperature increases are due to other factors, such as a warming climate. While not discounting the importance of global warming, Evan says this adjustment brings the estimate of global warming impact on Atlantic more into line with the smaller degree of ocean warming seen elsewhere, such as the Pacific.

"This makes sense, because we don't really expect global warming to make the ocean [temperature] increase that fast," he says.

Algae Fuel Cost and Production Breakthroughs

Algae Ventures claims to have a method of lowering the cost for harvesting, dewatering, and drying algae by Over 99.75%.

Patent documents have been filed and an operational prototype unit has been demonstrated to collaborators who have participated in research and commercialization proposals.

Prototype and laboratory testing has successfully been achieved with three species attempted including Chlorella vulgaris, Euglena gracilis, and Botryococcus braunii. The company plans to continue development on the process, equipment, and technology and is looking to establish relationships with potential customers, licensees, distributors, as well as funding or investing sources.

The best way to describe our breakthrough technology in algae harvesting, dewatering,
and drying is a model of nature’s liquid moving strategies in organisms. No biological system has anything even remotely close to a functioning centrifuge. For that matter we found it difficult to find flocculation or flotation occurring in a biological organism.

A centrifuge moves the entire mass of water and its contents in order to separate into fractions. This was also true of flocculation, flotation, and other methods to a certain degree because the focus was on moving the algae and not moving the water. A water molecule is 1/33,000 the size of a 10 micron algae. When differential pressure (even excessive gravitational pressure in the form of a water column) is moved to force algal mass and water through a screen, this energy compacts the algal mass into a form that blocks water and impacts algal mass into screen.

So using several of nature’s gifts to move the water molecules by changing the surface tension, adhesion, cohesion, taking advantage of the meniscus being formed, a capillary action from a compression pull (think artificial Transpirational) allowing absorption and next, use water’s surface area to mass to dramatically improve evaporation (think of a water based paint applied thin and how quick it dries).

Surface tension can be broken by hundreds of ways, however, a class of materials that
were patented several years ago has a combination of natural plus synthetic materials
called superabsorbent polymer (SAP) fabrics. It is these SAP fabric material types of we call our “cap belt” and they allow for simulating nature in multiple ways. These materials, when put into contact with the bottom of the screen (water meniscus), have the capability to move vast amounts of water without moving the algae because the molecular bonds from water to water are stronger then water to algae, as long as energy applied does not break water’s bonds to itself. The capillary effect and adhesion effect (once wetted, and rung) can be designed to be continuous, just like the screen can be designed to be continuous.

This continuous approach allows for a thin layer of algae to be continuously processed from in solution to dry flake in a distance of four feet at a scalable rate with scalable equipment. In our prototype equipment, the rate exceeds 500 liters per hour on less than 40 watts per hour of run time.


Bionavitas is another company making progress on algae fuel processes.

Bionavitas' Light Immersion Technology greatly enhances algae growth by evenly distributing light deep into the algae culture.

. Before Bionavitas made its Light Immersion Technology available to the public, nearly every large scale approach to algae growth has been challenged by a simple fact of nature: as algae grow, they become so dense they block the light needed for continued growth.

This “self-shading” phenomenon results in a layer that limits the amount of algae per acre that can be grown and harvested. The Light Immersion Technology developed by Bionavitas fundamentally changes this equation by enabling the algae growth layer in open ponds to be up to a meter deep. This represents a 10 to 12 time increase in yield over previous methods that produced only 3-5 centimeters of growth.

At the core of Light Immersion Technology is an innovative approach at bringing light to the algae culture in both open ponds and closed bioreactors through a system of light rods which extend deep into the algae culture. By distributing light below the surface “shade” layer and releasing the light in controlled locations, algae cultures can grow denser. In external canal systems, the rods distribute light from the sun into the culture. This abundant and free energy source is ideal for generating large amounts of algae for use as biofuels.

In closed bioreactors, the rods evenly distribute more readily absorbed red and blue spectrum light from high efficiency LEDs. While the LEDs increase the cost of production, algae grown in these systems are used for higher value markets such as nutraceuticals.

March 25, 2009

Sorry Collapsitarians, Doomers and Dystopians a Full Collapse Will Not Happen

Collapsitarians are described the Technium.

Former President Reagan defined a recession as when your friend lost his job, and a depression as when you lost your job. Collapse is when no one has a job; in fact there are no longer any such things as jobs to be had.

Various types of doomers/collapsitarians/dystopians:
* Luddites, anarchists, and anti-civilization activists who are trying the hasten collapse as soon as possible.

* Survivalists: collapse as the penalty for modern liberalism.

* Radical environmentalists who see ecological and environmental collapse

* Anti-globalists who see collapse as the penalty for globalism.

* Anti-Americans rooting for collapse of America and developed world

* Financial doomers: who see the never ending Depression

* Peak Everythingers who see all resources running out

Die off scenarios are at

Plenty of other places online describing these scenarios of decline, war, and collapse to 0 to 2 billion people starting as early as this year and usually by 2025 but no later than 2100.

One thing of note is that most people usually think that Hitler and Stalin were bad guys for killing or causing the death of about 100 million people. Most of the civilization die off scenarios are that level of death each and every year for 70 years. 1000 times the number of deaths in the holocaust. Why is there the belief that significant mitigation efforts would not be made ?

Why it Won't Happen

1. Efficiency, conservation and an energy plans can be enhanced beyond current levels with minimal strain. There has been partially voluntary reductions in energy demand during the credit crisis. 10% reductions with minimal effort and 20% reductions with more austerity.

2. Rationing of food, fuel and clothes was successfully maintained in many countries during World War 2. Any resource decline or environmental situation can have governments use rationing to buy time for a transition.

The UK had stricter rationing than the USA during and after the war.

Thus it shows that oil and food supplies can be greatly reduced while maintaining a war-level mobilization.

90% reductions can be handled in this way and possibly more.

3. Some simple and rapid transitions are possible. Ban or confiscate large gas guzzling vehicles and only allow light weight all electric or super-efficient vehicles other than freight trucks and heavy delivery trucks. In less than one year a mobilized effort with shifts in the weight of vehicles permitted and loosened safety and bureaucratic regulations to speed the changes.

4. Rapid switchover for the electricity generation infrastructure. A war-time level mobilized switchover for electricity generation could be achieved quickly. Lift regulatory restrictions on nuclear power. Weld together containment domes to get around production limitations on large forgings. Use the staff of coal plants for the new nuclear plants. The staff of early nuclear plants did come from the coal plants. Nuclear staffing levels were 200 or less originally.

5. In regards to global warming and environmental concerns:
* a rapid switchover to totally clean power would stop the air pollution of coal and most oil and would greatly reduce any additional CO2
* geoengineering can be used to reduce global temperatures if necessary
* if the beliefs of climate change being from man-made sources are right then we are already geoengineering by accident as a side effect of our industry. It will be cheaper and easier to geoengineer to cancel those accidental side effects with intentional reversal efforts

6. A real space age can be started right away with technology that we already have.

7. If there was a global war over resources. There would be clear winners. In all out war there would be clear losers. The US would not lose.

8. There is plenty of technology now and a lot more that will be available soon to innovate away doomer scenarios.

* biofuels and synthetic fuels are already at about 10% of total fuel levels. If there was a need to replace all oil tomorrow a combination of world war 2 level rationing and biofuels and synthetics would be sufficient (Germany invented to coal to liquid fuel technology back in World War 2.)
* There are significant levels of hydroelectric, wind, and nuclear power
* If any of the challenges can be staved off for ten years or so there will be significant transitions to new technology (electric and hybrid vehicles) and the availability of more new technology

9. Financial doom scenarios
* Mandated resets of debt forgiveness, re-issuing script etc... can be used to reboot a country or a financial system
* People and systems for production would still exist even if there was 1000 trillion in debt

10. All out nuclear war would kill less than 50% of the population. Current nuclear arsenals are reduced by ten times from the peak.

There are valid extinction risks and scenarios with several listed and discussed at the Lifeboat Foundation.

Generally the extinction effects have to be so rapid that their is no time to mitigate or adapt. Space based phenomena like massive asteroid or a nearby gamma ray burster are the kind of situation that we currently could not handle. This is why there is need to stop pissing around with penny ante crap and get serious about moving civilization to full Kardashev level II. At that level there is no known threat other than all out super-war that would be a risk to such a civilization. Even things like the sun going nova could be detected and handled as such a civilization would have its own highly efficient nuclear fusion and other power sources.

Gary Jones on Collapsitarians and this sites view on blackswans.

Theory of Space Time with Quantum Scale Fractals

The Invariant Set Hypothesis: A New Geometric Framework for the Foundations of Quantum Theory and the Role Played by Gravity. Ted Palmer studied general relativity at the University of Oxford, working under the same PhD adviser as Stephen Hawking. He has worked for the last 20 years as a leading mathematical climatologist.

The Invariant Set Hypothesis proposes that states of physical reality belong to, and are governed by, a non-computable fractal subset I of state space, invariant under the action of some subordinate deterministic causal dynamics D. The Invariant Set Hypothesis is motivated by key results in nonlinear dynamical-systems theory, and black-hole thermodynamics. The elements of a reformulation of quantum theory are developed using two key properties of I: sparseness and self-similarity. Sparseness is used to relate counterfactual states to points not on I thus providing a basis for understanding the essential contextuality of quantum physics. Self similarity is used to relate the quantum state to oscillating coarse-grain probability mixtures based on fractal partitions of I, thus providing the basis for understanding the notion of quantum coherence. Combining these, an entirely analysis is given of the standard "mysteries" of quantum theory: superposition, nonlocality, measurement, emergence of classicality, the ontology of uncertainty and so on. It is proposed that gravity plays a key role in generating the fractal geometry of I. Since quantum theory does not itself recognise the existence of such a state-space geometry, the results here suggest that attempts to formulate unified theories of physics within a quantum theoretic framework are misguided; rather, a successful quantum theory of gravity should unify the causal non-euclidean geometry of space time with the atemporal fractal geometry of state space.

The 29 page Invariant Set Hypothesis paper is here

New Scientist has coverage of using fractals to make sense of the quantum world.

Principles of invariance and symmetry lie at the heart of the foundations of
physics. We have introduced a new type of invariance; the Invariant Set
Hypothesis subordinates the notion of the differential equation and elevates
as primitive the notion of fractal state space geometry in defining the notion
of physical reality. It is suggested that this has profound implications for our
understanding of quantum theory as discussed at length in the body of this

The Invariant Set Hypothesis is motivated by two quite disparate ideas in
physics. Firstly, certain nonlinear dynamical systems have measure-zero,
nowhere-dense, self-similar non-computational invariant sets. Secondly, the
behaviour of extreme gravitationally bound systems is described by the
irreversible laws of thermodynamics at a fundamental rather than
phenomenological level.

General relativity has already elevated geometry as a key concept for
investigating the causal structure of space time. The Invariant Set Hypothesis
similarly elevates geometry as a key concept for understanding the atemporal
structure of quantum physics.

In the 1960s, the introduction of global space-time geometric and topological
methods, transformed our understanding of gravitational physics in space
time (Penrose, 1965). It is proposed that the introduction of global geometric
and topological methods in state space, may similarly transform our
understanding of quantum physics and the role of gravity in quantum
physics. Combining these disparate forms of geometry may provide the
missing element needed to advance the search for a unified theory of basic

Localization and Delivery Key to Successful Gene Therapy

The latest gene therapy treatments that are working through clinical trials are targeting problems that are localized in the body. Eventually there will be gene therapy success on non-localized problems. There is work to improve delivery of gene therapy without using viruses to avoid triggering immune response. There is also work on making genetic engineering changes on many genes at the same time instead of just one. The limitations of current early work may not limit more advanced methods.

Genzyme has three programs—peripheral artery disease, Parkinson’s disease, macular degeneration—that share some features in common. They can be treated with localized therapy, which doesn’t need to circulate throughout the body, they are serious illnesses that don’t require treatments with absolutely squeaky-clean safety profiles, and they appear suitable to a single-shot gene therapy approach. Eliminating the need for multiple injections is especially useful in the case of gene therapy for Parkinson’s, in which doctors drill a hole in the skull to deliver genes to a precise region of the brain, or for macular degeneration, in which doctors make an injection behind the eye.

Local delivery is the key, Wadsworth says. Doing it that way makes it much less likely that the body’s immune system will mount a reaction to the viruses used to deliver the genes, he says.

Genzyme (NASDAQ: GENZ), with headquarters in Cambridge, MA and a gene therapy manufacturing unit in San Diego, is planning to present results this month at the American College of Cardiology from a clinical trial of 289 patients who took its experimental gene therapy for peripheral artery disease. This treatment is designed to encourage re-growth of new blood vessels to circumvent clogged arteries in the legs. If successful, this trial will show whether a single shot can help patients with severely limited mobility keep walking for longer periods without pausing to rest.

Genzyme has been working together with China on gene therapy since 2007

Others are working on using gene therapy to treat deafness.

Stem cells could help cure deafness.

Deafness affects more than 250 million people worldwide. It typically involves the loss of sensory receptors, called hair cells, for their "tufts" of hair-like protrusions, and their associated neurons. A new study led by Dr. Marcelo N. Rivolta of the University of Sheffield has successfully isolated human auditory stem cells from fetal cochleae (the auditory portion of the inner ear) and found they had the capacity to differentiate into sensory hair cells and neurons.

Joe Eck Continues to Find High Meissner Transitions - Now -40 Centigrade

Joe Eck continues to find materials with higher Meissner Transitions which indicate superconductivity. Since we are now just a "stone's throw" from room temperature, he has placed this discovery into the public domain without patent protection. Other researchers are encouraged to examine this material and its structure.

40 degrees below zero is cold by any measure. But, in the world of superconductors it's a record hot day. Superconductors.ORG herein reports an increase in high-Tc to 233K (-40C, -40F) through the substitution of thallium into the tin/indium atomic sites of the X212/2212C structure that produced a 218 Kelvin superconductor in January of 2009.

The host material producing the 233K signal has the chemical formula Tl5Ba4Ca2Cu9Oy. One of several resistance-v-temperature plots used to confirm this new record is shown above. And a composite magnetization test, showing the Meissner transition, is shown below right.

Synthesis of these materials was by the solid state reaction method. Stoichiometric amounts of the below precursors were mixed, pelletized and sintered for 34 hours at 865C. The pellet was then annealed for 10 hours at 500C in flowing O2.

Tl2O3 99.99% (Alfa Aesar) 7.136 moles (gr.)
BaCuOx 99.9% (Alfa Aesar) 5.42 moles
CaCO3 99.95% (Alfa Aesar) 1.25 moles
CuO 99.995% (Alfa Aesar) 2.98 moles

The magnetometer employed twin Honeywell SS94A1F Hall-effect sensors with a tandem sensitivity of 50 mv/gauss. The 4-point probe was bonded to the pellet with CW2400 silver epoxy and used 7 volts on the primary.

Joe Eck also claims to have a version of YCBO that superconducts/has a Meissner Transition at 175K

92K YBCO (Y-123) has only 6 metal layers in the unit cell and very little PWD. In this new discovery - based on a 9223C theoretical structure type shown at left - there are 16 metal layers and a large amount of PWD. The closest analog to this structure type is the 9212/1212C intergrowth of the Sn-Ba-Ca-Cu-O family, with Tc ~195K.

The chemical formula of this new discovery - dubbed "Hyper YBCO" - is YBa3Cu4Ox. However, HY-134 does not form stoichiometrically. In order to synthesize a sufficent volume fraction to detect, the "layer cake" method must be used.

The layer cake used to produce the prototype pellet had 17 layers, 9 of (BaCuO) and 8 of (Y2O3 + CuO). This resulted in 16 interference regions in which the desired structure was encouraged to form. The layer cake method is depicted in the simplified graphic below.

Lower Cost Networked Sniper Location System Accurate to a Few Meters

Engineers at Vanderbilt University’s Institute for Software Integrated Systems (ISIS) have developed a system that can give soldiers just such an edge by turning their combat helmets into “smart nodes” in a wireless sensor network. Soldiers can carry personal digital assistants that can display the location of enemy shooters in three dimensions and accurately identify the caliber and type of weapons they are firing. (H/T Rocky Rawstern)

An entire node for the ISIS system weighs only slightly more than the four AA batteries that power it and costs about $1,000 to construct using currently available commercial hardware. The range is typically within a few meters even from as far as 300 meters. The more sensors that pick up the shot, the more accurate the localization. “Because the microphones on the helmet are so close together, the precision is not very high,” Ledeczi says. “However, the nodes are continuously exchanging the times and angles of arrival for these acoustic signals, along with their own locations and orientations. When two or more nodes detect the shot, they can provide the bearing with better than one degree accuracy.

ISIS system combines information from a number of nodes to triangulate on shooter positions and improve the accuracy of its location identification process. It also uses a patented technique to filter out the echoes that can throw off other acoustic detection systems.

The ISIS system communicates its findings with the personal digital assistants that the soldiers carry. The PDAs are loaded with maps or overhead pictures of the area upon which the shooter locations are displayed.

In 2006, a team from the National Institute of Standards and Technology at the U.S. Army Aberdeen Test Center independently determined the accuracy of the system. Firing positions were located at distances of 50 to 300 meters from a 10-node sensor network. Six different weapons were used. The only shots that the system sometimes failed to track accurately were those that passed to one side of all of the nodes.

The field tests demonstrated that the system can pick out the location of high-powered sniper rifles even when they are firing at the same time as a submachine gun like the AK-47. They also proved that it can identify the window that a rifle is firing through even when the rifle is completely inside the building, the technique preferred by trained snipers.

Standard GPS locations are inadequate for this purpose and satellite coverage can be spotty in urban environments. The ISIS team has recently solved this problem by adding an inexpensive radio chip that allows them to track the relative position of nodes using high-precision radio interferometry.

Netbooks, eBooks Now and in the Future

Kindle 2 has text entry and and a basic web browser for text centric sites and wireless connectivity via Sprints data network. So you could use it to post blog entries and enter comments on other websites.

The cost and value of wireless connectivity is one of the key differences. If you have internet connected wifi at the home and office and places like libraries and coffee shops then getting a wifi enabled e-book or netbook would be sufficient for many purposes.

Kindle Tip #16 - Kindle Text Entry Shortcuts
You can use these shortcuts when entering Text -

Alt-6 for question mark ?
Alt-7 for comma ,
Alt-8 for colon :
Alt-9 for quotation marks “
Alt-0 for single quotation marks ‘

The web access if free and the battery lasts for days. At this link are ten reasons to buy it and ten reasons not to buy it.

Here is the CNET review of the Kindle 2

Netbooks are getting battery life in the 5-11 hour range. They have full color screens. Most netbooks have wifi and bluetooth and wimax capable versions are coming. There are ways to connect netbooks to a mobile phone that is data enabled but the data access plans for mobile phones are not cheap.

There are netbooks that cost $99.

Fujitsu has a color display e-book but it costs $1000 and it has bluetooth and wifi and a 40 hour battery life per charge.

Wimax is starting to be rolled out. The current subscription costs are about $20/month -$50/month.

Download speeds from 768 kilobits per second to 6 megabits per second: $20 to $50 a month, depending on speed, plus $5 a month for WiMAX modem. Later this year, Clearwire plans to offer Internet home phone service for $25 a month, plus $15 for a phone adapter.
Mobile Internet access: Downloads at 4 mbps: $30 to $50 a month, depending on how much data you use, plus $50 for mobile WiMAX modem.

Home Internet access [Portland]: Comcast, Qwest and Verizon offer home Internet access at download speeds from 1 mbps to 100 mbps, for $30 to $145 a month, depending on speed. Mobile Internet access: Cricket Wireless, Sprint, Verizon and AT&T offer wireless Internet cards for laptops. Access typically costs $40 to $60 a month, plus $30 or more for a wireless card. Downloads are typically around 1 mbps.

Wimax is being deployed to more and more cities, but prices are currently about $30/month

High speed 4G mobile data service is also coming but it is also at a fairly high monthly cost.

There is a rumored Apple Netbook with a iPhone like touchscreen in the third quarter of 2009.

March 24, 2009

American Chemical Society Conference Videos on the Low Energy Nuclear Reaction/Cold Fusion Work

About a 45 minute interview of the cold fusion researchers.

Free Webcam Chat at Ustream

Organizer, Jan Marwan, first 5 minutes
Steven Krivit, New Energy Times, 6-10 minutes. Overview of the last 20 years
Antonella De Ninno, 10-13 minutes
Pamela Boss, 13-17 minutes, Navy researcher who detected neutrons
John Dash, 17-22 minutes
Mahadeva Srinivasan, 22-28
Q & A 29-45 minutes
Triple Track neutron question and answer 31-32 minute DT fusion consistent neutrons
100-200 researchers is not enough to rapidly solve these issues but progress has been made
37 minutes-44 minutes someone who did not attend the research meetings tries to ask why does this look like regular hot nuclear fusion in terms of reaction products. Answer these are not the same processes.
Old physics in a new context. Not fusion in a plasma but reactions in condensed matter.
44 minutes - Arata's work discussed. Deuterium introduced into apparatus but no power is added. So not excess heat. Just heat production where there would normally be none.

2009/03/23|11:50:20 “Cold fusion” rebirth? New evidence for existence of controversial energy source Researchers are reporting compelling new scientific evidence for the existence of low-energy nuclear reactions (LENR), the process once called “cold fusion” that may promise a new source of energy. One group of scientists, for instance, describe what they term the first clear visual evidence that LENR devices can produce neutrons, subatomic particles that scientists view as tell-tale signs that nuclear reactions are occurring. Low-energy nuclear reactions could potentially provide 21st Century society a limitless and environmentally-clean energy source for generating electricity, researchers say. The report injects new life into this controversial field.

Another video at this link

Free Webcam Chat at Ustream
Q&A continues for 13 minutes.
2-5 minutes in talk about the patents that have been awarded
5-6 minute - rumors of DOD, DOE, EPA applications
7-8 minute - Most excess heat, highly reproducible 10-100%, some non-reproducible 1600,2500%.
Thin layers and nano-particles (4-10 nanometer size best) have better reproducibility.

March 23, 2009

Emerging Technological Black Swans

The Black Swan theory (in Nassim Nicholas Taleb's version) refers to a large-impact, hard-to-predict, and rare event beyond the realm of normal expectations. Taleb regards many scientific discoveries as "black swans" — undirected and unpredicted. He gives the rise of the Internet, the personal computer, World War I, and the September 11, 2001 attacks as examples of Black Swan events.

The personal computer and Internet could have been predicted and were predicted by some. A 1968 video predicting a global communication system like the Internet and predictions of the information highway and the 1986 prediction of a global hypertext system.

Super-terrorism was predicted.

People who were deeply involved in an area were usually well aware of the potential of what they and others in the field were working towards.

As an Internet-empowered futurist it is possible to see the long gestation of "Black Swans". It is also very do-able to develop a deeper understanding of specifics in a technology or science to more fully assess how impacts will enfold and what the limits or the potential are.

Emerging Low Energy Nuclear Reactions, Blacklight Power, Jovion Power

Cold Fusion is close to generating kilowatts of consistent power for hours.

Blacklight Power is signing commercial deals and is planning to release commercial energy generation product this year.

Jovion Corporation received a patent for an energy generation system based on zero point energy.

Newsweek reports that, more level-headed participants [naysayers] have acknowledged that LENR might reveal some unusual new physics, but is unlikely to be a source of energy.

So all the ridicule and attempts at quashing were at least responsible in suppressing and delaying the discovery and examination of "unusual new physics". That alone seems to be a wrong for the scientists involved in the ridicule and quashing. They previously have written and claimed that there was absolutely nothing to it but fraud and error.

These power sources are all related to solid state physics at a near nano scale. The theories are different but if they have stumbled onto related effects that are effective then the impact would classify as a Black Swan event. If anyone of these do develop as an energy source then those that held back this research would be to blame for delaying this beneficial result. If cold fusion is finally triumphant, then those who were unscientific in shunning the research and researchers turned what could have been viewed as interesting anomalous but not reliable extra unexplained heat worthy of research into an uphill battle.

Quantum Computers

Dwave Systems seems to be on track to release 128 qubit quantum computers commercially in the summer of 2009. They seem ready to scale to thousands of qubits by the end of 2009 or early in 2010.

Quantum computers have been predicted in general for a while by many technologists, but a successful emergence over this year and following years will be very surprising to many.

DNA and other forms of Nanotechnology

DNA Nanotechnology appears ready to burst to a whole new level of capability (with structural DNA costing dollars per kilogram) and self assembly enabling 1-2 nanometer computer chip features and artificial ribosomes.

Room temperature single atom quantum dots

Carbon nanotubes and other super-strength materials and materials for electronics and antennas.

Superstrong materials make all kinds of cars, planes and spaceships vastly better and things that were impossible to build become possible (like the space elevator.


Metamaterials for nanocups, spectrum control (terahertz imagers) and invisibility and optic control (possibly optical computers).

Nuclear Fusion and Factory Mass Produced Deep Burn Fission

Several nuclear fusion projects IEC Fusion, Dense Plasma Focus fusion, general fusion, laser fusion could develop relatively quickly.

Stem cells
Massive life extension [SENS life extension project, Nutrigenomics and Genescient's work
Tissue Engineering
Gene therapy
Biomarkers and cheap and effective tests
Performance enhancement
Cognitive enhancement
Wearable computers, exoskeletons and cybernetics

Robotics, UAVs, Robotic vehicles, Automation and Sensors/super cameras Everywhere

Robots are getting cheaper and more capable. They will go from about 6 million now to 18 million in 2011.


Femtosecond and attosecond lasers reveal new science and are powerful research tools. Solid state lasers are over 100 kilowatts in power. Efficiency, power and other improvements are consistently made.

Brain science and technology, intelligence and artificial intelligence

Synapse artificial brain project and other brain ai work.

Recent brain emulation work and a roadmap to full human brain emulation.

Convergence of Multiple Black Swans and Creative Mixing

Better materials and better nanoscale control and manufacturing help with understanding and developing the new energy sources (like nuclear fusion, Blacklight power and the others).

Better computers and sensors help with better robots and with all scientific research.

Carnival of Space 95

Arata Excess Heat Cold Fusion Experiment Replications and Neutron tracks Detected in Cold Fusion Experiment

From presentations at the American Chemical Society conference.

1. Excess heat, gamma radiation production from an unconventional LENR device —Tadahiko Mizuno, Ph.D., of Hokkaido University in Japan, has reported the production of excess heat generation and gamma ray emissions from an unconventional LENR device that uses phenanthrene, a type of hydrocarbon, as a reactant. He is the author of the book "Nuclear Transmutation: The Reality of Cold Fusion." (ENVR 049, Monday, March 23, 3:35 p.m., Hilton, Alpine Ballroom West, during the symposium, "New Energy Technology.")

Anomalous heat generation during hydrogenation of carbon hydride

Tadahiko Mizuno,, Department of Engineering, Hokkaido University, Kitaku kita13 nishi8, Sapporo 060-8628, Japan, Fax: 81-11-706-7835

We observed anomalous heat generation during the process of heating a small quantity of phenanthrene that was put in a cylinder with a Pt catalyzer and filled with high pressure hydrogen gas. It is very difficult to explain the total energy generation on the basis of a conventional mechanism that describes the chemical reaction chain, because almost all of the phenanthrene and hydrogen gas remained in the reaction chamber as if it was before starting the experiment. There were no reaction products such as other chemical compounds. The heat generation sometimes reached values of 0.1kW and continued for several hours. Moreover, we have confirmed gamma ray emission at the same time. In particular cases, we observed that both processes, heat generation and gamma ray emission, were running simultaneously as processes correlated to each other. We confirmed the same result that shows good reproducibility by specifically taking care of the temperature and the pressure control within the reactor.

2. New evidence supporting production and control of low energy nuclear reactions — Antonella De Ninno, Ph.D., a scientist with New Technologies Energy and Environment in Italy, will describe evidence supporting the existence of low energy nuclear reactions. She conducted lab experiments demonstrating the simultaneous production of both excess heat and helium gas, tell-tale evidence supporting the nuclear nature of LENR. She also shows that scientists can control the phenomenon. (ENVR 064, Tuesday, March 24, 10:10 a.m., Hilton, Alpine Ballroom West, during the symposium, "New Energy Technology)

It is time to envisage a research program with the aim to move from the proof of the principle directing the attention towards a working prototype able to produce sustainable cheaply available energy. Major problems still to be solved are: a) the reproducibility of the effect not yet suitable for use by representative users; b) the structural weakness of the cathodes and the inability to resist on several loading-deloading cycles; c) the design of a "reactor" able to collect most of the energy produced and to transfer it to an engine; and d) the existence of nuclear reactions different from d+d production, other nuclear fragments and its potential application. Even though many questions are still open and many problems are in need to be solved, the LENR research has made significant progress in the past that is to be regarded within the framework of scientific acceptance and as serious contribution to create an alternative energy source for our future.

An experimental "cold fusion" device produced this pattern of "triple tracks" (shown at right), which scientists say is caused by high-energy nuclear particles resulting from a nuclear reaction (Credit: Pam Boss, Space and Naval Warfare Systems Center (SPAWAR))

New experimental results bolster the case for Cold Fusion.

Pamela Mosier-Boss and colleagues at Space and Naval Warfare Systems Command (SPAWAR) in San Diego, California, are claiming to have made a "significant" discovery – clear evidence of the products of cold fusion. Tracks of energetic Neutrons are being detected.

the results of Pd–D co-deposition experiments conducted with the cathode in close contact with CR-39, a solid-state nuclear etch detector, are reported. Among the solitary tracks due to individual energetic particles, triple tracks are observed. Microscopic examination of the bottom of the triple track pit shows that the three lobes of the track are splitting apart from a center point. The presence of three α-particle tracks outgoing from a single point is diagnostic of the 12C(n,n′)3α carbon breakup reaction and suggests that DT reactions that produce ≥9.6 MeV neutrons are occurring inside the Pd lattice. To our knowledge, this is the first report of the production of energetic (≥9.6 MeV) neutrons in the Pd–D system.

As with everything involving cold fusion there is still dispute about the latest research.

Steven Krivit, editor of the New Energy Times, has been following the cold fusion debate for many years and also spoke at the ACS conference. "Their hypothesis as to a fusion mechanism I think is on thin ice … you get into physics fantasies rather quickly and this is an unfortunate distraction from their excellent empirical work," he told New Scientist.

Krivit thinks cold fusion remains science fiction. Like many in the field, he prefers to categorise the work as evidence of "low energy nuclear reactions", and says it can be explained without relying on nuclear fusion.

Quantifying Some General Predictions

Futurismic has some of Karl Schroeder twittered short predictions of likely surprises for 2009.

eBook Readers become as common as iPods.

So ebook sales to track which part of the iPod quarterly sales jumps ?

240,000 kindle sales first half of 2008

Estimated 2008 Kindle sales are 500,000

So iPod like sales tracking would be about the 2004 ipod sales number for 2009 with the big jump going into 2010 that matches iPod going to 5-6 million per quarter. There are other eBooks besides Amazon Kindle's but the Kindle is currently the eBook leader.

Karl Schroeder: Desktop printer / rapid prototyping manufacturing is about a 1 billion a year industry.

Rapid manufacturing in 2007

The low end of the market is getting more introductions.

The $4995 3d Desktop printer will be available in 2008 2009 as well. The 3D desktop printer takes up 25 x 20 x 20-inch space, and weighs about 90-pounds, while the maximum size of printed objects is 5 x 5 x 5-inches, and Desktop Factory says per-cubic-inch printing costs will hover somewhere around $1. The Desktop Factory 3D printer builds robust, composite plastic parts that can be sanded and painted when desired. Their goal by 2011 is to have their 3D printer below $1000.

Desktop Factory the maker of the $4995 3d Desktop printer.

Dimension Printing released a 3D desktop printer for $14,900 Feb, 2009

Karl Schroeder: Breakeven Fusion in 2009

This site tracks nuclear fusion projects closely.

By breakeven : In nuclear fusion research, the term breakeven refers to a fusion energy gain factor equal to unity, this is also known as the Lawson criterion.

In terms of breakeven nuclear fusion:

* The National Ignition Facility is planning to achieve fusion breakeven but it is ten years or more and lot of money and work away from commercializing this system.

* If the EMC2 group was funded by the Navy to build a 1.5 meter 100 MW device that could achieve breakeven. It seems laete 2009 is the earliest that could be funded at it would take about 2 years to build.

* General Fusion is to raise $50 million for a net energy gain device with a target date of 2013 if the second/third phase are roughly on schedule.

* If the Lawrenceville Plasma Physics research is successful then 2010 could see mroe energy produced than is fed into the plasma. Lawrenceville Plasma Physics Inc., a small research and development company based in West Orange, NJ, has announced the initiation of a two-year-long experimental project to test the scientific feasibility of Focus Fusion, controlled nuclear fusion using the dense plasma focus (DPF) device and hydrogen-boron fuel. Hydrogen-boron fuel produces almost no neutrons and allows the direct conversion of energy into electricity. The goals of the experiment are first, to confirm the achievement the high temperatures first observed in previous experiments at Texas A&M University; second, to greatly increase the efficiency of energy transfer into the tiny plasmoid where the fusion reactions take place; third, to achieve the high magnetic fields needed for the quantum magnetic field effect which will reduce cooling of the plasma by X-ray emission; and finally, to use hydrogen-boron fuel to demonstrate greater fusion energy production than energy fed into the plasma (positive net energy production).

Karl Schroeder: Batteries and Ultracapacitors converge in 2009 into something commercializable.

This site reported on the Ultrabattery back in January 2008. A combination ultracapacitor and lead acid battery from Australia which has low cost and long endurance. There is a lot of activity with reports of improved batteries and ultracapacitors practically every day.

Scientists Harness Exon-Skipping in Large Animal to Successfully Treat Duchenne Muscular Dystrophy

Genetic researchers at Children’s National Medical Center and the National Center of Neurology and Psychiatry in Tokyo published the results of the first successful application of “multiple exon-skipping” to curb the devastating effects of Duchenne muscular dystrophy in an animal larger than a mouse.

Multiple exon-skipping employs multiple DNA-like molecules as a “DNA band-aids” to skip over the parts of the mutated gene that block the effective creation of proteins. Duchenne muscular dystrophy that strikes 1 of every 3,500 boys born in the United States and worldwide each year. (So almost one million people have it.)

“Exon-skipping” employs synthetic DNA-like molecules called antisense as a DNA bandaid to skip over the parts of the gene that block the effective creation of dystrophin. Because the gene’s mutation could affect any of its 79 exons and sometimes more than one single exon at a time, scientists employed a “cocktail” of antisense called morpholinos to extend the range of this application. By skipping more than a single exon, this so-called DNA band-aid becomes applicable to between 80 and 90 percent of Duchenne muscular dystrophy patients.

Systemic treatment of the majority of Duchenne dystrophy will require multiple sequences to be delivered in the blood, and this study also is the first proof-of-principle of multiple exon-skipping in any organism,” Shin’ichi Takeda, MD, another senior author, said. “In order to realize that promise in human trials, it also will be important to re-evaluate current measures of toxicity, efficacy, and marketing that ensure both safety for the patient, as well as rapid development and distribution of life-saving drugs.

Форма для связи


Email *

Message *