April 14, 2012

China had a GDP surge in March and Inland Cities are now driving growth

1. Bloomberg - China’s new yuan loans were the most in a year and money-supply growth unexpectedly accelerated after Premier Wen Jiabao moved to bolster the economy by cutting banks’ required reserves and helping small companies get funding.

Local-currency-denominated loans were 1.01 trillion yuan ($160.1 billion) in March, the People’s Bank of China said yesterday, the biggest surprise above forecasts in more than a year. M2, the broadest measure of money supply, grew 13.4 percent from a year earlier. China’s foreign-exchange reserves, the world’s largest, rose to a record $3.31 trillion as of March 31 after dropping for the first time in more than a decade in the fourth quarter.

Financial Times - Nomura strategist Zhiwei Zhang, who’d forecast a well-below-consensus 7.8 per cent GDP growth for China in Q1 2012 now believes he needs to revise his forecast upwards

Yes, the 8.1 per cent growth for Q1 was lower than consensus forecasts of 8.3 or 8.4 per cent. But no, it’s not causing the bears to rejoice.

However, our read of the higher frequency monthly data is that economic growth may have started to pick up late in the quarter (i.e., March). Industrial output growth rose to 11.6% y-o-y in March from an average of 11.4% in Jan-Feb; nominal fixed-asset investment growth eased only slightly to 20.9% y-o-y in Q1 from 21.0% in Jan-Feb; nominal retail sales growth picked up to 15.2% y-o-y from an average of 14.7% in Jan-Feb.

Zhiwei believes this (the March surge) indicates the worst may be over:

Our assessment is that China’s GDP growth has bottomed and that growth will start to pick up from this quarter. GDP is a rear-view-mirror statistic. Forward looking indicators, such as the OECD’s composite leading economic index for China and new loans (which surged to RMB 1.01trn in March) support our view. Policy easing, both fiscal and monetary, is underway, and we expect more to come, especially with CPI inflation likely to ease in the coming months.

Systemic Underestimation of future life expectancy by 3 years for every 20 years

Conversable Economist - What if retirement programs are underestimating how much longevity is likely to rise? The April 2012 Global Financial Stability report from the IMF tackles this question in Chapter 4: "The Financial Impact of Longevity Risk."

The main source of longevity risk is therefore the discrepancy between actual and expected lifespans, which has been large and one-sided: forecasters, regardless of the techniques they use, have consistently underestimated how long people will live. These forecast errors have been systematic over time and across populations. ... In fact, underestimation is widespread across countries: 20-year forecasts of longevity made in recent decades in Australia, Canada, Japan, New Zealand, and the United States have been too low by an average of 3 years. The systematic errors appear to arise from the assumption that currently observed rates of longevity improvement would slow down in the future. In reality, they have not slowed down, partly because medical advances, such as better treatments for cancer and HIV-AIDS, have continued to raise life expectancy.

If everyone in 2050 lived just three years longer than now expected—in line with the average underestimation
of longevity in the past—society would need extra resources equal to 1 to 2 percent of GDP per year. If this longevity shock occurred today and society wanted to save to pay for these extra resources for the next 40 years (that is, fully fund these additional “pension liabilities”), advanced economies would have to set aside about 50 percent of 2010 GDP, and emerging economies would need about 25 percent of 2010 GDP—a sum totaling tens of trillions of dollars. As such, longevity risk potentially adds one-half to the vast costs of aging up to the year 2050—and aging costs themselves are not fully recognized in most long-term fiscal plans.

Kevin Murphy and Robert Topel estimated in a 2006 paper in the Journal of Political Economy ("The Value of Health and Longevity," 114:5, pp. 871-904) that the 30 years of additional life expectancy gained by an average American during the 20th century was $1.3 million per person.

So increasing life spans is an economic good and we just have to increase retirement ages or change retirement programs to be able to adjust to lifespan increases.

China’s “overinvestment” problem may be greatly overstated

Economist - It is an article of faith that China needs to rebalance its economy by investing less and consuming more. Otherwise, it is argued, diminishing returns on capital will cramp future growth; or, worse still, massive overcapacity will cause a slump in investment, bringing the economy crashing down. So where exactly is all this excessive investment?

Most people point to the rapid growth in China’s capital spending and its unusually high share of GDP. Fixed-asset investment (the most widely cited figure, because it is reported monthly) has grown at a breathtaking annual rate of 26% over the past seven years. Yet these numbers are misleading. They are not adjusted for inflation and they include purchases of existing assets, such as land, that are inflated by the rising value of land and property. A more reliable measure, and the one used in other countries, is real fixed-capital formation, which is measured on a value-added basis like GDP. This has increased by a less alarming annual average of 12% over the past seven years, not that much faster than the 11% growth rate in GDP in that period.

The level of fixed-capital formation does look unusually high, at an estimated 48% of GDP in 2011 (see left-hand chart). By comparison, the ratio peaked at just under 40% in Japan and South Korea. In most developed countries it is now around 20% or less. But an annual investment-to-GDP ratio does not actually reveal whether there has been too much investment. To determine that you need to look at the size of the total capital stock—the value of all past investment, adjusted for depreciation. Qu Hongbin, chief China economist at HSBC, estimates that China’s capital stock per person is less than 8% of America’s and 17% of South Korea’s (see right-hand chart). Another study, by Andrew Batson and Janet Zhang at GK Dragonomics, a Beijing-based research firm, finds that China still has less than one-quarter as much capital per person as America had achieved in 1930, when it was at roughly the same level of development as China today.

How many people could live on earth using 10 kilowatts per year of clean energy

Robert Freitas went through detailed calculations of how much clean energy could generated and used on earth without effecting the climate of the world. This would just be energy generated and having the waste heat not impact climate. This analysis was assuming a certain level of nanotechnology so carbon dioxide levels would be made stable. How much heat we could dump into the environment without seriously affecting the climate ? He estimates it at 10^15 watts (about half a percent of insolation). So one hundred billion people at a Western standard of living (10 kilowatts)

It is possible to derive a limit to the total planetary active nanorobot mass by considering the global energy balance. Total solar insolation received at the Earth's surface is ~1.75 x 10^17 watts (IEarth ~ 1370 W/m2 + 0.4% at normal incidence). Global energy consumption by mankind reached an estimated 1.2 x 10^13 watts (~0.02 W/m2) in 1998. This latter figure may also be regarded as the total heat dissipation of all human technological civilization worldwide, as distinct from the ~10^12 watt metabolic output of the global human biomass.

Converting the limit to the amount of nanobots

The hypsithermal ecological limit in turn imposes a maximum power limit on the entire future global mass of active nanomachinery or "active nanomass." Assuming the typical power density of active nanorobots is ~10^7 W/m3, the hypsithermal limit implies a natural worldwide population limit of ~10^8 m3 of active functioning nanorobots, or ~10^11 kg at normal densities. Assuming the worldwide human population stabilizes near ~10^10 people in the 21st century and assuming a uniform distribution of nanotechnology, the above population limit would correspond to a per capita allocation of ~10 kg of active continuously-functioning nanorobotry, or ~10^16 active nanorobots per person (assuming 1 micron3 nanorobots developing ~10 pW each, and ignoring nonactive devices held in inventory). Whether a ~10-liter per capita allocation (~100 KW/person) is sufficient for all medical, manufacturing, transportation and other speculative purposes is a matter of debate.

Of course with advanced technology it will be trivial to travel around and colonize the solar system and use the resources of the solar system.

Quantum Vacuum Propulsion

We previously covered the work of Harold White and Paul March on Quantum Vacuum propulsion

Nuclear and Emerging Technologies for Space (2012) - Advanced Propulsion Physics: Harnessing the Quantum Vacuum.

Can the properties of the quantum vacuum be used to propel a spacecraft? The idea of pushing off the vacuum is not new, in fact the idea of a “quantum ramjet drive” was proposed by Arthur C. Clark (proposer of geosynchronous communications satellites in 1945) in the book Songs of Distant Earth in 1985: “If vacuum fluctuations can be harnessed for propulsion by anyone besides science fiction writers, the purely engineering problems of interstellar flight would be solved.”. When this question is viewed strictly classically, the answer is clearly no, as there is no reaction mass to be used to conserve momentum. However, Quantum Electrodynamics (QED), which has made predictions verified to 1 part in 10 billion, also predicts that the quantum vacuum (lowest state of the electrodynamic field) is not empty, but rather a sea of virtual particles and photons that pop into and out of existence stemming from the Heisenberg uncertainty principle. The Dirac vacuum, an early vacuum model, predicted the existence of the electron’s antiparticle, the positron in 1928, which was later confirmed in the lab by Carl Anderson in 1932. Confirmation that the Quantum Vacuum (QV) would directly impact lab observations came inadvertently in 1948 while Willis Lamb was measuring the 2s and 2p energy levels in the hydrogen atom. Willis discovered that the energy levels were slightly different, contrary to prediction, but detailed analysis performed within weeks of the discovery by Bethe at Cornell predicted the observed difference only when factoring in contributions from the QV field The Casimir force, derived in 1948 by Casimir in response to disagreements between experiment and model for precipitation of phosphors used with fluorescent light bulbs, predicts that there will be a force between two nearby surfaces due to fluctuations of the QV. This force has been measured and found to agree with predictions numerous times in multiple laboratories since its derivation.

What is the Casimir force? The Casimir force is a QV phenomenon such that two flat plates placed in close proximity in the vacuum preclude the appearance of particles, whose wavelength is larger than the separation gap, and the resultant negative pressure between the two surfaces is more negative than the pressure outside the two surfaces, hence they experience an attractive force.

Robert Zubrin makes the case against Population Control

New Atlantis - Robert Zubrin makes the case against population control.

Around the world, the population control movement has resulted in billions of lost or ruined lives. We cannot stop at merely rebutting the pseudoscience and recounting the crimes of the population controllers. We must also expose and confront the underlying antihumanist ideology. If the idea is accepted that the world’s resources are fixed with only so much to go around, then each new life is unwelcome, each unregulated act or thought is a menace, every person is fundamentally the enemy of every other person, and each race or nation is the enemy of every other race or nation. The ultimate outcome of such a worldview can only be enforced stagnation, tyranny, war, and genocide. The horrific crimes advocated or perpetrated by antihumanism’s devotees over the past two centuries prove this conclusively. Only in a world of unlimited resources can all men be brothers.

That is why we must reject antihumanism and embrace instead an ethic based on faith in the human capacity for creativity and invention.

In 1991, UNFPA head Nafis Sadik went to China to congratulate the oligarchs of the People’s Republic for their excellent program, which by that time had already sterilized, implanted IUDs in, or performed abortions on some 300 million people. “China has every reason to feel proud of and pleased with its remarkable achievements made in its family planning policy and control of its population growth over the past ten years,” she said. “Now the country could offer its experiences and special experts to help other countries.... UNFPA is going to employ some of [China’s family planning experts] to work in other countries and popularize China’s experience in population growth control and family planning.”

Sadik made good on her promise. With the help of the UNFPA, the Chinese model of population control was implemented virtually in its entirety in Vietnam, and used to enhance the brutal effectiveness of the antihuman efforts in many other countries, from Bangladesh and Sri Lanka to Mexico and Peru.

There are several key aspects about the population control debate.

Did population control need to be implemented historically on a country by country basis? What were the other options ?

By eating less meat a larger population could have been supported.
There were famines but they were the result of mismanagement and not because there were too many people.

What is the carrying capacity of the world ? Is it far higher than what the current population is ?

What are the numbers that be supported at every point in time going forward ?

New super-rice in China has a yield of 13.5 tons per hectare. When it is scaled up past experience indicates about 80% of the smaller scale yield should result in 10.5 tons per hectare. The highest rice yield in the world is in Australia, on average about 9.9 tons per hectare (660 kg/mu), followed by 6.7 tons per hectare (445 kg/mu) in Japan. The yields of China’s super-rice have now reached 550 and 600 kg/ mu, respectively, at large scale, as the result of the first two phases of development.

The developer of the superrice believes yield could eventually increase to 15 tons a hectare (without using bioengineering, and just the hybriding methods they are using so far). Rice is the staple food for more than half of the world population. Rice production per hectare is now available to feed 27 people, and it needs to feed 43 people by 2050. Yuan told reporters that many countries in the world are suitable for planting China’s hybrid rice. If the area of its cultivation is increased by 75 million hectares globally, an increased yield of 2 000 kg per hectare will provide 150 million tonnes extra for feeding 400-500 million more people, and effectively guarantee food security. China is ready to help people bid farewell to famine.

161 million hectares are currently used for growing rice with an average yield of 4.2 tons per hectare. So the deploying the existing superrice increasing yields to 10.5 tons per hectare would increase yields by 250%. It would increase China's yield by 60%. They be

Biotech crops reached 160 million hectares, up 12 million hectares on 8% growth, from 2010, as the global population reached a historical milestone of 7 billion on 31 October 2011.

* 11.5% of the total arable land is used for biotech crops.
* 3.3% of all agricultural land.

So total arable land is about 1440 million hectares.
Total agricultural land is 5280 million hectares.

World Crop production statistics

Latest Talk by Chapman on Laser pulse fusion propulsion

Last year, John Chapman of NASA proposed a pulsed laser system for megawatt class fusion propulsion.

In Chapman’s aneutronic fusion reactor scheme, a commercially available benchtop laser starts the reaction. A beam with energy on the order of 2 x 10^18 watts per square centimeter, pulse frequencies up to 75 megahertz, and wavelengths between 1 and 10 micrometers is aimed at a two-layer, 20-centimeter-diameter target.

In the present study : The fusion reaction is spawned by a burst of Terawatt laser accelerated protons accelerated via Target Normal Sheath Acceleration (TNSA) from the substrate and hydrogen content to liberate ~180 to 575 KeV Protons which react with the target.

Thrust is realized via Lorentz reaction of electromagnetic forces coupled to the spacecraft frame

Project Icarus look at fusion and antimatter propulsion

1. Project Icarus: Specific Power For Interstellar Missions Using Inertial Confinement Fusion Propulsion.

Many studies have examined the engineering requirements for an interstellar fusion propelled mission for the 21st Century, including a comprehensive NASA study in the early 1990s. These same studies considered variations in specific power from 1-100 kW/kg for mission profiles lasting 100-100s of years and examined 1-4-staged engine designs for both flyby and rendezvous type missions with specific impulses varying up to 1 million seconds and with varying jet power ratings.

In the 1970s members of The British Interplanetary Society (BIS) designed an interstellar flyby spacecraft called Daedalus. The design aim was to produce a design with a specific power of 100 MW/kg which corresponded to the optimum mass ratio, but is a million times higher than the NASA studies had examined. However, due to a longer mission profile than originally planned the design for Daedalus ended up having a specific power of around 40 MW/kg.

This paper considers the range of specific powers for an interstellar mission in the application of inertial confinement fusion propulsion type systems using propellant combinations such as D/D and D/He3. These methods are being examined for the recent Project Icarus which aims to evolve the Daedalus probe to an improved design.

The Nuclear and emerging technologies for space conference 2012 had many interesting talks about space technology

Helicon Injected Inertial Plasma Electrostatic Rocket, HIIPER

This paper presents a radically new class of nuclear electric thrusters that has the advanced capabilities necessary to perform missions previously unfeasible. This light-weight propulsion system called HIIPER (Helicon Injected Inertial Plasma Electrostatic Rocket) employs one of the highest density plasma sources (Helicon source) for plasma production and one of the most erosion-resistant accelerators (Inertial Electrostatic Confinement (IEC)) for plasma acceleration.

Although the helicon source and IEC have been used separately for space propulsion; issues of longevity, scalability and cost have always been a barrier in achieving more comprehensive interplanetary explorations. This is the first time that all these limitations have been overcome by using a Helicon stage to produce and inject very high density plasma into the IEC stage which accelerates ions to high energies (multikVs), forming an ultra high intensity, pencil-thin plasma jet exhaust that produces exceptional thrusting capability. The high ion energies plus the ability to use a wide variety of gases, e.g. nitrogen or argon as well as conventional xenon, provides high ISP, and with the addition of a heavier propellant gas in the nozzle, a variable power-ISP tradeoff. The HIIPER is also the only electric thrust system to date that can potentially be converted to a multi-role self-powered fusion spacecraft and propulsion system (e.g. presentation of the conceptual version with p-B11 fusion power, VIPER, for ultra fast deep space probe missions). Indeed, the IEC part is already used to fuse deuterium in commercially available low level neutron generators for Neutron Activation Applications (NAA).

HIIPER allows for improved variable specific impulses and high thrust to power ratio by decoupling the ionization (helicon) and acceleration (IEC) stages of the plasma thruster. While VASIMR uses decoupling with ICRH antenna heating, the IEC heating section allows unmatchable ion energies, power scaling and efficiency, with the added advantage of being simple and light-weight. The current 500-Watt HIIPER lab experiment is capable of specific impulses around 3,000 s, with a final multi-kilowatt device capable of around 276,000 s.

April 13, 2012

Japan government and agencies declare two reactors safe for restart

Two idled Japanese nuclear reactors have been declared safe and will need to be restarted to avoid a summer power crunch in western Japan, the trade minister said on Friday, a step towards the first restart in Japan since last year's Fukushima crisis.

Trade Minister Yukio Edano also said that he will visit Fukui prefecture, host to the No. 3 and No. 4 reactors at Kansai Electric Power Co's Ohi nuclear power plant, on Saturday to meet with the governor and Ohi town mayor and to convince them of the necessity for the restarts.

Edano set no deadline for the reactor restarts, but implied that he hopes to obtain public backing by July, when the hottest season starts.

Fukui prefecture, host to 13 reactors, cannot legally block restarts, but Tokyo has made clear it is reluctant to override wary public opinion.

All but one of Japan's 54 nuclear reactors are now off line, most of them for regular maintenance checks, as public concerns over nuclear safety have kept them from restarting. The last reactor will shut down on May 5.

Edano, who holds the energy portfolio, said that Kansai Electric's power supply this summer may fall by up to 20 percent short of peak-hour demand and that a sudden power outage would have a wide impact.

Enabling a closed thermal cycle for power plants using core shell nanoparticles

Argonne National Laborartory - nanoparticles based on what is known as a “core-shell” configuration, in which a solid outer coat protects an inner layer that can melt above a certain temperature will be mixed with the coolant water of a thermal power plant (coal, natural gas or nuclear). Once dispersed in the plant’s water supply, the nanoparticles are able to absorb heat during the thermal cycle. After partially melting, the particles travel to the cooling tower where they resolidify. The system is closed and designed to ensure against leakage of the plant’s water or steam into the environment.

If this works they will be able to used a fixed amount of coolant water with the nanoparticles to enhance the heat exchange. No water would be leaked or added to the cooling system.

In order to operate, electrical plants use a cycle that uses partially condensed high-temperature steam to turn a large turbine. During generation, a significant quantity of this steam is lost due to evaporation. “In every cycle, there’s a significant amount of water that we can’t recapture,” said Argonne materials scientist Dileep Singh, who is working to develop the specialized nanoparticles.

At the molecular level, Singh and his colleagues are especially concerned with the surface of the nanoparticles, as the chemistry at the boundary between the metal and the water determines how much heat the particles can take up. “We’re experimenting with looking at the bonding between the particles and the water molecules,” he said.

“What we really want to know is how much heat we can pick up given a constant amount of water to cool the system,” he added. “Environmentally responsible energy growth involves worrying about how you manage your water resources.”

Argonne is working with the Electric Power Research Institute and other partners to move this basic technology quickly through the developmental pipeline. Initial plans call for the demonstration of proof of concept to commence this year and full-scale commercial deployment to begin in four years. “It’s practically unheard of for industry to seek to deploy a new technology so quickly,” Ewing said. “However, water consumption is a major issue that limits the expansion of power. If we want to solve the energy crisis, we’ll have to move boldly.”

Photo-driven Molecular Wankel Engine

Arxiv - Photo-driven Molecular Wankel Engine (4 pages)

Technology Review - Clusters of boron atoms should behave like rotary Wankel engines when bathed in circularly polarised light. Nanotechnologists have identified many molecular motors and even a few rotary versions (ATP springs to mind). What makes this one special is that the polarised light doesn't excite the molecule's electronic ground state, leaving it free to be chemically active. By contrast, other forms of molecular power such as chemical or electric current can generate heat that has a critical effect on the system. For the moment, the photon-powered molecular Wankel engine is merely an idea, the result of some detailed chemical modelling.

We report a molecular Wankel motor, the dual-ring structure B13+, driven by circularly-polarized infrared elec-tromagnetic radiation, under which a guided uni-directional rotation of the outer ring is achieved with rotational frequency of the order of 300 MHz.

(a) Schematic of the outer ring rotation. Three identical counterclockwise elementary rotations are involved in the sequence 1-2-3-4. The orientation of the molecule is defined by the blue arrow, which is aligned with the C2v axis of the molecule. (b) Rela-tive ground state energies (left axis, solid black line) and vibrational frequencies (right axis, red dashed line) of the elementary rotation of the B13+ as a function of the angle when a constant electric field ( V/m) is applied. (less than 180º) is defined as the angle between the field direction and the molecule. The blue (red) region indicates that a counterclockwise rotation depicted in (a) is favored (unfavored) at a given angle (see text).

Defkalion Green Technologies targets operational prototype low energy nuclear reaction reactor for July, 2012

Defkalion indicates that they will have a fully operational prototype cold fusion reactor by July, 2012

Defkalion has successfully managed to trigger and monitor Chemically Assisted Low Energy Nuclear Reactions caused by Nickel and Hydrogen nuclei. This unique LENR technology is based on proprietary methodologies and engineering designs.

Defkalion will become the first global player in LENR using its proprietary technology to commercialize its new products and diversify into new Research and Development efforts for new applications. It plans to have a fully operational prototype ready by July 2012.

Defkalion has a system that is similar to the Rossi Energy Catalyzer.

DARPA will buy prototype heads-up display for combat data in a contact lens

Defense Advanced Research Projects Agency (DARPA) just signed a contract under which an optics company called Innovega in Bellvue, Wash. will develop a version of its dual-focus contact lens and accompanying HUD-display (heads up display) eyeglasses to DARPA's specs

The contact lens allows human eyes to simultaneously focus on images that are ultra-close and those that are farther away by piping in the image from an accessory such as a HUD display and display it using the multiple-focus optic splitter designed into the contact lenses.

Making wearable HUD displays smaller, lighter, more energy efficient and easier to use is a huge priority for those within the military eager to put tactical information in the hands of troops who need it. Past efforts have provided useful information – according to the evaluations of the troops testing them in the field – but were too heavy, too awkward or displayed data too out of date to be useful to most ground troops.

Innovega says a contact lens with heads up display (HUD) could be released to the public in 2014

BBC News - The lenses allow the wearer to focus on both the information projected onto the glasses' lenses and the more distant view that can be seen through them. They do this by having two different filters.

The central part of each lens sends light from the HUD towards the middle of the pupil, while the outer part sends light from the surrounding environment to the pupil's rim.
The images are being displayed in the glasses but the contact lenses help the person look at the augmented data and the regular field of view

IBM Watson system referencing a complete indexed and annotated whole lifetime video records to try to pass the Turing Test

Wired - “Two revolutionary advances in information technology may bring the Turing test out of retirement,” wrote Robert French, a cognitive scientist at the French National Center for Scientific Research, in an Apr. 12 Science essay. “The first is the ready availability of vast amounts of raw data — from video feeds to complete sound environments, and from casual conversations to technical documents on every conceivable subject. The second is the advent of sophisticated techniques for collecting, organizing, and processing this rich collection of data.”

“Is it possible to recreate something similar to the subcognitive low-level association network that we have? That’s experiencing largely what we’re experiencing? Would that be so impossible?” French said.

Science - Dusting Off the Turing Test

Hold up both hands and spread your fingers apart. Now put your palms together and fold your two middle fingers down till the knuckles on both fingers touch each other. While holding this position, one after the other, open and close each pair of opposing fingers by an inch or so. Notice anything? Of course you did. But could a computer without a body and without human experiences ever answer that question or a million others like it? And even if recent revolutionary advances in collecting, storing, retrieving, and analyzing data lead to such a computer, would this machine qualify as “intelligent”?

Better simulations of multicore chips

The computer chips of the future are likely to have hundreds or even thousands of cores. For chip designers, predicting how these massively multicore chips will behave is no easy task. Software simulations work up to a point, but more accurate simulations typically require hardware models — programmable chips that can be reconfigured to mimic the behavior of multicore chips.

Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) presented a new method for improving the efficiency of hardware simulations of multicore chips. Unlike competing methods, it guarantees that the simulator won’t go into “deadlock” — a state in which cores get stuck waiting for each other to relinquish system resources, such as memory. The method should also make it easier for designers to develop simulations and for outside observers to understand what those simulations are intended to do.

Hardware simulations of multicore chips typically use devices called field-programmable gate arrays, or FPGAs. An FPGA is a chip with an array of simple circuits and memory cells that can be hooked together in novel configurations after the chip has left the factory. The chips sold by some small-market manufacturers are, in fact, specially configured FPGAs.

World Nuclear Generation in 2011 was 2518 TWh

Total nuclear electricity generation in 2011 was 2518 TWh, 4.3% less than the 2630 TWh generated in 2010, according to figures from the International Atomic Energy Agency (IAEA). Generation had increased in 2010 following three consecutive years of decline. Japan's generation was down 127.7 TWh and world generation was down 112 TWh. Germany shutdown reactors for another 30.7 TWh decrease. So the world generated 46.4 TWh more if the effect of Japan and German shutdowns were excluded. Looking at 2012, generation will increase from 14 new nuclear reactors. However, if Japan's reactors are not restarted then Japan would end up with 140 TWh of nuclear generation less in 2012 than in 2011.

Japan's nuclear electricity generation fell by 44.3% in 2011 to 152.6 TWh, compared with 280.3 TWh in 2010.

The accident resulted in Germany imposing a three-month moratorium on the operation of its oldest reactors. This was followed by a decision to permanently shut down the seven units that began operating in or before 1980, plus one reactor that has been in long-term shutdown. As a result, Biblis A and B, Neckarwestheim 1, Brunsbüttel, Isar 1, Unterweser, Phillipsburg 1 and Krümmel were closed, wiping 8336 MWe from Germany's generation capacity. These shutdowns helped contribute to a 23.1% decrease in the country's nuclear electricity generation, from 133.0 TWh in 2010 to 102.3 TWh in 2011.

April 12, 2012

Lighter roofs and roads would offset 130 to 150 billion tons of CO2

The combined size of global urban areas is around 2 million square kilometers.
-- Over 50% of the world's population currently lives in urban areas. This is expected to increase to 70% by 2040.
-- Pavements and roofs comprise over 60% of urban surfaces (25% roof and 35% pavement).

"Typically roofs are resurfaced (or changed) about every 20-30 years; paved surfaces are resurfaced about every ten years. When roofs or paved surfaces are installed, they can be changed to materials with high solar reflectance, typically at no incremental cost," the researchers write.

Lead author Professor Hashem Akbari said: "It is all based on planning, codes and policies. If we really put the nuts and bolts in place, we can get close to 100 per cent of urban areas increasing the albedo of surfaces."

Increasing the reflectance – commonly known as albedo – of every urban area by 0.1 will give a CO2 offset between 130 and 150 billion tonnes. This is equivalent to taking every car in the world off the road for 50 years, assuming a single car gives off around 4 tonnes of carbon dioxide a year.

Progress for waste heat engine to make clean power from biomass integrated with biochar to remediate water and soil pollution

Cyclone’s Waste Heat Engine (WHE, pronounced “we”) recaptures heat from external sources to create steam which powers the engine. The WHE models are designed to run a grid-tied or primary electric power generator while producing ZERO Emissions.

The WHE is designed to run on heat as low as 500ºF from many different external sources of “wasted” heat such as:

* Commercial or small-scale industrial ovens or furnaces
* Landfill and industrial gas flares
* Engine exhaust – from vehicles or power generators
* Biomass combustion – dry, vegetative waste materials

A licensee in China, Great Wall Alternative Power Systems Ltd., has completed the build of the first prototype engines under its License Agreement with the Company, and has begun in-house testing of these units.

The engines built by Great Wall are based on Cyclone’s WHE-25 design, and are meant for use with biomass-to-power generator systems. Applications will include distributed combined heat and power (CHP) systems, and power sources for bio-char producing environmental remediation equipment. Initial compressed air testing of Great Wall’s engines has been successful, and steam testing will commence shortly. These units will ultimately be manufactured and sold only in China.

Great Wall’s Managing Director, Robert Devine, commented: “We see a multi-billion dollar market for distributed power in China’s rural areas. With the Cyclone Engine, we can deliver viable, low cost biomass-based power solutions integrated with a bio-char process that can help remediate water and soil pollution. Operating within China can sometimes be challenging, and that has admittedly pushed back our production schedule. We are pleased to be back on track, and fully committed to seeing this project through to completion.”

Global clean energy spending was $263 billion, US Spent more money, China installed more power

Global clean energy finance and investment grew to $263 billion in 2011, a 6.5 percent increase over the previous year, according to new research released by The Pew Charitable Trusts.

The US spent the most money on clean energy at $48 billion (which probably includes the Solyndra investment). China installed 133 GW versus the US installing 93 gigawatts.

In the United States, which attracted $48 billion last year, investors took advantage of the country’s stimulus programs before they expired at the end of 2011, as well as the production tax credit for electricity from renewable energy, which is to end this December.”

Among renewable technologies, solar increased globally by 44 percent, attracting $128 billion and accounting for more than half of all clean energy investment among members of the G-20. Dramatic price declines, with the cost of solar modules dropping by half in the past 12 months, fueled the activity. Wind prices also were lower in 2011.

The combination of falling prices and growing investments accelerated installation of clean energy generating capacity by a record 83.5 gigawatts (GW) in 2011. Almost 30 GW of new solar and 43 GW of wind power were deployed. Renewable power generating capacity, at 565 GW globally, was nearly 50 percent more than installed nuclear generating capacity in 2010.

Skin transformed into liver cells to treat an inherited disease

Guardian UK - Scientists have taken skin cells from a patient with liver disease and turned them into replacement liver cells, in a biological tour de force that promises to transform how the condition is treated.

The procedure will have to undergo several years of trials before it can be used in humans, but if approved, it could launch a new era of personalised therapies for serious genetic disorders.

In Britain 30,000 people carry a genetic defect that causes antitrypsin deficiency, a disease that can only be cured by a liver transplant. The operation requires a suitable donor organ and costs around £500,000, with drugs to prevent rejection by the immune system adding more than £20,000 a year to medical costs.

Treating a patient with their own cells removes the need for anti-rejection drugs, reduces the burden on strained transplant services and is likely to be cheaper, the scientists behind the technique believe.

The scientists now hope to partner with a major pharmaceutical firm and work towards trials in people. Rather than injecting the cells directly into patients, the cells will probably be encapsulated in a porous bag. This will ensure that patients are not put at risk if some of the cells turn out to be faulty and develop into tumour cells.

Scientists elsewhere are now expected to develop the procedure to treat other genetic conditions, including those that require the correction of several mutations at once.

Nature - Targeted gene correction of α1-antitrypsin deficiency in induced pluripotent stem cells

More Evidence that the Mars Viking Probes 'Found Life'

Discorvery News - Mathematical analysis adds to growing body of work questioning the negative results of a life-detection experiment 36 years ago.

The new study took a different approach. Researchers distilled the Viking Labeled Release data, provided as hard copies by the original researchers, into sets of numbers and analyzed the results for complexity. Since living systems are more complicated than non-biological processes, the idea was to look at the experiment results from a purely numerical perspective.

They found close correlations between the Viking experiment results' complexity and those of terrestrial biological data sets. They say the high degree of order is more characteristic of biological, rather than purely physical processes.

International Journal of Aeronautical and Space Sciences - Complexity Analysis of the Viking Labeled Release Experiments

Russia will make a big push for offshore oil

DARPA offer $2 million prize for rescue robot challenge

DARPA plans to offer a $2 million prize to whomever can help push the state-of-the-art in robotics beyond today’s capabilities in support of the DoD’s disaster recovery mission.

DARPA’s Robotics Challenge will launch in October 2012. Teams are sought to compete in challenges involving staged disaster-response scenarios in which robots will have to successfully navigate a series of physical tasks corresponding to anticipated, real-world disaster-response requirements.

The primary goal of the DARPA Robotics Challenge program is to develop ground robotic capabilities to execute complex tasks in dangerous, degraded, human-engineered environments. The program will focus on robots that can utilize available human tools, ranging from hand tools to vehicles. The program aims to advance the key robotic technologies of supervised autonomy, mounted mobility, dismounted mobility, dexterity, strength, and platform endurance. Supervised autonomy will be developed to allow robot control by non-expert operators, to lower operator workload, and to allow effective operation despite low fidelity (low bandwidth, high latency, intermittent) communications.

42 page description of the 2 million dollar DARPA robotics challenge

Texas Oil Commissioner talks about possible 4 million barrels of oil per day in 2016 from Texas

In 2011 oil production in Texas and North Dakota dramatically increased.

North Dakota's oil production has continued to increase in 2012 and is at 558,254 barrels of oil per day in February.

Texas produced 1,427 thousand barrels per day in 2011, 22 percent more than in 2010 and the highest amount of oil production since 1997. This increase is primarily from the Eagle Ford shale formation in south Texas.

Texas field production of crude oil is at 1.67 million barrels of oil per day in January, 2012.

There is a difference in how the EIA is calculating Texas field production versus the Texas railway commission tracking of crude oil production. The railway commission is seeing about 33 million barrels of crude oil for February, which would be about 1.15 million barrels of oil per day. I believe the difference has to do with definitions of oil.

There are projections for Texas oil production to get over 2 million barrels per day (based on the Texas railway commission definitions).

I could paint a scenario for you where we are producing 3 million more barrels per day by 2016, which would almost get us to the point where we could eliminate 60 to 70 percent of our OPEC imports,” said Texas Railroad Commissioner Barry Smitherman. “With that greater control over our own energy security , we could care less about what happens in the Strait of Hormuz” — the narrow mouth of the Persian Gulf that serves as a seaway for 22 percent of the world’s oil supply.

The Texas commissioner in charge of their oil industry is saying that 4 million barrels per day of oil production in Texas is feasible by 2016.

Times Record News - Oil production in Texas increased 9.5 percent in February compared to February 2011. Crude oil has once again regained its dominance as employment reaches a new high with 33,500 jobs added in the previous 12 months in Texas. Texas producers completed 869 oil wells during February, more than 230 percent more than the 262 oil-well completions reported in February 2011. Higher well counts and higher wellhead prices for crude oil combined to push the value of Texas-produced crude up 25.7 percent in February to more than $3.6 billion.

Alaska produced 563 thousand barrels per day in 2011, 6 percent less than in 2010. Alaska was at 612,000 barrels per day in January, 2012

California produced 538 thousand barrels per day in 2011, 3 percent less than in 2010. California was at 535,000 barrels of oil per day in January, 2012

North Dakota Oil in February at a record 558,254 barrels per day

North Dakota oil and gas report from Lynn Helms, Director of NDIC Department of Mineral Resources.

Jan Oil 16,935,846 barrels = 546,318 barrels/day
Feb Oil 16,189,355 barrels = 558,254 barrels/day (preliminary) (NEW all-time high)

Jan Gas 17,738,391 MCF = 572,206 MCF/day
Feb Gas 17,437,469 MCF = 601,292 MCF/day (preliminary) (NEW all-time high)

Jan Producing Wells = 6,624
Feb Producing Wells = 6,726 (NEW all-time high)

Jan Permitting: 170 drilling and 0 seismic
Feb Permitting: 181 drilling and 5 seismic (all time high was 245 in Nov 2010)

North Dakota likely passed California's oil production (about 540,000 barrels per day) and is almost at Alaska (563,000 barrels per day). Texas has over 1.4 million barrels per day.

April 11, 2012

US Crude Oil Production Back over 6 million barrels per day for the first time since 2000

US crude oil production is over 6 million barrels per day (weekly Energy Information Administration Report

Net imports of oil are at about 7.5 million barrels per day, which is about the level they were in 1996.

Fission Fragment Rocket Engine Thrust Velocity at 1.7% of light speed

Proposal for a Concept Assessment of a Fission Fragment Rocket Engine (FFRE) Propelled Spacecraft (15 page presentation from the NASA NIAC Spring Symposium

Can Free Spacecraft From Today’s Propulsion Limitations
• Far Less Propellant Than Chemical Or Nuclear Thermal
• Far More Efficient Than Nuclear Electric
• Far Safer: Charge Reactor In Space, Radioactivity Ejected

Has Highest Exhaust Velocity Possible Today
• 10s To 100’s Lbs Of Continuous Thrust (Years)
• Specific Impulse Above 500,000sec

• Faster Travel
• More Payload
• Nearly Unlimited Electrical Power
• Greater Human Safety (Mission Travel, Maintenance)
• No Need For Vast Propellant Supply

Fusion Propulsion Based on the Inductively-Driven, Metal Propellant Compression of an FRC Plasmoid

At the NASA NIAC Spring Symposium, John Slough presented Nuclear Propulsion through Direct Conversion of Fusion Energy (30 pages)

John Slough could have an experiment in 2012 with a net gain in fusion energy of 1.6. It will be an imploding liner experiment. For space propulsion he is targeting a 200 times gain in energy output from what is input. Mission profiles are for 30 day or 90 day missions to Mars with over 5000 ISP.

* Lowest mass fusion system is realized with FRC (Field Reversed Configuration) compressed by convergent array of magnetically driven metal foils - steps (a), (b)

*Fusion neutron and particle energy is directly transferred to the encapsulating, thick metal blanket - step (c)
−Provides spacecraft isolation from fusion process
−Eliminates need for large radiator mass

* Expansion of hot, ionized propellant in magnetic nozzle - step (d)
−Produces high thrust at optimal Isp

Solar Power Satellite via Arbitrarily Large PHased Array

Space.com - Last August, Artemis Innovation Management Solutions was selected for a NASA NIAC award to dive into the details of what Mankins labels "the first practical solar-power satellite concept."

24 page presentation at the NIAC Spring Symposium

A preliminary draft of an estimated schedule would target getting a 1.2 gigawatt system up by 2024.

The project will be an energetic one-year study of the design. SPS-ALPHA is short for Solar Power Satellite via Arbitrarily Large PHased Array.

Along with reviewing the conceptual feasibility of the SPS-ALPHA, the team will carry out select proof-of-concept technology experiments.

SPS-ALPHA is a novel "biomimetic" approach to the challenge of space solar power, Mankins told SPACE.com.

Biomimetic refers to human-made processes, substances, devices or systems that imitate nature. The booming field of biomimetics is of interest to researchers in nanotechnology, robotics, artificial intelligence, the medical industry and the military.

Megawatts of power

If successful, Mankins said that this project would make possible the construction of huge platforms from tens of thousands of small elements that can deliver remotely and affordably tens to thousands of megawatts using wireless power transmission to markets on Earth, as well as missions in space, Mankins said.

Maxing out technology proven within 5 years for 2027 to 2032

I previously looked at a technological outline for the next 30 years. I will focus on what could be possible by the latter half of the middle decade.

Robin Hanson had a vision of accelerated economic activity based upon Whole Brain Emulation in mini robots. My projection will be focus on ways to accelerate economic activity.

I have bolded the technologies of the mundane singularity with the most economic impact.

More efficient factory mass produced sky scrapers are already being built. If there were a 100 by 100 grid of 200 story skyscrapers that would provide housing and offices for 1 billion people in a 10 mile by 10 mile grid.

Larger and more integrated cities, a new wave of robotic automation, robotic driving, terabit broadband and new energy could boost GDP by over three times what they would otherwise be over a twenty year period. Global GDP growth could get into the 12-18% per year range from 2020 onwards.

1. Pro-growth Policies (variable and uncertain by region)
2. Energy Efficiency - superconductors, thermoelectrics, improved grid
3. Energy Revolution - Mass produced fission, fusion, and maybe cold fusion
4. Additive manufacturing

5. Not so mundane - neuromorphic chips, quantum computers, photonics
6. Automated transportation (leading to robotic cars and planes)
7. Urbanization MegaCities
8. Urbanization Broad Group skyscrapers, Tata flat packed buildings
9. Robotics
10. Hyperbroadband

11. Supermaterials
12. Improve medicine and public health
13. Space
14. Synthetic biology and recombineering
15. Sensors everywhere
16. Education transformed and accelerated innovation
17. Supersmartphones, exoskeletons and wearable systems

18. Memristors and other significant computing and electronic improvements.

Buildings use 50% of the raw materials used each year in the world and 40% of the energy. Radical reductions in material usage and rapid increases in building energy efficiency will have a hugely beneficial impact on energy and the environment.

Broad group of China has built 15, 30 and 50 story factory mass produced skyscrapers already. They are targeting capturing 30% of the global construction market by 2020. If that target were reached then in the 2020's the urbanization of the developing world would be accelerated.

From 2020-2030, one hundred thousand Sky City 200 story skyscrapers could theoretically be built to provide homes and offices for 5 billion people.

Sky City would be a 200 story building that could hold about 100,000 people and would cost about $1.25 billion.

The Skyscraper could hold 100,000 people and 1 million to 10 million robotic servants of different shapes and sizes.

Improvements in the factory mass production methods and processes (and the lowering of costs for the Nth copy) could lower the costs of the factory mass produced skyscraper by 2 to 3 times.

$500 million for each of 100,000 sky cities would be enough space for 5 billion people at a total cost of $50 trillion. $100 trillion will be spent on construction in the world in this decade. In the next decade world construction spending will likely increase to $150 trillion.

If the 10,000 Sky Cities were built in 100 by 100 grid, they would cover an area of 15 kilometers by 15 kilometers. A supercity with 1 billion people.

Notice that these buildings are 5 times more energy efficient, would reduce CO2 production, use 6 times less material (less cement and smarter use of steel) and would reduce particulates with 99% less construction dust.

During the construction of new buildings and cities lighter and more reflective material should be used to offset more carbon dioxide.

New wave of Robotic automation

DARPA is working on robotic avatars now and has a grand challenge for humanoid robots.
Foxconn is building 1 million industrial robots over the next three years.
There could be one billion iRobot Avabot like robots with computer tablets for their heads by 2020-2022.
By 2027-2032, the tablets could have tens of teraflops of performance and could have neuromorphic chip and quantum computer chip co-processors.

By 2027 to 2032, the new robots could out-number people by 2 to 10 times and enable higher productivity and more world economic growth.

April 10, 2012

Robin Hanson updated analysis of the economic impact of Whole Brain Emulation Artificial Intelligence

Robin Hanson's best guess for the next revolution on the scale of the industrial, or farming, or human revolutions, is artificial intelligence in the form of whole brain emulations, or “ems.”

The powerpoint slides for the talk are here

Sander Olson interviewed Robin Hanson in 2010

You have written about how a major AI advance could bring about a step change in economic growth. Could another breakthrough, such as fusion power or molecular manufacturing, lead to similar growth?

Answer: We actually spend less than 10% of the economy on energy, so it is difficult to see how an energy advance by itself could bring about exponential changes in economic growth. Molecular manufacturing is more plausible, but it is still a long shot. We only spend about 15% of the economy on manufacturing, though advanced nanotech could facilitate a cascade of changes. So artificial intelligence is the most plausible way towards much faster economic growth rates in a short timeframe.

An earlier version of the talk was given at the Foresight 2010 conference.
Robin Hanson: "Economics of Nanotech and AI" at Foresight 2010 Conference from Foresight Institute on Vimeo.

Zyvex Marine will enable superior small manned and unmanned navy ships

Zyvex has made 54 foot long speed boats out of its carbon nanotube enhanced fiber, Arovex. Arovex is about twice as strong by weight compared to carbon fiber. It promises significant efficiency gains over boats made from fiberglass or aluminum. The 54-foot craft has demonstrated a fuel consumption of 12 U.S. gallons (45.4 liters) per hour at a cruising speed of 24 knots (44 km/h). This, Zyvex claims, constitutes a 75-percent fuel saving compared to a "traditional" boat consuming 50 U.S. gallons (189 liters) per hour, allowing ten times the range. That's a claim almost as bold as it is hazy, and in lieu of any precise figure on range, it's worth repeating the claims made about the prototype: an 8,000-pound boat capable of carrying a 15,000-pound payload a distance of 2,500 miles (4,000 km).

Space-based Solar Power: Possible Defense Applications and Opportunities

Space-based Solar Power: Possible Defense Applications and Opportunities for Navy Research Lab Contributions (105 pages)

A Forward Operating Base (FOB) exists to support a small number of reconnaissance and surveillance teams as well as for military power projection ahead of primary forces. As such, the FOB can be anywhere from 50 to 5000 personnel because it is task-organized and scales to meet the size of the assigned task(s).

Provision of electrical energy to the FOB must be viewed as a necessary commodity. The FOBs tend to be in remote, relatively inaccessible areas, due to both terrain and location of opposing forces (OPFOR). Resupply missions are tradeoffs between the risk of sending in an armed convoy and the risk, and substantial additional costs, of air resupply.

With basic assumptions of 1 to 3 kW/person at the FOB, generator usage can grow rapidly. An appropriate example is the mobile Command and Control center known as the Unit Operations Center (UOC) – the generator provided is 20 kW and consumes approximately half of a small trailer (the other half is occupied by an 8-ton environmental control unit and tent). The UOC is appropriately sized for use in an FOB and yet provides power only for itself. One innovative option is to have battlefield vehicles provide power for temporary operations – a concept demonstrated by the Reconnaissance, Surveillance, and Targeting Vehicle (RST-V) – which was capable of providing 30 kW of prime power to external systems. While 30 kW may be adequate for temporarily powering the UOC, as was suggested by the prime vendor, it is inadequate for any larger installation.

Roadmap to Space Solar Power using an up to 50% efficient space based power grid

The primary difficulties with the dream of Space Solar Power (SSP) for earth, are the extreme launch costs of solar power satellites to Geosynchronous Earth Orbit (GEO), and the absence of an evolutionary path to SSP. This makes the cost-to first- power unacceptably high. We present a 3-stage approach to SSP, and lay out the problems and opportunities. The key idea is to use space assets initially for global transmission and distribution, rather than generation, establish the infrastructure, and then add space-based power generation to a revenue-generating power grid. In the first stage, a Space Power Grid is established with satellites 1,200 km above Earth, distributing earth-generated beamed microwaves to other satellites and ground receivers. This boosts the earth-based alternative power industry (wind and solar-thermal) by providing transmission between any points on earth, and accommodating fluctuations in demand and supply. It also satisfies strategic needs for emergency power delivery. In Stage 2, direct power conversion technology will augment the existing space power grid with space-generated solar power. In Stage 3, large ultralight GEO reflectors will beam light to the direct-conversion collectors, and multiply the power through the grid. The need to put expensive, heavy solar arrays in high orbit is avoided, along with the objections to atmospheric transmission of visible light. The system would gain significantly from the development of low-mass, high-efficiency conversion equipment for direct conversion of broadband solar energy to beamed microwaves.

End-to-end efficiency of a space based power grid could reach 50%

At present, the end-to-end efficiency of this process alone does not compare favorably with earth-based transmission of energy, in existing markets. With 70% at conversion, and 10% for each atmospheric pass, even with essentially 100% waveguides and in-space transmission, the end-to-end efficiency is limited to roughly 50%, compared to about 90% for transmission over high-voltage lines. However, this masks the value of the approach in opening up worldwide markets, smoothing power fluctuations, avoiding loss of the “excess” power of ‘green’ energy plants, and enabling power plants in remote areas and connecting them to new development in other remote areas. A more detailed examination of the economics and policy aspects of the concept must wait until a later paper, where we expect to show how the inclusion of these large-system aspects, typically neglected in engineering concept development, make all the difference here.

This idea seems like it would be compatible with another idea to just place light inflatable mirrors in space to provide light for ground based solar farms at night.

Nvidia Tegra 4 Might appear in first quarter of 2013

VR Zone has possible leaked information about the Nvidia Tegra 4.

If the roadmap leaks is accurate then there will be four variants of the Tegra 4, three with a quad-core ARM Cortex A15 configuration, with clock speeds from 1.2 to 2.0GHz. The "SP3X" flavor appears to add LTE capabilities. The Tegra 4 would be late based on an Nvidia roadmap from a couple of years ago.

Ways China can rebalance from Michael Pettis

Michael Pettis has two major assumptions about China's economy

1) The fundamental imbalance in China is the very low GDP share of consumption. This low GDP share of consumption, I have always argued, reflects a growth model that systematically forces up the savings rate largely by repressing consumption, which it does by effectively transferring wealth from the household sector (in the form, among others, of very low interest rates, an undervalued currency, and relatively slow wage growth) in order to subsidize and generate rapid GDP growth.

2) China must and will rebalance in the coming years – its imbalances, in other words, cannot get much greater and we will soon see a reversal. There are two reasons for saying this, neither of which has to do with the claims being made by Beijing that they are indeed determined to rebalance the economy.

Michael Pettis believes that Beijing will begin rebalancing well before we reach the debt capacity limit. I will discuss later how Beijing can engineer the rebalancing process, but the point here is just that either Beijing forces rebalancing, or rebalancing will be forced upon China in the form of a debt crisis. One way or the other, in other words, debt will force China to rebalance.

The second reason for assuming that China will rebalance is because of external constraints. Globally, savings and investment must balance. This means that for any set of countries whose savings exceed investment, like China, there must be countries whose investment exceeds savings, like the US. To put it another way, the world can function with a group of underconsuming countries only if they are balanced by a group of overconsuming countries.

Spain the focus of New European Crisis

Business Week - Things are unraveling in Europe at a startling pace. The country in the greatest danger is Spain, which could become the fourth member of the euro zone to require a bailout, following Greece, Ireland, and Portugal. Spain’s 709 billion euros of sovereign debt is roughly twice the debt of those three nations combined, according to data compiled by Bloomberg, so a rescue of Spain would be a heavy burden for the rest of Europe.

Investors got overconfident in Spain after the European Central Bank announced last December that it would funnel cheap, three-year loans to European banks, which they could (and did) use to invest in the debt of their own nations. The ECB lent more than 1 trillion euros. Spanish government 10-year yields, which were over 7 percent last November, plummeted to below 5 percent this January and February. But they have raced back upward to just below 6 percent in recent weeks.

Spain’s yields started jumping in early March after Prime Minister Mariano Rajoy announced that the government budget deficit would miss the 4.4 percent of GDP target the previous administration had agreed to with the European Union. The problem is that the easy money from the ECB didn’t do anything to fix Spain’s fundamental problems—overindebtedness and an uncompetitive economy that, because of the common currency, can’t use depreciation as an escape hatch.

NY Times- Spain is entering its second recession in just three years, and the government expects the economy to contract by about 1.7 percent in 2012. With unemployment of around 23 percent, there is fear that austerity measures — deemed critical for winning back market confidence — could have the perverse effect of further depressing growth and creating a vicious cycle in which more budget cuts are needed to balance the public books.

Being Smart about world urbanization will have a big impact on the quality of the future

By 2030 humanity’s urban footprint will occupy an additional 1.5 million square kilometres - comparable to the combined territories of France, Germany and Spain.

UN estimates show human population growing from 7 billion today to 9 billion by 2050, translating into some 1 million more people expected on average each week for the next 38 years, with most of that increase anticipated in urban centres. And ongoing migration from rural to urban living could see world cities receive yet an-other 1 billion additional people. Total forecast urban population in 2050: 6.3 billion (up from 3.5 billion today).

The World urban population should reach 5 billion by 2030.

During 2005-2030, the world's urban population will grow at an average annual rate of 1.8 per cent, nearly double the rate expected for the total population of the world (1 per cent per year). At that rate of growth, the world's urban population will double in 38 years.

Inexpensive separation method of graphene developed

Nanoletters - Direct Measurement of Adhesion Energy of Monolayer Graphene As-Grown on Copper and Its Application to Renewable Transfer Process

Direct measurement of the adhesion energy of monolayer graphene as-grown on metal substrates is important to better understand its bonding mechanism and control the mechanical release of the graphene from the substrates, but it has not been reported yet. We report the adhesion energy of large-area monolayer graphene synthesized on copper measured by double cantilever beam fracture mechanics testing. The adhesion energy of 0.72 ± 0.07 J m–2 was found. Knowing the directly measured value, we further demonstrate the etching-free renewable transfer process of monolayer graphene that utilizes the repetition of the mechanical delamination followed by the regrowth of monolayer graphene on a copper substrate.

April 09, 2012

New technique for solid-state quantum information processing

Ames Lab - Scientists have overcome a major hurdle facing quantum computing: how to protect quantum information from degradation by the environment while simultaneously performing computation in a solid-state quantum system.

A group led by U.S. Department of Energy’s Ames Laboratory physicist Viatsheslav Dobrovitski and including scientists at Delft University of Technology; the University of California, Santa Barbara; and University of Southern California, made this big step forward on the path to using the motions of single electrons and nuclei for quantum information processing. The discovery opens the door to robust quantum computation with solid-state devices and using quantum technologies for magnetic measurements with single-atom precision at nanoscale.

Quantum information processing relies on the combined motion of microscopic elements, such as electrons, nuclei, photons, ions, or tiny oscillating joists. In classical information processing, information is stored and processed in bits, and the data included in each bit is limited to two values (0 or 1), which can be thought of as a light switch being either up or down. But, in a quantum bit, called a qubit, data can be represented by how these qubits orient and move in relationship with each other, introducing the possibility for data expression in many tilts and movements.

The U.S. Department of Energy’s Ames Laboratory led a team that demonstrated a protected quantum gate in a solid-state system. The physicists (c) isolated the system from forces in the environment while maintaining coherence between the nucleus and electron in the system. This was a major step forward in quantum information processing beyond simply (a) establishing coherence between parts of a quantum system (but without isolation from the environment) or (b) decoupling all parts of the system from the environment (and each other).

Nature - Decoherence-protected quantum gates for a hybrid solid-state spin register

Ford Fiesta Econetic gets 86.5 mpg UK

Facebook strengths and weaknesses

1. Facebook has a strong revenue and traffic dependence on gaming companies like Zynga.

In its S-1 filing, Facebook revealed that about 12% of its revenue comes from Zynga games purchases, about $445 million in 2011 alone. How much of that goes away if gamers are playing on Zynga.com instead of Facebook?

If a Facebook member playing on Zynga.com buys virtual goods, does Facebook still get 30% of the cut as it does now for purchases made through its social networking platform?

Techcrunch - Zynga now has 56.6 million daily active users and 246.9 million monthly active users

Appdata has a list of the top Facebook apps

Facebook delivers 93 percent of Zynga’s revenues. Zynga and Facebook are intertwined until at least 2015, when a five-year deal between the two expires. Zynga has taken steps to mitigate the risk of relying completely on Facebook by expanding into mobile and international markets.

2. Forbes - Faced with its first serious competitor, Facebook has dropped a billion dollars to purchase Instagram, a photo-sharing service. Why? To drop a roadblock in front of Pinterest, the Facebook-on-training-wheels that has recently been all the rage. A few changes to Instagram and it can compete very effectively with Pinterest, with the added ability to interface with Facebook even more directly than it already does.

Indonesia could be the 5th largest economy by 2030 on a nominal basis

Goldman Sachs Jim O'Neill came up with the term BRIC (Brazil, Russia, India, and China).

What countries does O'Neill think are primed for growth? MIST, meaning Mexico, Indonesia, South Korea, and Turkey. All these countries rank worldwide between the 10th and 20th largest GDPs and had GDP growth rates above 5% for 2010.

EconomyWatch is forecasting about 7% GDP growth for Indonesia through 2016. Indonesia has 6.3% GDP growth in 2012.

Indonesia had 6.5% GDP growth in 2011.

The total size of Indonesia’s middle class may rise to 171 million in 2020, or 63 percent of the population. In 2030, it may reach 244 million, or 78 percent of people living in the country.

Indonesia's population is projected to be 312 million in 2030. Indonesia per capita GDP is projected to be just over $30,000. This will be up from the current GDP per capita of just over $5000. The US is projected to have almost $100,000 per capita GDP. China is projected to be at $50,000 per capita GDP.

A Stem Cell–Based Approach to Cartilage Repair

A small molecule dubbed kartogenin encourages stem cells to take on the characteristics of cells that make cartilage, a new study shows. And treatment with kartogenin allowed many mice with arthritis-like cartilage damage in a knee to regain the ability to use the joint without pain.

The new approach taps into mesenchymal stem cells, which naturally reside in cartilage and give rise to cells that make connective tissue. These include chondrocytes, the only cells in the body that manufacture cartilage. Kartogenin steers the stem cells to wake up and take on cartilage-making duties. This is an essential step in the cartilage repair that falls behind in people with osteoarthritis, the most common kind of arthritis, which develops from injury or long-term joint use.

Science - A Stem Cell–Based Approach to Cartilage Repair

Osteoarthritis (OA) is a degenerative joint disease that involves destruction of articular cartilage and eventually leads to disability. Molecules that promote the selective differentiation of multipotent mesenchymal stem cells (MSCs) into chondrocytes may stimulate the repair of damaged cartilage. Using an image-based, high-throughput screen, we identified the small molecule kartogenin, which promotes chondrocyte differentiation (EC50 = 100 nM), shows chondroprotective effects in vitro, and is efficacious in two OA animal models. Kartogenin binds filamin A, disrupts its interaction with the transcription factor CBFβ, and induces chondrogenesis by regulating the CBFβ-RUNX1 transcriptional program. This work provides new insights into the control of chondrogenesis that may ultimately lead to a stem cell–based therapy for osteoarthritis.

Nano-sized ‘factories’ churn out proteins

MIT researchers has developed a new type of nanoparticle that can synthesize proteins on demand. Once these “protein-factory” particles reach their targets, the researchers can turn on protein synthesis by shining ultraviolet light on them.

The particles could be used to deliver small proteins that kill cancer cells, and eventually larger proteins such as antibodies that trigger the immune system to destroy tumors.

MIT researchers designed these particles that can produce proteins when ultraviolet light is shone on them. In this case, the protein is green fluorescent protein.
Image: Avi Schroeder

Nanoletters - Remotely-activated protein-producing nanoparticles

Carnival of Nuclear Energy 99

NEI Nuclear Notes has the Carnival of Nuclear Energy 99

Road Adams at ANS Nuclear Cafe looks at a National Academy of Science (NAS) has released phase one of a study titled Analysis of Cancer Risks in Populations Near Nuclear Facilities.

Comment from an e-mail list inhabited by people who have studied radiation health effects for decades

20-year-old cell biology evidence instead of their LNT [linear no-threshold] ideology and epidemiology, they would realize that they are trying to measure a cancer risk (radiation-induced DNA damage rate) that is six million (6,000,000) times lower than the spontaneous risk of cancer (i.e., natural DNA damage rate).

ANS Nuclear Cafe - The provincial government of Tamil Nadu, India’s southern-most state, has dropped its opposition to hot start of twin 1,000-MW VVER reactors at Kudankulam and withdrawn support from local anti-nuclear protests.
The long-running controversy over the start of NPCIL’s Russian-built twin 1,000-MW VVER reactors at Kudankulam *may* be coming to an end.
India plans to add 64 Gwe of power to its grid by 2032 to reduce the gap in rural electrification -- a nuclear reactor market said to be worth $150 billion, although the USA is currently locked out by a supplier liability law. Dan Yurman covers the Kudankulam story, and the details of the politics behind the story.

April 08, 2012

Michigan Tech Breakthrough Could Speed the Search for Next Generation of Hydrogen Fuel Cells

A research team led by Jeffrey Allen of Michigan Technological University is nearing development of a mathematical model that will slash that R&D time and effort to find better hydrogen fuel cells.

Water vapor is the only emission coming out of the tailpipe of a hydrogen fuel cell-powered vehicle, a big reason why fuel cells are so attractive. But moving that water out of the fuel cell can be a soggy problem. Just a teaspoon can kill the reaction that drives hydrogen fuel-cell powered vehicles. And, considering that it can take a stack of dozens of fuel cells to power a car, and a single flooded cell can take down the entire stack, water management becomes a looming issue.

Most of that watery action happens in the fuel cell’s porous transport layer, or PTL, which is not much thicker than a coffee filter. That’s where all the byproducts of the fuel cell’s power-generating reaction meet up with a catalyst and react to form water vapor.

It’s not easy to find out exactly what’s happening in the PTL. “Everything is compressed like crazy,” says Allen, the John F. and Joan M. Calder Associate Professor in Mechanical Engineering. “You have to get the gases—hydrogen and air—to the catalyst, and you have to get the water away. Figuring out how to do this has largely been a matter of trial and error.”

IBN’s ‘Fish and Chips’ May Help Accelerate Drug Discovery

A cheaper, faster and more efficient platform for preclinical drug discovery applications has been invented by scientists at the Institute of Bioengineering and Nanotechnology (IBN), the world’s first bioengineering and nanotechnology research institute. Called ‘Fish and Chips’, the novel multi-channel microfluidic perfusion platform can grow and monitor the development of various tissues and organs inside zebrafish embryos for drug toxicity testing.

Current drug studies on zebrafish embryos are performed on traditional microtiter plates, which do not allow perfusion or the replenishment of growth media and drugs, and cannot facilitate live imaging since the embryos are not fixed in one position due to the size of the well. The conventional way of visualizing tissues and organs in embryos is a laborious process, which includes first mounting the embryos in a viscous medium such as gel, and then manually orienting the embryos using fine needles. The embryos also need to be anesthetized to restrict their motion and a drop of saline needs to be continuously applied to prevent the embryos from drying. These additional precautions could further complicate the drug testing results.

The IBN ‘Fish and Chips’ has been designed for dynamic long-term culturing and live imaging of the zebrafish embryos. The microfluidic platform comprises three parts: 1) a row of eight fish tanks, in which the embryos are placed and covered with an oxygen permeable membrane, 2) a fluidic concentration gradient generator to dispense the growth medium and drugs, and 3) eight output channels for the removal of the waste products.

Image 2: Schematic diagram on the layout of the ‘Fish and Chips’, which consists of three parts: inlet layer drawing into the gradient generator, fish tanks and waste channels ending in outlets.

The novelty of the ‘Fish and Chips’ lies in its unique diagonal flow architecture, which allows the embryos to be continually submerged in a uniform and consistent flow of growth medium and drugs,

Image 3: Side view of the ‘Fish and Chips’, which illustrates how the flow travels diagonally across the fish tank, and envelops the embryo in the drug solution.

and the attached gradient generator, which can dispense different concentrations of drugs to eight different embryos at the same time for dose-dependent drug studies.

Форма для связи


Email *

Message *