May 25, 2013

Metamaterial UV lens could lead to improved lithography and nanoscale manipulation

Scientists working at the National Institute of Standards and Technology (NIST) have demonstrated a new type of lens that bends and focuses ultraviolet (UV) light in such an unusual way that it can create ghostly, 3D images of objects that float in free space. The easy-to-build lens could lead to improved photolithography, nanoscale manipulation and manufacturing, and even high-resolution three-dimensional imaging, as well as a number of as-yet-unimagined applications in a diverse range of fields.

"Conventional lenses only capture two dimensions of a three-dimensional object," says one of the paper's co-authors, NIST's Ting Xu. "Our flat lens is able to project three-dimensional images of three-dimensional objects that correspond one-to-one with the imaged object."

An article published in the journal Nature explains that the new lens is formed from a flat slab of metamaterial with special characteristics that cause light to flow backward—a counterintuitive situation in which waves and energy travel in opposite directions, creating a negative refractive index.

Metamaterial is projected as a three-dimensional image in free space on the other side of the slab. Here a ring-shaped opening in an opaque sheet on the left of the slab is replicated in light on the right. Bottom left: Scanning electron micrograph of a ring-shaped opening in a chromium sheet located on the surface of a flat slab of metamaterial. Bottom right: Optical micrograph of the image projected beyond the slab under UV illumination, demonstrating that the metamaterial slab acts as a flat lens.
Credit: Lezec/NIST

Nature - All-angle negative refraction and active flat lensing of ultraviolet light

A hydrocarbon-sorting material could replace energy-intensive oil refining steps and lowering the price of gasoline

A new metal-organic framework material that sorts hydrocarbon molecules by shape could lower the cost of gasoline and also make the fuel safer by reducing the need for certain additives that have been linked to cancer, according to a paper in the next issue of the journal Science.

Refiners typically use a material that can sort molecules by size during a key step in the refining process. To achieve a desired octane rating, this step has to be supplemented with energy-intensive distillation steps, or by the use of additives. The new material, which sorts molecules by shape rather than by size, can better differentiate between different types of hydrocarbon molecules, eliminating the distillation steps and the need for octane-enhancing additives.

To make the material for sorting hydrocarbons, the researchers made a material riddled with microscopic tunnels featuring triangular cross-sections. These tunnels can sort five different types of hexane molecules—hydrocarbons with six carbon atoms—that are key to achieving the desired octane rating of gasoline. The octane rating for hexanes depends on how the carbon atoms are arranged. Line them up in a row, forming a linear molecule, and the octane number is very low—about 30. But link them together to form a branching structure and the octane level can be as high as 105.

Journal Science - Separation of Hexane Isomers in a Metal-Organic Framework with Triangular Channels

Intel outlines the transition to 3D NAND to carry us beyond the 10 nanometer limit

IM Flash Technologies LLC, the joint venture between Intel and Micron Technologies, is considering how and when to take its NAND flash memory ICs into the third dimension but reckons its development of a 20-nm memory cell has bought it a generation or two of 2-D scaling.

An industry-wide transition for the nonvolatile NAND flash memory technology from memory cells in a 2-D array to strings of NAND transistors integrated monolithically in the vertical direction is now anticipated. These 3-D memories are expected to be arranged as a 2-D array of vertical semiconductor channels with many levels of gate-all-around (GAA) structures forming the multiple voltage level memory cell transistors.

Toshiba is leading the charge towards 3-D NAND processes with its p-BiCS (pipe-shaped Bit Cost Scalable) technology, which it has presented at numerous learned conferences over several years. Towards the end of last year Toshiba announced that it had 16-layer devices based on a 50-nm diameter vertical channel. Samples are due this year and volume production in 2015. Toshiba's p-BiC technology arranges the transistor string in a U-shape.

But Esfarjani, while acknowledging there is a scaling limit for 2-D NAND flash, indicated in one of his slides that 2-D NAND flash can scale to two more nodes at about 15- and 10-nm. The slide showed that the first 3-D NAND generation is likely to be brought up alongside that 15-nm 2-D node. Esfarjani added that 16 layer NAND flash ICs will not be enough to provide an economic benefit. "You need 64 or at least 32 layers," he said.

End to end design of interstellar radio communication that is thousand to tens of thousands of times more power efficient

This 237 page report has addressed the end-to-end design of an interstellar communication system taking account of all known impairments from the interstellar medium and induced by the motion of source and observer. Both discovery of the signal and post-discovery communication have been addressed. It has been assumed that the transmitter and receiver designers wish to jointly minimize the cost, and in that context we have considered tradeoffs between resources devoted in the transmitter vs. the receiver. The resources considered in minimizing cost are principally energy consumption for transmitting the radio signal and energy consumption for signal processing in a discovery search.

Note - project Icarus had worked out an even more energy efficient communication system if the sender and receiver are at gravitation lensing points of two stars. One tenth of a milliwatt is enough to have perfect communication between the Sun and Alpha Cen through two 12-meter FOCAL spacecraft antennas. A similar bridge between the Sun and a Sun-like star inside M31, using the gravitational lenses of both. We’re working here with a distance of 2.5 million light years, but a transmitted power of about 107 watts would do the trick

Thus study is applicable to communication with interstellar spacecraft (sometimes called ”starships”) in the future, although it will probably be quite some time before these spacecraft wander far enough to invoke many of the impairments considered here. In the present and nearer term this study if relevant to communication with other civilizations. In the case of communication with other civilizations, we of course will be designing either a transmitter, or a receiver, which in either case has to be interoperable with a similar design by the other civilization. In either case, it is very helpful to consider the end-to-end design and resource tradeoffs, as we have attempted here. In our view, a major shortcoming of existing SETI observation programs (which requires the design of a receiver for discovery) is the lack of sufficient attention to the issues faced by the transmitter in overcoming the large distances and the impairments introduced in interstellar space.

One profound conclusion of this study is that if we assume that the transmitter seeks to minimize its energy consumption, which is related to average transmit power, then the communication design becomes relatively simple. The fundamental limit on power efficiency in interstellar communication has been determined, that limit applying not only to us but to any civilization no matter
how advanced. Drawing upon the early literature in power-efficient communication design, five simple but compelling design principles have been identified. Following these principles has been shown to permit designs that approach the fundamental limit. Although the same fundamental limit does not apply to the discovery process, we have defined an alternative resource-constrained
design approach that minimizes processing resources for a given power level, or power level for a given processing resource. Again, application of a subset of the five principles leads to a design that can achieve dramatic reductions in the transmitter’s energy consumption relative to the type of Cyclops beacons that have been a common target of SETI observation programs, at the expense of a more modest increase in the receiver’s energy consumption through an increased observation time in each location.

The power efficiency for narrow-bandwidth interstellar radio communication signals assumed in many current SETI searches has a penalty in power efficiency of four to five orders of magnitude. A set of five power-efficient design principles can asymptotically approach the fundamental limit, and in practice increase the power efficiency by three to four orders (thousand to tens of thousands of times more power efficient interstellar communication) of magnitude. The most fundamental is to trade higher bandwidth for lower average power. In addition to improving the power efficiency, average power can be reduced by lowering the information rate.

SeeMeCNC - $999 Improved 3D Printer and $7499 laser cutter

Rostock MAX's Cheapskate design uses t-slot aluminum extrusion as not only the structural member, but also the linear bearing surface. To keep the bearings from wearing the aluminum's surface we CNC machine acetal bearing covers. The extruder is our own proven workhorse, the "Steve's Extruder", that uses push to connect fittings for a bowden extruder setup fed down to our reliable hot-end design that comes set up for 1.75mm filament. The hot-end platform as well as the Delta arms are all made from incredibly strong, lightweight injection molded reinforced plastic. The same material is also used in the extruder body to keep everything super strong and light. The wood kit's laser cut framework uses a new material introduced to 3D printers by us, melamine laminate. It's easier to work with, and more durable than other laser cut plywoods that are popular. The laser cut parts speed up assembly of the printer and help key the major parts together while you install the hardware. The extruder features an integrated spool holder designed for 1lb. or 2.2lb (1kg.) spools.

Standard features below:

* Over 1300 in² of build volume (11" Diameter by over 14 3/4" height)
* Your model is stationary throughout the entire build process, less prone to print failures
* High quality Laser cut, Injection molded and CNC machined parts all made by us right here in the USA
* Comes set up for 1.75mm filament
* Uses high torque NEMA 17 stepper motors
* Comes standard with a .5mm nozzle for easy printing, but can use our other sizes as well
* 450w PSU delivers more than enough power for the heated bed, hot end and accessories you could add
* RAMBo by UltiMachine electronics
* Positioning accuracy of .02mm (20 microns) *
* Speed in excess of 300mm/s in all motion, not just X/Y moves ***
* Tinker-friendly electronics and hardware. Many extra places to add your own mods to both
* Filament spool holder incorporated into the machine (for 1lb or 1kg spools only)
* Easily upgradeable to dual-extrusion in the future. RAMBo supports dual hotends and extruders stock

High Resolution Desktop 3D Printer for $3299

At $3,299, the Form 1 could expand the market for 3-D printing technology. It can produce much higher-fidelity plastic objects than the consumer desktop printers available today. But it is still cheap enough to be affordable to a wide swath of professional designers, engineers, and dedicated tinkerers. The Form 1 can, for example, create detailed functioning prototypes with mechanical parts, such as precise screw threads.

Formlabs new device can print layers as thin as 25 microns and can produce objects at half the scale of typical consumer desktop printers, which function more like automated hot-glue guns. Such machines often don’t produce the level of detail necessary for the professional prototyping functions that 3-D printing is often touted to fulfill.

They are selling the clear resin for sale for $149 per liter.

May 24, 2013

25 kilogram or lighter detectors can use millisecond pulsars for universal positioning system accurate today to ±5 km in the solar system and beyond and soon to meters

Millimeter pulsars can be used for universal position system accurate today to ±5 km in the solar system and beyond and soon to meters. (arxiv 22 pages) By comparing pulse arrival times measured on-board a spacecraft with predicted pulse arrivals at a reference location, the spacecraft position can be determined autonomously and with high accuracy everywhere in the solar system and beyond. The unique properties of pulsars make clear already today that such a navigation system will have its application in future astronautics. In this paper we describe the basic principle of spacecraft navigation using pulsars and report on the current development status of this novel technology.

Autonomous spacecraft navigating with pulsars is feasible when using either phased-array radio antennas of at least 150 square meter antenna area or compact light-weighted X-ray telescopes and detectors, which are currently being developed for the next generation of X-ray observatories. Using the X-ray signals from millisecond pulsars we estimated that navigation would be possible with an accuracy of ±5 km in the solar system and beyond. The error is dominated by the inaccuracy of the pulse profiles templates that were used for the pulse peak fittings and pulse-TOA measurements. As pulse profiles templates are known with much higher accuracy in the radio band, it is possible to increase the accuracy of pulsar navigation down to the meter scale< by using radio signals from pulsars for navigation.

Pulsar-based navigation systems can operate autonomously. This is one of their most important advantages, and is interesting also for current space technologies; e.g., as augmentation of existing GPS/Galileo satellites. Future applications of this autonomous navigation technique might be on planetary exploration missions and on manned missions to Mars or beyond.

Currently positioning relies on Earth-based tracking stations to work out a spacecraft’s distance using radio waves, a process that is accurate to within a meter or so. That’s fine for the radial distance, but tracking a spacecraft’s angular position is much harder because of the limited angular resolution of radio antennas. The current technology produces an uncertainty of about four kilometers per astronomical unit of distance between Earth and the spacecraft. So for a spacecraft at the distance of Pluto, that’s an uncertainty of 200 kilometers and at the distance of Voyager 1, the uncertainty is 500 kilometers.

Nuclear Fusion Summary - Prospects for breakthrough commercial reactors 2018-2025

Nuclear fusion is one of the main topics at Nextbigfuture. I have summarized the state of nuclear fusion research before. A notable summary was made three years ago in mid-2010. I believed at the time that there could be multiple successful nuclear fusion project vying for commercial markets by 2018. Progress appears to be going a bit more slowly than previously hoped, but there are several possible projects (General Fusion, John Slough small space propulsion nuclear fusion system, Lawrenceville Plasma Physics - if they work out metal contamination and other issues and scale power) that could demonstrate net energy gain in the next couple of years.

Commercialization Date targets

General Fusion 2020 (targeting 4 cents per kwh)

Helion Energy 2022 (about 5 cents per kwh and able to burn nuclear fission waste)

Lockheed Compact Fusion 2023

Tri-Alpha Energy (previously talked about 2015-2020, but now likely 2020-2025)

Lawrenceville Plasma Physics - 4 years commercial after net energy gain proved. Say two years to prove net energy gain. Then 2019-2021 for a commercial reactor (2021 if we allow for 2 years of slippage). Could lower energy costs by ten times.

EMC2 Fusion (?? No information for the last few years. US Navy is funding the work at a few million dollars per year)

Muon Fusion - Research in Japan and at Star Scientific in Australia

There will be more than one economic and technological winner. Once we figure nuclear fusion there will be multiple nuclear fusion reactors. It will be like engines - steam engines, gasoline engines, diesel engines, jet engines. There will be multiple makers of multiple types of nuclear fusion reactors. There will be many applications energy production, space propulsion, space launch, transmutation, weapons and more. We will be achieving greater capabilities with magnets (100+ tesla superconducting magnets), lasers (high repetition and high power), and materials. We will also have more knowledge of the physics. What had been a long hard slog will become easy and there will be a lot more money for research around a massive industry.

The cleaner burning aspect of most nuclear fusion approaches versus nuclear fission is not that interesting to me. It is good but nuclear fission waste cycle could be completely closed with deep burn nuclear fission reactors that use all of the uranium and plutonium. In China it is straight up engineering questions. So a transition to moderately deeper burn pebble bed from 2020-2035 (starts 2015 but not a major part until 2020) and then a shift to breeders 2030-2050+.

What matters are developments which could radically alter the economy of the world and the future of humanity. The leading smaller nuclear fusion projects hold out the potential of radically lowering the cost of energy and increasing the amount of energy. Nuclear fusion can enable an expansion of the energy used by civilization by over a billion times from 20 Terawatts to 20 Zettawatts. Nuclear fusion also enables space propulsion at significant fractions of the speed of light (1 to 20% of lightspeed.) Earth to orbit launch with nuclear fusion spaceplanes or reusable rockets and trivial access to anywhere in the solar system.

10 Tesla superconductors could enable Tokamak fusion to actually be affordable and workable in reasonable time

A new generation of 10 tesla uperconductors could make Tokamak style nuclear fusion reactors work. It could make them affordable, smaller, maintainable and remove the plasma problems. The development time could be greatly reduced from 50 years to 10 to 20 years. A new design would also switch to FLIBE molten salt for lower costs.

The overnight cost of a fission power plant is ~ $4/W ($2/W for China).
• First of kind fusion plants at least $10-20/W
• Which implies that developing fusion reactors at ~GWe scale requires 10-20 G$ “per try” e.g. ITER
• Chance of fusion development significantly improved if net thermal/electrical power produced at ~5-10 x smaller i.e. ~ 500 MW thermal

It would be good to have a more reasonable Tokamak fusion option.

I still like John Slough's fusion reactor designs more. He could have net gain this year or next year. It looks even cheaper and faster to develop.
The Lawrenceville Plasma Physics dense plasma focus fusion also seems to have chance to lower energy costs by ten times while Tokomaks are trying to not be two to ten times more expensive.
General fusion also seems more promising.

Also, any nuclear fusion option needs to be better than improved nuclear fission.
Molten salt nuclear fission looks to greatly reduce the waste and have improved costs.

Canadian David LeBlanc is developing the Integral Molten Salt Reactor, or IMSR. The goal is to commercialize the Terrestrial reactor by 2021. It should have initial costs of $3.5/W and could have costs that are $1/W.

May 23, 2013

DARPA's lightweight soft exoskeleton the Warrior Web is revealed

DARPA has released a video of a soldier carrying a 61-pound load while walking in a prototype DARPA Warrior Web system during an independent evaluation by the U.S. Army.

It is less like the hard exoskeleton of Iron man and more like the supersuit of the Pixar Incredibles.

Warrior Web seeks to create a soft, lightweight under-suit that would help reduce injuries and fatigue common for Soldiers, who often carry 100-pound loads for extended periods over rough terrain. DARPA envisions Warrior Web augmenting the work of Soldiers’ own muscles to significantly boost endurance, carrying capacity and overall warfighter effectiveness–all while using no more than 100 Watts of power.

The U.S. Army Research Laboratory Human Research and Engineering Directorate (ARL HRED) is nearing completion of a five-month series of tests to evaluate multiple Warrior Web prototype devices. The testing evaluates how each prototype incorporates different technologies and approaches to reduce forces on the body, decrease fatigue, stabilize joints and help Soldiers to maintain a natural gait under a heavy load. The testing uses a multi-camera motion-capture system to determine any changes in gait or balance, a cardio-pulmonary exercise testing device to measure oxygen consumption and a variety of sensors to collect force, acceleration and muscle activity data.

The suit will actively assist muscle movement using tiny actuators in certain joints.

Saving lives and weaponization with 3D printing and other technology

3D bioprinters were used to make a life saving splint to help a baby breath.

The image-based design and 3D biomaterial printing process can be adapted to build and reconstruct a number of tissue structures. Green and Hollister have already utilized the process to build and test patient specific ear and nose structures in pre-clinical models. In addition, the method has been used by Hollister with collaborators to rebuild bone structures (spine, craniofacial and long bone) in pre-clinical models.

3D printing is predicted to enable a new industrial revolution. The disruption in areas where 3D printing already works well – including furniture, cutlery, machine tools, car components, toys, garden equipment and so on – will be intense. Some retailers will be disintermediated and go bust, just as music stores have been destroyed by Apple’s iTunes.

Airbus has long term projects to print whole wings and other large components.

3D printers have been used to print working guns and DARPA weapons and vehicles using additive manufacturing.

Doctors use 3D bioprinter to create a splint for baby's blocked throat

The Youngstown, Ohio, baby turned blue again and again as his little airways collapsed and kept air from reaching his lungs. But doctors used a 3-D bioprinter to custom-make a splint that is holding his airway open and helping him breathe.

Now 19-month-old Kaiba Gionfriddo is “into everything”, says his mother, April Gionfriddo.

"Quite a few doctors said he had a good chance of not leaving the hospital alive," she adds.

Kaiba was born with a rare condition called tracheobronchomalacia. This deformity affects about one in 2,200 babies and causes the airways to be weak and prone to collapse. In tiny babies, it can look like asthma and it can take a while to diagnose.

Doctors at the University of Michigan bioprinted this splint, custom designed for Kaiba Giofriddo's trachea. It fits around the outside and supports the windpipe.

Green and Hollister were able to make the custom-designed, custom-fabricated device using high-resolution imaging and computer-aided design. The device was created directly from a CT scan of Kaiba's trachea/bronchus, integrating an image-based computer model with laser-based 3D printing to produce the splint. The image-based design and 3D biomaterial printing process can be adapted to build and reconstruct a number of tissue structures. Green and Hollister have already utilized the process to build and test patient specific ear and nose structures in pre-clinical models. In addition, the method has been used by Hollister with collaborators to rebuild bone structures (spine, craniofacial and long bone) in pre-clinical models.

General Fusion on track for Demonstration of Net Gain Equivalent Plasma Compression this year

There is a 19 page presentation from 2012 that updates the progress of General Fusion

General Fusion is trying to make affordable fusion power a reality.
• Founded in 2002, based in Vancouver, Canada
• Plan to demonstrate proof of physics DD equivalent “net gain” in 2013
• Plan to demonstrate the first fusion system capable of “net gain” 3 years after proof
• Validated by leading experts in fusion and industrial engineering
• Industrial and institutional partners
• $42.5M in venture capital, $6.3M in government support

General Fusion intends to build a three-meter-diameter steel sphere filled with spinning molten lead and lithium. Super-heated plasma would be injected into the vortex and then the outside of the sphere would be hit with 200 computer-synchronized pistons travelling 100 meters per second (200 mph) The resulting shock waves would compress the plasma and spark a fusion reaction for a few microseconds.

Carnival of Space 302

May 22, 2013

Tesla Repays $451 million Government loan ten years early and will get about $188 million this year selling Zero Emission credits

Tesla Motors announced that it has paid off the entire loan awarded to the company by the Department of Energy in 2010. In addition to payments made in 2012 and Q1 2013, today’s wire of almost half a billion dollars ($451.8M) repays the full loan facility with interest. Following this payment, Tesla will be the only American car company to have fully repaid the government for a Dept of Energy loan.

The loan payment was made today using a portion of the approximately $1 billion in funds raised in last week’s concurrent offerings of common stock and convertible senior notes. Elon Musk, Tesla’s Chief Executive Officer and cofounder, purchased $100 million of common equity, the least secure portion of the offering.

In the first quarter of 2013, Tesla sold nearly $68 million of the zero-emission credits to other automakers. That represented 12% of its overall revenue. Other automakers are buying the zero emissions credits are concerned they won't able to meet tough new environmental regulations requiring that more than 15% of sales in 11 states be zero-emissions vehicles by 2025. Adam Jonas, auto analyst with Morgan Stanley, who estimates that the credits will come to $188 million this year.

Seager equation based on detected exoplanets alternative to Drake Equation for alien life

The Kepler space telescope has found more than 130 worlds and detected 3000 or so more possibles.

Sara Seager at the Massachusetts Institute of Technology reckons the Drake equation is ripe for a revamp. Her version narrows a few of the original terms to account for our new best bets of finding life, based in part on what Kepler has revealed. If the original Drake equation was a hatchet, the new Seager equation is a scalpel.

Red dwarfs are the most common stars in our galaxy: projections based on Kepler data suggest that the nearest habitable Earth-sized world could orbit a red dwarf as close as 6.5 light years away.

Even better, it will be easier to probe these planets for gases associated with life, because tighter orbits mean that more of the star's light will filter through a planet's atmosphere on the way to us, picking up telltale clues to its composition. Seager's goal is to find the fraction of habitable Earth-sized worlds in our galactic neighbourhood with detectable atmospheric biosignatures – in other words, inhabited worlds. She has already put the number of inhabited planets that the James Webb space telescope might see at less than 10.

"Just like with the Drake equation, some of the terms are always speculative," Seager says.

Sara Seager has a 48 page ebook about "Is There Life Out There ? The Search for Habitable Exoplanets

New Technique May Open Up an Era of Atomic-scale Semiconductor Devices with wafer scale one atom thick layers

Researchers at North Carolina State University have developed a new technique for creating high-quality semiconductor thin films at the atomic scale – meaning the films are only one atom thick. The technique can be used to create these thin films on a large scale, sufficient to coat wafers that are two inches wide, or larger.

“This could be used to scale current semiconductor technologies down to the atomic scale – lasers, light-emitting diodes (LEDs), computer chips, anything,” says Dr. Linyou Cao, an assistant professor of materials science and engineering at NC State and senior author of a paper on the work. “People have been talking about this concept for a long time, but it wasn’t possible. With this discovery, I think it’s possible.”

The researchers worked with molybdenum sulfide (MoS2), an inexpensive semiconductor material with electronic and optical properties similar to materials already used in the semiconductor industry. However, MoS2 is different from other semiconductor materials because it can be “grown” in layers only one atom thick without compromising its properties.

In the new technique, researchers place sulfur and molybdenum chloride powders in a furnace and gradually raise the temperature to 850 degrees Celsius, which vaporizes the powder. The two substances react at high temperatures to form MoS2. While still under high temperatures, the vapor is then deposited in a thin layer onto the substrate.

Nature Scientific Reports- Controlled Scalable Synthesis of Uniform, High-Quality Monolayer and Few-layer MoS2 Films

Molecular Trigger for Alzheimer's Disease Identified

Researchers have pinpointed a catalytic trigger for the onset of Alzheimer’s disease – when the fundamental structure of a protein molecule changes to cause a chain reaction that leads to the death of neurons in the brain.

For the first time, scientists at Cambridge’s Department of Chemistry have been able to map in detail the pathway that generates “aberrant” forms of proteins which are at the root of neurodegenerative conditions such as Alzheimer’s.

They believe the breakthrough is a vital step closer to increased capabilities for earlier diagnosis of neurological disorders such as Alzheimer’s and Parkinson’s, and opens up possibilities for a new generation of targeted drugs, as scientists say they have uncovered the earliest stages of the development of Alzheimer’s that drugs could possibly target.

Single cell genomics breakthrough - RNA in single cells sequenced and up to 1000-fold variability in expression levels found

A team of scientists at the Klarman Cell Observatory at the Broad Institute recently completed an effort to read, or sequence, all the RNA — the “transcriptome” — in individual immune cells. Whereas DNA in a cell’s genome represents its blueprint for making the building blocks of cells, RNA is more like the cell’s contractor, turning that blueprint into proteins. By sequencing RNA in single cells, scientists can obtain a picture of what proteins each cell is actively making and in what amounts.

The Broad researchers sought to adapt a recently developed technique for single-cell RNA sequencing, known as SMART-Seq, and apply it to a model of immune cell response well-studied by Regev, Broad senior associate member Nir Hacohen, and their fellow researchers. In this model, immune cells known as bone-marrow derived dendritic cells (BMDCs) are exposed to a bacterial cell component that causes the cells to mount an immune response.

Working with scientists in the Broad’s Genomics Platform, notably research scientists Joshua Levin and Xian Adiconis, the team established the SMART-Seq method for use in their model system, using it to gather RNA sequence data from 18 BMDCs in this pilot phase.

The team first analyzed the data for differences in expression, or activity, of various genes among the cells, seen as alterations in RNA abundance. Although they were working with a single cell type — BDMCs — they did expect to see some variation in gene expression as cells activated various pathways during their immune response. But the team discovered that some genes varied greatly, with 1000-fold differences in the expression levels between cells. “We went after a narrowly defined cell type that has a specific function that we think of as being very uniform,” said Shalek. “What we saw was striking — a tremendous variability that wasn’t expected.”

Nature - Single-cell transcriptomics reveals bimodality in expression and splicing in immune cells

China Urbanization from now to 2050 needs more skyscrapers to leave land for agriculture

Goldman Sachs believes the ‘new’ China urbanization will aim for a shift in focus, in which incremental benefits will be more likely to accrue to social safety nets / consumption areas, while FAI ( fixed asset investment) will become more selective and targeted. In this report, we address several misconceptions about the future for urbanization.

A more concrete urbanization blueprint unveiled in the coming months. Premier Li Keqiang has identified top policy issues as geographic strategy, land reform, hukou reform, natural resource support, and environmental issues. Funding source is also a big challenge, in our [Goldman] view.

Healthcare and Insurance will be the big winning sectors with the new China urbanization.

Limitations in land and environment favor Broad Group Skyscrapers to continue old model urbanization

The Goldman study cites land scarcity and environmental and road limitations. Goldman expects more growth in townships to get around those problems. Nextbigfuture believes that Broad Group skyscrapers would enable more growth in first tier, second tier and third tier cities.

Land scarcity: Land is a scarce resource in China and the government has already been struggling to maintain its minimum commitment of 1.8bn mu of arable agricultural land. Simply building many more cities may not be feasible.

Environmental and other limitations : Some of China’s largest and most popular cities (such as Beijing) are facing clear bottlenecks not resolvable by simply more investment. For example, more roads and subways could possibly alleviate traffic congestion. However, other issues such as nature degradation, pollution, etc. are difficult to overcome in a short period of time, limiting the capacity for such cities to take in a faster pace of population inflow

Broad Group has gotten permission to proceed with the construction of the 220 story Sky City skyscraper starting in June, 2013 in Changsha.

The Broad Group Vision is to us the skyscrapers to make a cleaner better city for China.

They have very good air purification systems for reducing indoor air pollution by over 99%.
The factory built system would reduce construction dust air pollution.

They envision clean offices, homes (up to 3000 square feet), hospitals, schools, exercise facilities, pools, restaurants and grocery stores.

Clonetegration is a fast One-step genetic engineering technology

A new, streamlined approach to genetic engineering drastically reduces the time and effort needed to insert new genes into bacteria, the workhorses of biotechnology. The method paves the way for more rapid development of designer microbes for drug development, environmental cleanup and other activities.

current genetic engineering methods are time-consuming and involve many steps. The approaches have other limitations as well. To address those drawbacks, the researchers sought to develop a new, one-step genetic engineering technology, which they named “clonetegration,” a reference to clones or copies of genes or DNA fragments.

They describe development and successful laboratory tests of clonetegration in E. coli and Salmonella typhimurium bacteria, which are used in biotechnology. The method is quick, efficient and easy to do and can integrate multiple genes at the same time. They predict that clonetegration “will become a valuable technique facilitating genetic engineering with difficult-to-clone sequences and rapid construction of synthetic biological systems.”

ACS Synthetic Biology - One-Step Cloning and Chromosomal Integration of DNA

EU fears being left behind with global shale oil and gas revolution but also fears hydrofracking and nuclear energy

EU leaders, desperate to give economic growth a boost, are talking about targeting energy policy. They are concerned a US-led revolution in shale oil and gas development will reshape the global economy and leave Europe far behind.

* energy costs remain high in the EU
* Europe paid one billion euros a day for its energy imports in 2012
* Europe risks becoming the only continent to depend on imported energy
* In 2035, Europe could still depend on imports for more than 80 percent of their energy needs
* one trillion euros in energy investment is needed by 2020
* Britain, Hungary, Poland, Romania and Spain favor developing shale energy but others, and France in particular, are opposed, citing the environmental issues involved.
* The public in many countries fear hydrofracking
* public fears of nuclear energy is also causing Germany and other countries to make costly shifts from nuclear energy

Which european fears will win ?
Fears of being left behind economically ?
Fears of high unemployment ?
Fears of hydrofracking ?
Fears of nuclear energy ?
Environmental fears ?

Cheap natural gas is cheap energy and cheap chemical feedstock for Industrial processes

Motivated by a rapid-fire increase in natural gas production in the United States, business leaders and some politicians in Germany say they need to act quickly to prevent the country’s industrial core from departing for places where energy costs just a fraction of the price. They worry that the country’s ambitious environmental goals are far less meaningful if the economy withers in achieving them.

New asthma drug, duplimab, cuts attacks by 87% and treats underlying cause olf asthma

A new type of asthma drug (duplimab) meant to attack the underlying causes of the respiratory disease slashed episodes by 87 percent in a mid-stage trial, making it a potential game changer for patients with moderate to severe disease, researchers said on Tuesday.

"Overall, these are the most exciting data we've seen in asthma in 20 years," said Dr. Sally Wenzel, lead investigator for the 104-patient study of dupilumab, an injectable treatment being developed by Regeneron Pharmaceuticals Inc and French drugmaker Sanofi.

The drug also met all its secondary goals, such as improving symptoms and lung function and reducing the need for standard drugs called beta agonists.
"We have been treating asthma with sort of Band-Aid therapies that didn't get at the underlying causes," Wenzel said in an interview, adding that dupilumab could be an important step in going to the root of the problem.

"By end of the trial, after 12 weeks, 44 percent of those in the placebo group had exacerbations, compared with 5 percent of those on dupilumab," Wenzel said. That represented an 87 percent reduction in exacerbations, which was highly statistically significant.

Single atom electron spin qubit building block for scalable quantum computer compatible with silicon computer chips

An Australian team unveils the fundamental building block of a scalable quantum computer that could be embedded in today’s silicon chips.

Kane Quantum Computer Proposal - Phosphorus atoms embedded in silicon would be the ideal way to store and manipulate quantum information.Phosphorus atom could store a single qubit for long periods of time in the way it spins. A magnetic field could easily address this qubit using well-known techniques from nuclear magnetic resonance spectroscopy. That would allow single-qubit manipulations but not two-qubit operations, because nuclear spins do not interact significantly of each other.

For that, he suggested transferring the spin to an electron orbiting the phosphorus atom, which would interact much more easily with an electron orbiting a nearby phosphorus atom. Two-qubit operations would then be possible by manipulating the two electrons with electric fields.

Building a Kane quantum computer has become almost an obsession in Australia, where some 100 researchers have been working on the problem for over a decade.

Breakthroughs Achieved
* Able to implant phosphorus atoms at precise locations in silicon using a scanning tunnelling microscope.
* able to address the nuclear spins of these phosphorus atoms using powerful magnetic fields.
* Now able address the spin of an individual electron orbiting a phosphorus atom and to read out its value.

The end result is a device that can store and manipulate a qubit and has the potential to perform two-qubit logic operations with atoms nearby; in other words the fundamental building block of a scalable quantum computer.

Arxiv - A single-Atom Electron Spin Qubit in Silicon

Good governance is the toughest part of ending poverty

In April the World Bank governors endorsed two historic goals: to end extreme poverty by 2030 and to ensure that prosperity is shared.

It will take a lot to end poverty: strong growth, more infrastructure investments, increased agricultural productivity, better business environments, jobs, good education, and quality health care. We have to do more of this in tough places, particularly those that are fragile and conflict-affected.

But it also takes overcoming institutional weaknesses and zero tolerance for corruption. Without improving governance it will not be possible to lift the 1.2 billion people who still live of $1.25 a day or less out of poverty and to ensure that economic growth will benefit all citizens.

Good governance and the role it plays in fighting poverty is complex. A finance minister from a resource rich but otherwise poor country told me recently that the fuel subsidies in that country, designed to protect the most vulnerable from high prices, are ultimately “anti-poor” because the rich benefit most, they are wasteful and ineffective. And another official from a middle income country described achieving shared prosperity as tough because a growing middle class has high expectations and becomes disillusioned by corruption and lack of services, making them less willing to support the state.

Some Countries appear to want inflated poverty statistics to get more international aid

World Bank poverty statistics are based upon flawed purchasing power parity numbers. There is real poverty in the world but the poverty statistics likely have been substantially overstating the poverty levels for ten years.

Taking the bank’s 2012 figures at face value also implies that we have to believe the following:

* North Korea has roughly the same poverty rate as China.
* Individual consumption in India has grown at a paltry 1.5 percent per year since the country’s economic takeoff in the early 1990s, and the much vaunted Indian middle class only numbers 9 million people—in a country with over 900 million cell phone subscribers and 40 million cars.
* In 1981, China was poorer than any country in the world is today, with a level of individual consumption below the current level in Liberia.

World Bank Aid by country

China received over one billion dollars in aid until 2009. In 2011, China started giving out more development aid. China would not have a reason to suppress its purchasing power parity numbers. Some in China could get the free money from 2000 to 2010 while other parts were adding trillions to the overall economy.

Path to getting salamander like regeneration under hospital conditions

Macrophages are a major immune cell type which roam the tissues engulfing invaders like bacteria and fungi. But they're not just involved in gobbling up debris. They actively determine repair - for example they are important in human muscle repair.

When macrophages were removed from salamanders, it had a "devastating effect" on their ability to regrow limbs. The animals ended up with fibrosis (scarring) and a stump.

Godwin believes that chemicals released by the animals' macrophages are essential for the regeneration process, and is conducting experiments now to investigate this.

"This really gives us somewhere to look for what might be secreted into the wound environment that allows for regeneration," he says.

"The long-term plan is that we'll know exactly what cocktail to add to a wound site to allow salamander-like regeneration under hospital conditions."

The work has implications not just for entire limb regrowth, but for "smaller, less ambitious" goals such as scar-less healing. Although scars perform a useful function in stopping blood loss and preventing infection getting into a wound, they inhibit communication between cells and this prevents regeneration, says Simon. Down the track, using the salamander's approach could maybe help with healing of burns, for instance, he suggests.

PNAS - Macrophages are required for adult salamander limb regeneration

Ending $1.25 per day PPP poverty by 2030

The Brookings Institute has a study on getting poverty below 3% of population by 2030. The range of poverty outcomes for 2030 is large, implying that the future trajectory of global poverty is highly uncertain. Getting to the “zero zone”, defined here as a poverty rate of under 3 percent, by 2030 is unlikely to occur through stronger than expected consumption growth or an improving distribution alone. Both factors are needed simultaneously.

There is no magic ingredient for eliminating poverty. Rather it hinges on a complex recipe: better than expected consumption growth and distributional trends in favor of the poor; country-by-country progress in transitioning fragile and conflict-affected states onto a stable path; strengthening the resilience of vulnerable households and economies to other kinds of shocks; the incorporation of isolated or excluded sub-national populations into the orbit of their economies; more deliberate and efficient targeting of the poor, including the poorest of the poor, at a country and sub-national level.

While the future trajectory of global poverty is impossible to predict, our understanding of what it will take to eliminate poverty is growing. The challenge for the global community is to seize this knowledge so that the dream of achieving a poverty-free world becomes a reality

The 40-year period from 1990 to 2030 resembles a relay race in which responsibility for leading the charge on global poverty reduction passes between these three giants.

China’s relay leg is the most striking. It undergoes a dramatic transformation in which its population, which is initially predominantly poor, disperses over a range of consumption levels beyond the poverty line, driven by rapid, though often inequitable, consumption growth.

Age of Technological Disruption

"All of the structures that we use to run the world today— our civics, our politics, our legal systems, healthcare, education— are all structured for a world 100 or 200 years ago, not for the world of today. So we think we're in for a lot of disruption," says Salim Ismail, founding director of Singularity University.

ReasonTV's Tracy Oppenheimer caught up with Salim at the 2013 Milken Institute Global Conference in Beverly Hills, CA to discuss crowd funding, the next steps in technological expansion, and how we've entered the age of an information-based environment.

"It's happening across industries. The first few were newspapers, music, and electronic publishing. Those were the first three domains to be fully information-enabled. Now were moving to cars being information-enabled," says Salim, "we're turning everything in to a computational basis."

Singularity University's academic programs strive to "educate, inspire and empower leaders to apply exponential technologies to address humanity's grand challenges."

A major driver of the future economy will be the lowered barrier to forming a company. $100,000 to start a company instead of $20 million and the $100,000 can be crowdsourced if you have a good idea.

May 21, 2013

South Korea should overtake Japan by 2017 on per capita income (GDP PPP)

Looking at GDP PPP numbers across the four Asian tigers shows that they are each slowly reaching, if not, overtaking Japan.

1993 Singapore overtook Japan in GDP PPP
1997 Hong Kong , and
2010 Taiwan in 2010

South Korea should overtake Japan as the richest [larger] country in Asia in terms of GDP PPP by 2017.

It was only in 1980 that South Korea had a GDP PPP that was less than one quarter of that seen in Japan at the time. South Korea has clearly come a long way since 1980, especially given that back then, companies such as Hyundai, LG, and Samsung were practically unheard of outside of Seoul.

The IMF still expects Japan to be ahead of South Korea in 2017 (barely). So it might be 2018 for South Korea to overtake Japan in per capita GDP PPP.

Bloom Energy has raised $130 million to bring total raised to $1.1 billion and are on track to be profitable in 2013

Fuel cell maker Bloom Energy has raised $130 million in new venture capital funding in May 2013 Bloom now has raised more than $1.1 billion in venture capital funding, including past investments from Kleiner Perkins Caufield & Byers, New Enterprise Associates, Advanced Equities, DAG Ventures and Goldman Sachs.

The company had become gross margin positive (on a pro forma basis),was "operating with a fully funded business plan" and was "on track with our goal to be profitable in 2013."

Bloom Energy is most likely not profitable even after 11 years. Bloom Energy CFO Bill Kurtz had said that the company was “half way to break even” in the Summer of 2012. Primack previously reported that Bloom’s retained earnings through Q3 2012 stood at negative $873 million, with $113 million left in the bank, and with positive gross margins on a pro forma basis.

A field of Bloom box fuel cells

Rossi LENR convinces some but many are not convinced and say it is a fraud

Jed Rothwell gives a positive review of the Hot Cat third party test paper

These people think and write like engineers rather than scientists. That is a complement coming from me [Jed].

In every instance, their assumptions are conservative. Where there is any chance of mismeasuring something, they assume the lowest possible value for output, and the highest value for input. They assume emissivity is 1 even though it is obviously lower (and therefore output is higher). The add in every possible source of input, whereas any factor that might increase output but which cannot be measured exactly is ignored. For example, they know that emissivity from the sides of the cylinder close to 90 degrees away from the camera is undermeasured (because it is at an angle), but rather than try to take that into account, they do the calculation as if all surfaces are at 0 degrees, flat in front of the camera. In the first set of tests they know that the support frame blocks the IR camera partly, casting a shadow and reducing output, but they do not try to take than into account.

Edmund Storms on Nickel 62.

Simple arguments can show that the amount of energy claimed by Rossi can not result from the Ni+p=Cu reaction regardless of the isotope. Ironically, people will accept Rossi's claim that transmutation is the source of energy while questioning whether he makes any energy at all.

Japan may restart more nuclear reactors in the fall of 2013

Industry minister Toshimitsu Motegi said that some of the country’s idled nuclear reactors might resume operations this autumn at the earliest after undergoing a new safety assessment process.

To restart idled reactors, it is necessary to obtain understanding of host local communities. In this regard, Motegi said the government will make efforts to win local consent.

Of the nation’s 50 commercial reactors, only two are currently online amid safety concerns over nuclear power in the wake of the 2011 Fukushima No. 1 plant disaster.

Tepco shares have surged on hopes that there will be reactor restarts later this year. Tokyo Electric Power, operator of the stricken Fukushima Dai-Ichi plant, surged for a fourth consecutive day, bringing gains to 59 percent in the period amid speculation it will apply soon to restart idled reactors.

Areva expects Japan over the next few year to restart two-thirds of its atomic plants that were idled after the 2011 Fukushima accident. Half a dozen reactors may restart by the end of this year in addition to the two that resumed operations in 2012.

New Xbox One has eight times the graphics performance and New Kinect can measure your pulse by scanning your face

Wired has the details on the new Xbox One. This was previously what was expected to be called the Xbox 720.

Microsoft touts the Xbox One as delivering 8 times the graphic performance of the 360. If you were to go by raw transistor count, that performance jump would be closer to tenfold: the Xbox One boasts 5 billion to the 360’s 500 million. It has 8GB RAM and an 8-core CPU.

A new 500-GB hard drive was designed in-house, likewise a custom-built Blu-ray–capable optical drive. A single 40-nanometer chip contains both the CPU and GPU rather than the two dedicated 90-nm chips needed in the 360. In fact, a custom SOC (system on a chip) module made by AMD contains the CPU/GPU chip, the memory, the controller logic, the DRAM, and the audio processors, and connects directly to the heat sink via a phase-change interface material.

Xbox One gives game developers the ability to access Microsoft’s Azure cloud computing platform. That leads to a few obvious and immediate applications: All your downloaded and installed games and achievements are synced to the cloud and can be accessed and played without interruption on any Xbox One you sign in to; stable, dedicated servers for every multiplayer game rather than the notoriously fragile practice of hosting matches on one participant’s console; even multiplayer matches that can grow to 64, even 128 participants, rather than the usual limit of 16 or 32.

Science fair winner has 20.1 Wh/kg energy density core-shell nanorod supercapacitor

Eesha Khare, 18, of Saratoga, Calif., received the Intel Foundation Young Scientist Award of $50,000 for the invention of a tiny energy-storage device that will enable cellphones to charge in 20 seconds.

Eesha's invention also has potential applications for car batteries.

The EEstory covers Eesha Khare and her parents.

Manoj Khare is her father. He was one of the co-founders of Vihana Inc., a company bought by Cisco for its semiconductor technology. It wasn't a big company and the sales price was $30 Million. Manoj Khare has the material science background to account for the nearly miraculous high schooler's invention.

Reena Khare (mother) is herself some sort of science genius and associated with a few patents in the biology realm ("genomic technologies"). It looks like she was part of a research team at Incyte Inc in a Palo Alto R&D facility before it closed.

Her teacher is Amanda Alonzo, who got a masters in something from Stanford. She won a California science teacher of the year award.

The Details of the Supercapacitor Work

Design and Synthesis of Hydrogenated TiO2-Polyaniline Nanorods for Flexible High-Performance Supercapacitors

The goal of this work was to design and synthesize a supercapacitor with increased energy density while maintaining
power density and long cycle life.

Carbon fibers coated with carbon nanotubes have been made twice as strong

MIT researchers have produced carbon fibers coated in carbon nanotubes without degrading the underlying fiber's strength. The engineered fibers may be woven into composites to make stronger, lighter airplane parts.

The researchers coated carbon fibers with nanotubes without causing fiber degradation, making the fibers twice as strong as previous nanotube-coated fibers — paving the way for carbon-fiber composites that are not only stronger, but also more electrically conductive. The researchers say the techniques can easily be integrated into current fiber-manufacturing processes.

Applied Materials and Interfaces - Circumventing the Mechanochemical Origins of Strength Loss in the Synthesis of Hierarchical Carbon Fibers

Hierarchical carbon fibers (CFs) sheathed with radial arrays of carbon nanotubes (CNTs) are promising candidates for improving the intra- and interlaminar properties of advanced fiber-reinforced composites (e.g., graphite/epoxy) and for high-surface-area electrodes for battery and supercapacitor architectures. While CVD growth of CNTs on CFs has been previously shown to improve the apparent shear strength between fibers and polymer matrices (up to 60%), this has to date been achieved only at the expense of significant reductions in tensile strength (30–50%) and stiffness (10–20%) of the underlying fiber. Here we demonstrate two approaches for growing aligned and unaligned CNTs on CFs that enable preservation of fiber strength and stiffness. We observe that CVD-induced reduction of fiber strength and stiffness is primarily attributable to mechanochemical reorganization of the underlying fiber when heated untensioned above 550 °C in both hydrocarbon-containing and inert atmospheres. We show that tensioning fibers to ≥12% of tensile strength during CVD enables aligned CNT growth while simultaneously preserving fiber strength and stiffness even at growth temperatures over 700 °C. We also show that CNT growth employing CO2/acetylene at 480 °C without tensioning—below the identified critical strength-loss temperature—preserves fiber strength. These results highlight previously unidentified mechanisms underlying synthesis of hierarchical CFs and demonstrate scalable, facile methods for doing so.

40 gigabit per second wireless link and the possibility of even faster communication

Researchers of the Fraunhofer Institute for Applied Solid State Physics and the Karlsruhe Institute for Technology have achieved the wireless transmission of 40 Gbit/s at 240 GHz over a distance of one kilometer. Their most recent demonstration sets a new world record and ties in seamlessly with the capacity of optical fiber transmission. In the future, such radio links will be able to close gaps in providing broadband internet by supplementing the network in rural areas and places which are difficult to access.

A distance of over one kilometer has already been covered by using a long range demonstrator between two skyscrapers in Karlsruhe. (Photo: Ulrich Lewark / KIT)

May 20, 2013

China Provincial GDP for 2012

Storing tens of gigabits in cubic millimeters of opal for unbreakable and convenient one time pad encryption

One-time pads are the holy grail of cryptography–they are impossible to crack, even in principle.

They work by adding a set of random digits to a message thereby creating a ciphertext that looks random to any eavesdropper. The receiver decodes the message by taking away the same set of random digits to reveal the original message.

The security of this process depends on two factors. The first is the randomness of the digits that make up the one time pad. If this key is truly random, it offers nothing the eavesdropper can use to break the code. Although there are some potential pitfalls, random digits are reasonably straightforward to generate these days.

The second factor is the ability to keep this key secret so that only the transmitter and receiver have access to it. That’s much more difficult to ensure.

Arxiv - Physical key-protected one-time pad

We describe an encrypted communication principle that can form a perfectly secure link between two parties without electronically saving either of their keys. Instead, cryptographic key bits are kept safe within the unique mesoscopic randomness of two volumetric scattering materials. We demonstrate how a shared set of patterned optical probes can generate 10 gigabits of statistically verified randomness between a pair of unique 2 cubic millimeter scattering objects. This shared randomness is used to facilitate information-theoretically secure communication following a modified one-time pad protocol. Benefits of volumetric physical storage over electronic memory include the inability to probe, duplicate or selectively reset any random bits without fundamentally altering the entire key space. Beyond the demonstrated communication scheme, our ability to securely couple the randomness contained within two unique physical objects may help strengthen the hardware for a large class of cryptographic protocols, which is currently a critically weak link in the security pipeline of our increasingly mobile communication culture.

The construction and operation of a CPUF (communication physical unclonable function). (a) Sequentially over time, n random phase patterns pi are displayed on an SLM. (b) A microscope objective (MO) focuses each random wavefront from the SLM onto a volumetric scatterer. The scrambled light emerging from the material passes through a designed aperture before being detected by a CMOS sensor. (c) Each detected speckle image r is digitally transformed into an ideally random key k with a constant digital whitening projection operator W. (d) Optical scattering is mathematically represented by a complex random Gaussian matrix T and (e) digital whitening is described by a sparse binary random matrix W. The combination of one unique T and general W per CPUF device leads to an ideally random multi-gigabit key space that is very difficult to characterize or clone. (f) The experimental CPUF setup, including all components in (b).

Rossi, NASA and Low energy nuclear reactions and Nickel 62 and Nickel 63 Speculation

Rossi had a third party investigation of his hot ecat. They should have done flow calorimetry, but the tests they did show interesting results.

There are rumors on the internet that Ampenergo (Rossi related company) has raised a lot of money. This fact fits with both the fraud conspiracy and the real deal theories.

Looking at what might be at work if it is the real deal

In 2011, Rossi has indicated Hydrogen and nickel was the mechanism.

Rossi said that about 30% of nickel was turned into copper, after 6 months of uninterrupted operation.

If it is nickel and neutrons

If nickel isotopes are the key
* start with highly enriched nickel 62
* get a cheap source of neutrons at 10^14 neutrons per second
* get a lot of nickel 63 and then have an environment that accelerates decay to copper

This would generate a lot of energy with radiation that would not escape a metal container and would have stable copper as an end product.

Note - a less controversial way to get to this would be to have super-efficient enrichment with say laser enrichment and then having some other cheap neutron source.

GE - Global Laser Enrichment received a construction and operation licence in September, 2012 for a full-scale laser enrichment facility in Wilmington, North Carolina.

Nickel 63 is unstable and will beta decay into copper. There is radiation. The radiation is easy to stopped with a piece of paper.

Beta radiation is easily shielded. The level of shielding depends on the energy of the beta radiation. The Ni-63 found in ECDs is easily shielded by even a sheet of paper because of its very low energy of emission. Thus this isotope has virtually no external radiation risk, though it certainly can be a dangerous source of internal exposure if the source is leaking.

Intense magnetic fields can induce beta decay based on a 1983 paper in Physical Review Letters.

Details around Nickel 63 energy

Natural nickel consists of five stable isotopes:
nickel-58 (68.27 percent)
nickel-60 (26.10 percent)
nickel-61 (1.13 percent)
nickel-62 (3.59 percent)
and nickel-64 (0.91 percent)

The Russians had tried to make industrial quantities of Nickel 63 for power generation.

Nickel-63 (a pure beta-emitter with a half-life of 100 years) is one of the most promising radionuclides that can be used in miniature autonomous electric power sources with a service life of above 30 years (nuclear batteries) working on the betavoltaic effect.

The maximum energy of beta-particles in the 63Ni emission spectrum is 65 keV, which is much lower than the threshold of radiation damage in the semiconductors intended for use - silicon and gallium arsenide.

If Rossi was using say 80% enriched Nickel 62 and then the device was producing a neutron flux then the Nickel 63 would be generated and would then give off energy and decay to Copper

Steve Jurvetson indicates that Machine learning innervates everything Google does - Google glass, robotic cars, Dwave quantum computer research

Steve Jurvetson is a venture capitalist who invested in DWave Systems. Dwave makes adiabatic quantum computers. Dwave sold a 512 qubit system to Google for machine learning and artificial intelligence

Google's Harmut Neven said "We actually think quantum machine learning may provide the most creative problem-solving process under the known laws of physics."

This is an interesting development in a larger trend I (Steve Jurvetson) call Deus Ex Machina — machine learning innervates everything.

Under the covers, just about every new initiative at Google, from Glass to robo-cars, is driven by machine learning — whereby the machine learns patterns in the data without explicit models or traditional solution design. It’s what makes “Big Data” BIG this time around. The approach requires a humble relaxation of the presumption of control, and so it starts with companies like Google and eventually revolutionizes all businesses, even those with a delusion of control, like Investment bankers. =)

Tumblr Could have had a $3 billion buyout

Danielle Morrill describes how Tumblr could have sold for $3 billion instead of the $1.1 billion deal with Yahoo.

Tumblr claims 120 million+ daily impressions on Tumblr Radar, which equals 3.6 Billion+ monthly impressions. Assuming $10 – 20 RPM (revenue per thousand impressions), which is within the normal range for premium brand advertising, the total revenue opportunity for Q1 was $108 – 216 Million. Based on this calculation, at an annual run rate of $15 Million ($3.75 Million quarterly revenue).

Tumble is running low on cash on hand and there is a lack of trust in leadership to hit revenue milestones. These are likely having a negative impact on Tumblr’s negotiating position, which is probably contributing to what some consider a “lowball” offer.

Self-driving car technology that would only add $4000 to price of a car wins Intel science fair

Ionut Budisteanu, 19, of Romania was awarded first place for using artificial intelligence to create a viable model for a low-cost, self-driving car at this year's Intel International Science and Engineering Fair. His whole system should work for no more $4,000.

Ionut created a feasible design for an autonomously controlled car that could detect traffic lanes and curbs, along with the real-time position of the car.

"The most expensive thing from the Google self-driving car is the high resolution 3-D radar, so I was thinking how I could remove it," he told NBC News.

His solution relies on processing webcam imagery with artificial intelligence technology to pick out the curbs, lane markers, and even soccer balls that roll onto the road. This is coupled with data from a low-resolution 3-D radar that recognizes "big" objects such as other cars, houses, and trees.

All of this information is collected and processed real time by a suite of computers that, in turn, feed into a "supervisor" computer program that calculates the car's path and drives it down the road.

Budisteanu ran 50 simulations with his system and in 47 of them it performed flawlessly. In three, however, it failed to recognize some people who were 65 to 100 feet (20 to 30 meters) away. He said slightly higher-resolution 3-D radar should do the trick and still keep costs at a fraction of Google's.

Third Party ECat report on Arxiv

Arxiv - Indication of anomalous heat energy production in a reactor device containing hydrogen loaded nickel powder

Giuseppe Levi, Bologna University, Bologna, Italy

Evelyn Foschi, Bologna, Italy

Torbjörn Hartman, Bo Höistad, Roland Pettersson and Lars Tegnér Uppsala University, Uppsala, Sweden

Hanno Essén, Royal Institute of Technology, Stockholm, Sweden

An experimental investigation of possible anomalous heat production in a special type of reactor tube named E-Cat HT is carried out. The reactor tube is charged with a small amount of hydrogen loaded nickel powder plus some additives. The reaction is primarily initiated by heat from resistor coils inside the reactor tube. Measurement of the produced heat was performed with high-resolution thermal imaging cameras, recording data every second from the hot reactor tube. The measurements of electrical power input were performed with a large bandwidth threephase power analyzer. Data were collected in two experimental runs lasting 96 and 116 hours, respectively. An anomalous heat production was indicated in both experiments. The 116-hour experiment also included a calibration of the experimental set-up without the active charge present in the E-Cat HT. In this case, no extra heat was generated beyond the expected heat from the electric input.

Computed volumetric and gravimetric energy densities were found to be far above those of any known chemical source. Even by the most conservative assumptions as to the errors in the measurements, the result is still one order of magnitude greater than conventional energy sources.

May 19, 2013

Carnival of Nuclear Energy 157

The Carnival of Nuclear Energy 157 is up at Hiroshima Syndrome.

Yes Vermont Yankee looks at what the land use requirements would be to achieve 90% renewables by 2050 in Vermont.

* 18,000 GWh of electricity, Vermont would need to build 140 wind farms with the approximate output of Lowell Mountain’s 21-turbine facility. According to the National Renewable Energy Laboratory web site and other comparisons, 21 turbines of this size would usually cover 5 miles of ridgeline. These 140 wind farms would use 2,240 industrial turbines over 700 miles of ridgeline. Lowell claims to use only 3 miles of ridge line: in this case, ”only” 420 miles of ridgeline would be required for the turbines. However, not all ridges have wind as good as Lowell, so more turbines would probably be needed. Keep in mind, the entire state of Vermont is 158 miles long and 90 miles across at its widest.

Inductrack III for superefficient leviation and movement of shipping containers

Inductrack III configurations are suited for use in transporting heavy freight loads. Inductrack III addresses a problem associated with the cantilevered track of the Inductrack II configuration. The use of a cantilevered track could present mechanical design problems in attempting to achieve a strong enough track system such that it would be capable of supporting very heavy loads. In Inductrack III, the levitating portion of the track can be supported uniformly from below, as the levitating Halbach array used on the moving vehicle is a single-sided one, thus does not require the cantilevered track as employed in Inductrack II.

Inductrack systems were studied for moving containers at the port of Los Angeles. This study was done with Inductrack 2 which is before Inductrack 3 optimizations for cargo transport.

World’s first cargo maglev system, the Electric Cargo Conveyor (ECCO) undergoes testing at the GA test track in San Diego, California. The system architecture is arranged to shuttle cargo vehicles back and forth through highspeed sections connected with dual-loading/unloading spurs. This arrangement, coupled with 20-sec headway between vehicles in transit and 2-min dwell time for loading and unloading, meets the 5,000 container trips per day requirement. The system is driverless, using automatic train control. It is also energy-efficient, and uses regenerative braking during deceleration.

Inductrack is a completely passive, fail-safe magnetic levitation system, using only unpowered loops of wire in the track and permanent magnets (arranged into Halbach arrays) on the vehicle to achieve magnetic levitation. The track can be in one of two configurations, a "ladder track" and a "laminated track". The ladder track is made of unpowered Litz wire cables, and the laminated track is made out of stacked copper or aluminum sheets.

* Inductrack I was optimized for high speed operation
* Inductrack II, which is more efficient at lower speeds.
* Inductrack III for moving heavy freight

FIG. 4A shows a plot of drag power versus velocity for an Inductrack I configuration for a levitated load of 35,000 kg.

FIG. 4B shows a plot of drag power versus velocity for an Inductrack III configuration (such as that shown in FIG. 3) for a levitated load of 35,000 kg.

FIG. 5A shows a plot of drag power versus velocity for an Inductrack I configuration for a levitated "return trip" load of 8000 kg.

Several maglev railroad proposals are based upon Inductrack technology. The U.S. National Aeronautics and Space Administration (NASA) is also considering Inductrack technology for launching space planes.

General Atomics is developing Inductrak technology in cooperation with multiple research partners.

Metamaterials remove heat and control its flow and direction

Factal Antenna Systems has filed patent on better methods for removing heat, and controlling its flow and direction. The technology uses tiny resonators made of self-scaled structures called fractals to form a virtual bridge to lead heat, or other electromagnetic radiation forms, controlling flow from one area to another. The resulting ‘heat transfer’ happens at the speed of light and can be superior to other methods such as convection, conduction, or ambient radiation loss.

CEO and inventor Nathan Cohen says “Our initiative to exploit fundamental technology on fractal metamaterials is bearing bountiful fruit, such as the world’s first and best invisibility cloak (US patent 8,253,639) , and now ‘metatransfer’ cooling.” Cohen is an astrophysicist with decades of experience in radio astronomy, optics, radar/ultrasound, and infrared. He is considered one of the world’s experts in applications of fractals. Cohen added that the company holds the fundamental patent on fractal metamaterials.

Metamaterials are used in the new technology as an array of close-spaced resonators that radiate the heat to each adjacent fractal resonator ‘cell’. The close spacing forces a special type of radiation called ‘evanescent waves’. There is no physical contact. The fractal shapes used to make the metamaterials assure that the heat is efficiently transferred, from hot areas to desired cool areas, at wide bandwidths. No outside power is needed. The metatransfer material can be paper thin and easily bendable, with simple and inexpensive connects. Because the effect is a special type of electromagnetic radiation, it occurs at the speed of light.

Cohen further added that although the technology has been kept close to the vest, recent scholarly publications have reported such metatransfer ability, with less efficiency and bandwidth. “We are deeply appreciative these independent research results are now public, as it makes our job easier to cite outside scientific validation that post-dates our pioneering efforts.” Cohen stresses that the pending patent, both in first to invent and first to file, pre-dates any potential competing claims.

Cohen expects the technology to see wide use in the next two decades. “We intend to start out small, with disruptive applications, and use 3D prototype printing, for example, to make the metatransfer heat cooling an integral part of devices,” he concluded.

China's 20 year plan to pay 8 trillion to urbanize 500 million people by 2034

After extensive consultation, co-ordinated by the National Development and Reform Commission, the long-term plan for China's urbanisation is being finalised. Behind all the complex issues is one fundamental question: how will it be paid for?

Here the ballpark costs of $400 billion per year are suggested to use increased taxes and temporarily increasing the budget deficit from 2% of GDP to 5% and redirecting funds from rural land acquisition.

The costs of urbanization could be reduced by leveraging the factory mass produced skyscraper technology of Broad Group. China's Broad Group is building the Sky City One using factory mass production. It is to likely completed after 90 days of assembly late in 2013 and the projected cost for the building is RMB 4 billion (US$628 million). Sky City will boast 220 floors, 1 million square meters (11 million square feet) of floor space and 104 elevators, according to the preliminary plans. It will cost $63 per square foot and house 30000 people in 4500 apartments. Five hundred Skycities would cost $314 billion (and costs could go down by having the follow on buildings being learned to be built for less). They would house the 15 million people each year that are urbanized. They would also have all of the schools, offices, hospitals and other facilities that were needed.

Based on estimates by the State Council's Development Research Centre and other sources, 100,000 yuan (HK$125,000) to convert one rural resident to an urban dweller may be a reasonable starting point. So, converting 500 million people (which would see 70 per cent of China urbanised) by 2030 would cost about 50 trillion yuan, or US$8 trillion, the equivalent of China's gross domestic product last year. A comparison may be German reunification, which cost some US$2 trillion for a much smaller base of 16 million people. At four times the cost, China's urbanisation would be 30 times the size of German unification on a human scale.

In this context, suddenly abolishing household registration is unfeasible. Even if hospitals and schools could be built overnight, there is no way to pay for them. Instead, a gradual approach is more realistic. While financially challenging, it should be possible to implement such social integration over two decades. Anything faster is beyond China's financial capacity but anything slower may compound social discontent.

Форма для связи


Email *

Message *