January 16, 2016

NASA has formalized Planetary Defense Coordination Office to detect and mitigate asteroid threats

NASA has formalized its ongoing program for detecting and tracking near-Earth objects (NEOs) as the Planetary Defense Coordination Office (PDCO). The office remains within NASA's Planetary Science Division, in the agency's Science Mission Directorate in Washington. The office will be responsible for supervision of all NASA-funded projects to find and characterize asteroids and comets that pass near Earth's orbit around the sun. It will also take a leading role in coordinating interagency and intergovernmental efforts in response to any potential impact threats.

More than 13,500 near-Earth objects of all sizes have been discovered to date -- more than 95 percent of them since NASA-funded surveys began in 1998. About 1,500 NEOs are now detected each year.

The Panoramic Survey Telescope & Rapid Response System (Pan-STARRS) 1 telescope on Maui's Mount Haleakala, Hawaii has produced the most near-Earth object discoveries of the NASA-funded NEO surveys in 2015. Image credit: University of Hawaii Institute for Astronomy / Rob Ratkowski

China plans 60 MWE modular nuclear reactor by 2020 and a floating reactor by 2025

China General Nuclear (CGN) expects to complete construction of a demonstration small modular offshore multi-purpose reactor by 2020, the company announced yesterday.

CGN said development of its ACPR50S reactor design had recently been approved by China's National Development and Reform Commission as part of the 13th Five-Year Plan for innovative energy technologies.

The company said it is currently carrying out preliminary design work for a demonstration ACPR50S project. Construction of the first floating reactor is expected to start next year, it said, with electricity generation to begin in 2020.

The 200 MWt (60 MWe) reactor has been developed for the supply of electricity, heat and desalination and could be used on islands or in coastal areas, or for offshore oil and gas exploration, according to CGN.

CGN promotes the advantages of a small modular marine reactor

ACPR SMR: Safe, Flexible, Efficient Advanced Small Pressurized Water Reactor
  • Multi-purpose Small Pressurized Water Reactor independently developed by CGN;
  • Adopting advanced safety design concepts, satisfying safety requirements for eliminating off-site emergency;
  • To be used in small-scale grid, comprehensive supply of heat, electricity, water and steam and marine energy.
  • ACPR50S: Marine Small Modular Reactor
  • Mature technology: compact reactor design, combined with mature marine engineering technologies
  • High safety level: combination of active and passive safety systems, to make use of the advantage of sea water in cooling and shielding
  • Economical and Practical: adopting long-cycle refueling scheme to be more competitive than conventional marine energy; it could be used as a comprehensive energy make-up station for marine exploitation and to satisfy demand for electricity, heat, water and steam in marine environment.

The Chinese company said it is also working on the ACPR100 small reactor for use on land. This reactor will have an output of some 450 MWt (140 MWe) and would be suitable for providing power to large-scale industrial parks or to remote mountainous areas.

ACPR100: Small Modular Reactor

Multi-purpose: to realize function customization, to be used in distributed energy applications, such as small and medium-sized grid, industrial heat and electricity supply, urban heating, etc.

Twitter followers and Twitter activity of main 2016 Presidential candidates

Donald Trump has 5.7 million twitter followers.

Donald Trump has messages that he wants to communicate and he sends them out via twitter in a real time conversational way.

Hillary clinton has 5.17 million twitter followers

Hillary has some tweets that appear to be personal messages.

Ted Cruz has about 721,000 twitter followers.

Ted Cruz twitter has a lot of retweets with information from Fox News and other sources.

Bernie Sanders has 1.2 million twitter followers.

Sen. Bernie Sanders, D-Vt., has remained firmly in the No. 2 seat behind Hillary Clinton for the Democratic nomination.

Bernie twitter is full of his messages about income inequality and social unfairness.

MIT researchers can recycle the light of incandescent light and become 4 times as efficient as LED lights

Researchers at MIT have shown that by surrounding the filament with a special crystal structure in the glass they can bounce back the energy which is usually lost in heat, while still allowing the light through.
They refer to the technique as ‘recycling light’ because the energy which would usually escape into the air is redirected back to the filament where it can create new light.

"It recycles the energy that would otherwise be wasted," said Professor Marin Soljacic.

Usually traditional light bulbs are only about five per cent efficient, with 95 per cent of the energy being lost to the atmosphere. In comparison LED or florescent bulbs manage around 14 per cent efficiency. But the scientists believe that the new bulb could reach efficiency levels of 40 per cent.

And it shows colors far more naturally than modern energy-efficient bulbs. Traditional incandescent bulbs have a ‘colour rendering index’ rating of 100, because they match the hue of objects seen in natural daylight. However even ‘warm’ finish LED or florescent bulbs can only manage an index rating of 80 and most are far less.

Nature Nanotechnology - Tailoring high-temperature radiation and the resurrection of the incandescent source

Insilico is applying machine learning and GPU supercomputer to advance disease cures for cancer, Alzhimers and for antiaging

Discovering cures for cancer, for Alzheimer’s, for multiple sclerosis, for Parkinson’s, for the halting and reversing of aging itself, may not require the development of new drugs. It may mean discovering properties and therapies in drugs already developed and used for other diseases.

That’s the principle driving bioinformatics start-up
Insilico Medicine, a Baltimore-based company utilizing GPU-accelerated NVIDIA advanced scale computing to power deep learning neural nets using massive datasets for drug repurposing research that targets aging and age-related diseases.

Drug re-targeting is not new. One of the best known cases is rapamycin, a drug originally thought to be an antifungal agent before it became widely used in in organ transplantation and then as a cancer fighter. Other companies have pursued drug re-purposing as a development strategy, but Dr. Alex Zhavoronkov, Insilico CEO, said his company using big data analytics to scale the strategy to a level never previously attempted.

Taiwan has its first female President

Taiwan appears to have its first female President, in a landmark election that could unsettle relations with Beijing.

Eric Chu, the Nationalist Party candidate in Taiwan's presidential election conceded defeat late Saturday and congratulated rival Tsai Ing-wen to her victory as the new President, state-run Central News Agency reported.

Tsai Ing-wen is part of the pro-indepedent Taiwan DPP party.

Her supporters filled streets, waving party banners and cheering to victory announcements made from a stage.

The DPP has traditionally leaned in favor of independence for the island from mainland China, which could anger Beijing, which views Taiwan as an integral part of its territory that is to be taken by force if necessary. Beijing has missiles pointed at the island.

"I voted for DPP, because it's very critical time for the Taiwan people. We have our own democracy systems, we will not be influenced by China," said Tsai Cheng-an, a 55-year-old Taipei professor.

The KMT forged closer ties with China under President Ma Ying-jeou, which recently drew street protests. The new president will take over from Ma, who will step down on May 20 after serving two four-year terms.

January 15, 2016

Much like white light, spacetime is also composed of a certain rainbow

When white light is passed through a prism, the rainbow on the other side reveals a rich palette of colors. Theorists from the Faculty of Physics, University of Warsaw have shown that in models of the Universe using any of the quantum theories of gravity there must also be a 'rainbow' of sorts, composed of different versions of spacetime. The mechanism predicts that instead of a single, common spacetime, particles of different energies essentially sense slightly modified versions thereof.

We have probably all seen the experiment: when white light passes through a prism it splits to form a rainbow. This is because white light is in fact a mixture of photons of different energies, and the greater the energy of the photon, the more it is deflected by the prism. Thus, we might say that the rainbow arises because photons of different energies sense the same prism as having slightly different properties. For years now it has been suspected that particles of different energies in quantum universe models essentially sense spacetimes with slightly different structures.

Arxiv - Rainbow metric from quantum gravity

US Navy looks at three ways to kill everything which include Hypervelocity Projectiles and Railguns

The US Navy is adopting a philosophy of increased lethality and "three ways to kill everything."

Hypervelocity Projectiles (HPV)

Among projects in the works for the US Navy is the development of new gun rounds, including the possibility of a smaller version of the electromagnetic projectile launching technology used by the rail gun weapon now in development. The rail gun, which can hurl a projectile at well over 5,000 miles per hour, is being evaluated for possible mounting on a Zumwalt-class destroyer by the mid-2020s.

"When we take that projectile with the rail gun, why not make it small enough to put in a five-inch round ... with a couple of hundred five-inch rounds that now can shoot something as far, almost as accurately as a rail gun?" Rear Admiral Peter Fanta suggested.

As the Navy was developing EMRG (electromagnetic railgun), it realized that the guided projectile being developed for EMRG could also be fired from 5-inch and 155mm powder guns. Navy cruisers each have two 5-inch guns, and most Navy destroyers each have one 5-inch gun. The Navy’s three new Zumwalt class (DDG-1000) destroyers, which are under construction, each have two 155mm guns.

BAE Systems states that HVP is 24 inches long and weighs 28 pounds, including a 15-pound payload. The total length and weight of an HVP launch package, BAE Systems states, is 26 inches and 40 pounds. BAE states that the maximum rate of fire for HVP is 20 rounds per minute from a Mk 45 5-inch gun, 10 rounds per minute from the 155mm gun on DDG-1000 class destroyers (called the Advanced Gun System, or AGS), and 6 rounds per minute from EMRG. HVP’s firing range, BAE Systems states, is more than 40 nautical miles (when fired from a Mk 45 Mod 2 5-inch gun), more than 50 nautical miles (Mk 45 Mod 4 5-inch gun), more than 70 nautical miles (155mm gun on DDG-1000 class destroyers), and more than 100 nautical miles (EMRG).

The Navy describes the HVP as “a next generation, common, low drag, guided projectile capable of completing multiple missions for gun systems such as the Navy 5-Inch, 155-mm, and future railguns.... HVP’s low drag aerodynamic design enables high velocity, maneuverability, and decreased time-to-target. These attributes coupled with accurate guidance electronics provide low cost mission effectiveness against current threats and the ability to adapt to air and surface threats of the future.

Railgun tests and engineering to get to operational railgun by 2021

If the Navy does take the railgun out to sea on a fast transport, it will be in 2017 at the earliest. In lieu of testing the prototype rail gun in an at-sea environment, the Navy might instead proceed directly to developing an operational weapon system.

Zumwalt destroyer

General Atomics Electromagnetic Systems (GA-EMS) railgun projectiles were fired from the company’s Blizter prototype railgun weapon during recent tests at the U.S. Army’s Dugway Proving Ground

DARPA robotic sub hunter

DARPA has awarded the contracts for its robotic sub-hunter project called the ACTUV. ACTUV will be an umanned surface warship tasked with locating and trailing hostile submarines, keeping tabs on their movements.

One day, ACTUV might be armed and assigned kill missions -- but for now the mission is strictly "Look, don't touch."

The USA will use robotic sub hunters to counter $100 million to $1 billion diesel electric submarines with Air independent propulsion (AIP). AIP submarines can stay underwater for months and can be quieter than even nuclear submarines.

DARPA awarded the ACTUV contract to prime contractor Leidos. Leidoa is one half of the defense contractor formerly known as SAIC. (Note: The other half of that company inherited the company name, and remains SAIC today.) In so doing, DARPA listed several requirements for ACTUV. Among them, ACTUV must be:

  • Cheap. It should be only "a fraction" of the size of a diesel sub, and a fraction of a sub's cost as well.
  • Long-legged. ACTUV will need to range "thousands of kilometers" across the seas, for "months" at a time.
  • Independent. Manned operators will have only "sparse" ability to keep tabs on ACTUV, so the vessel must be able to conduct its mission autonomously, robotically following all "maritime laws and conventions for safe navigation" even as it maneuvers to keep track of "an intelligent adversary."

ACTUV must be able to fulfill its mission, and maintain "robust continuous track of the quietest submarine targets over their entire operating envelope."

Leidos hired Raytheon to develop a Modular Scalable Sonar System (MS3) for ACTUV -- the "eyes" (or rather, ears) that it will use to identify and track enemy submarines -- for mounting aboard its trimaran prototype (construction of which was in turn subcontracted to Oregon Iron Works). Raytheon says it delivered a completed MS3 system to Leidos in November.

China gets first overseas military base but has a long way to go to catch up hundreds of US bases

China has sealed a deal to build its first military base in Djibouti, a former French colony strategically located across from Yemen on the Red Sea, squeezed between Eritrea and Somalia.

Confirming years of under-the-radar suspicions, AFRICOM commander Gen. David Rodriguez told The Hill that the "logistics hub" and airfield will let China "extend their reach" into Africa over the course of an initial 10-year contract. Currently, The Hill observed, China can't do much more than stage some naval patrols out of Djibouti ports.

* China gets first overseas military base but has a long way to go to catch up hundreds of US bases
* China is the second largest economic power in nominal terms and first in purchasing power parity
* despite China's slowdown it should still grow faster than the USA over the next few decades
* China will be using trillions in investment and trade to get more overseas bases and access
* China is and will build many ports, airports and global infrastructure
* China military base presence will expand

As December's Forum on China-Africa Cooperation revealed, the Middle Kingdom wants to ensure privileged access to that kind of future. Although it's hard to unravel the details, Beijing used the Forum to pledge $60 billion in loans and export credits.

AFRICOM's (USA military command in Africa) top three priorities reach from one end of Northern Africa to the other:
  1. "neutralizing" the jihadist al-Shabab group in Somalia to the east,
  2. while "containing" enemies like ISIS in Libya and
  3. containing Boko Haram, to the west, in Nigeria and the greater Lake Chad region
Although al-Shabab's influence has been significantly reduced, nearby Ethiopia just booted the U.S. out of a drone base Washington had hoped to expand in the southerly town of Arba Minch. In other words, as China sets up shop in Djibouti, the U.S. finds itself restricted to that country for its eastern African operations.

Djibouti hosts the largest American permanent military base in Africa, Camp Lemonnier, which is home to more than 4,000 personnel - mostly part of the Combined Joint Task Force - Horn of Africa.

Even though France and Japan also launch operations from the Djibouti-Ambouli International Airport, it is China's military ambitions that are piquing interests.

Proposed Proto space colony

At Centauri Dreams, Gregory Matloff has some thoughts on the kind of space facility we can build with our current technologies.

The Bigelow B330 expandable station has been proposed for quite while. The new analysis is about the exact location of a base and using two stations for spinning for simulated gravity.

We need to answer some fundamental questions regarding human life beyond the confines of our home planet. Will humans thrive under lunar or martian gravity? Can children be conceived in extraterrestrial environments? What is the safe threshold for human exposure to high-Z galactic cosmic rays (GCRs)?

To address these issues we might require a dedicated facility in Earth orbit. Such a facility should be in a higher orbit than the International Space Station (ISS) so that frequent reboosting to compensate for atmospheric drag is not required. It should be within the ionosphere so that electrodynamic tethers (ETs) can be used for occasional reboosting without the use of propellant. An orbit should be chosen to optimize partial GCR-shielding by Earth’s physical bulk. Ideally, the orbit selected should provide near-continuous sunlight so that the station’s solar panels are nearly always illuminated and experiments with closed-environment agriculture can be conducted without the inconvenience of the 90 minute day/night cycle of equatorial Low Earth Orbit (LEO).

Initial crews of this venture should be trained astronauts. But before humans begin the colonization of the solar system, provision should be made for ordinary mortals to live aboard the station, at least for visits of a few months’ duration.

Another advantage of such a “proto-colony” is proximity to the Earth. Resupply is comparatively easy and not overly expensive in the developing era of booster reuse. In case of medical emergency, return to Earth is possible in a few hours. That’s a lot less than a 3-day return from the Moon or L5 or a ~1-year return from Mars.

A Possible Orbital Location

An interesting orbit for this application has been analyzed in a 2004 Carleton University study conducted in conjunction with planning for the Canadian Aegis satellite project. This is a Sun-synchronous orbit mission with an inclination of 98.19 degrees and a (circular) optimum orbital height of 699 km. At this altitude, atmospheric drag would have a minimal effect during the planned 3-year satellite life. In fact, the orbital lifetime was calculated as 110 years. The mission could still be performed for an orbital height as low as 600 km. The satellite would follow the Earth’s terminator in a “dawn-to-dusk” orbit. In such an orbit, the solar panels of a spacecraft would almost always be illuminated.

Two Bigelow B330 could be launched with one Falcon Heavy

The study of the adjustment of humans and other terrestrial life forms to intermediate gravity levels might be one scientific goal of the proposed 600-km habitat, the habitat should consist of two (Bigelow B330) modules arranged in dumbbell configuration connected by a variable-length spar with a hollow, pressurized interior. The rotation rate of the modules around the center could be adjusted to provide various levels of artificial gravity. Visiting spacecraft could dock at the center of the structure.

Image: The pressurized volume of a 20 ton B330 is 330m3, compared to the 106m3 of the 15 ton ISS Destiny module; offering 210% more habitable space with an increase of only 33% in mass. Credit: Bigelow Aerospace.

DARPA program aims to precisely spot single photons and explore the Fundamental Limits of Photon Detection

The process of detecting light—whether with our eyes, cameras or other devices—is at the heart of a wide range of civilian and military applications, including light or laser detection and ranging (LIDAR or LADAR), photography, astronomy, quantum information processing, medical imaging, microscopy and communications. But even the most advanced detectors of photons—the massless, ghostlike packets of energy that are the fundamental units of light—are imperfect, limiting their effectiveness. Scientists suspect that the performance of light-based applications could improve by orders of magnitude if they could get beyond conventional photon detector designs—perhaps even to the point of being able to identify each and every photon relevant to a given application. But is it even possible, within the laws of quantum physics, to definitively detect and identify every relevant photon—and to be confident that each detection signal is true and accurate?

DARPA’s Fundamental Limits of Photon Detection—or Detect—program aims to establish the first-principles limits of photon detector performance by developing new fully quantum models of photon detection in a variety of technology platforms, and by testing those models in proof-of-concept experiments

January 14, 2016

Lightbridge making progress to improved nuclear fuel that will boost energy production by 17% in existing nuclear reactors

Lightbridge Corporation has received final regulatory approval for irradiation testing of its metallic fuel at Norway's Halden research reactor. The company has also entered an agreement with US fabricator BWXT Nuclear Energy to evaluate the possible fabrication of fuel samples at BWXT's US facilities.

Reston, Virginia-based Lightbridge announced on 12 January that the operator of the Halden reactor, the Institute for Energy Technology (IFE), had received approval from the Norwegian Nuclear Radiation Protection Authority (NRPA) for all planned irradiation of Lightbridge fuel, which is expected to begin in 2017.

Lightbridge’s all metal fuel (AMF) assembly is comprised entirely of metallic fuel rods and is capable of providing up to 17% increase in power output in existing PWRs and up to a 30% power uprate in new build PWRs operating on 18-month fuel cycles. Due to certain constraints associated with the size of equipment that can fit in the containment structures of existing PWRs, there are limits as to the maximum power uprate level existing PWRs can accommodate without changing their existing containment structure. However, a new build unit can be constructed with a larger containment to allow for higher capacity equipment with relatively small capital cost increase.

Lightbridge is developing three primary nuclear fuel product offerings for power uprates and longer fuel cycles:

  • LTB17-1024™ all-metal fuel for up to 10% power uprates and 24-month operating cycles in existing PWRs;
  • LTB17-1718 1718™ all-metal fuel for up to 17% power uprates and 1818-month operating cycles in existing PWRs; and
  • LTB17-3018™ all-metal fuel for up to 30% power uprates and 18-month operating cycles in new-build PWRs.
  • In addition, Lightbridge is developing LTB17-Th18™, our™ thorium-based seed and blanket fuel, which offers significant back-end advantages and enhanced proliferation resistance of used fuel.

Presently, the size of Lightbridge’s initial target market worldwide is approximately 127 GWe. Our target market is projected to grow to 261 GWe by 2030. The following chart shows a breakdown of Lightbridge’s estimated target market by market segment.

First genetically-modified human embryos in Britain within weeks which follows 2015 GM embryos in China

The first genetically-modified human embryos could be created in Britain within weeks according to the scientists who are about to learn whether their research proposal has been approved by the fertility watchdog.

It was believed that scientists in China had already created genetically modified human embryos in early 2015.

Although it will be illegal to allow the embryos to live beyond 14 days, and be implanted into the womb, the researchers accepted that the research could one day lead to the birth of the first GM babies should the existing ban be lifted for medical reasons.

A licence application to edit the genes of “spare” IVF embryos for research purposes only is to be discussed on 14 January by the Human Fertilisation and Embryology Authority (HFEA), with final approval likely to be given this month.

Scientists at the Francis Crick Institute in London said that if they are given the go-ahead they could begin work straight away, leading to the first transgenic human embryos created in Britain within the coming weeks or months.

The researchers emphasised that the research concerns the fundamental causes of infertility and involves editing of the genes of day-old IVF embryos that will not be allowed to develop beyond the seven-day “blastocyst” stage – it will be illegal to implant the modified embryos into the womb to create GM babies.

Nvidia 8 teraflop deep learning supercomputer for self driving cars

NVIDIA is applying its deep learning prowess to enable autonomous vehicles. The GPU vendor launched NVIDIA DRIVE PX 2, an autonomous vehicle development platform powered by the 16nm FinFET-based Pascal GPU, the named successor to Maxwell. Like last year’s DRIVE PX, the next-gen development platform targets NVIDIA’s automotive partners, a growing list that includes Audi, BMW, Daimler, Ford and dozens more.

Equipped with two Tegra SOCs with ARM cores plus two discrete Pascal GPUs, the new platform is capable of delivering up to 24 trillion deep learning operations per second — 10 times what the previous-generation product offered. In terms of general computing capability, the PX 2 offers an aggregate of 8 teraflops of single-precision performance, a four-fold increase over the PX 1. In addition to pertinent interfaces and middleware, the development platform includes the Caffe deep learning framework to run DNN models designed and trained on DIGITS, NVIDIA’s interactive deep learning training system.

Reprising a conversation he had with Elon Musk on stage at GTC15, Huang noted that humans are the least reliable part of the car, responsible for most of the one million automotive-related fatalities each year. Thus, said Huang, replacing the human altogether will make a great contribution to society. Perception is the main issue and deep learning is able to achieve super-human perception capability. DRIVE PX 2 can process 12 video cameras, plus lidar, radar and ultrasonic sensors. This 360 degree assessment makes it possible to detect objects, identify them and their position relative to the car, and then calculate a safe and comfortable trajectory.

Department of Energy ESnet will carry 100 petabytes of data per month in 2016

The Energy Sciences Network (ESnet) is the mission network of the U.S. Department of Energy. This high-performance, unclassified network that is managed by Lawrence Berkeley National Laboratory is moving into the newly-constructed Wang Hall on the Berkeley Lab campus.

ESnet links 40 DOE sites across the country and scientists at universities and other research institutions via a 100 gigabits-per second backbone network. One of these sites, the National Energy Research Scientific Computing Center (NERSC) has made the move to the Berkeley campus from its previous 15-year home in Oakland, California. ESnet has built a 400 gigabit-per-second (Gbps) super-channel between the Berkeley and Oakland sites to support this transition over the next year. This is the first-ever 400G production link to be deployed by a national research and education network, and will also be part of a research testbed for assessing new tools and technologies that are necessary to support massive data growth as supercomputers approach the exascale era.

ESnet carries around 20 petabytes of data monthly. The level of traffic over the ESnet network has increased an average of 10 times every 4 years, propelled by the rising tide of data produced by more powerful supercomputers, global collaborations that can involve thousands of researchers, and specialized facilities like the Large Hadron Collider and digital sky surveys. It’s expected that ESnet will need to carry over 100 petabytes of data per month by 2016.

ESnet purchased almost 13,000 miles of dark fiber from a commercial carrier for DOE use. By creating a research testbed and lighting the dark fiber with optical gear, ESnet will enable network researchers to safely experiment with disruptive techologies that will make up the next generation Internet in a production-like environment at 100 Gbps speeds.

Rambus will work with Microsoft to create memory system for next generation quantum computers

Rambus Inc., the chip technology intellectual property vendor, is working with Microsoft Research on future memory requirements for quantum computing.

Rambus (NASDAQ: RMBS), Sunnyvale, Calif., confirmed just before the holidays it is collaborating with the software giant (NASDAQ: MSFT) on memory systems for next-generation quantum computing. The promising but largely untested technology is slowly gaining traction as current computing architectures run out of steam.

Rambus noted that memory requirements are being driven by soaring system demand as consumption of real-time data expands. That requirement “is driving the need to explore new high-performance, energy-efficient computer systems,” Gary Bronner, vice president of Rambus Labs, noted in a company blog post. “By working with Microsoft on this project, we can leverage our vast expertise in memory systems to identify new architectural models.”

The research partners said they would pool their resources to explore future computing architectures capable of enhancing memory capabilities for a range of future use cases. They also will explore how memory technologies can be used to boost overall system performance as data volumes skyrocket and the list of data sources expands.

Google dives into virtual reality with new division

Alphabet Inc.'s Google is focusing on virtual reality and moving the head of its product management team to run the new effort.

Clay Bavor, the vice president of product at Google since 2005, has taken on the title of vice president of virtual reality, according to Bavor's Twitter profile.

Bavor is no stranger to taking on big jobs at Google. As a top player in product management, he helped lead some of the company's most well-known apps, including Gmail, Google Docs and Google Drive.

For Google, these moves are aimed at making sure the company doesn't fall behind -- or further behind -- competitors, like Facebook, in the virtual reality arena. Facebook, for instance, has already begun taking pre-orders for its virtual reality console Oculus Rift, with the device expected to begin rolling out in the first quarter of this year.

"Virtual reality is eventually going to be one of the big data interfaces and given Google is about data access, not having a focus on this could be a going-out-of-business strategy," said Rob Enderle, an analyst with the Enderle Group. "The market hasn't emerged yet, so there is time and Google has a great deal of reach. This may be the first step -- develop an expertise, then buy [related companies] to catch up."

January 13, 2016

New DARPA Chips Ease Operations In Electromagnetic Environs

Enhanced situational awareness could come from new chips that can sample and digitize battlefield radiofrequency signals at blazingly fast rates

Competition for scarce electromagnetic (EM) spectrum is increasing, driven by a growing military and civilian demand for connected devices. As the spectrum becomes more congested, the Department of Defense (DoD) will need better tools for managing the EM environment and for avoiding interference from competing signals. One recent DARPA-funded advance, an exceptionally high-speed analog-to-digital converter (ADC), represents a major step forward. The ADC could help ensure the uninterrupted operation of spectrum-dependent military capabilities, including communications and radar, in contested EM environments. The advance was enabled by 32 nm silicon-on-insulator (SOI) semiconductor technologies available through DARPA’s ongoing partnership with GlobalFoundries, a manufacturer of highly-advanced semiconductor chips.

The EM spectrum, whose component energy waves include trillionth-of-a-meter-wavelength gamma rays to multi-kilometer-wavelength radio waves, is an inherently physical phenomenon. ADCs convert physical data—that is, analog data—on the spectrum into numbers that a digital computer can analyze and manipulate, an important capability for understanding and adapting to dynamic EM environments.

With the help of innovative new chips that can convert analog radar and other electromagnetic signals into processible digital data at unprecedented speeds, warfighters can look forward to enhanced situational awareness in the midst of battle.

Russia's Reserve Fund could be depleted in 2016 without massive budget cuts

Russia’s Reserve Fund, which it uses to plug gaps in the budget, has slumped 30 percent since the start of last year. Finance Minister Anton Siluanov warned Wednesday that the buffer may be depleted entirely in 2016 if the government doesn’t enact bold spending cuts.

The fund, which was built from windfall oil revenue, stood at $59.35 billion at the end of November. That compares with a five-year high of $91.72 billion in August 2014. Russia’s budget is based on an average oil price of $50 per barrel, while Brent is trading near the lowest level in 12 years, slightly above $31. December fund figures are due to be released on Wednesday

Twitter reports that Gravitational Waves have been found and LIGO Observatory researchers are writing a paper

Lawrence Krauss, a cosmologist at Arizona State university, tweeted that he had received independent confirmation of a rumour that has been in circulation for months, adding: “Gravitational waves may have been discovered!!”

The excitement centers on a longstanding experiment known as the Advanced Laser Interferometer Gravitational-Wave Observatory (Ligo) which uses detectors in Hanford, Washington, and Livingston, Louisiana to look for ripples in the fabric of spacetime.

According to the rumors, scientists on the team are in the process of writing up a paper that describes a gravitational wave signal. If such a signal exists and is verified, it would confirm one of the most dramatic predictions of Albert Einstein’s century-old theory of general relativity.

Krauss said he was 60% confident that the rumor was true, but said he would have to see the scientists’ data before drawing any conclusions about whether the signal was real or not. Researchers on a large collaboration like Ligo will have any such paper internally vetted before sending it for publication and calling a press conference.

So this is pre-buzz for a researcher paper which may not be any good or conclusive

Einstein predicted that the waves would be produced in extremely violent events, such as collisions between two black holes. As gravitational waves spread out, they compress and stretch spacetime. The ripples could potentially be picked up by laser beams that measure minute changes in the lengths of two 4km-long pipes at the Ligo facilities.

Gabriela Gonzalez, professor of physics and astronomy at Louisiana State University, and the spokesperson for the LIGO collaboration, told the Guardian: “The LIGO instruments are still taking data today, and it takes us time to analyse, interpret and review results, so we don’t have any results to share yet.

Physicist working with LIGO looked for them from 2002 to 2010, with the initial incarnation of the observatory, which consists of two gargantuan L-shaped optical instruments in Hanford, Washington, and Livingston, Louisiana. To detect the stretching of space itself, researchers compare the lengths of an interferometer’s two 4-kilometer-long arms to within a billionth the width of an atom.

From 2010 to 2015, LIGO researchers completely rebuilt their instruments, aiming to make them up to 10 times more sensitive. They resumed their hunt for a fleeting source of gravitational waves on 18 September 2015. Then the rumor mill revved up.

By mid-September 2016 "the world's largest gravitational-wave facility" would have completed a 5-year US$200-million overhaul and a would have a total cost of $620 million. LIGO is the largest and most ambitious project ever funded by the NSF.

Its sensitivity will be further enhanced until it reaches design sensitivity around 2021

January 12, 2016

3D printing and advanced computer modeling was used to develop China Y-20 military transport aircraft

The Xian Y-20 is China's new large military transport aircraft. The Y-20 is the first cargo aircraft to use 3-D printing technology to speed up its development and to lower its manufacturing cost. Model-based definition (MBD) is also used, and it's the 3rd aircraft to utilize MBD technology in the world, after Boeing 787 (2005) and Airbus A380 (2007). The implementation of MBD greatly shortened the time required, for example, without MBD, installation of wings takes a month or two, but with MBD adopted, the time is drastically shortened to just a few hours, and in general, the design work reduced by 40%, preparation for production reduced by 75%, and manufacturing cycle reduced by 30%. In addition to 3-D printing, Y-20 is also the first aircraft in China adopting associative design technology (ADT) in its development, the second aircraft to do so in the world, after Boeing 787. The adaptation of ADT greatly shortened the development time by at least eight months, and modification of wing design that previously took a week is shortened to half a day

* the shortest take-off distance of Y-20 is 600 to 700 meters
* The first Y-20 prototype is powered by four 12-ton thrust Soloviev D-30KP-2 engines.; early production units are likely to be similarly powered. The Chinese intend to replace the D-30 with the 14-ton thrust WS-20, which is required for the Y-20 to achieve its maximum cargo capacity of 66 tons

The cost is about $160 million for each one.

US 2016 ship and submarine procurement

Summary of US 2016 Aircraft Weapons Procurement

AI is not needed to address Population, Climate Change, Human Development and Education

Google’s chairman Eric Schmidt thinks artificial intelligence will let scientists solve some of the world’s "hard problems," like population growth, climate change, human development, and education.

Rapid development in the field of AI means the technology can help scientists understand the links between cause and effect by sifting through vast quantities of information, said Eric Schmidt, executive chairman of Alphabet Inc., the holding company that owns Google.
“AI will play this role to navigate through this and help us.”

It can also aid companies in designing new, personalized systems. In the future, Schmidt would like to see “Eric and Not-Eric,” he said at a conference in New York, where “Eric” is the flesh-and-blood Schmidt and“not-Eric is this digital thing that helps me.”

Nextbigfuture's position on population growth, climate change, human development, and education is AI is not needed

Larger families in Africa is the remaining aspect of higher population growth. Population going to 11-12 billion by 2100 is because Africa will go from 1 billion to 5 billion. Asia population has flattened other than India and some South Asian countries. The developed world and China have flat to declining population.

A lot of the larger family in Africa issues will be greatly reduced when they have healthcare and public health infrastructure like clean water and sanitation and vaccination. If people have over 99% confidence that their children will die then they will not have 5-6 children to assure that they end up with 2 that grow into adults.

Climate change and CO2 emissions are a unintended byproduct of industrialization. We use coal and fossil fuels for energy and industry and the carbon gets combined with oxygen when it is burned to make CO2. We use about 8 billion tons of coal and about another 4 billion tons of oil. Carbon plus two oxygens means the weight of CO2 is about 40 billion tons. Burning literal mountains of carbons each year also means billions of tons of soot and particulates. Incomplete burning results in this air pollution which also blackens white icecaps and other surfaces. Darkening increases the heat that is absorbed and not reflected. The soot problem results in almost as much heating effect as the CO2 and it also kills more people because of the damage to hearts and lungs. The soot and particulate problem kills over 4 million people each year and makes billions of people less healthy which increases medical costs. The soot and particulate problem is ten to twenty times cheaper to address and can provide faster results of a few years rather than decades.

Again no AI is required to diagnose and prescribe solutions. Clearly when people want to spend many trillions addressing CO2 to get some effect in several decades to a supposedly urgent issue when a lot less money could be spent targeting soot and particulates shows that people are not really trying to solve the big problems.

A 2013 study indicated that the role in climate change for soot is twice as large as previous estimates. Soot has 66% of the impact of carbon dioxide. Mitigating soot would cost about $6 per ton of CO2 equivalent. CO2 mitigation costs about $100 per ton. Nextbigfuture has frequently written that soot is the most cost effective emission target for managing climate. It is also the one with the fastest results. Carbon dioxide mitigation does not impact temperatures for 50-80 years. Fully mitigating soot can also save 1-2 million lives by avoiding the disease from soot pollution

Is AI supposed to make up for human corruption and unwillingness to take obviously better approaches and basic analysis ?

Project Pacer And the Unbuilt Pacer Economy Part 2: Ralph Moir's 2 Kiloton Variant and the Future of Pacer

Project Pacer And the Unbuilt Pacer Economy Part 1:
This is the second of two articles. This gives some history of the possible need for near term D-D fusion neutrons to maximize fission fuel supplies. This  second part covers more modern redesigns of the concept, notably by Ralph Moir in the 1990s and the fact that the vast industries Pacer enables were never built.
 the Unbuilt Pacer Economy
We also discuss some ideas about the future of Pacer.

In the first post of this series http://nextbigfuture.com/2016/01/project-pacer-and-unbuilt-pacer-economy.html#more  we discussed the general question of the length of time earthly fissionables might last and what an aid D-D fusion is to stretching those supplies into geological time.

 The 1975 Los Alamos Progress Report on Project Pacer was covered in the first part of this article and can be found here:  http://permalink.lanl.gov/object/tr?what=info:lanl-repo/lareport/LA-05764-MS

There have been various incarnations of Project Pacer, including before it was formally born, from 1957 on; a lot of people (especially those working in the nuclear weapons complex in the USA) began thinking on how to get power out of nuclear explosions, because at that time, only 15 years after the construction of the first artificial uranium reactor in 1942, we knew how to make uncontrolled fusion reactions, and in fact had been doing so for 5 years already: The hydrogen bomb.

 In a Project Pacer containment facility underground, you would not see mushroom clouds but this gives you a feel for the yields under consideration (as well as a reminder why you want them underground):


Remember that the first considerations were 1 megaton devices, Project Pacer in 1975 was thinking about 800 individual 50 Kiloton devices yearly  (40 megatons explosive equivalent yearly or around 2 gigawatts electrical at 30% efficiency,and Ralph Moir's 2 kiloton device based variant is considered below.

Were larger explosions ever considered? I'm sure they must have been, I considered them in this article

 1 megaton device-- $10 million--4184000 gigajoules per megaton
$2.39 per giajoule-- about the same price as coal
29.3 gJ/ton coal at $80 ton gives $2.73 a gigajoule coal

10 megaton device-- $10 million--41840000 gigajoules per megaton
0.239 per giajoule 1/10th the cost of coal

So, 100 megaton device-- $10 million--418400000 gigajoules per megaton = ~1/100th the cost of coal

The takeaway is anything much below a megaton device is very
expensive relative to the larger sizes.

If the devices were free, the deuterium being the only cost, the price would be 1/1891 that of coal at $80 ton (given Deuterium at $500/kilogram)

which basically comes to the conclusion that assuming 'weapons complex like costs per device', nothing much under a megaton makes economic sense.

I should explain that phrase-- a lot of government operations ramp up costs continually and every few years allocate the overhead expansion to costs to each of their subprograms. It is a huge problem in government and Western civilization in general, and it comes down to a kind of organizational hardening of the arteries where more and more financial and time input is required for less and less result.

 Thus today we spend more time and money on studying and building the latest government large launch vehicle then a much newer and leaner NASA working with incredibly less equipment and worse conditions had to spend actually building the real Saturn V and the new rocket has not yet flown and may not ever.
http://www.space.com/24628-will-spacex-kill-nasa-sls.html  We killed the Saturn V to save what wiki characterizes as  Cost per launch, $494 million in 1964–73 dollars ($3.2 billion present day) 
 And today pay for a development budget as big as that which launched it without actually having anything to launch.
Same for fusion reactors and same for the nuclear weapons complex.  The US Air Force in 1960 could not hit targets more accurately than we can today with better equipment but they could generate more sorties on less notice for less real money, and so on. In 1959 the US government could build 20 nuclear weapons a day at probably a tenth the real cost each that merely rebuilding a single weapon would take today. Similarly there is a fusion complex  of overhead to pay for even though there are no working fusion reactors.

All this is a roundabout way of saying that if you can build explode and recycle the isotopes of  nuclear devices for $1000 each you can make money running a 2KT Pacer. If it takes $25000 to build explode and recycle the isotopes you can make money with a 50KT Pacer (assuming you can keep the cavity engineering cost down which tends to be easier with the smaller sizes than larger)  If it takes $25 million per device to build explode and recycle the isotopes  you will lose money unless you are blowing off 50 megaton Tsar Bombas twice a day.
On the raw power of a 50 megaton bomb-- http://nextbigfuture.com/2013/01/friedlander-on-wang-bullet-and-on.html

  (There is a concept--800 Tsar Bombas per year, output 2 terawatts, not gigawatts electrical-- Wow. If 97% fusion or better like the original was that is enough isotopes to produce 9300 kg of  U-233 per shot. Wow.).  I am not sure how you would leakproof piping in a Tsar Bomba like Pacer but what's a little local earthquake between friends?  Unless you have AB-Matter available http://nextbigfuture.com/2011/11/starbase-jupiter-and-other-femtotech.html

 If you had AB-Matter to build a super-Pacer --yes you could make huge money on the 50 megaton units-- but if you had AB-Matter you wouldn't NEED to do that because there are enormously more profitable things you could do with AB-Matter--- sigh. Back to the article

It's not just  that the 50 megaton versions shake the ground for 10 times the radius; the real reason they would be very hard to build is the wider the unsupported arch of the roof the easier it is to collapse. THAT (tightness of the chamber against 800 thermonuclear blasts a year for decades) is why salt domes (highly plastic) were considered for the 50 Kiloton version of Pacer in 1975 and why the Ralph Moir version is only 2 kilotons (smaller, tighter, easier to engineer).

EVERY Pacer unit involves local earthquakes. (a number of kilometers away they are not noticable to anyone without a seismograph)

In the last article the 1975 report stated "Seismic effects, which are peculiar to PACER are kept small: the baseline 2000 Mwe station produces "thumps" so small as to go unnoticed a few miles from the site"
I believe I was saying that if you can keep your costs down you can make money building smaller versions of Pacer.

Remember it is the TOTAL complex cost including the bombs that adds up. You got overhead? It goes on the bill. Bombs go on the bill, piping problems go on the bill. That visitor center? On the bill.  To keep it economical keep expenses down.

This is of course the exact opposite approach to fusion as ITER which is basically free organizational money forever with no binding deliverables and no thought of  actual financial payback ever except as a theoretical far-future exercise whenever a complaining government asks about it. http://news.sciencemag.org/europe/2015/11/breaking-iter-fusion-project-take-least-6-years-longer-planned (This is not to diss their work or say that they are not trying serious approaches to breakeven, just that I don't believe that future practical working fusion reactors will claim direct descent from ITER)

 Iter is Latin for "The Way".  ITER claims it is the way to new energy. But one premise of this pair of articles is that perhaps PACER was 'the way' that was never seriously considered because to be anti-nuclear and anti Project Plowshare was cool in the 70s even before Three Mile Island.  President Carter was trained as a nuclear engineer but cut fusion research and part of that was ending the Plowshare Project under which PACER began. (Google Carter cut fusion research and see what you get)  However those who hate Republicans can read this, http://www.huffingtonpost.com/2015/01/20/fusion-energy-reactor_n_6438772.html  The key thing is I am confident some version of PACER would work and Carter's budget cut that; by the time the Reagan Administration cut magnetic fusion research PACER was long gone.

Another key thing about Pacer is you have to conserve the surface of the blast chamber so you keep the temperature lower than you'd like and the cavern bigger than optimum from a heat point of view because you don't want to maximize radiation and wall contact with the fireball but rather minimize it. (AB-Matter of course would be immune to both. )

The chamber will move but needs to recover preshot position, the surrounding matrix (rock, salt, whatever) will take shocks but needs to stabilize and get sealed, you need to crank out many many bombs a year and they frankly cannot have much at all in common with regular nuclear bombs-- no expensive detonators inside, ideally primaries fired externally say through gas gun electromagnetic implosion of a metal collar or other exotic means, all expensive parts of which are never in the blast chamber.

No bomb can be independently triggerable, the complex itself is the trigger so no stolen bomb risk. Etc.(note that the fissionables are by definition usable in weapons and they can NEVER leave the complex except downblended as U233 from Thorium 232 and mixed with U-238 down to reactor grade.

 If we produce Pu-239 it is supergrade, the best bomb material there is and so for the U-233
https://en.wikipedia.org/wiki/Thorium_fuel_cycle#Uranium-232_contamination because of the clean D-D generation of neutrons without a lot of reactor dwell time in the messy fission environment that irradiates daughter isotopes over time.
 Here mother isotopes are irradiated directly quickly and finally.

The neutrons, the blasts, everything will eventually weaken the chamber. That blast chamber needs to be either regeneratable or end of life disposable all underground. You can't have leaks that get through all layers of defenses, whatever they be, not in the pipes, not in the blast chamber.

 Yet you can't spend too much on mega-engineering your way out of impossible situations or it becomes too expensive.

 I am pretty sure that Ralph Moir thought about it a bit and realized that solvable small problems  held more promise than huge profits divided by huge expenses.

But there are still big problems because of chemistry physics radiochemistry and all stops in between. All pieces of the former bombs will go into the chamber. Various working fluids are possible, some of which are optimized for tritium extraction (water on the other hand is designed for tritium entrainment as tritiated water) All kinds of isotopes will be produced although if well insulated from the chamber these can be a greatly reduced subset of the otherwise possible. The interaction of whatever working fluids are used with the pieces and the radiation will all generate complexity.

The huge and singular advantage of Pacer, of course, is that OFF means OFF. Not that all radiation is gone but that no runaway nuclear reaction is possible once you refuse to load another bomb.

There is no huge flammable inventory of graphite or hydrogen gas as in other reactor designs, or liquid sodium (unless you design it that way-- never read of it in any of the PACER literature)

 From the point of last detonation all radioactivity begins to decay and however horrible a possible emergent situation it is (if you have sited correctly) already buried underground and under the water table. Rare indeed are the situations involving a Pacer unit that you could not simply turn your back on (physically, not necessarily financially).

FLiBe  https://en.wikipedia.org/wiki/FLiBe
 Here is Wiki data and pictures

Molten FLiBe flowing; this sample's green tint is from dissolved uranium tetrafluoride.

The 2:1 mixture forms a stoichiometric compound, Li2BeF4, which has a melting point of 459 °C, a boiling point of 1430 °C, and a density of 1.94 g/cm3. Its heat capacity is 4540 kJ/m3, which is similar to that of water, more than four times that of sodium, and more than 200 times that of helium at typical reactor conditions
The low atomic weight of lithiumberyllium and to a lesser extent fluorine make FLiBe an effective neutron moderator. As natural lithium contains ~7.5% lithium-6, which tends to absorb neutrons producing alpha particles and tritium, nearly pure lithium-7is used to give the FLiBe a small cross section;[8] e.g. the MSRE secondary coolant was 99.993% lithium-7 FLiBe.[9]
Beryllium will occasionally disintegrate into two alpha particles and two neutrons when hit by a fast neutron.
In the liquid fluoride thorium reactor (LFTR) it serves as solvent for thefissile and fertile material fluoride salts, as well as moderator and coolant.
Some other designs (sometimes called molten-salt cooled reactors) use it as coolant, but have conventional solid nuclear fuel instead of dissolving it in the molten salt.
The liquid FLiBe salt was also proposed as a liquid blanket for tritium production and cooling in the compact tokamak reactor design by MIT
Ampoules of FLiBe with uranium-233 tetrafluoride: solidified chunks contrasted with the molten liquid.

Purified FLiBe. Originally ran in the secondary loop of the MSRE.
 there’s an opening for a new lithium-7 extraction process. However, any company attempting such a development will have to work under the watchful eye of DOE’s Y-12 group.

http://www.fusion.ucla.edu/apex/meeting4/5sze0798.pdf WHAT DO WE KNOW ABOUT FLIBE?
https://www.iaea.org/INPRO/CPs/COOL/2nd_Meeting/Literature_Summary-IAEA.pdf molten salt for reactor use

Ralph Moir Pacer Links -- 


Peaceful nuclear explosives to make electrical energy:
PACER Revisited

A study and project called Pacer suggested exploding a 20 kton nuclear explosive in a steam filled, earth walled cavity once every three hours
to produce 1000 MWe of power. In a series of papers this idea was revisited replacing the steam filled, earth walled cavity with a steel lined
underground cavity using molten salt droplets to cushion the effects of the explosive and absorb its energy. The yield chosen was typically
2 ktons once every 20 minutes to produce the same 1000 MWe of power. If such explosives could be initiated with 20 tons [84 GJ] of fission yield for a total of 2 ktons then the resulting power system would be 1% fission and 99% fusion.
• R.W. Moir. “PACER Revisited” Fusion Technology 15 (March 1989) 5 pages
• Call, Charles and R.W. Moir. “A Novel Fusion Power Concept Based on Molten-Salt Technology,” Nuclear Science & Engineering 104 (1990)
10 pages
• Szoke, Abraham and R.W. Moir. “A Practical Route to Fusion Power,” Technology Review (July 1991) 8 pages
• Szoke, Abraham and R.W. Moir. “A Realistic Gradual and Economical Approach to Fusion Power,” Fusion Technology 20 (December 1991)
10 pages
• Sahin, Sumer, R.W. Moir, Unalan, S. “Neutronic Investigation of a Power Plant Using Peaceful Nuclear Explosives,” Fusion Technology 26
Ralph Moir data on PACER from Neutronic analysis of a PACER reactor

The original PACER concept called for 20 kiloton charges in a 200 m diameter cavity under 200 atmospheres pressure of  500 C steam  generating 1000
MW electric with one bomb every 7 hours.

The 50 kiloton concept of 1975 was detailed in the previous article http://nextbigfuture.com/2016/01/project-pacer-and-unbuilt-pacer-economy.html#more

Modified by Ralph Moir the concept became 2 kiloton charges each 40 minutes in a 20 meter radius cylindrical cavity engineered with a 1 cm thick stainless steel liner rock bolted or otherwise joined to the natural earth interface.

FLiBe is 2 Lithium 1 Beryllium 4 Fluorine with a density of .495 and jets of it, in the modified PACER architecture of Ralph Moir, shield the wall of the blast chamber. 

At the moment of blast, 25% of the volume in the 'liquid zone' is molten FLiBe jets and 75% void.
 They shield the 1 cm thick stainless steel blast chamber which is 30 meters in radius 60 in diameter and that chamber is rockbolted to the excavated wall.

  To achieve a tritium breeding ratio of 1.15 (a shot!) requires FLiBe thickness of 2.0 resulting in energy density of 19085 joules/gram.  

If over 2.5 meters thick FLiBe thickness after 30 years of 800 blasts a year the steel wall will be low enough activation (and the rock behind it) that USNRC rules of shallow burial apply.

This change was instrumental in reducing cavity volume by factor of 50 and peak pressure by a factor of 9.

Steam working fluid was replaced by molten salt jets absorbing energy and shocks, reducing pressure right after the explosion to 3 megapascals or so (the heat evaporates the liquid)

 Because tritium is nearly insoluble in the salt it reduces the tritium inventory by a factor of 10 e 5 from 10 million curies to 100 curies of tritium.

The FLiBe working temperature would be 500 C or 773 K

Fusion reaction products: 3.5 mev alpha and 14.1 mev neutron

40 cm thickness of rock behind the 1 cm metal experiences the heaviest irradiation

As viewed from the side 5 meters away from the bomb the pouring FLiBe zone begins and 30 meters away it ends so 25 meters thick of 2mm diameter shower sprays amid 75% void.

Each 2 kiloton shot is  neutron source with strength 2.95 x 10e24 neutrons per shot.

Lithium 6 burnup of 37 gram shot of 6 li in FLiBe natural lithium makeup of ~170 tons 12 tons lithium 6 content
315000 shots in plant lifetime 630000 kilotons (all nuclear tests ever equivalent)
Amazingly close-- I wrote an article on that. The actual number of detonations is under 3000 so the first PACER team rapidly becomes world champions in about 4 years.
Apparently around 629 megatons. 452 megatons Soviet, 140 USA 7 megatons UK 10 megatons France and 20 megatons China. Plus under a megaton for India, Pakistan and North Korea and everyone else.
Berylllium burnup from neutrons 10 gram of Be per shot 3 tons total over a 30 yr USA beryllium resource ~150 kt (If you take total crustal abundance it is about the same as Uranium so say 40 trillion tons in the world)
One cool thing about this paper is the Pacer PNE reactor vessel at end of lifetime can be a disposal site for other nuclear waste so theoretically you have an end of cycle profit center

Ralph Moir data on PACER from


A PNE (Peaceful Nuclear Explosion--another name for PACER)  plant would produce a tenth as much waste as a conventional fission plant even if the explosions were all fission. over time 90% fusion should be possible.

DT is implied in the beginning at least. Lithium is only in supply for some centuries in a D-T powered world-- deuterium is needed.  You really need to conserve lithium for the molten salt and not waste it on breeding tritium that you can get from the harder to fuse deuterium.  On the other hand everything has a learning curve.

Engineers might wrap the fissile material in a  cylindrical  jacket through which they would pass a large electric current squeezing the cylinder--rail guns and gas guns might also be used to keep bomb cost down and the charges more secure

In the early 1960s physicist Albert Latter, then of the Rand Corp, devised a scheme called PACER (but did not plan to reprocess the unburned fuel) and assumed Hiroshima sized explosions.

PNE power using 1  kiloton  explosives would be economically competitive at $1000 per explosive including both the nuclear explosive and processing the nuclear materials.

Oak Ridge NL estimate processing nuclear leftovers with molten salt system could cost as little as $10 per kg of recovered U which could translate to $100-500 per explosion.

Previous Project Plowshare estimates relied on custom rather than mass produced technology and were orders of magnitude more expensive than needed here.

To be economical the bombs have to be under $1000 each-- that means radically different detonation and arming procedures--you can't have expensive electronics aboard the bomb--
On the other hand 25,000 identical units a year would amortize costs and bring mass production economies and the explosives would not have to be packaged or guarded external to the site and Plowshare also assumed no recycling of fuel-- none of that true here.

Test facilities could be made to withstand explosions of 30 to 300 tons as we engineer the actual unit.

If a leak did occur it would be well underground, small (the entire facility is small) detectable and contained and the worst conceivable accident is no more than 1 percent that from today's fission plants since cleaning is done once a week vs once in 3 yrs.

  Not lowering the next explosive would shut the plant down; no runaway dynamic exists even in potential (the article mentions Chernobyl's stock of hot graphite burned for days)

Some waste would accumulate on the walls of the cavity. Decommissioning might mean filling it.

Over the experience curve the amount of fission in each explosive should decline.

Any existing nuclear weapons state could operate a PACER without gaining any knowledge of new bomb tech. Non weapons states might have them run by contract by weapons states

Inspection is needed because of the potential of plutonium production even though Thorium U-233 is the logical cycle to choose because of the safety and down-blending potentials.

No ideal energy source for base-load electric power on Earth as opposed to space is in sight.  Conservation is not an energy source though it can moderate oncoming demand. Space based approaches are not build-able on demand without some new tech. PACER is one answer.

The explosives would not be self contained and transportable, thus not immediately usable as other than raw bomb material offsite.

Many of the technical ideas in this article come from the  High Energy Density Facility to study matter at high density pressure and temperature

20 minute intervals between explosion would produce 3000 megawatts thermal and 1000 megawatts electrical

Ralph Moir data on PACER from

The cost of reprocessing has been estimated to be $600 kg in aqueous solutions but $10 kg in molten salt
 Ralph Moir mentions here that one proposed way to store nuclear energy is to use the pulse to pump water uphill.
J. Pettibone A Novel Scheme For Making Cheap Electricity With Nuclear Energy UCID 18153 1979
Modified search name UCRL-JC--107068
http://www.osti.gov/scitech/servlets/purl/5784646 PDF Idea is from 1979 PDF from 1991 UCRL-JC--107068

. In
1979 Joseph Pettibone conceived of a large water piston external engine driven by a nuclear
weapon-like release of nuclear energy. [29, 30] Although this engine had obvious proliferation
problems, it highlights the scientific feasibility and the economic advantages of eliminating
the nuclear steam supply system.www.gera-e.com/pdfs/GERA-15-01.pdf
More on PACER by Ralph Moir--
If a fraction p is fissioned in the explosion the he amount of uranium used to run a 4 gw thermal reactor with a 1 day processing cycle is about 5/p kg  This has to be compared with the 4800 kg plutonium inventory of the Super Phenix breeder reactor (translation--with a breeder reactor you process infrequently with PACER you reprocess literally between shots so you only have a few kg in inventory instead of tons-- amazing improvement in nuclear site safety)

Ralph Moir data on PACER from
Steam entrains tritium becomes radioactive
hard to separate tritium from steam

political and health hazards if vented tritium

Key differences to Ralph Moir version of Pacer
The Moir reboot of PACER changes chamber size from an unlined salt cavity, spherical shape 100 m radius to a steel lined cavity cylindrical shape 20 to 50 m radius 60 to 150 m height
The charge yield went down from 20 KT to 1 to 10 kilotons
ambient pressure 20 mpa (200 atmospheres) to .1 pa  (1 atmosphere)
equilibrium pressure after explosion from 26 mpa to 3 mpa
tritium inventory in cavity from over 10 million curies to 100 curies
fluid inventory from 330000 tons of steam to 1000 to 10000 tons of FliBe salt.

Line cavity with steel for predictable seal and properties under hundreds of thousands of explosions
use molten salt not steam for working fluid 70 pct energy then absorbed pressure contained reduced factor 3 or more
tritium produced insoluble in molten salt so can be pumped away purified
tritium inventory one hundred thousandth what was in case of accidental venting much reduced risk
smaller yield means reduced cost to cavity
Sterling  reentry test in the same cavity as Salmon proved decoupling worked--only nuclear tests in Mississippi --proved salt cavity practicality

This picture is from Wikipedia

the only nuclear weapons test detonations known to have been performed in the eastern United States.
Two underground detonations, a joint effort of the US Atomic Energy Commission and the US Department of Defense, took place under the designation of Project Dribble, part of a larger program known as Vela Uniform (aimed at assessing remote detonation detection capabilities). The first test, known as the Salmon Event, took place on October 22, 1964. It involved detonation of a 5.3 kiloton device at a depth of 2,700 feet (820 m). The second test, known as the Sterling Event, took place on December 3, 1966 and involved detonation of a 380 ton device suspended in the cavity left by the previous test. Further non-nuclear explosive tests were later conducted in the remaining cavity as part of the related Project Miracle Play.
no experience base of hundreds of thousands of explosions in salt caverns (but fatigue experience with steel)

steel liner stops impurities from earth contaminating molten salt
Euctectic FLiBe is 67.1 pct wt bef2 32.9 pct liF melting point 363 degrees Centigrade small amount ThF4 added to this
fuel charge surrounded beryllium thorium flouride FLiBe vaporizes all energy released U233 bred from Thorium and tritium
arbitrary height cylinder 4.67 radius
Vertical jet streams molten FLiBe around charge ~2mm diameter
in cylindrical cavity 20 m radius 4tj fuel charge exploded no evaporation pressure 10.4 mpa 90pct energy absolrbed vaporizationp 2.9 mpa 100pct absorbed vaporization p - 2.2 mpaAlthough D-D is assumed in the 50 KT pacer, tritium may be used up to 50 50 mix with D T in this model
D-T surrounded solid beryllium up to 20 cm thick absolrb muliplies neutrons from DT
breeding configvuration based on fission suppressed concept to maximize number of neutrons per unit energy

If DD used beryllium not necessary  D D reactions create even more excess neutrons per unit energy than DT reactions among beryllium.

 Beryllium metal is not soluble in FLiBe
Keeping flourine ratio high gives chance for beryllium to convert to BeF2 so can dissolve in FLIBE
Tritium has low solubility in FLiBe so pumpable out with cryopump few cubic meters after each explosion
at 37 kg FLiBe cost 19 million
amont FLiBeneeded absorb 4 tj without evaporation for temp rise form 400 to 1300 c
1 atmos vapor pressure is 2000 tons of FLiBe  salt
direct cost whole pacer unit must not exceed 2 billion

Ralph Moir data on PACER from Pacer Revisited-- 2 kiloton charges in small engineered chamber
UCRL 98468 Rev 1
Older version of this at

 In the PACER concept, a 20-kT
peaceful nuclear explosion is contained in a cavity about 200
m in diameter, filled with 200 atm of 500C steam.

Energy from the explosion is used to produce power, and the neutrons
are used to produce materials such as
 U233, Pu, Co-60, and T.

The present idea is to modify the PACER concept in 3
ways to improve the practicality, predictability, and safety
of power production from this technologyst; improvements
are (1) line the cavity with steel; (2) replace the steam with
molten salt: and (3) reduce the explosive yield to about 2

... the only fusion power concept where the
underlying technology is proven and in hand today.

... Lining the cavity with steel makes it engineer able and
predictable, and prevents contamination of the working fluid...

The steam working fluid is replaced with molten  (FLiBe) in the form of droplets, to absorb energy and
suppress shocks

 This change results in an ambient pressure
below 1 atm soon after the explosion and allows much
of the energy to go into evaporation, thus reducing the pressure
in the cavity right after the explosion to about 3 MPa.
Also, because tritium is insoluble in the molten salt, it can
he removed almost completely, thus reducing the tritium inventory
by a factor of 100000 to 100 Ci using FLiBe. Then when the explosive yield is
reduced to 2 kT, the cavity volume is reduced by a factor of
50. which reduces the peak pressure in the cavity by a factor
of 9

In the modified concept, the cavity is tall and
cylindrical—rather than spherical-with a smaller radius of
curvature and a hemispherical roof (Fig. 1). As such, the
cavity should be much more durable.

The steel skin also must be corrosion resistant For example,
alloys high in nickel would be good. Haslelloy-n would be
excellent; and type 316 stainless steel may be adequate.

Pipes used to carry the molten salt to the droplet spray system  and to spray the walls will be made of the same material as the skin.

The system or pipes to carry the molten salt
from the cavity to the pumps and primary heat exchanger is
conventional, except the pipes must withstand pressures to
about 3 MPa at a pulse rale of about 1/hr

The cavity, its
liner, and the piping system must withstand about 200,000
shots over a 30-yr period

The walls of the lower part of the cavity will be cooled
actively so that a reasonable temperature ~650c is maintained
during the intershot lime of ~1 hour while the heat is
removed by circulating the molten salt

To keep the temperature
rise of the salt pool reasonable, a bed of balls in the salt
pool is assumed to store much of the beat. The balls could
be made of nickel-coaled iron.

One limiting
process is heat conduction from the surface of a droplet
to its  interior. 
this time is characteristic of the thermal diffusivity
k/pc, which for FLiBe is 1.7 times ten to the minus 7 meters squared per second where k the thermal conductivity is .8watt/m kelvin, p the density is 2050 kg m3 and c the heat capacity is 2350 J kg-1K-1 
thermal diffusivity time for a droplet of  1
mm in diameter, this time is 140 milliseconds 

The vapor rushing
past the droplets from the expanding fireball will distort
and break up droplets and cause internal circulation or  vortex motion so can enhance heat transfer
by  a factor of 2.7 and oscillations by a similar factor.

 The time to extinguish the fireball appears to be limited by conduction into the droplet rather than heat transfer within the gas or from the gas to the droplets or by condensation onto the droplets

After the heat is distributed over
2 kT of molten salt for each Kt of nuclear energy yield the pressure in the cavity will drop below 1 atm which corresponds to a temperature of below 1200 C an additional 5 KT of molten salt will bring the temperature down to 700 C. 
Before the next shot the tritium helium and other noncondensible gases must be pumped out and molten salt pumped through heat exchanger to lower its temperature and recharge the upper reservoir (for the shower spray system) shown in Figure 1

 The next charge is then lowered on a tether or dropped and the droplet spray system turned on to fill the cavity with the appropriate distribution of molten salt droplets.

The salt can be kept in a reduced state by continuously reacting it with metallic beryllium then the tritium will exist as T2 gas and can be removed by pumping. The uranium can also be removed by reacting with beryllium by fluorination or the salt can be fluorinated directly in a separate tank. The small amount of fissile material left would be very dilute in the huge amount of salt so criticality is prevented.

Since the excavated cavity will not confirm to the desired skin shape grout must be chosen to that it has low vapor pressure above 500 C. 
Ideally vapor pressure should be low in oxygen nitrogen silicon and other materials that would contaminate the molten salt if an inward leak occured (this would cause difficulties in reprocessing the molten salt)
Holes drilled for the rock bolts will form a collection system for pumping gaseous material in the region behind the metal skin to maintain the low gas phase pressure under 1 atmosphere.

Maintaining a low gas pressure in the cavity makes pumping tritium gas easy. The thermal design of the cavity skin is important. If salt is sprayed on the walls before each shot the salt-carrying debris from the nuclear device and its surrounding material will not be frozen directly onto the wall instead it will flow to the pool at the bottom of the cavity or freeze onto the existing frozen salt layer. Flowing fresh molten salt can clear or remelt this layer. 

Therefore the steel skin will remain at an ambient temperature below the melting point of the molten salt (363 C) except for a short time (under an hour) between each shot typically 1/hour. The frozen salt layer can then reduce the thermal stress on the wall. 
Contrast that with the 1975 design of PACER

enter image description here

A key question-- how realistic  would it be to expect $1000 per unit small thermonuclear charges? Pure fusion of D-D would be nice, Deuterium-Tritium boosting fissionables would work, there might be a booting path of many architectures because you are debugging the charges as well as the first PACER but in the end the expectation is cheap Deuterium-Deuterium devices, as close to pure fusion as you can get.  This might in practice mean 200 tons of fission, 1800 tons of D-D- fusion. Moir himself has expressed the hope that  "If such explosives could be initiated with 20 tons [84 GJ] of fission yield for a total of 2 ktons then the resulting power system would be 1% fission and 99% fusion."

Some references on small fission/fusion charge engineering--

discussion of 4th generation weapons FGNW...  .... nuclear shaped charges 15 kg of tritium in an arsenal equivalent to one million 1-ton-FGNWs,..

The physical principles of
thermonuclear explosives, inertial
confinement fusion, and the quest for
fourth generation nuclear weapons

Andre Gsponer and Jean-Pierre Hurni  

Note from Friedlander-- This is a good as something Winterburg would write. Enjoy. 

"any country with access to tritium and high-power x-ray imaging technology could
easily develop and weaponize simple boosted fission explosives without nuclear
testing....with boosting — the problem of the preinitiation of the chain reaction, which creates difficulties in making a non-boosted fission bomb [66, 69], is no longer a serious problem....Boosting can also be used to make efficient and reliable fission weapons in which reactor grade plutonium is used instead of weapons grade plutonium.....It is therefore clear that ICF experiments will contribute very significantly to progress in weapons physics...A modern, sophisticated proliferator with access to ICF computer codes and today’s computer workstations would have far more tools for designing a secondary than the U.S., U.K. or USSR had in the 1950s or France and China in the 1960s..., in subcritical burn, the quality of the fissile material is of little importance: reactor-grade plutonium is just as good as weapons-grade plutonium....many technologically sophisticated countries (and, in particular, Germany, India, Israel, Japan, and Pakistan which have highly developed nuclear
infrastructures) are today in a good position to make not only atomic bombs but
also hydrogen bombs...currently preferred technique is to use magnetic compression to increase the
density of the fissile material (which may consist of low-quality, reactor-grade
plutonium) and a very small amount of antimatter to initiate the subcritical burn.....Fourth generation nuclear weapons based on such processes, and with yields of 1 to 10 tons equivalents of TNT, may weigh less than a few kilograms."

Friedlander here to finish up: So we may end up with a underground antimatter triggered boosted fission burning D-D fusion molten salt reactor. That kind of crosses over many boundaries of what is a traditional reactor.
Now a general discussion on the future of PACER. 
Moir's variant is far more engineerable and deployable. Not in every place but politics allowing (a big if) within say 5 to 10 kilometers of every place.  The 1975 scheme in the previous article was limited to a hundred or more sites near the US Gulf coast (or other locations in the world where salt domes are plentiful such as Iran).

But the salt domes were only chosen because they would allegedly be cheap. If engineering a big cavity got far cheaper anywhere PACER like units could be located anywhere. They would also be ideal waste disposal units at end of lifetime. But if we have the kind of free neutron factory that PACER promised to provide huge numbers of portable fission reactors could operate anywhere it was safe to operate one. It would enable a whole new energy economy, including say high temperature reactors for chemical processing and vast associated industries.Trading 252 kilos of  deuterium  at $500 a  kg worth $126,000 getting a million plus in power would pay if the processing was say only a couple of hundred thousand dollars more.   But with a self-booting new nuclear chemical economy the deuterium itself might get cheaper, just from economies of scale and lower power costs. Assured supplies for a lifetime of cheap power can settle peoples' minds and get them producing wealth. http://nextbigfuture.com/2015/12/will-simple-act-of-building-cheap.html
Contrast that with people fearful of peak everything and with constantly ramping up taxes because of a shrinking real economy.  (By which I mean the portion of the economy not created by finance games and government contracts of any kind but rather by industry and engineering selling honest goods at market prices)  Had the late 1970s seen a prototype PACER unit we might well now be living in a different world.  

Forty years ago, Project Pacer was-- and remains-- the most practical hope for short term D-D- fusion reactors that could be commercially deployed within a decade or even less.
Because of the decisions not made in 1975 the economy of 2015 is far smaller than it might be.  Notably China negotiated in an exemption for peaceful nuclear explosions in the test ban treaty. Perhaps there was a reason for that:  Someday PACER may return.

If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks

Форма для связи


Email *

Message *