February 13, 2016

Not just automation but other economic forces have been reducing labor participation for decades

Rice University computer scientist Moshe Vardi expects that within 30 years, machines will be capable of doing almost any job that a human can. In anticipation, he is asking his colleagues to consider the societal implications. Can the global economy adapt to greater than 50 percent unemployment? Will those out of work be content to live a life of leisure?

"We are approaching a time when machines will be able to outperform humans at almost any task," Vardi said. "I believe that society needs to confront this question before it is upon us: If machines are capable of doing almost any work humans can do, what will humans do?"

Vardi will address the issue in an 8 a.m. Sunday presentation, "Smart Robots and Their Impact on Society," at one of the world's largest and most prestigious scientific meetings—the annual meeting of the American Association for the Advancement of Science in Washington, D.C.

Charles Murray and others have pointed out the decline in labor participation of white men in america.

During the past half-century of economic growth, virtually none of the rewards have gone to the working class. The economists can supply caveats and refinements to that statement, but the bottom line is stark: The real family income of people in the bottom half of the income distribution hasn’t increased since the late 1960s.

During the same half-century, American corporations exported millions of manufacturing jobs, which were among the best-paying working-class jobs. They were and are predominantly men’s jobs. In both 1968 and 2015, 70 per cent of manufacturing jobs were held by males.



More than Moore's law strategy for computer industry

Next month, the worldwide semiconductor industry will formally acknowledge what has become increasingly obvious to everyone involved: Moore's law [semiconductor scaling], the principle that has powered the information-technology revolution since the 1960s, is nearing its end.

Moore's law states that the number of transistors on a microprocessor chip will double every two years or so — which has generally meant that the chip's performance will, too.

The semiconductor industry has released a research road map every two years to coordinate what its hundreds of manufacturers and suppliers are doing to stay in step with the law — a strategy sometimes called More Moore. It has been largely thanks to this road map that computers have followed the law's exponential demands.

Top-of-the-line microprocessors currently have circuit features that are around 14 nanometres across, smaller than most viruses. But by the early 2020s, says Paolo Gargini, chair of the road-mapping organization, “even with super-aggressive efforts, we'll get to the 2–3-nanometre limit, where features are just 10 atoms across. Is that a device at all?” Probably not — if only because at that scale, electron behaviour will be governed by quantum uncertainties that will make transistors hopelessly unreliable. And despite vigorous research efforts, there is no obvious successor to today's silicon technology.

The industry road map released next month will for the first time lay out a research and development plan that is not centred on Moore's law. Instead, it will follow what might be called the More than Moore strategy: rather than making the chips better and letting the applications follow, it will start with applications — from smartphones and supercomputers to data centres in the cloud — and work downwards to see what chips are needed to support them. Among those chips will be new generations of sensors, power-management circuits and other silicon devices required by a world in which computing is increasingly mobile.


Uranium and nuclear energy

Kazakhstan’s Kazatomprom reported their uranium production increased 4.3 percent in 2015, to 23,800 tonnes uranium (52.5 million pounds). This was an increase of over 970 tonnes (2.1 million pounds) from the 22,829 tonnes that Kazakhstan produced in 2014.

Idling Japan’s reactors for a few years caused Japanese utilities to accumulate about 120 million pounds of uranium since they still had to honor their existing supply contracts. This is enough to fuel its restarting fleet for the next decade.

The price of uranium has little effect on the price of nuclear power since the fuel is such a small part of the total cost and the cost of fuel itself is dominated by the fabrication costs, not the cost of uranium. Decisions to build nuclear power plants do not hinge on uranium supplies. And there are sufficient uranium deposits in the world to provide nuclear energy at any level for many thousands of years.

Eighty-nine percent of the fuel requirements of the current fleet of nuclear reactors worldwide, totaling some 377 million pounds U3O8 (yellowcake), will be met in 2016 by Canada, Australia, and Kazakhstan.

Japan will restart about 40 of its nuclear reactors over the next few years.
China will build out over 100 nuclear reactors over the next ten years
India, Russia will also build out nuclear reactors.




February 12, 2016

Spacex planning to launch every 2 to 3 weeks and achieve 70% landing success rate in 2016

Elon Musk is confident about Spacex's ability to land rockets in 2016 and he predicted a 70% success rate for the year.



If all goes as planned, Spacex will achieve a launch rate of once every two to three weeks, according to a recent comment from SpaceX president Gwynne Shotwell.

Spacex is transforming its rocket factory. It is going from building six or eight rockets cores a year to about 18 cores a year. By the end of 2016 Spacex should be at over 30 cores per year.

Beyond ramping up production of its Falcon 9 at the company's factory, SpaceX is increasingly focused on preparing its Falcon Heavy. The more powerful rocket is slated for its first launch during 2016.


DARPA launching robotic sub hunting ship on April, 2016

Israel has five modified Dolphin submarines

Israel has five Dolphin class submarines. The diesel-electric submarines were developed and constructed by Howaldtswerke-Deutsche Werft AG (HDW), Germany for the Israeli Navy. The first boats of the class were based on the export-only German 209-class submarines, but were modified and enlarged. The Dolphin 1 sub-class is slightly larger than the German Navy Type 212 in length and displacement. The three newer air-independent propulsion (AIP) equipped boats are similar to the Type 212 vessels in underwater endurance, are 12 metres (39 ft) longer, nearly 500 tonnes heavier in submerged displacement and have a larger crew than either the Type 212 or the Type 214.

In 2011, Israel ordered a sixth Dolphin-class submarine.

Dolphin class submarine
INS Dolphin (1999)
INS Livyathan (Whale, 1999)
INS Tekumah (Revival, 2000)

AIP Dolphin 2 class:

INS Tannin (Crocodile, delivered in 2012)
INS Rahav (Demon, delivered in 2014)
INS Dakar (Grouper, ordered 21 March 2012, expected operational date 2019)

The Dolphin 2-class are the largest submarines to have been built in Germany since World War II. The Dolphin class boats are the most expensive single vehicles in the Israel Defense Forces. The Dolphin-class replaced the aging Gal-class submarines, which had served in the Israeli navy since the late 1970s. Each Dolphin-class submarine is capable of carrying a combined total of up to 16 torpedoes and SLCMs.

The cruise missiles have a range of at least 1,500 km (930 mi) and are widely believed to be equipped with a 200-kilogram (440 lb) nuclear warhead containing up to 6 kilograms (13 lb) of plutonium.

Each submarine is fitted with 6 × 533 mm (21.0 in) torpedo tubes, and 4 × 650 mm (26 in) torpedo tubes. The very large 650 mm tubes can be used for laying mines, larger submarine-launched cruise missiles, or swimmer delivery vehicles, and with liners the tubes could be used for standard torpedoes and submarine-launched missiles. According to the German Defense Ministry the 650 mm tubes are to have a liner installed for firing 533 mm UGM-84 Harpoon missiles although the Dolphin class already has six tubes of the 533 mm size.




3D NAND flash chips will mean 3.5 TB SSD the size of a pack of gum and over 10 TB for 2.5 inch SSDs

Intel’s solid-state drives could be poised for a big jump in capacity and speed with new 3D flash chips coming from Micron.

Micron, which makes the flash in Intel’s SSDs, has started volume shipments of its 3D NAND flash chips. The chips could lead to SSDs the size of a pack of gum with more than 3.5TB of storage and standard 2.5-inch SSDs with capacities greater than 10TB.

SSDs have been advancing in capacity and durability. Fixstars last month shipped a 13TB SSD, which is priced at about $1 per gigabyte, or $13,000. This year, SanDisk plans to ship 6TB and 8TB SSDs, while Samsung is aiming to release a 4TB SSD.

Intel sells consumer SSDs with a maximum capacity of 4TB, and Micron’s 3D NAND chips will could up to triple that capacity. Some of Intel’s enterprise SSD products are old and due for an upgrade.

Micron is behind rivals Samsung and Toshiba, which moved to the 3D NAND flash structure many years ago to improve storage capacity and reduce production costs. Micron’s implementation is different and relies on floating-gate cells to improve the reliability and capacity of drives. Micron’s rivals use charge-trap technology, which analysts say might provide longer battery life.

Micron claims its 3D NAND chips have three times the density of competitors’ products. This should mean three times the capacity in the same size SSD, so the storage will take up less space in an expensive data center or a small, slim device.

It isn’t yet clear whether Micron will use these 3D NAND chips for Xpoint, a significantly faster storage and memory technology that it’s developing with Intel. They claim Xpoint will be 10 times more dense than DRAM and produce SSDs that are 1,000 times faster and more durable than flash storage. Intel plans to release Xpoint-based memory DIMMs and SSDs under the Optane brand.



Brian Kesinger mashup of Calvin and Hobbes and Star Wars the Force Awakens

Brian Kesinger is a story artist at Walt Disney animation studios and Artist for Marvel Comics. He has a series of cartoons that mashup Calvin and Hobbes and Star Wars the Force Awakens.

He has other mashups of Pixar Big Hero 6 and Star Wars as well.




Star Wars the Force Awakens has a global box office of over $2.01 billion and about $908 million domestic. It may come up short of the Titanic box office of $2.186 billion (with re-releases).

World data transmission speed record of 1.125 Terabits per second

A new record for the fastest ever data rate for digital information has been set by UCL researchers in the Optical Networks Group. They achieved a rate of 1.125 Tb/s as part of research on the capacity limits of optical transmission systems, designed to address the growing demand for fast data rates.

Lead researcher, Dr Robert Maher, UCL Electronic & Electrical Engineering, said: “While current state-of-the-art commercial optical transmission systems are capable of receiving single channel data rates of up to 100 gigabits per second (Gb/s), we are working with sophisticated equipment in our lab to design the next generation core networking and communications systems that can handle data signals at rates in excess of 1 terabit per second (Tb/s).

“For comparison this is almost 50,000 times greater than the average speed of a UK broadband connection of 24 megabits per second (Mb/s), which is the current speed defining “superfast” broadband. To give an example, the data rate we have achieved would allow the entire HD Games of Thrones series to be downloaded within one second.”




Nature Scientific Reports - Increasing the information rates of optical communications via coded modulation: a study of transceiver performance

Rabbit brain defrosted from cryopreservation without damage

A mammal brain has been defrosted from cryogenic storage in an almost perfect state for the first time. This breakthrough, accomplished using a rabbit brain, brings us one – albeit tiny – step closer to the prospect of reanimating a human brain that has been cryogenically preserved.

After death, organs begin to decay, but we can delay this by cooling these tissues, just like freezing food. But in the same way that a frozen strawberry becomes soggy when defrosted, it is difficult to perfectly preserve mammals at cold temperatures. We, and strawberries, contain large amounts of water, which freezes into ice crystals that damage cells.

Cryoprotectants can prevent this ice damage, working like medical-grade antifreezes and preventing organs from freezing. This works in small worms and rabbit kidneys, but it needs to be administered quickly, which usually causes brains to dehydrate and shrink.

The company 21st Century Medicine in Fontana, California, have developed a technique that appears to prevent dehydration and preserves the brain in a near-perfect state. By draining the blood immediately and replacing it with a chemical fixative called glutaraldehyde, they can instantly stop decay, allowing them to add cryoprotectants more slowly to prevent dehydration.

The brain is then cooled to -135 °C, which turns it into a glass-like state that can be stored for centuries without decay. When they tried this technique on rabbit brains, thawing them up to a week later, Fahy and McIntyre say the preservation appeared “uniformly excellent” when examined using electron microscopy. They have been awarded a US$26,735 prize by the Brain Preservation Foundation for the technique.

Mastering cryopreservation of the brain and other organs can also improve organ transplantation.

Cryopreservation of the brain could be used for cryogenic suspension of those who are critically ill for possible revival in the future after medical advances.

Although cryopreservation techniques have not yet been perfected, more than 100 people worldwide have already been cryogenically frozen after death by companies like Alcor.

Freezing normally damages cells, but this defrosted rabbit brain was in a near-perfect state. Kenneth Hayworth, Brain Preservation Foundation

Cryobiology - Aldehyde-stabilized cryopreservation

DARPA converts Stent into strentrode for recording neural activity

A DARPA-funded research team has created a novel neural-recording device that can be implanted into the brain through blood vessels, reducing the need for invasive surgery and the risks associated with breaching the blood-brain barrier. The technology was developed under DARPA’s Reliable Neural-Interface Technology (RE-NET) program, and offers new potential for safely expanding the use of brain-machine interfaces (BMIs) to treat physical disabilities and neurological disorders.

In an article published in Nature Biotechnology, researchers in the Vascular Bionics Laboratory at the University of Melbourne led by neurologist Thomas Oxley, M.D., describe proof-of-concept results from a study conducted in sheep that demonstrate high-fidelity measurements taken from the motor cortex—the region of the brain responsible for controlling voluntary movement—using a novel device the size of a small paperclip.

This new device, which Oxley’s team dubbed the “stentrode,” was adapted from off-the-shelf stent technology—a familiar therapeutic tool for clearing and repairing blood vessels—to include an array of electrodes. The researchers also addressed the dual challenge of making the device flexible enough to safely pass through curving blood vessels, yet stiff enough that the array can emerge from the delivery tube at its destination.




Nature Biotechnology - Minimally invasive endovascular stent-electrode array for high-fidelity, chronic recordings of cortical neural activity

France has a €100 billion nuclear reactor upgrade bill

French utility EDF will need to spend some €100 billion ($113 billion) on upgrading its fleet of 58 nuclear power reactors by 2030, the country's state audit office has said. The upgrades are needed to meet new safety requirements and to extend the lives of the units beyond 40 years.

EDF announced its Grand Carénage life extension program for the existing fleet in France in 2011. Under this investment program, the company planned to spend around €55 billion on upgrading its plants to improve their performance and enable their continued operation. The program also includes safety upgrades in response to the Fukushima Daiichi accident in Japan.

In its 2016 annual public report, released yesterday, the Cour des Comptes (Court of Audit) said it estimates that almost double this amount would have to be spent by 2030.

If France implements an energy transition law, which calls for France's reliance on nuclear energy to be reduced to 50% of power generation by 2025, "is likely to challenge the planned investments and force the company to close a third of its reactors". This, it said, will have "important consequences in terms of jobs" and could result in "compensation supported by the state".




GPS data combined with inertial sensor data with vastly improved computational algorithms for real time centimeter accuracy

Researchers at the University of California, Riverside have developed a new, more computationally efficient way to process data from the Global Positioning System (GPS), to enhance location accuracy from the meter-level down to a few centimeters.

The optimization will be used in the development of autonomous vehicles, improved aviation and naval navigation systems, and precision technologies. It will also enable users to access centimeter-level accuracy location data through their mobile phones and wearable technologies, without increasing the demand for processing power.

The approach involves reformulating a series of equations that are used to determine a GPS receiver’s position, resulting in reduced computational effort being required to attain centimeter accuracy.

First conceptualized in the early 1960s, GPS is a space-based navigation system that allows a receiver to compute its location and velocity by measuring the time it takes to receive radio signals from four or more overhead satellites. Due to various error sources, standard GPS yields position measurements accurate to approximately 10 meters.

Differential GPS (DGPS), which enhances the system through a network of fixed, ground-based reference stations, has improved accuracy to about one meter. But meter-level accuracy isn’t sufficient to support emerging technologies like autonomous vehicles, precision farming, and related applications.

“To fulfill both the automation and safety needs of driverless cars, some applications need to know not only which lane a car is in, but also where it is in that lane—and need to know it continuously at high rates and high bandwidth for the duration of the trip,” said Farrell, whose research focuses on developing advanced navigation and control methods for autonomous vehicles.


IEEE Transactions on Control Systems Technology - Computationally Efficient Carrier Integer Ambiguity Resolution in Multiepoch GPS/INS: A Common-Position-Shift Approach

KC-46 on track for delivery of 18 operation planes by August 2017

On Jan. 24 flight, the Boeing and US Air Force test team completed a series of test points with the KC-46 tanker aircraft before successfully transferring 1,600 pounds of fuel to an F-16 fighter jet, according to a company statement. The KC-46 is a militarized version of the company’s 767 commercial jet.



Air Force fixed-wing assets use the boom system for aerial refueling, with a planned 1,200 gallons-per-minute transfer rate from the KC-46. Air Force helicopters and most Navy and Marine Corps aircraft, on the other hand, use the “probe-and-drogue” method of refueling. During this event, fuel passes from the tanker’s “drogue” refueling basket, which trails from the plane via a flexible hose, through a “probe,” a rigid, retractable arm placed on the receiver aircraft’s nose or fuselage.

The tests show the KC-46 program is back on track after several setbacks in 2015. The test plane successfully completed first flight in September after it was initially planned for 2014.

The Air Force is planning to buy 179 KC-46s in total to recapitalize its aging tanker fleet. According to the contract terms, Boeing must deliver 18 ready-to-go tankers by August 2017.



Artificial intelligence researchers claims sentient creativity machines will be made within 5 years

The AI researcher Dr. Stephen Thaler has given an interview recently in which he claims that his AI research will lead to sentient, cognizant "creativity machines" within 5 years.

The research continues to accelerate.

Consciousness appears to be more like an intensive rather than extensive property/behavior of the brain. It’s sort of like the gas law equation, PV= nRT, with P and T being intensive and n and V extensive. So, consciousness is intensive, but we as humans deny simpler forms of consciousness, while fearing the scaled up version attainable via machine intelligence.

Imagination Engines, Inc. was founded upon key scientific and engineering breakthroughs in the area of artificial neural networks by Dr. Stephen Thaler.



An Imagination Engine is a trained artificial neural network that is stimulated to generate new ideas and plans of action through a very amazing effect that is an outgrowth of scientific experiments conducted in 1975 by our founder, Dr. Stephen Thaler. In these initial experiments, neural networks were trained upon a collection of patterns representing some conceptual space (i.e., examples of either music, literature, or known chemical compounds), and then the networks were internally 'tickled' by randomly varying the connection weights joining neurons. Astonishingly, Thaler found that if the connection weights were varied at just the right level, the network's output units would predominantly activate into patterns representing new potential concepts generalized from the original training exemplars (i.e., new music, new literature, or new chemical compounds, respectively, that it had never been exposed to through learning)

There is an audio interview with Dr Thaler at this link

Funding battle between F35 stealth fighter and long range strike bomber

A battle is brewing between the multibillion-dollar aircraft programs for the F35 stealth fighter and the new long range strike bomber — and the defense companies, lobbyists, and Pentagon offices that back them.

Pentagon money will fund two of the most sophisticated and expensive planes ever built, the F-35 Joint Strike Fighter and the new Long Range Strike-Bomber, or LRS-B. The bomber needs cash to get off the ground and the skittish F-35 camp already is worried the new kids will steal from the huge but finite pot.

The F-35’s price tag looms at $400 billion for thousands of jets to be bought over the next two decades. The 100 planned bombers are expected to cost between $80 billion and $111 billion. The last time the Air Force had such an ambitious plane-building plan, Ronald Reagan was president. But unlike then, defense spending is capped through 2021.

Pentagon leaders have been floating the idea of signing a contract with Lockheed Martin, the world’s largest defense contractor, for more than 450 new F-35s over a three year-period beginning in 2018. Most of those planes would be for the Air Force. Between 2016 and 2020, the Air Force plans to spend more than $25 billion on at least 200 F-35s, according to Pentagon budget documents.



For the bomber, Air Force officials will not disclose the actual yearly budget of the plane, saying that would harm national security. But they have released an estimate that it will cost at least $23.5 billion to develop and at least $56 billion to buy 100 planes.



February 11, 2016

China is nearing completion of the high temperature pebble bed reactor and will test it before generating power starting about Nov 2017

China’s Nuclear Engineering Construction Corporation plans to start up a high-temperature, gas-cooled pebble-bed nuclear plant in 2017 in Shandong province, south of Beijing. The twin 105-megawatt reactors—so-called Generation IV reactors that would be immune to meltdown—would be the first of their type built at commercial scale in the world.

Construction of the plant is nearly complete, and the next 18 months will be spent installing the reactor components, running tests, and loading the fuel before the reactors go critical in November 2017.

If it’s successful, Shandong plant would generate a total of 210 megawatts and will be followed by a 600-megawatt facility in Jiangxi province. Beyond that, China plans to sell these reactors internationally; in January, Chinese president Xi Jinping signed an agreement with King Salman bin Abdulaziz to construct a high-temperature gas-cooled reactor in Saudi Arabia.

“This technology is going to be on the world market within the next five years,” Zhang predicts. “We are developing these reactors to belong to the world.”

Pebble-bed reactors that use helium gas as the heat transfer medium and run at very high temperatures—up to 950 °C—have been in development for decades. The Chinese reactor is based on a design originally developed in Germany, and the German company SGL Group is supplying the billiard-ball-size graphite spheres that encase thousands of tiny “pebbles” of uranium fuel. Seven high-temperature gas-cooled reactors have been built, but only two units remain in operation, both relatively small: an experimental 10-megawatt pebble-bed reactor at the Tsinghua Institute campus, which reached full power in 2003, and a similar reactor in Japan.

One of the main hurdles to building these reactors is the cost of the fuel and of the reactor components. But China’s sheer size could help overcome that barrier. “There have been studies that indicate that if reactors are mass-produced, they can drive down costs,” says Charles Forsberg, executive director of the MIT Nuclear Fuel Cycle Project. “The Chinese market is large enough to make that potentially possible.”

China is also working on
  • a molten-salt reactor fueled by thorium rather than uranium (a collaboration with Oak Ridge National Laboratory)
  • a traveling-wave reactor (in collaboration with TerraPower, the startup funded by Bill Gates)
  • a sodium-cooled fast reactor being built by the Chinese Institute for Atomic Energy
  • a supercritical water cooled reactor

Nextbigfuture has been covering all of Chinas nuclear reactor projects for many years.


20 firm orders to Aerion supersonic business jet for delivery in about 2023 so the wealthy 0.1% will fly twice as fast as others

Aerion Corporation is making a 8-12 passenger AS2 supersonic business jet, with key engineering support provide by the Airbus Group. Aerion has set a target to achieve FAA certification in 2021 and enter service in 2023

Aerion recently announced that the flight-service provider Flexjet has placed a firm order for 20 examples of the Aerion AS2 supersonic jet—for which the company began taking orders last year—making Flexjet the first fleet operator for the plane, the first publically available supersonic jet since commercial supersonic travel ended in 2003.

Each jet is priced at $120 million, for a total order potential of $2.4 billion. Aerion says the AS2 will carry passengers in a 30-foot-long cabin at speeds as fast as Mach 1.5 (990 mph). With a range of up to 5,466 miles, the AS2 cuts transatlantic travel time by 3 hours and shortens trans-Pacific routes by 6 hours or more.

Flexjet chairman Kenn Ricci called the airplane a “potential game changer for business travel.” Flexjet and Aerion will work together to design a custom, premium cabin interior for the Flexjet fleet. Aerion says for busy international travelers, all that extra speed quickly adds up to more productive days and even weeks over the course of a year. For example, a typical long-range business jet that would log about 250,000 miles in a year would fly for about 500 hours, while the same miles could be covered in the AS2 in about 300 hours. The difference for travelers is the equivalent of 25 8-hour workdays.








Risks of seasonal outbreaks of Zika Virus mosquitos in southern

Most of the Northern United States and Canada, it is simply too cold for the mosquitoes that spread the Zika virus to survive. Aedes aegypti, the only mosquito confirmed to spread Zika, has been in the Southern United States since the 1960s, and pretty much all over South America since at least the 1990s. The other mosquito involved, Aedes albopictus, has never been shown to carry Zika, but scientists suspect that it probably could. As of 2010, however, there has been no sign of either mosquito species migrating northward.

The mosquitos could migrate north eventually. In fact, Aedes albopictus have been known to migrate along highways, laying their eggs in standing water loaded onto trucks. But even then, there is little risk of these mosquitos surviving long-term north of the Mason Dixon line. And, while a large portion of the U.S. gets warm enough during the summer for these mosquitos to survive, only Florida, Hawaii and the southern tip of Texas have climates that could sustain these mosquitos year-round.

The Lancet indicates there is seasonal outbreak risks in the southern US states. A seasonal outbreak would still mean millions of women in the US would have risks of deformed babies if they were or became pregant during the outbreak and for months after.







Physicists detect gravitational waves from colliding black-holes

Scientists announced Thursday that, after decades of effort, they have succeeded in detecting gravitational waves from the violent merging of two black holes in deep space. The detection was hailed as a triumph for a controversial, exquisitely crafted, billion-dollar physics experiment and as confirmation of a key prediction of Albert Einstein's General Theory of Relativity.

It will also inaugurate a new era of astronomy in which gravitational waves are tools for studying the most mysterious and exotic objects in the universe, scientists declared at a euphoric news briefing at the National Press Club in Washington.

Scientists have observed ripples in the fabric of spacetime called gravitational waves, arriving at the earth from a cataclysmic event in the distant universe. This confirms a major prediction of Albert Einstein’s 1915 general theory of relativity and opens an unprecedented new window onto the cosmos.

Gravitational waves carry information about their dramatic origins and about the nature of gravity that cannot otherwise be obtained. Physicists have concluded that the detected gravitational waves were produced during the final fraction of a second of the merger of two black holes to produce a single, more massive spinning black hole. This collision of two black holes had been predicted but never observed.



The gravitational waves were detected on September 14, 2015 at 5:51 a.m. Eastern Daylight Time (09:51 UTC) by both of the twin Laser Interferometer Gravitational-wave Observatory (LIGO) detectors, located in Livingston, Louisiana, and Hanford, Washington, USA. The LIGO Observatories are funded by the National Science Foundation (NSF), and were conceived, built, and are operated by Caltech and MIT. The discovery, accepted for publication in the journal Physical Review Letters, was made by the LIGO Scientific Collaboration (which includes the GEO Collaboration and the Australian Consortium for Interferometric Gravitational Astronomy) and the Virgo Collaboration using data from the two LIGO detectors.

Based on the observed signals, LIGO scientists estimate that the black holes for this event were about 29 and 36 times the mass of the sun, and the event took place 1.3 billion years ago. About 3 times the mass of the sun was converted into gravitational waves in a fraction of a second—with a peak power output about 50 times that of the whole visible universe. By looking at the time of arrival of the signals—the detector in Livingston recorded the event 7 milliseconds before the detector in Hanford—scientists can say that the source was located in the Southern Hemisphere.






February 10, 2016

Russia and Iran appears close to handing a military victory to Assad in Syria

Days of intense bombing that could soon put the critical city of Aleppo back into the hands of Syrian President Assad’s forces. The USA says there could be no military solution to Syria. The Russians may be proving the United States wrong. There may be a military solution, one senior American official conceded Wednesday, “just not our solution,” but that of President Vladimir V. Putin of Russia.

The Russian military action has changed the shape of a conflict that had effectively been stalemated for years. Suddenly, Mr. Assad and his allies have momentum, and the United States-backed rebels are on the run. If a cease-fire is negotiated here, it will probably come at a moment when Mr. Assad holds more territory, and more sway, than since the outbreak of the uprisings in 2011.

Mr. Kerry enters the negotiations with very little leverage: The Russians have cut off many of the pathways the C.I.A. has been using for a not-very-secret effort to arm rebel groups, according to several current and former officials. Mr. Kerry’s supporters inside the administration say he has been increasingly frustrated by the low level of American military activity, which he views as essential to bolstering his negotiation effort.

At the core of the American strategic dilemma is that the Russian military adventure, which Mr. Obama dismissed last year as ill-thought-out muscle flexing, has been surprising effective in helping Mr. Assad reclaim the central cities he needs to hold power, at least in a rump-state version of Syria.

Battle maps from the Institute for the Study of War show, in fact, that it is: The Russians, with Iranian help on the ground, appear to be handing Mr. Assad enough key cities that his government can hang on.



Thailand is considering a polywell fusion project

According to the Thai Ministry of Energy's Integrated Blueprint, up to 5 per cent of the country's energy requirements will be met by nuclear power by 2036. Nuclear power has advantages over fossil fuels in terms of greenhouse gas emissions, yet the Fukushima disaster reminded the world of the dangers associated with fission plants - radiation during their operation and radioactive waste to subsequently dispose of. There is, however, an alternative: nuclear fusion. Given global climate change, the increased health costs and proven loss of life from air pollution emitted by the Mae Moh and Map Ta Phut coal plants in Lamphang and Rayong, and ongoing protests against the proposed coal-fired plants in Krabi and Songkhla, the "other" nuclear option bears revisiting.

ITER is due to be completed in 2019 but will never produce commercial energy. It is a research facility designed to produce 500MW of fusion power from 50MW of energy input for up to 1,000 seconds. In 2033 it is due to be replaced by a new-generation reactor that could produce 2,000MW of fusion energy. However, this will still only be a prototype and will require yet another generation of reactors before fusion becomes commercially viable, around 2050.

Faster nuclear fusion development pathways may be realized sooner. While ITER and its descendants will be tokamak reactors - giant "doughnuts" using magnetic confinement, encased in liquid lithium and water layers to produce steam as with conventional reactors - there are alternatives. The main one employs inertial electrostatic confinement, or "polywell" reactors, which look like cubes.

Polywell reactors have been championed by EMC2 Inc, funded by the US Navy for over 20 years. This and similar projects, such as Lockheed Martin's proposed reactor, use a variety of rapidly evolving technologies. Alternatives to polywells, such as magnetised target systems, also exist.

About a dozen companies are currently exploring low-end versions of fusion, with EMC2 and others expecting proof of the concept in 2018-2020



Athletes may skip Olympics in Brazil over concerns over Zika Virus causing deformed future children

The spreading Zika virus that has been linked to microcephaly (abnormally small brains and heads) in newborn babies in Brazil and other countries has raised concerns about this summer’s Olympics in Brazil, and that includes concerns from high-profile athletes.

“If I had to make the choice today, I wouldn’t go [to the Olympics],” U.S. goalkeeper Hope Solo told SI.com on Monday from Texas, where the U.S. women’s national team opens its Olympic qualifying tournament on Wednesday against Costa Rica.

Unlike other Olympic events, which will take place in the Rio de Janeiro area, Olympic soccer will be held in cities outside Rio—Manaus, Salvador, Brasília, Belo Horizonte and São Paulo—some of which have higher rates than Rio of mosquito-borne viruses like Zika, dengue, chikungunya and malaria.

Based on the current knowledge of Zika (and other congenital infections), as long as you don’t try to get pregnant or are pregnant when you have Zika, you can acquire Zika virus as a woman and still have a healthy baby later on, says Dr. Celine Gounder, an infectious disease and public health specialist. But Dr. Gounder suggests waiting at least at least one month after recovering from Zika (and preferably three months) before trying to get pregnant.



The threat that fear of Zika could lead tourists, or even athletes, to stay away from the 2016 Olympics has been added to a list of problems for organizers to resolve before the Rio games in August.

US Sports officials have told olympics athletes that those concerned about the Zika virus should consider not going to the Rio 2016 Olympic Games in August.


US National Intelligence classified Genome editing as a weapon of mass destruction

Genome editing is a weapon of mass destruction.

That’s according to James Clapper, U.S. director of national intelligence, who on Tuesday, in the annual worldwide threat assessment report of the U.S. intelligence community, added gene editing to a list of threats posed by “weapons of mass destruction and proliferation.”

Worldwide Threat Assessment of the US Intelligence Community (33 pages)

Gene editing refers to several novel ways to alter the DNA inside living cells. The most popular method, CRISPR, has been revolutionizing scientific research, leading to novel animals and crops, and is likely to power a new generation of gene treatments for serious disease

The choice by the U.S. spy chief to call out gene editing as a potential weapon of mass destruction, or WMD, surprised some experts. It was the only biotechnology appearing in a tally of six more conventional threats, like North Korea’s suspected nuclear detonation on January 6, Syria’s undeclared chemical weapons, and new Russian cruise missiles that might violate an international treaty.



National Intelligence Genome Editing Assessment

Research in genome editing conducted by countries with different regulatory or ethical standards than those of Western countries probably increases the risk of the creation of potentially harmful biological agents or products. Given the broad distribution, low cost, and accelerated pace of development of this dual use technology, its deliberate or unintentional misuse might lead to far reaching economic and national security implications. Advances in genome editing in 2015 have compelled groups of high-profile US and European biologists to question unregulated editing of the human germline (cells that are relevant for reproduction), which might create inheritable genetic changes. Nevertheless, researchers will probably continue to encounter challenges to achieve the desired outcome of their genome modifications, in part because of the technical limitations that are inherent in available genome editing systems.

Chinese experimental nuclear fusion reactor contained a 50 million degree plasma for 102 seconds

Researchers at the Experimental Advanced Superconducting Tokamak (EAST) said they were able to heat the gas to nearly three times the temperature at the core of the Sun, and keep it there for 102 seconds.

The goal of the Experimental Advanced Superconducting Tokamak was to reach 100 million Kelvins for over 1,000 seconds (nearly 17 minutes). It would still take years to build a commercially viable plant that could operate in a stable manner for several decades.

The reactor, officially known as the Experimental Advanced Superconducting Tokamak (EAST), was able to heat a hydrogen gas - a hot ionised gas called a plasma - to about 50 million Kelvins (49.999 million degrees Celsius). The interior of our sun is calculated to be around 15 million Kelvins.

Most of the tokomak devices built over the last 60 years have not been able to sustain for more than 20 seconds.

The team claimed to have solved a number of scientific and engineering problems, such as precisely controlling the alignment of the magnet, and managing to capture the high-energy particles and heat escaping from the “doughnut”.




February 09, 2016

US Air force self-protect high-energy laser demonstrator still a high priority for 2021-2022 demonstrations on fifth generation fighters

The dawn of the combat laser era might begin in 2021 when the US air force hopes to begin demonstrations of a podded electric laser system for fifth and sixth-generation fighter jets that can destroy incoming missiles, not just steer them off course.

The US air force research laboratory started gathering market information under an advanced technology demonstration program known as SHiELD, or self-protect high-energy laser demonstrator.

According to the request for information notice, the project seeks to integrate a “moderate power” electric laser into a protective pod for supersonic combat jets, including fifth-generation jets like the Lockheed Martin F-35 and F-22 as well as future fighters and bombers.

“SHiELD seeks to expand moderate power (tens of kilowatts) laser weapon operation into the supersonic regime by demonstrating system performance under transonic flight, and acquiring aero-effects data under a supersonic environment relevant to current and future tactical aircraft,” the notice states.

“Advanced laser options under investigation are those with size and weight appropriate for integration as part of a complete laser weapon system into an aerodynamic integrated pod-like structure carried by a tactical aircraft.”

Military scientists hope to validate the laser pod in a laboratory environment (technology readiness level four) by 2017 and be ready for prototype demonstration by 2021


In 2015, the US Air Force lab was talking about a 2020 demonstration of a podded laser system

  • A defensive system with “tens of kilowatts” of power called SHIELD, the Self-protected HIgh-Energy Laser Demonstration. It will be demonstrated circa 2020.
  • A longer-range defensive system with 100 kilowatts of power, to be demonstrated in 2022.
  • A 300-kilowatt offensive system capable of destroying enemy aircraft and ground targets at long range.

All these systems will be weapons pods or other external add-ons to existing aircraft, not “fully integrated” inside the airframe like a gun or radar, Masiello cautioned. That means radar-evading aircraft like the F-35 or F-22 couldn’t use them without sacrificing stealth. “We’re talking decades to have some sort of a 300-kw laser possibly integrated into a fighter,” he said.

First burn out enemy sensors and communications and vulnerable systems

In the near term to develop and field the next generation of laser defenses that will burn out, not just blind, sensors on SAMs [surface-to-air missiles] and air-to-air-missiles

SHIELD demo will also look at engaging “soft” ground targets on behalf of Lt. Gen. Heithold and Air Force Special Operations Command. “Soft” wasn’t clearly defined, but it probably means sensors, communications equipment, and other delicate but high-value systems.

Are Nuclear Weapons 100 times Less Effective Than Supposed?

  A guest article by Joseph Friedlander

Nigel B. Cook's Glasstone.Blogspot Blog has beautiful coverage of many nuclear topics here. http://glasstone.blogspot.co.uk/
Cook is a master researcher who digs up incredible piles of research on all topics nuclear and the following is digest of various writings of his gathered for easy access centered on the remarkable thesis that the effects of nuclear weapons, while literally awesome, have been exaggerated or misunderstood to an even greater extent, with perhaps very considerable military consequences.

I remember reading a Korean War history book that explained why the US didn't use atomic bombs in 1950-- there were only a few available, they needed to be saved for a European war, and they didn't want to ruin the perception of one bomb per city, two bombs to win a war.
Indeed a 1950 British study established that to equal World War 2 you would need 300 atomic bombs plus half a million tons in conventional bombs.  Even the USA did not have such an arsenal completely ready for use then

 http://glasstone.blogspot.co.il/2006/08/nuclear-weapons-1st-edition-1956-by.html  in 1950, the Top Secret British Home Office Scientific Advisory Branch report SA/16 (HO225/16 in the UK National Archives), 'The number of atomic bombs equivalent to the last war air attacks on Great Britain and Germany', concluded:

consider the numbers of atomic bombs that would have to be dropped on this country and on Germany to have caused the same total amount of damage as was actually caused by attacks with high explosive and incendiary bombs.

‘During the last war a total of 1,300,000 tons [i.e. 1.3 MEGATONS of bombs] were dropped on Germany by the Strategic Air Forces [of Britain and America]. If there were no increase in aiming accuracy, then to achieve the same amount of material damage (to houses, industrial and transportational targets, etc.) would have required the use of over 300 atomic bombs together with some 500,000 tons of high explosive and incendiary bombs for targets too small to warrant the use of an atomic bomb… the total of 300,000 civilian air raid deaths in Germany could have been caused by about 80 atomic bombs delivered with the accuracy of last war area attacks, or by about 20 atomic bombs accurately placed at the centres of large German cities...’

This report, SA/16, was kept Top Secret for 8 years, and then Restricted for another 22 years. It was never published, and civil defence was gradually undermined by the exaggeration of nuclear weapons effects by political groups such as CND, the full facts remaining secret.




In the popular picture of science fiction from around say 1946, 1951, 1956 and 1961-- updated of course with the then current version of futuristic weapons-- the mere employment of nuclear weapons either ends civilization or ends the career of the nations that use them as great powers.  Numerous novels portrayed pathetic burned out pockets of survivors isolated across great nuclear deserts.

That is so far from the actual military reality as calculated by Nigel B. Cook that we let these excerpts speak for themselves.
Does this mean nuclear war is a casual lark? No. No. No.

See here for a view of recovery issues
 http://www.cato.org/pubs/pas/pa009.html
The Social and Economic Effects Of Nuclear War
by Arthur M. Katz and Sima R. Osdoby


But a science fiction scenario I cannot recall reading is, what if there was a massive exchange of ICBMS and while some failed most did not-- but they simply did not have the imagined effect and the great warring parties realized to their horror they were now in a nuclear AND conventional World War 3 with no obvious stopping point and sufficient remaining reserve capacity for mobilization for a decade long war full of shortages and sacrifices? I can't ever recall reading that in any story! 

(Some studies around 1951 of 'broken back war' touched on it but assumed A-bombs were as effective as the advertising and the post attack USA had problems keeping 20 divisions in the field in Europe. But what if the USA could draft as many riflemen as the Russians had in World War 2 and send them to Europe or Asia  in the midst of a tactical nuclear war that also was not as effective as advertised) What a nightmare struggle.

 Note that Nigel never states this himself but at least some scenarios derived from his data are compatible with that vision.

I have bolded particularly interesting details for the reader to notice below.

An explanation of key civil defense practices can be found here https://en.wikipedia.org/wiki/Duck_and_cover

I should  emphasize that below the line is nearly all Nigel's work except for some connecting and classifying phrasework this writer added for clarity. Or Nigel is quoting other's work which is lost to the casual reader deep in the source pages at  http://glasstone.blogspot.co.uk/ 



Below this line it's mostly Nigel:
------------------------------------------

Nigel B. Cook has documented that even well-educated nuclear scientists have learned the wrong values for cratering (which has vast military consequences- since bunkers and silos are only killed by being in the crater.)  
http://glasstone.blogspot.co.il/2009/08/nuclear-cratering-exaggeration-admitted.html
Read the whole thing there (big page, may take some searching) and the significance is here below--if the crater is smallewr no more one shot one kill. It becomes an ongoing war on smaller scale not a huge final nuclear exchange.

http://glasstone.blogspot.co.il/2010/03/lifeboat-analogy-to-civil-defence.html


Herman Kahn points out in Appendix III of his 1960 book On Thermonuclear War that if the severe damage radius of a nuclear explosion is R and the missile Circular Error Parameter (CEP) distance (the radius from the intended target within which 50% of warheads fall) is C, then the probability of a target surviving nwarheads is simply S = (1/2)x where x = n(R/C)2. (Note that Kahn's formula assumes 0% survival chance for a direct hit, which is obviously incorrect for very hard buried targets like the bunker under the Kremlin which is reportedly deeper than the crater rupture zone depth for the revised crater dimensions law at high yields, but such deep targets can still be destroyed either by earth-penetrator warheads or by a repeated sequence of ground bursts in the craters formed by prior detonations.) Since R generally scales as only the cube-root of the bomb yield, it follows that for constant survival probability the payoff from a given increase in missile accuracy is larger than the effect from varying the weapon yield. Hence, many individual bombs each with smaller yields but improved accuracy are preferred to a fewer heavier higher-yield warheads, since they are more destructive to hardened counterforce targets (missiles in silos, underground enemy command posts, tanks, submarines, etc.) while producing less collateral damage to civilians since the amount of fallout radioactivity (unlike blast and cratering areas) scales directly with the yield.

Nigel's points--
Visions of vast firestorms and melted twisted girders may be greatly exaggerated except very near to ground zeroes.
http://glasstone.blogspot.co.il/2014/05/debunking-hardened-dogma-of-exaggerated.html

 (The Twin Towers collapsed, Nigel reminds us)  by heat from thousands of gallons of burning aviation gasoline running into the steel supports and turning them into putty.  Nuclear weapons at best deliver a brief match ignition to unshielded dry fire kindling, they don’t deliver the minutes of heating needed to dry out and ignite anything thicker than paper!  Beside the Thames, London air of around 80% humidity gives wood a moisture content of around 16%, entirely different from the dry Nevada desert where flashover did once occur in a wooden hut with just 19% air humidity at the 1953 Encore test.  Even in the 1953 Nevada Encore test, a large window had to be exposed to an unobstructed radial view of the whole fireball.

Nigel's points--
Even in Hiroshima reinforced concrete modern construction resisted blast and shielded from radiation far better than the wooden shacks making up much of the city.
http://glasstone.blogspot.co.il/2014/05/debunking-hardened-dogma-of-exaggerated.html
Hennessy ignored the 1954 Leader-Williams report on civil defence against the H-bomb, instead lists effects for the May 1953 Home Office predictions of casualties and house damage from a massive attack of 132 atomic bombs of 20 kt yield dropped on 39 British towns (CAB 134/942).  This is closer to the SLBM MIRV warhead yield today than the 10-20 MT H-bombs considered in 1954-5, which wouldn’t even fit by itself into a typical modern missile today, no matter how clever the weaponeer. Hennessy fails to make the most important point, which is the relatively few casualties per bomb, in his table on page 130, compared to Hiroshima and Nagasaki in wooden cities.  The 1953 study found that 35 Hiroshima-Nagasaki 20 kt atomic bombs would be needed to kill 422,000 people in London with WWII type civil defence evacuation of women, children and the disabled, and sheltering applied to all likely targets.  A similar total number of fatalities occurred with just 2 bombs in the Japanese wooden cities in August 1945, thereby suggesting that raw data from Hiroshima and Nagasaki exaggerates crude fatality rates by a factor of about 35/2 = 17.5, but Hennessy fails to mention this evidence for civil defence.  The total dead of 1,378,000 for the 132 nuclear weapons on 39 British towns and cities implies an average of 10,439 killed per 20 kt nuclear weapon, a very small fraction of the death toll per atom bomb on Hiroshima and Nagasaki.

In addition, the 132 atomic bombs study of 1953 shows that 12,326,000 houses are damaged to the point of being at least temporarily uninhabitable (17.5% of these are irreparable), in other words the 9 houses are made uninhabitable per fatality.  This signifies the relatively good survival of people compared to houses, and also the need for civil defence to set up emergency feeding, emergency shelter for the homeless refugees, etc.  Hennessy again makes no comment concerning this obvious inference.

Nigel's points--
Simply adding megatonnage without correcting for wasted vertical blast energy means that equivalent nuclear arsenals are not the equivalent of thousands of World War 2s, but only a few times the damage inflicted on Germany, Japan, Britain and France  (Possibly even less than the damage inflicted on Russia though Nigel does not state this.  If there is warning and sheltering and civil defense casualties might be 3% of those imagined) 
http://glasstone.blogspot.co.il/2014/05/debunking-hardened-dogma-of-exaggerated.html
Blast damage area equivalent megatons for destruction area and casualty comparisons (“equivalent megatonnage,” EMT) are proportional the product the the number of bombs and the 2/3 power of the yield of each bomb.  In WWII, 2.2 megatons distributed in 22 million conventional 10^-7 megaton bombs were therefore equivalent to 22,000,000(10^-7)^2/3 = 474 one megaton blast bombs, or 948 nuclear bombs each with a blast yield of 1 megaton (blast being 50% of the total energy of a nuclear explosion).  In other words, even a thousand megatons in a nuclear war would not be on a different scale to the 2.2 megatons of highly effective, dispersed small bombs in WWII.


Key equivalencies 


 In the whole of WWI, the British Army fired 170 million shells, with equivalent damage to:
170,000,000(3.7 x 10-9)2/3 = 408 separate 1 megaton nuclear weapons.

Now consider WWII, where London alone received about 18.8 kilotons in roughly 188 thousand separate 100 kg explosives in the 1940 Blitz :

188,000(10-7)2/3 = 4 thermonuclear weapons, each 1 megaton.
74.2 kilotons of conventional bombs were dropped on the UK in WWII causing 60,000 casualties, equivalent to 16 separate 1 megaton nuclear weapons, confirming the British Home Office analysis that - given cheap-type civil defence - you get about 3,750 casualties for a one megaton nuclear weapon.  

Naturally, without civil defence, as in early air bombing surprise attacks or the first use of nuclear weapons against Hiroshima and Nagasaki, casualty rates can be over 100 times higher than this.  (For example, Glasstone and Dolan, in The Effects of Nuclear Weapons, 1977 point out that in Hiroshima the 50% lethal radius was only 0.12 mile for people under cover in concrete buildings, compared to 1.3 miles for those caught totally unprotected outdoors.  The difference in areas is over a factor of 100, indicating that the casualties in Hiroshima could have been reduced enormously if the people had taken cover in concrete buildings, or simple earth covered WWII shelters which offered similar protection to concrete buildings.)



10 kt of small 1 ton TNT bombs = same area of damage as 1 megaton in a single bomb.  The American B-52 bomber has a payload of 32 tons, so it takes 313 sorties to drop 10 kt of TNT which (if the bombs are 1 ton each) is equivalent in damage area to a 1 megaton nuclear weapon

 if WWII had been a nuclear war, the same destruction ...would have necessitated dropping 431 nuclear weapons each of 1 megaton yield ...

The 1.3 megatons of conventional bombs dropped on Germany in WWII was likewise equivalent to:

13,000,000(10-7)2/3 = 280 separate thermonuclear weapons, each 1 megaton.

In Vietnam, 7,662,000 tons of conventional bombs were dropped (according to Micheal Clodfelter's Vietnam in Military Statistics, 1995, page 225), which by this reckoning (10 kt of conventional bombs = 1 megaton of nuclear) is equivalent in terms of damage to a nuclear war of 766 separate 1 megaton explosions.

So you see, when the proper scaling laws are applied, nuclear weapons are not so destructive. Put it another way, vague arm-waving propaganda exaggerates nuclear war to a very serious degree.


There is an immense blast collateral damage inefficiency of the nuclear bomb as compared to conventional weapons, due to the fact that blast damage areas due to peak overpressure are proportional to the two-thirds power of yield. E.g., a 1 kg TNT bomb is a thousand million times smaller in blast energy than a 1 megaton blast, but it produces equal peak overpressures over an area equal to (10-9)2/3 = 10-6 of that of a 1 megaton blast. Therefore, one million separate 1 kg TNT bombs, or 1 kiloton of TNT, is exactly equivalent to a single explosion of 1 megaton of TNT. This explains why the blast effects from a megaton bomb are approximately equal to a 1 kiloton World War II conventional bomber attack, with a hundred or more aircraft each scattering a few tons of TNT in small bombs over a large area target (so that there is little probability of severe blast area overlap, i.e. the wasteful "overkill" effect). But all nuclear weapons media propaganda ignores such facts, presenting a megaton explosion over a city as an unparalleled disaster, a thousand times worse than a large World War II . The very first edition of Glasstone's nuclear effects handbook, The Effects of Atomic Weapons, 1950, on page 57 has a section written by John von Neumann and Fredrick Reines of Los Alamos (it is attributed to them in a footnote) stating factually:

"... the structures ... have the additional complicating property of not being rigid. This means that they do not merely deflect the shock wave, but they also absorb energy from it at each reflection.

"The removal of energy from the blast in this manner decreases the shock pressure at any given distance from the point of detonation to a value somewhat below that which it would have been in the absence of dissipative objects, such as buildings."


This was removed from future editions. This isn't speculative guesswork: it's down to the conservation of energy. I emailed Dr Harold L. Brode and other experts about why it isn't included in American nuclear weapons effects manuals. Dr Brode kindly replied with some relevant and interesting facts about non-radial energy flows in Mach waves and the transfer of energy from the blast wave to flying debris (which, alas, travels slower than the supersonic shock front because the blast wind is always slower than the shock front velocity). It is true that the energy loss from the blast wave near ground level is partially offset by downward diffraction of energy from the diverging blast wave at higher altitudes. However, this downward diffraction process is not a 100% efficient compensator for energy loss, particularly for the kinetic energy of the air (the dynamic pressure or wind drag effect). The dynamic pressure (which in unobstructed desert or ocean nuclear tests makes the blast more hazardous for higher yield weapons) is an air particle effect not a wave effect so it does not diffract like a wave, and it is cut down severely when transferring its energy to building debris. Even if every house absorbs just 1% of the incident energy per unit of area incident to the blast, then the destruction of a line of 100 houses cuts the blast energy down to 0.99100 = 0.366 of what it would be over a desert surface. Basically, this chops down the collateral blast damage from large yield weapons detonated in cities and affects the usual scaling laws, making nuclear weapons even less dangerous than predicted by the textbook equations and curves.attack!


Nigel's points-- Proven measures and practical designs employed in England in the midst of shortages in 1941 could save over 90% of prospective victims.


The discovery of this table "duck and cover" effectiveness in air raids led to a revolutionary shelter design; the indoor Morrison table shelter of 1941. (For publication dates of these booklets, see T. H. O’Brien, Civil Defence, H.M. Stationery Office, 1955, pages 371 and 529.) It is the forerunner to the “inner core refuge” adopted for protection against thermal flash, blasted flying debris and fallout radiation in a nuclear war in the 1980 booklet Protect and Survive.
http://ww2today.com/27th-march-1941-the-morrison-shelter-is-introduced
Above: the facts about the life-saving ability of the Morrison table shelter during aerial bombing in World War II Britain: it protects against the collapse of buildings regardless of whether that collapse is caused by TNT, a hurricane, an earthquake, or a nuclear bomb. A U.K. Government press release from November 1941,Morrison Shelters in Recent Air Raids, states:
“A report of Ministry of Home Security experts on 39 cases of bombing incidents in different parts of Britain covering all those for which full particulars are available in which Morrison shelters were involved shows how well they have stood up to severe tests of heavy bombing.

“All the incidents were serious. Many of the incidents involved direct hits on the houses concerned, a risk against which it was never claimed these shelters would afford protection. In all of them the houses in which shelters were placed were within the radius of damage by bombs; in 24 there was complete demolition of the house on the shelter.

“A hundred and nineteen people were sheltering in these ‘Morrisons’ and only four were killed. So that 115 out of 119 people were saved. Of these only 7 were seriously injured and 14 slightly injured while 94 escaped uninjured. The majority were able to leave their shelters unaided.”
The top set of instructions for building the Morrison shelter and using it as a table between air-raids are taken from the instruction manual for building the Morrison shelter, How to put up your Morrison “Table” Shelter, issued by the Ministry of Home Security, H.M. Stationery Office, March 1941(National Archives document reference HO 186/580), which states:

“The walls of most houses give good shelter from blast and splinters from a bomb falling nearby. The bomb, however, may also bring down part of the house, and additional protection from the fall of walls, floors and ceilings is therefore very essential. This is what the indoor shelter has been designed to give. Where to put it up, which floor? Ground floor if you have no basement. Basement, if you have one. ... Protect windows of the shelter room with fabric netting or cellulose film stuck to the glass (as recommended in Your Home as an Air Raid Shelter). The sides of your table shelter will not keep out small glass splinters.”


“The public outcry about conditions in the largest public shelters, often without sanitation or even lighting, and the appalling inadequacy of the over-loaded and ill-equipped rest centres for the bombed-out led to immediate improvements, but cost Sir John Anderson his job. ... His successor as Home Secretary, Herbert Morrison ...

“The growing reluctance of many people to go out of doors led the new Home Secretary to look again at the need for an indoor shelter… The result was the Morrison shelter, which resembled a large steel table … During the day it could be used as a table and at night it could, with a slight squeeze, accommodate two adults and two small children, lying down. The first were delivered in March 1941 and by the end of the war about 1,100,000 were in use, including a few two-tier models for larger families. Morrisons were supplied free to people earning up to £350 a year and were on sale at about £7 to people earning more. … the Morrison proved the most successful shelter of the war, particularly during the ‘hit and run’ and flying-bomb raids when a family had only a few seconds to get under cover. It was also a good deal easier to erect than an Anderson, and while most people remember their nights in the Anderson with horror, memories of the Morrison shelter are usually good-humoured.

“... A government leaflet, Shelter at Home, pointed out that ‘people have often been rescued from demolished houses because they had taken shelter under an ordinary table... strong enough to bear the weight of the falling bedroom floor’. I frequently worked beneath the solid oak tables in the school library during ‘imminent danger periods’ and, particularly before the arrival of the Morrison, families became accomplished at squeezing beneath the dining table during interrupted meals. ... Although the casualties were mercifully far fewer than expected, the damage to property was far greater. From September 1940 to May 1941 in London alone 1,150,000 houses were damaged ...”


- Norman Longmate, How we Lived Then - A history of everyday life during the Second World War, Pimlico, 1971.


Above: the British Mission to Japan in 1945 evaluated the nuclear explosion damage at Hiroshima and Nagasaki, producing a report called The Effects of the Atomic Bombs at Hiroshima and Nagasaki (linked here, 42.5 MB pdf file). The purpose of the British Mission was for ten British Home Office bomb damage scientists to directly compare the British bomb damage assessment criteria from German air raids upon British cities with conventional bombs to the effects of nuclear weapons. Page 6 states:

"Photographs in this report and elsewhere show great areas of destruction in which, rising here and there like islands, there remain reinforced concrete buildings showing few signs of external damage. There were in fact many reinforced concrete buildings in Hiroshima and a number in Nagasaki. ... These observations make it plain that reinforced concrete framed buildings can resist a bomb of the same power detonated at these heights, without employing fantastic thicknesses of concrete."



On page 8, the report finds that Japanese wood-frame houses collapsed out to a ground range of 2.0 km in Hiroshima (at this range, 50% of the wood-frame houses were subsequently burned out by the fire storm, due to the blast wave displacement of breakfast cooking charcoal braziers and flammable traditional bamboo/paper screen furnishings in the wooden houses; at 2.6 km only 10% were burned out and at 1.0 km about 90% were burned out) and 2.4 km in Nagasaki, while typical brick type British type only collapsed out to an average distance of 910 metres (at 1.6 km they were standing but irrepairably cracked, at 2.4 km they needed repair before habitation and there was minor damage from 3.2-4.0 km). Page 9 states:

"The provision of air raid shelters throughout Japan was much below European standards. ....."These observations show that the standard British shelters would have performed well against a bomb of the same power exploded at such a height. Anderson shelters [1.5 million of which were assembled in Britain by September 1939, each sleeping 6 people], properly erected and covered, would have given protection. Brick or concrete surface shelters with adequate reinforcement would have remained safe from collapse. The [indoor] Morrison shelter is [a steel table type shelter] designed only to protect its occupants from the debris load of a [collapsing] house, and this it would have done. Deep shelters such as the refuge provided by the London Underground would have given complete protection."

Cresson Kearny's Oak Ridge National Laboratory Nuclear War Survival Skills, ADA328301 (1979), which contains all the evidence for the civil defence T. K. Jones was discussing. Kearny's blast and fallout shielding evidence was completely ignored by Scheer. Scheer was engaged in a political diatribe.

http://oai.dtic.mil/oai/oai?verb=getRecord&metadataPrefix=html&identifier=ADA328301
I think that part of the reason why civil defense was being taken less seriously at that time was that the excellent civil defense chapter in Glasstone and Dolan's Effects of Nuclear Weapons 1964 was completely removed from the 1977 edition which was published during Carter’s administration, which also tried to appease the Soviet Union by delaying the deployment of the neutron bomb.


http://glasstone.blogspot.co.il/2010/03/lifeboat-analogy-to-civil-defence.html
John Newman had examined effects of fallout blown into a buildings, due to blast-broken windows, in Health Physics,vol. 13 (1967), p. 991: ‘In a particular example of a seven-storey building, the internal contamination on each floor is estimated to be 2.5% of that on the roof. This contamination, if spread uniformly over the floor, reduces the protection factor on the fifth floor from 28 to 18 and in the unexposed, uncontaminated basement from 420 to 200.’ But measured volcanic ash ingress, measured as the ratio of mass per unit area indoors to that on the roof, was under 0.6% even with the windows open and an 11-22 km/hour wind speed as reported in U.S. Naval Radiological Defense Laboratory report USNRDL-TR-953, 1965. The main gamma hazard is from a very big surrounding area, not from trivial fallout nearby! Hence, the gamma radiation that needs to be shielded is not that from fallout under your feet. Even if the roof is blown off a building, since 90% of the fallout gamma radiation dose is from direct gamma rays (not Compton effect air scattered gammas) any walls or indeed pile of rubble will shield the long range direct gamma rays which are coming to you almost horizontally. 'The Challenge - Why Home Defence?', to the Home Office 1977 Training Manual for Scientific Advisers:

'Since 1945 we have had nine wars - in Korea, Malaysia and Vietnam, between China and India, China and Russia, India and Pakistan and between the Arabs and Israelis on three occasions. We have had confrontations between East and West over Berlin, Formosa and Cuba. There have been civil wars or rebellions in no less than eleven countries and invasions or threatened invasions of another five. Whilst it is not suggested that all these incidents could have resulted in major wars, they do indicate the aptitude of mankind to resort to a forceful solution of its problems, sometimes with success. ...

'Let us consider what a nuclear attack on the United Kingdom might mean. It will be assumed that such an attack will only occur within the context of a general nuclear war which means that the UK is only one of a number of targets and probably by no means the most important. It follows that only part of the enemy's stock of weapons is destined for us. If the Warsaw Pact Nations constitute the enemy - and this is only one possible assumption - and if the enemy directs the bulk of his medium range and intermediate range weapons against targets in Western Europe behind the battle front, then Western Europe would receive about 1,000 megatons. Perhaps the UK could expect about one fifth of this, say 200 Mt. Let us assume rather arbitrarily that this would consist of 5 x 5 Mt, 40 x 2 Mt, 50 x 1 Mt and 100 x 1/2 Mt.

'An attack of this weight would cause heavy damage over about 10,000 square kilometres, moderate to heavy damage over about 50,000 square kilometres, and light damage over an additional 100,000 square kilometres. (Light damage means no more than minor damage to roofs and windows with practically no incidence of fire.) We can compare the heavy damage to that suffered by the centre of Coventry in 1940. This will amount to approximately 5% of the land area of the UK. Another 15% will suffer extensive but by no means total damage by blast and fire; another 40% will suffer superficial damage. The remaining 40% will be undamaged. In other words, four-fifths of the land area will suffer no more than minor physical damage. Of course, many of the undamaged areas would be affected by radioactive fallout but this inconvenience would diminish with the passage of time.

'Policy to meet the Threat

'The example just given of the likely severity of the attack - which is, of course, only one theoretical possibility - would still leave the greater part of the land area undamaged and more people are likely to survive than to perish. Government Home Defence policy must therefore be aimed to increase the prospects of the survivors in their stricken environment.'

'When a bomb is burst in the air the pressure wave is reflected from the ground, and since the reflected wave travells through air which has been compressed and heated by the direct wave, it tends to travel faster than, and to catch up with, the direct wave. When the reflected wave catches up with the direct wave the two join together to form what is called a Mach wave, and this accounts for a pronounced increase in range of damage.

'The duration of any particular feature of a blast wave varies approximately with the cube root of the power [power in common sense of energy release, not power in the physics definition of the rate of energy release] of the explosion. ... The familiar 500 lb. [230 kg] H.E. [high explosive] bomb of the last war contained about 1/15th of a ton of T.N.T. A nominal [20 kt] atomic bomb contains the equivalent explosive energy of 20,000 tons of T.N.T. The ratio of equivalent weights is therefore 300,000 to 1, and the ratio of the cube roots of these weights is about 70 to 1. The duration of the blast pressure from a 500 lb. bomb is about 1/100th second, so with a nominal atomic bomb it should be 0.7 seconds (actually the duration of the wave increases also with its distance from the source and at distances of 2 miles is about 1 second). Applying the same scaling law, the blast pressure from a [10 megaton] 500 x nominal bomb will last 5 seconds or more.

'These large differences in duration of the positive pressure phase for different sizes of explosion result in the mechanism of damage from an atomic or hydrogen bomb being quite different from that for an H.E. bomb. ... The ability of a suddenly applied blow to cause damage is determined both by the pressure and by its duration. In fact, it is the product of these two (known as the "impulse") which measures the damaging ability of the blast from an H.E. bomb.

'When a bomb is burst in the air the pressure wave is reflected from the ground, and since the reflected wave travells through air which has been compressed and heated by the direct wave, it tends to travel faster than, and to catch up with, the direct wave. When the reflected wave catches up with the direct wave the two join together to form what is called a Mach wave, and this accounts for a pronounced increase in range of damage.

'With the nominal [20 kt] bomb the pulse of thermal radiation from the fireball lasts for only about 1.5 seconds though most of the energy is radiated in about half a second; because it is so transient, this pulse has been called the "heat flash". With a 10 megaton bomb the thermal radiation lasts much longer and can hardly be described as a "flash"; it may persist for 20 seconds or more though most of its energy will be radiated in the first 10 seconds.


'People directly exposed to the heat flash from an air burst nominal [20 kt] bomb within 2.5 miles of ground zero would receive burns on exposed skin; even at a distance of 5 miles it would feel as though an oven door had suddenly been opened nearby. The nearer to ground zero the greater is the danger to life, and those directly exposed within 0.5 mile of ground zero [unshielded by white paper or anything opaque] would undoubtedly be killed because of serious burns, if not from other causes. Severe third degree burns (charring) would result up to about a mile, second degree burns (blistering) up to about 1.5 to 2 miles, and first degree burns (reddening) up to about 2.5 miles.

'It is relatively easy to gain protection, since [because atmospheric scattering of thermal radiation has been found to be trivial compared to absorption] one has only to be out of the direct path of the rays from the fireball. Complete protection from heat-burn could be achieved if everyone took cover [just get out of the fireball line-of-sight from windows and skylights]...

Effects of an air burst bomb on public utility services

'The effects of an air burst bomb, whether nominal or larger than nominal, on public utility services would be largely confined to damage above ground. Underground gas and water mains would be undamaged, except possibly where they were carried on bridges, or where they were fairly close to the surface and liable to damage by a collapse of neighbouring heavy masonry. Sewers too should be undamaged. Overground installations and services, such as gas holders, water pumping stations, electricity generating stations and sub-stations, overhead electricity, telephont and telegraph cables, buses and motor cars would be damaged more or less severely up to 1 mile or so from ground zero for a nominal [20 kt] bomb, and up to 8 miles for a 10 megaton bomb. Railway and tramway [street car] tracks would probably remain intact but might be affected by debris, overturned rolling-stock, adjacent fires, etc.

'It is not so easy to assess the chance of a continuing fire. A window of two square metres would let in about 10^5 calories at the 5 cal/(cm)^2 range. The heat liberated by one magnesium incendiary bomb is 30 times this and even with the incendiary bomb the chance of a continuing fire developing in a small room is only 1 in 5; in a large room it is very much less.

'Thus even if thermal radiation does fall on easily inflammable material which ignites, the chance of a continuing fire developing is still quite small. In the Birmingham and Liverpool studies, where the most generous values of fire-starting chances were used, the fraction of buildings set on fire was rarely higher than 1 in 20.

'And this is the basis of the assertion [in Nuclear Weapons] that we do not think that fire storms are likely to be started in British cities by nuclear explosions, because in each of the five raids in which fire storms occurred (four on Germany - Hamburg, Darmstadt, Kassel, Wuppertal and a "possible" in Dresden, plus Hiroshima in Japan - it may be significant that all these towns had a period of hot dry weather before the raid) the initial fire density was much nearer 1 in 2. Take Hamburg for example:

'On the night of 27/28th July 1943, by some extraordinary chance, 190 tons of bombs were dropped into one square mile of Hamburg. This square mile contained 6,000 buildings, many of which were [multistorey wooden] medieval.

'A density of greater than 70 tons/sq. mile had not been achieved before even in some of the major fire raids, and was only exceeded on a few occasions subsequently. The effect of these bombs is best shown in the following diagram, each step of which is based on sound trials and operational experience of the weapons concerned.

'102 tons of high explosive bombs dropped -> 100 fires

'88 tons of incendiary bombs dropped, of which:

'48 tons of 4 pound magnesium bombs = 27,000 bombs -> 8,000 hit buildings -> 1,600 fires

'40 tons of 30 pound gel bombs = 3,000 bombs -> 900 hit buildings -> 800 fires

'Total = 2,500 fires

'Thus almost every other building [1 in 2 buildings] was set on fire during the raid itself, and when this happens it seems that nothing can prevent the fires from joining together, engulfing the whole area and producing a fire storm (over Hamburg the column of smoke, observed from aircraft, was 1.5 miles in diameter at its base and 13,000 feet high; eyewitnesses on the ground reported that trees were uprooted by the inrushing air).

'When the density was 70 tons/square mile or less the proportion of buildings fired during the raid was about 1 in 8 or less and under these circumstances, although extensive areas were burned out, the situation was controlled, escape routes were kept open and there was no fire storm.'

'Hardening of personal transistor radios is theoretically possible and implies good design practice (e.g. shielding, bonding, earthing, filtering etc.) incorporated at the time of manufacture. Such receivers are not currently available on the popular market.'

http://glasstone.blogspot.co.il/2006/08/nuclear-weapons-1st-edition-1956-by.html

Above: the 1992 BBC broadcast attack on Herman Kahn’s civil defense facts by Adam Curtis, Pandora’s Box: To The Brink Of Eternity. The clip of Kahn saying:

"Even if you irrationally decide to go to war, that doesn't mean that you have to fight it in a wildly irrational fashion,"


is taken out of context: Kahn is referring to the lesson of Britain's decision to go to war with Hitler in September 1939. Because of the pathetic year-on-year rate of Britain's rearmament compared to Germany's after 1935, Britain was in the most feeble state to go to war at that time. Kahn's argument is Britain should rationally have decided to go to war in say 1934 and stopped Germany's illegal rearmament with minimal combat (Britain was still more powerful in 1934), or surrendered completely to the Nazi threat. By leaving war until 1939, Britain was either (1) deliberately allowing the Nazis to prepare better for war than Britain (remember that in 1939 America was neutral and there was not even any sight of lend-lease on the horizon) or (2) behaving irrationally. Britain was declaring war irrationally, having been duped by lying weapons-effects-exaggerating appeasers into not declaring war when it had a chance of winning without American help. This is the first thing you need to understand about Kahn's statement. The second thing is that even though Britain declared war at a time which was irrational (it should have done that earlier, when the ratio of British to German strength was higher), it did not fight the war in an irrational fashion, nor did Germany. Neither side immediately despatched 100% of their bombers filled with weapons of mass destruction like gas or germs to kill the other side, despite the appeasers' pre-war certainty that war would escalate instantly into mass destructions with a million casualties a month predicted in Britain.

This is Kahn's second point: even Hitler didn't immediately try to annihilate the world's population with his stockpile of mustard liquid contaminant or Nazi-discovered tabun nerve gas, of which thousands of tons were manufactured but never used by the Nazis because they didn't have enough gas masks to deal with mustard gas retaliation, owing to a rubber shortage. Kahn is following A. J. P. Taylor's The Origins of the Second World War standpoint on Hitler here; Hitler was a bigoted egotistical dictator, but so are most politicians at heart; most politicians are simply so inept that they fail to obtain enough power to corrupt them absolutely (a point long ago observed by 19th century historian Lord Action: “And remember, where you have a concentration of power in a few hands, all too frequently men with the mentality of gangsters get control. History has proven that. All power corrupts; absolute power corrupts absolutely”). In other words, Hitler was not a unique thug; thugs are common. Taylor's point is that Hitler's 1930s propaganda was true to the extent that he was doing his best for the "Aryan" German. Hitler believed Britain wouldn't fight under any circumstances because that was what the leading British politicians and newspaper editors were saying when they exaggerated the effects of weapons and stated that Britain would be wiped out in war, so war was unthinkable. Kahn draws the lesson from this that war must never be unthinkable to the public again, if war is to be averted by unequivocal deterrence.

Curtis additionally gets the facts about Kahn wrong by claiming that Kahn's "flexible response" strategy was debunked by the Cuban missile crisis. Kahn makes it clear that "flexible response" was needed once the Soviet Union had a balance with America, not before that time. The Soviet Union was still behind America in 1962, so Kennedy could afford to promise massive retaliation in response to any missile being fired from Cuba in order to encourage Soviet caution with the missiles in Cuba. Massive retaliation would have been an empty promise if the Soviet Union had an immense stockpile of nuclear weapons in 1962. It didn't. Kahn had proposed flexible response for the late 1960s onwards.

Duck and cover simple countermeasures were ignored by Kahn

Kahn never really made the civil defense case effectively by getting to grips with the details of survival in Hiroshima and Nagasaki in different kinds of buildings, the effectiveness of "duck and cover", and the detailed scientific studies on nuclear weapons effects from tests remained Secret – Restricted Data until recent years. Politicians, policy makers, and even many nuclear weapons effect computation scientists are unaware of the vital data from Hiroshima, Nagasaki and nuclear tests http://glasstone.blogspot.co.il/2010/03/lifeboat-analogy-to-civil-defence.html

    
http://glasstone.blogspot.co.uk/2015/10/russian-anti-terrorism-policing-world.html
The larger the explosion, the fewer casualties per unit energy release for similar conditions, because (1) the area of destruction scales up far less than in direct proportion to the energy released in the explosion, and additionally, (2) the larger areas of destruction for a larger explosion means an increasing average blast wave arrival time over the area of damaged buildings, therefore allowing on average more time for "duck and cover" between the flash and the blast and sound arrival.  Most blast casualties are caused by flying glass and other debris impacting on standing people who take no evasive action.  For 1 psi peak overpressure (flying glass from window breakage), at 1 ton of TNT yield (a V1 or V2 in WWII) you have only 0.4 seconds to respond to the flash before the blast arrives.  But for a tactical nuclear weapon such as neutron bomb of 1 kiloton yield you have 4 seconds to "duck and cover" which reduces collateral damage possibilities, and for a strategic nuclear weapon of 1 megaton on a silo or military command post, the public has 40 seconds to "duck and cover" before 1 psi blast arrives.  These laws of physics mean that smaller explosions cause many more casualties per unit energy than larger explosions do.  As the data table and graph above proves, there is experimental evidence to substantiate these laws of physics.  It is also easier to spend a few seconds lying flat to avoid blast winds and horizontally blasted glass fragments in a nuclear explosions, than to spend months doing so during the repeated conventional air raids, which are required to deliver the same amount of energy!  The longer duration of the blast for larger explosions blows debris further downrange, reducing the weight of material falling on simple improvised protection, such as table shelters.  Although a longer blast duration blast causes more damage to wind sensitive targets by bending tree stems and lightweight metal panels for a longer period of time, it has little effect on modern reinforced concrete buildings which require a minimum force for damage, irrespective of the duration of that force.  Likewise, a chair or rigid wall does not suddenly collapse after you apply a force to it for a certain period of time: there is no failure impulse criterion for such targets!  You must supply a force above a certain threshold for destruction to occur.  If the force (pressure multiplied by area) is below the threshold needed for destruction, no destruction will occur, regardless of the duration of the force. Propagandarists of fashionable groupthink always dismiss this evidence, and instead do a direct comparison of conventional and nuclear energy yields, as if it were valid: that is a massive exaggeration.  In addition, anti-civil defense propagandarists also deny that duck and cover becomes more credible as yield increases.  (Source: H. M. Government, Health and Safety Executive (Commission), Advisory Committee on Major Hazards, Second Report, 1979, Figure 3.)

In WWI, Britain's fired 170 million shells at German trenches, of which 1.5 million were fired in the brief barrage before the Battle of the Somme.  In 1917 alone, Britain produced 50 million shells containing 185 kilotons of explosive. In the Battle of Amiens, August 1918, the firing of 4,000,000 allied shells broke down German positions.  In a final push, devastation at a rate similar to nuclear war bombardment occurred when 943,947 shells were fired in a 24-hour period by the British Army on 28-29 September 1918, resulting in the Armistice ending the war (source: Malcolm Pearce and Geoffrey Stewart, British political history, 1867-2001, page 296).  Altogether, from 1914-17 Britain fired 290 kilotons of high explosives in shells at German trenches:



The "equivalent megatonage" or equivalent to 1 megaton nuclear weapons, isn't just 0.29 megatons, but is immense because the area of destruction and thus casualties scale by only about the 2/3 power of energy, not directly with yield, and each average shell contained only 3.7 kg of explosive. Thus, the equivalent megatonnage of Britain's shelling in 1917 alone is:

50,000,000(3.7 x 10-9)2/3 = 120 separate 1 megaton nuclear weapons.  In the whole of WWI, the British Army fired 170 million shells, with equivalent damage to:

170,000,000(3.7 x 10-9)2/3 = 408 separate 1 megaton nuclear weapons.

(We can neglect the 50% blast partition of total yield in nuclear weapons, because that's also true for conventional explosive shells that are 50% explosive, 50% steel case by mass.)


Dr Ralph E. Lapp's 1965 book The New Priesthood (Harper, New York) on pages 113-114 gives an honest "equivalent megatonnage" comparison between conventional weapons and old high-yield megaton single warhead nuclear missiles (which have now been replaced with lower yield MIRV warheads) instead of following CND by claiming falsely that the energy equivalent of 1,000,000 tons of TNT kills the same number as a million separate tons of TNT in explosions of conventional weapons:

"A warhead for a Minuteman or Polaris missile costs about $1 million each. ... To produce damage comparable to that from a one-megaton bomb, some 8,000 'old-fashioned' bombs each containing one ton of TNT would have to be dropped uniformly over the same target area."

In other words, according to Lapp: 8 kt of conventional weapons = 1 megaton.  Using the two-thirds power of yield scaling, the equivalence is: 10 kt of small 1 ton TNT bombs = same area of damage as 1 megaton in a single bomb.  The American B-52 bomber has a payload of 32 tons, so it takes 313 sorties to drop 10 kt of TNT which (if the bombs are 1 ton each) is equivalent in damage area to a 1 megaton nuclear weapon.  For solid direct evidence for the validity of this scaling law, whereby bigger bombs cause fewer fatalities per TNT ton of energy equivalent than smaller bombs, see the graphs linked in the earlier post here and the ease of protection against the increasingly delayed heat, fallout and blast arrival time over larger areas for bigger explosions, as proved here.  At the 1 psi peak overpressure range for shattered windows in a conventional 1 ton TNT air burst explosion, there is only 0.4 second available between the flash and the blast arrival, little longer than the blink reaction time for human beings.  Hence, for small bombs, you can do little.  But, contrary to BBC TV fiddled sound tracks on films of nuclear explosions, for a 1 kt bomb you have a full 4 seconds before 1 psi arrives, while for 1 megaton you have 40 seconds.  This effect reduces casualties.

In Vietnam, 7,662,000 tons of conventional bombs were dropped (according to Micheal Clodfelter's Vietnam in Military Statistics, 1995, page 225), which by this reckoning (10 kt of conventional bombs = 1 megaton of nuclear) is equivalent in terms of damage to a nuclear war of 766 separate 1 megaton explosions.

If you're worried that we haven't included fallout, don't worry: we didn't include the 113,000 tons of gas used in WWI in that calculation.  But seeing that gas wasn't used in WWII despite dire scare-mongering prior to the war - largely responsible for the appeasement policy that led to the war, according to Herman Kahn's analysis - there's no particular reason why nuclear weapons will be used to maximise fallout by high yield ground bursts near cities, rather than air bursts.  Likewise for the time-scale of the attack: in 1939 pundits were claiming that there would be an immediate all-out "knockout blow" lasting days, not six years of protracted war.  As Kahn argued, even a dictator like Hitler didn't fight WWII in the wildly irrational way that the consensus of expert opinion in 1939 predicted. 


 There's even less reason for a country to try to disarm itself by detonating every warhead it has within five minutes of a nuclear war starting.

Now consider WWII, where London alone received about 18.8 kilotons in roughly 188 thousand separate 100 kg explosives in the 1940 Blitz :

188,000(10-7)2/3 = 4 thermonuclear weapons, each 1 megaton.

The 1.3 megatons of conventional bombs dropped on Germany in WWII was likewise equivalent to:

13,000,000(10-7)2/3 = 280 separate thermonuclear weapons, each 1 megaton.

In total, 74.2 kilotons of conventional bombs were dropped on the UK in WWII causing 60,000 casualties, equivalent to 16 separate 1 megaton nuclear weapons, confirming the British Home Office analysis that - given cheap-type civil defence - you get about 3,750 casualties for a one megaton nuclear weapon.  Naturally, without civil defence, as in early air bombing surprise attacks or the first use of nuclear weapons against Hiroshima and Nagasaki, casualty rates can be over 100 times higher than this.  (For example, Glasstone and Dolan, in The Effects of Nuclear Weapons, 1977 point out that in Hiroshima the 50% lethal radius was only 0.12 mile for people under cover in concrete buildings, compared to 1.3 miles for those caught totally unprotected outdoors.  The difference in areas is over a factor of 100, indicating that the casualties in Hiroshima could have been reduced enormously if the people had taken cover in concrete buildings, or simple earth covered WWII shelters which offered similar protection to concrete buildings.)

About ten percent of the conventional bombs failed to detonated, creating a massive bomb disposal problem that slowed down civil defence in WWII, where the protracted air raids over many months progressively reduced shelter utilization in London, increasing the casualty rate.  In neither Britain nor Germany did the bombing of civilians lead to a clear defeat: the U.S. Strategic Bombing Survey found that generally the outrage about being bombed offset the depression of morale from the devastation.  Strategic bombing of military manufacturing targets like ball bearing factories failed because the steel machine tools could easily withstand the blast and shrapnel.  Only the bombing of fuel and munition supplies (both of which will destroy themselves easily, once ignited) crucially helped to end the war: German production of aviation fuel fell from 156,000 tons in May 1944 to just 11,000 tons in January 1945, thus defeat. The point is:

Conventional weapons failed to deter two world wars, which were each the size of a substantial nuclear war (in terms of devastation and overall casualties).  Disarmament after WWI led to WWII.

If you're worried that we haven't included fallout, don't worry: we didn't include the 113,000 tons of gas used in WWI in that calculation.  But seeing that gas wasn't used in WWII despite dire scare-mongering prior to the war - largely responsible for the appeasement policy that led to the war, according to Herman Kahn's analysis - there's no particular reason why nuclear weapons will be used to maximise fallout by high yield ground bursts near cities, rather than air bursts.  

Likewise for the time-scale of the attack: in 1939 pundits were claiming that there would be an immediate all-out "knockout blow" lasting days, not six years of protracted war.  As Kahn argued, even a dictator like Hitler didn't fight WWII in the wildly irrational way that the consensus of expert opinion in 1939 predicted.  There's even less reason for a country to try to disarm itself by detonating every warhead it has within five minutes of a nuclear war starting.

Now consider WWII, where London alone received about 18.8 kilotons in roughly 188 thousand separate 100 kg explosives in the 1940 Blitz :

188,000(10-7)2/3 = 4 thermonuclear weapons, each 1 megaton.

The 1.3 megatons of conventional bombs dropped on Germany in WWII was likewise equivalent to:

13,000,000(10-7)2/3 = 280 separate thermonuclear weapons, each 1 megaton.

In total, 74.2 kilotons of conventional bombs were dropped on the UK in WWII causing 60,000 casualties, equivalent to 16 separate 1 megaton nuclear weapons, confirming the British Home Office analysis that - given cheap-type civil defence - you get about 3,750 casualties for a one megaton nuclear weapon.  Naturally, without civil defence, as in early air bombing surprise attacks or the first use of nuclear weapons against Hiroshima and Nagasaki, casualty rates can be over 100 times higher than this.  (For example, Glasstone and Dolan, in The Effects of Nuclear Weapons, 1977 point out that in Hiroshima the 50% lethal radius was only 0.12 mile for people under cover in concrete buildings, compared to 1.3 miles for those caught totally unprotected outdoors.  The difference in areas is over a factor of 100, indicating that the casualties in Hiroshima could have been reduced enormously if the people had taken cover in concrete buildings, or simple earth covered WWII shelters which offered similar protection to concrete buildings.)

About ten percent of the conventional bombs failed to detonated, creating a massive bomb disposal problem that slowed down civil defence in WWII, where the protracted air raids over many months progressively reduced shelter utilization in London, increasing the casualty rate.  In neither Britain nor Germany did the bombing of civilians lead to a clear defeat: the U.S. Strategic Bombing Survey found that generally the outrage about being bombed offset the depression of morale from the devastation.  Strategic bombing of military manufacturing targets like ball bearing factories failed because the steel machine tools could easily withstand the blast and shrapnel.  Only the bombing of fuel and munition supplies (both of which will destroy themselves easily, once ignited) crucially helped to end the war: German production of aviation fuel fell from 156,000 tons in May 1944 to just 11,000 tons in January 1945, thus defeat. The point is:

Conventional weapons failed to deter two world wars, which were each the size of a substantial nuclear war (in terms of devastation and overall casualties).  Disarmament after WWI led to WWII.

That's what you get when you don't even have a nuclear deterrent.  However, I don't see why we have to have the extremely expensive (£100 billion for a set of four) strategic nuclear Trident SLBM system.  Why not simply put some tactical (enhanced neutron) nuclear warheads on cruise missiles on our Astute class submarines (which now cost us only £747 million each) to deter Putin from sending massed tank invasions into Europe?  Then if Mr Corbyn has to press the button, he can rest assured that the 1 kiloton yield nuclear weapons at 500 m burst altitude over Mr Putin's tank column as it heads over a border will not cause any harm to civilians.  Sure, some cruise missiles might be shot down, but since Moscow has ABM, some Trident warheads will likewise be shot down.  As for Trident, where we use penetration aids like cheap decoy warheads to help the real ones get through the saturated ABM systems, we can send the neutron warhead-armed cruise missiles disguised by a salvo of non-nuclear cruise missiles (the non-nuclear warheads could contain electronic countermeasures to blind enemy radars with false signals or noise, and to target enemy missile launchers that can shoot down cruise missiles).  Some warheads will get through to do the job.

Note also that the widely-believed propaganda that the Spitfire and Hurricane fighter aircraft then being built were a wonderful contribution thanks to Chamberlain, is actually a lie.  Both aircraft were already obsolete compared to the German Me-109 when used in the Battle of Britain in 1940.  Thus the growing stockpile of Spitfiles in 1938 were not only outnumbered in quantity by Germany, but were also soon obsolete in quality.  Battle of Britain Tom Neil, author of Scramble, aged 95, shot down 14 German aircraft and won two DFCs in the Battle of Britain.  He has now debunked populist myths.  He joined the RAF in 1938 and was taught to fly using a 20 year old obsolete Tiger Moth so that when in 1940 when finally given charge of 249 squadron he failed in practice to hit any target flags with his first 30,000 rounds of ammunition, and then he found that German Me-109s had a larger engine than his Spitfires and could climb faster as well as higher, and also had better guns and more ammunition than Spitfires and Hurricanes. Britain won the Battle of Britain not because it had superior aircraft as hyped up wartime propaganda for the Spitfire claimed, but rather, it survived the German onslaught despite the fact that it had poorer aircraft: "We didn't win.  But we didn't lose."

Not only were Britain's Hurricane and Spitfire actually inferior to German Me-109, but they were outnumbered.  Germany had over 700 superior Me-109s and 227 Me-110s, compared to Britain's inferior 650 Hurricanes and Spitfires.  This disproves Chamberlain's claim.  It was civil defence evacuation and shelters that won the Battle of Britain when German bombers on 7 September 1940 stopped bombing RAF air fields and instead bombed cities.  By reducing casualty rates and panic, civil defence then gave the RAF the time for fighter attrition to cut the Luftwaffe down to size. 

http://glasstone.blogspot.co.il/2006/08/nuclear-weapons-1st-edition-1956-by.html
 It turns out that 1.3 megatons as a single explosive is only the equivalent of 4.64 kilotons of 100 kg bombs, because efficiency is greater for smaller bombs.

(This is the reason that America stopped designing very high yield thermonuclear weapons after the 1954 nuclear tests of Operation Castle, and the mean yield of the 4,552 nuclear warheads and bombs in the deployed 1.172 Gt or 1,172 Mt U.S. nuclear stockpile is only 0.257 Mt or 257 kt. 257 kt is just 12 times the yield of the Nagasaki bomb, so by the cube-root scaling law the blast destruction radii for the mean yield of 257 kt is just 2.27 times the blast destruction radii in Nagasaki. Because there are no flimsy wood-frame inflammable cities in the West, the actual effects of typical stockpiled nuclear weapons today would be less severe than they were in Nagasaki.)

Because the average bomb size of conventional (chemical) high explosive bombs was under 100 kg in WWII, they were far more efficient than a megaton nuclear bomb: relative area damaged = number of bombs * {bomb yield}2/3

Hence to get the same area damaged by 100 kg TNT bombs as by a 1 Mt nuclear bomb, you would need only 1/(10-7)2/3 = 46,400 conventional 100 kg bombs, a total of just 46400*0.0001 = 4.64 kilotons of bombs doing the same area destruction as a single 1 megaton bomb. To emphasise this non-linear addition law:

1 megaton of TNT as a single explosion = 4.64 kt of 100 kg bombs in an air raid

The relative efficiency of the single 1 Mt nuclear bomb in this example is only 0.464% compared to conventional small TNT explosive bombs.

Hence, heavy conventional high explosive bombing raids with hundreds of aircraft in WWII produced the same destruction as a relatively large thermonuclear weapon. The fact that easily mitigated effects (such as delayed fallout and thermal radiation which is easily avoided by ducking and covering skin) were absent in the high explosive attacks, where the energy wasn't wasted but mainly went into blast wave damage, made conventional warfare far more dangerous

It is estimated that Mongol invaders exterminated 35 million Chinese between 1311-40, without modern weapons. Communist Chinese killed 26.3 million dissenters between 1949 and May 1965, according to detailed data compiled by the Russians on 7 April 1969. The Soviet communist dictatorship killed 40 million dissenters, mainly owners of small farms, between 1917-59. Conventional (non-nuclear) air raids on Japan killed 600,000 during World War II. The single incendiary air raid on Tokyo on 10 March 1945 killed 140,000 people (more than the total for nuclear bombs on Hiroshima and Nagasaki combined) at much less than the $2 billion expense of the Hiroshima and Nagasaki nuclear bombs! Non-nuclear air raids on Germany during World War II killed 593,000 civilians.
http://glasstone.blogspot.co.il/2015/10/russian-anti-terrorism-policing-world.html
At the 1 psi peak overpressure range for shattered windows in a conventional 1 ton TNT air burst explosion, there is only 0.4 second available between the flash and the blast arrival, little longer than the blink reaction time for human beings.  Hence, for small bombs, you can do little.  But, contrary to BBC TV fiddled sound tracks on films of nuclear explosions, for a 1 kt bomb you have a full 4 seconds before 1 psi arrives, while for 1 megaton you have 40 seconds.  This effect reduces casualties.

In Vietnam, 7,662,000 tons of conventional bombs were dropped (according to Micheal Clodfelter's Vietnam in Military Statistics, 1995, page 225), which by this reckoning (10 kt of conventional bombs = 1 megaton of nuclear) is equivalent in terms of damage to a nuclear war of 766 separate 1 megaton explosions.
 in 1939 pundits were claiming that there would be an immediate all-out "knockout blow" lasting days, not six years of protracted war.  As Kahn argued, even a dictator like Hitler didn't fight WWII in the wildly irrational way that the consensus of expert opinion in 1939 predicted.  There's even less reason for a country to try to disarm itself by detonating every warhead it has within five minutes of a nuclear war starting.

Now consider WWII, where London alone received about 18.8 kilotons in roughly 188 thousand separate 100 kg explosives in the 1940 Blitz :

188,000(10-7)2/3 = 4 thermonuclear weapons, each 1 megaton.

The 1.3 megatons of conventional bombs dropped on Germany in WWII was likewise equivalent to:

13,000,000(10-7)2/3 = 280 separate thermonuclear weapons, each 1 megaton.

In total, 74.2 kilotons of conventional bombs were dropped on the UK in WWII causing 60,000 casualties, equivalent to 16 separate 1 megaton nuclear weapons, confirming the British Home Office analysis that - given cheap-type civil defence - you get about 3,750 casualties for a one megaton nuclear weapon.  Naturally, without civil defence, as in early air bombing surprise attacks or the first use of nuclear weapons against Hiroshima and Nagasaki, casualty rates can be over 100 times higher than this.  (For example, Glasstone and Dolan, in The Effects of Nuclear Weapons, 1977 point out that in Hiroshima the 50% lethal radius was only 0.12 mile for people under cover in concrete buildings, compared to 1.3 miles for those caught totally unprotected outdoors.  The difference in areas is over a factor of 100, indicating that the casualties in Hiroshima could have been reduced enormously if the people had taken cover in concrete buildings, or simple earth covered WWII shelters which offered similar protection to concrete buildings.)

If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks




Форма для связи

Name

Email *

Message *