January 23, 2016

Ultraresponsive touch devices up to 100 inches on diagonal in size enables new applications

Cima NanoTech, a developer and manufacturer of transparent conductive film solutions, is has large ultra responsive large touch screens.

SANTE® ProTouch™ module is the industry’s first high performance projected capacitive (pro-cap) solution for large format touch screens that can be mass-produced in high volumes. With ultra fast response and excellent touch performance, this module enables an exceptionally intuitive multi-touch, multi-user experience for a wide array of applications such as interactive digital signage, kiosks, vending machines, tabletops and whiteboards. The company’s patented SANTE technology allows manufacturers to produce highly customizable touch displays with a significantly lower cost structure.



Key Features and Benefits of SANTE ProTouch Module:

  • Ultra fast response time with a scan rate of up to 120Hz (6ms) for a more intuitive and interactive experience
  • True multi-touch capabilities of up to 32-points for greater interactivity and better collaboration
  • Edge-to-edge cover lens for enhanced aesthetics and ease of maintenance
  • Narrow bezel of 20mm to enhance visual appeal


Available
65-inch Interactive Whiteboard
55-inch Interactive Digital Signage
40-inch Interactive Tabletop
40-inch Interactive Digital Signage

Email interview with Nextbigfuture

So how is this superior to surface Table and other touch devices?

In terms of performance, our touch modules deliver better performance in terms of speed and accuracy than you would get from your phone or tablet, but at sizes up to 100”. So we enable products that encourage creativity, collaboration and interactivity on a whole new level.

What is the cost relative to the competing touch devices?

Thanks to our joint venture with Foxconn, we are able to mass produce touch modules very cost competitively and so for our customers, this translates into the ability to deliver unique high performance products that are affordable. We have the industry’s first vertically-integrated supply chain for large format touch modules. This means we have full control over materials, costs, processes, and quality.

How does the increased speed and accuracy translate into applications that are not possible with lower resolution or slower response?

Having faster response and better performance in large touch devices compared to the less responsive IR technology on the market today means a totally different experience for the users. This heightened experience will open up an increased demand in general for large touch devices, and we will see them become much more commonly used throughout the retail, hospitality, food and beverage industries etc.

Customers’ preference for touch interfaces will lead to the integration of touch functionality in common home and work devices and appliances as well. Consumer applications such as interactive tabletops, mirrors, smart home, and IoT (Internet of Things) devices are not only possible, but also starting to gain traction in product development circles. Imagine a coffee table that displays a digital collage of family photos that can be resized and reorganized with the swipe of a finger.

Are there new video games?

The size and true multi-touch nature of our technology means that several players can interact and play simultaneously, which brings a new dimension of interactivity to playing all the old favourites. In terms of new multi-touch applications specifically developed for gaming there is a lot of opportunity for developers to create more dynamic software, and in the coming year we expect to see more advancements.

Walmart has 50" TVs for $339. How much would a touch enabled version be?

The cost of modules sold to our customers would depend on a number of factors – including volume, level of customization, etc. and the price of a finished device would depend on the pricing strategy of that particular customer. We are incredibly cost competitive which enables our customers to deliver products which are high performance yet very affordable. With this affordability comes the opportunity for large touch devices to open up new markets and product categories so we are excited about what this means for the consumer market and expect to see new, novel touch devices emerging for the home.

South Korea starts KF-X project to develop indigenous next-generation stealth fighter

South Korea officially kicked off its ambitious KF-X project to develop indigenous next-generation fighter jets to defend it airspace within the next decade.

Despite setbacks last year in acquiring some key technologies for the fighters from the United States, the DAPA unveiled a detailed timeline for the project in cooperation with the Korea Aerospace Industries (KAI), the local aircraft manufacturer. KF-X stands for Korean Fighter Experimental.

“The KF-X project will take a leading role in the development of our aviation industry,” Chang Myoung-jin, head of the DAPA, said Thursday. “As requested by our Air Force, we will make all assurances that a fighter jet with superb capabilities can be deployed in a timely manner and develop a locally-made fighter jet that all Koreans can be proud of.”

In November, Indonesia’s government signed on to the project, agreeing to pay 20 percent of the total cost of development. It will participate in some designing and receive access to some technologies and a prototype.



The Korean government allocated some 8.5 trillion won ($6.67 billion) to develop indigenous mid-level fighter jets to replace the Air Force’s antiquated F-4 and F-5 aircraft. Another 9.6 trillion won is earmarked for the production of the 4.5th generation fighters, which are expected to outperform the KF-16-class fighters, bringing the total budget for the project to 18.1 trillion won.

KAI is expected to begin production of the KF-X in 2018, finish designing by September 2019 and come up with six prototype fighters by 2021, according to the DAPA. It will spend the next four years doing flight tests to complete development by 2026.

Once development is finalized, 120 fighter jets are expected to be built by 2032.

In September 2014, the Korean government signed a 7.34 trillion won deal with Lockheed Martin to buy 40 F-35A jets and receive technical support for Korea’s project to build its own next-generation fighter jet.

Korea initially asked for 25 technologies from the U.S. defense contractor. The DAPA belatedly admitted in September that Washington had rejected export licenses for four core technologies pertaining to its F-35 stealth fighter jets. There still are worries over whether the United States will transfer the remaining 21 technologies in a timely manner and some doubts whether the development and building of the fighter jets will be feasible within the tight timeframe of a little over 10 years.



Nanofactory Solution to Global Climate Change: Atmospheric Carbon Capture initially at least 3 times cheaper than alternatives and later a lot cheaper

Robert Freitas wrote the multi-volume text Nanomedicine, the first book-length technical discussion of the potential medical applications of hypothetical molecular nanotechnology and medical nanorobotics. He was granted the first patent ever filed on diamond mechanosynthesis.

Robert Freitas has a new 86 paper that describes how nanofactory molecular nanotechnology could be applied to removing carbon from the atmosphere.

Robert A. Freitas Jr., “The Nanofactory Solution to Global Climate Change: Atmospheric Carbon Capture,” IMM Report No. 45, December 2015; http://www.imm.org/Reports/rep045.pdf

The new carbon capture technology proposed would be built by molecular manufacturing using first-generation nanofactories at a manufacturing cost of ~$1000/kg, would enable the atmospheric capture of CO2 at a total lifetime cost of about $21/tonne CO2, far less expensive than the $70-$200/tonne CO2 and higher estimated for conventional atmospheric carbon capture technologies. For an installation cost of $2.74 trillion/yr over 10 years followed by a maintenance cost of $0.91 trillion per year, a network of direct atmospheric CO2 capture plants could be emplaced that would be powerful enough to reduce global CO2 levels by ~50 ppm per decade, easily overwhelming current anthropogenic emission rates. This is sufficient to return Earth’s atmosphere to pre-industrial carbon dioxide levels near 300 ppm within 40 years from launch of program, and thereafter to maintain the atmosphere in this ideal condition indefinitely, eliminating one of the primary drivers of global climate change on our planet. Lower cost later generation nanofactories may allow the deployment of a global system of similar capacity for an annual installation and maintenance cost of $4.04 billion per year, capturing and permanently sequestering atmospheric CO2 using marine “carbon capture islands” at a total lifetime cost of about $0.08/tonne CO2. The cost is driven so extraordinarily low because the mature nanofactory, manufacturing atomically precise product for ~$1/kg, can also manufacture a cheap source of solar energy to power the CO2 capture and sequestration process.

As usual, Robert Freitas is exhaustive in his analysis of any issue that he studies. He lists all of the startups looking to perform carbon capture and the studies of all known methods.

A 2011 technology assessment panel under the auspices of the American Physical Society pessimistically concluded that the cost of atmospheric CO2 capture using all currently-known chemical-based approaches would likely approach $600/tonne in any actual fielded large-scale system.

The potential value of using motorized micromachines to assist in CO2 capture has already been illustrated experimentally. Here, we propose to achieve direct atmospheric carbon capture using a new technique based on atomically precise nanomachines called “molecular filters” that maximize the efficiency of molecular capture and transport across barrier membranes. Molecular filters can be fabricated in commercially useful quantities using molecular manufacturing methods such as nanofactories.

A specific sequence, focusing on a single binding pocket, is illustrated above. Such molecular pumps generally operate in a four-phase sequence: (1) recognition (and binding) by the transporter of the target molecule from a variety of molecules presented to the pump in the source fluid; (2) translocation of the target molecule through a wall, into the interior of the transporter mechanism; (3) release of the molecule by the transporter mechanism; and (4) return of the transporter to its original condition, outside the wall, so that it is ready to accept another target molecule. Molecular transporters that rely on protein conformational changes are ubiquitous in biological systems.


Commander in charge of US nuclear forces calls for modernization and buildup of US nuclear forces to counter strengthening China and Russia

Rick Fisher, a China military analyst at the International Assessment and Strategy Center, said China appears to be seeking to “sprint to parity” with the United States in warhead numbers along with growing space warfare capabilities. Rick Fisher feels this poses “a much greater danger to U.S. strategic forces and should prompt a build up of U.S. nuclear forces.

Admiral Cecil D. Haney, the commander in charge of US nuclear forces, says a concern of US Strategic Command is China’s re-engineering of its long-range missile to carry multiple nuclear warheads.

U.S. intelligence agencies detected the test of a new DF-41 intercontinental ballistic missile on Dec. 4 with two independently targetable reentry vehicles, or MIRVs.

Russia is continuing to modernize both its conventional and strategic forces and is stressing new strategic approaches and destabilizing activities in Syria and Ukraine, while developing space weapons and conducting cyber attacks, Haney said.

North Korea continues to threaten the Korean Peninsula and the Northeast Asia region with strategic advancements, including claims of “miniaturized” nuclear warheads and recent claims of a successful hydrogen bomb test, the four-star admiral said.

Pyongyang also is developing road-mobile and submarine-launched ballistic missile technologies, he added.

To meet the challenges, Haney said U.S. nuclear forces need to be modernized with new missiles, submarines, and bombers.

“Without timely investment, we risk degrading the deterring and the stabilizing effect of a strong and credible nuclear deterrent force,” he said.

Haney also warned about the growing threat of space warfare capabilities.



USA estimates China will deploy a hypersonic glide weapon by 2020 and a powered hypersonic vehicle by 2025

China conducted six successful tests of a new high-speed hypersonic glide vehicle, the most recent in November, and also recently tested an anti-satellite missile, the commander of the U.S. Strategic Command said Friday.

Adm. Cecil D. Haney, the commander in charge of nuclear forces, said the tests are part of a worrying military buildup by China, which also includes China’s aggressive activities in the South China Sea.

“China continues to make significant military investments in their nuclear and conventional capabilities, with their stated goal being that of defending Chinese sovereignty,” Haney said during a speech to the Center for Strategic and International Studies.

The congressional U.S.-China Economic and Security Review Commission stated in its latest annual report that the hypersonic glide vehicle program is “progressing rapidly” and the weapon could be deployed by 2020.

China also is building a powered version of the high-speed vehicle that could be fielded by 2025.

“The very high speeds of these weapons, combined with their maneuverability and ability to travel at lower, radar-evading altitudes, would make them far less vulnerable than existing missiles to current missile defenses,” the commission stated.



LPP Fusion explains why Tungsten and Beryllium electrodes for their dense plasma fusion reactor work

The LPP Fusion research team is still working with the tungsten electrodes but they know the beryllium electrodes will be needed soon. Tungsten is being used now because of its extreme resistance to the heat generated by runaway electrons during the early stages of FF-1’s pulse. They are combining that thermal resistance with a technique called “pre-ionization” to prevent vaporization of the electrodes and the resulting impurities in the plasma (see earlier report here.) This, they expect, will greatly increase the density of the tiny plasmoid the device produces and thus the fusion energy yield.

If LPP Fusion is successful they could reduce the cost of energy to 10-20 times.

Two cylinders of nearly pure beryllium metal were delivered to LPPFusion’s Middlesex, NJ lab on January 14. The cylinders, weighing together 35 kg, are to be machined over the next five months into two anodes and a cathode for experiments in the second half of 2016. They were fabricated from 97.8% pure beryllium at the Ulba Metallurgical Plant in Kazakhstan. The two anodes will be machined in California and the cathode in Massachusetts, after acceptance testing for purity and strength, which were guaranteed by Ulba.

The Beryllium cylinders


As the plasma density increases, so will the intensity of the x-ray pulse emitted by the plasmoid. In tungsten, the x-rays will be absorbed in the outermost micron of the metal. When they are strong enough, the x-rays will start to vaporize even tungsten. Before we reach that point, LPP Fusion wants to switch to beryllium. Beryllium, a far lighter metal with only four electrons per atom, is almost transparent to x-rays. What x-rays beryllium does absorb will be spread out harmlessly throughout the bulk of the electrodes.


Tungsten electrode pictures



LPP Fusion did not want to use beryllium first because they need to test and perfect the pre-ionization technique on the tougher tungsten. Beryllium is much less resistant to the runaway electrons than tungsten. Once they get the pre-ionization to work well, we’ll test it further using a silver-coated electrode to simulate the less thermally resistant beryllium. Then they can switch to beryllium.

They have to be sure that the beryllium will not significantly erode because vaporized beryllium could recondense as beryllium dust. While bulk beryllium is harmless, beryllium dust is dangerous. If inhaled in air at above 0.1 parts per billion, it can set off an immune reaction that leads to serious or fatal lung disease. By comparison, the decaborane fuel we will be using later this year is harmful only at concentrations in air of 50 ppb, 500 times as much as beryllium dust. As a result, the beryllium is being machined at specialized facilities with high levels of safety protections. Because of this safety hazard LPPFusion will have to use special precautions, including a sealed glove box, if they do anything to the electrodes that could create dust. However, with tests to ensure no dust is produced, careful monitoring and careful safety procedures, we will be able to ensure our own safety around the beryllium.

Since only 400 tons of beryllium is currently produced world-wide, some of the LPP Fusion newsletter readers have asked if supplies will be adequate for production of millions of focus fusion generators. In fact, beryllium is as abundant in the Earth’s crust as lead, whose global production is 4 million tons per year. Beryllium production at the moment is limited by low demand, and strict regulations relating to its use in fission reactors and nuclear weapons. As focus fusion production gears up, it will be technically easy to ramp beryllium production up to the roughly 40,000 tons per year needed. Changes to regulations should also be possible, as focus fusion generators would make fission power obsolete and could lead to the cessation of uranium production, firmly closing the door to more nuclear weapons and obviating the need for controlling beryllium.


Donations can be made at this link to support the effort to develop commercial nuclear fusion via LPP Fusion.

The key to LPP Fusion progress is taking shots with our machine, Focus Fusion-1 or FF-1 for short, which gives us the experimental data to test our theories and demonstrate progress towards net energy. We estimate that to accomplish net energy demonstration we have to do 1,500 more shots. So far they have carried out 1,900 shots. Each shot costs us about $900.

For $75 you can fund the charging of one of their 12 capacitors for one shot, for $150, two capacitors and so on up to $900 for a full shot. Everyone who funds a given shot will be recognized in a list kept permanently on the website.


January 22, 2016

Google Alphabet has four times DARPA's research budget and larger moonshot ambitions than DARPA

Google's revenue is continuing to grow at about 20 percent a year and is on pace to generate approximately $60 billion this year.

Google will have $20-24 billion in operating income and $12-14 billion will be spent on research and development in 2016. These amounts are still growing at 20% per year.

The NY Times provides some insight into Google by focusing on Founder and ex-CEO Larry Page.

Larry Page managerial modus operandi is to take new technologies or product ideas and generalize them to as many areas as possible. Why can’t Google Now, Google’s predictive search tool, be used to predict everything about a person’s life? Why create a portal to shop for insurance when you can create a portal to shop for every product in the world?

Inside Google, Mr. Page is known for asking a lot of questions about how people do their jobs and challenging their assumptions about why things are as they are. In an interview at the Fortune Global Forum last year, Mr. Page said he enjoyed talking to people who ran the company’s data centers.

“I ask them, like, ‘How does the transformer work?’ ‘How does the power come in?’ ‘What do we pay for that?’” he said. “And I’m thinking about it kind of both as an entrepreneur and as a business person. And I’m thinking ‘What are those opportunities?’”

Another question he likes to ask: “Why can’t this be bigger?

Larry Page is dedicated to “moonshots” like interplanetary travel, or offering employees time and money to pursue new projects of their own. By breaking Google into Alphabet, Mr. Page is hoping to make it a more welcoming home for employees to build new businesses, as well as for potential acquisition targets.



What Is The True Tax Rate? And What Is The Best Single Tax or Pair of Taxes to Boost Growth Rates?


A guest article by Joseph Friedlander


  • Article summary:  
  • The true all inclusive tax rate of GDP may be harder to calculate than it appears and higher relative to the 'real' economy outside the world of government funded NGOs and contractors. 
  •  Readers are invited to give their take on the question: Should government expenditures be included in GDP or not (if not it hugely inflates the true tax rate because the GDP is by definition smaller.)  
  • Also readers are invited to pick a single pair of taxes to substitute for the current nearly untrackable complex of taxes in an ideal future.


Although Next Big Future focuses on future tech, part of the story on the rate of deployment of those future technologies depends on how much money governments have to spend (tax revenues, defense budgets) and private parties (industry, entrepreneurs, you) have to spend.  (After tax disposable income)

So on occasion there is an economic discussion as to what kind of future might result from what kind of governmental change. In this case, tax structures.

The net amount of tax paid of course varies with the individual because circumstances are different. But the all inclusive tax burden of GDP per country should be a number that is easy to calculate and it is not.


There are many issues involved in such a calculation, including the fundamental issues of 'what is government vs. the private sector?'  I personally have over the years arrived at a conclusion that if a given allegedly private firm or foundation that has above say 10 or 20 percent of its' cumulative business with any branch of government (or any other allegedly private firm or foundation that effectively is a branch of the government) it too is effectively a branch of the government to the percentage of its' involvement.


What this means --if my hypothesis is correct-- is that as government influence spreads throughout an economy it saps the vitality of that economy by introducing what you might call sloppy habits or warped perspective on what are acceptable products. 

 (An example in the aerospace industry they keep the government and civilian departments separate because the cost-plus mentality leads to massive losses when you are selling for the open market. Other examples would be education systems in which the children comply with all government and union directives but learn no marketable skill; or what the late Neil Craig called 'fakecharities'--(ie heavily or entirely funded by government which then proceed to sit on each other's boards,  grant each other grants with government money they lobbied for using other government funds, and for window dressing spend 10-20% of their efforts on their alleged reason for being. )

http://www.thinkscotland.org/todays-thinking/articles.html?read_full=11546&article=Www.thinkscotland.org
Neil Craig cites this report as evidence
http://www.iea.org.uk/sites/default/files/publications/files/DP_Sock%20Puppets_redesigned.pdf
 
 

(1) Sock Puppets - How the government lobbies itself and why 


... when government funds the lobbying of itself, it is subverting democracy and debasing the concept of charity. It is also an unnecessary and wasteful use of taxpayers’ money. By skewing the public debate and political process in this way, genuine civil society is being cold-shouldered.
Sock Puppets: How the government lobbies itself and why, by Christopher Snowdon, shows that:
In the last 15 years, state funding of charities in Britain has increased significantly. 27,000 charities are now dependent on the government for more than 75 per cent of their income and the ‘voluntary sector’ receives more money from the state than it receives in voluntary donations.
 
State funding weakens the independence of charities, making them less inclined to criticise government policy. This can create a ‘sock puppet’ version of civil society giving the illusion of grassroots support for new legislation. These state-funded activists engage in direct lobbying (of politicians) and indirect lobbying (of the public) using taxpayers’ money, thereby blurring the distinction between public and private action.
 
State-funded charities and NGOs usually campaign for causes which do not enjoy widespread support amongst the general public (e.g. foreign aid, temperance, identity politics). They typically lobby for bigger government, higher taxes, greater regulation and the creation of new agencies to oversee and enforce new laws. In many cases, they call for increased funding for themselves and their associated departments.
 
Urgent action should be taken, including banning government departments from using taxpayer’s money to engage in advertising campaigns, the abolition of unrestricted grants to charities and the creation of a new category of non-profit organisation, for organisations which receive substantial funds from statutory sources.

  
 Friedlander here again. Why should having one huge customer distort the entire economy? It sounds confusing but think:  In a hypothetically totally private market, if you had a lot of small customers and one huge customer, you would be very focused on keeping that huge customer because his business could easily mean the difference between profit and loss in any given year.  How much more so when you have to comply with that customer's regulations (mil-spec or otherwise) and he can in extremis  throw you into jail for violation of any of his rules.  A sufficiently big mass on a playing field will cause a sinkhole on the playing field. A yet bigger mass will cause a runaway singularity 
 https://en.wikipedia.org/wiki/Hyperinflation_in_Zimbabwe
in that playing field and it's collapsar time. https://en.wikipedia.org/wiki/Collapsar
The real economy vanishes and people desperately tread water to survive.


Why does this issue of what is private and public matter  in relation  to the true tax rate relative to GDP? Simply because of a recursion problem involving the question,

What is the true size of government?

and the other question,

 Should government expenditures be included in the true tax rate?


Because, obviously, if you exclude gov + contractors+ NGOs + everyone on the public dime from GDP calculations (at least the percentage of funding they get from the government) the GDP is a lot smaller.

And if you recalculate the tax rate to reflect that change you get a far higher number on the truly productive part of society (Note that my intent is not to insult anybody working for the government, merely that if you don't have an actual productive part of the economy that manufactures office supplies and building materials and other things that can also pay NET taxes to the government you in the end will not be able to build government offices or even run them-- the lies run out when the real money does-- see the above link about Zimbabwe.)
 Should government expenditures be included in GDP or not? (if not it hugely inflates the true tax rate because the GDP is by definition smaller.)
https://mises.org/library/government-bloat-not-growth-real-gross-domestic-private-product-2000%E2%80%932011
  In the 1930s and 1940s, when the modern system of national income and product accounts (NIPA) was being developed, the scope of national product was a hotly debated issue. No issue stirred more debate than the question, Should government product be included in gross product? Simon Kuznets (Nobel laureate in economic sciences, 1971), the most important American contributor to the development of the accounts, had major reservations about including all government purchases in national product. Over the years, others have elaborated on these reasons and adduced others.

Why should government product be excluded? First, the government’s activities may be viewed as giving rise to intermediate, rather than final products, even if the government provides such valuable services as enforcement of private property rights and settlement of disputes. Second, because most government services are not sold in markets, they have no market-determined prices to be used in calculating their total value to those who benefit from them. Third, because many government services arise from political, rather than economic motives and institutions, some of them may have little or no value. Indeed, some commentators—including the present writer—(Robert Higgs_ )ultimately went so far as to assert that some government services have negative value: given a choice, the people victimized by these “services” would be willing to pay to be rid of them.

When the government attained massive proportions during World War II, this debate was set aside for the duration of the war, and the accounts were put into a form that best accommodated the government’s attempt to plan and control the economy for the primary purpose of winning the war. This situation of course dictated that the government’s spending, which grew to constitute almost half of the official GDP during the peak years of the war, be included in GDP, and the War Production Board, the Commerce Department, and other government agencies involved in calculating the NIPA recruited a large corps of clerks, accountants, economists, and others to carry out the work.

After the war, the Commerce Department, which carried forward the national accounting to which it had contributed during the war (since 1972 within its Bureau of Economic Analysis [BEA]), naturally preferred to continue the use of its favored system, which treats all government spending for final goods and services as part of GDP. Economists such as Kuznets, who did not favor this treatment, attempted for a while to continue their work along their own, different lines, but none of them could compete with the enormous, well-funded statistical organization the government possessed, and eventually almost all of them gave up and accepted the official NIPA..the issues that had been disputed at length in the 1930s and 1940s did not disappear. They were simply disregarded as if they had been resolved, even though they had not been resolved intellectually, but simply swept under the ... rug. In particular, the inclusion of government spending in GDP remained extremely problematic....To resolve this question, I have computed what I call gross domestic private product (GDPP), which is simply the standard real GDP minus the government purchases part of it. 


If government expenditures were not purely consumptive but net productive they could in theory fill a vital role in the economy and help boot economic growth but that is a different post involving production of capital goods-- not consumption of vast resources providing  mostly political net value for same.

One anecdotal example on the local government level I can offer is that often the local businessman of the year will be honored by the same government that is his biggest customer. His economic success is real but not indigenous in the sense that who knows if the private market would have booted his business to the same way that big government contracts did.  Example would be a office supplies contractor who sells to many government agencies, a car dealer (who sells to the same) and so forth. This banquet is held at a restaurant but none of those people would have gone to that restaurant on their own dime. I argue that if that restaurant has a large part of its' business be from the government it too not only effectively represents government expenditure to that extent, but is in effect a branch of the government, however subordinate to that percentage. In relation to this in the USA Tax Freedom Day

.https://en.wikipedia.org/wiki/Tax_Freedom_Day
Every dollar that is officially considered income by the government is counted, and every payment to the government that is officially considered a tax is counted. Taxes at all levels of government—local, state and federal—are included.


Note however that if the true tax rate is higher because a lot of unofficial taxes are 'off the books' and the true payments are higher because of fees, regulations, AND the taxpaying sector is smaller-- that would be a huge discrepancy from the official statistics.

If the 75-80% true tax rate people are right,  Tax Freedom day would not be in the official April-May range  which implies a 33-40% rate but in the financial chill of September or October.

Look up the official statistics here for where you live https://en.wikipedia.org/wiki/Tax_Freedom_Day#United_States

https://en.wikipedia.org/wiki/Tax_Freedom_Day#Tax_Freedom_Day_around_the_world

 Recall also that for example a 75% tax rate as paid by a corporation or person (if you count all tax inputs at any level, including customs on the raw materials, sales taxes, income taxes, and so on) does not show up as 75% of GDP because income is just a subset of GDP.

There is an additional subtle point. Unless government expenditures are TRULY productive they are worse than equivalent malinvestments from the private sector (yet far easier to make)   Why?  Because a fool and his money are soon parted, and he goes into bankruptcy and is removed from making further bad financial decisions.  A government and its' money spent foolishly just leads to sunk cost fallacy, doubling down and a runaway cycle (not necessarily rapid--it can take generations) of tax increases and private sector shrinkage. Also, as detailed here,
https://mises.org/library/how-reducing-gdp-increases-economic-growth
 assuming government wastes at least 50 percent of the resources expended, the net benefit to consumers of government production would be zero.


Now  we list some anecdotes from various bloggers.

This blogger quoted below is of the opinion that future has already been spent-- although he does not explicitly say so he implies that stronger taxes and regulations sapped the potential growth (to say nothing of fostering malinvestment)

http://captaincapitalism.blogspot.com/2012/04/what-could-have-been.html


"What would our GDP or "income per capita" be if we had continued to grow at 4%?"
...
Had we continued our traditional, old school, EVIL and OPPRESSIVE 1950's economic growth, our GDP would NOT be the paltry $14 trillion it is today (in 2005 numbers), it would be closer to $26 trillion.














We take the roughly 310 million Americans in the country today and that translates into a real GDP per capita of about $84,500.  However, that figure is in 2005 dollars.  I was surprised to find out based on the CPI how much inflation has occurred since then (despite what the government tells us) and apparently the US dollar has inflated by about 18%.  You adjust for that and what do you get?

$99,832.

Did I say $100,000 as just a guess?

...

All these economic problems we have with debt and social security and economic growth and student loans, etc. etc. - All these problems would be washed away if we had maintained our previous economy growth rate.
... A country that is no longer growing or prospering, but is stagnant and on the derivative value of 0, entering into decay... 


Another writer on the internet, FH Cofer of Georgia, wrote: a multi-million dollar government study identified that 23% of whatever you buy is tax (which disagrees with Scientific American which computed it to be 34% of the GDNP for the year 2000).

(Cofer, by the way is the author of the CLAMPIT proposal for a single sales tax in the USA. More on this below. )

It can also be the studies mentioned by Cofer indicate the tax portion of the PRICE you pay, but to get the money after-tax to pay that price in the first place you need pay tax. This gets very recursive, for by their very design the taxes are considered much better when too hidden for the voters to get worked up about. For example before 1943 and tax withholding the practice was you wrote a check to the tax authorities. Certainly that was the case in the 1920s. That was the biggest check most people wrote in a year and there was usually a high tide of public resentment about then.  By contrast the ideal system keeps money flowing in and the people calm

Other estimates of the true tax rate are rather higher.  One European researcher added to it the costs of interest which themselves are dictated by government fiscal policy.

According to Margrit Kennedy, a German researcher who has studied this issue extensively, interest now composes 40% of the cost of everything we buy. http://www.webofdebt.com/articles/rights.php
Few realize that this (inflation-JF)  is just another form of
taxation through which governments manage to
overcome the worst problems of an increasing
interest burden. --Margrit Kennedy
http://www.converge.org.nz/evcnz/resources/money.pdf

 That sounds quite high to me but I don't have to accept her whole work to at least have an alert that there might be something of interest in it--such as inflation itself constituting a hidden tax. .
 But where does inflation come from if not by governmental powers directly acted on https://en.wikipedia.org/wiki/Deficit_spending or delegated? https://en.wikipedia.org/wiki/Monetary_inflation

The high estimate I have ever seen of the true tax rate comes from L. Neil Smith's 7/8ths estimate.  (from a now defunct web pagehttp://down-with-power.com/secede.html)
 Taxes consume half of what we earn. Taxes
double the price of everything we buy. And regulations double the
price all over again. We all live on on eighth of what we earn.

(87.5% !  Of which elsewhere he explained that  75% was actual taxes piling up, disguised and undisguised, and the rest hidden costs of compliance which most studies don't count as a tax.  But it is certainly an expense imposed by the unfunded mandate of the tax simply existing.)


Mike Darwin on his Chronopause Blog estimates true tax rates are up to 70%-- 


I was alerted to the work of the economic analyst (and economist) Michael Mandel’s in the form of his seminal article, “Why the Jobs Crisis is Actually an Innovation Crisis”


 I had to see, interface with, and become familiar with the manufacturing processes of businesses of many kinds. I toured Dow Chemical’s Zionsville research and production facilities, and spent countless hours learning to do tissue culture there when I was 12-13 years old, in 1966-7.  So, Mandel and Cowen are clearly wrong about the increase in productivity being slowed to the extent they assert. They are, however, absolutely right that the average man on the receiving end of this technological bonanza has been seeing increasingly flat gains in personal wealth, and now is seeing net losses. That is real. But what they are failing to see, and the question they are failing to ask is, “why did this happen and where did all the wealth from that increased productivity go?” Somebody undoubtedly got richer!

The answer is in my 2008 CCM-L post: this wealth was stolen by hidden taxation and largely hidden (and unappreciated) inflation coupled with irrational and unsustainable expenditures in areas such as health care and Defense. My rough guess is that conservatively, 60 to 70% (more now) of each individual’s productivity is taken from him before he ever gets his paycheck. The decaying, and now failing infrastructure in the US is proof positive that this wealth isn’t going to fund basic and ‘good’ things government can do – such build and maintain roads, dams, utilities, land reserves –  and maintain basic public health. INSTEAD, IT IS BEING STOLEN AND WASTED.

...

historically nation-states (and empires) collapse when the taxation burden on their populace exceeds ~30% of the GDP, or its equivalent. So, it would seem simple enough to look at the taxation rate and come up with a number as to how close to that historical margin we are at any given time, assuming, of course, that this number still applies, because in the past, peoples’ incomes were just barely enough, or a little more, than was required to keep them alive, or in a modest (very modest by today’s standards) zone of comfort...So, 30% taxation on total earned income almost certainly does not equal the breaking point for parasitic load today, because that breaking point probably represents the fraction of earned (and available) income you have to take from a population before they start to be acutely uncomfortable, begin to be unable to buy necessities AND become fearful about their prospects for long term stability, and even for their personal survival. The huge absolute growth in wealth has thus destroyed the utility of this at least 2,000-year-old indicator for predicting how much theft is intolerable to the continued functioning of the socioeconomic system.

From a site called Jim's Blog evidence points that Mike Darwin is right from a commentator named Alrenous.  

Alrenous says:
2012 May 1 at 6:25 pm
Oh hell, I’ve been an idiot.

Official tax rates are ~35% of GDP. But what is GDP? It includes the government paying the money out again. That’s 2/3rds right there.


So even official statistics confirm that prices are dominated by taxes, often upwards of 70% of any outlay ends up in government hands.


I was initially skeptical of my 4:1 estimate because it was so high, but every analysis I’ve seen confirms it. This is the first of two I’ve seen confirming it just this week.


http://blog.jim.com/economics/the-laffer-curve.html

Jim, the host of the blog, gives a specific example of a 75% tax rate. (Not to be confused with a whole GDP rate)

Unfortunately the U.S. corporate tax rate is 39.2%. Now you have $60 800 000 in net income. If you are in the state of California, they want 8.84 percent corporate tax. So now you have 51 960 000. OK, this gets split between the various investors, the founding employees, and you, all whom are probably paying at the maximum marginal tax rate. So after it is paid out, the feds want 35%, and California wants 10%, so now that is down to 28 000 000 or so. And then you spend it, and there is sales tax, so now that is down to 26 000 000 or so.

Thus, three dollars for the taxman, one dollar for you.


Of course, should your plan fail, and everyone lose their money, that is your problem, not the government’s.



The late Neil Craig  http://neilsindex.blogspot.co.uk/ also wrote


...the current state parasitism where tax takes about 50% of the economy and regulatory controls (nuclear regulations, housing restrictions, environmentalism etc) destroy at least 50% of the possible economy reducing what people really get to 25% of the optimum. http://a-place-to-stand.blogspot.co.uk/2012/01/couple-of-interesting-articles-at-next.html

Next question:  Is the present system of taxes ideal for stimulating productive economic activity? I would argue not. 
Here is a list of 65 classes of taxes (by no means comprehensive) come up with by Michael Snyder,  of http://theeconomiccollapseblog.com/
http://theeconomiccollapseblog.com/archives/65-ways-that-everything-that-you-think-that-you-own-is-being-systematically-taken-away-from-you  The tone may be a bit distraught but the general point should be obvious:  Some people in the government may feel that ordinary people should not be overwhelmed or feel preyed upon when there are too many TYPES of taxes to keep track of.  

But many people in fact do feel that way, and it definitely can effect economic productivity if by no other mechanism than demoralization and hopelessness.
Hopelessness is the opposite of optimism so it is no surprise that government economists looking for ways to stimulate optimism by government expenditures will accumulate data points in favor of their cause.

But will these same government economists  even consider the possibility that the economic world they helped make by their previous recommendations  has reduced millions of formerly productive people to a state of economic lethargy?
History argues not.

 All this is assuming that the tax burden is not high enough to actually take a toll of personal savings, capital accumulation and the source of startup funds.  
 But apparently that is happening as well according to Charles Hugh Smith http://www.oftwominds.com/blogdec15/entrepreneurism12-15.html and as the same author demonstrates,  mandated expenses have all the value of  a tax for depleting startup capital http://charleshughsmith.blogspot.co.il/2015/08/our-government-destroyer-of-jobs.html



I have been monitoring the implosion of the real economy and was searching for evidence to collaborate my suspicions when I ran  across an episode of Modern Marvels (or a similar show which I cannot find the link to) where they build an office building at vast expense to have it occupied by--some LA transportation bureaucracy full of officers monitoring each other's compliance with mandates--and I would bet having nothing to do with road building.  A microcosm of diverted wealth.

 The funny thing during the episode was a subcontractor of over a century in business went out of business while the construction was being filmed.  But the government cannot go out of business. And if new business formation is made harder by the above trends, you would tend to get a runaway dynamic or singularity of  more government spending to counter the effects of the previous government spending and a shrinking private sector more heavily taxed which then shrinks further...

And in relation to government buildings I cannot count the times I have noticed that a firm that must impress its clients (law firms, PR people, publically held corporations all of which as we have seen above may be effectively part of government at least partially) --they will have a fancy office but most offices of the small private sector are older and beat up. They pay the taxes for the government office, which nearly invariably look better than theirs.

Now we come to the last question of the title.
The rules of this game are:  You can take the entire mess of taxes that exists on your national and local level and you pick a single (or double) tax to replace it. What is the rate you would impose?


You do not get to say in addition to present tax structure, I think we need a carbon tax, or a tax on (fill in the blank). You get to pick one or two taxes, say why and how much,

What would be the best single tax or pair of taxes to promote vigorous economic growth, and why?  Or if you are not sure, what are the most destructive taxes you can think of.
Why this simple system, which will appear too simple to many people?

Because if you cannot know by heart how much tax you are paying it is difficult to make informed economic decisions in real time and that will certainly affect productivity.

My nominees--- a form of Sales single tax invented by Frank H Cofer of Georgia called CLAMPIT.  I would pair it with a tariff to stop a surge in imports (because foreign governments often refund VAT to exporters and that would be a huge subsidy to foreign imports coming into the USA)

Some people are for a property tax/sales tax combo:

Robert Heinlein, as reported by Neil Craig in his blog came up with the following tax system,a  federal head tax paid by the states and a self-assessed property tax::
http://a-place-to-stand.blogspot.co.uk/2012/06/land-value-tax.html
http://a-place-to-stand.blogspot.co.uk/2010/06/best-of-all-possible-worlds.html
Taxation is low, simple-and contains a surprise. The Federal government is supported by a head tax paid by the States, and is mostly for military and foreign affairs. 
This state derives most of its revenue from real estate taxes. It is a uniform rate set annually, with no property exempted, not even churches, hospitals, or schools-or roads; the best roads are toll roads. The surprise lies in this: The owner appraises his own property.
There is a sting in the tail: Anyone can buy property against the owner's wishes at the appraisal the owner placed on it. The owner can hang on only by raising his appraisal at once to a figure so high that no buyer wants it- and pay three years back taxes at his new appraisal.


Readers--what pair of taxes would you choose in place of the present system? And should government expenditures be included in GDP or not?  Make your comment below. 


If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks

China will make more aircraft carriers and carrier groups til they get a global blue water force

China's first indigenous aircraft carrier is nearing completion. It is the second chinese aircraft carrier for the People’s Liberation Army Navy (PLAN) after the refurbished Liaoning.

The new carrier is still part of a conservative engineering journey for China learning and developing experience in how to design and build aircraft carriers.

China is also adopting a concerted strategy in developing a CBG (Carrier Based Group). They are paying close attention to how established carrier navies operate such forces. As such, while developing the carrier, efforts are long afoot to develop a slew of other capabilities that can help constitute a full-fledged CBG. Notably, the Chinese are churning out new major surface combatants, such as the Type-052C/D Luyang II/III guided missile destroyers and Type-054A Jiangkai II frigates, which are optimized for fleet air defense and ASW respectively. Even more ominously, but often overlooked, is China’s ambitious program to build more capable ocean-going fleet replenishment vessels. In the past recent years, new units of the Type-903 (plus the improved 903A variant) replenishment vessels have entered service. An even more capable successor, touted the Type-901 which is said to displace some 40-45,000 tons (just slightly smaller than the new carrier itself), is at an advanced stage of construction.




DARPA making fully implantable devices able to connect with up to one million neurons for breakthrough computer-brain interfacing

A new DARPA program aims to develop an implantable neural interface able to provide unprecedented signal resolution and data-transfer bandwidth between the human brain and the digital world. The interface would serve as a translator, converting between the electrochemical language used by neurons in the brain and the ones and zeros that constitute the language of information technology. The goal is to achieve this communications link in a biocompatible device no larger than one cubic centimeter in size, roughly the volume of two nickels stacked back to back.

The program, Neural Engineering System Design (NESD), stands to dramatically enhance research capabilities in neurotechnology and provide a foundation for new therapies.

“Today’s best brain-computer interface systems are like two supercomputers trying to talk to each other using an old 300-baud modem,” said Phillip Alvelda, the NESD program manager. “Imagine what will become possible when we upgrade our tools to really open the channel between the human brain and modern electronics.”

Among the program’s potential applications are devices that could compensate for deficits in sight or hearing by feeding digital auditory or visual information into the brain at a resolution and experiential quality far higher than is possible with current technology.

Neural interfaces currently approved for human use squeeze a tremendous amount of information through just 100 channels, with each channel aggregating signals from tens of thousands of neurons at a time. The result is noisy and imprecise. In contrast, the NESD program aims to develop systems that can communicate clearly and individually with any of up to one million neurons in a given region of the brain.

The Neural Engineering System Design (NESD) program seeks innovative research proposals to design, build, demonstrate, and validate in animal and human subjects a neural interface system capable of recording from more than one million neurons, stimulating more than one hundred thousand neurons, and performing continuous, simultaneous full-duplex (read and write) interaction with at least one thousand neurons in regions of the human sensory cortex. In addition to achieving substantial advances in scale of interface (independent channel count), proposed systems must also demonstrate simultaneous high-precision in neural activity detection, transduction, and encoding, with single-neuron spike-train precision for each independent channel.

The Neural Engineering System Design program aims to develop an implantable neural interface able to provide unprecedented signal resolution and data-transfer bandwidth between the human brain and the digital world.

January 21, 2016

DOE funds Southern Companies molten chloride nuclear reactors and X-energy fuel pellet reactor

the U.S. Department of Energy (DOE) announced the selection of two companies, X-energy and Southern Company, to further develop advanced nuclear reactor designs. These awards, with a multi-year cost share of up to $80 million for both companies, will support work to address key technical challenges to the design, construction, and operation of next generation nuclear reactors.

At $40 million each in matching funds over the next five years, the grants will go to X-energy, a little-known Maryland-based startup that is developing a new version of a pebble-bed reactor, and to Southern Company, the Atlanta-based utility that is working with TerraPower on molten-salt reactors.

It’s a promising development for the advanced nuclear industry, which has struggled to find funding and regulatory approval. Much of the money raised so far has gone to just two companies, Tri Alpha Energy, which is working on fusion reactors, and TerraPower.

X-energy was founded in 2009 by Kam Ghaffarian, who previously founded Stinger Ghaffarian Technologies, a major contractor for NASA. Based in Greenbelt, Maryland, the company is working on a high-temperature reactor that is cooled by gas, rather than water, and uses small fuel pellets inside a graphite cylinder, rather than solid fuel rods. The design, says Pete Passano, the company’s vice president of fuel production, makes the reactor immune to meltdowns. Encased in layers of carbon and ceramics, the individual fuel pebbles, each the size of a poppy seed, maintain their integrity at temperatures of 1,800 °C, far beyond the temperatures that might be reached inside the core in the event of an accident, according to tests at Oak Ridge and Idaho national laboratories.



Southern plans to build a prototype of the molten chloride reactor by the mid-2020s. Coupled with recent private funding announcements for companies including Terrestrial Energy and Transatomic Power, the DOE’s support could help jump-start a sector that promises to make real the long-awaited renaissance of nuclear power

India will take longer than ten years to have a China sized impact on many commodities

Australian Minister for Resources, Energy and Northern Australia has said the urbanisation of India and its swelling middle class will step in to fill the breach as China’s economic growth slows. India's economy is 5 to 6 times smaller than China's but for commodities India could start to step up some for certain commodities over a ten year outlook.

India might make up some gap for grain imports. However, India will take longer than ten years to make a China sized impact on metals and energy.

Though other resource-dependent economies such as Brazil and Canada were technically in recession last year, Australia continued to grow strongly, creating 300,000 jobs.

Chinese steel production is expected to be down year on year in 2015, according to our Head of Commodities Research at Macquarie Colin Hamilton, resulting in weaker demand for commodity bulk imports. At the same time, Indian demand is enjoying double digit growth rates, with the expectation that the ten year commodity outlook will begin to be driven by India, rather than China.


There is a world bank comparison and analysis of China and India's impact on commodities.



India should freeze the design for a larger fourth aircraft carrier by the end of this year

India has two active aircraft carriers now, with a third under construction and they are finalizing the design for a 65,000 ton fourth aircraft carrier

India's aircraft carriers

INS Viraat: 28,700 tons, Centaur class carrier (ex-HMS Hermes) in service since 1987.
INS Vikramaditya : 45,400 tons, Modified Kiev class carrier (ex-Admiral Gorshkov), in service with India since 2013.

INS Vikrant: 40,000 tons, Vikrant class carrier. It is being built at Cochin Shipyard and is expected to enter service in 2018.

INS Vishal: 65,000 tons, Vikrant-class carrier. Yet to start, planned to enter service in 2025. It may be nuclear powered

INS Vishal will be capable of carrying over 50 aircraft.



January 20, 2016

Graphene elastomer could create soft, tactile robots to help care for elderly people

A new sponge-like material, discovered by Monash researchers, could have diverse and valuable real-life applications. The new elastomer could be used to create soft, tactile robots to help care for elderly people, perform remote surgical procedures or build highly sensitive prosthetic hands

Graphene-based cellular elastomer, or G-elastomer, is highly sensitive to pressure and vibrations. Unlike other viscoelastic substances such as polyurethane foam or rubber, G-elastomer bounces back extremely quickly under pressure, despite its exceptionally soft nature. This unique, dynamic response has never been found in existing soft materials, and has excited and intrigued researchers Professor Dan Li and Dr Ling Qiu from the Monash Centre for Atomically Thin Materials (MCATM).

According to Dr Qiu, “This graphene elastomer is a flexible, ultra-light material which can detect pressures and vibrations across a broad bandwidth of frequencies. It far exceeds the response range of our skin, and it also has a very fast response time, much faster than conventional polymer elastomer.



Advanced Materials - Ultrafast Dynamic Piezoresistive Response of Graphene-Based Cellular Elastomers

A head transplant surgery was performed on a Monkey and it caused no neurological damage which keeps human head transplant attempt on track for end of 2017

Sergio Canavero plans to transplant a human’s head onto a donor body. He says that the procedure will be ready before the end of 2017 and could eventually become a way of treating complete paralysis.

The team behind the work has published videos and images showing a monkey with a transplanted head, as well as mice that are able to move their legs after having their spinal cords severed and then stuck back together.

Non-damaging transplant of Monkey head and mice have had spinal cord repairs

Sergio Canavero said researchers led by Xiaoping Ren at Harbin Medical University, China, have carried out a head transplant on a monkey. They connected up the blood supply between the head and the new body, but did not attempt to connect the spinal cord. Canavero says the experiment, which repeats the work of Robert White in the US in 1970, demonstrates that if the head is cooled to -15 °C, a monkey can survive the procedure without suffering brain injury.

“The monkey fully survived the procedure without any neurological injury of whatever kind,” says Canavero, adding that it was kept alive for only 20 hours after the procedure for ethical reasons. New Scientist was, however, unable to obtain further details on this experiment.

“We’ve done a pilot study testing some ideas about how to prevent injury,” says Ren, whose work is sponsored by the Chinese government. He and his team have also performed experiments on human cadavers in preparation for carrying out the surgery, he says.



This can open up a new science of spinal cord trauma reconstruction

Canavero is seeking funds to offer a head transplant to a 31-year-old Russian patient, Valery Spriridonov, who has a genetic muscle-wasting disease. Canavero says he intends to make a plea to Mark Zuckerberg to finance the surgery. Last week, Trinh Hong Son, director of the Vietnam-Germany Hospital in Hanoi, Vietnam, offered to host the procedure.

“If the so-called head transplant works, this is going to open up a whole new science of spinal cord trauma reconstruction,” says Michael Sarr, editor of the journal Surgery and a surgeon at the Mayo Clinic in Rochester, Minnesota. “We are most interested in spinal cord reconstruction using head transplantation as a proof of principle. Our journal does not necessarily support head transplantation because of multiple ethical issues and multiple considerations of informed consent and the possibility of negative consequences of a head transplant.”

Evidence for a previously unknown Distant Giant Planet in the Solar System that is 5000 times bigger than Pluto

Astronomers say they have compelling evidence of something bigger and farther away than Pluto and this something would definitely satisfy the current definition of a planet.

Evidence indicates there is an undiscovered planet 5000 times bigger than Pluto

Caltech researchers have found evidence of a giant planet tracing a bizarre, highly elongated orbit in the outer solar system. The object, which the researchers have nicknamed Planet Nine, has a mass about 10 times that of Earth and orbits about 20 times farther from the sun on average than does Neptune (which orbits the sun at an average distance of 2.8 billion miles). In fact, it would take this new planet between 10,000 and 20,000 years to make just one full orbit around the sun.

The researchers, Konstantin Batygin and Mike Brown, discovered the planet's existence through mathematical modeling and computer simulations but have not yet observed the object directly

There was separate work at the Atacama radio telescope facility for evidence of a possible large planet or Brown Dwarf object. The nature of Atacama work could not isolate the exact location.

Brown notes that the putative ninth planet—at 5,000 times the mass of Pluto—is sufficiently large that there should be no debate about whether it is a true planet. Unlike the class of smaller objects now known as dwarf planets, Planet Nine gravitationally dominates its neighborhood of the solar system. In fact, it dominates a region larger than any of the other known planets—a fact that Brown says makes it "the most planet-y of the planets in the whole solar system."

Dr. Morbidelli said that the ninth planet could easily be the core of a gas giant that started forming in the early years of the solar system; a close pass to Jupiter could have flung it out. In those days, the sun was packed in a dense cluster of stars, and the gravity of those neighbors could have slowed the planet and prevented it from escaping the solar system.

Dr. Brown said he began searching for the planet a year ago, and he thought he would be able to find it within five years — perhaps sooner, with luck. Now it is likely that other astronomers will scan that swath of the sky.

If the planet exists, it would easily meet the International Astronomical Union’s requirements, Dr. Brown said.

“There are some truly dominant bodies in the solar system and they are pushing around everything else,” Dr. Brown said. “This is what we mean when we say planet.”

Our solar system is likely like thousands of others with most planets in the 1 to 10 earth mass size

In terms of understanding more about the solar system's context in the rest of the universe, Batygin says that in a couple of ways, this ninth planet that seems like such an oddball to us would actually make our solar system more similar to the other planetary systems that astronomers are finding around other stars. First, most of the planets around other sunlike stars have no single orbital range—that is, some orbit extremely close to their host stars while others follow exceptionally distant orbits. Second, the most common planets around other stars range between 1 and 10 Earth-masses.

"One of the most startling discoveries about other planetary systems has been that the most common type of planet out there has a mass between that of Earth and that of Neptune," says Batygin. "Until now, we've thought that the solar system was lacking in this most common type of planet. Maybe we're more normal after all."

Brown, well known for the significant role he played in the demotion of Pluto from a planet to a dwarf planet adds, "All those people who are mad that Pluto is no longer a planet can be thrilled to know that there is a real planet out there still to be found," he says. "Now we can go and find this planet and make the solar system have nine planets once again."


The six most distant known objects in the solar system with orbits exclusively beyond Neptune (magenta) all mysteriously line up in a single direction. Also, when viewed in three dimensions, they tilt nearly identically away from the plane of the solar system. Batygin and Brown show that a planet with 10 times the mass of the earth in a distant eccentric orbit anti-aligned with the other six objects (orange) is required to maintain this configuration Credit: Caltech/R. Hurt (IPAC); [Diagram created using WorldWide Telescope.]


This artistic rendering shows the distant view from Planet Nine back towards the sun. The planet is thought to be gaseous, similar to Uranus and Neptune. Hypothetical lightning lights up the night side. Credit: Caltech/R. Hurt (IPAC)

Recent analyses have shown that distant orbits within the scattered disk population of the Kuiper Belt exhibit an unexpected clustering in their respective arguments of perihelion. While several hypotheses have been put forward to explain this alignment, to date, a theoretical model that can successfully account for the observations remains elusive. In this work we show that the orbits of distant Kuiper Belt objects (KBOs) cluster not only in argument of perihelion, but also in physical space. We demonstrate that the perihelion positions and orbital planes of the objects are tightly confined and that such a clustering has only a probability of 0.007% to be due to chance, thus requiring a dynamical origin. We find that the observed orbital alignment can be maintained by a distant eccentric planet with mass gsim10 m⊕ whose orbit lies in approximately the same plane as those of the distant KBOs, but whose perihelion is 180° away from the perihelia of the minor bodies. In addition to accounting for the observed orbital alignment, the existence of such a planet naturally explains the presence of high-perihelion Sedna-like objects, as well as the known collection of high semimajor axis objects with inclinations between 60° and 150° whose origin was previously unclear. Continued analysis of both distant and highly inclined outer solar system objects provides the opportunity for testing our hypothesis as well as further constraining the orbital elements and mass of the distant planet.

Russia opts for next generation diesel electric sumbarines

Russia decided not to build Lada-class diesel-electric submarines (Project 677) and spend the funds on the fifth generation Kalina-class diesel electric submariness, a senior Russian Navy’s official said Tuesday.

The new submarines will be equipped with anaerobic (air-independent) power units. Their construction is expected to be launched after 2020.

Air-independent, closed cycle submarines, which usually use hydrogen-oxygen fuel cells, are quieter than conventional diesel-electric boats and do not have to surface or use snorkel tubes to breathe air, thereby exposing themselves to detection by radar and other sensors.

Several years ago Rear Admiral Shlemov in charge of naval shipbuilding expanded on this, highlighting that this new type submarine would have a displacement of 5,000–6,000 tons. This new, smaller submarine’s main mission would be the protection of the DOLGORUKIY Class SSBN, allowing the multi-mission SEVERODVINSK to perform other navy missions



January 19, 2016

IMF sees weaker growth in China and the World in 2016 and 2017

China's economy grew at its weakest pace in a quarter of a century last year, raising hopes Beijing would cushion the slowdown with more stimulus policies, which in turn prompted a rally on the country's rollercoaster share markets.

Growth for 2015 as a whole hit 6.9 percent after the fourth quarter slowed to 6.8 percent, capping a tumultuous year that witnessed a huge outflow of capital, a slide in the currency and a summer stocks crash.

China's slowdown, along with the slump in commodity prices, prompted the International Monetary Fund to cut its global growth forecasts again on Tuesday, and it said it expected the world's second-largest economy to see growth of only 6.3 percent in 2016.

The IMF's latest global growth forecast revised down—3.4 percent in 2016 and 3.6 percent in 2017.
The IMF sees emerging market and developing economies facing increased challenges. The key risks relate to China slowdown, stronger dollar, geopolitical tensions and renewed global risk aversion.



What If Humans Could Survive A 4000 G Explosive Launch?

A guest article by Joseph Friedlander

This is a longer article giving an overview of what happens if Gershom Gale's theory that people with fluid filled interiors can survive 1000 G and better is workable. If so, what new opportunities for manned space operations are opened?

There are so many launch system ideas that have great promise for reaching space at lower cost than present, but a distressing number of them subject cargoes to extreme acceleration.

Let's focus on explosive launch for a moment.   This can be gas guns or direct explosive launch.


Reader Paul 451 made the following comment on one of the Wang Bullet articles.

http://nextbigfuture.com/2012/01/nuclear-katyusha-launching.html#comment-423815569
Joseph, I had a random thought. A nuke is a largely unshaped blast wave. (The drill-hole will cause some innate shaping of the wave, but I suspect that would be behind the leading edge, so irrelevant to the launch itself.) Is it possible to simulate the effects on the reaction mass (the water beneath the payload) by using conventional explosives. I was thinking something like ringing the entire length of the drill-hole with shape-charges, timing the explosions to simulate the speed and force of the nuclear blast wave. It'll take a painful amount of explosives, but it won't take 150,000 tons of HE to simulate a 150kt nuke. And it's much politically safer than a nuke, so easier to actually be allowed to do it.

And indeed that is one scenario we consider in the article.

Brian Wang's coverage of explosive launch
http://nextbigfuture.com/2009/06/blast-wave-accelerator-space-launch.html

 The Blast-Wave Accelerator:
* is of Russian origin
* is a concept that has been verified by NASA studies
* is state-of-the-art technology
• Estimated launch cost: $200 – 2,000/kg of payload, depending on construction and refurbishment options
• 15 m barrel generates 300,000 g acceleration
• 40 m barrel generates 100,000 g acceleration
• Longer barrel generates lower launch acceleration
• Russian experiments indicate that Mach 27 projectile/payload velocity is achievable
• Payload mass fraction is 70 - 95%
 it has artillery-like operations, complexity and cost
* it can be based anywhere
* it possesses excellent stealth (i.e., it has no exhaust plume)
* it has affordability, ferocity, and quick reaction time

Projectiles are accelerated by a series of hollow explosive rings that are detonated in rapid sequence causing a near-constant pressure to form at the base of the projectile, thereby generating a near-constant and large acceleration 


The amount of explosives used can be very large and stacked in rings each conceivably as high yielding as the picture below for a conventional Wang Bullet kind of explosive launch of the kind Paul 451 suggested. I am guessing ear protection might be a good idea. Particularly when the thing goes supersonic. Note that ANFO may not be the quick triggering explosive you need for the above technology; illustrative only. 

Minor Scale fireball immediately after detonation. The F-4 Phantom aircraft in the foreground is 63 feet (19 m) long.
4.8 kilotons of ANFO explosive (ammonium nitrate and fuel oil),equivalent to 4 kilotons of TNT,[4] were used to roughly simulate the effect of an eight kiloton air-burst nuclear device. With a total energy release of about 1.7 ×1013 joules (or 4.2 kilotons of TNT equivalent), Minor Scale was reported as "the largest planned conventional explosion in the history of the free world"

Some future explosive launch systems might be analogous to this conventional explosive shock wave guide:

 Sandia National Laboratory's "Thundertube". The "Thundertube" was a conventional explosive shock wave guide which consisted of a steel pipe about 5.8 m (19 ft) in diameter and about 120 m (400 ft) long. Small scale HML design concept models were placed on a soil sample (about 5m x 5m x1.5 m deep) intended to represent Western US desert soils. Soil sample preparation was quality assurance verified using a 1 cm diameter ultra-miniature Cone Penetration Test penetrometer (tip and friction sleeve) developed at the Earth Technology Corporation (Long Beach, CA) in 1984. The CPT soil test system and sample preparation (soil surface planner) equipment was designed by Andrew Strutynsky PE,CPT Group Leader at Earth Technology 1982-1985.



But explosive launch systems aren't the only way up at high  G:  pulsed quenchguns are an transient release of electrical energy to kinetic energy:


http://www.askmar.com/Massdrivers/Electromagnetic%20Launch.pdf


We are talking  two seconds to escape velocity. 

 Here is an excerpt from the 1980 L-5 News announcement of Dr. Kolm's work. Read the whole thing here:

http://www.nss.org/settlement/L5news/1980-massdriver.htm
---------

MASS DRIVER UP-DATE


By Henry Kolm
From L5 News, September 1980
Mass Drivers were proposed by Professor Gerard O'Neill in 1974 as the logical means for transporting lunar raw material to L-5. As all but perhaps a few of the newest L-5 members know, mass drivers are electromagnetic launchers which accelerate payloads in recirculating buckets with superconducting magnet coils at a repetition rate of about ten per second. These buckets are levitated, guided and driven by a synthetically synchronized linear motor derived from the Massachusetts Institute of Technology (MIT) Magneplane. The magneplane is a cylindrical high-speed train which floats twelve inches above an aluminum trough. The mass driver ultimately evolved into a line of pulse coils surrounding a barrel of aluminum guide rails within which a stream of cylindrical buckets is accelerated and decelerated without physical contact.
Mass driver
Cutaway of a mass driver model: the current in the drive coils makdes a magnetic field which pushes on currents in the bucket coils, producing acceleration.
Mass driver
The mass driver on display at the Princeton Conference in 1979. Photo by Charles Divine.
Mass driver development has been pursued by a dedicated group, first at two NASA Ames summer studies in 1976 and 1977, and during the intervening academic year, while O'Neill was a visiting professor at MIT. Out of this collaboration came Mass Driver One, built by a group of MIT students on a shoestring budget in four months, in time to be demonstrated at the May 1977 Princeton-American Institute of Aeronautics and Astronautics (AIAA) Symposium on Space Manufacturing. It was also featured in the NOVA documentary "The Final Frontier," and was flown to California to be exhibited and nationally televised at the festivities surrounding the first piggyback flight of the Space Shuttle orbiter Enterprise in 1977.
It is now believed that a lunar mass driver several kilometers long, designed conservatively with present technology, should be able to deliver 600,000 tons a year to L-5, or more easily to L-2, at a cost of about $1 per pound, assuming only ten years of operation. Smaller caliber mass drivers could also be useful as reaction engines to propel large structures or asteroids by ejecting waste matter as reaction mass. Such devices are not as straightforward as lunar launchers since certain stability problems of long, flexible structures in space need to be solved.
I am often asked what, if anything, has happened recently. An update is about due, particularly since exciting new possibilities have emerged.
Mass Driver Two
In the fall of 1978, O'Neill and I shared a university-level NASA grant for the development of Mass Driver Two. It is to operate in an evacuated, four-inch caliber tube at an acceleration of 500 gee, with a superconducting bucket and an oscillating, push-pull coil system. It is close to an actual lunar driver, but more complicated due to the need for a vacuum tube between drive coils and bucket. ...it may be possible to build mass driver reaction engines which are only several meters, rather than several kilometers, long and eject reaction mass in the form of small rings or washers (easily made of lunar aluminum, for example) without the use of superconducting buckets. Normal metals will carry even higher current densities than superconductors for very short periods of time. On the other hand, conventional mass drivers with recirculating superconducting buckets can be improved drastically by using superconducting instead of normal-conducting drive coils, and storing the launch energy inductively in the drive coils. This would eliminate the need for capacitors and feeder lines, thereby reducing the system mass, cost and complexity. The most exciting thing we learned is that mass drivers can be used to launch space cargo from Earth!
The Era Of Earth-Based Mass Drivers
Electromagnetically launched space vehicles are an old dream. Arthur C. Clarke and Robert Heinlein have used them for decades, and a Princeton professor named Northrup proposed them in the Twenties. The Germans attempted electromagnetic launching unsuccessfully during World War Two, before they embarked on the development of rockets. Actually the most successful catapult launch was achieved by chemical means in the Sixties when a passive missile was almost accelerated to orbital velocity from the Barbados Islands by welding together two large naval guns. It would be nice to be able to launch pure payload, unaccompanied by over 100 times its mass in expensive rocket engines and fuel. Nevertheless, space technologists never took direct Earth-launching seriously. After all, consider the ablation problems we face when entering the atmosphere from the top, where it is very dilute. Imagine the energy and ablation loss when a vehicle enters at full speed from the Earth's surface, where the atmosphere is very dense. Even if a vehicle could be launched at escape velocity of 11 km/s, or even at a lower orbital velocity, it would certainly burn up before traversing the atmosphere, right?
Wrong! At least one dreamer refused to accept this extrapolation: Fred Williams has talked about Earth-launching ever since the days of the Magneplane Project. The question is: just how large would an Earth-launched vehicle have to be to survive its passage through the atmosphere? The first time this question was considered seriously in a quantitative way, to the best of my knowledge, was at the 1977 NASA Ames summer study. The theory of ablation in a dense atmosphere had received recent attention in connection with the outer planet probe program, and two members of the Ames team applied the resulting software to the problem of the Earth launcher: Chul Park and Stuart Bowen. They found, much to everybody's surprise, that an Earth-launched vehicle would not have to be prohibitively large to survive: a vehicle the size and shape of a telephone pole could be launched out of the Solar System with a loss of only about 3% of its mass, and 20% of its energy to the atmosphere. There are two reasons for this result. First, the atmospheric transit is short and vertical rather than long and tangential (as required for astronauts to survive the deceleration); and second, the high atmospheric density leads to highly opaque ablation products which reduce radiation heating from the hot air to the projectile's surface.
A reference design telephone pole launcher would have the specifications shown below.
Vehicle: Telephone Pole Shaped, Mass of 1,000 kg
Launch Velocity: 12.3 km/s
Velocity at Top of Atmosphere: 11 km/s (escape velocity)
Kinetic Energy at Launch: 76 x 109 joule
Ablation Loss, Carbon Shield: 3% of mass
Energy Loss: 20%
Acceleration: 1,000 gee
Launcher Length: 7.8 km
Launch Duration: 1.26 second
Average Force: 9.8 x 106 newton = 2.2 x 106 pound
Average Power: 60 x 106 kilowatts
Charging Time From 1,000 MW Power Plant: 1.5 minute
This launcher is about as long as the deepest well hole ever drilled, and therefore represents the longest launcher which can be installed vertically by present technology. If it were made longer to decrease the power requirement or increase the payload size it would have to be installed up a mountainside at an inclination of perhaps 30 to 45 degress. This would increase mass and energy losses due to the lengthened path through the atmosphere.
The cost of the launcher itself in terms of installed copper, steel and concrete would be only 24 million dollars. But a device to store 76 gigajoules by conventional technology (generators and capacitors) would cost 11 billion dollars. This estimate may not be very meaningful, because it is based on cost estimates for quantities of capacitors which have never been manufactured before, but even at half the price, the investment would be formidable. The energy cost of the launch would only be about 65 cents per pound, but amortization of capital would add 10 to 20 dollars per pound, even if the launcher were used continuously, day and night, every 12 minutes.
Actually, it is more useful to think in terms of power compression rather than energy storage. The reference launcher could be operated by storing energy from one single large (1,000 megawatt) power plant for 1.5 minutes, and releasing it in 1.5 seconds, a 60-fold compression. Perhaps sixty power plants could be tapped simultaneously during off-peak hours by using superconducting transmission lines. On the other hand, if the power requirement were reduced by a factor of 60, there would be no need for energy storage at all. This could be done either by making the launcher 60 times longer (468km), or by making the vehicle 60 times smaller (17kg). Neither alternative is reasonable. A compromise might be to apply a factor of the square root of 60 to each: a 60km long launcher with a 129kg vehicle. Unfortunately this launcher would be too long even for installation up a mountainside, and the payload ratio of such a small vehicle would be very poor.
There does, however, appear to be a solution to the energy storage problem. If the entire drive coil system of a mass driver is made superconducting, as well as the bucket coils, enough energy can be stored inductively by charging the system with current. It is then merely necessary to quench the current in each individual drive coil as the bucket passes. This loses some of the energy efficiency of a push-pull capacitor system, but the loss is more than offset by eliminating capacitor feeder line losses. A preliminary calculation indicates that a "quench gun" of this type of 12-inch caliber, only 1 km long, would store enough energy to launch a 20kg vehicle to 10.5km/s at an energy conversion efficiency of 80%, at an average acceleration of 5,600 gee.
There are technical problems to be solved, of course, but not any of a fundamental nature. The benefit-to-risk ratio of the enterprise certainly justifies an immediate, serious study. The possibility of launching cargo into space at a cost approaching about one dollar per pound by using off-peak electric power has mind-boggling consequences. To name only the most obvious: we could dispose of nuclear waste by launching it out of the Solar System; we could begin constructing solar power satellites; and we could establish fueling stations in low Earth orbit where Shuttle travellers would take on fuel and reaction mass for the trip beyond: to geosynchronous orbit, to the Moon, and to L-5.

Henry KolmHenry Kolm
-----

Friedlander here again. For reference we will talk about a Kolm Launcher later in the article. That is the above postulated system for a quenchgun scaled up by a factor of 100-1000 for a 2-20 ton launch vehicle but barrel lengthened to keep it down to 4,000 G. Why the upsizing? This has to be high G resistant and if manned the manned part has to be fluid filled thus the scale up. 
 If not manned you can make the capsules smaller sized like Quicklaunch (below) but you have cargo subdivision problems (many things cannot fit in a small capsule without total redesign and segmentation)  and you have packing problems (custom G proofing) and you have capsule rendezvous problems (instead of 1 self sufficient lunar surface rendezvous lander you have a bunch of little things that need gathering or orbital collection and assembly) It's a mission architecture problem you can play with. Many new insights are possible.


When I say no evacuated spaces in the launch article I don't just mean in the astronauts but the whole ship --
 literally there were no void spaces --electronics potted with epoxy,  tanks filled with no air at all, inflatable tanks in flat bag form, and no hard drives or other vacuum containing systems that can't be fluid filled and pumped down later. Open spaces completely filled with fluid.  Combined with the stoutness of the ship that is a substantial weight penalty. The flip side is you have a lot of scrap material and chemicals you can use on the Moon. Unopened inflatable flat rolled tanks can take the outgassing as you de-liquid the ship on the Moon; later the base can do that for you.


This excerpt from the link below is by Dr. David P. Stern and I recommend you see the page he wrote to get a good look at the gas gun. Note his calculation yields a 4000 G value for launch, consistent with many cannon calculations as well.

http://www.phy6.org/stargaze/SSHARP.htm
  

 Let us assume that the shell inside the cannon accelerates at a constant rate ofa (meters/sec2). From the equations of motion with a constant acceleration (developed earlier for falling objects, whose acceleration a equals g ≈10 m/sec2), if t (in seconds) is the time spent accelerating, the final velocity (m/sec) is
v = at
and the distance covered, in meters
s = at2/2
From the first equation, t = v/a. Substituting this in the second equation gives, after a few steps
v2 = 2as
    Suppose the barrel of the cannon is a mile long (≈1600 meter) and the final velocity v, the one with which the shell emerges, is the escape velocity from the surface of the Earth
v = vesc. = 11,300 m/sec
v 2 ≈ 128,000,000 (m/sec)2
Then a quick calculation yields:             a ≈ 40,000 m/s2 ≈ 4000 g
    The force on the shell and on any passengers inside it would be 4000 times stronger than gravity. A suitably supported person, such as an astronaut in the space shuttle, flat on his or her back, can endure accelerations of up to about 6 g. Doubling the figure can bring loss of consciousness, and any accelerations much greater than that can rupture organs and blood vessels. 


Friedlander here again.
At 4000 gs one would expect strawberry jam on the couch. In  one long ago science fiction story I read  of someone who endured 30 G accelerations on a water filled couch which failed and he was extruded out a hole in the back or some equally horrible fate.

But 4000 g?  If we could survive that---the cosmos would be open to us, and for cheap.

By the way, if you recall THINGS TO COME by H.G. Wells, a key plot point was a crowd of rioters rushing a space gun (smooth move, dudes) trying to abort a launch. I am guessing the overpressure didn't do them any favors. 


 However being younger then I was disgusted by the fact that they did not explain how they could survive the high Gs (At the time being naive I thought it was merely 1000 gs or so. As shown, it's probably closer to 100000 (short barrel) But that is fiction.


Nice page below....guns... lots of guns... it will give you a feel for many possibilities in this field. Nuclear gun launch of course includes the Wang Bullet concept. (links further  below)
http://orbitalvector.com/Orbital%20Travel/Launch%20Guns/LAUNCH%20GUNS.htm


A fact company that has been covered by Brian Wang in this blog is Quicklaunch.

http://nextbigfuture.com/2014/12/railguns-are-better-military-project.html

Friedlander here. The picture is not from Quicklaunch but illustrates how easy it is to get to high velocities with simple hydrogen oxygen mixtures-- superhot hydrogen allows even greater velocities.

http://nextbigfuture.com/2010/09/john-hunter-of-quicklaunch-is.html
Sander Olsen writes there--
...
basic idea of Quicklaunch is that you launch a projectile from a cannon at 6 kilometers per second using compressed hydrogen gas. On a conventional rocket, the payload fraction is about 3%, whereas with our concept the payload is more than 20%. So we could get propellant into orbit for about a tenth the cost of using conventional rockets. 

Question: So QuickLaunch could be used to launch propellant canisters to orbiting depots? 
Answer:Yes, these depots will serve as orbiting gas stations. For most space missions, 90% of the cost is getting propellant into orbit. Each launch could lift 1,000 pounds of payload into orbit, and we are capable of about 5 launches per day, every day. So we could reasonably expect to be able to transfer 30,000 pounds of fuel to an orbiting depot within a week, if so desired. 


...if we try to send a single human to mars and back using only conventional rockets, the cost is $5 billion per person just for the fuel. By using our Quicklaunch, the cost would be only $500 million per person for the fuel. ....
We did G-tests in the 1990s on ruggedized satellites using the High G Test facilities at National Test Systems in Largo Florida. It turns out that many items, such as electronics, can be hardened to withstand high gs. Most cellphones are already hardened to withstand 1000 gs. 
Hardening surface mount electronics to withstand 3,200 gs only adds 2% to the weight of the object. 

Question: How much would it cost to assemble and prepare a Quicklaunch system? 
Answer: To get a system capable of launching 1,000 pound payloads to orbit into operation might cost $500 million. But constructing a proof of concept system that could hurl 100 pound payloads into orbit would cost only $50 million. Such a system could be quickly developed. 

Question: How long would it require to get the QuickLaunch system up and running? 
Answer: The main component - the cannon - is based on well understood principles and should not be difficult to perfect. There are four proposed development stages, and each stage should take two years. Phase 3, where we are actually launching payloads in the 100 pound range, would be in about the fifth year. The system would require roughly $500,000 per payload pound to develop, amortized over thousands of shots. So for example when amortizing over 10,000 launches, the capital cost is only $500,000/10,000= $50/lb. Naturally one must fold in the time value of money as well as the vehicle costs and the O&M. 

Question: Wouldn't wear and tear on the cannon barrel be a major concern? 
Answer: Yes, really good preventive maintenance on the barrels is required. The barrel will have a liner which will need to be periodically replaced. That process would require about a week. We would also need to be very careful to get proper payload alignments. We will also do maintenance between every launch, which would probably limit the number of launches to 5 per day. 


http://nextbigfuture.com/2010/01/ocean-based-orbital-payload-delivery.html

Brian writes there--
Quicklaunch will cost $562 million to develop over 4 phases and 8 years
* One thousand pound payloads.
* 10-28% payload fraction (full scale system will have 28% payload fraction)
* the donuts around the tube are for bouyancy and for rigidity and precision alignment
* 97+% recapture of the hydrogen gas to recycle the gas
* Cellphone electronics are G hardened, just replace the transformers
* Bigger systems can be built
* Neutrally buoyant barrel made out of composite, so no gravitational sag

Quicklaunch designs shows that all of the high-g issues of my nuclear cannon design can be resolved. If larger projectiles have issues then can launch many smaller projectiles at the same time. The nuclear launch system can achieve the 9km/sec speed so no booster is needed. The nuclear cannon can have a deeper hole to allow reduced g-forces even when accelerating to 9 km/sec instead of 6 km/sec.




https://en.wikipedia.org/wiki/Quicklaunch

Friedlander here. As you can see, an ocean-suspended hydrogen gas gun which has the longest barrel length practical without building expensive land based structure-- and it's aimable given time, and can theoretically be towed to a near equatorial location for maximum trajectory choice.


A sabot is ejected at launch, an aeroshell is jettisoned at 100 km, a solid rocket uses most of the boost weight, but after all that cost to orbit is still a fraction of today. However if direct to escape blast were feasible the payload fraction would be far higher. 



Brian refers above to the Wang Bullet concept  This is nuclear explosive launch from below the ground or ocean with a single round, no airbursts like ORION.
http://nextbigfuture.com/2009/03/underground-nuclear-tests-salt.html
http://nextbigfuture.com/2009/02/nuclear-orion-home-run-shot-all-fallout.html
http://nextbigfuture.com/2010/03/150-kiloton-nuclear-verne-gun.html
http://nextbigfuture.com/2012/01/nuclear-katyusha-launching.html


http://nextbigfuture.com/2010/12/sea-based-launch-option-for-nuclear.html




Supportive posts with nuclear data for Wang Bullet Studies
http://nextbigfuture.com/2012/06/what-was-total-yield-of-all-known.html
http://nextbigfuture.com/2012/02/which-is-cheaper-per-unit-of-energy.html
http://nextbigfuture.com/2013/01/friedlander-on-wang-bullet-and-on.html


------
This article was inspired by the speculation of Gershom Gale who  "introduced Dr. Tom Shaffer of Temple University (who had perfected liquid ventilation, which he had intended to employ in order to save premature children) to Dr. Henry Kolm, who had left MIT to found a company called Electromagnetic Launch Systems (which promised to put a two-ton payload into low Earth orbit in 1.9 seconds). 

Gale's about page:  http://www.angelfire.com/my/theory/gale.html
BTW Gale's interesting idea on why time flows 'forward' http://www.esek.com/jerusalem/timetrav.html
Gale says: It was my idea (subsequently favored by these two men) that filling an astronaut's lungs with the breathable liquid developed by Dr. Shaffer and floating him in a similarly liquid-filled capsule (i.e. neutral density encapsulation) would make it possible to withstand the 1,000G forces generated by Dr. Kolm's launch mechanism."
This is Gale's idea in his own words (shortened, whole thing at link)
 http://www.angelfire.com/my/theory/g.html

Surviving 1,000G

One of the first problems that must be solved by any group planning to colonize space is getting there. Rockets are likely to be too slow, too dangerous, and far too expensive when substantial numbers of people, animals, and plants are involved.

Perhaps there is a better way. The electromagnetic launch system designed by Dr. Henry Kolm (formerly of MIT) offers the possibility of putting two-ton payloads into low-Earth orbit in less than two seconds, and with no risk of explosion. Furthermore, it may be able to do so at a rate of up to six payloads an hour at a price of about $10,000 a payload. The cost of developing such a system, says Kolm (who has formed his own company, Electromagnetic Launch Systems), would be considerably less than what has already been spent on the shuttle program.

Attaining escape velocity in two seconds, however, generates acceleration stress of close to 1,000 gravities. Up to now, there has been no way for human passengers to survive such stress. Neutral density encapsulation might make it possible.

Some years ago, Dr. Tom Shaffer of Temple University developed a liquid hydrofluorocarbon 
https://en.wikipedia.org/wiki/Fluorocarbon
https://en.wikipedia.org/wiki/Organofluorine_chemistry#Hydrofluorocarbons

that can carry enough oxygen into the lungs to support mammalian life. His original purpose was to save severely premature infants, whose lungs are not able to handle gaseous oxygen. In this, he succeeded. Extensive animal studies and preliminary experiments with human infants show that his new liquid makes it possible to bring fetuses to healthy term after as little as 12 weeks in the womb.

Liquid Breathing Interview with Thomas Shaffer
https://www.youtube.com/watch?v=kF5e2raiB7c




 But the substance has other applications, one of which is to neutralize almost all the effects of acceleration stress.

Consider: What kills human beings at acceleration much over 30 g is not the acceleration itself, but the fact that the vehicle accelerates at a rate different from that of its passengers, and the different parts of the passengers' bodies also experience different rates of acceleration. This is because of the differences in density between the astronauts' bodies and the environment within the capsule, and differences in density between the lungs and the surrounding body tissues. So an unprotected human in a capsule accelerating at 1,000 g would be killed instantly for two reasons. First, in an air-filled capsule, the more dense human body, even if placed on an acceleration couch, would slam against that couch with bone-shattering force. Secondly, the relative density of the ribs and chest muscles compared to the air pockets in the lungs would cause the ribs to crush the lungs.

Neutral density encapsulation could perhaps solve both problems. The overall density of the human body is nearly the same as that of Shaffer's liquid. By floating an astronaut in a capsule completely filled with the hydrofluorocarbon, and then accelerating the whole capsule, the first source of stress has been removed, since both the capsule and its occupant would now be accelerating at the same rate.

To better understand this point, remember the high-school science experiment with a raw egg. Placed loose inside a tin box which is then thrown against a wall, the egg shatters. If the box with the egg in it is filled with water, however, so that egg and box accelerate and decelerate at the same rate, the egg can survive the throw unbroken. The same principle was applied to living bodies during a rather cruel Italian experiment conducted in the 1960s. The researchers slammed a pregnant rat against a wall at 10,000 g. While the mother rat was killed instantly, the fetuses -- floating as they were in sacs totally filled with amniotic fluid -- survived.

The second source of stress -- the difference in density (and hence rate of acceleration) between chest and lungs -- can be neutralized by having the astronaut breath the liquid. The gag reflex can be overcome by adjusting the substance's temperature and pH. Ethical considerations have so far prevented Shaffer from filling both lungs of a human volunteer, but one lung has been filled, and the liquid has been breathed and later coughed out without harm. Whatever was left in the lung was safely absorbed....
Neutral density encapsulation could thus permit the entire "package" -- capsule, astronaut, chest, and lungs -- to be accelerated or decelerated as a single-density whole. When the idea was presented to Shaffer and Kolm, they worked out the physics and concluded that, yes, floating a liquid-breathing astronaut in a completely liquid-filled chamber would offer full protection against up to 1,000 g.
...
Of course, if the ultimate goal is colonization of the galaxy, rather than merely the solar system, drive systems considerably more "potent" than Kolm's may be required. A Swedish specialist in space medicine has speculated that, if the sinus cavities as well as the lungs are filled, it might be possible to survive even higher accelerations.
----
Friedlander here again.
Wiki is not so optimistic 
https://en.wikipedia.org/wiki/Liquid_breathing
Acceleration protection by liquid immersion is limited by the differential density of body tissues and immersion fluid, limiting the utility of this method to about 15 to 20 G.
 Extending acceleration protection beyond 20 G requires filling the lungs with fluid of density similar to water. An astronaut totally immersed in liquid, with liquid inside all body cavities, will feel little effect from extreme G forces because the forces on a liquid are distributed equally, and in all directions simultaneously. 
However effects will be felt because of density differences between different body tissues, so an upper acceleration limit still exists.
Liquid breathing for acceleration protection may never be practical because of the difficulty of finding a suitable breathing medium of similar density to water that is compatible with lung tissue. Perfluorocarbon fluids are twice as dense as water, hence unsuitable for this application.

 Incidentally, if we could safety against a million G's (We would need a nanotech infusion that would equalize body tissue densities)  we could heliobrake at the target star's atmosphere so we would only have to pay for the outward trip which itself would be huge in terms of affording massive interstellar colonization. Combine that with an AB Matter tether at Jupiter as I have speculated on here 
http://nextbigfuture.com/2011/11/starbase-jupiter-and-other-femtotech.html
and we would be able to do it with little net energy cost other than drawing down Jupiter's rotational energy. 

But that is literally getting ahead of ourselves. Let's focus on merely launching to earth escape velocity from the ground with some sort of high G launcher of 1,000-4000 Gs with humans aboard and surviving unharmed.

Brian Wang wrote about  the Johndale Solem Orion like asteroid interceptor that would have given a 1000 G launch a few years ago.

BTW, the original orion project had the design for an asteroid intercepter that would accelerate at about 1000 Gs. An unmanned Orion asteroid interceptor was designed. It would not need shock absorbers. Artillery arming, fusing, firing system for shells are regularly built to take 1000 Gs. There was a three page paper: Nuclear explosive propelled Interceptor for deflecting objects on collision course with Earth. Johndale Solem, Los Alamos, proposed unmanned vehicle. No shock absorber or shielding. The pulse units were 25kg bombs of 2.5 kiloton yield. Get to high velocities with only a few explosives and small shock absorbers or no shocks at all. Launch against a 100 meter chondritic asteroid coming at 25 km/sec. 1000 megatons if it hits. Launch when it is 15 million kilometers away and try to cause 10000km deflection. A minimal Orion weighing 3.3 tons with no warhead would do the job. 115 charges with a total of 288 kiloton yield. Launch to intercept in 5 hours. Ample time to launch a second if the first failed. http://nextbigfuture.com/2009/02/unmanned-sprint-start-for-nuclear-orion.html Sprinting out of the Magnetosphere Notice the unmanned high acceleration configurations would reduce the number of charges to go through the atmosphere to about 1-3 charges. Instead of 200 charges to go to orbit with constant lower acceleration. Kick it hard with 3 or fewer 100G force acceleration charges. (charges would go off every half second for fast acceleration instead of 1.1 seconds for human safe acceleration). It can head up at 100Gs. 980 m/s**2. So only 1-3 charges is enough to give escape velocity then coast. It is only a matter of containing the fallout from 1-3 low level charges. Plus 1-3 charges and that is it we have tens of thousands to millions of tons to start the space age. Some of the Orion configurations were for 1000Gs of acceleration. At 100G's in 10 seconds it would be almost 50 kilometers up. 20 shots assuming one every 0.5 second. In 20 seconds it would be almost 200 kilometers up. Some more charges could be used to slow the Orion for a rendezvous with human passengers and acceleration sensitive cargo. They could then fly anywhere in the solar system at a leisurely pace without concern about fallout. Mars Express Another aspect of the fast acceleration that is possible is that an unmanned Orion go from earth or earth orbit to Mars (decelerate at halfway) and get to Mars in under one day going at 100Gs if Mars and Earth are in the close approach. If the unmanned version was going at 1000Gs (which was a design that is possible), then Earth to Mars could be done in a few hours. At about 300Gs and you would be looking at a Mars Overnight package delivery. Brian

A manned Soledale hotrod would give new opportunities to work on yelling Yahooo! through fluid filled lungs (could you even gurgle?)  But a manned Wang Bullet would give the same performance if it were possible to launch at 4000 Gs filling humans with fluid breathing liquid and with NO atmospheric nuclear explosions (just one underground or underwater) 

You will note that above Gale speculated on 1,000 Gs. Many systems need a bit higher than that so I am setting the bar at a nominal 4000 Gs in which encapsulated electronics can survive. 

What would be the design rules for stuff going up?

Compact design. If fluid has to fill every void space there can't be a lot of void space unless you have a huge system.  With the Wang Bullet this is not a problem but with the reference Kolm Launcher it is.

No evacuated spaces.  This is the biggest rule. Not only the lungs, the sinuses would be filled. Astronauts would literally wear wetsuits, fluid containing suits in which they had been sealed (I am not sure I am joking when I can envision a CAT scan directly after sealing in the suit just to be sure).  The early 60's astronauts joked about spam (the meat, not the email) in a can-- we are talking about spam in a beverage bag.  They get loaded in their ship and the ship itself filled with (possibly a different) fluid. Ideally the time from sealing in to direct to escape is so small that literally an hour later-- they have been in space for half an hour.

(note that sanitary arrangements are assumed and can be discussed but not  here. The main concern being that you not only need to be able to take care of bathroom needs waiting for a launch but before a high -G landing. In other words, you may need to stay liquid-packed during the whole Earth-Moon run if you want to try for a high G-landing.)

 Yes, even though direct impacts at over 300 m/s tend to leave powder in their wake,   not whole ships, (see F-4 Phantom dispute with wall movie below)


 there might be ways to totally cut the mass ratio of payload up to payload down if you could take say 4000 gs deceleration as well as acceleration.) (actually even 50 to 500 Gs would be huge) The penalty is you need to stay in your G-suit for however much time to get to your destination) 

For example the Zond capsules of the USSR would have killed a cosmonaut if they had encountered the 20-30 G forces in certain possible return trajectories. 
https://en.wikipedia.org/wiki/Soyuz_7K-L1
All L1/Zond spacecraft made only unmanned flights from 1967–70, from (Zond 4 toZond 8), and four of these five Zond flights suffered malfunctions.
Test flights conducted around the Moon showed problems using their star sensors for navigation. These problems caused ballistic reentry due to the failed guidance. One direct descent re-entry was performed on a steep ballistic trajectory with deceleration of up to 20 Gs and splashed down in the Indian Ocean. Three others performed a maneuver known as "skip reentry" to shed velocity. One of those also performed an unsafe (for humans) descent of up to 20 Gs of deceleration, the other suffered main parachute failure, and only one flight - Zond 7 - would have been safe for cosmonauts.

https://en.wikipedia.org/wiki/Skip_reentry


For example again the Pioneer Venus Multiprobe 
they encountered 458 Gs in a very swift deceleration.  http://www.mrc.uidaho.edu/entryws/presentations/Papers/bienstock_pioneer%20venus%20and%20galileo%20probe%20history-final.pdf
Ability to take high Gs on demand would enable many exotic mission profiles and return trajectory profiles and probably someday save lives. 

I have a rotary tether in mind on the moon or something akin to but longer than an aircraft carrier arrestor system 
https://en.wikipedia.org/wiki/Arresting_gear
that could pull on the surface skimming (you need the right angle and incoming trajectory) capsule as it comes into the moonbase. (The first missions to build the braking base would be blasted up, retro-rocket down. But by saving the half your weight in fuel you'd need to retro each capsule up the capture system of the moonbase would rapidly pay for itself)
This would only work for many smaller loads such as the reference Kolm Launcher mentioned above. Imagine a 12 ton launched, losing a few hundred kilos of heatshield on the way up, rocketing down to the moon, and you will find maybe 6 tons lands dry, maybe less and most of that is salvagable structure (say 2/3--high G capsules are built sturdily) and void filling fluids (besides the breathing fluid, these could include water, propane, liquid ammonia (on unmanned flights) and other candidates-- but they aren't really payload unless someone on the base asked for them and will pay for them)  Net real payload might be under a ton and a half on a 12 ton launch.  My model of this is two people might launch simultaneously but it might only be  one. So if 6 shots an hour, 8 hours a day (the Earth rotates and not every trajectory up is ideal for a prompt Moon landing, only the best windows are for manned missions, the other payloads need to take their time down) you have 48 shots a day, a few of which might be manned, and maybe 60 tons net payload down.  The Saturn V system might have been rigged for a direct landing of the 3rd stage or with 3 bottom stage only lunar modules to give say 15-18 tons down. So the equivalent of over 3 Saturn Vs a day, or a thousand a year.  That basically can build a couple Skylab base modules a day, or a large Moonbase 2001 or Space 1999 style in a year.

But if you could snag the 12 tons coming in and not waste 6 tons of that in retro fuel you could probably quadruple the net payload. So first you rocket down, and build the landing base. Then you operate a lunar capture port and ship more people, livestock, plants and industrial equipment and start building and launching spacecraft on the Moon for tether launch to space (including a L-5 shipyard for large structures) 


How might we capture large payloads at high G on the Moon? Direct crashing won't work unless we can build a retro tunnel in which to hit the incoming craft with gas streams of some kind in which it can retro. This gas need not be a permanent gas but could be sodium vapor for example that would not ruin the lunar vacuum.
Another approach would be a hovering capture hook.
The main idea would be a mass driver on the surface speeds up (on a hovertrack with magnetic suspension)  something akin to a rocket sled that lifts a hook up to the incoming spacecraft at something under 300 m/sec relative velocity. Only the capture eyelet on the spacecraft need be built superstrong. Hooked, and then the spacecraft can decelerate at at least hundreds of Gs if not thousands at quite a reasonable encounter length.

If that idea doesn't work, Kraft Ehricke came up with an elaborate treatment of using lunar dust runways to slow down with 1/10th the fuel (for hover as you brake using skids)

If that idea doesn't work, this rocket sled which is going better than lunar escape velocity  8,568 km/h is slowed down by liquid drag through liquid braking. 


So we can imagine a liquid runway of molten sodium metal as detailed here http://nextbigfuture.com/2016/01/the-future-of-canal-transport-take_3.html with a rocket sled brake accelerated to docking speed on a mass driver along side and the capsule hooking on and decelerating at high G sloshing liquid sodium in a great plume.

There are many other lower G ways to decelerate if you want to get out of the wetsuit right after reaching escape. For example hooking on to a lunar rotovator and enduring 8 Gs at 100 km radius as you are decelerated.  But the whole focus of this article is cool things you can do at high G so we are weighting it toward those scenarios.

I am assuming the astronauts would want to launch being awake. But if you want to keep consumption of oxygen to a minimum you could chill them and put them to sleep this way while liquid filling them.  I personally would want to be awake every minute during my first space mission but I am guessing there are a lot of people who just want to go to sleep on Earth and wake up on the Moon with no worries in between.


Which would you prefer? 19 hours-3 days in a wetsuit with waste filtering  awake or asleep?


There is a sweet spot on the trip duration and delta v requirements even if ANY launch speed is possible. If you go to bare escape velocity you take 5 days to get to the Moon and impact at the lowest possible speed.  If you up the launch delta v a bit, not much over 12 km/sec launch net speed would get to the moon not in 5 days or even 3 but 19 hours. 
http://www.universetoday.com/13562/how-long-does-it-take-to-get-to-the-moon/
New Horizons left at 16.3-5 km/sec (the first thing launched to solar escape directly IIRC) and could have reached (and impacted) the moon at 8 hours 35 minutes after launch. It passed the Moon's orbit by.New Horizons went past lunar orbit in under 9 hours
But if you had to brake as you are retrofiring onto the moon's surface you want a low incoming speed. So I am guessing the sweet spot is a bit over 12km/sec for explosive or high-G launch and as slow a landing as possible for under 3 km/sec delta V needed.  

There is a claim which I am not sure I believe that the very first thing to impact the Moon on September 1 1959 from the USSR retrofired by explosive charge to give its commemorative medallion a chance to survive impact.  


It also carried metal pendants which it scattered on the surface on impact, with the hammer and sickle of the USSR on one side and the launch date on the other. - See more at: http://www.historytoday.com/richard-cavendish/soviet-union-first-moon#sthash.AN79osgn.dpuf

Or at least a ghost of chance. Not sure if that story is real and no photographs from the surface have yet proved it.  But by dynamics, if not retrofired those medallions had this happen to them 


with 8 times the velocity and 64 times the energy.
Explosive deceleration would be tricky but might pay if for example aluminum/liquid oxygen charges were locally makable on the moon. But would detonation velocity be in the right range? It might be very hard to match compared to say a spray of liquid or gas in the path of the oncoming capsule.
For that reason I think explosive slowdown isn't as likely as explosive launch. But the high-G deceleration capability for manned missions would enable astounding things.

One can imagine 1000G shielded astronauts surviving an asteroid aerocapture event as the successful return of a one-way manned asteroid mission. 
Go to the asteroid ALMOST in an encounter with Earth trajectory, 

https://en.wikipedia.org/wiki/Asteroid_impact_avoidance
nudge it a bit years in advance, stay with the asteroid working on it to form it into an aerocapture friendly shape, then ride it into a highly eccentric earth orbit (Low point is capture necessity, high point because you don't want it circularizing at the low altitude circular orbit which would herald atmospheric reentry!)  With one mission Earth would have a literal new moon and a huge new space station possibly of kilometer scale. 
 IF the world community would ever permit it.
https://en.wikipedia.org/wiki/Aerocapture
https://en.wikipedia.org/wiki/Asteroid_capture


Some posts of mine that discuss problems of lunar landing and colonization, Lunar and Space  Industrial buildup—and what to do with it:

Some other articles on 
http://nextbigfuture.com/2010/12/sea-based-launch-option-for-nuclear.html
http://nextbigfuture.com/2010/12/setting-up-industrial-village-on-moon.html
http://nextbigfuture.com/2010/12/after-lunar-industrial-village.html
http://nextbigfuture.com/2011/01/two-world-industrial-bootup-enabling.html
http://nextbigfuture.com/2011/02/new-space-age-materials-for-new-space.html
http://nextbigfuture.com/2011/02/hyperwealth-and-alternative-futures-by.html

http://nextbigfuture.com/2011/12/friedlander-cold-crown-cold-trap-for.html
http://nextbigfuture.com/2013/01/friedlander-cold-crown-2-conversation.html





http://nextbigfuture.com/2010/03/in-praise-of-large-payloads-for-space.html
http://nextbigfuture.com/2013/08/in-praise-of-large-payloads-for-space.html

And other Wang Bullet collaborations:

http://nextbigfuture.com/2013/01/friedlander-on-wang-bullet-and-on.html
http://nextbigfuture.com/2009/03/underground-nuclear-tests-salt.html
http://nextbigfuture.com/2009/02/nuclear-orion-home-run-shot-all-fallout.html
http://nextbigfuture.com/2010/03/150-kiloton-nuclear-verne-gun.html
http://nextbigfuture.com/2012/01/nuclear-katyusha-launching.html



If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks

Форма для связи

Name

Email *

Message *