January 20, 2017

Technology for making Death Star Sized Structures will have technical feasibility in the next 20 years

The first Death Star had a diameter of between 140 and 160 kilometers. The second Death Star's diameter ranged from 160 to 900 kilometers.


There are two near term technologies which could be applied to making Death Star sized structures:
1. Space bubbles
2. Robotic spiderfab construction

Giant Space Bubbles

Bubbles on earth have been made that cover a rectangular 11 meter-by-7.5 meter area.


In 2007 Devon Crowe of PSI corporation created a study for NASA Advanced Innovative conceps for making large space structures from bubbles that are made rigid using metals or UV curing


A single bubble can be 10 meters in earth gravity, 100 kilometer in low earth orbit or 1000 kilometers in deep space. Foams made of many bubbles could be far larger in size.

A 140 kilometer diameter sphere would have 61600 square kilometers of surface area. If a structure could be made that was one gram per square meter, then that bubble sphere would be 61600 tons.

Reusable Spacex falcon heavy launchers could bring up the material to expand a large bubble in space. It would take about 2000 launches of a Spacex Falcon heavy to place the amount of material in orbit for a 140 kilometer sphere if the mass budget of one gram per square meter could be maintained. Every additional gram per square meter would be another 2000 launches

The size of a 1000 kilometer bubble is nearly the size of Charon, the moon of Pluto. Charon is 1200 kilometers in diameter. Saturn's moon Tethys is 1050-1080 kilometers in diameter Ceres the largest object in the asteroid belt is 970 kilometers in diameter. A single tesselation foam (like in the picture) of 1000 kilometer bubbles would be about the size of Earth's moon. A Penrose tesselation like the one in the picture of 1000 kilometer bubbles would be in between the size of Neptune or Saturn. A Tesselation foam of 100 kilometer bubbles in earth orbit could form an object the size our existing moon or larger.

A tesselation of many bubbles would make a large death star size structure that would be more resilient than the movie Death Star. The Death Star was destroyed with one well placed shot. Thousands of bubbles would need thousands of shots to destroy. Very low pressure or no pressure inside bubbles would mean that bubbles would not pop with a hole.

Metal can be evaporated to coat the inside of the bubble for reflective sails and telescopes. A reflective giant space bubble could be used to focus sunlight into a beam.

Russian theorists had proposed solar reflector systems that could constantly illuminate areas the size of several cities every night. The most ambitious proposal foresees a constellation of 100 reflectors, each 1,300 feet in diameter with a surface area of 30 acres.

The Nazi considered making giant mirrors to heat cities to several hundred degrees.


Successful healthcare systems in Israel, Singapore and others could teach the US to lower health costs by two to four times while getting better results

The United States should copy and adapt an improved healthcare system based upon analysis of international systems. This article will review the last attempt at a universal healthcare policy (Medicare for All) and then some of the more successful healthcare systems in the world will be reviewed. Here is a 180 page document that reviews health systems around the world in 2015

Medicare for All

A number of proposals have been made for a universal single-payer healthcare system in the United States, most recently the United States National Health Care Act, (popularly known as H.R. 676 or "Medicare for All") but none have achieved more political support than 20% congressional co-sponsorship. Advocates argue that preventative health care expenditures can save several hundreds of billions of dollars per year because publicly funded universal health care would benefit employers and consumers, that employers would benefit from a bigger pool of potential customers and that employers would likely pay less, and would be spared administrative costs of health care benefits. It is also argued that inequities between employers would be reduced. Also, for example, cancer patients are more likely to be diagnosed at Stage I where curative treatment is typically a few outpatient visits, instead of at Stage III or later in an emergency room where treatment can involve years of hospitalization and is often terminal. Others have estimated a long-term savings amounting to 40% of all national health expenditures due to preventative health care although estimates from the Congressional Budget Office and The New England Journal of Medicine have found that preventative care is more expensive.

Any national system would be paid for in part through taxes replacing insurance premiums, but advocates also believe savings would be realized through preventative care and the elimination of insurance company overhead and hospital billing costs. An analysis of a single-payer bill by Physicians for a National Health Program estimated the immediate savings at $350 billion per year. The Commonwealth Fund believes that, if the United States adopted a universal health care system, the mortality rate would improve and the country would save approximately $570 billion a year.

In 2013, a fiscal study by Gerald Friedman, a professor of economics at the University of Massachusetts, Amherst was made for Medicare of All. There would even be money left over to help pay down the national debt, he said. Friedman says his analysis shows that a nonprofit single-payer system based on the principles of the Expanded and Improved Medicare for All Act, H.R. 676, introduced by Rep. John Conyers Jr., D-Mich., and co-sponsored by 44 other lawmakers, would save an estimated $592 billion in 2014. That would be more than enough to cover all 44 million people the government estimates will be uninsured in that year and to upgrade benefits for everyone else.

Under the single-payer system created by HR 676, the U.S. could save an estimated $592 billion annually by slashing the administrative waste associated with the private insurance industry ($476 billion) and reducing pharmaceutical prices to European levels ($116 billion). In 2014, the savings would be enough to cover all 44 million uninsured and upgrade benefits for everyone else. No other plan can achieve this magnitude of savings on health care.

Specifically, the savings from a single-payer plan would be more than enough to fund $343 billion in improvements to the health system such as expanded coverage, improved benefits, enhanced reimbursement of providers serving indigent patients, and the elimination of co-payments and deductibles in 2014. The savings would also fund $51 billion in transition costs such as retraining displaced workers and phasing out investor- owned, for-profit delivery systems.

World Health Systems Compared

58 countries have universal health coverage

Israel has a system with government payment but with four competing service providers. Israel costs are half the USA.
Singapore has some of the best health results and among the lowest healthcare costs by percent of GDP. Singapores overall GDP costs are four times less than the USA.


Hong Kong has early health education, professional health services, and well-developed health care and medication system. The life expectancy is 84 for females and 78 for males, which is the second highest in the world, and 2.94 infant mortality rate, the fourth lowest in the world.

There are two medical schools in Hong Kong, and several schools offering courses in traditional Chinese medicine. The Hospital Authority is a statutory body that operates and manages all public hospitals. Hong Kong has high standards of medical practice. It has contributed to the development of liver transplantation, being the first in the world to carry out an adult to adult live donor liver transplant in 1993

Israel has a system of universal healthcare as set out by the 1995 National Health Insurance Law. The state is responsible for providing health services to all residents of the country, who can register with one of the four national health service funds. To be eligible, a citizen must pay a health insurance tax. Coverage includes medical diagnosis and treatment, preventive medicine, hospitalization (general, maternity, psychiatric and chronic), surgery and transplants, preventive dental care for children, first aid and transportation to a hospital or clinic, medical services at the workplace, treatment for drug abuse and alcoholism, medical equipment and appliances, obstetrics and fertility treatment, medication, treatment of chronic diseases and paramedical services such as physiotherapy and occupational therapy.

Prior to the law's passage over 90% of the population was already covered by voluntarily belonging to one of four nationwide, not-for-profit sickness funds which operated some of their own medical facilities and were funded in part by employers and the government and in part by the insured by levies which varied according to income. However, there were three problems associated with this arrangement. First, membership in the largest fund, Clalit, required one to belong to the Histadrut labor organization, even if a person did not wish to (or could not) have such an affiliation while other funds restricted entry to new members based on age, pre-existing conditions or other factors. Second, different funds provided different levels of benefit coverage or services to their members and lastly was the issue mentioned above whereby a certain percentage of the population, albeit a small one, did not have health insurance coverage at all.

Before the law went into effect, all the funds collected premiums directly from members. However, upon passage of the law, a new progressive national health insurance tax was levied through Bituah Leumi (Israel's social security agency) which then re-distributes the proceeds to the sickness funds based on their membership and its demographic makeup. This ensured that all citizens would now have health coverage. While membership in one of the funds now became compulsory for all, free choice was introduced into movement of members between funds (a change is allowed once every six months), effectively making the various sickness funds compete equally for members among the populace.


Singapore spends just 4.7 percent of its GDP on health care (World Bank Health Data, 2014). Cost is controlled in a number of ways, perhaps foremost by the manner in which the government both fosters and controls competition—intervening when the market fails to keep costs down. Public and private hospitals exist side by side, with the public sector having the advantage of patient incentives and subsidies. Because it regulates prices for public hospital services and regulates the number of public hospitals and beds, the government is able to shape the marketplace. Within this environment, the private sector must be careful not to price itself out of the market.

At the same time, the government sets subsidy and cost-recovery targets for each hospital ward class, thereby indirectly keeping public sector hospitals from producing excess profits. Hospitals are also given annual budgets for patient subsidies, so they know in advance the levels of reimbursement they will receive for patient care.
Within their budgets, hospitals are required to break even.

Singapore currently has the second lowest infant mortality rate in the world and among the highest life expectancies from birth, according to the World Health Organization. Singapore has "one of the most successful healthcare systems in the world, in terms of both efficiency in financing and the results achieved in community health outcomes," according to an analysis by global consulting firm Watson Wyatt. Singapore's system uses a combination of compulsory savings from payroll deductions (funded by both employers and workers) a nationalized health insurance plan, and government subsidies, as well as "actively regulating the supply and prices of healthcare services in the country" to keep costs in check; the specific features have been described as potentially a "very difficult system to replicate in many other countries." Many Singaporeans also have supplemental private health insurance (often provided by employers) for services not covered by the government's programs




Virtually all of Europe has either publicly sponsored and regulated universal health care or publicly provided universal healthcare. The public plans in some countries provide basic or "sick" coverage only, with their citizens being able to purchase supplemental insurance for additional coverage. Countries with universal health care include Austria, Belarus, Croatia, Czech Republic, Denmark, Finland, France, Germany, Greece, Iceland, Ireland, Italy, Luxembourg, Malta, Moldova, the Netherlands, Norway, Portugal, Romania, Russia, Serbia, Spain, Sweden, Switzerland, Ukraine, and the United Kingdom




Philippines will attend China's One Belt One Road conference in May where there will be the announcement of many new deals

Philippine President Duterte will visit China in May to attend the One Belt One Road conference. When Duterte made his first visit to China in October, the communist government in Beijing agreed to pump in $24 billion worth of funds.

The One Belt, One Road Initiative aims to get 60 countries to invest in infrastructure projects to develop the old Silk Road that once connected China with Central Asia, Europe and beyond.
President Duterte's second trip shows that the Philippines is eager to build closer relations with China


Duterte reiterated last month he wanted to avoid confrontation with China and saw no need to press Beijing to abide by a July ruling on China's claims in the disputed the South China Sea that went in favor of the Philippines.

The Philippine leader intends to settle the dispute over the South China Sea through diplomatic talks and is seeking assistance from Japan.

It seems likely that there will be several new One Belt One road infrastructure deals announced in May at the conference.



Texas oil land grab and US crude oil production projected to pass 10 million bpd by 2021

US domestic oil production remains in a deep two-year slump because of the low oil prices. However, there are several multibillion-dollar deals showing sparks of recovery in the shale fields of the Permian Basin straddling Texas and New Mexico.

The U.S. has added 482,000 barrels of oil per day since mid-October, an increase of more than 5 percent that’s been driven in large part by burgeoning output out of that west Texas shale formation. The OPEC - Russia deal to cut production has increased oil prices by $10 per barrel and provided room for the US to increase supply.

Exxon Mobil announced on Tuesday that it was acquiring 275,000 acres in New Mexico from the Bass family of Fort Worth for up to $6.6 billion in stock and cash. The deal came one day after another oil producer, Noble Energy, agreed to pay $2.7 billion to buy Clayton Williams Energy, giving it 120,000 oil-rich acres nearby in West Texas.

The deals are among the largest of more than $25 billion of mergers and acquisitions in the Permian since June, representing roughly one-quarter of the total spent by the oil and gas industry on such transactions worldwide over the last year. Companies like Anadarko Petroleum, SM Energy and EOG Resources are selling assets in other domestic fields to snap up parts of several fields that make up the basin, which is roughly the size of South Dakota.

“The Permian Basin has now become the crown jewel of the world’s oil and gas industry,” said Scott Sheffield, the executive chairman of Pioneer Natural Resources, a large producer in the area.





The Permian received new life about a decade ago when drillers began experimenting with hydraulic fracturing to blast through shale fields that course through the region. Exploration by Pioneer Natural Resources and a few other companies found multiple layers of shale — six to eight oil-rich zones, one on top of the other, like a layer cake — that offer companies the opportunity to drill through multiple reservoirs on the same real estate.

The geological virtues of the Permian, along with an existing robust array of pipelines, have made the basin the cheapest to develop of any shale oil field in the country. The break-even price for the best acreage in the basin is as low as $40 a barrel, where in most other shale fields the break-even price can be $10 to $20 higher. With acreage prices for oil properties multiplying by 10 times or more since 2012, oil executives are starting to talk of “Permania.

DARPA makes progress on paper and polymer drones that degrade in days after delivering cargo

DARPA will take a page from Mission Impossible. Mission Impossible had messages on tape or disc that would self destruct after being played. DARPA will make drones made of polymers or paper that will disappear (deconstruct) after delivering a 3 pound cargo.

DARPA's ICARUS program is developing autonomous air-delivery vehicles capable of delivering intact a 3-pound payload with 10 meters accuracy with respect to a GPS-programmed location. Within hours of payload delivery, the vehicle, which should be no more than 3 meters in its longest dimension, must physically vanish. The 26-month program will culminate in a final Government field-test of fully vanishing, precision air-delivery prototypes.

The primary technical categories of the ICARUS program are aerodynamics and materials. It will take creative aerodynamic design and materials engineering to minimize overall capability tradeoffs that the interplay of these two interacting arenas is likely to require. For example, engineering materials that are stable enough to meet flight specifications, yet unstable enough to undergo the vanishing requirement, is a tall order.

Within a larger context, the ICARUS program addresses the fundamental question of whether large, functional structures can be deliberately designed to disappear soon after their mission is completed. If this capability can be developed, it could have impacts in many core areas where a leave-behind would have environmental and/or unintended logistical consequences.


DARPA has now provided the San Francisco-based research team at Otherlab with funding to build what would surely be the most tech-savvy paper plane to take to the skies

The Aerial Platform Supporting Autonomous Resupply Actions (APSARA) systems are a heavy-duty cardboard gliders that can be deployed from an aircraft like a C-17 cargo plane, by the hundreds. Star Simpson, hardware developer on the project, tells us that they can then glide up to around 55 mi (88 km) away from the drop point, before circling in and making a precise landing with the cargo in tow.

"We have done tests releasing our aircraft from 1,000 ft (304 mt) and proved their ability to turn at waypoints and to land within close range of a specific location," Simpson tells New Atlas.

Once the goods have arrived, the drones biodegrade in a matter of days. And because it is a glider without motors and rotors, it means that all of the onboard electronics, courtesy of DARPA's VAPR program, go with it.

The current models carry 1 kg (2.2 lb), but should easily could scale up to carrying 10 kg (22 lb).








January 19, 2017

Creating atomic scale graphene nanoribbons

Silicon crystals are the semiconductors most commonly used to make transistors, which are critical electronic components used to carry out logic operations in computing. However, as faster and more powerful processors are created, silicon has reached a performance limit: the faster it conducts electricity, the hotter it gets, leading to overheating.

Graphene, made of a single-atom-thick sheet of carbon, stays much cooler and can conduct much faster, but it must be into smaller pieces, called nanoribbons, in order to act as a semiconductor. Despite much progress in the fabrication and characterization of nanoribbons, cleanly transferring them onto surfaces used for chip manufacturing has been a significant challenge.

A recent study conducted by researchers at the Beckman Institute for Advanced Science and Technology at the University of Illinois and the Department of Chemistry at the University of Nebraska-Lincoln has demonstrated the first important step toward integrating atomically precise graphene nanoribbons (APGNRs) onto nonmetallic substrates. The paper, "Solution-Synthesized Chevron Graphene Nanoribbons Exfoliated onto H:Si(100)," was published in Nano Letters.



Researchers have made the first important step toward integrating atomically precise graphene nanoribbons (APGNRs) onto nonmetallic substrates.
CREDIT Adrian Radocea, Beckman Institute for Advanced Science and Technology


Nanoletters - Solution-Synthesized Chevron Graphene Nanoribbons Exfoliated onto H:Si(100)

Chip-sized, high-speed terahertz modulator raises possibility of faster data transmission

Tufts University engineers have invented a chip-sized, high-speed modulator that operates at terahertz (THz) frequencies and at room temperature at low voltages without consuming DC power. The discovery could help fill the “THz gap” that is limiting development of new and more powerful wireless devices that could transmit data at significantly higher speeds than currently possible.

Measurements show the modulation cutoff frequency of the new device exceeded 14 gigahertz and has the potential to work above 1 THz, according to a paper published online today in Scientific Reports. By contrast, cellular networks occupy bands that are much lower on the spectrum where the amount of data that can be transmitted is limited.

The device works through the interaction of confined THz waves in a novel slot waveguide with tunable, two-dimensional electron gas. The prototype device operated within the frequency band of 0.22-0.325 THz, which was chosen because it corresponded to available experimental facilities. The researchers say the device would work within other bands as well.

Confined terahertz waves interact with tunable, two-dimensional electron gas in a novel slot waveguide. Credit: Nano Lab, Tufts University School of Engineering.

Nature Scientific Reports - High Speed Terahertz Modulator on the Chip Based on Tunable Terahertz Slot Waveguide

Climate scientists know that emissions cannot be reduced to prevent more than 2 degree global change so will geoengineering get serious attention

In 2008 some climate scientiss said that the world had 100 months to enact drastic anti-global warming policies to avoid the environment warming by two degrees celsius compared to pre-industrial times. The 100 months have passed and only modest policies have been enacted and it seems likely that some of the policies will be reverse with more use of fossil fuels.

Global temperatures have already risen by 1 degree celsius compared to 1880.

What scale of change are we looking at to stay below 2C? Being optimistic about what might be achieved in terms of saving forests from being cut down and cleaning up industry, especially the production of steel and cement, Anderson estimates globally the world can afford to emit around 650 billion tonnes of carbon dioxide in total from energy systems. Currently, the world pumps out about 36 billion tonnes every year alone. Starting from today, and assuming that poorer and industrialising nations see a peak in the emissions from energy use by 2025 and go zero carbon by 2050, Anderson calculates that this leaves a rich country such as the UK with the challenge of cutting its emissions by around 13% per year.


Emissions have been going up fairly constantly. There has not even been a flattening of emissions. It is likely that the 650 billion ton budget will be used in 15-20 years.

The climate scientists were merely asking for about $100 trillion to be spent on climate mitigation between now and 2100. It is amazing that they are shocked and disappointed that the world has not enacted that program.

Effective geoengineering could be performed at about 100-1000 times lower cost. This would be

Lightweight Car production with disruptive 3D print process

Engineers at The University of Nottingham are developing lightweight automotive components using new additive manufacturing processes to boost vehicle fuel efficiency, while cutting noise and CO2 emissions.

The components will be constructed using selective laser melting (SLM). SLM uses a 3-Dimensional Computer Aided Design (CAD) model to digitally reproduce the object in a number of layers.

Each layer is sequentially recreated by melting sections of a bed of aluminium alloy powder using a laser beam. Layer by layer, the melted particles fuse and solidify to form novel structures that can be made up from complex lattices to provide a light-weight component.



Russia developing hypersonic weapons expects breakthroughs in combat laser and electromagneticweapons

Russia is developing hypersonic weapons by using new materials, Russian Deputy Defense Minister Yuri Borisov said on Thursday.

"Coming next are hypersonic weapons, which require the use of principally new materials and control systems that operate in a completely different medium, in plasma," the deputy defense minister said.

Today the Army is at the stage of a new scientific and technical revolution and principally new armament systems based on physical principles never used before in this field are coming to replace existing systems, the deputy defense minister said.

Russia expects a serious breakthrough in the field of laser and electromagnetic weapons.

The United States is currently leading in race to develop combat lasers and electromagnetic railguns.



The USA had a lead in developing hypersonic missiles but Russia and China may be closer to deploying weapons.

Graphene superconductivity activated which could enable new electronic devices and faster computers

Researchers have found a way to trigger the innate, but previously hidden, ability of graphene to act as a superconductor - meaning that it can be made to carry an electrical current with zero resistance.

The finding, reported in Nature Communications, further enhances the potential of graphene, which is already widely seen as a material that could revolutionise industries such as healthcare and electronics. Graphene is a two-dimensional sheet of carbon atoms and combines several remarkable properties; for example, it is very strong, but also light and flexible, and highly conductive.

Since its discovery in 2004, scientists have speculated that graphene may also have the capacity to be a superconductor. Until now, superconductivity in graphene has only been achieved by doping it with, or by placing it on, a superconducting material - a process which can compromise some of its other properties.

But in the new study, researchers at the University of Cambridge managed to activate the dormant potential for graphene to superconduct in its own right. This was achieved by coupling it with a material called praseodymium cerium copper oxide (PCCO).

Superconductors are already used in numerous applications. Because they generate large magnetic fields they are an essential component in MRI scanners and levitating trains. They could also be used to make energy-efficient power lines and devices capable of storing energy for millions of years.

Superconducting graphene opens up yet more possibilities. The researchers suggest, for example, that graphene could now be used to create new types of superconducting quantum devices for high-speed computing. Intriguingly, it might also be used to prove the existence of a mysterious form of superconductivity known as "p-wave" superconductivity, which academics have been struggling to verify for more than 20 years.



Nature Communications - p-wave triggered superconductivity in single-layer graphene on an electron-doped oxide superconductor

Tri-Alpha Fusion spending $500 million to develop commercial fusion by 2027

Michl Binderbauer is chief technology officer for a startup called Tri Alpha Energy that is making a $500 million bet on fusion. Tri Alpha is the largest of about a dozen startups trying to make it work.

Tri Alpha is taking a different approach than the International Tokomak. TriAlpha will not a huge structure. The idea is to fire two football-shaped plasma clouds at each other at supersonic speeds.

At the center of the chamber, they collide violently, fusing into a larger football. Additional particles are fired at right angles, making the plasma ball spin like a well-thrown pass.

They are testing constantly, sometimes 50 times a day. Each shot requires about 20 megawatts of electricity, enough to power all the lights and appliances in 5,000 homes, but for only a few-thousandths-of-a-second.

Gleaning data from a hot ball of nothing that lasts for much, much less than the blink of an eye requires a lot of clever testing tools.

Binderbauer believes a commercial fusion system will be available in a decade.

The funding was known back when Nextbigfuture covered Tri-alpha back in 9 months ago. 2027 as a commercialization target is a slip from prior plans

Tri Alpha Energy, nuclear fusion startup, has raised $500 million. Tri Alpha’s setup borrows some of the principles of high-energy particle accelerators, such as the Large Hadron Collider, to fire beams of plasma into a central vessel where the fusion reaction takes place. Last August the company said it had succeeded in keeping a high-energy plasma stable in the vessel for five milliseconds—an infinitesimal instant of time, but enough to show that it could be done indefinitely. Since then that time has been upped to 11.5 milliseconds.

The next challenge is to make the plasma hot enough for the fusion reaction to generate more energy than is needed to run it. How hot? Something like 3 billion °C, or 200 times the temperature of the sun’s core. No metal on Earth could withstand such a temperature. But because the roiling ball of gas is confined by a powerful electromagnetic field, it doesn’t touch the interior of the machine.

The photos seen here were taken a few days before Tri Alpha began dismantling the machine to build a much larger and more powerful version that will fully demonstrate the concept. That could lead to a prototype reactor sometime in the 2020s.

Tri-alpha energy was in stealth mode for many years but now has their own website.

Compact Toroidal injector test stand



The C2U is the world's largest compact toroid device. 20 meters in length and 1.4 meters in diameter. Magnetic fields of 3.5 tesla deliver 1 megajoule in microseconds forming and accelerating compact toroids to 600,000 kilometers per hour.






Tri Alpha’s machine produces a doughnut of plasma, but in it the flow of particles in the plasma produces all of the magnetic field holding the plasma together. This approach, known as a field-reversed configuration (FRC), has been known since the 1960s. But despite decades of work, researchers could get the blobs of plasma to last only about 0.3 milliseconds before they broke up or melted away. In 1997, the Canadian-born physicist Norman Rostoker of the University of California, Irvine, and colleagues proposed a new approach. The following year, they set up Tri Alpha, now based in an unremarkable—and unlabeled—industrial unit here. Building up from tabletop devices, by last year the company was employing 150 people and was working with C-2, a 23-meter-long tube ringed by magnets and bristling with control devices, diagnostic instruments, and particle beam generators. The machine forms two smoke rings of plasma, one near each end, by a proprietary process and fires them toward the middle at nearly a million kilometers per hour. At the center they merge into a bigger FRC, transforming their kinetic energy into heat.

Previous attempts to create long-lasting FRCs were plagued by the twin demons that torment all fusion reactor designers. The first is turbulence in the plasma that allows hot particles to reach the edge and so lets heat escape. Second is instability: the fact that hot plasma doesn’t like being confined and so wriggles and bulges in attempts to get free, eventually breaking up altogether. Rostoker, a theorist who had worked in many branches of physics including particle physics, believed the solution lay in firing high-speed particles tangentially into the edge of the plasma. The fast-moving incomers would follow much wider orbits in the plasma’s magnetic field than native particles do; those wide orbits would act as a protective shell, stiffening the plasma against both heat-leaking turbulence and instability.





SOURCES - PBS, Technology Review, Physics of Plasmas, Trialpha Energy, Youtube, Science




We only stop or reduce the use of materials when we swap in something better or something that is toxic is banned

While some scientists believe that the world can achieve significant dematerialization through improvements in technology, a new MIT-led study finds that technological advances alone will not bring about dematerialization and, ultimately, a sustainable world.

Researchers found that no matter how much more efficient and compact a product is made, consumers will only demand more of that product and in the long run increase the total amount of materials used in making that product.

Take, for instance, one of the world’s fastest-improving technologies: silicon-based semiconductors.

Over the last few decades, technological improvements in the efficiency of semiconductors have greatly reduced the amount of material needed to make a single transistor. As a result, today’s smartphones, tablets, and computers are far more powerful and compact than computers built in the 1970s.

Nonetheless, the researchers find that consumers’ demand for silicon has outpaced the rate of its technological change, and that the world’s consumption of silicon has grown by 345 percent over the last four decades. As others have found, by 2005, there were more transistors used than printed text characters.

“Despite how fast technology is racing, there’s actually more silicon used today, because we now just put more stuff on, like movies, and photos, and things we couldn’t even think of 20 years ago,” says Christopher Magee, a professor of the practice of engineering systems in MIT’s Institute for Data, Systems, and Society.
“So we’re still using a little more material all the time.”



The researchers found similar trends in 56 other materials, goods, and services, from basic resources such as aluminum and formaldehyde to hardware and energy technologies such as hard disk drives, transistors, wind energy, and photovoltaics. In all cases, they found no evidence of dematerialization, or an overall reduction in their use, despite technological improvements to their performance.

“There is a techno-optimist’s position that says technological change will fix the environment,” Magee observes. “This says, probably not.”

In their research, Magee and Devezas examined whether the world’s use of materials has been swayed by an effect known as Jevons’ Paradox. In 1865, the English economist William Stanley Jevons observed that as improvements to coal-fired steam engines reduced the price of coal, England’s consumption of coal actually increased.

While experts believed technological improvements would reduce coal consumption, Jevons countered the opposite was true: Improving coal-fired power’s efficiency would only increase consumer demand for electricity and further deplete coal reserves.

The researchers’ model indicates that dematerialization is more likely when demand elasticity for a product is relatively low and the rate of its technological improvement is high. But when they applied the equation to common goods and services used today, they found that demand elasticity and technological change worked against each other — the better a product was made to perform, the more consumers wanted it.

“It seems we haven’t seen a saturation in demand,” Magee says. “People haven’t said, ‘That’s enough,’ at least in anything that we can get data to test for.”

In follow-up work, the researchers were eventually able to identify six cases in which an absolute decline in materials usage has occurred. However, these cases mostly include toxic chemicals such as asbestos and thallium, whose dematerialization was due not to technological advances, but to government intervention.

There was one other case in which researchers observed dematerialization: wool. The material’s usage has significantly fallen, due to innovations in synthetic alternatives, such as nylon and polyester fabrics. In this case, Magee argues that substitution, and not dematerialization, has occurred. In other words, wool has simply been replaced by another material to fill the same function.

Форма для связи

Name

Email *

Message *