December 06, 2008

Computers, Robotic and Communication Developments to Watch in 2009 and a Little Beyond

The technological and other developments to watch is expanding to four parts:
1. Computers, robots, electronics and communication
2. Energy and transportation
3. DNA/biotech/synthetic biology, nanotechnology
4. Medicine, life extension, space, manufacturing and anything else that was not covered

Dwave has a 128 qubit quantum computer chip at the end of 2008. In early 2009, this chip will continue tests and work and will either prove to be the design that they first choose to scale and/or Dwave will go through some more design revisions. So either in Q1 2009 or as long as Q3, Dwave will press onto scaling a successful 128 qubit design. Dwave should end 2009 with somewhere from a 512 to 8000 qubit chip.

Can they impact important niches and drive science ?
Can they impact important business niches and generate a big business ? so they can afford more R&D and thus get momentum to fund impact on science ?
Their research should also answer how much of a speed up they are expecting with this and more highly connected designs as they scale qubits and make other design changes.
2009 could provide key clarification of the timing and level and breadth of impact of quantum computers. Note: there are many other kinds of approaches to making quantum computers. Trapped Ion QC, spin-based QC and many others

Trapped ion quantumcomputers could achieve a near term 300 qubit design.

Experimental methods for laser-control of trapped ions have reached sufficient maturity that it is possible to set out in detail a design for a large quantum computer based on such methods, without any major omissions or uncertainties. The main features of such a design are given, with a view to identifying areas for study. The machine is based on 13000 ions moved via 20 micron vacuum channels around a chip containing 160000 electrodes and associated classical control circuits; 1000 laser beam pairs are used to manipulate the hyperfine states of the ions and drive fluorescence for readout. The computer could run a quantum algorithm requiring 109 logical operations on 300 logical qubits, with a physical gate rate of 1 MHz and a logical gate rate of 8 kHz, using methods for quantum gates that have already been experimentally implemented.

2. Mobile Broadband at 42-80 mbps 2009 and 100-250 Mbps in 2010

Mobile broadband could hit 42 megabit per secnd in 2009. Evolved HSPA and improved encoding.

We go to the modulation that is 64 QAM, that's 64 combinations of information in the same slot as one piece of information. MIMO, multiple in multiple out, is multiple radios on a device, this is like Wi-Fi uses with the N standard. With MIMO we can go from 14Mb/sec to 28Mb/sec. We can then combine them to get 42Mb/sec. We can probably squeeze that to 80Mb/sec, and that's before we even get to Long Term Evolution (LTE).

Japan's NTT is launch LTE (called Super 3G) in 2010 and will have 100+ Mbps. 250 Mbps on the downlink for LTE tests have been reached.

There is more 4G speed after that to above Gbps speeds.

3. Can Wimax, white space modems, 60Ghz wireless or free space optics make a big impact in 2009 ?

The USA is using slower mobile phone technologies than in Asia and Europe so that could still provide a window for Wimax, white space modems and other technologies.

4. Mobiles phones of 2012 and 2009

2012: 12-20MP phone cameras, record HD video, 100-250 Mbps and 1 GHz processor

2009: 8MP cellphone cameras and 12.5 MP cameras are already demonstrated in 2008

Nokia's 2009 cellphone roadmap has 8 MP cameras and much more.

Nokia Corolla
* 3" display with a VGA resolution
* An 8 megapixel camera (a first for Nokia)
* A “half QWERTY keyboard”
* aGPS & Wi-Fi connectivity
* HSPA support
* 8GB of internal memory

Nokia Ivalo
* 3.5" display
* Wi-Fi connectivity
* 5 megapixel camera
* Bluetooth connectivity
* TV out
* FM transmitter and receiver
* 32GB of internal memory

Handheld projectors could be made to interoperate with the phones and eventually be reduced in size and power requirements further and get integrated with the phone.

The local (wifi range) connection speed could leap to 5 Gbps with the integration of the new and cheap 60 GHz chip from Taiwan.

5. Will a new superior form of computer memory make a breakthrough ?

Most likely: low market share as production is started and scaled up, niche markets as first versions have limitation in performance and capacity.

Leading candidates
Programmable metallization cell which is also called conductive-bridging RAM, or CBRAM and NEC has a variant called “Nanobridge” and Sony calls their version “electrolytic memory”.

Memristor based Ram is begin targeted for 2009 [unknown capacity, but should have energy efficiency and other advantages for at least niche market success].

Nanochip is targeting 100GB - 1Terabyte devices for 2010 at lower cost than Flash and leverages some Flash technology and has Intel backing. In 2009, see if they announce that they are on track for a high volume splash in 2010.

MRAM will be at 16 MB in 2009 and some companies are claiming 2015 when they will be competitive with Flash

6. Will Memristors make FPGAs a major force and have other impacts ?

From MIT Technology review: memristors could vastly improve one type of processing circuit, called a field-programmable gate array, or FPGA. By replacing several specific transistors with a crossbar of memristors, we showed that the circuit could be shrunk by nearly a factor of 10 in area and improved in terms of its speed relative to power-consumption performance. Right now, we are testing a prototype of this circuit in our lab.

And memristors are by no means hard to fabricate. The titanium dioxide structure can be made in any semiconductor fab currently in existence. (In fact, our hybrid circuit was built in an HP fab used for making inkjet cartridges.) [NOTE: this also means that printable electronics might target memristor based devices] Emulating the behavior of a single memristor requires a circuit with at least 15 transistors and other passive elements. The implications are extraordinary: just imagine how many kinds of circuits could be supercharged by replacing a handful of transistors with one single memristor.

-the wires and switches can be made very small: we should eventually get down to a width of around 4 nm, and then multiple crossbars could be stacked on top of each other to create a ridiculously high density of stored bits.
- memristor behavior is similar to that of synapses. Right now, Greg is designing new circuits that mimic aspects of the brain. The neurons are implemented with transistors, the axons are the nanowires in the crossbar, and the synapses are the memristors at the cross points. A circuit like this could perform real-time data analysis for multiple sensors
[so big sensor and AI impacts]

There are already FPGA desktop supercomputers that do not leverage the memristor benefits

7. Many core processors and advanced GPGPU will fight for the top end personal systems

Nvidia Tesla GPGPU line, AMD Firestream GPGPU and Intel's many core Larrabee chips should be watched for the leading mainstream edge.

Tensilica is a company to watch for configurable processors and potential supercomputer breakthroughs

8. HP is targeting 2009 for its first optical interconnect technology in products by harnessing their expertise in nanoimprint lithography to fashion low-cost, high-speed silicon photonic device.

HP Labs also showed how its optical bus could harness nanoimprint lithography to fashion cheap plastic waveguides, micro-lenses and beamsplitters. Its first demonstration was of a 10bit-wide optical data bus that used just 1mW of laser power to interconnect eight different modules at 10Gbps/channel for an aggregate bandwidth of over 250Gbps.

"What we are working towards now are novel optical connections, such as board-to-board connections using a photonic bus that enables us to replace an 80W chip that performs the electronic switching function today with a moulded piece of plastic," said Morris.

Most photonic interconnects use vertical cavity surface-emitting lasers, but HP Labs also showed inexpensive methods of eliminating the need for expensive gallium arsenide chips, using plasmonic LEDs that could cut costs, and a silicon ring resonator that it hopes to fashion with imprint lithography.

"HP Labs has already demonstrated one of the world's smallest and lowest power silicon ring resonators. Now we want to show how to do it with nanoimprint lithography because a dense pattern that takes 60 hours to create with e-beam lithography could take only 30 minutes for nanoimprint lithography," Morris claimed.

HP contends that its photonic interconnects are poised for commercialisation.

9. When will optical computers breakout ?

Nader Engheta's, professor at university of Pennsylvania, work provides "a vision, consisting of building blocks, along with instructions on how to arrange them together to enable transplanting well-known passive inductor-capacitor-resistor [LCR] electrical networks to the optical domain. This includes the direct optical realization of filters, antennas, power-distribution networks, microwave transmission-line metamaterials and many more. He has theories about creating equivalent structures for optical component versions for all of the components of electrical computers. Some of components would require new types of not yet created metamaterials to be developed.

Transformation optics is a field of optical and material engineering and science embracing nanophotonics, plasmonics, and optical metamaterials.

Transformation optics may enable invisibility, ultra-powerful microscopes and optical computers by harnessing nanotechnology and "metamaterials."

Terabit internet,photonic chips and optical computers are the goal of the Centre for Ultrahigh bandwidth Devices for Optical Systems (CUDOS)

MIT has several approaches for all optical computers which they hope to realize by 2012.

All optical computers could approach the theoretical speed of a photonic switch which is estimated to be on the order of petahertz (10**15). They should definitely achieve multi-terahertz speeds. So 1000 to 1 million times faster than current computers at 4 Gigahertz (4 * 10**9).

10. Robotics
Heartland robotics could revolutionize low cost industrialization.

Prosthetics and robotics possibilities.

Other robotics companies

Small pill to bactera sized robots

11. Category disrupting technology

Red Camera digital and still video cameras

Memjet printers

12. Other research to monitor

A promising molecular computer appraoch

Avogadro scale computing and programmable matter work.

Brain Emulation

Intel's all digital radios and approaching micron size catom claytronics (currently millimeter size)

Graphene computer memory and electronics and graphene based technology in general

December 04, 2008

Loss of Light Prevented in Metamaterials

Researchers have solved one of the significant remaining challenges with photonic “metamaterials,” discovering a way to prevent the loss of light as it passes through these materials, and opening the door to many important new optical, electronic and communication technologies.

Photonic metamaterials are engineered composite materials with unique electromagnetic properties, and have attracted significant research interest in recent years due to their potential to create “negative index” materials that bend light the opposite way of anything found in the natural world. But their performance has been significantly limited by the absorption of light by metals that are part of their composition – metal might absorb much more than 50 percent of the light shined on it, and drastically reduce the performance of devices based on these materials.

The solution to this problem, researchers discovered, is to offset this lost light by adding an optical “gain” to a dielectric adjacent to the metal. The new publication outlines how to successfully do that, and demonstrates the ability to completely compensate for lost light. It had been theorized that this might be possible, the researchers said, but it had never before been done, and the theories themselves were the subject of much scientific debate.

As such, this may have removed a final roadblock and now made possible “a number of dreamed about applications.

Stimulated Emission of Surface Plasmon Polaritons

We have observed laserlike emission of surface plasmon polaritons (SPPs) decoupled to the glass prism in an attenuated total reflection setup. SPPs were excited by optically pumped molecules in a polymeric film deposited on the top of a silver film. Stimulated emission was characterized by a distinct threshold in the input-output dependence and narrowing of the emission spectrum. The observed stimulated emission and corresponding compensation of the metallic absorption loss by gain enables many applications of metamaterials and nanoplasmonic devices.

Eric Drexler's New Blog Metamodern and Belated H+ E-Magazine

Eric Drexler, the visionary of molecular nanotechnology and mechanosynthesis, has started a new blog Metamodern: The Trajectory of Technology.

Metamodern isn’t intended to be “a blog about nanotechnology”; its scope includes broader issues involving technologies with world-changing potential. For example, looking well downstream in technology development, I will sketch the requirements for large-scale systems able to restore the atmosphere to its pre-industrial composition. Closer to hand, social software and the computational infrastructure of our society are high on the list.

It should be a must read for futurist and people interested in making molecular nanotechnology happen as Eric Drexler takes a systems view and will "suggest research objectives that seem practical, valuable, and ready for serious pursuit". Many of these opportunities could lead to the goal of advanced nanotechnology.

Sample articles:
1. Modular molecular composite nanosystems: The concept of “modular molecular composite nanosystems” (MMCS) describes an approach to building complex, self-assembled structures that can organize functional components.

The idea is to exploit the properties of structural DNA nanotechnology (in particular, DNA orgami [pdf]) to provide an easy-to-design framework, together with complementary properties of folded polymeric molecular objects (“foldamers”, and in particular, products of protein engineering) to bind and organize functional components in precise 2- and 3-D configurations

2. A DNA Imaging Bottleneck: the scarcity of cryo-electron tomography capability is a problem.

3. Peptoids at the molecular foundry. Peptoids have potential for complementing biomolecules in building composite nanosystems

- peptoid synthesis is uncommonly flexible and straightforward.
- The field has progressed beyond making only small, floppy molecules. There are now prototypes of protein-like peptoids built up from helical secondary structures loosely analogous to the alpha helices of the peptide world
- The limit today is design, not fabrication, and pushing back this limit will require a partnership that links scientific exploration to software development.

4. Combining molecular signals

5. Nanoplasmonics

There is also the new H+ magazine, which covers Humanity Plus/Transhumanist Technology and Society.

Political Coup Attempt Canada-style

Canada has an interesting attempt at a political coup.

Three disparate opposition parties—the centrist Liberals, the socialist New Democrats (NDP) and the separatist Bloc Québécois—have ganged up in order to oust the minority Conservative government and replace them with a centre-left coalition. On Thursday December 4th Prime Minister Harper asked Michaëlle Jean, who as governor-general acts as Canada’s head of state, to suspend Parliament until January. In January, the Conservatives will return with a new proposed budget.

1. The Governor general will not take responsibility for this unprecendented change in government and will punt - and call elections. [assumes the next item. the Tory government is brought down when it tables the budget, which would be a confidence vote, as all money bills are. The other option is for the coalition to chicken out/work with the Tories cause they all look bad for fighting for power during an economic crisis or the coalition fall apart and for the minority government to continue.]
2. Parliament has been suspended into January. There will be a new budget then and both sides will campaign until the showdown. The no-confidence motion will pass and then there will be new elections. 5:1 odds that their will be a new election.
3. The Liberals will accelerate a change to replace the unpopular Dion. 60:40 odds. It is the smart move, but that does not mean the Liberals will pull it off.
4. If the Lib/NDP/PQ have a combined majority then they will have their new government.

UPDATE: The odds of the coalition falling apart or chickening out are increasing. There are reports of liberals abandoning the coalition and calling for replacement of Liberal leader Dion. The coalition has to hold together for two months and vote down a budget and bring and pass a non-confidence vote and then either win an election or convince the Governor general (for decades a traditional, ceremonial rubber stamp role) to give them power.

5. The election will be close, but the Conservatives can get their act together enough to win a small majority. This is actually more a bet that the Liberals, NDP and PQ will be more incompetent. 60:40 odds of Conservative getting their majority.

According to an Angus Reid poll for CTV, 64 percent of Canadians do not support Stephane Dion becoming prime minister in a coalition government, but 53 percent are against the Conservatives' current economic policy.

Statement Agree Disagree Not sure

I would be comfortable with Stéphane Dion becoming Canada’s prime minister
25% 64% 10%

I am worried about the Bloc Québécois becoming involved in the federal government
57% 30% 13%

The Conservative government has done a good job in dealing with the economic crisis
36% 53% 10%

The federal government should implement a stimulus package to boost the economy as soon as possible
75% 17% 8%

Conservative strategy would be
1. Get Governor General to call the new election
2. Propose a stimulus package in the budget
3. Play on fears and dislike of Bloc Québécois and Dion (if Dion is still around)

There is discussion at wikipedia about how to record the official wikipedia article.

The wikipedia article about the Canadian parliamentary dispute/crisis

Only in Canada, eh, Pity or not Pity ?

December 03, 2008

Micro-Fusion For Space Propulsion and Weapons with Gigavolt Super Marx Generator, Proton Beams or Argon Ion Lasers

Micro-fusion work of Friedwardt Winterberg from the 1950's-1970s was recently declassified. Winterberg had several ideas for using micro-fusion without fission bomb triggers to generate nuclear energy or power spacecraft. Winterberg was proposing pure deuterium micro-explosions. H/T Crowlspace and J Friedlander.

Wikipedia entry for Friedwardt Winterberg.

Friedwardt Winterberg (born June 12, 1929) is a German-American theoretical physicist and research professor at the University of Nevada, Reno. With more than 260 publications and three books, he is known for his research in areas spanning general relativity, Planck scale physics, nuclear fusion, and plasmas. "His work in nuclear rocket propulsion earned him the 1979 Hermann Oberth Gold Medal of the Wernher von Braun International Space Flight Foundation [highest award in astronautical research] and in 1981 a citation by the Nevada Legislature." Winterberg is well-respected for his work in the fields of nuclear fusion and plasma physics, and Edward Teller has been quoted as saying that he had "perhaps not received the attention he deserves" for his work on fusion. He is known for his ideas which lead to the development of GPS (Global Positioning system, his fusion activism, his first proposal to experimentally test Elsasser's theory of the geodynamo.

Standard fusion bombs use the Tellar-Ulam design, which were using fission bombs to trigger a fusion bomb.

The basic principle of the Teller–Ulam configuration is the idea that different parts of a thermonuclear weapon can be chained together in "stages", with the detonation of each stage providing the energy to ignite the next stage. At a bare minimum, this implies a primary section which consists of a fission bomb (a "trigger"), and a secondary section which consists of fusion fuel. Because of the staged design, it is thought that a tertiary section, again of fusion fuel, could be added as well, based on the same principle of the secondary. The energy released by the primary compresses the secondary through the concept of "radiation implosion", at which point it is heated and undergoes nuclear fusion.

Super Marx Generator Ignition for Nuclear Fusion Power
Proposed use of a super Marx generator to ignite a pure deuterium thermonuclear micro-explosion. In a super Marx generator, N Marx generators charge up N fast capacitors FC to the voltage V, which switched into series add up their voltages to the voltage NV. The proposed super Marx generator can reach what nature can do in lightning. The high voltage in natural lightning is released over a distance about 1 km, and the same is true for the super Marx generator.

A “super Marx generator” ultimately reaching gigavolt potentials with an energy output in excess of 100 megajoule. An intense 10 million Ampere-GeV proton beam drawn from a “super Marx generator” can ignite a deuterium thermonuclear detonation wave in a compressed deuterium cylinder, where the strong magnetic field of the proton beam entraps the charged fusion reaction products inside the cylinder.

While the ignition without fission of a deuterium-tritium (DT) thermonuclear micro-explosion has not yet been achieved, the ignition by a powerful laser beam seems possible in principle. But it is unlikely it will lead to a practical inertial confinement nuclear fusion reactor, because of the intense photon burst of a high gain micro-explosion (required for a inertial confinement fusion reactor), is destroying the laser. Ignition is also likely possible with intense GeV heavy ion beams, but there the stopping of the beam in the target is a problem. In either case, 80% of the energy released goes into the 14 MeV neutrons of the DT reaction.

DT inertial confinement fusion is likely a hybrid fusion-fission reactor, with fusion providing the neutrons and fission the heat. It favors inertial fusion by encapsulating the DT “pellet” in a U238 or Th232 shell.

Deuterium Micro-explosive Space Propulsion
For a propulsion system to transport large payloads with short transit times between different planetary orbits: a deuterium fusion bomb propulsion system is proposed where a thermonuclear detonation wave is ignited in a small cylindrical assembly of deuterium with a gigavolt-multimegampere proton beam, drawn from the magnetically insulated spacecraft acting in the ultrahigh vacuum of space as a gigavolt capacitor. Note: This paper was presented in part at the NASA-JPL-AFRL 2008 Advanced Space Propulsion Workshop.

With no deuterium-tritium (DT) micro-explosions yet ignited, the non-fission ignition of pure deuterium (D) fusion explosions seems to be a tall order. An indirect way to reach this goal is by staging a smaller DT explosion with a larger D explosion. There the driver energy, but not the driver may be rather small.

Winterberger claims that the generation of GeV potential wells, made possible with magnetic insulation of conductors levitated in ultrahigh vacuum, has the potential to lead to order of magnitude larger driver energies. It is the ultrahigh vacuum of space by which this can be achieved. And if the spacecraft acting as a capacitor is charged up to GeV potentials, there is no need for its levitation.

The spacecraft is positively charged against an electron cloud surrounding the craft, and with a magnetic field of the order 10,000 Gauss, easily reached by superconducting currents flowing in an azimuthal direction, it is insulated against the electron cloud up to GeV potentials. The spacecraft and its surrounding electron cloud form a virtual diode with a GeV potential difference. To generate a proton beam, it is proposed to attach a miniature hydrogen filled rocket chamber R to the deuterium bomb target, at the position where the proton beam hits the fusion explosive. A pulsed laser beam from the spacecraft is shot into the rocket chamber, vaporizing the hydrogen, which is emitted through the Laval nozzle as a supersonic plasma jet. If the nozzle is directed towards the spacecraft, a conducting bridge is established, rich in protons between the spacecraft and the fusion explosive. Protons in this bridge are then accelerated to GeV energies, hitting the deuterium explosive. Because of the large dimension of the spacecraft, the jet has to be aimed at the spacecraft not very accurately.

Deuterium Micro-explosive Space Launch Systems
For a cost effective lifting of large payloads into earth orbit: the ignition is done by argon ion lasers driven by high explosives, with the lasers destroyed in the fusion explosion and becoming part of the exhaust.

If launched from the surface of the earth, one has to take into account the mass of the air entrained in the fireball. The situation resembles a hot gas driven gun, albeit one of rather poor efficiency. Assuming an efficiency of 10%, about 100 kiloton explosions would be needed to launch 1000 tons into orbit. It would be a cleaner and more public relations friendly version of the Orion Nuclear Pulsed Propulsion system.

Winterberg suggests an ultraviolet argon ion laser is used as a trigger. However, since argon ion lasers driven by an electric discharge have a small efficiency, he suggested a quite different way for its pumping.

where the efficiency can be expected to be quite high. As shown above, it was proposed to use a cylinder of solid argon, surrounding it by a thick cylindrical shell of high explosive. If simultaneously detonated from outside, a convergent cylindrical shockwave is launched into the argon. For the high explosive one may choose hexogen with a detonation velocity of 8 km/s. In a convergent cylindrical shockwave the temperature rises as r -0.4, where r is the distance from axis of the cylindrical argon rod. If the shock is launched from a distance of ~1 m onto an argon rod with a radius equal to 10 cm, the temperature reaches 90,000 K, just right to excite the upper laser level of argon. Following its heating to 90,000 K the argon cylinder radially expands and cools, with the upper laser level frozen into the argon. This is similar as in a gas dynamic laser, where the upper laser level is frozen in the gas during its isentropic expansion in a Laval nozzle. To reduce depopulation of the upper laser level during the expansion by super-radiance, one may dope to the argon with a saturable absorber, acting as an “antiknock” additive. In this way megajoule laser pulses can be released within 10 nanoseconds. A laser pulse from a small Q-switched argon ion laser placed in the spacecraft can then launch a photon avalanche in the argon rod, igniting a DT micro-explosion.

Science Fiction and technology futurists have always been interested in ways to enable high performance nuclear spacecraft. A classic in science fiction is Keith Laumer's bolo Supertank which had the Hellbore cannon, which has similarities to the proposed fusion system.

Hellbore ammunition consists of slivers of highly-pressurized frozen deuterium which, when fired, are ignited (by a laser) in a fusion reaction. The resulting bolt is contained and directed using strong magnetic fields in the breech and barrel. The resulting plasma travels at a considerable fraction of light speed. Since the Hellbore was designed as naval armament for Concordiat vessels, modifications had to be made to use them in an atmosphere to avoid losing a significant portion of the shot's energy to dispersal. To this end, a fraction of a second before using the Hellbore a powerful laser will be firing to create a momentary vacuum along the path of the bolt. Later Bolo marks are capable of internally manufacturing Hellbore rounds, using water as a raw material, whereby the deuterium isotope of hydrogen is separated and cooled cryogenically into splinters of frozen hydrogen. The Mark XXXIV carries a variant of the Hellbore known as the Hellrail, an anti-starship railgun weapon, possessing an output of 90 megatons per shot. Hellrails are designed for planetary defense and cannot normally be depressed to strike ground targets.

Winterberg page at University of Nevada

Dr. Winterberg obtained his Ph.D. under Dr. Werner Heisenberg, and is listed as one of the four notable students of Werner Heisenberg on Heisenberg's 'Wikipedia entry, along with Felix Bloch, Edward Teller and Rudolph E. Peierls.

His thermonuclear microexplosion ignition concept was adopted by theBritish Interplanetary Society for their Daedalus Starship Study.

Among many seminal papers:
Theory of Nerva Type nuclear fission rocket reactors. 2nd United Nations Conference on the Peaceful Uses Atomic Energy, A/Conf.15/P/1055, Geneva 1958; also translated into Russian under "Selected Reports of Non-Soviet Scientists", Moscow, Atomizdat 3,453 (1958).

First proposal for Impact Fusion Z. Naturforsch. 19a, 231 (1964); Magnetic macroparticle acceleration for impact fusion, J. Nuclear Energy 8, 541 (1966).

December 02, 2008

Reviewing Technologies, Developments and Projects to Watch for 2008

On Christmas eve 2007, this site published 10 technologies and projects to watch and one added development.

Here is a review of those items:
1. Dwave systems' adiabatic quantum computers (AQC) will come up short of the projected 512 to 1000 qubits and likely end the year at 128 qubits. 128 qubits is still the most for a quantum computer by a large margin. Successful development of quantum computers would accelerate the development of molecular nanotechnology with superior molecular simulation and modeling.

The Dwave System AQC is making progress to quantify what performance will be possible on its systems and in determining the quantumness of their systems.

So Dwave is still pushing ahead to record numbers of qubits and working to refine the efficiency of their systems and to expand the areas where this system will have superior performance. Dwave will explore where to use this system and will continue to refine their designs and increase the qubits and increase performance.

2. Inertial Electrostatic (Bussard Fusion) WB-7 prototype reactor was built and operated. The results are still not published, but are suspected to be mostly confirming the final WB6 result success. There are likely also some issues to be addressed. The ultimate potential impact may not be known until follow up funding and work is done and the details of what has been learned in the WB-7 tests will not be known until the results are published.

3. Breakthrough memjet printers were delayed into 2009. The 2008 versions of the memjet printers were expected to be 60 page per minute.

4. Nvidia Tesla and AMD Firestream have delivered multi-teraflop GPGPU desktops and servers for under $20,000 and even under $8,000.

5. About 200 Tesla Electric cars will have been by the end of 2008 and the Aptera electric car could have some units built by 2008.

Hybrid car sales will rise to over 5% of total new cars by 2012

Hybrid car sales were solid in 2008 at about 2.6-2.8% in the USA.

The Aptera could get up to 300mpg and was to have been available Oct, 2008.

6. Flash solid state drives were $5-7/Gigabyte at the end of 2007.

Solid state flash drives have reached $2.33/Gigabyte.

This pricing level was not expected until the end of 2009 or 2010.

Hard drives are $0.10-1.20/Gigabyte

7. 2008 a year of significant improvements in communication speed.

Wimax progress has been slower than was expected at the end of 2007, but the FCC has approved the Sprint/Clearwire deal. The FCC also approved white space airwaves for public use.

Comcast is offering wideband 50 mbps download speed and 10 mbps upload in 10 markets.

Verizon FIOS service has 50 mbps download and 20 mbps upload as its top tier.

8. Some of the hypersonic aircraft programs have been cancelled, but the US air force has an environmental assessment to allow 48 hypersonic flight tests from 2011-2015.

Darpa is still planning another hypersonic test flight in the summer of 2010.

9. The Reprap project did announce a self-replicating version in 2008

Reprap plans to expand to a wide variety of materials and continue improvements.

10. 2008 was a breakthrough year for low cost DNA sequencing and synthesis.

Applied Biosystems announced their genome sequencing for $10,000 and Complete Genomics announced $5,000 genome sequencing

The non-technological thing to watch: The Taiwan Presidential election correctly forecasted that Ma would win the Taiwan Presidency and would move to improve relations with China.

Holographic Precise Mass Control of Nanoscale Particles (50 nanometers to 3 microns) with Re-arrangement in Seconds

Researchers at Purdue University have developed a technique that uses a laser and holograms to precisely position numerous tiny particles within seconds, representing a potential new tool to analyze biological samples or create devices using nanoassembly.

The technique, called rapid electrokinetic patterning, is a potential alternative to existing technologies because the patterns can be more quickly and easily changed, said mechanical engineering doctoral student Stuart J. Williams.

The researchers demonstrated how the method could be used to cause particles to stick permanently to a surface in a single crystalline layer, a structure that could be used in manufacturing. They used their technique to move fluorescent-dyed beads of polystyrene, latex and glass in sizes ranging from 50 nanometers to 3 micrometers.

Future work may involve using a less expensive light source, such as a common laser pointer, which could not be used to create intricate patterns but might be practical for manufacturing.

The technique overcomes limitations inherent in two existing methods for manipulating particles measured on the scale of nanometers, or billionths of a meter. One of those techniques, called optical trapping, uses a highly focused beam of light to capture and precisely position particles. That technique, however, is able to move only a small number of particles at a time.

The other technique, known as dielectrophoresis, uses electric fields generated from metallic circuits to move many particles at a time. Those circuit patterns, however, cannot be changed once they are created.

The new method is able to simultaneously position numerous particles and be changed at a moment's notice simply by changing the shape of the hologram or the position of the light.

"If you want to pattern individual particles on a massive scale using electrokinetic methods as precisely as we are doing it, it could take hours to days, where we are doing it in seconds," Williams said.

The method offers promise for future "lab-on-a-chip" technology, or using electronic chips to analyze biological samples for medical and environmental applications. Researchers are trying to develop such chips that have a "high throughput," or the ability to quickly detect numerous particles or molecules, such as proteins, using the smallest sample possible.

"For example, a single drop of blood contains millions of red blood cells and countless molecules," Williams said. "You always want to have the smallest sample possible so you don't generate waste and you don't have to use as many chemicals for processing the sample. You want to have a very efficient high throughput type of device."

So-called "optical tweezers" use light to position objects such as cells or molecules.

"You can't use mechanical tweezers to move things like molecules because they are too delicate and will be damaged by conventional tweezers," Kumar said. "That is why techniques like optical tweezing and dielectrophoresis are very popular."

The students also have designed an experiment containing one indium tin oxide plate and one gold plate, an important development because gold is often used in biomedical applications.

"It's a technique that you would likely use in sensors, but we also see definite potential ways in which you could use it to manufacture devices with nanoassembly," Wereley said. "But it's really too soon to talk about scaling this up in a manufacturing setting. We're just beginning to develop this technique."

Thorium Google Talk and Jim Hansen Now a Thorium Proponent

Joe Bonometti, gave a Google Tech talk on liquid fluoride thorium reactors. 11MB of powerpoint slides


Besides the low amount of waste and almost complete burning of all Uranium and Plutonium, another big advantage of liquid fluoride reactors is fast and safe shutoff and restart capability. This fast stop and restart allows for load following electricity generation. This means a different electric utility niche can be addressed other than just baseload power for nuclear power. Currently natural gas is the primary load following power source. Wind and solar are intermittent in that they generate power at unreliable times. LFTR would be reliable on demand power.

Small, mobile and factory mass producible liquid fluoride thorium reactors (LFTR) designs are being proposed. About 500 tons for the LFTR power plant to generate 100 MWe with 30 years between refueling.

James Hensen, NASA scientist famous for sounding early warning on global warming, has an eight page open letter to Barack Obama arguing for Integral Fast Breeders and Liquid Fluoride Thorium Reactors

James Hansen appeared on Charlie Roseand advocated nuclear power

Sovietoligist, a grad history student with an interest in Soviet atomic power, answers Joe Romm's critiques of Hansen.

Sovietologist fact check of Joe Romm on New nuclear power

Nuclear Green suggests factory assembled LFTR's and swapping out the coal burners with LFTR's.

There is bill in the US Senate to fund Thorium nuclear research for $250 million form 2009 to 2013. It is currently in committee and may or may not pass.

December 01, 2008

Superconductor at 212K

Joe Eck at continues his research with superconductors and reports signs of superconductivity over 212K by doping a recently-discovered 200K tin-copper-oxide superconductor with a small amound of indium. The improved formula becomes (Sn5In)Ba4Ca2Cu10Oy. Although the 212K phase forms as a very small volume fraction within the bulk, sharp resistive and diamagnetic transitions are clearly visible when multiple tests are digitally summed together.

Hopefully some better funded research institution will follow up and replicate Joe Eck's work.

Synthesis of these materials was by the solid state reaction method. Stoichiometric amounts of the below precursors were mixed, pelletized and sintered for 36-60 hours at 830C. The pellet was then annealed for 10 hours at 500C in flowing O2.

SnO 99.9% (Alfa Aesar) 4.64 moles
In2O3 99.9% (Alfa Aesar) 0.96 moles
CaCO3 99.95% (Alfa Aesar) 1.38 moles
BaCuOx 99.9% (Alfa Aesar) 5.98 moles
CuO 99.995% (Alfa Aesar) 3.29 moles

The magnetometer employed twin Honeywell SS94A1F Hall-effect sensors with a tandem sensitivity of 50 mv/gauss. The 4-point probe was bonded to the pellet with CW2400 silver epoxy and used 7 volts on the primary.

RESEARCH NOTE: The copper-oxides are strongly hygroscopic. All tests should be performed immediately after annealing.

Superconducting News from Joe Eck

Double to Triple the Energy Harvesting from Nanoscale Piezoelectrics

Dramatic enhancement in energy harvesting for a narrow range of dimensions in piezoelectric nanostructures around the critical size of 20-23 nanometer thick beams.

Lead zirconate titanate (PZT) material employed in the form of cantilever beams, our results indicate that the total harvested power peak value can increase by 100% around 21 nm beam thickness (under short circuit conditions) and nearly a 200% increase may be achieved for specifically tailored cross-section shapes. The key (hereto undiscovered) insight is that the striking enhancement in energy harvesting is predicted to rapidly diminish (compared to bulk) both below and above a certain nanoscale structural length thus providing a rather stringent condition for the experimentalists.

Physorg has more coverage

Dwave Quantum Computer Performance Estimates and Calculations

From the Dwave presentation at SC08 (Supercomputer conference), they indicated that the current 128 qubit adiabatic quantum computer has sub-PC performance but they project by next year to reach that level with whatever number of qubits are operational then. Assuming the time scale is roughly correct between the 2008 and 2009 points then it will take 4-5 years to reach supercomputer levels of performance and 6-7 years to exceed classical computer performance.

By November, 2009:
• Dwave Quantum computer Systems are targeted to be in top research centers, performance ~ PC
• Extremely compelling scientifically

From Geordie Rose, CTO of Dwave:
It’s not only about the number of qubits. There are a lot of other issues. The level of connectivity in the underlying hardware graph (the number of couplers), algorithms for mapping “real life” problems into hardware, documentation and debugging, increasing mean time between failure, decreasing costs, decreasing 1/f noise from materials science issues in fab, increasing fab yields, etc. etc. etc.

Dwave is currently running calculations and experiments to determine the precise performance of their adiabatic quantum computers

Recently techniques have been introduced for calculating the run time of the quantum adiabatic algorithms for problems up to about 128 variables; see here and here. While this is still too small to ultimately answer questions about the asymptotic scaling of these approaches, it is sufficient to predict the expected performance of adiabatic quantum computers of up to 128 qubits, which perhaps not coincidentally is the number of qubits in the Rainier design.

D-Wave and Dr. Peter Young (a co-author on one of the QMC papers referenced previously) have built a distributed QMC platform to calculate the run time of a quantum adiabatic algorithm relevant to the experimental hardware at D-Wave. The project is called AQUA (Adiabatic QUantum Algorithms), with the distributed version called AQUA@home. The distributed computing technology is based on BOINC, the same technology that enables SETI@home and a host of other large-scale distributed scientific computations.

AQUA is currently running internally at D-Wave. We have set up an external server at that will shortly go live. This server is set up to accept volunteer cycles from individuals who wish to contribute computer time to the AQUA project and make a direct contribution to the advancement of scientific understanding of the quantum adiabatic algorithms. The AQUA@home program runs at low priority in the background of any internet-connected computer. All of the data acquired from AQUA will be published, and everyone who contributes cycles to the project will receive copies of all the publications arising from this work.

The specific project we have been working on internally to test the AQUA system calculates the runtimes for a particular type of problem ideally suited to our hardware. These problems are spin glass problems and are known to be NP-hard. This type of problem will be the first thing we run on AQUA@home to ensure that everything is working properly and will be the basis of the first publication.

After this we plan to run AQUA@home to compute the expected run time of our 128-qubit superconducting adiabatic quantum computing system, running the quantum adiabatic algorithm it enables, on problems generated by the binary classification machine learning application we co-developed with Vasil Dentchev and Hartmut Neven at Google.

Geordie Rose, Dwave CTO, believes that the outcome of this particular project has significant implications for the field of quantum computation, as currently there are no widely accepted meaningful commercial applications of quantum computers. If quantum adiabatic algorithms can solve machine learning problems better than the best known classical approaches that would be a game changer for quantum computation, which currently relies on insufficient amounts of government funding, primarily for its potential role in breaking certain asymmetric cryptosystems, which of course has limited interest to partners and investors in the commercial world.

November 30, 2008

Terry Grossman Recommended Health Checks and Disease Prevention

Terry Grossman is a doctor who runs a longevity health clinic and co-wrote "Fantastic Voyage: Live Long Enough to Live Forever" with Ray Kurzweil.

Terry spoke at the Convergence08 conference.

Heart and Cardio Health
Heart and coronary disease is the number one killer in North America. Two tests are very good for early detection of heart and coronary problems.

1. Coronary artery calcium score tests should be performed about US$200

2. Carotid Intima-media thickness (IMT) test using ultrasound

Carotid ultrasound

Women should take breast exams at earlier ages. Thermograms are better for detection without the dose of radiation from a mammogram. Good for women age 25-40. After 40 use alternating mammograms and thermograms.

Terry recommends PSA tests to check for prostate problems. He believes stronger preventive measures should be taken with scores over 1. Scores over 1 do not necessarily mean cancer but can mean inflamation. Lifestyle modification such as eating more fish and less red meat.

Genetic Screening

APOE2, APOE3, and APOE4 tests will help determine Alzheimer's risk. (About $90)
23andme offers $400 a test of about 80 genes based on a swab of your mouth.

Basic Preventative

Take a multi-vitamin 3-4 times per day
Fish Oil -2 grams
Vitamin D 40-56 [80% of people have too little Vitamin D]

Red Yeast Rice is good for reducing cholesterol
Plant Sterols also good for reducing cholesterol

Reducing calories consumed by 10-20% is beneficial. Calorie reduction is 10-20%, while calorie restriction is 30-40% reduced calories.

A Short guide to a long life

Chapter Four: Food and Water

Avoid soft drinks and other acidic drinks, particularly colas (which have an extremely acidic pH of 2.5).

Replace coffee, which is also quite acidic, with less acidic beverages such as tea (particularly green tea).

Drink one-half fluid ounce per pound of body weight of alkaline water (pH between 9.5 and 10) each day. A 140-pound person should drink about nine 8-ounce cups per day.

In general, unfiltered tap water should not be drunk. Filtered tap water or ideally filtered, alkalinized water should be drunk instead.

Purified alkaline water can be produced from tap water by using an alkalinizing water machine (see recommended products listing).

Chapter Five: Carbohydrates and the Glycemic Load

One of the principal recommendations is to cut down sharply on high-glycemic-load carbohydrates. Beyond this, the proportion of carbohydrates in the diet depends on your health condition.

Our “low-carbohydrate group” consists of five subgroups of people that should cut down their carbohydrate consumption to no more than one sixth of their calories and virtually eliminate high-glycemic-load carbohydrates. As an example, the maintenance calorie level for someone weighing 150 pounds who is moderately active is 2, 250 calories. This would translate into a carbohydrate limit of 94 grams per day. The five subgroups of people [Note from Nextbigfuture author: the dietary recommendations are probably a prudent step for most people and the five subgroups cover a lot of people] are:

People trying to lose weight.

People with The Metabolic Syndrome (also known as “Syndrome X” -- see definition below).

People with Type II diabetes.

People with elevated risk factors for heart disease.

People who have cancer, have had cancer, or have an elevated risk of cancer.

For this low-carbohydrate group, we recommend:

Limit total carbohydrate consumption to less than one sixth of calories (see table below).

Generally avoid grains and fruit juices.

Eat very small quantities of low-glycemic index fruits, such as berries.

Acceptable carbohydrates in limited quantities include legumes (bean, lentils) and nuts.

Acceptable carbohydrates in larger quantities include low starch vegetables, particularly fresh and lightly cooked

Good low starch vegetables:

Kale, Swiss chard, collards, spinach

Dandelion greens, green and red cabbage, broccoli

Red and green leaf lettuce, romaine lettuce, endive

Chinese cabbage, bok choy, fennel, celery, cucumbers

Cauliflower, zucchini, Brussels sprouts

Green vegetables in general

Use a starch blocker.

Chapter Eight: Change Your Weight for Life in One Day

Basic Procedure for our “Change Your Weight for Life in One Day” program:

Step One: Determine your body frame size from Table 1 in chapter 8

Step Two: Determine your optimal weight range from Table 2 in chapter 8. Set your optimal weight to the low end of the range. If your weight falls below this level, increase your calorie consumption to maintain this optimal weight.

Step Three: Determine and adopt the maintenance calorie level for your optimal weight and exercise level (which should be at least moderately active). This will result in gradual weight loss, which will automatically taper off as you approach your optimal weight. You only need to make this one change.

As you approach your optimal weight, assess the important issue of body fat, which should be in the range of 12 to 20% for men and 18 to 26% for women, although we recommend you stay on the lean side of these ranges. Use the tables on body fat (see section on this web site)to determine your body fat percentage. Alternatively, you can use a scale that shows body fat percentage.

Do not make weight loss your primary goal. Rather, adopt a healthy pattern of eating with a sustainable level of calories and approach your optimal weight gradually.

Exercise is an important component of losing weight and a healthy life style. We recommend at least 300 calories of exercise per day.

Product recommendations

Fantastic Voyage online

Terry Grossman wrote about genetic tests back in 2002

Fantasic voyage glossary

First 3 chapters of the Fantastic Voyage book

Health and Longevity news

Форма для связи


Email *

Message *