January 19, 2007

Big Breakthrough: Honeycomb Nanotubes transfer carbon nanotube strength to the macroscale

Physorg reports, the stretchiness of these 20-nm-long carbon nanotubes enables them to do what straight nanotubes find difficult: namely, transfer tensile forces and possess high ductility, or malleability. Scientists Min Wang, Xinming Qiu, and Xiong Zhang from Tsinghua University in Beijing recently investigated the mechanical properties of super honeycomb structures, which are made of periodically repeating carbon nanotube Y junctions that form hexagonal patterns. While straight nanotubes—such as those compiled in bundles or ropes—have renowned strength and elasticity, the honeycomb structure can also transfer these forces to different parts of its structure.

When one tube in the honeycomb structure is broken, the surrounding arms can easily carry the load due to the structure’s ability to transfer forces.

The super honeycomb’s ability to transfer forces means that these structures could provide scientists with resources to improve nanoelectronics devices for computers, and also fiber-reinforced composites.

“Many nanoelectronic devices based on Y-junction carbon nanotubes have been proposed recently,” said Zhang. “Scientists [Coluci] have discussed the electronic properties of the super structures, and indicated that they have great applications as actuators and as hosts for large biomolecules. Regarding fiber-reinforced composites, just as its name implies, the mechanical properties of materials such as resin and concrete can be improved by adding some fiber components.”

Citation: Wang, Min, Qiu, Xinming, and Zhang, Xiong. “Mechanical properties of super honeycomb structures based on carbon nanotubes.” Nanotechnology. 18 (2007) 075711 (6pp)

Nervous system/brain connection to computers

A "data cable" made from stretched nerve cells could someday help connect computers to the human nervous system. The modified cells should form better connections with human tissue than the metal electrodes currently used for purposes such as remotely controlling prosthetics

Tests have already shown that electrical signals can be transmitted in both directions along the cord. "Tests in animal models are next," says Smith. Connecting the chord to electrodes outside of the brain means the reaction of neurons to non-organic material can be controlled. In future, the cord could connect an amputee's nerves to a sophisticated prosthetic, he says, and might even offer a way to connect artificial eyes or ears to the brain.

In Europe most researchers in this field are using non-invasive EEG. There are also brain implants with pill sized chips Wikipedia describes the brain gate interface Nanowires can connect to individual neurons

Nanodots could make higher density storage

Nanodots are one of two major approaches being pursued around the world as possible means of boosting the density of magnetic data storage. The other involves using a laser to heat and switch individual bits (Seagates HAMR technology for 50 terabits per inch) . The ultimate solution may be a combination of the two approaches, because heat reduces the strength of the magnetic field needed to switch nanodots, according to Justin Shaw, lead author of the new paper. Considerable work still needs to be done to make this type of patterned media commercially viable: Dot dimensions need to be reduced to below 10 nm; techniques to affordably fabricate quadrillions of dots per disk need to be developed; and new methods to track, read, and write these nanoscale bits need to be devised. The NIST authors collaborated with scientists at the University of Arizona, where some of the nanodot samples were made.

Spider silk like substance from polymeric nanocomposites

MIT creates spiderman like fabrics that are strong, stretchy and light The Massachusetts Institute of Technology (MIT) claims that it has devised a way to produce a material that mimics the stretchy and strong properties of spider silk for use in fuel-cell cells, medical devices, military products and even Spider-Man-like fabrics.

The materials, known as polymeric nanocomposites, are weaving a web of interest for use in packaging materials, tear-resistant fabrics and biomedical devices. Another possible use is membranes for fuel cells.

Cheap, safe drug kills most cancers

The drug, dichloroacetate (DCA), has already been used for years to treat rare metabolic disorders and so is known to be relatively safe.

It also has no patent, meaning it could be manufactured for a fraction of the cost of newly developed drugs. Evangelos Michelakis of the University of Alberta in Edmonton, Canada, and his colleagues tested DCA on human cells cultured outside the body and found that it killed lung, breast and brain cancer cells, but not healthy cells. Tumours in rats deliberately infected with human cancer also shrank drastically when they were fed DCA-laced water for several weeks.

DCA attacks a unique feature of cancer cells: the fact that they make their energy throughout the main body of the cell, rather than in distinct organelles called mitochondria. This process, called glycolysis, is inefficient and uses up vast amounts of sugar. Crucially, though, mitochondria do another job in cells: they activate apoptosis, the process by which abnormal cells self-destruct. When cells switch mitochondria off, they become “immortal”, outliving other cells in the tumour and so becoming dominant. Once reawakened by DCA, mitochondria reactivate apoptosis (the process by which abnormal cells self-destruct) and order the abnormal cells to die.

DCA can cause pain, numbness and gait disturbances in some patients, but this may be a price worth paying if it turns out to be effective against all cancers. The next step is to run clinical trials of DCA in people with cancer. These may have to be funded by charities [Yo, Gates Foundation], universities and governments: pharmaceutical companies are unlikely to pay because they can’t make money on unpatented medicines. The pay-off is that if DCA does work, it will be easy to manufacture and dirt cheap.

Big News: Quantum computer demo dates announced

Dwave Systems has fixed the dates for the demo of their Orion quantum computing system. They are going to hold two events, one at the Computer History Museum in Mountain View, California on February 13th, and the second at the Telus World of Science in Vancouver, Canada on February 15th.

This 16 qubit system is the beginning of commercially usable and useful quantum computers.

See the full list of over 47 articles on quantum computers and quantum algorithms

More advanced quantum computers that will follow on the heels of this will have their biggest impact in enabling the rapid advancement of our control and understanding of the quantum world and the ability to solve new problems that we were not able to before. Large scale quantum simulations could rapidly drive advancement towards molecular nanotechnology.

The new age of large scale commercial quantum computers will be a high impact enabling thing.

January 18, 2007

New York times discusses better uses for 1.2 trillion dollars

Better uses for the conservative estimate of the Iraq War 1.2 trillion

Currently the war is not buying much. A shift should be made to a lower cost approach. Just engage enough to prevent safe havens for terrorists. Redeploy to borders. Occasional sweeps through different cities. Reduce the objectives. Shore up allies Kuwait, Israel, Turkey, Kurds, Afghanistan, etc... Let Sunnis and Shia fight it out. Come back at a time of the US's choosing. A later unpredictable surge. Run experimental tech through Iraq. New UAVs, robot fighting systems, make things more unpredictable.

Put some of the money into research in game changing military sensors and technology. Re-engage when technology projects change the way things can be done.

Fareed Zakaria indicates the problem of the Arab states being mostly poorly run. The successful non-Arab muslim states of Turkey and Malaysia are examples what should be encouraged

Another Fareed Zakaria piece. Looking at our circumstances in Iraq should give us some appreciation for the difficulty of his [Kissinger's] task. With a losing hand and deteriorating conditions on the ground, Kissinger maneuvered to extricate the United States from a situation in which it could not achieve its objectives, while at the same time limiting the damage, shoring up regional allies and maintaining some measure of American credibility. A version of such a strategy is the only one that has any chance of success in Iraq today.

Levels of abstraction for a matter compiler

Chris Phoenix has several very good comments about the UK Ideas factory projects. Here is one on the matter compiler

Level 1: Reaction trajectories, potential energy surfaces.
2: Covalent structures: ball-and-stick diagrams.
3: Surface and volume structures. (Overlaps with 2 and 4.)
4: Lowest functional parts: gears, levers, wires…
5: Gearboxes, logic gates
6: Machines, circuits
7: Machine systems (e.g. assembly line, simple CPU)
8: Nanoblocks ~100 nm-1 micron
9: Nanoblock surface/volume structures; moving interfaces
10: Virtual materials (10-100 micron scale)
11: Human-interacting material properties (texture, appearance)
12: Detailed product form and function
13: Large-scale product form and function

Chris's article about the projects that came from the Ideas Factory

Improved walking molecule

Walking molecule improved to now carry two molecules of cargo A research team, led by UC Riverside's Ludwig Bartels, was the first to design a molecule that can move in a straight line on a flat surface. Now this team has found a way to attach cargo: two CO2 molecules, making the nano-walker a molecule carrier.

The molecule carrier is anthraquinone, which consists of three fused benzene rings with one oxygen atom on each side. An organic compound, anthraquinone is widely used in the pulp industry for turning cellulose from wood into paper. It is also the parent substance of a large class of dyes and pigments. Its chemical formula is C14H8O2.

The UCR study used a scanning tunneling microscope in Bartels's laboratory that gives a precise picture of individual molecules. Experiments took place on a highly polished copper surface, cleaned so that only the desired molecules were present on it. An individual anthraquinone molecule appears in Bartels' microscope as an almost rectangular feature with slightly rounded edges.

ANSOM Microscope Achieves Sub 10nm Resolution

ANSOM -- apertureless near-field scanning optical microscope is able to resolve less than 10 nm. Prior versions could only get to about 20nm. The Stephen Quake group, California Institute of Technology, developed a new phase filtering method. The fluorescence near-field microscope can distinguish single molecules.

The microscope’s phase filtering method can also be applied to such things a nanoantennas and supersharp carbon nanotube probes. The resolution of both of these instruments could be improved with the group’s process. Additionally, the microscope could be altered to work on a level that approaches the resolution of an electron microscope.

This new microscope, if properly adapted, could image live cells. It could look at things in motion. Observe proteins that are on the cell surface membranes. This microscope offers a powerful new tool for imaging single molecules and nanostructures.

Intel makes prototype 80 core chip and more future chip technology

Response to UK Ideas Factory projects: start of MNT race?

Quote Gandalf from the Two Towers:
And I come back to you now - at the turn of the tide

If other countries respond with similar or more aggressive software control of matter projects, than the very interesting projects from the UK Ideas Factory then this would be the start of a growing race to molecular nanotechnology

I think the UK projects are interesting, aggressive and achievable. Significant success of the three proposed project and other which are likely to follow in the regular funding process would force a response by other countries to take programmable matter with site specific chemistry (molecular nanotechnology) seriously. I think even before such success (which might take until 2009-2011) is achieved, countries and researchers in other countries will react this year.

January 17, 2007

Ten times smaller computer circuits that are ten times cheaper

Major obstacle removed for mass production of circuits ten times smaller than current mass production. As they eliminate tiny air bubbles that form when liquid droplets are molded into intricate circuits, a Princeton-led team is dissolving a sizable obstacle to the mass production of smaller, cheaper microchips.

Led by Stephen Chou, the Joseph C. Elgin Professor of Engineering at Princeton, the team worked to troubleshoot one form of nanoimprint lithography, a revolutionary method invented by Chou in the 1990s. Nanoimprint uses a nanometer-scale mold to pattern computer chips and other nanostructures, and is in marked contrast to conventional methods that use beams of light, electrons or ions to carve designs onto devices.

This technique allows for the creation of circuits and devices with features that are not much longer than a billionth of a meter, or nanometer -- more than 10 times smaller than is possible in today's mass-produced chips, yet more than 10 times cheaper. Because of its unique capabilities and reasonable cost, nanoimprinting is a key solution to the future manufacturing of computer chips and a broad range of nanodevices for use in optics, magnetic data storage and biotechnology, among other disciplines.

In dispensing-based nanoimprinting, liquid droplets on the surface of a silicon wafer are pressed into a pattern, which quickly hardens to form the desired circuitry. This technique is more attractive to manufacturers than some other forms of nanoimprinting because it does not need to be done in an expensive vacuum chamber. However, the widespread use of the technique has been hindered by the formation of gas bubbles that distort the intended pattern.

New armor from bear protection suit maker

body armor from bear suit
A small inventor has created new body armor

The suit has stood up to bullets from high-powered weapons, including an elephant gun. The suit was empty during the ballistics tests, but he's more than ready to put it on and face live fire. The whole suit is made from high-impact plastic lined with ceramic bullet protection over ballistic foam. Its many features include compartments for emergency morphine and salt, a knife and emergency light. Built into the forearms are a small recording device, a pepper-spray gun and a detachable transponder that can be swallowed in case of trouble. In the helmet, there's a solar-powered fresh-air system and a drinking tube attached to a canteen in the small of the back. A laser pointer mounted in the middle of the forehead is ready to point to snipers, while LED lights frame the face.

He has spent two years and $15,000$150,000 in the lab out back of his house in North Bay, designing and building a practical, lightweight and affordable shell to stave off bullets, explosives, knives and clubs. He calls it the Trojan and describes it as the "first ballistic, full exoskeleton body suit of armour." The whole suit comes in at 18 kilograms. It covers everything but the fingertips and the major joints, and could be mass-produced for about $2,000, Hurtubise says. [not an exoskeleton but full body armor]

See follow up article with videos of people running in the bear suit


Trading Futures
Nano Technology
Netbook     Technology News
Computer Software
Future Predictions

Room temperature organic containing magnets

Railguns and lasers

An 8 megajoule railgun is working This could lead to higher velocity and longer range weapons. Key to practicality is power generation and storage like superconducting engines or nuclear power. A powerful pulse generator is used for this railgun. The prototype fired at Dahlgren is only an 8-megajoule electromagnetic device, but the one to be used on Navy ships will generate a massive 64 megajoules. Current Navy guns generate about 9 megajoules of muzzle energy. A 32-megajoule lab gun will be delivered to Dahlgren in June. The projectile fired yesterday weighed only 3.2 kilograms and had no warhead. Future railgun ordnance won't be large and heavy, either, but will deliver the punch of a Tomahawk cruise missile because of the immense speed of the projectile at impact.

The railgun's 200 to 250 nautical-mile range will allow Navy ships to strike deep in enemy territory while staying out of reach of hostile forces.

Mortars could soon become a lot less effective.

Progress for laser mortar defences.

Could provide squad or individual defence. They would also have potential effect versus small UAVs like those that are MEMS/NEMS based.

Which accompanies existing base / encampment mortar defences.

Those without good tech will be at more and more of a disadvantage.

Third proposed UK ideas factory project

There is a need for a high level instruction language and a computer compiler that translates commands in this language into instructions for the ‘nano-assembler’. Translate build instructions to specific pick and place actions for a SPM or other accurate pick and place tool.

An ambition to assemble molecules and materials under atomically precise control demands a big leap forward in control engineering and computer science. Is it possible to anticipate the properties and needs of a ‘nano-assembler’? If so, there is a need for a high level instruction language and a computer compiler that translates commands in this language into instructions for the ‘nano-assembler’. This development will require a breakthrough in understanding of chemical synthesis that must embrace the radically new ‘pick and place’ assembly method which is now possible in scanning probe microscopy (SPM). The Matter Compiler project is thus both an exercise in foresight, to anticipate developments in this area, and a prototype implementation for the engineering control and computer science aspects of directed molecular assembly. It has as inputs data from SPM experiments of collaborators, energy landscapes for ‘pick and place’ reactions and the vast knowledge base of classical synthetic chemistry, including methodologies such as retrosynthesis. This will be supplemented by reaction schemes for ‘pick and place’ reactions deduced from first principles quantum chemistry calculations and the technology of object oriented databases and inference engines.

Was the last part of my comment here any inspiration for this specific project choice? (you can create a molecular manufacturing processes design language
remove hydrogen, passivate surface etc...) Regardless this is a necessary and useful project to perform.

WHO biggest environmental risk factors

Skewed risk perceptions: How you say it matters

A WHO discussion of skewed risk perceptions

Positive or negative framing? Striking changes in preference can result from framing the risk in either positive or negative terms, such as lives saved or lives lost, rates of survival or mortality, improving good health or reducing risks of disease

Relative or absolute risks? Although relative risks are usually better understood, it can be very important to present absolute changes as well.

Percentages or whole numbers? Probabilities are better understood as percentage changes than by comparison of whole numbers.

Whole numbers or an analogy? Whole numbers may be less well understood than an example or analogy for the size of an adverse event.

Small or large numbers? A small number of deaths is more easily understood than a large number, which is often incomprehensible.

Short or long periods? A few deaths at one time or over a short period, as in a tragic accident, often have more impact than a larger number of deaths occurring discretely over a longer period of time.

World Health Organization estimates 3 million killed by outdoor air pollution

The World Health Organization (WHO) says 3 million people are killed worldwide by outdoor air pollution annually from vehicles and industrial emissions, and 1.6 million indoors through using solid fuel. Most are in poor countries. One study says 7-20% of cancers are attributable to poor air and pollution in homes and workplaces.

Wikipedia discusses air pollution

Research published in 2005 suggests that 310,000 Europeans die from air pollution annually. Direct causes of air pollution related deaths include aggravated asthma, bronchitis, emphysema, lung and heart diseases, and respiratory allergies.

Combustion-fired power plants (coal-fired power plants contribute 40% of world's sulphur and mercury).

A PDF by the World Health Organization on air pollution

A color PDF report on environmental causes of different diseases. 1.5 million deaths per year from respiratory infections attributable to the environment.

Natural Resource Defence council on air pollution

As many as 64,000 premature deaths occur each year from cardiopulmonary causes attributable to particulate air pollution, according to NRDC estimates. Most particulate emissions result from burning fossil fuels -- coal, oil, diesel, gasoline -- or wood. Old coal-fired power plants, industrial boilers, diesel and gas-powered vehicles and wood stoves are some of the worst culprits.

Used in motor fuels, solvents, detergents, pesticides and many other substances, benzene is a carcinogen that causes leukemia as well as a number of other illnesses. Virtually the entire U.S. population is exposed to benzene in at least small amounts -- at gas stations (it's in the gasoline), in diesel exhaust or from cigarette smoke, including second-hand smoke. Benzene also a problem in a number of workplaces, including oil refineries, coal-coking operations at steel mills, chemical processing plants, rubber manufacturing plants and laboratories, where it is often used as a solvent for other chemicals

Environmental causes of avoidable forms of cancer

The National Cancer association has listed the percentage of avoidable cancers with different environmental causes.

Proportion of cancer deaths caused by different avoidable cancers

Causes Percent 1981(US)* Percent 1998(UK)**

Tobacco 25-40 29-31
Diet 10-70 20-50
Medicines 0.3-1.5 <1
Infection: parasites, bacteria, viruses
10 best estimate 10-20
Ionizing and UV light 2-4 5-7
Occupation 2-8 2-4
Pollution: air, water, food <1-5 1-5
Physical inactivity 1-2

*Doll R and Peto R. The causes of cancer: quantitative estimates of avoidable risks of cancer in the United States today. Journal of the National Cancer Institute 1981;66:1191-1308.

**Doll R. Epidemiological evidence of the effects of behavior and the environment on the risk of human cancer. Recent Results in Cancer Research 1998;154:3-21.

Aaron Blair, Ph.D., the chief of the Occupational Epidemiology Branch in NCI's Division of Cancer Epidemiology and Genetics, was interviewed and he said:

This includes radiation from many sources - cosmic rays, radon, X-rays, atomic bombs, and above ground nuclear bomb tests. So, the estimates for tobacco and ionizing radiation are very solid. However, the total contribution from all the other causes of cancer, such as diet, occupational exposures, or air and water pollution, may be correct, but we are less certain. For these categories, we never know if we have really identified all the potential factors that contribute to the cancer risk in the population.

There is very solid evidence that environmental factors are the major cause of cancer, although the specific environmental factors involved differ by tumor. Tobacco smoke is the major cause of lung cancer. But there is a long list of other chemicals that cause lung cancer - arsenic, asbestos, PAHs (polyaromatic hydrocarbons), and chromium, to name a few. For breast cancer, hormone use is one of the major factors affecting risk. Prostate cancer has nothing that reaches the level of evidence of lung or breast cancer, although there are a number of strong leads. Physical inactivity is strongly linked to colorectal cancer, as well as a number of dietary factors -- low fiber is probably implicated.

My hunch is that general environmental exposures (pollutants in air and water) will be understood to be more important in the future decades. These won't account for as large a percentage of cancers as tobacco, although they could rise above the 2-5 percent range because of the large numbers of people exposed. They were on Doll and Peto's list, but there was very little information to back up their estimates (see table below). Doll and Peto assumed that several environmental exposures in the industrial arena were the same as in the general population. Researchers are beginning to focus on potentially hazardous substances in the water and air. This is a difficult research area and is every bit as hard to study as diet. My suspicion is that we will have much more solid information in the next couple of decades about how these things may contribute to cancer

The Cancer Prevention Coalition is more firm:

A 2000 publication on a large-scale study of identical twins in Sweden, Denmark, and Finland; this showed that cancer risk in adopted children parallels that of their adoptive, rather than biological, parents. "The overwhelming contribution to the causation of cancer in the population of (90,000) twins that we studied was the environment" (20). The critical significance of these findings has been recently stressed. "Thus the conclusion from twin studies is consistent with the conclusion from migrant studies: the majority, probably the large majority, of important cancers in western populations are due to environmental rather than genetic factors. Overly enthusiastic expectations regarding genetic research for disease prevention have the potential to distort research priorities for spending and health.

The cancer establishment has ignored the June 2002 admission by Doll that most non-smoking cancers "are caused by exposure to chemicals, often environmental ones"

Another discussion of coal health risks

Coal plants are the primary human activity responsible for the release of cancer-causing radioactive substances known as radionuclides. The International Agency for Research on Cancer (IARC) has classified radium-226, radium-228, thorium-232, and their decay products, as promoting cancer in human beings. All of these radionuclides are present in coal. They become more concentrated when they are burned and emitted as gases and particles.

19 coal plants emit 4,415 pounds of toxic mercury - contaminates fish and leads to permanent brain damage in exposed children.

Studies show people living near coal plants have a higher risk of cancer and asthma, and the children are at risk of brain disorders that result in learning and behavioral disorders. Particle pollution that can get into the lungs and bloodstream, as well as acid gases and mercury, are just a few of the concerns of residents living near coal plants.

A recent study shows that 240 more deaths a year will result from the 19 proposed plants (in Texas)- 12,000 deaths over their 50-year life.

Coal usage and production in the united states. Over 1000 plants.

A PBS debate on coal versus nuclear

Physicists discover structures of gold nanoclusters

Theoretical electron scattering intensities from negatively charged gold nanclusters containing 24 atoms, calculated for the structures shown at the top. Each of the curves is denoted by a symbol that corresponds to a particular structure. Shown at the bottom is the experimental scattering data (solid red line) together with the calculated one that fits it in the best way (dotted line) -- that is, the tubular, capsule-like, structure marked G. The theoretically predicted best-fit structure is shown also on the right, with atoms in individual layer of the nanotube colored differently. The green strip accompanying the experimental data shows the experimental uncertainty.

Credit: Georgia Tech/Uzi Landman<span style="font-style:italic;">

Knowing the structures of gold that are naturally formed is useful to gather more building blocks for usage in nanotechnology and nanoscale technology

Quantum biology

Researchers at Rensselaer Polytechnic Institute describe a mechanism to explain how an intein -- a type of protein found in single-celled organisms and bacteria -- cuts itself out of the host protein and reconnects the two remaining strands. The intein breaks a protein sequence at two points: first the N-terminal, and then the C-terminal. This aspect of the project, which is led by Saroj Nayak, associate professor of physics, applied physics, and astronomy at Rensselaer, focuses on the C-terminal reaction.

Another Rensselaer team previously found that the reaction at the C-terminal speeds up in acidic environments.

"You can use this protein that cuts itself and joins the pieces together in a predictable way," he said. "It already has a function that would be nice to harness for nanotechnology purposes." And because the reaction may be sensitive to light and other environmental stimuli, the process could become more than just a two-way switch between "on" and "off."

Detect Bombers from 50 or more yards away

The CounterBomber system beams low-power radar at a person to detect concealed bombs or weapons beneath clothing. The technology, which could detect suicide bombers from 50 or more yards away, may reach market later this year. Future versions may be augmented with gait-recognition software that detects when people are carrying heavy objects--or leaving objects on the ground--by analyzing anomalies in a person's gait.

Metagenetics advance sequencing all microbes in a termite for better biofuels

Sequencing the genomes of microbial ecosystems could lead to better biological machines. Scientists are sequencing the genomes of entire microbial communities in the hope of uncovering new genes and organisms that can create fuel, mine metals, or clean up superfund sites. Known as metagenomics, the field relies on studying bits of DNA from a variety of organisms that live in the same place.

The standard way to identify and study the microorganisms living in a particular community is to grow them in a lab, but this is only possible with about 1 percent of microbes. However, in the past two years, faster and cheaper gene-sequencing methods have offered microbiologists a new tool with which to study the other 99 percent. Scientists can extract the DNA from, say, a drop of seawater or a sample of sludge from a sewage-treatment plant and then sequence that DNA, deriving genomic clues to all the organisms living in that environment.

Assembling the random fragments of DNA generated during sequencing can be a challenge--even impossible in some cases. Hugenholtz likens the process to trying to put together one thousand jigsaw puzzles from a single box that holds only a few pieces from each puzzle. So rather than fully assembling these genomic puzzles, scientists try to understand the individual pieces, or genes. Identifying the genes that allow the microbes in the termite gut to digest wood, for example, could lead to better biofuels. Converting cellulose in trees and grasses into the simple sugars that can be fermented into ethanol is a very energy-intensive process. "If we had better enzymatic machinery to do that, we might be better able to make sugars into ethanol," Bristow says. "Termites are the world's best bioconverters."

Metagenomic researchers have already identified a number of novel cellulases--the enzymes that break down cellulose into sugar--and are now looking at the guts of other insects that digest wood, such as an anaerobic population that eats poplar chips. The end result will be a giant parts list that synthetic biologists can put together to make an ideal energy-producing organism.

Microphotonic devices for higher speed communications

Researchers at MIT have developed a new device that will improve communications. Fiber optics cause random polarizations of light that can lead to weakened or garbled signals. The new structure makes all incoming light one polarization before the data is processed.

The current advance pertains only to those photonic applications that involve light with multiple polarizations--mainly communications applications that involve fiber optics. There hasn't been much economic pressure in the past couple of years to develop technology for these applications because of a glut in bandwidth, but now communications demands are increasing again, says Erich Ippen, professor of electrical engineering and physics at MIT and one of the researchers on the project.

"When you integrate things like this, the complexity and the performance of the kinds of filtering we can do are a little more advanced than the methods that are used today," Ippen says. And that, he says, will make it possible to meet the demands of next-generation telecommunications.

January 16, 2007

UK Proposal for sub-angstrom precision computer controlled actuators

Another proposed project from the UK Ideas factory to create reconfigurable computer controlled actuators with sub-nanometer to sub-angstrom precision.

We propose a scheme to revolutionise the synthesis of nanodevices, nanomachines, and, ultimately, functional materials via the positional assembly of molecules and nanoscale building blocks. Computer-directed actuators will be used to drive (with sub-nanometre to sub-Angstrom precision) the elements of a nanosystem along pre-defined and entirely deterministic trajectories, thereby achieving structures not accessible by mimicing natural assembly strategies alone. Linkages and bonding between the building blocks will also be initiated, modulated, and - in some cases - terminated by direct computer control. Our proposal rests on the parallel development of novel surface-bound, reconfigurable nanoscale building blocks (molecules, functionalised clusters, nanoparticles) and a prototype computer-controlled matter manipulator best described as a nanoscale conveyor belt. We focus on the generation of two major and immensely challenging functionalities for positionally-assembled nanomachines: switchable energy transduction and conformationally-driven motion. Our archetypal system comprises the following units: an energy harvester, a switchable/gateable link, and an optical or mechanical output. By arranging, configuring, and triggering these fundamental units our long-term goal is no less than the fabrication of an autonomous, abiotic [non-living] nanomachine.

Software control of DNA oligomer project proposed

Software controlled assembly of DNA oligomers one of the likely ideas factory projects. (In biochemistry, the term oligomer is used for short, single stranded DNA fragments, generally used in hybridization experiments. It can also refer to a protein complex made of two or more subunits. In this case, a complex made of several different protein subunits is called a hetero-oligomer. When only one type of protein subunit is used in the complex, it is called homo-oligomer.)

We propose to create a molecular machine that will build new materials under software control. The output of the machine will be chains of building blocks linked by covalent bonds. The machine is modular and is designed to accept many different building blocks, from small molecules to nanoparticles, with a wide range of physical and chemical properties. In order to drive its development we will concentrate on using it to create two target products: a molecular wire, capable of transporting energy and electrical charge, and a catalyst. Software control starts with specification by the end-user of a sequence of building blocks. The target sequence is encoded in an instruction tape which can be read by the machine: the tape is itself a molecule, a synthetic DNA oligomer. The target sequence of building blocks is automatically converted into a control sequence of DNA bases, and the tape is produced by commercial solid-phase synthesis. The job of the machine is to read the instruction tape and to form the bonds between building blocks in the specified sequence. Every component of this molecular factory is itself a molecule: our ambition is to develop the system to the point where it could be distributed to end users as chemicals in plastic vials.

This looks like it should be an interesting attempt to create a programmable, modular and step-wise improvable DNA based construction system. It would be molecular system that would prove out a set of molecular components that would work in a construction system which would extend the complexity of what has been done before. It would leverage the improvement of DNA synthesis and bring in other building blocks (nanoparticles and small molecules). DNA synthesis currently is a rapidly improving area and can currently put together strings of thousands of DNA bases.

The molecular wire to transport charge and catalyst would be a way to guide site specific reactions in a more general way.

The capabilities will improve as the enabling technology improves.

One of the next steps after that would be to find ways to create highly parallel or exponential manufacturing.

HP researchers make new computer chip design, 8 times the transistors

Hewlett-Packard researchers have designed a faster, more energy-efficient chip by packing in more transistors--without shrinking them. Instead of using aluminum wiring that take up a lot of space beside transistors, they used crossbar nanowire mesh that goes on top to connect the transistors. HP's design has the potential to be easily integrated into a chip-making facility. By 2010, the technology should be ready for manufacturing.

The first application of the technology will most likely be in a type of chip called field-programmable gate arrays (FPGAs), which have the flexibility to be programmed to complete a variety of tasks. FPGAs are typically used in the design stages of electronics and communication systems. However, once the bugs are worked out of the design, manufacturers replace FPGAs with faster, cheaper chips called application-specific integrated circuits (ASICs). Reducing the size and cost of FPGAs and increasing their speed has the potential to shift the balance between FPGAs and ASICs,

Форма для связи


Email *

Message *