March 30, 2007

New material Nanobuds are buckyballs fused to Nanotubes

(left) Two transmission electron microscope images of a single-walled carbon nanotube with fullerenes attached to its surface (right) A fullerene/SWNT hybrid structure – a NanoBud. Credit: Esko Kauppinen, et al.

Researchers have created a hybrid carbon nanomaterial that merges single-walled carbon nanotubes and spherical carbon-atom cages called fullerenes. The new structures, dubbed NanoBuds because they resemble buds sprouting on branches, may possess properties that are superior to fullerenes and nanotubes alone.

NanoBuds may find use as cold electron field emitters – materials that emit electrons at room temperature under a high applied electric field – due to the fullerenes' many curved surfaces, which make for better emitters that flat surfaces. Cold electron field emission is key to many technologies, including flat-panel displays and electron microscopes.

“We believe that NanoBuds may have other applications, such as molecular anchors to prevent SWNTs from slipping within composite materials,” says Kauppinen. “Additionally, since the optical and electrical properties of the fullerenes and nanotubes can be individually tuned, NanoBuds provide SWNTs with distinct regions of different electrical properties. This could be useful for many applications, including memory devices and quantum dots.”

Cold Fusion continues to gain Grudging Respect

Cold Fusion continues to claw its way to respectability American Chemical society is holding a symposium at their national meeting in Chicago, Illinois, on 'low-energy nuclear reactions', the official name for cold fusion.

Vocal cold-fusion critic Robert Park at the University of Maryland. "If anything is going on, it's not fusion."

That cold-fusion critics such as Park even acknowledge there might be any effect at all is a major change in attitude, says Frank Gordon from the US Navy's Space and Naval Warfare Systems Center in San Diego, California.

More on the low energy nuclear reactions

Penn scientists engineer small molecules to probe proteins deep inside cell membrane

Proteins, which form much of the molecular machinery required for life, are the targets of most drug molecules. One third of all proteins are membrane proteins – embedded within the cell’s fatty outer layer. While scientists can easily study the other two-thirds using such tools as antibodies, they have not had such methods to investigate the membrane-embedded portions of proteins.

To probe the secrets of these seemingly inaccessible proteins, researchers at the University of Pennsylvania School of Medicine have designed peptides that are able to bind to specific regions of transmembrane proteins, using computer algorithms, and information from existing protein sequence and structure databases.

The University of Florida is launching a broadbased study of regeneration.

Medical progress summary

Japanese scientists have developed an oral vaccine for Alzheimer’s disease that has proven effective and safe in mice. “We hope the Phase I trials go well,” Tabira said. “Animals are able to recover their functions after developing symptoms, but humans are less able to do so. It may be that this only works in the early stages of the disease, when symptoms are light.” The vaccine is made by inserting amyloid-producing genes into a non-harmful virus. When taken orally, the virus stimulates the immune system to attack and break down the amyloid proteins in the brain. About 4.5 million Americans have Alzheimer’s, a toll expected to reach a staggering 14 million by 2050 with the graying of the population.

A lot of drug cures for Alzheimer's have failed, but drug companies soldier onward

Various forms of dementia are being targeted with many different approaches

Cancer vaccines are nearing FDA approval Those who received Provenge lived four and a half months longer than those who did not.

370,000 Americans lack functioning kidneys. Three or four times every week, they visit a special clinic and sit for four hours as their blood is removed, cleaned and returned to their body. A new ceramic filter has the potential to make kidney dialysis much more efficient and to reduce by 30 minutes to one hour the time required for a dialysis treatment.

Clinical trials of a diabetes regenerative product, E1-INT showed more than half decreased their average daily insulin usage by more than 20 percent, or reduced their HbA1c levels (a long-term measure of blood sugar control) by 1.2 to 2 percent in the months post-treatment.

Human trials are starting on an artificial pancreas product

Clinical trials that are available for each type of disease are listed here

March 29, 2007

Progress in Nanotube Transistors

High-current transistors made from perfectly aligned carbon nanotubes show promise for use in flexible and high-speed nanoelectronics.

Tube transistors: Researchers at the University of Illinois at Urbana Champaign have developed a technique to grow thousands of carbon nanotubes (shown in blue and white in this colorized scanning electron micrograph). The researchers deposit electrodes (shown in gold) on two sides of the nanotube arrays to create transistors that have hundreds of nanotubes bridging the electrodes.
Credit: John Rogers, UIUC

In a Nature Nanotechnology paper, the researchers, led by John Rogers, a professor of materials science and engineering at UIUC, have demonstrated transistors made with about 2,000 nanotubes, which can carry currents of one ampere--thousands of times more than the current possible with single nanotubes. The researchers have also developed a technique for transferring the nanotube arrays onto any substrate, including silicon, plastic, and glass.

The nanotube transistors could be used in flexible displays and electronic paper. Because carbon nanotubes can carry current at much higher speeds than silicon, the devices could also be used in high-speed radio frequency (RF) communication systems and identification tags. In fact, the research team is working with Northrop Grumman to use the technology in RF communication devices, says Rogers

Until now, making transistors with multiple carbon nanotubes meant depositing electrodes on mesh-like layers of unaligned carbon nanotubes, Rogers says. But since the randomly arranged carbon nanotubes cross one another, at each crossing, flowing charges face a resistance, which reduces the device current. The perfectly aligned array solves this problem because there are "absolutely no tube-tube overlap junctions," Rogers says.

Making a well-ordered array in which parallel nanotubes are connected between the source and drain electrodes is a big achievement, says Richard Martel, a chemistry professor at the University of Montreal. The new work allows a true comparison between nanotube transistors and silicon transistors because an array of nanotubes gives a planar structure similar to silicon devices, he says. "They did exactly what needed to be done, and it's a significant step."

For now, the new transistors will be useful for larger electronics circuits such as those in flexible displays and RF chips, but to be used in high-performance electronics like computer chips, the devices need a much better structure and geometry, Javey says. For instance, the devices would need to be much smaller than they are now: the transistors are currently tens of micrometers long and wide.

To make smaller devices, the UIUC team is working on making the arrays denser. Right now, the distance between adjacent tubes is 100 nanometers, but theoretically, this separation could go down to only one nanometer without affecting electrical properties, Martel says.

Future work: find an effective way to make devices with only semiconducting nanotubes. Typically, a third of the nanotubes in any grown batch are metallic, which causes a small current to flow through a transistor even when it is turned off. The researchers use a common trick to get rid of metallic tubes: turn a transistor off and apply a high voltage that blows out the metallic tubes. But to make good-quality transistors on a larger scale, they would need to find a better way to get rid of the metallic tubes or selectively grow semiconducting tubes. That, according to Javey, is the "last big key" for making nanotube electronics.

Dwave puts out a background summary of their Orion quantum computer

A Dwave system pdf that has questions and answers about their quantum computer and their quantum computer plans

some more useful answers in the comments and discussion:

“For the algorithms you demostrated, how many times do you have to run the quantum computer to get the right answer?”

Generally the success rates are around 90% for 4-vertex MIS problems and around 85% for 6-vertex MIS problems.

Do you understand why it is scaling in the manner it is scaling?”

The main issue now is probably calibration of the machine language numbers (the coupler and qubit bias numbers), although we’re going to have to build significantly bigger systems to get enough information to chop into this issue.

“How did you determine that quantum effects are responsible for the operation of your processor?”

I wouldn’t characterize them as “responsible”. They are definitely involved, in the sense that their presence is changing the behaviour of the machine vs. the case where these weren’t there. One of the characterization experiments is the macroscopic resonant tunneling (MRT) experiments discussed at the APS meeting earlier this month. There are a few others.

more answers were provided to room408

Can you outline the main technical challenges that must be overcome before a system such as Orion is scalable and universal?

There are several aspects of the current design that aren’t scalable to the levels we want to be able to achieve (thousands to millions of qubits). The next processor generation, which we’ll release Q4/2007, has planned fixes to all of these. Whether or not the redesigned processor elements do the job of course remains to be seen.

Re. universality, the Hamiltonian of the current system isn’t universal, in the sense that arbitrary states can’t be encoded in its ground state. The Hamiltonian is of the form X+Z+ZZ. It’s known that adding another type of coupling device to give something like X+Z+ZZ+XZ (for example) allows for universal state encoding in the ground state.

Of course this isn’t enough. Making the system we’ve got now universal is of course very hard. The question is whether or not it’s worth trying to do this. Ultimately this question resolves down to the potential value of applications requiring resources the current type of chip can’t provide. It could be that for certain quantum chemistry applications the value of modifying the chip design to get closer to universal QC might be worth the effort, although I see a big opportunity just sticking to discrete optimization problems.

Given the unabated advancement of classical computing technology, and given that Orion doesn’t appear to enable the efficient solution of problems that can’t already be solved efficiently by a classical computer, for what problem size do you expect that a system such as Orion will be able to outperform the fastest classical computer? When do you expect to achieve this?

This is a real hard question to answer. Our best guess is that the ability to solve 256-variable integer programming problems in hardware will be close to break even for certain instance classes.

There are two tough problems to solve in developing a predictive model to answer performance questions. The first is that since this type of system is a “hardware heuristic” there will definitely be instance-dependence and we don’t know what the types of instances that will be best suited to the system look like yet. The second is that there’s no way to predict the scaling advantage from having the system be quantum mechanical. You can look at general arguments to ascertain bounds on performance, but how the machine will function in practice is an entirely different problem. Our attitude is that it’s at least as tough to try to develop realistic models and solve them as it is to actually build real hardware, so we focus on building real hardware and having very fast redesign cycles.

I think also that because the approach we’ve taken can be taken far past the projected cross-over point (up to say a million qubits, which should be able to encode 10s of thousands of variables), even if we’re wrong on where the cross-over point is we can continue building bigger and bigger systems.

March 28, 2007

Light activated nanoscale scissors from Japan

Researchers in Japan have developed a pair of molecular-scale scissors that open and close in response to light. The tiny scissors are the first example of a molecular machine capable of mechanically manipulating molecules by using light, the scientists say.

The scissors measure just three nanometers in length, small enough to deliver drugs into cells or manipulate genes and other biological molecules, says principal investigator Takuzo Aida, Ph.D., professor of chemistry and biotechnology at the University of Tokyo.

Scientists have long been looking for ways to develop molecular-scale tools that operate in response to specific stimuli, such as sound or light. Biologists, in particular, are enthusiastic about development of such techniques because it would provide them with a simple way to manipulate genes and other molecules.

“It is known, for example, that near-infrared light can reach deep parts of the body,” says Kazushi Kinbara, Ph.D., associate professor of chemistry and biotechnology at the University of Tokyo and co-investigator of the study. “Thus, by using a multi-photon excitation technique, the scissors can be manipulated in the body for medicinal applications such as gene delivery.”

In a recent study, the scientists demonstrated how the light-driven scissors could be used to grasp and twist molecules. The group is now working to develop a larger scissors system that can be manipulated remotely. Practical applications still remain five to 10 years away, the scientists say.

Wealthy consumption, airplanes and the environment

Future pundit brought up a concern about high consumption by a lot of rich people in the future causing environmental problems

I examine more of the statistics around the current situation. Previously I looked at what the projected wealth situation might look like.

Work is being done to reduce the environmental impact of planes. Molecular nanotechnology will both remove limits on the production of planes and large houses but will also enable the environmental impact to be greatly reduced.

I think limitations on construction pre-molecular manufacturing will keep the environmental impact of the very wealthy as a small fraction of overall consumption.

The wealthy are starting to buy commercial airline size jets This is still a tiny fraction 50-100 planes out of the 5000 boeing 737, 1050 boeing 757, 950 boeing 767, 1380 boeing 747 and the 4000 or so airbus planes.

The fairly wealthy who are not high consumption are described in the millionaire next door. The consumption pattern of many high net worth people with an average of 3.5 million is not as conspicuous as some who are very rich

Private jet owners have an average annual income of $9.2 million and a net worth of $89.3 million. They are 57 years old. And 70% of them are men.
Article profiling of private jet owners
Corporate jet article from USA today

11,000 business jets in worldwide fleet. This is a little less than the total number of 100+ person jets (boeing and airbus) used by the commercial airlines.

US Fuel consumption figures and projections from the FAA for private jets and planes

By piston-engine planes
2005: 75.1 million gallons.
2017: forecast: 86.5 million gallons.
Average annual growth rate: 1.2 percent.

By jet-engine planes
2005: 793.3 million gallons.
2017: forecast: 2,427.6 million gallons.
Average annual growth rate: 9.8 percent

Total fuel consumed
2005: 1,298.8 million gallons.
2017: estimate:3,065.3 million gallons.
Average annual growth rate: 7.4 percent.

Global usage of jet fuel is about 71 billion gallons/year. 19.5 billion gallons used by US airlines. Private planes use 6.6% of the total jet fuel.

A Forbes article about the booming private jet market. Private jets increased 35% in first half of 2006.

Small jets as an alternative to flying first class

The air taxi model: utilizing 5000 small airports in the USA (out of total 14,000 airports), not just the 30 big ones and 550 commercial ones. There will be a similar increase in the usage of small airports worldwide.

Further reading:
Nasa's vision of air taxi's
air taxi vision
Nasa's vision of small jet systems linking small airports
A study of the small air transportation program
The summary of the section examining impacts of the small air transportation system including environmental impacts.

Some research on rich people

Spending patterns for some of the rich

Very light jet market 2007-2016

This study predicts the delivery of 4,124 VLJs during 2007-2016 – which added to the 30 or so delivered in 2006 will put the global VLJ fleet at 4,154 aircraft. Some industry observers critical of the manufacturer-supplied growth rate contend that only two, or at best three, manufacturers will make it to market but the authors of this report are slightly more optimistic than this. We believe, in addition to the five key programmes featured in this report (A700, Cessna Mustang, Embraer Phenom 100, Diamond D-Jet and Eclipse 500) there will be other new entrants who succeed in producing aircraft for the personal jet market, and estimates for these are listed in the HondaJet/new entrant line, with an average aircraft sales price of $2.8 million, throughout the timeframe of the study – equivalent to the HondaJet reported retail price in 2010.

Here is a study for more environmentally benign aircraft

MIT looks at aviation and the environment.

A pdf with a government view of a vision for Air transportation in 2025

Far infrared can be used wireless thousands of times faster

NEw research shows high-frequency terahertz signals can be switched on and off to carry data in the digital code of ones and zeroes, and that it someday may be possible to build superfast switches to carry terahertz data at terahertz speeds. That is 1,000 times faster than gigahertz fiber optic lines that carry data as near-infrared and visible light, and 10,000 times faster than microwaves that carry cordless and cell phone conversations.

No one has built terahertz switches, but Nahata says the new study shows it is possible to use terahertz radiation to carry data and thus may be possible to create terahertz-speed switches for superfast wireless communication over short distances, such as between a cellular phone and headsets, a wireless mouse and a computer, and a PDA (personal digital assistant) and a computer.

University of Utah researchers have shown it is possible to harness far-infrared light -- also known as terahertz electromagnetic radiation -- for use in superfast wireless communications and to detect concealed explosives and chemical or biological weapons. The researchers shined far-infrared light on metal foils punctured with holes arranged in what are known as quasicrystal and quasicrystal-approximate patterns. Even though the holes make up only a portion of each foil's surface, almost all the radiation passed through the metal foils with these patterns. This photo shows a quasicrystal pattern. Credit: Tatsunosuke Matsui, University of Utah

Terahertz lasers can work at over 25 meters

Projecting future wealth

200620052004Wealth Amount
532US$30B+ Forbes list (which mainly catches owners
674932US$10B+ of public assets, can underestimate some
167124102US$5B+ like CTO of Cisco, who may be billionaire
946793691US$1B+ from cisco stock + large startup positions)
1000082007500US$160M+ (my own estimate)
(e)100K8540077500US$30M+ (UHNW, ultra high net worth class)
(e)1M820000745000US$5 to 30M
7.8M7.4M US$1-5M Global number, US number 33% (2.6 million)
8.7M8.2MUS$1M+ Global number, US number 33% ( 2.6 million)
~24M~22MUS$500K-1M doesn't include primary residence, (estimate)

The affluent class with over $1M (not including primary residence) is growing 8-11% each year.

This does not factor in global inflation of between 3-4% per year.
Plus the weakening dollar has increased the number of wealthy people who primarily have assets in other currencies.
This projection into the future of recent trends assumes that many things stay consistent. This will be less and less likely as time passes. However, it is a view of what happens if things stay roughly the consistent. Over longer time frames world growth rate has increased from past centuries. Radical improvements in technology and access to space resources would improve the overall wealth situation in most cases.

In 7-9 years most of the people in the 500K group will have moved up to the $1M+
11-16 years the people in the 500K group would mostly advance to the $1M+ level on an inflation adjusted basis.

In 20 years, expect 70 million millionaires worldwide. About 20 million in the US.

There will be 6000 to 26000 billionaires in 20 years (10% or 17% growth rate). (30-40 years on an inflation adjusted basis). A mid-range estimate is 15,000 billionaires in 2027.

Projecting farther into the future every 20-40 years a particular category of wealth will have about the same number as a lower category. 40 years for inflation adjusted, 30 years with more conservative growth and 20 years with optimistic growth.
So in 60 years, the $30M+ plus people or their descendents probably move up two spots to billionaire class. 80 years for inflation adjusted billionaire class.

Aggregate wealth of the top 200 wealthiest in the world has gone up 7 times over 20 years (Forbes).
There is a mass affluent group forming.

Household income distribution in the US is described in detail at wikipedia There are 113 million households.

The most wealth is not necessarily those with the most income. Someone with a lot of net worth could choose to not churn assets or draw large amounts of taxable income.

There was a large middle class in the US and a small wealthy class.
It is similar to what is happening now in China for a growing lower middle class which will be followed by the upper middle class becoming dominant demographically.

Scientists Create First Non-Carbon Material with Near-Diamond Hardness

The material is a boron nitride “nanocomposite.” This means that, rather than consisting of one large continuous crystal, it is made of crystalline boron-nitride grains that are each a few to several nanometers in size. Although research groups have previously reported boron carbonitride materials, claimed to be the second and third hardest materials after diamond, the particular versions, or “phases,” of those materials were unstable at high temperatures.

Single-crystal diamond, the hardest type, has a hardness of about 100 GPa. The boron nitride nanocomposite displayed a maximum hardness of 85 GPa at a grain size of about 14 nanometers, and is thermally stable up to 1600 degrees Kelvin (about 2400 degrees Fahrenheit). Prior to this research, the next hardest known material after single-crystal diamond was cubic boron nitride, a single-crystal phase of the material, which has a Vickers hardness of 50 GPa.

March 27, 2007

State of Virtual Reality

Center for Responsible Nanotechnology has a recent article that recalls a passage from Unbounding the Future (1991) which discusses virtual reality with sensory feedback

What do we have now for the state of the art in virtual reality?

We have a 100 million pixel virtual reality room
The 10 foot by 10 foot cube room has new equipment -- a Hewlett-Packard computer cluster featuring 96 graphics processing units, 24 Sony digital projectors, an eight-channel audio system and ultrasonic motion tracking technology.

Virtusphere is another interface system

Haptic gloves are getting quite advanced. A realistic touch interface must ideally be able to change 500 times per second or more. (The human visual system will be fooled by images that change 20 times per second.)

Haptic body suits have been under development for quite a while

Gesture recognition is being incorporated in many devices and is supplied to game system like the Nintendo Wii, Microsoft Xbox and Sony Playstation 2 and Sony Playstation 3.

Samsung continues to double flash density each year

Samsung releasing 64 GB flash drive in Q2 2007

The read and write performance on the drive has been increased by 20 and 60 percent (over last years 32 GB unit): the 64 GB unit can read 64 MB/S, write 45 MB/s, and consumes just half a Watt when operating (one tenth of a Watt when idle). In comparison, an 80 GB 1.8-inch hard drive reads at 15 MB/s, writes at 7 MB/s, and eats 1.5 Watts either operating or when idle.

Last years 32 GB drive announcement

Samsung has terabit flash memory plans

Top down nanotechnology trends and current status

Current state of the art for arrays of scanning tunneling microscopes (STM) or atomic force microscopes (AFM) or nanostructured patterning:

IBM millipede at wikipedia uses an array of atomic force probes.
They are hoping for a 2007 commercial launch possibility with a
64X64 cantilever chips with a 7 mm x 7 mm data sled that uses 10 nanometer pits.

Nanoink has 55,000 AFMs in working array of dip pen of atomic force microscopes (more info is here)

They can perform high throughput nanopatterning The 2D nano PrintArray™ can cover a square centimeter with nanoscale features and pattern 10^7 m^2 per hour. Using established templating techniques, these advances enable screening for biological interactions at the level of a few molecules, or even single molecules.
55,000 images were produced with Nanoink array, which took only 30 minutes. Each identical nickel image is 12 micrometers wide -- about twice the diameter of a red blood cell -- and is made up of 8,773 dots, each 80 nanometers in diameter. So 17500 dots per element in the array in one hour. This is 6 actions per second for each AFM in the array.

1 million probe array has been built

Electron beams and Election beam induced depositioning

In 2005, electron beams had 2 nanometer focus

Electron beam induced deposition

1 nanometer focus has been achieved by electron beams An electron beam induced deposition system with 100 beams and the ability to deposit one nanometer sized dots is being created

Electron beams are used in the e-line fabrication system

The Memjet inkjet has higher dot volume for larger dots. (commercial release in 2008).
Each chip measures 20 millimeters across and contains 6,400 nozzles, with five color channels, the company said. A separate driver chip calculates 900 million picoliter-sized drops per second. For a standard A4 letter printer, the result is a total of 70,400 nozzles. The dots are about one micron by one micron in size.

2007 45 nanometer lithography, 65 nanometer is the common
2009 32 nanometer lithography
2011 22 nanometer lithography, the 32 nanometer would be common (4X smaller than 2007 systems)
2013 16 nanometer lithography
2015 16 nanometer common, 16 times smaller than 2007 systems.

A projected advance from 6 actions per second in 2007 to 2000 actions per second per AFM/STM in 2015. Several chips for 20 million AFM/STM arrays up to one billion microscope elements.

4 million nozzles. One hundredth of a picoliter drops.

Could advanced metamaterial based lithography bring some form lithography like fabrication down to 1 nanometer features ? Even for relatively lower volume production ? Near-nano graphene etching ?

Does this transition into some form of top-down molecular nanotechnology bootstrap ?

Another summary of Robert Bussard Inertial electostatic fusion system

A description of how the Robert Bussard Inertial Electrostatic fusion system would work

His fuel of choice is one of the earth’s most common and least exotic elements: boron. It can be scooped from the Mojave Desert in California, possibly even extracted from sea water. Boron is used in the production of hundreds of products as diverse as flame retardants, electronic flat panel displays and eye drops. It’s so common that no country, company or individual could corner the market on the fuel supply.

The process Bussard hopes to perfect would use boron-11, the most common form of the element. Bussard says his experiments — which achieved fusion with deuterium, not boron — in November 2005 proved that the boron process will work.

The boron reactor would be similar to, but more powerful than, the reactor that blew up in 2005.

Bussard’s reactor design is built upon six shiny metal rings joined to form a cube — one ring per side. Each ring, about a yard in diameter, contain copper wires wound into an electromagnet.

The reactor operates inside a vacuum chamber.

When energized, the cube of electromagnets creates a magnetic sphere into which electrons are injected. The magnetic field squeezes the electrons into a dense ball at the reactor’s core, creating a highly negatively charged area.

To begin the reaction, boron-11 nuclei and protons are injected into the cube.
Because of their positive charge, they accelerate to the center of the electron ball. Most of them sail through the center of the core and on toward the opposite side of the reactor. But the negative charge of the electron ball pulls them back to the center. The process repeats, perhaps thousands of times, until the boron nucleus and a proton collide with enough force to fuse.

That fusion turns boron-11 into highly energetic carbon-12, which promptly splits into a helium nucleus and a beryllium nucleus. The beryllium then splits into two more helium nuclei.

The result is three helium nuclei, each having almost three million electron volts of energy.

The force of splitting flings the helium nuclei out from the center of the reactor toward an electrical grid, where their energy would force electrons to flow — electricity.

This direct conversion process is extraordinarily efficient. About 95 percent of the fussion energy is turned into electricity.

March 26, 2007

Three great pdfs to look at

Robert Bussard Inertial Electrostatic Fusion work The gear from the last round of testing are with Jim Benson of Spacedev in San Diego.
More info on Robert Bussard's Inertial confinement fusion

It would take $250 million and 5 years to create a full scale demonstration plant which should be able to produce more power than it uses.

Robert Bussard's wikipedia info

2 Dwave quantum computer pdf's
A recent short technical presentation on their adiabatic quantum computer

The slides from the Feb 13, 2007 quantum computer demo

IBM optical chip will allow moving 20 Gbytes/sec

IBM showed a prototype optical transceiver chip set Monday that they said will allow people to download movies or share online data at 160 Gbit/sec.

IBM says it can meet that need, building its new chip set by making an optical transceiver with standard CMOS technology and combining that with optical components crafted from exotic materials such as indium phosphide and gallium arsenide. The resulting package is just 3.25 mm by 5.25 mm in size, small enough to be integrated onto a printed circuit board.

Although all those technologies exist today, it will probably be at least three years until suppliers can produce enough parts for IBM to bring optical transceivers into its product stream, the company said.

When it does arrive, the part could have an immediate impact on applications from computing to communications and entertainment. A PC using that board would be able to reduce the download time of a typical high-definition feature-length movie from 30 minutes to one second, the company said.

Water used for nuclear power is not destroyed

One of the main uses of water in the power industry is to cool the power-producing equipment. Water used for this purpose does cool the equipment, but at the same time, the hot equipment heats up the cooling water. Overly hot water cannot be released back into the environment -- fish downstream from a power plant releasing the hot water would get very upset. So, the used water must first be cooled. One way to do this is to build very large cooling towers and to spray the water inside the towers. Evaporation occurs and water is cooled. That is why large power-production facilities are often located near rivers, lakes, and the ocean. The evaporated water returns to the water cycle and comes back as rain.

Alternative nuclear reactors can be built to use molten salt in place of water cooling and heat transfer.

Production of electrical power results in one of the largest uses of water in the United States and worldwide. Water for thermoelectric power is used in generating electricity with steam-driven turbine generators. In 2000, about 195,000 million gallons of water each day (Mgal/d) were used to produce electricity (excluding hydroelectric power). Surface water was the source for more than 99 percent of total thermoelectric-power withdrawals. In coastal areas, the use of saline water instead of freshwater expands the overall available water supply. Saline withdrawals from surface water sources accounted for 96 percent of the National total saline withdrawals. Thermoelectric-power withdrawals accounted for 48 percent of total water use, 39 percent of total freshwater withdrawals for all categories, and 52 percent of fresh surface-water withdrawals.

New Life extension possibility and stem cell treatment for hearts

A method of using isotopes to make bonds in the body more resistant to free radical damage could lengthen human lives by 10% or about 10 years. It does not reverse free radical damage, so it would be less helpful to older people.

A team led by Mikhail Shchepinov, formerly of Oxford University, fed nematode worms nutrients reinforced with natural isotopes (naturally occurring atomic variations of elements). In initial experiments, worms' life spans were extended by 10%, which, with humans expected to routinely coast close to the centenary, could add a further 10 years to human life.

Food enhanced with isotopes is thought to produce bodily constituents and DNA more resistant to detrimental processes, like free radical attack. The isotopes replace atoms in susceptible bonds making these bonds stronger. 'Because these bonds are so much more stable, it should be possible to slow down the process of oxidation and ageing,' Shchepinov says.

The isotopes could be used in animal feed so that humans could get the "age-defying" isotopes indirectly in steaks or chicken fillets, for example, rather than eating chemically enhanced products themselves. Shchepinov says an occasional top-up would be sufficient to have a beneficial effect.

Ageing experts are impressed with the isotopic approach. Aubrey de Grey, the Cambridge-based gerontologist, says it could be very relevant to the rates of several chemical and enzymatic processes relevant to ageing 'It is a highly novel idea,' he says. 'But it remains to be seen whether it can be the source of practicable therapies, but it is a prospect that certainly cannot be ruled out.'

Doctors have rejuvenated post-heart attack patients by injecting them with stem cells. Hare and his team injected intravenously 53 patients within 10 days of a heart attack. They randomly assigned patients different doses (0.5 million 1.6 million or 5.0 million cells per kilogram) and compared the dosages with a placebo.

Over six months, the patients receiving the stem-cell treatment had better heart and lung function with fewer arrhythmias.

Echocardiography also showed better heart function, especially in patients with greater heart damage.

Форма для связи


Email *

Message *