May 23, 2008

Hyperion uranium hydride nuclear battery update

It is 1.5 meters or 4 feet 9 inches across.

The Hyperion Power Generation "nuclear battery" is a self contained, automated, liquid metal nuclear reactor. The company has venture capital funding.

Each HPM provides 70 MW thermal energy or 25 MW electric energy via steam turbine for seven to ten years. This amount of energy provides electricity for 20,000 average American-style homes or the industrial or infrastructure equivalent. Each module will cost $25 to $30 million. This works out to a cost of $1000-1200/KW, but the company has quoted $1400/Kw. A couple of delivery dates starting in 2013 are available. 2012 has been targetted as the time when the first units will be deployed.

Nextbigfuture has previously examined the patent for this reactor and the primary initial application which would be providing cheaper and more effective heat for oilsand and oil shale oil extraction. Over 2 trillion barrels of oil is available in Canada and the United states in the form of oilshale or oilsand.

Hyperion offers a 70% reduction in operating costs (based on costs for field-generation of steam in oil-shale recovery operations), from $11 per million BTU for natural gas to $3 per million BTU for Hyperion. The possibility of mass production, operation and standardization of design, allows for significant savings. They expect an initial market of 4000 units, which would provide 100GW of power. This is equal to the current nuclear power generated in the USA. There will be 10-40 times less nuclear waste because the units have 50% burnup of the nuclear fuel.

Currently about 1100 cubic feet of natural gas is needed to extract one barrel of oil from the Alberta oilsands. This could decline to 900 cubic feet with more efficient processes. One cubic foot of natural gas is about 1000 BTU. So 900,000 to 1.1 million BTU are needed to extract one barrel of oil from the oilsands.

Other benefits:
Water not used as coolant; cannot go “supercritical” or get too hot.

No mechanical parts in the core to malfunction

Sealed module, never opened on site
The uranium hydride reactor could also have a large impact on space travel.

Canadian researchers have considered using conventional nuclear reactors to provide steam for oilsands oil recovery.

A 728 MWe (gross) nominal electric output ACR-700 design generates 1983 MW (thermal). The CANDU reactor can be adapted to provide steam of 2-6 MPa.
An ACR700 would provide in one configuration 140MWe (net), 420,000 barrels/day/steam and supply pressure of 2.2 MPa. The production rate of bitumen using this steam would depend on the steam/oil ratios required in the SAGD wells. For steam/oil ratios of 212.4-224 degrees celsius the bitumen production rates would be 168,000-210,000 bbl/day. The project would achieve a 10% advantage in steam cost even if natural gas were at USD3.25/mmbtu. The twin 2.2 GWe reactor proposal would generate 507,000 to 634000 bbl/day in a similar configuration with similar assumptions.

The Uranium hydride batteries can provide heat at many smaller oil properties so less piping would be necessary. The uranium hydride reactors would also be more portable and could be moved from project to project.

Breakthrough in Waferscale self-assembly of nanostructures

Researchers at Northeastern University have developed a technique to scale-up the directed assembly of single-walled carbon nanotube (SWNT) networks, from microns to inches, creating a viable circuit template that can be transferred from one substrate to another for optimum productivity. The revolutionary assembly process has the potential to change the way electronics and other applications are developed for consumers.

The CHN (Center for High-rate Nanomanufacturing at Northeastern University) has been able to develop a novel way to assemble nanoelements (nanotubes, nanoparticles, etc.) into nanostructures and devices that enable the mass production of atomic-scale structures and will lead to the production of devices such as biosensors, batteries, memory devices and flexible electronics very quickly and efficiently and with minimal errors.

The revolutionary assembly process, developed by Busnaina and his team, scales-up the nanoscale structures on a wafer level on a variety of hard and soft substrates such as silicon and polymers. In addition, the assembled structures could also be transferred to other substrates in continuous or batch processes.

Concurrently, researchers at the CHN are investigating the environmental and biological implications to ensure that these devices and techniques are safe for people and for the environment.

Research at CHN

1 & 2: Nanotemplate-enabled High Rate Manufacturing
Once developed, CHN’s nanotemplates will be integrated as tooling for an economically realistic production process. The nano-building blocks will be guided to self-assemble over large areas in high-rate, scaleable, commercially viable processes such as injection molding and extrusion. CHN researchers have successfully assembled both carbon nanotubes and 50-nm polystyrene latex particles on gold microwires and nanowires.

3: Proof of concept : Memory Device and Biosensor
Use carbon nanotubes for electromechnical switches (working with Nantero). Second testbed is for biosensors for 8-10 minute detection of antibodies. Working with Triton Systems on the biosensors.

May 22, 2008

Canada's natural gas and CO2 stimulated oil

Encana is the largest natural gas company in Canada and is active in the Horn River and Montney natural gas fields in British Columbia. Encana has proven reserves of 13.3 trillion cubic feet of natural gas.

Horn River wells are currently producing in the order of 3 to 5 million cubic feet a day per well. Encana plans to drill 50 to 100 wells per year into this formation.
EnCana claims the initial discovery at Horn River may have 6 trillion cubic feet of gas.

Encana also has sequestered 10 million tons of CO2 and could go to at least 30 million tons with full development in Weyburn. They use the CO2 to enhance oil recovery to 50%. They get 14,000 barrels a day from the Weyburn field.

The Encana portion of the Montney has 500 million cf to 1 billion cf a day long-term potential. [35.3 cubic feet in one cubic metre, so that amount would be 14.16 million to 28.3 million cubic metres/day, 3-6% of Canada's total natural gas production] They currently are getting 120 million cf/day. Well rates in the Montney are very consistent between 5 and 10 million cubic feet a day. Encana's production from their portion of the Montney could be 6-12% of Canada's total current natural gas production.

New natural gas finds in Canada:
- Ootla, about 60 miles from Fort Nelson in northeastern British Columbia, may hold 9 trillion to 16 trillion cubic feet of gas. Horizontal wells test flowed at rates of 8.8 million cubic feet, 6.1 million cubic feet and 5.3 million cubic feet of gas a day.
- Montney find in BC (50-80 trillion cf)
- the Horn River basin (12+ trillion cf.)
- Quebec Utica Shale based on some of the Canadian-based research on the play to date the size of the resource is being estimated between 24 and 30 trillion cubic feet of natural gas.
- Smaller but significant find of 1.6 tcf in Southern Ontario

The total new reserves are 97 tcf to 140+ tcf. If they were developed with production rates proportional to Encana's efforts then they would provide 8 bcf/day to 22 bcf/day. The current projection if for Canada to produce 15 bcf/day in 2009 [5.5 tcf per year]. So these new finds appear likely to reverse the decline in Canada's natural gas producton.

EOG Resources could have 6 trillion cubic feet of natural gas reserves in their portion of the Horn River Basin. They have not released new test well data.

Nextbigfuture's initial article about Canada's natural gas

Carnival of Space Week 55

May 21, 2008

Biomarkers for bloodtests and protein biomarkers for imaging for effective early stage cancer screening

Early detection saves lives as treatment is more effective. Also, it can be 100 times cheaper to treat early stage versus later stage cancer. The Canary Foundation goal is to deliver early detection tests for solid tumor cancers by 2015. Cancer treatment cost $89 billion in the U.S. in 2007. Over 1.4 million new cancer cases are expected in 2008 in the US alone. Less than 15% of research funding goes to early detection. Early detection has proven value: since 1950, there has been a 70 percent decline in cervical-cancer incidence and deaths in developed countries5 thanks to a simple screening test, the Pap test ($8 test). Effective early cancer tests could save over $50 billion per year in medical costs and 400,000 lives each year in the USA and 5 million lives around the world. 7 million people die from cancer each year worldwide.

Cancer researchers met at Stanford University to work toward a goal of developing a simple blood test to detect cancer.

The symposium by the Canary Foundation allowed doctors to share their research in developing a simple two-stage test for cancer. They're hoping to deliver an early detection test for solid tumor cancers by 2015, said Dr. Don Listwin, founder of the Canary Foundation.

The blood test, which would look for proteins given off by cancer cells, could detect the disease at its earliest stages, when treatment would be most effective.

Slide images and information is from the Canary Foundation presentation

A major investment in imaging is a key difference with Canary. As opposed to anatomical imaging, they strive to create probes that will home in on the cancer and light it up for the surgeon. Canary's goal is to deliver two‐stage tests for all solid tumors.

For the blood test, they need to combine multiple biomarkers to find ovarian cancer. So far, there aren’t single markers. Canary believes that combinations of three to five markers or “panels” will identify early cancer. They have a blood test that is quite promising at 0.960 in early stage cancer. So what is good enough? Well, if you cut someone open as the next step, then .96 isn’t near good enough. 4% wrong if you screen millions of woman is a disaster. But if you have a next stage imaging test to
confirm, deny or monitor, it is good enough. Without imaging, this test needs to be 0.999 and that will be very challenging if not impossible. Their focus is on new molecular imaging as opposed to anatomical imaging like X‐ray or mammography

They look for proteins that are specific to cancer that exist on the cell surface.
The scientists create probes that are specific to those cells and light them up.
Biomarkers for imaging are those type of proteins that stick on cell surface and for blood biomarkers it’s proteins that shed and circulate in the blood.

Nextbigfuture has been in favor of the aggressive research into biomarkers and the development of inexpensive tests for early disease detection.

Cheap titanium, virtual telescope, property rights in space

Titanium could become a lot cheaper and more commongly used. A non-melt consolidation process being developed by Oak Ridge National Laboratory and industry partners could reduce the amount of energy required and the cost to make titanium parts from powders by up to 50 percent, making it feasible to use titanium alloys for brake rotors, artificial joint replacements, space vehicles and military vehicles.

Peter noted that the non-melt approach, which includes roll compaction for directly fabricating sheets from powder, press and sinter techniques to produce net shape components and extrusion, offers many advantages over traditional melt processing.

“Instead of using conventional melt processing to produce products from titanium powder, with the new method the powders remain in their solid form during the entire procedure,” Peter said. “This saves a tremendous amount of energy required for processing, greatly reduces the amount of scrap and allows for new alloys and engineered composites.”

While powder metallurgy has been used to produce components for many years, titanium products have not widely been fabricated using these methods because of the high cost of conventional titanium powders. Now, however, new low-cost titanium powders are enabling ORNL, International Titanium Powders, Ametek and BAE Systems to develop these technologies for titanium.

Microsoft Research has launched the WorldWide Telescope (WWT). It is a Web 2.0 visualization software environment that enables your computer to function as a virtual telescope—bringing together imagery from the best ground and space-based telescopes in the world for a seamless exploration of the universe. WWT is a single rich application portal that blends terabytes of images, information, and stories from multiple sources over the Internet into a seamless, immersive, rich media experience. You have to download and install a component to use it.

Glenn Reynolds, the instapundit and a lawyer, talks about lunar property rights

Property rights attract private capital and, with government space programs stagnating, a lunar land rush may be just what we need to get things going again.

A longer piece in the Boston Globe about how allowing property rights in space will boost the motivation for development of space.

For thinkers like Wasser, celestial private property is important not simply because of helium-3 mining or moon-based solar arrays, it's important because it would allow for large-scale colonization. In such a future, Wasser believes, it simply doesn't make sense not to have private property in space, any more than it would make sense for people in the United States not to be able to buy and sell and inherit their homes.

The colonization of space, in this model, would unfold as a sort of interplanetary suburbanization, with the moon and other celestial bodies being settled thanks to reliable transportation and the ready availability of private plots of land. For all the technological marvels required to make this happen, it's a story Americans are pretty familiar with.

Nextbigfuture has indicated that there should be something like the 1862 Homestead act for space

May 20, 2008

Carbon nanotubes might increases risk of cancer, mesothelioma

Carbon nanotubes appear to have a similar cancer causing effect as asbestos in mice

Within days of being injected into mice, the nanotubes -- which are increasingly used in electronic components, sporting goods and dozens of other products -- triggered a kind of cellular reaction that over a period of years typically leads to mesothelioma, a fatal form of cancer, researchers said.

The preliminary evidence of cancer risk is strong enough to justify urgent follow-up tests and government guidance for nano factory workers. Others called for labels to guide consumers or recyclers who might encounter the material when incinerating or otherwise destroying discarded nano products.

Carbon nanotubes also occur in nature in volcanic ash. It would interesting to see what the cancer effect was of volcanic ash.

There is a lot of media coverage of this study

Gene therapy advance for safer, cheaper and more efficient procedure and Extinct Tiger gene revived

Replacing one amino acid on the surface of a virus that shepherds corrective genes into cells could be the breakthrough scientists have needed to make gene therapy 30 times more efficient. Gene therapy will be a more viable option for treating genetic diseases such as hemophilia. The discovery could be the solution to a problem that has plagued researchers and doctors using AAV as a gene therapy vector — how to administer enough of the gene-toting virus to yield a therapeutic benefit without triggering an attack from the body’s immune system.

A two-week-old mouse fetus expresses the DNA of the extinct Tasmanian Tiger by developing cartilage, shown in blue. So a hybrid of a mouse and and extinct animal.

This is the realization of many movies and TV shows: Jurassic Park, Manimal, Aliens IV, South Park, Island of Dr Moreau and many more

In separate news, DNA from an extinct Tasmanian Tiger has been resurrected in a live animal (mouse) for the first time. The genetic material, extracted from the extinct Tasmanian tiger, proved functional in mice.

In addition to being more efficient, the new version of AAV could also prove to be more economical, Srivastava said. Current gene therapy trials are expensive because scientists must administer so much of the vector containing the therapeutic gene to see results. Using the new vector, scientists could potentially scale back to using as little as 100 billion particles instead of 10 trillion, Srivastava said.

Iraq oil status and possible increases in 2009 and 2010

The US state department reports Iraq oil production for second week of May, 2008 at 2.52 million bpd. (slide 22 of the 35 slide report)

Iraq exporting 1.88-2.04 million bpd, since Sept 2007. The May, 2008 projection is for 2.04 million bpd.

Vitol, Anadarko, and Dome oil companies have formed a consortium currently negotiating with the GOI for a technical service contract. The GOI has been working with other companies on five short-term technical service contracts, each with an approximate value of $500 million. This sixth contract would involve the Luhaisoilfield located in southern Iraq. The deal would increase production at Luhaisfrom 50,000 barrel per day (bpd) to an estimated 150,000 bpd. The GOI hopes that the completion of all deals will result in an increase in output of 600,000 bpd, translating into a more than 25% increase in the country’s daily production over a two year span.

According to Dow Jones, Iraq’s crude oil exports are significantly higher in 2008 than they were at this point in 2007; oil exports have risen 22%. An average of 1.92 million bpd have been exported from Iraq this year, with the bulk (approximately 1.5 million bpd) coming from southern Iraq

Global supply statistics [increasing to new highs] and recent T Boone Pickens oil predictions

The latest oilfield, the Bakken, largest in the lower 48 states. Part of it is in Canada

Global oil megaprojects

Prior to the Iraq war with Iran in 1980, Iraq had a production capacity of 3.6 million bpd. That was reduced to 3.2 million before the first Gulf War in 1990 and to 2.7 million barrels per day before the start of the most recent conflict.

With a stable political and civil environment, Iraq has the potential to produce four million bpd in the near term, if necessary investments are made in repairing and modernizing facilities, and up to six million. Added to it are the prospects of five undeveloped fields in southern Iraq -- Bin Umar, Majnoon, Nasiriyah, West Qurna and Ratawi -- that have the potential to pump three million bpd.

On April 9, U.S-based IHS Inc. unveiled full details of Iraq Atlas -- the first and only detailed analysis of oil reserves, production and upstream opportunities in the Middle Eastern state. The study -- which came in the wake of a year-long fact-finding mission by geologists and petroleum engineers covering 435 undrilled prospects and non-commercial discoveries and 81 producing fields and commercial discoveries -- concluded Iraq has (conventional) reserves of up to 116 billion barrels -- third in the world after Saudi Arabia and Iran. That equation could easily change. According to the Atlas, if discoveries in Iraq's Western province are an indication, the pecking order may well be reversed -- Baghdad with potential oil reserves of 215 billion barrels, could race ahead of Canada at 193 billion.

"We estimate that there could potentially be another 100 billion barrels in the Western Desert areas," said Mohamed Zine, IHS regional manager for the Middle East. "It (the desert) is widely regarded as being substantially underexplored, with only one commercial discovery largely because Iraq has had a surplus of oil to date and little incentive for exploration."

Baghdad hopes to pump an average of 2.6 million to 2.7 million bpd over 2008.

Some of T Boone Pickens oil predictions are wrong

T Boone Pickens has made various predictions about oil on MCNBC

"Eighty-five million barrels of oil a day is all the world can produce, and the demand is 87 million," he said. "It's just that simple. It doesn't have anything to do with the value of the dollar."

target=blank>Pickens founded BP Capital and has a 46% interest in the company which runs two hedge funds, Capital Commodity and Capital Equity, both of which invest primarily in oil and natural gas.

Pickens is also investing in wind energy

Pickens says that world oil production will not exceed 85 million bpd. The EIA says that Feb 2008, the world production is at 85.921 million bpd. March and April had 300,000 bpd increase from Saudi Arabia and 42,000 bpd increases in Brazil.

The IEA statistics for total world oil production rose to 87.47 million b/d in February, up from 87.29 million b/d in January, the IEA said, thanks to higher volumes from the Americas and the former Soviet Union.

It would appear by end of 2008, Thunder Horse (Gulf of Mexico deepwater oil rig) will come online in 2008 and produce 250,000 bpd sometime in 2009. More additions in Saudi Arabia in July. More increases in Brazil up to 500,000-600,000 bpd more by end of 2008.

I predict that Pickens is wrong on the production peak.
I predict that the May, 2008 EIA numbers will be 86+ million bpd. (Not April because of the UK strike and other issues in April)
I predict that the Sept, 2008 EIA numbers will be 87+ million bpd.

On the Oilsands
Pickens claimed that the oilsand development would be hindered by a shortage of welders and personnel.

When asked about Canadian Oil sands, he said he had $500 million invested in this segment. He has been there ten years. I pointed out that he has probably made five times on his investment and he agreed. He owns Canadian Oil Sands (CNQ) and Suncor (SU). I then asked him if he was worried about the fact that the Canadian government is going to raise taxes. He said that governments always tax profitable businesses. Chesapeake Energy (CHK) and SandRidge (SD) were two explorers that he mentioned.

Encana is projecting growth in their holdings in the Alberta oilsands from 35,000 barrels a day net to EnCana today to 100,000 next year (2009), to 200,000 by 2012 and to 400,000 barrels a day by 2016.

A third-quarter [2008] startup of the massive Horizon oilsands project will deliver 110,000 barrels per day (bpd) in the first phase. Construction to increase capacity to 250,000 bpd is already underway.

Pickens forecasts $150 a barrel price for oil in 2008
"The only way I see that oil doesn't continue to rise [is] if we had a global recession." he said. "That will happen at some point, but I don't see the Chinese stumbling until after the Olympics."

Pickens says natural gas is the only American resource that can reduce oil imports. He claims the effective use of natural gas could reduce oil imports by 40 percent. [Wind and solar can free up natural gas to replace 40% of oil use in the US within 10 years.] He dismissed ethanol as an alternative.

Prices could rise that high because of a weak dollar, supply/demand imbalance, and any hickup in production (like the Nigerian unrest that is blocking 500,000 to 1 million bpd).

Using nuclear power, wind and solar to free up natural gas would be part of a reasonable energy plan.

May 19, 2008

New work on nanorobotics design, simulation and control for nanomedicine

A follow up to a group working on nanorobots for nanomedicine

The researchers have new papers: Nanorobot hardware architecture for Medical Defense

This work presents a new approach with details on the integrated platform and hardware architecture for nanorobots application in epidemic control, which should enable real time in vivo prognosis of biohazard infection. The recent developments in the field of nanoelectronics, with transducers progressively shrinking down to smaller sizes through nanotechnology and carbon nanotubes, are expected to result in innovative biomedical instrumentation possibilities, with new therapies and efficient diagnosis methodologies. The use of integrated systems, smart biosensors, and programmable nanodevices are advancing nanoelectronics, enabling the progressive research and development of molecular machines. It should provide high precision pervasive biomedical monitoring with real time data transmission.The use of nanobioelectronics as embedded systems is the natural pathway towards manufacturing methodology to achieve nanorobot applications out of laboratories sooner as possible. To demonstrate the practical application of medical nanorobotics, a 3D simulation based on clinical data addresses how to integrate communication with nanorobots using RFID, mobile phones, and satellites, applied to long distance ubiquitous surveillance and health monitoring for troops in conflict zones. Therefore, the current model can also be used to prevent and save a population against the case of some targeted epidemic disease.

They have a lot of papers and work at their site on nanorobotics design, control and 3d simulation

Current developments in nanoelectronics [Appenzeller, J.; Martel, R.; Derycke, V.; Rodasavljevic, M.; Wind, S.; Neumayer, D.; Avouris, P. Carbon nanotubes as potential building blocks for future nanoelectronics. Microelectron. Eng. 2002, 64 (1), 391–397.] and nanobiotechnology [Liu, J.-Q.; Shimohara, K. Molecular computation and evolutionary wetware: a cutting-edge technology for artificial life and nanobiotechnologies. IEEE Trans. Syst. Man Cybern. Part C- Appl. Rev. 2007, 37 (3), 325–336. ] are providing feasible development pathways to enable molecular machine manufacturing, including embedded and integrated devices, which can comprise the main sensing, actuation, data transmission, remote control uploading, and coupling power supply subsystems, addressing the basics for operation of medical nanorobots.
A recent actuator with biologically-based components has been proposed [Xiong, P.; Molnar, S.V.; Moerland, T.S., Hong, S.; Chase, P.B. Biomolecular-based actuator.
7014823US 2006, Mar. ]. This actuator has a mobile member that moves substantially linearly as a result of a biomolecular interaction between biologically-based components within the actuator. Such actuators can be utilized in nanoscale mechanical devices to pump fluids, open and close valves, or to provide translational movement. To help control nanorobot position, a system for tracking an object in space can comprise a transponder device connectable to the object. The transponder device has one or several transponder antennas through which a transponder circuit receives an RF (radio frequency) signal. The transponder device adds a known delay to the RF signal, thereby producing RF response for transmitting through the transponder antenna [Laroche, J.-L. RF system for tracking objects. 20060250300US 2006, Nov]. A series of several transmitters and antennas allow a position calculator, associated with the transmitters and receivers, to calculate the position of the object as a function of the known delay, and the time period between the emission of the RF signal and the reception of the RF response from the first, second and third antennas. Nanotechnology is moving fast towards nanoelectronics fabrication. Chemically assembled electronic nanotechnology provides an alternative to using complementary metal oxide semiconductor (CMOS) for constructing circuits with feature sizes in the tens of nanometers [. Goldstein, S.C.; Rosewater, D.L. Methods of chemically assembled electronic nanotechnology circuit fabrication. 7064000US 2006, Jun. ]. A CMOS component can be configured in a semiconductor substrate as part of the circuit assembly [Ramcke, T.; Rosner, W.; Risch, L. Circuit configuration having at least one nanoelectronic component and a method for fabricating the component. 6442042US 2002, Aug. ]. An insulating layer is configured on the semiconductor substrate, which covers the CMOS component. A nanoelectronic component can be configured above an insulating layer. If several nanoelectronic components are provided, they are preferably grouped in nanocircuit block.

This work used a 3D approach to show how nanorobots can effectively improve health care and medical defense. Nanorobots should enable innovative real time protection against pandemic outbreaks. The use of nanomechatronics techniques and computational nanotechnology can help in the process of transducers investigation and in defining strategies to integrate nanorobot capabilities. A better comprehension about the requirements a nanorobot should address, in order to be successfully used for in vivo instrumentation, is a key issue for the fast development of medical nanorobotics. Details on current advances on nanobioelectronics were used to highlight pathways to achieve nanorobots as an integrated molecular machine for nanomedicine. Moreover, based on achievements and trends in nanotechnology, new materials, photonics, and proteomics, a new investigation methodology, using clinical data, numerical analysis and 3D simulation, has provided a nanorobot hardware architecture with real time integrated platform for practical long distance medical monitoring. This model can enable nanorobots as innovative biohazard defense technology.

In the 3D simulation, the nanorobots were able to efficiently detect alpha-NAGA signals in the bloodstream, with the integrated system retrieving information about a person infected with influenza. The model provided details on design for manufacturability, major control interface requirements, and inside body biomolecular sensing for practical development and application of nanorobots in medical

The use of nanorobots for in vivo monitoring chemical parameters should significantly increase fast strategic decisions. Thus, nanorobot for medical defense means an effective way to avoid an aggressive pandemic disease to spread into an outbreak. As a direct impact, it should also help public health sectors to save lives and decrease high medical costs, enabling a real time quarantine action. An important and interesting aspect in the current development is the fact that, the similar architecture presented in terms of hardware and platform integration, can also be used to detect most types of biohazard contaminants.

Nanorobot hardware articles

Aging 2008 at UCLA June 27-29, 2008

The Methuselah Foundation is having a major Aging conference at UCLA in Los Angeles June 27 through June 29, 2008 It is at Royce Hall in UCLA.

Aging: the Disease, the Cure, the Implications

The press release on the free June 27 event

What: Aging: The Disease, The Cure, The Implications, hosted by Methuselah Foundation
When: Friday, June 27, 2008, Drinks 4pm, Presentations 5pm, Dinner 8pm
Where: Royce Hall, 405 Hilgard Ave, Los Angeles, CA 90024
* Dr. Bruce Ames, Professor of Biochemistry and Molecular Biology at UC Berkeley
* G. Steven Burrill, Chairman of Pharmasset and Chairman of Campaign for Medical Research
* Dr. Aubrey de Grey, Chairman and CSO of Methuselah Foundation and author of Ending Aging
* Dr. William Haseltine, Chairman of Haseltine Global Health
* Daniel Perry, Executive Director of Alliance for Aging Research
* Bernard Siegel, Executive Director of Genetics Policy Institute
* Dr. Gregory Stock, Director of Program on Medicine, Technology & Society at UCLA School of Medicine
* Dr. Michael West, CEO of BioTime and Adjunct Professor of Bioengineering at UC Berkeley

The June 28, 29 Aging conference which costs $150-$995 depenging on if you are a student, how early you register and if you need a hotel room. So register early (before June 5, 2008) to save the most money.

Here are banners and videos for bloggers and others to use to promote the event

Limits to AGI, brain computer interfaces, nanomedicine and nanorobots

Some people scoff at mind uploading, human level or greater artificial intelligence or nanorobots that are able repair all damage to the human body

Here is a review of the current state of brain computer interfaces, brain simulation, and nanomedicine related medicine (cellular repair and rejuvenation) and science. I do not see anything stopping zettaflop level computers (one million times more powerful than the current best supercomputers). Zettaflop computers will enable human level brain simulations or greater. Brain computer interfaces already exist in many forms and are able to restore and create memories. There is nanoscale technology for precise delivery of drugs, genes and imaging agents into the cells of the human body and interaction with cells. Adult cells have been rejuvenated (made younger) by manipulating the cellular mechanisms. It will be a lot of hard work that will need to be funded to achieve the increased levels of capabilities, but it appears that there is a path to those levels and that it is not unreasonable to expect those levels to be achieved.

From the April IEEE Spectrum, an article on recent Brain-computer interface work which indicated large funding from DARPA for significant brain-computer interface progress by 2009.

2007 work for integrating an artificial 12000 neuron memory device with the human brain

Clearly those who predict that artificial brains will never reach humans levels feel that progress on hardware brain simulation or software brain simulation will reach limits or will not be integrated with more advanced brain/computer interfaces

Millions of neurons now and billions projected for several funded and active projects by around 2015

Will exaflop computers not be achieved ?

The Zettaflop design work won't pan out ?

10,000 neurons and 30 million synapses using a 22.8 teraflop supercomputer now. (45 times more power in a petaflop machine; 45,000 times for an exaflop, 45 million times for a zettaflop).

The researchers believe that the hardware for full brain simulation will be available in 2017.

A whole human brain has 100 billion neurons and 100 trillion synapses.

So what are the hardware, bandwidth or interface issues ?

I can also put together the current state of nanomedicine. With the ability of nanoscale devices to be targetted in the body at tumors or for delivery of gene therapy or delivery of sensors or imaging agents or drugs.

There is scientific/medical journal discussion on invivo rejuvenation of cells

There has been the analysis of the process to turn existing cells into rejuvenated cells. (A sequence of gene manipulations.)

So it appears that nanomedicine does not need to manipulate every molecule in the body. It merely has to extend and enhance existing alterations of processes in the body.

Which goes to the strategy of surveying what is working best now and enhancing and extending those methods instead of finding far more difficult plans and showing how those would have problems with physical limits.

Someone can show - look how little we understand about gravity and how hard or impossible anti-gravity is. OR someone can look at more efficient flight and space launch systems or look at enhancing magnetism for a ground based launch against the earth's magnetic field. There are many different options and strategies for any goal. Determine if the goals are worthwhile and find the best approaches for getting to worthwhile goals.

Flying cars are technically achievable, but 1.2 million deaths from regular ground based cars. Are flying cars a worthwhile goal ?
A technologically and economically practical flying car. but safety ? Insurance ?

New implantable device can extract stem cells from blood

Bloodstream robots and other technology from Engines of Creation

Enabling regeneration in humans

Progress to nerve and paralysis repair

Carbon nanopipettes for cellular surgery of organelles within the cell

Magnetically assembled nanotube tipped probes

Scale, cost and impact of a public transportation overhaul

Many environmentalists love to point to public transportation as the blessed technology and policy which would reduce oil consumption and dependence. I agree that increasing public transportation can be helpful, but how helpful and how long would it take to have what level of impact ?

In the big energy picture, the USA uses 40% of its oil for cars. So more mass transit will not help with oil used for factories, home heating, trains, freight trucks and other uses.

I will show that going from 3-5% public transportation up to 20-30% will still leave 70-80% driving cars. It will cost 1% of GDP for 50 years. So 40% of oil for cars goes to 30%. US 20.8 million barrels of oil per day goes to 18.5-20 million barrels of oil per day. After decades of a massive public transportation system overhaul which would be politically difficult [perhaps impossible] to enact in many cities.

Which is why the nextbigfuture energy plan has a different focus.

Book on Australia and New Zealand public transportation It is from the early, mid-90s but there has been no significant change in the situation.

Public transportation trips in Auckland over time

Comparison of some other cities over times
Perth -7.1% per capita ridership since 1981 versus 2005.
Portland and Brisbane are among the cities with more public transport per capita since 1981.

Perth and Adelaide and Auckland all are among the cities with less mass transit per person. How much would it cost and how long would it take to get up to the higher levels. Add 60km of rail per person. In each low transit city.

Comparing transit and commuter trips for USA, Australia, Canada, europe and Asia.

Transportation in Australian and New Zealand cities

Stronger versus weaker rail cities in Australia and New Zealand

2005 view of public transportation in many cities

Public transportation for cities in 2001-2005.

So copy Vienna and 50 years of 1% of city GDP (or more for a shorter period of time, this would be trillions of dollars on a global scale.) on public rail plus all decisions needed to transform city planning and layouts for higher public transportation usage to get closer to the higher commuter usage rates. Note: the highest usage rates are 20-30% in Europe. Maybe some highly dense asian cities are doing better. The Philippines with a lot of jeepnees. (Local made jeep/buses where people with less money ride at to 20 to a vehicle, some hanging on the outside. Two jeepnees crashing together can result in 30-50 deaths.)

So up from 3-5% up to 20-30%. That still leaves 70-80% driving cars.

So 40% of oil for cars goes to 30%. US 20.8 million barrels of oil per day goes to 18.5-20 million barrels of oil per day. After decades of a massive public transportation system overhaul.

May 18, 2008

New Cell processors, latest FGPAs, optimized Opterons

IBM has shifted cell processors to 65 nanometers and improved double precision performance by up to five times (PowerXCell 8i processors). Double precision performance is very important for scientific and supercomputing applications.

New PowerXCell 8i processors in QS22 blades have :
● 460 single precision (SP) GFLOPS/217 double precision (DP) GFLOPS per blade
● 6.4/3.0 TFLOPS (SP/DP peak) in a single BladeCenter chassis (14 blades)
● 25.8/12.18 TFLOPS (SP/DP peak) in a standard 42U rack with 56 blades installed

QS22 information at IBM's site

Here is a comparison of the latest FGPAs versus the latest quad-core Opteron. 2.5 GHz quad-core Opterons and Virtex-5 LX330, SX95T and recently announced SX240T FPGAs.

For the quad-core Opteron, this equates to a theoretical peak of (4 ops/clk * 4 cores * 2.5 GHz) 40 Gflop/s in 64-bit mode and 80 Gflop/s in 32-bit mode. For actual predicted performance, microprocessors use DGEMM (64-bit matrix multiply), which is typically 80 percent to 90 percent lower then the peak.

The SX240T can achieve 1.5 to 4.78 times more speed than the DGEMM speeds. DGEMM performance on a microprocessor is the best actual performance and typically hand coded in assembler by the microprocessor vendor. Typical user code that has been run through a compiler normally achieves maybe 25 percent of the peak, and even less as the number of cores increases

The PowerXCell 8i achieves nearly 109 double precision (DP) gigaflops. That's on par with the 102 DP gigaflops provided by AMD's FireStream 9170 GPU.

AMD's FireStream 9170 chipset includes 660 million transistors and 320 processing units and gets 500 peak gigaflops. The FireStream 9170 is a step on the way to AMD's Fusion project, which the company says will combine a graphics processor and general processor on the same piece of silicon. AMD hopes to release Fusion in 2009.

AMDs latest processor and workstation roadmap.

A comparison of Cell processors, GPGPUs, and FPGAs

Форма для связи


Email *

Message *