April 04, 2008

China and Iran are following Mr Miyagi's advice with new anti-ship missile

As Mr Miyagi said in Karate Kid.. If do right, no can defence

The U.S. Navy can't stop China's most sophisticated anti-ship missile (purchased from Russia) -- and won't even start testing a defense until 2014. I don't think it is an issue between China and the USA because I do not believe they will be fighting. More relevant is if Iran gets the missile. Then it would be important for the USA to use spies, satellites and other means to find any missiles and destroy them before putting the navy within range. If Iran got it then it would change the tactics in any Iran/USA war. A war over Taiwan is not in the offing as the Taiwan presidential and legislative election has placed pro-Chinese politicians in Taiwan. They are going to moving to a common market. If Taiwan and China move to a European Union type situation then there will not be war.

It was pointed out to me that the US tactics would not be that greatly effected.
1) Pound them using air force. Air craft carriers are used because US does not have enough landbased friends. B52's do most of the damage. Navy is for precision attacks. In the Iran case, airforce from Iraq, Afganistan, Dubai, and Suadi would mash them into stone age.
2) Drive army in from a friend. In this case, Iraq and Afganistan. [if wanting a land assault which the US probably does not except for some kind of smash and dash]
3) Navy would sit well offshore and cheer them on.

However, the US military pumping up its enemies would get the Admirals desired project or pet weapon system funded.

The Sizzler starts at subsonic speeds. Within 10 nautical miles of its target, a rocket-propelled warhead separates and accelerates to three times the speed of sound, flying no more than 10 meters (33 feet) above sea level. On final approach, the missile 'has the potential to perform very high defensive maneuvers,' including sharp-angled dodges, the Office of Naval Intelligence said in a manual on worldwide maritime threats.

The U.S. Navy, after nearly six years of warnings from Pentagon testers, still lacks a plan for defending aircraft carriers against a supersonic Russian-built missile, according to current and former officials and Defense Department documents.

Air power australia has photos of the Sizzler supersonic missile

Hat tip to Wired defense blog

The missile, known in the West as the ``Sizzler,'' has been deployed by China and may be purchased by Iran.

The Defense Department's weapons-testing office judges the threat so serious that its director, Charles McQueary, warned the Pentagon's chief weapons-buyer in a memo that he would move to stall production of multibillion-dollar ship and missile programs until the issue was addressed.

``This is a carrier-destroying weapon,'' said Orville Hanson, who evaluated weapons systems for 38 years with the Navy. ``That's its purpose.''

China's off the shelf air defense

Sino defence forum discusses the Sizzler missile and the Sunburn (supersonic all the time)

The Sizzler is smaller in size and lighter. I think the main difference is that the sunburn travels at supersonic speeds all the time, therefore requiering a big load of fuel, wich in turn makes the missile big and heavy. The Sizzler cruises at subsonic speeds and goes supersonic in terminal phase. But there are different versions of the "Club" some are "conventional" subsonic CMs over the entire flight. The Sizzlers range is longer than the Sunburn's.

And I think the Sizzlers also have the capability of interoperability. Meaning they can exchange info. One "lead" missiles flies at high altitude searching for targets with it's radar, while the other missiles of a barrage stay low and recieve info from the lead-missile.

Sino Defence has info on the missile

For political reasons, China may get small indigenous air craft carriers around 2013 (not nuclear powered, competitive with non-USA aircraft carriers other than the expected future French aircraft carrier

Here is a more complete list of arsenal of China's navy.

3M-54E (SS-N-27) Anti-Ship Cruise Missile
Sunburn 3M-80E (SS-N-22) Ship-to-ship Missile

The purchase of the 3M-54E1 with 300 kilometer (180 mile) range back in 2005

Summary and Review of ABC : living to 150

Barbara Walters had a show: Live to 150..Can you do it? It was light on details and touched a lot of the topics in the longevity area.

Alcor, the cryogenics company has their article

ABC has other related articles.

I saw the show. It was a very quick run through of many topics on aging. Aubrey's section was maybe 2 minutes and half of that was him riding a bike and paddling on water. His actual speaking points were a very brief summary of extend life 30 years and then during that time extend it again by 50 years and so on up to 1000 years. No details or even mentioning of SENS. Barbara does describe him as a respected scientist and active and basically a "leader in the field".

They had an extended bit on Advanced Cell Technologies corporation. A company whose stock has tanked. Because they could show video of an articifially created rat heart. They interviewed the CSO. Dr. Robert P. Lanza M.D., 51
Chief Scientific Officer. He predicts hundreds of years of life extension using organ and cell replacement and rejuvenation.

Calorie restriction segment shows practitioners and how they eat.

And then a lot of the show (almost half) is talking about and showing how extending life will be good and fulfilling and shows these active older people enjoying life and how people will have many careers and maybe many partners and maybe older woman will become lesbians because they will live so much longer than men. They also mentioned that 30 years was added to life expectancy in the 20th century.

They also talked about gray power and how boomers and others may retain their positions and power and thus help push the funding and effort towards life extension forward.

I think the more older reporters and journalists that there are on 60 minutes and ABC and other places then the more there will be favorable shows of this type.

A facebook discussion

Carnival of Space Week 48

I am stepping in as late last minute host for the Carnival of Space week 48. Thanks to Fraser Cain of Universe Today for letting me host. Send your Carnival of space entries for week 49 to carnivalofspace at gmail dot com.

We have Mars, Red Dwarfs, Astronomy meetings, 2001, Forbidden Planet, Dr Who, aliens that could hide their solar system and more.

From Centauri dreams we have Red Dwarfs: Dust, Details and Habitability

One of three pictures in the red dwarf ariticle.

Here Paul Gilster is looking at a recent paper on dust disks around red dwarfs, but the broader speculation is really about red dwarfs themselves and the odds on habitable planets around them. 75 million in our galaxy if we assume 1 per thousand stars -- and the article goes on to look at the other assumptions in that statement, and contrasts red dwarfs to G and K stars.

From Ed Minchau at Space Feeds Every day, Space Feeds shows a space video of the day; from these Ed chooses a Space Video of the Week.

This week's video is the classic 1956 science fiction movie Forbidden Planet.

From Hobby space and Space Transport News is

Surrey's GIOVE-A : a full success

The GIOVE-A satellite, built and launched by Surrey Satellite, is another example of how an entrepreneurial, innovative, "NewSpace" style approach can succeed at rapid, low cost development of space systems.

David Portee at altairvi has a series of blog posts on the 40th anniversary of the film 2001. The film premiered 40 years ago today in Washington, DC. David's blog series started on Monday and will continue through Sunday, April 6, when the trimmed-back version of the film screened for the first time in NYC. His first post gave the dates of 2001's press screenings and premieres, and the second considered the inspiration behind the good ship Discovery. Today's will look at a Earth-moon transportation proposal described in a paper with the title "2001: A Space Odyssey Revisited."

There will also be one in which he compare the monolith and the TARDIS and discuss our perceptions of superraces.

Dynamics of Cats at science blogs has NASA launches USS enterprise. Note the date of the article.

Martian Chronicles has an update on the Opportunity rover from Monday's planning session. They are working on driving over close to a cliff in Victoria Crater called Cape Verde. They posted a low-resolution "thumbnail mosaic" of Cape Verde
See their links for more pictures.
as well as an image showing where the rover is now.

Another posting after the second rover planning meeting.

Starts with a bang has a little game called "Mars or Arizona?" He shows you pictures and you have to guess whether it's a picture of Mars or a picture of Arizona. So far the highest score is 10/13.

From Cumbrian Sky we have what will Phoenix see [on Mars] when she opens her eyes?

Pamela Gay at Star Stryder
has IYA taking shape

Summary: IYA is taking shape - People and tools are all slowly emerging to make 2009 a year of great astronomical promise

Ian O'neill at Astroengine has a brief story about the recent sunspot activity and cycle overlap.

Orbiting frog is submiting the entire National Astronomy meeting blog They are covering the UK National Astronomy meeting which is on this week and since its a one-week thing. They are covering press releases from the UK and talks/poster sessions.

Astroprof's Page has Too Much Radiation?

Summary: About radiation exposure to astronauts on long duration space missions.

A Babe in the Universe has Eruptions. Halma'uma'u crater on Hawaii's Big Island is erupting and simultaneously our Cassini spacecraft has found organic molicules erupting from Saturn's moon Enceladus

Earth and a distant moon share the phenomenon of internal heat sculpting their surfaces. Study of worlds like Enceladus offers clues to our own planet

My own Next Big Future entry Fermi Paradox, metamaterials and alien civilizations that are not distinguishable from dark matter.

Summary: Recently developed metamaterials are being developed to make objects that they surround invisible to microwaves and some optical wavelengths. They can also make things invisible to magnetic fields and sound and other waves. The Fermi paradox is based on assumptions about what aliens with advanced technology would or would not do and that the works of very advanced aliens should be visible to humans or that they or their robots should be greeting us like the galactic welcome wagon. Advanced aliens if they could build massive Dyson spheres could also make those things fade into the interstellar background. Plus we need to be humble and realistic in our assessment of how hard we have looked at the universe for life. This year we are finding that the Milky way is twice as thick as we thought (12000
light years not 6000) and last year the Andremeda Galaxy is five times bigger than we thought. These kinds of errors should tell us how clueless we are judging whether looking at some pinpoints of light tells us about whether there is or is not advanced life.

UPDATE: Centauri Dreams has an excellent follow up to my article on what aliens might do with Dyson spheres and current efforts to look for Dyson sphere building aliens.

April 03, 2008

Genetic makeup (gene UGT2B17) effects testosterone sports doping drug test results

From the Economist magazine, the genetic makeup of individuals effects results of performance enhancing doping tests. It shows that some dopers would not be caught with current tests and that some innocent people would be accused. I think we just need to make better safe forms of enhancement and then not worry about it.

The production of TG (testosterone glucuronide)is controlled by an enzyme that is, in turn, encoded by a gene called UGT2B17. This gene comes in two varieties, one of which has a part missing and therefore does not work properly. A person may thus have none, one or two working copies of UGT2B17, since he inherits one copy from each parent.

A researcher gave healthy male volunteers whose genes had been examined a single 360mg shot of testosterone (the standard dose for legitimate medical use) and checked their urine to see whether the shot could be detected.

Nearly half of the men who carried no functional copies of UGT2B17 would have gone undetected in the standard doping test. By contrast, 14% of those with two functional copies of the gene were over the detection threshold before they had even received an injection. The researchers estimate this would give a false-positive testing rate of 9% in a random population of young men.

Dr Schulze also says there is substantial ethnic variation in UGT2B17 genotypes. Two-thirds of Asians have no functional copies of the gene (which means they have a naturally low ratio of TG to EG), compared with under a tenth of Caucasians—something the anti-doping bodies may wish to take into account

Thus one third of asians could take testerone and not trigger an anti-doping detection.

Bakken oil from USGS 1995 and decline rates

Here is a picture of the geological layers and formations in the Williston basin (where a potential very large oil resource called the Bakken formation is located).

I have previously discussed the great potential of the Bakken oil resource.

This information is from the 1995 US Geological Survey (USGS) of the area.

A new USGS report is expected to be released this month (April, 2008).

UNCONVENTIONAL PLAYS, Continuous Type By James W. Schmoker (pages 9-17 of the 1995 USGS report)


Available evidence indicates that the Bakken Formation of Montana and North Dakota has generated hundreds of billions of barrels of oil. The overall Bakken unconventional continuous-type oil play is bounded on the north by the Canadian border (a political rather than geologic boundary), on the east, northwest, and west by thermally controlled limits of oil generation, and on the southwest by the Bakken subcrop. Within this area, the Bakken Formation is considered to be oil saturated. However, drilling and production data indicate that this entire area cannot be characterized by a single play probability, success ratio, and estimated ultimate recovery probability distribution. Consequently, the overall Bakken play is partitioned into three smaller plays--the Bakken Fairway (along the southwest subcrop), Bakken Intermediate, and Bakken Outlying Plays (3110, 3111, and 3112, respectively).

The Bakken (Spanish Pool) wells in the Antelope field area have produced about 12 million barrels of oil. As of July, 1993, 161 vertical Bakken wells (excluding the Spanish Pool) have produced 10,320,000 barrels of oil, and 202 horizontal wells have produced 12,233,000 barrels of oil. The Bakken Play is far from exhausted. Potential additions to oil reserves are measured in the hundreds of millions of barrels, in contrast to the tens of millions of barrels produced to date. Full realization of these potential reserve additions will probably depend upon improvements in technology, economics, and geologic understanding.

In 2007, North Dakota produced about 5 million barrels of oil from the Bakken oil formation (about 13500 barrels of oil per day), Montana about 50,000-60,000 barrels of oil per day and Saskatchewan about 30,000 to 40,000 barrels of oil per day.

There is a lot of drilling activity, but large scaling up of production will require new pipelines to be built and new refineries, which will take about five years.

Bakkenshale blog has a couple of useful tables that analyze the decline in production for wells that are being drilled in the Bakken oil formation.

The initial production (IP) decline chart with cumulative production thru January 2008 shows the average production over a longer period is about 60% of what is produced in the first month.

There is a discussion group on Bakken oil.

You need 18 to 24 months of production to get a good feel for what a well is going to ultimately produce.

It appears that if you average the first 2 to 3 months of "flush production", the typical well might be producing 50% of this average amount in 10 months to a year. After 15 to 18 months in appears production has leveled off at a rate of about 25-30% of the first 3 month average (with little regard to the IP rate). Hopefully the decline from this point forward will hold at about 10%-15% per year.

The obvious exception to the scenario is the Petro-Hunt USA 2D in the Charlson area. It's reported IP was 700 barrels per day. It's 16 month total production is 378,536 barrels and the most recent month production was 1000 barrels per day.

Other cautions on every well: did they stay "in zone" while drilling; did the zone get damaged while drilling; did the direction of the lateral section optimize natural fracturing, did the frac job get into the intended zones, and on and on. We'll all be wiser in a few years as this data base grows and learning curve goes higher.

This is a play brought on by technology: horizontal drilling and fracing. Both of these will only get better and we've just scratched the surface of the Middle-Bakken potential. Who knows where the Three Forks will take us.

April 02, 2008

Energy plan

Updating and synthesizing my articles on energy into an energy plan. I will update this article with more updates and synthesize past and new information that I have gathered.

Short term
Efficiency and drilling for regular and enhanced recovery, policy that discourages coal and fossil fuel and encourages nuclear and renewables. Try to reduce fuel usage 2-4% per year and try to increase oil from drilling and biofuels by 3-6% per year.

Mid Term
Big nuclear buildup and thermoelectric and transmission efficiency Triple nuclear power by 2020 by using new uprate technology and advanced thermoelectrics and some new plants. (25% from nuclear instead of 8.2% and 17% less fossil fuel. I would reduce coal first - 30,000 deaths from coal air pollution, 60,000 deaths from combined coal and fossil fuel air pollution in the USA. Plus moving 1.2 billion
tons of coal is 40% of freight rail traffic and 10% of diesel fuel usage.)
Can get up to six times more nuclear by 2030. Displace all coal and a lot of oil.

Mid-Long Term
Very advanced nuclear fission and nuclear fusion and better renewables (geothermal, wind [kitegen, superconducting wind turbines], solar [concentrated solar in municipal or rural power configurations. My favorite is CoolEarth's solar balloons], genetically modified organisms for biofuel)


Oil and fossil fuels are clearly critical in the near and mid-term and any shift away or reduction in usage is a very difficult task. Of the 100 quadrillion BTUs that the US uses 85% comes from fossil fuels. (It coincidently means that 1 quad BTU is about equal to 1%. World usage is a little over 4 times more with a slightly different energy mix)
(Dept of Energy figures for 2006)
40% of that is from oil (20-22 million barrels per day about 12-13 million barrels per day imported, recent high prices have dropped oil usage by 400,000 or so barrels per day, which is more than all geothermal, wind and solar combined)
23% from coal (mainly supplying 50% of electricity)
23% from natural gas
8.2% nuclear
3.3% wood based mainly, waste and biofuel
2.9% hydro
0.35% geothermal
0.27% wind (3 year wait for a new turbine if you order today)
0.07% solar (years to make factories, roof systems do not pay back costs to buy and install)

Energy use is currently close to evenly split between residential home (electricity and heating), industrial and transportation.

Home energy and industrial plant efficiency should be improved. Policy should be adjusted so that someone can more easily capture the return on efficiency investment. the problem is that I might not be motivated to put in more insulation and a better water heater [more cost effective and provides more energy savings than installing solar power] and appliances if I am selling the place in a few years or if I am renting it out and not paying for utilities anyway).

East coast homes using heating oil should be converted to electric heating.

Only 14-16 million new cars and trucks each year out of about 300 million cars and trucks in the USA (800 million in the world) We need to get the old cars and trucks that are driven on the highway a lot retrofitted with aftermarket adjustments to make them more aerodynamic. highway mileage can be increased 25% fairly easily. Maybe
10% of fuel for cars and trucks could be saved. This would mean 5% of total US oil or 1 million barrels per day. (5-10 years for a strongly supported program)

Reduce highway speed limits back to 55 or 60 mph and other policy modifications.

Hybrids and electric cars. Using ultracapacitors and batteries or all ultracapacitors.
Mixing folding electric bikes/scooters with public transportation.
[China is making 30 million electric bikes and scooters each year. In 5-7 years most of the 500 million bike riders in China could shift to electric bikes and scooters]


There is quite a bit of oil in Alaska but it would take 5-10 years once we started to try and drill to get up to 1 million or so barrels per day. They talk about 10-40 billion barrels of oil there. I view it is a secondary and larger strategic oil reserve. If things get desperate enough for whatever reason it will be drilled.

Nearer term and not controversial is the Bakken oil field. Known for quite a while but until recently with high prices and new drilling tech not thought to be economic. Now it is the hot and profitable new play in oil. USGS (geological survey) released a new study that confirmed the current recoverable oil as 3 to 4.3 billion barrels of oil in the US portion. Past estimates 200 - 800 billion barrels of oil in place. It is under North Dakota, South Dakota, Montana, Saskatchewan and Manitoba. It is a thin layer of light oil (the good stuff sandwiched between shale).
Many more agile oil firms are going after it (including what used to be called Enron.)
Many of the wells are paying back in 3-12 months. Costs double to drill the horizontal wells with stacked fracturing versus a traditional well.

About 140,000 barrels per day now from Sask and USA. Maybe 250,000 barrels per day by the end of 2008, Maybe double the year after. Saskatchewan in Canada is a bit ahead in drilling this play. North Dakota, Montana need to build refineries and pipelines to get the oil out in order to scale this up in say five years to million barrels per day plus.

New Gulf of Mexico oil find by Chevron will also have significant oil in 5 years. Mega oil projects worldwide will be the primary determiner of how much oil is available. The US has the Thunder Horse deep oil rig which should add 250,000 barrels of oil per day. The USA uses about 21 million barrels of oil per day and imports 10 to 11 million barrels of oil per day.

Enhanced oil recovery could tap more of the previously used wells. 300 billion barrels could be extracted from old wells in the USA. Enhanced recovery can help recover more oil in Canada's oilsands and the US oil shale in colorado (but those are longer term projects)


In spite of almost no new reactors being built in the USA for 2-3 decades, nuclear power has been increasing because of higher operating efficiency and power uprates (different kinds of traditional uprates +2%, +5% and +10-20%). Most gains from better operations.

There is technology (from MIT and other places will take about ten years to fully deploy, could be faster but regulatory issues) that would enable increasing the power from current reactors by 30-50% by changing the coating and configuration (shape of the nuclear fuel). The fuel also makes the reactors safer.

In 2015-2020 we should have built 10-20 of the 30-32 reactors that will have applied for licenses. New uprating technology could add the equivalent of 30-50 new reactors by making better fuel.

Idaho national labs plan for making current reactors better

McCain and Lieberman had a climate change bill that the EIA (DOE's energy information agency analysed). It could increase nuclear power by 20% by 2020 and triple it by 2030. Because any legislation that increases the cost of coal and natural gas means the next best option is nuclear for utilities. Coal plants are about as big and take several years to build similar to nuclear plants. China builds coal plants at 1 per week. 1984 there were 28 nuclear plants completed worldwide. 1974 there were 12 nuclear plants completed in the USA.

Coal and nuclear reactors only use about 33% of the heat energy that they generate. Steam generator efficiency. Some plants are located where the low grade steam heat can be used for biofuel power input or new desalination.

New thermoelectric technology (electronics to convert heat to electricity) could increase efficiency from 33% up to 45-60%. Again a huge boost. Some of this work is funded as part of the Freedomcar project (GE, Catipillar and others working on it). The other way to boost thermo efficiency would be to switch to new high temperature nuclear reactor designs [Modular Helium reactor 47% thermal efficiency]. Higher temperatures allow for higher conversion.

Current nuclear reactors as good as they are basically reactors designed for submarines during the 1950s. There were and are nuclear reactor designs that could use 98% of the nuclear fuel instead of 5%. Thus 93% of the "nuclear waste" which is unburned fuel could be used for energy generation. It would mean completing new reactor designs and building out new reactors (7-10 to new reactors another 10-15
years to get significant build out.)

Possible breakthroughs with privately funded nuclear fusion projects.
5-20 year timeframes if they work out. I believe the Bussard and Tri-Alpha Energy project and the General Fusion project should work out. Even a ten fission reactor to one expensive nuclear fusion reactor would be important.

All external costs and internal costs compared for different energy sources

Current central power source analysis by the DOE

Specifics of the MIT 50% uprate with new fuel

Past standard uprates and operating efficiency gains, France is uprating about half of their reactors by 7%

Lifecycle CO2 analysis

EROEI comparison for different energy sources

A new centrifuge is 20 times more energy efficient at enriching uranium for reactor fuel.

Nuclear power build not materially constrained

Idaho national labs strategic plan for light water reactors would work out issues of preping the supply chain for 10+ reactors per year by the USA.

Staffing up nuclear power (other energy also has staffing and supply chain issues, 3 year wait for a wind turbine, grid buildout for serious shift to wind, new factories and supply chain for solar). Idaho national labs strategic plan for light water reactors also addresses staffing.

The EIA analysis of the effect of a climate change bill passing Two to three times more nuclear power from increased nuclear plant build. It does not consider the MIT work or the thermoelectrics.

Flex fuel substitution (which needs to be combined with genetically engineered biofuels)

Direct conversion of radiation into electricity and an alternative thermoelectric advance

Promising alternative private nuclear fusion projects (several have been privately funded

Constant threats and challenges to life, Bacteria needs to be emulated

A couple of interesting posts about historical threats to civilization and life by Howard Bloom.

Natural climate shifts and from space (not asteroids but interstellar gases).

Humans are not the most successful life, bacteria is the most successful. Bacteria has survived for 3.85 billion years. Humans for 100,000 years. All other kinds of life lasted no more than 160 million years. If your numbers are not big enough and you are not diverse enough then something in nature eventually wipes you out.

I think the point is that the Enlightment and life is not just about holding past gains. Fear and lack of confidence could allow a retreat in society or make human life more fragile. Emergencies can be used as excuses for theft or rollback of gains or enable choices that increase the fragility of humanity as a whole.
We need to push forward with more confidence.
More confident projects of high risk [but with the best planning to maximize the odds of success] and high reward.

Using the nations of the world to allow the competent citizen to be as free and empowered as possible.
example. there is no approved gene therapy procedure in the USA yet. China has had a gene therapy procedure approved since 2003.
Medical tourism is something that perhaps a million people are doing.
Insurance companies are shifting to making deals with overseas hospitals to send people (expenses paid) over to the foreign country for a treatment to save money for the insurance company and the patient (no deductibles etc...)
Medical tourism is also being used to circumvent excessive restrictions.

Use jurisdictions and find ways to make them more open to create competition for more freedom and access and power for individuals.

Enable more open source, creative commons, collective research and projects.

Better and more collaboration.

Disruptions from small recessions to extinction (frequency, history and classification).

April 01, 2008

Samsung Instinct, Current EVDO Rev A wireless and future Super 3G and 4G

Sprint and Samsung are releasing the Samsung Instinct a iPhone like phone that uses their 3G network (EVDO Rev A, 1400 kbps down/400 kbps up) The Samsung Instinct will be available in June and Sprint is providing and unlimited connection plan Sprint is a company in trouble and is losing subscribers and could have to sell out to another company.

UPDATE: A spokesperson for Sprint said the carrier has decided to sell the device for less than $299.99 and wants to push it as close to $200 as margins will allow. That would make for a mid-range iPhone clone with nice touches like localized haptics feedback (powered by Immersion), visual voicemail, a 2MP camera and an included 2GB microSD card.

A 3G version (with about the same speed over the ATT network) of the iPhone should be available in a few months.

DoCoMo has an aggressive deployment schedule for the 250 megabit per second high-speed wireless technology Super 3G (LTE, Long Term Evolution) and has targeted 2010 for initial rollout.

Verizon (NYSE: VZ) Wireless (including its Vodafone (NYSE: VOD) Group equity partner) recently said it plans to move to LTE and the GSM Association, representing most of the world's cell phone service providers, also has endorsed LTE. DoCoMo picked Ericsson as its partner to supply LTE infrastructure. Ericsson also has been soliciting winners of the recent 700-MHz spectrum auction as candidates for its LTE infrastructure technology.

The likely rollout of WiMAX by Sprint, clearwire and others is one of the big reasons many analysts think the LTE vendors and carriers are being more aggressive with launch plans.

The traditional GSM carriers also will have another technology evolution before reaching LTE. That’s HSPA+, the next iteration in the W-CDMA/HSPA technology that expects to provide data rates of about 24Mbps to 40 Mbps. HSPA+, also called HSPA Evolved, is expected to be deployed starting in 2010.

Qualcomm joined with Vodafone, Ericsson and Huawei on a forthcoming trial of HSPA+. Vodafone started deploying HSPA last year, with data rates of up to 7.2 Mbps. Peter Carson, senior director of product management in Qualcomm’s semiconductor group, says access to all these various air interfaces will be done on the device side through multimodal chips.

Currently most US carriers have made a significant shift to EV-DO Rev. A (600-1400 kbps download and 300-800kbps upload. This is an upgrade of Edge networks 473 kpbs with 200 kbps in actual practice). Coverage maps will show which is available from which provider.

Most analysts think it will be at least 2010 and more likely 2011 before LTE equipment starts getting deployed. That’s creates a possible opportunity for another technology some call 4G – WiMAX. WiMAX doesn’t yet have the data rates that LTE (10 Mbps vs. 100 Mbps) but still could provide multimedia applications like video.

Analysts have just started making forecasts for 4G uptake, including LTE, WiMAX and 3GPP2’s Ultra-Mobile Broadband (UMB) fostered by the CDMA community. ABI Research is expecting more than 90 million subscribers for LTE and WiMAX in 2013. Another research firm, Analysys, is suggesting the number of LTE subscribers will hit 400 million by 2015.

Sprint is in talks with Google, Comcast and others to fund the Wimax rollout. Wimax would initially have a peak speed of 10 Mbps (2-4 Mbps in actual usage).

Ericsson's director of government and industry relations, Mikael Halen, says his company will roll out its first LTE base stations in 2009, with commercial services up to 100 Mbps later that year.

Mr Halen says Ericsson will deliver a full prototype LTE network to Verizon in the US, and Japan's NTT DoCoMo before the end of 2008

Wimax is already being deployed in Australia

A software upgrade could double EDGE performance in the third quarter of 2008.
AT&T would need to deploy the Nokia Siemens software upgrade on its network for subscribers to reap the benefits; so far, not a peep from AT&T about Nokia Siemens' announcement.

EVDO Rev B is available to wireless carriers but so far none are moving to it. EVDO Rev B would be three times faster than EVDO Rev A (9 mbps instead 3.1 mbps)

CTIA-The Wireless Association® announced today that as of December 2007, the industry survey recorded more than 255 million wireless users [USA]. This represents a year-over-year increase of more than 22 million subscribers.

March 31, 2008

Software and design innovation is enabling a 2 litre engine to perform like a 3-4 litre and save 27% on fuel

Better computer software is enabling a smaller engine to have higher performance and use 27% less fuel with low emissions

Ricardo's engine, called 2/4SIGHT, uses valves like a four-stroke engine, but in two-stroke mode, the engine keeps both the intake and exhaust valves open at the same time so that the fuel and air in the cylinder are replenished each cycle, rather than every other cycle. Ricardo's prototype, an adapted 2.1-liter V6 engine, has been tested by researchers at the University of Brighton and has been found to be able to produce the kind of performance one would normally expect from a three-to-four-liter engine. Based on the New European Driving Cycle, which is a standard performance test designed to gauge engine efficiency and emissions under typical car usage, the prototype has demonstrated fuel savings of 27 percent, and it reduces emissions by a similar amount. The next phase is to try to incorporate a prototype engine into a working vehicle, says Jackson.

"Four strokes are most efficient at full throttle; with two strokes, it's the opposite," says Robert Kee, a mechanical engineer who specializes in combustion engines at Queen's University, in Belfast, Northern Ireland.

The difference between two- and four-stroke engines is that the latter carry out the four stages of air intake, compression, combustion, and exhaust in four strokes of a piston. A two-stroke engine, in contrast, does this in just two piston strokes.

Two-stroke engines are intrinsically simpler by design and have higher power-to-weight ratios at high loads and low speeds because they get twice as many power strokes per revolution. But traditional two-stroke engines require oil to be mixed in with the fuel, and therefore produce higher emissions. Because of this, they aren't typically used in cars. Instead, they're used for lightweight applications such as chainsaws, lawnmowers, and some motorbikes.

But now, researchers at Ricardo have developed a piston head that operates in both two- and four-stroke mode, and it can switch automatically between the two modes, depending on the needs of the engine. This allows a smaller engine to handle the low-speed, high-load conditions without stalling.

Stanford researchers develop tool that 'sees' internal body details 1,000 smaller

Stanford University School of Medicine researchers has developed a new type of imaging system that can illuminate tumors in living subjects—getting pictures with a precision of nearly on nanometer (one-trillionth of a meter).

This technique, called Raman spectroscopy, expands the available toolbox for the field of molecular imaging, said team leader Sanjiv Sam Gambhir, MD, PhD, professor of radiology. signals from Raman spectroscopy are stronger and longer-lived than other available methods, and the type of particles used in this method can transmit information about multiple types of molecular targets simultaneously.

“Usually we can measure one or two things at a time,” he said. “With this, we can now likely see 10, 20, 30 things at once.”

Gambhir said he believes this is the first time Raman spectroscopy has been used to image deep within the body, using tiny nanoparticles injected into the body to serve as beacons.

When laser light is beamed from a source outside the body, these specialized particles emit signals that can be measured and converted into a visible indicator of their location in the body.

Technology Review also has some information on this new imaging technology.

There are several techniques that employ the Raman effect, but this study used SERS (surface enhanced Raman scattering), which relies on roughened surfaces of metal nanoparticles to greatly boost the Raman effect. To create Raman nanoparticles, scientists attach small dye molecules, which scatter light, to these molecular amplifiers. They can then affix molecules that allow them to target the particles to a location in the body, such as antibodies that bind to specific proteins in cells.

The key advantage of this technique is that it allows for what imaging researchers call multiplexing: creating images of several different molecules at once. "One of the problems with imaging is, we tend to only be able to look at one or two things at a time," says Sanjiv Sam Gambhir, lead author of the study and codirector of the Molecular Imaging Program at Stanford. Multiplexing is important in complex diseases like cancer, in which several events occur within tumor cells, each of which could give information about the tumors' status and the likelihood that it will spread. As a first demonstration of multiplexing, Gambhir's team injected mice simultaneously with four kinds of Raman nanoparticles at different concentrations and showed that it is possible to locate the different particles and calculate their concentrations based on their Raman signal.

The most widely used molecular imaging technique in the lab is fluorescence. What makes Raman spectroscopy unique is that "you get a very sharp signal back, unlike [with] fluorescence, where you get a broad spectrum of energy," Gambhir says.

Claudio Vinegoni, an imaging specialist at the Center for Molecular Imaging Research at Harvard and at the Massachusetts General Hospital, who was not involved in the study, says that although scientists can use fluorescent molecules of different colors to see more than one molecule at a time, the ability to multiplex is limited because their signals quickly begin to overlap. In contrast, with Raman spectroscopy, "every molecule has its own Raman spectrum," Vinegoni says, so there is no possibility of the signals interfering. Because of their specificity, Raman nanoparticles can also be imaged at concentrations a thousand times lower than what can be detected using fluorescent quantum dots.

One of the major shortcomings of this technique, as in all optical imaging methods, is the limited ability of light to penetrate deep into tissue. Although it can be used to visualize the internal organs of a mouse, Gambhir says that in humans, the technique would be more useful for visualizing tumors close to the surface of the skin, such as melanomas or even breast cancer. The technique could also be used in conjunction with endoscopes that probe inside the body. Gambhir's team is planning a clinical trial to test the use of Raman particles in conjunction with colonoscopies for detecting early-stage cancers. In this procedure, the nanoparticles could simply be sprayed onto the surface of the colon rather than injected into the body. But a key challenge for bringing this technique into the clinic will be determining the safety of nanoparticles as probes--studies that Gambhir's group is currently undertaking.

Imaging of animals and humans can be done using a few different methods, including PET, magnetic resonance imaging, computed tomography, optical bioluminescence and fluorescence and ultrasound. However, said Gambhir, none of these methods so far can fulfill all the desired qualities of an imaging tool, which include being able to finely detect small biochemical details, being able to detect more than one target at a time and being cheap and easy to use.

Postdoctoral scholars Shay Keren, PhD, and Cristina Zavaleta, PhD, co-first authors of the study, found a way to make Raman spectroscopy a medical tool. To get there, they used two types of engineered Raman nanoparticles: gold nanoparticles and single-wall carbon nanotubes.

First, they injected mice with the some of the nanoparticles. To see the nanoparticles, they used a special microscope that the group had adapted to view anesthetized mice exposed to laser light. The researchers could see that the nanoparticles migrated to the liver, where they were processed for excretion.

Using a microscope they modified to detect Raman nanoparticles, the team was able to see targets on a scale 1,000 times smaller than what is now obtainable by the most precise fluorescence imaging using quantum dots.

When adapted for human use, they said, the technique has the potential to be useful during surgery, for example, in the removal of cancerous tissue. The extreme sensitivity of the imager could enable detection of even the most minute malignant tissues.

Fermi paradox, metamaterials, dark matter, recent science and advanced aliens fading into the background

Traditional view of a Dyson's sphere. Metamaterials could mask and alter the observable signature and alter the interaction with magnetism and the spectrum of space.

The theory of dark matter is that 85-90% of the mass in the universe is dark and does not interact with the electromagnetic force.

Recently there has been the design of metamaterials for magnetic shielding/invisibility Previously there has been work and designs for metamaterial to move microwaves, visible light and other wavelengths around a shielded region. There has also been recent progress on direct conversion of radiation into electricity and the highly efficient conversion of heat into electricity.

Scientists could use the metamaterial as a building block for a magnetic invisibility cloak. Such a cloak could hide magnetism by guiding an applied magnetic field around a cloaked region.

An advanced civilization could create a Dyson shell with metamaterials on the outside for guiding light, heat and magnetism around the shell [Dyson shell is converting all of the planets in a solar system into solar collecting satellites that orbit the star at about the distance of the earth. The structure lets you use all the solar energy of the star which would be over a trillion times more energy that our civilization uses]. Their shell would then have minimal interation with light and magnetism.

I had previous postings on the Fermi Paradox.

My speculation about technology and aliens is that we do not know what civilizations with technology able to explore the galaxy would be using. The Fermi paradox itself is based about speculations about aliens.

UPDATE: Centauri Dreams has an excellent follow up to this article on what aliens might do with Dyson spheres and current efforts to look for Dyson sphere building aliens.

Some the Fermi speculation is on dyson spheres and dyson shells (visible megastructures) or visitations to our world. The megastructures might not be the easiest things to spot. They might look like some large infrared source about the size of earth's orbit. Light would all be captured. Plus we are now learning how to convert heat to electricity in a very efficient way. A highly advanced civilization could be so efficient that they fade into the background of space.

We could be way off base trying to predict what a civilization hundreds, thousands or millions of years more technologically advanced than us would be doing.

Rolling back a few hundred years. An advanced civilization would be burning all of the forests for wood fuel or the new coal etc...

A bit before that an advanced civilization would be breeding and domesticating bigger animals and making sailing ships with 100 to 1000 sails.

Really advanced aliens could have some kind of controlled big bangs. Some high density pocket universes for power sources or they can leave this universe/dimension and travel to or make their own more productive places. People have talked about wormholes for possibly traveling within our universe but that kind of control of space and time means they could make their own dimensional places. I am not saying that this scenario is likely but we do not know that the dyson sphere/shell scenario or the Jupiter brain scenarios are the high probability end state technology version either.

So looking on planets and around stars could be like primitives looking into the best caves and wondering where the advanced people are. Cave and tree dwelling was common 100,000 years ago. Projecting out another 100,000 years in tech development is even more futile. Plus with accelerating tech even projecting out 50-200 years is very, very difficult.

Many people have an over-estimation of how much we have seen with astronomy.

Up until a decade or so ago humanity had not detected the wobbles in nearby stars caused by extrasolar planets. Up until that point planets around other stars was speculation. Up until the recent discoveries it was assumed that those planets would also have circular orbits like most of the planets in our system. Now the feeling is that non-circular orbits are more common.

If a dyson shell or sphere obscured a star then it is not like we would like in that area and say that a star was missing. There might be some difficult to detect infrared smudge. There are large voids in space without visible stars or galaxies. One is a billion light years across. We do not how that happened or if the stars and galaxies in there are just more sparse and difficult to see. Until now, optical surveys have found no voids larger than 80 megaparsecs wide – making the new hole 40 times larger in volume than the previous record holder. So there are plenty of gaps in our observations.

Our observations of other galaxies is pathetic. It was not until Edwin Hubble in the early 1920s using a new telescope that it was determined whether some nebula were galaxies. He was able to resolve the outer parts of some spiral nebulae as collections of individual stars and identified some Cepheid variables, thus allowing him to estimate the distance to the nebulae: they were far too distant to be part of the Milky Way. In 1936 Hubble produced a classification system for galaxies that is used to this day, the Hubble sequence.

Beginning in the 1990s, the Hubble Space Telescope yielded improved observations. Among other things, it established that the missing dark matter in our galaxy cannot solely consist of inherently faint and small stars. The Hubble Deep Field, an extremely long exposure of a relatively empty part of the sky, provided evidence that there are about 125 billion galaxies in the universe.

But those galaxies of billions and trillions of stars are smudges. We can say practically nothing about the composition of those smudges. We can now only determine what is or is not a galaxy based on redshift and pulsars to determine distance. If we now know whether some galaxies have a certain order of redshift, this tells us what about whether there is an advanced civilization inside. If an advanced civilization tore apart their own galaxy and remade it into something else, then how we would know that we should be seeing a galaxy where there is none now. If they did this in the two million years but are not in our galaxy then we would not have any light to observe from this event. If they did it within the last 100 million years but are not in our supercluster of galaxies then again we would have no light for those events.

Our own Milky way galaxy was recently found to be twice as fat as we thought. (12000 light years instead of 6000 light years.)

The Andromeda galaxy in 2007 was found to be five times bigger than previously thought.

From our observation, we can not tell what is or is not inhabited. We cannot tell what is or is not natural. We can make assumptions, but we do not know. If another civilization was to look at our solar system and all they could get was the light from our star and maybe if they were had good telescopes whether or not Jupiter passed in front of our star, what could they say about life on earth which they do not know is there ? What could they say if we had molecular nanotechnology and super AI and fusion power and one hundred times the population and terraformed Mars and Venus and have spaceships flying around the solar system. Looks the same still a star.

What if they were looking from a five thousand light years away at a shot of tens of millions of stars. If we had erected a dyson shell and now our star is obscured. Would there picture of ten million stars be different from ten million + 1 stars ? Would they be able to determine from gravitational tracking that there was an obscured star ?

How about if they were looking at the galaxy ? If they are inside the Milky Way and do not know whether there are 100 billion stars or 2 trillion stars then what can they say about life, even advanced life ? What if those aliens had only indirectly detected 200 planets and most that are 10 times the size of Jupiter or bigger ? What if there radio telescopes could not pickup radio signals from an equivalent civilization because the transmissions become attenuated and fade into the background. What if the civilization is just now still finding planets larger than Pluto in their own solar system and may not have spotted 100 objects of the size of the moon to the Pluto or bigger.

A d.c. magnetic metamaterial: Nature Materials

Electromagnetic metamaterials are a class of materials that have been artificially structured on a subwavelength scale. They are currently the focus of a great deal of interest because they allow access to previously unrealizable properties such as a negative refractive index. Most metamaterial designs have so far been based on resonant elements, such as split rings, and research has concentrated on microwave frequencies and above. Here, we present the first experimental realization of a non-resonant metamaterial designed to operate at zero frequency. Our samples are based on a recently proposed template for an anisotropic magnetic metamaterial consisting of an array of superconducting plates. Magnetometry experiments show a strong, adjustable diamagnetic response when a field is applied perpendicular to the plates. We have calculated the corresponding effective permeability, which agrees well with theoretical predictions. Applications for this metamaterial may include non-intrusive screening of weak d.c. magnetic fields.

A collection of links on Dyson shells

Discussion about aliens and Dyson shells and spheres with some excerpts below.

An energy balance calculation of the temperature inside a dyson sphere. Assuming that energy is lost only by blackbody radiation, and that inner (absorbing) and outer (emitting) surface areas are the same, a dyson sphere at 1 AU from the sun would equilibrate to a temperature of 396K = 123C. Clearly, this is unacceptable.

To achieve stable temperature of 33 C (still a bit warm if you ask this canadian), the sphere would have to be at about 1.7 AU or have an effective radiating area (fins?) about 3 times the inner absorbing area.

A Dyson shell has to radiate eventually because the thermodynamic gradients that power everything need a heat-sink. If we dump heat as microwaves just a bit shorter than the CMB, at a radiator temperature of 3.25 K, then the radiator surface has to be ~ 20,768 AU in radius to handle the Sun’s output. A lower output temperature means a much larger radiator. I seriously doubt anything can be made that big except for very tenuous gas.

So what’s to be done? A civilization needs thermodynamic gradients and so at some temperature it must radiate. Perhaps the old SF idea of “cooling lasers” might be feasible? By dumping heat as a coherent beam pointing into the intergalactic void, advertising one’s presence to the galaxy might be avoided. In that case I suspect a real Dyson sphere will be needed to manage the Sun’s light, instead of the Dyson Swarm that Dyson himself imagined.

Gene Therapy Breakthrough- Three Micro RNA inhibition injections reduces cholesterol 30%, 2013 for human use

Two forms of RNA either boost or suppress protein production, and scientists have in recent years discovered this system is central to a range of illnesses, including cancers, viral infections, cardiovascular disease and neurological disorders.

Previous attempts to manipulate this process have failed because the drug molecules used were too large to get to the target cells. Now scientists at Santaris Pharma have developed smaller compounds that can cross cell membranes and intercept the microRNA molecules that usually put a brake on protein production.

LNA-mediated microRNA silencing in non-human primates.

MicroRNAs (miRNAs) are small regulatory RNAs that are important in development and disease and therefore represent a potential new class of targets for therapeutic intervention. Despite recent progress in silencing of miRNAs in rodents, the development of effective and safe approaches for sequence-specific antagonism of miRNAs in vivo remains a significant scientific and therapeutic challenge. Moreover, there are no reports of miRNA antagonism in primates. Here we show that the simple systemic delivery of a unconjugated, PBS-formulated locked-nucleic-acid-modified oligonucleotide (LNA-antimiR) effectively antagonizes the liver-expressed miR-122 in non-human primates. Acute administration by intravenous injections of 3 or 10 mg kg-1 LNA-antimiR to African green monkeys resulted in uptake of the LNA-antimiR in the cytoplasm of primate hepatocytes and formation of stable heteroduplexes between the LNA-antimiR and miR-122. This was accompanied by depletion of mature miR-122 and dose-dependent lowering of plasma cholesterol. Efficient silencing of miR-122 was achieved in primates by three doses of 10 mg kg-1 LNA-antimiR, leading to a long-lasting and reversible decrease in total plasma cholesterol without any evidence for LNA-associated toxicities or histopathological changes in the study animals. Our findings demonstrate the utility of systemically administered LNA-antimiRs in exploring miRNA function in rodents and primates, and support the potential of these compounds as a new class of therapeutics for disease-associated miRNAs.

Mice on a high fat diet were given three injections of a drug to block miRNA-122, a compound in the liver that controls cholesterol levels. Those given the highest dose had 30 per cent lower cholesterol levels than those given placebo injections, and the effects lasted three weeks after the last injection.

Laboratory tests also showed that blocking miRNA-122 also prevented the hepatitis C virus replicating. Human trails of a drug to treat hepatitis C will begin next year and scientists are using the method to develop a treatment to combat blood cancers.

Santaris predicts new therapies will be ready for use by patients within five years if trials go well.

Santaris Pharma is preparing to advance its first LNA-antimiR compound, targeting miR-122, into human clinical testing in the first half of 2008.

Safer and more precise methods of targeting gene therapy are being made with synthetic zinc fingers.

Researchers have figured out the real problem with a common gene therapy delivery system the adenovirus type 5. Adenovirus consists of three major proteins - fiber, penton and hexon. Previously researchers thought the fiber protein was the problem. New research indicates it is the hexon. Now by modifying the hexon they can make adenovirus delivered gene therapy safe.

“Now that we have learned the mechanism that an adenovirus uses we could modify that process by genetically engineering the virus, to improve uptake into several cell types, including stem cells,” says Dr. Napoli.

There has also been progress in using gene therapy to treat brain cancer.

Safe and effective gene therapy or drugs that safely target genetic effects could be used to safely boost muscle mass by four times. This could make people stronger and healthier. Better weight control with more muscle that burns excess fat. 2012-2016 seems to be the likely timeframe when these procedures start making a big societal impact. It could happen sooner and more could happen later, but that seems to be the time when more people will realize that a new age is upon the world.

March 30, 2008

The big energy picture

This is how much energy the United States was using from 2002 to 2006. Notice that solar is 0.1%. Nuclear increase from 2002 to 2006 was equal to the total amount of all solar power. (even though that was just operating efficiency and some small nuclear uprates).

Oil and fossil fuel usage was increasing. Petroleum (oil) was the primary source. 21 million barrels per day or about 7.4 billion barrels per year.

Oil usage in the united states is described here

Twenty times as much solar power as there was in 2006 would be 1.2 quads. It would be nice but 5% of the coal usage. Increasing wind by ten times 2006 would be 2.6 quads. Combined it would be equal to about what one would expect to be the business as usual increase in energy consumption. All of the old coal and oil would stay in place.

The California million roof plan is subsidies of $2.9 billion and the hope is to get 3GW of solar power installed by 2018. These kind of programs are not good energy investments because the same investment could buy more nuclear power, wind power or pay for the research for more efficient and effective solar or other energy. A Berkeley study shows that solar installations do not pay back their investment.

Biomass has a more significant share.

France was able to achieve over 30% energy from nuclear (80% of electricity) [4.4 quads out of 11.4 quads).

Brazil has been able to get more of its cars running on biofuels from sugarcane.

Energy Plan
Any reasonable energy plan has to look at still obtaining and using oil for the next ten to twenty years. This means enhanced oil recovery and new oil sources (such as the Bakken Formation) and new natural gas sources.

Even drilling oil from profitable reserves takes time. US drilling activity in 2007

Making our homes and houses more energy efficient. Heating, insulation and appliances need to be addressed more aggressively.

New technology for uprating nuclear power plants can add 50% more power to existing reactors within 10 years. Regular nuclear power uprates will be adding 4% to nuclear reactor in France and 2% to US reactors over the next 6 years. New nuclear plants are being constructed and could add 150-250 GW worldwide by 2020.

I had a prior post on short, mid and long term energy strategy.

Short term: conservation and drilling for more oil, enhancing oil recovery, uprate nuclear power, develop and deploy more efficient thermoelectric processes and technology, ecomod existing the fraction of the 800 million existing cars and trucks with a lot of highway travel (make them more aerodynamic with a focus on those that drive on the highway the most.), increase industrial and home efficiency. Adopt policies that shift energy away from coal and oil.

Mid-term (2012-2020) signficant nuclear power and efficiency technology could be brought into play. The privately funded nuclear "battery" (Hyperion Uranium hydride reactor) could be developed and placed into fairly high production (50 per year would be equal to one large nuclear plant

Kitegen is an interesting wind power technology which has interesting potential.

By 2015 Iris reactor and/or the Modular Helium Reactor could provide greater fuel and energy effiency and lower costs.

Nuclear fusion could start making a significant difference to the energy picture in the 2015-2020 timeframe if the IEC fusion, colliding beam fusion or some of the other private projects pan out.

Oil prices fell March 31, 2008 to about $100/barrel.

Refineries operated at 82.2 percent of capacity in the week ended March 21, the lowest since October 2005, the department said last week. Total implied US fuel demand averaged 20.3 million barrels a day in the four weeks ended March 21, down 2.2 percent [440,000 barrels per day, 0.8 quads which is more than the power from wind and solar from some demand destruction caused by higher prices] from a year earlier, the Energy Department said last week. Fighting between government forces and militiamen in Iraq eased after a truce offer from Moqtada al-Sadr. Iraq has the world's third biggest oil reserves, according to BP Plc.

Electric cars using a cellphone like billing service model could be dominant in countries like Isreal and Denmark within ten years. Smaller countries, islands (Hawaii, Singapore, some Japanese islands) and city regions with smaller service areas and higher gasoline prices and car taxes are most suitable for the new model.

Форма для связи


Email *

Message *