This would be great but I think it will initially be a niche market. It may stay a niche market, depending upon costs and competing tech. The new plants will gradually replace regular silicon plants. Diamond semiconductors will have to scale up their processes (wafer size). Wafer size drives costs and efficiency.
If the processes can be performed at low enough temperature and integrated with silicon then it could be possible to make some hybrid chips. Having some diamond cells that perform some high frequency work and calcs. It would allow faster rollout.
So 8-20 times performance boost but over how many years is that impact spread ?
If it is 10 years (optimistic) that is 5 iterations of Moore's law. Instead of 32 times faster it ends up at 256 or 600 times. So 3-4 more iterations of Moore's law.
Other tech being developed could also provide substantial boosts in speed: carbon nanotube electronics, advanced spintronics, laser interconnect etc...
All together it is a lot of new tricks to master. Some of them will be competing. Probably some will be in different niches. It also means that the end game for when Moore's law runs out of steam keeps getting pushed out decades until all the tricks get mastered at low cost.
Diamond semi looks good for military and space apps where you want resistance to high temp and don't care as much about price.
Something that I think looks good for boosting high end computing is taking the cheap and what will be high volume Cell chips (Sony PS3) making slight modification to one cell for double precision and getting 10-40 times more speed. Cell+ are about 20 times faster, 2 times faster again going to match 65nm process and then keeping pace with smaller process lithography. 2-4 years to ramp up the Cell chip volumes.
Semicon industry has resisted getting out of their silicon comfort zone (for valid economic reasons and for rapid scaling). If they can use tricks with silicon to get a pretty close speed boost (better strained silicon, substrate tricks, etc...) then they will use those instead of bringing in an entirely new process.
There are a lot of interesting ways to speed things up. I think advancement should get faster than Moore's law even without MNT [MNT is Molecular Nanotechnology].
Obviously if we get good MNT then a lot of the economic road blocks to integrating and rolling out new materials and shortening the gap from lab to product could get removed. The technology improvement logjam could get broken. We could get maybe 19-26 iterations of Moore's law when we were expecting 1 to 5.
4 from Diamond/carbon nanotubes, 4 from laser interconnects between chips (plus there is a Sun microsystems process for put chips end to end for communication), 5 from advanced cell architectures, 6 from smaller size processes. Maybe 5 from going early 3d. (processing cube instead of chip.), 2 from better cooling.
I would be surprised if even with MNT that we could rollout MNT simulaneously with a fully realized computing process with all of the optimal tricks at once. So not over a few months but a few years. Still probably a big impact quite fast. 50 years of rapid Moore's law progress in say 6-9 years.
Plus the boost to quantum computing, superconductors, etc...
Moore's law is also a cost thing, so MNT could also throw in some extra iterations by making things cheaper faster.
A lot better supercomputers accelerate the simulation and R&D. We learn new tricks faster.
The increase in the rate of progress even after the initial burst I think gets sustained. (if we don't screw it up). 6-9 month cycles instead of 24 months. That is without strong AI or strong intelligence enhancement.
I think the new thing (post-MNT) would be to push out products that were say 4-X times better every year. Doubling speed would not be worth switching for in most cases.