IBM Made First Chip with 2 Nanometer Nanosheet Technology

IBM announced the first chip with 2 nanometer (nm) nanosheet technology.

IBM’s new 2 nm chip technology helps advance the state-of-the-art in the semiconductor industry, addressing this growing demand. It is projected to achieve 45 percent higher performance, or 75 percent lower energy use, than today’s most advanced 7 nm node chips.

Four years after IBM announced its milestone 5 nm design, this latest breakthrough will allow the 2 nm chip to fit up to 50 billion transistors on a chip the size of a fingernail.

Advanced 2 nm chips will enable:

* Quadrupling cell phone battery life, only requiring users to charge their devices every four days.
* Slashing the carbon footprint of data centers, which account for one percent of global energy use.
* Drastically speeding up a laptop’s functions, ranging from quicker processing in applications, to assisting in language translation more easily, to faster internet access.
* Contributing to faster object detection and reaction time in autonomous vehicles like self-driving cars.

A 2 nm wafer fabricated at IBM Research’s Albany facility. The wafer contains hundreds of individual chips. Courtesy of IBM

SOURCES – IBM
Written By Brian Wang, Nextbigfuture.com

8 thoughts on “IBM Made First Chip with 2 Nanometer Nanosheet Technology”

  1. By the way, are we at the limit yet? We always get lower nm numbers advertised, but increasingly barely perceivable performance improvements.
    It would be nice to see someone coming out with a processor technology, that's not based on the same ~60 year old concept.

  2. Why is your new laptop a pig? Because new programs and operating systems expand to occupy the resources they have.
    I remember back in the 386 days when someone was complaining that the new version of Windows took up 40mb. He said, Nothing should take up 40mb!

    Now the bi-weekly update to Windows 10 takes up about 1gb.
    The newer computers are really nice. It's the newer software that clogs them down.

  3. Yeah, moving from 5nm to 2nm for the same CPU design is roughly a quarter of the footprint, but I doubt that translates directly to a quarter of the power consumption. Quantum electron tunneling is gonna be fun at this feature size…

    The real big deal is that as this effectively will precede microcontrollers getting smaller/faster/cheaper with much more compute. We're already seeing microcontrollers with full multipurpose ARM cores now as is, so going smaller will allow more cores/memory, which may lead to the end of most microcontroller lines aside from very niche applications, and the coming of an effectively standardized high performance microcontroller for peanuts as makers consolidate on fabbing just a few chips types for microcontroller applications. You can see examples of this in fake chips right now, where decapping them shows a generic ARM microcontroller running software to emulate the microcontroller they were disguised as, and that is still cheaper than the original microcontroller.

    So we'll be seeing smartdust IoT things. We'll be seeing heterogeneous core mixes in small/mobile CPU applications (fast cores for single threaded stuff, and slow cores for general/housekeeping processes. More cache.

    As for why GoatGuys new windows PC is slow, I suspect it's a combo of maker bloatware and windows evolutionary cruft. My early Core2Quad 3GHz 8GB PC with a SATA SSD with no third party antivirus and a clean windows 10 install with no bloatware has served me well.

  4. There is an equivalent for energy efficiency in other applications than computing called Jevon's paradox. An increase in efficiency makes it cheaper, which toghether with elastic demand increases overall energy consumption.

    Modern flat screen TVs are more efficient than CRTs. But they still consume more energy than CRTs because a large CRT is tiny by modern standards. I think it peaked with plasmas, but we can surpass that again with wall-sized modular TVs or HDR with very high peak brightness.

    First came refrigerators, and they were tiny. Then came larger refrigerators and freezers. Then came A/C to refrigerate entire houses and appartments. Then refrigerator-freezer combo units asymptotically approached the size of the door; now it's common to have two refrigerator/freezer pairs for just a small family. Energy consumption only stopped when doors put a limit on their size.

  5. Yep, came to make the same comment.
    Of the 4 listed outcomes, only the last is likely to occur.

    Because only robo-driving is currently limited by available technology. In the other cases, the system is sized to produce the minimum acceptable performance. Improved technology will result in resizing, not improved performance.

  6. Is there some aphorism that describes this? Something like, "A growth in systemic capacity is inevitably accompanied by a bloat in systemic inefficiency."

  7. Quadrupling battery life…

    Maybe. More likely, more 'going on' on future cell phones. So the 'savings' will be abbreviated. Actually too, with the ever present tend toward thinner, lighter, less battery will be put in the phones because '1–2 days is enough'.  

    • Slashing the carbon footprint of data centers

    Why not 'reducing energy for unit computing performance?' Same as that of the cell phones: With much much increased performance the actual joule budget is not going to be reduced: computing demands increase either in proportion or more-so.

    speeding up a laptops, applications, language translation, internet access

    Ah… promise of 35 years? Each new generation is supposed to save energy, be faster, be smarter, offer lower and lower 'human impact' from dawdling.  

    Yet, my wife's brand new computer is a pig.  
    8 GB of RAM, 4 GHz 6 core Intel CPU.  
    A fine RDX GPU.  
    A ½ TB SSD drive.  

    And it is a dog.  
    WHY?

    Its specs far exceed my 2013 MacBook Air (1.3 GHz, 4 GB RAM, 2 cores, ¼ TB SSD, no GPU, shared video-computer memory).  Yet, my MacBook runs circles around the new computer. WHY? WTF-WHY?

    Contributing to faster autonomous vehicle robo-driving

    Well, sure. 2 nm GPUs will definitely put da peppa in da salsa. 
    On this we agree.

    ⋅-⋅-⋅ Just saying, ⋅-⋅-⋅
    ⋅-=≡ GoatGuy ✓ ≡=-⋅

Comments are closed.