Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
IBM creates 9nm carbon nanotube transistor that outperforms silicon (extremetech.com)
79 points by ukdm on Jan 26, 2012 | hide | past | favorite | 31 comments


When I think of IBM, I still think of consultants and guys in blue suits. ...but more than anyone else out there, these guys are building the future.

See also:

http://en.wikipedia.org/wiki/Watson_(computer)

http://www.engadget.com/2011/06/30/embargo-ibm-develops-inst...

http://www.engadget.com/2012/01/14/ibm-stores-bits-on-arrays...

I hate to think what their patent portfolio looks like.


Its patent portfolio is huge -- the biggest in the world. They have registered the most patents per year for, like, the last 10 years. More than 5,000 per year! (Next biggest is Samsung and MS, which have averaged around 2k-3k per year, I think).


I worked for IBM for a year before Uni. Great place to work, but there was a joke that if you sneezed, they would patent it. They offered bounties for any granted patents, even if they were never used.


How awesome would it be it we just could hit delete on the hole shity patent db.


Your horrible spelling aside, I wouldn't mind seeing the patent system suffer an irreversible data corruption, along with a fire destroying all hard copies.

A forced reset, as it were.


If IBM didn't think that it would be awarded patents for work like this, why would they bother to do the research in the first place?


Surely you don't think the only reason people do research and advance science is due to hope of compensation?


That might not be the only reason for people, for companies however it would be silly to spend money without hoping to get it back somehow.


You don't need patents to create and execute big ideas. As a company or anything else.

As I said, this is a dream (and a pretty silly one at that). I can still hope though


The alternative to patents is secrecy. I think the patent system was originally introduced because it was deemed superior to make ideas public, but limit their commercial usability.


Problem being that the patent system (much like the copyright system) is irrevocably hosed.


Yes, without patents nobody is ever gone invent anything again.


Shake that preconception! I can say with confidence that in the Hursley UK site (where us tech types are) you'd struggle to even find someone in a suit.


It's true that IBM does incredible research, but their results never seem to break into mainstream.


They do end up into the mainstream. IBM is not a B2C company, though, so there is no hype behind the unveiling of a new shiny gadget. So the tech community at large seriously underestimates IBM's contributions to the field.



SQL came out of IBM research and its quite mainstream.


IBM was the first company to successfully release a commercial USB flash drive back in 2000.


I believe IBM was the first company to use GMR in hard drives.


And even hard drives, after all they invented them, too.



Transistors that outperform silicon aren't especially rare. The hard part is finding ones that can be outmanufactured compared to silicon.


How do they get the carbon nanotube in the correct position and orientation like that? I suspect it will be difficult to do it reliably and cheaply for billions of transistors at a time.


I apologize if this is a bit long:

The answer can be found in the supplementary information to the article here: http://pubs.acs.org/doi/suppl/10.1021/nl203701g/suppl_file/n...

In short, they grew the nanotubes on a quarts crystal, transferred them with tape quite randomly to gates already made on a Si-wafer and then etched and metalized around these gates. They then tested a large amount of devices to find ones where a nanotube of the right kind (semiconducting) had placed itself in a correct alignment with the gate and metalized contacts (source and drain). Once they knew what devices were working they imaged some of the working devices with an Atomic Force Microscope (AFM) and did a bunch of standard transistor measurements.

As was commented earlier, the specific growth of nanotubes in a well defined position is not easily achieved. There is IMO a long time until anyone can do a full chip where nanotubes grow exactly where one wants the transistors. It might even be that graphene is a more convenient technology for just this reason (since graphene can be grown somewhat more conveniently by annealing SiC wafers). A side note I guess is that growing nanotubes and pillars vertically can be done in specific spots on a wafer, but that makes manufacturing of the gate a bit problematic. And I do not know wheter carbon nanotubes can be grown selectively this way.


Pfft. That's an engineering problem, not a research problem. I kid a bit, but IBM does have the fabrication engineering talent to pull it off.


The mass production of transistors is completely thanks to its relatively etching process. I wonder if we will see a relatively similar process for alternative transistor technology.


They probably use an atomic force microscope (google "atomic force microscope nanomanipulation").


Probably. There's a reason we are still using 32nm silicon, even though we can experimentally fabricate graphene & gallium-arsenide transistors at sizes down to... 6nm? (That's the smallest I remember.)


"At 9nm, IBM’s transistor is also smaller than the physical limit of silicon transistors, which is around 11nm."

Can anyone cite a source for that statement regarding the theoretical minimum size of silicon transistors?


According to the Wikipedia article about the 11nm node, the thickness of the gate dielectric becomes just 1 atom at that scale: http://en.wikipedia.org/wiki/11_nm


Although Intel mentioned recently that they think they can make it to 8nm




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: