He that will not apply new remedies must expect new evils; for time is the greatest innovator. - Francis Bacon
In previous editorials, we examined the High Tech industry in significant detail by picking a representative sample of leading systems, software and semiconductor companies. Taken as a group, the selected firms serve the full range of technology markets - the first tier markets of the three C's and mobile computing and the second & third tier sectors of industrial, automotive, medical and mil/aero.
The examination revealed that macroeconomic factors emerging after 2007 were severely hampering growth across the industry. These factors were caused by the ridiculously leveraged positions of large commercial banks and sovereign governments in nearly all of the developed and emerging national economies worldwide.
The 'cures' imposed by various governments and central banks for their economic woes were derived from existing economic orthodoxy and its standard models and theorems. However, in this instance, the debt problems proved so great that the remedies not only failed but actually locked the global economy into a 'liquidity trap.' As a consequence, national economies in most of the world are stagnant, with uncertainty and financial duress stifling the extension of credit that would facilitate business expansion and growth.
The fallout from this moribund economic environment has included, among other things, an extended period of significantly higher than usual unemployment throughout most of the industrialized world. The resultant consumer distress has manifested itself as a weakening in disposable income, with negative effects that are still cascading thru the entire High Technology industry.
Ill fortune seldom comes alone. - John Dryden
Concurrent to its buffeting by ill economic winds, High Tech has also encountered a frightful technology obstacle - the death of Moore's Law. Without 3P gains of 20%-25% for each successive process generation, additional chip features and benefits can no longer continue to be integrated without penalty. This constrains the enhancements, optimizations and improvements of all other technologies thru the High Tech value chain, including systems and software, in a kind of 'domino effect.'
Yet despite this one-two combination of blows to the industry, there are no 'Chicken Little' proclamations echoing across High Tech office cube farms. This is simply not the way that workers or management react to bad news in this business.
Victory belongs to the most persevering. - Napoleon Bonaparte
To work as an engineer in High Tech, you have to be an individualist by nature. You also have to be an optimist. Engineers are, by both inclination and training, disposed to critical analysis of an event or item and decomposing it into its constituents in order to find ways to improve upon the whole. They proceed under the assumption that there is always a way of improving upon the original.
When confronted by severe market and technology challenges, High Tech engineers don't throw their hands in the air and yell "The Sky is Falling!" or dismiss the future prospects of their industry as a mature sector with no further room for growth. Stated differently: Wall Street orthodoxy cannot understand and doesn't apply to High Tech. When the going gets tough, Technology engineers simply get to work making new things and improving on old ones.
The end result has been a wave of new or refreshed interest and work on different process, memory and packaging technologies. In other words, the Silicon Valley response to the macroeconomic and technological Chimera has been to do what it does best - Innovate.
Silicon On Insulator (SOI) is a process technology developed to extend performance and power improvements in any given conventional bulk CMOS node. To describe it in somewhat simplistic terms, SOI consists of adding a layer of glass to conventional silicon. Due to the insulating properties of the added glass layer, transistors can be built in the silicon matrix with more current drive and in closer proximity than in conventional CMOS processes. Depending on design and library choices, one could optimize an SOI-based chip design for superior performance or greater power efficiency. The technology also offers some advantages to mil/aero applications because of its added resistance to soft errors in memory.
IBM pioneered the commercial use of SOI, offering a process node at 180nm for ASIC clients at the end of the 1990's. Despite hopes at the time of launch and efforts for several years afterwards, IBM was never able to consistently offer more than 10-15% improvements in power or performance at the sacrifice of a 20-25% greater cost.
Nevertheless, SOI was adopted and used by other chip companies - Freescale, AMD and STM, for example. A variant of SOI that uses sapphire instead of glass as the insulating layer has seen use by companies making RF chips.
Samsung seems to be the latest firm to join the fray, having just licensed STM's 28nm SOI process for use in mobile computing applications. There is a general interest across the chip industry in giving SOI another look. Despite over a decade and a half of dedicated manufacturing effort to optimize SOI nodes, though, it is still not a perfect balm for the end of Moore's Law. SOI allows optimization of designs for two P's - performance and power - at the expense of the third: price.
Another area of renewed interest in Silicon Valley has been in exploring different memory technologies and architectures. Some of these efforts were stimulated by the burst in enterprise networking activity that started in the final part of the last decade, as routers began updating to 10Gb and then 40/100Gb bandwidths to support the growing backhaul demands of mobile computing. Several memory IP startups arose from this networking revitalization, including companies such as Memoir Systems and Crossbar. Another source of inspiration has been the runaway success of Flash-based devices for applications such as SSD, with Flash IC's driving most of the 7% growth of the total semiconductor market in 2013.
There seem to be quite a few memory technologies undergoing intensive applied R&D - some with development histories going back to the early 1990's, others even farther back. Among them are the following:
1. MRAM - This technology has been in and out of the news for nearly three decades but never seems to have hit its stride. In its simplest form, ferromagnetic elements are used to 'store' data based on the polarity of those elements. One is permanently polarized while the other is programmable. The value of the data can be read by a transistor that senses the resistance of the cell, which changes based on whether the magnetic elements have the same or opposite polarity. Some of you might notice how this sounds something like the magnetic 'core' memory of the 1960's.
New development efforts are focused on the scaling and power problems MRAM has had in the past. Applied research continues because MRAM does indeed offer the theoretical promise of better performance than DRAM, the same density and much lower power requirements. MRAM also provides the permanence of Flash while consuming much less power and avoiding the programmable endurance issue. Once perfected, it might be able to compete even with SRAM for lower performance applications while offering much superior density. This would make MRAM a kind of 'super memory' that could replace all other architectures for most applications. An alliance of nearly two dozen American and Japanese companies was formed at the end of 2013 with this end in mind.
2. ReRAM - also known as RRAM (resistive random access memory), this may offer promise over not only conventional memories, but many of the novel architectures currently being researched. Based on a broad variety of organic and nonorganic oxides or complex compounds, the basic idea involves what might be grossly termed a regrowable fuze as the data storage element. The research suggests that RRAM can offer superior performance, power and scaling compared to any other memory architecture currently in use or still being researched, but it has yet to prove this in large scale manufacturing.
3. FRAM - this memory technology resembles DRAM in application and architecture but uses very different materials. A bit is held by a thin film of exotic polarized material that can be rewritten. It purports to offer the additional benefit of Flash-like non-volatility but with much greater reprogramming endurance, significantly less power required for the write cycle and much faster programming as well. However, reading the cell destroys any information it may contain, requiring a rewrite cycle afterwards to restore the original state.
Research into semiconductor-based applications of FRAM began in earnest 25 years ago. Despite having been in production for over twenty years, FRAM has yet to make a significant impression on the semiconductor market. The technology has not achieved the same scaling into deeper submicron as DRAM or Flash, nor can it offer the bit densities of Flash.
There are many other memory technologies being researched or manufactured in small quantities in pilot lines. Which ones will prevail? I haven't the vaguest idea, to be frank. Ask the same question to people in the development labs for any chip company, foundry or research institute and you're likely to get many different answers.
I was a very green-around-the-gills marketing guy at Raytheon Semiconductor in 1991 when Raytheon was participating in a DoD project to develop FRAM. The very first films implemented on silicon left the Raytheon researchers shaking their heads in shock and disappointment. Nevertheless, R&D work continued. The demise of Moore's Law will add significant impetus to all of these research efforts, so that some clear winners are more likely to emerge in the next several years.
Memories are only one half of the improvement required for digital integrated circuits. The other is, of course, the processing logic. With no more room to effectively drive 3P improvements thru smaller physical dimensions, companies are exploring a completely different direction for adding gates - they're going vertical.
Look up to the sky
You'll never find rainbows
If you're looking down. - Charlie Chaplin
There has long been an interest in driving transistors into the IMD (inter-metal dielectric) above the silicon substrate and between metal layers to build integrated circuits. Efforts heretofore have concentrated on memories, as Matrix Semiconductor tried to do for seven years before being bought by Sandisk in 2005. I was part of a team led by an extraordinarily insightful ex-engineering manager from the programmable logic sector to develop a 3D FPGA architecture. The design approach brought truly stunning capabilities to programmable logic and would have revolutionized the entire custom chip sector, but no foundry could properly support the advanced process technology requirements needed to implement it.
While research continues in this direction, the IC industry has already taken matters into its own hands and developed packaging techniques to stack and interconnect separate die. Originally referred to 15 years ago as SiP (System in Package) and now, with a variety of additional innovations, called 2.5D or 3D-IC, multiple die can be integrated into a single, sophisticated package and even stacked above one another & connected by TSV (thru-silicon vias.)
There are companies that are already making this work in mass production. Samsung is producing NAND Flash chips using 3D IC integration techniques:
The potential benefits for other applications are legion. One could hypothetically integrate entire PCB's onto a single chip, with concurrent cost and power savings at the system level. Die from very different process nodes could be included in the same package, and previously incompatible functions (for instance, digital logic and RF) could find their way (in theory) into the same chip. One could envision achieving all sorts of speed, bandwidth and power benefits as well by avoiding the problems of sending signals between chips across printed circuit boards and interacting much more intimately and directly within the confines of a single package.
Naturally, the potential problems and difficulties are numerous as well. How do you test such a contraption? How do you cool it properly? How well do EDA tools and models accommodate such a design and integration effort, including proper capture of all parasitics (not to mention 3D electromagnetic effects)? What if just one die had a defect - would you have to throw the whole chip out? Finally - how much would such a monster cost?
The above is quite a list of innovation efforts. Surely, by pursuing so many interesting and challenging avenues, the industry is bound to find one or more things that will help it turn around - right?
When the Merrimack and the Monitor slugged it out in 1862 at the battle of Hampton Roads, they ushered in the age of the Ironclad. These naval vessels quickly evolved into ship designs that were constructed entirely out of iron and eventually steel, culminating with the behemoths that ploughed the world's oceans at the outbreak of the second world war.
By the early 1940's, these monstrously expensive war machines become the sole purview of only the wealthiest and most industrialized nations. Wrapped in up to 16 inches of steel plate and deploying rifled cannons that could fire munitions up to 18 inches in diameter that weighed 1.5 tons a distance of 26 miles, these symbols of national power were considered the pinnacle of naval technology.
They were, however, already obsolete. Ships much simpler in design and significantly lower in cost could be topped with a flat deck, loaded up with 80 aircraft, command an ocean area 10 times greater than the largest battleship and turn any of the vaunted dreadnaughts into scrap metal before the ships ever caught sight of each other.
Though nothing can bring back the hour
Of splendour in the grass, of glory in the flower. - William Wordsworth
I have nothing but praise for the engineers who are working on the technologies mentioned above in an effort to extend the life of Moore's Law. They show in their ingenuity, drive and creativity the generous and adventurous spirit of those who dream of building a better world. It is ultimately thanks to them that there has been such an explosion of prosperity, progress and freedom across the globe over the last four decades.
Without question, their efforts will pay at least some dividends. Some new application niches might be discovered and the companies that most quickly deploy working versions of these technologies will reap extra growth and profit margins in the time before their competitors inevitably catch up.
Yet one cannot fail to notice that almost all of these technologies provide benefits to performance and power at the expense of the third P - price. For those few which may prove to benefit all three P's, the gains will be only of a single instance, since Moore's Law has come to an end for silicon at the 28nm node.
The phoenix hope, can wing her way through the desert skies, and still defying fortune's spite; revive from ashes and rise. - Miguel de Cervantes
Most significantly, the above innovations are targeted at existing technology markets, which at this point have for the most part exhausted their possibilities for further significant growth. What High Tech needs to do in order to achieve a palingenesia is to break open new, untapped markets. To do this - to bring about a rebirth of High Tech greater than the 40 year revolution we've already experienced - will require something at the chip level, and it will have to be more than another round of innovation.
What High Tech needs to do is something which, except for some isolated, magical instances, is something which it rarely attempts to perform - it will need to throw itself headlong into the struggle of INVENTION. There are a tiny number of companies and organizations that are pursuing that very horizon, trying to create things that today might fall under the realm of magic. This, however, is a tale for a future post. ;-)