Friday, July 18, 2014

Value Proposition and the Custom Chip Market: Part 5 - Wrap Up & Roadmap

There is a history in all men's lives,
Figuring the nature of the times deceas'd,
The which observed, a man may prophesy
With a near aim, of the main chance of things
As yet not come to life, which in their seeds
And weak beginnings lie intreasured. - Shakespeare (Henry IV, Part II)



Over the last 5-6 weeks, we've deconstructed each segment of the custom logic market (ASIC, FPGA and SoC) in detail, weighing their strengths and weaknesses versus user requirements. Their internal weaknesses and advantages & disadvantages relative to each other have also become apparent during the course of the discussion.

Yet all custom logic companies are confronting an identical set of threats and challenges as we move into High Tech's first real Black Swan - the confluence of a desultory global economy, the saturation, stagnation or decline of every system market served by semiconductors, and the imminent or perhaps de facto repeal of Moore's Law. In a nutshell, the menace with which they will all be wrestling for the next 7-10 years is the erosion of their innate value propositions and the rise of price as the determining competitive factor.

Some believe that the three custom logic segments will deal with these difficult market circumstances independently, as their technologies are not truly direct competitors of each other. It's an easy inference to make, as each sector has a very individual market outlook, system of methods & practices and raison d'etre. This conclusion is, however, fundamentally incorrect. These segments are indeed competitors and even mortal enemies - yet not in the way that the conventional wisdom views it.

It is widely proclaimed that the decline of the ASIC market can be attributed to the rise in size and complexity of FPGA products, and that system designers have been increasingly adopting FPGAs (with their high levels of customization and faster TTM) over ASIC in order to avoid high NRE bills, large engineering teams, long development cycles and the risks inherent in fixed function logic implementations. However, revenue numbers over the last 10-15 years demonstrate that this is not true.

Before we look at the financial details, some caveats:
1. SoC market numbers can be a little difficult to quantify. Screening out pretenders is not problematic, but SoC companies offer a broad range of products, not all of which genuinely qualify as SoCs. Separating SoC vs non-SoC revenue streams for any given company is an impossible task, so the numbers provided for SoC revenue are, without a doubt, somewhat inflated. These numbers nonetheless reflect the innate technology strength of the segment's participating firms. For the record, the companies that I have chosen to count as SoC firms for the purpose of this analysis (acknowledging that this list is not necessarily all-inclusive) are: Broadcom, Qualcomm, Mediatek, Cavium, Cortina Systems, PMC-Sierra, NVidia, AMCC, Vitesse, Sigma Designs, Marvell and Ikanos.
2. ASIC numbers are impossible to determine precisely. As an old ASIC hand myself, I've struggled with this in the past. The reason that there are no exact revenue numbers for the ASIC market is because the segment participants are deliberately deceptive. There are almost no pure-play ASIC firms anymore, and market players use this to disguise their actual revenues - sometimes counting ASIC design wins as standard product revenue, while others count some of their standard product earnings as ASIC income. Companies will also deliberately under-report their ASIC revenues simply to keep competitors in the dark regarding their actual market footprint (or lack of it, as the case may be.) Nevertheless, based on the information that I have been able to gather and cross-reference, the numbers presented below are, I think, reasonably accurate representations of the size of the market.
3. Market growth/decline was obviously not a straight line during the period depicted; nevertheless, for the sake of simplicity, revenue changes in the respective markets have been represented linearly and reflect the smoothed-out trends for each segment.



























A couple of observations can be made from even a cursory glance at the chart:
1. The FPGA market has grown more slowly than the semiconductor market in general - slightly more than a 3% compound rate versus roughly 3.5% on a yearly basis.
2. ASIC has lost $7-8B of market value over the last 14 years, yet FPGA has only grown by $2B. Clearly, almost all of the lost business in ASIC has been captured by something other than FPGAs.
3. The real driver of FPGA growth has been its cannibalization of the Gate Array market. There is essentially nothing left of gate arrays or embedded arrays other than a small amount of activity in the structured array segment (eASIC being the primary vendor.) 

Observe your enemies, for they first find out your faults. - Antisthenes

Much of this is comprehensible upon examining the particular characteristics of each segment. There was simply no way that FPGA could make serious inroads in the ASIC market over time. The programmable arrays are generally too small and inefficient to instantiate all the required functionality. Furthermore, FPGAs are notoriously slow and power hungry versus fixed function equivalent logic - the price one pays for the ability to emulate logic functions with near infinite flexibility. Finally, FPGA unit prices are simply nowhere near those of ASIC, except for the smallest programmable devices (which in turn cannot support the circuit densities of their fixed function equivalents.) It is simply preposterous for FPGA houses to think they can seriously compete with ASIC.

It is the SoC segment which has done all of the damage to the ASIC business. By targetting at least 80% of the desired hardware functionality for a given market segment and compensating for any lack thereof thru a software distribution, SoC vendors combined some of the flexibility and personalization of FPGA with the 3P advantages of ASIC, sans NRE and lengthy development times.

As a side note, the technical differences between ASIC and SoC product offerings reflect a fundamental difference in mental framework between the two segments that ultimately boiled down to courage. Whereas ASIC firms viewed the discovery of commonality amongst customized chips for a given market segment as an insoluble problem and the development of deep software support for complex hardware as simply beyond the expertise and capabilities of a chip company, the SoC firms were undaunted, viewing both issues as challenges to be met. Stalwart and resolute in their convictions, they strove mightily to overcome such obstacles and benefited immensely for it, as the financial results of the last decade and a half clearly attest.

The strength of each segment's value proposition is evident both in total revenues as well as in unit volumes for design wins. It is a reasonable expectation that the median unit volumes of booked designs ship in lifetime volumes of 10k-20k for FPGA, 150k-250k for ASIC and 1M+ for SoC.

But as pointed out at the beginning of this editorial, all three segments are falling under the shadow of a Black Swan that is spreading its wings over the entire High Tech market. This event changes the rules of the game, as circumstances will deeply erode the value propositions of each segment and their value relative to each other. Who will survive thru this time of troubles and what they will need to do in order to emerge from it in a stronger position than others is a crucial strategic concern.

Change always involves a dark night when everything falls apart. Yet if this period of dissolution is used to create new meaning, then chaos ends and new order emerges. - Margaret Wheatley

Leading FPGA vendors today are proclaiming that their time has finally arrived, and thru pursuing ever deeper submicron nodes to instantiate their programmable arrays, their products will shortly reach the 3P requirements of typical ASIC applications. Zvi Or-Bach, however, sees it very differently, and argues decisively that such a strategy is hollow at its core:
http://www.eetimes.com/author.asp?doc_id=1322021

Or-Bach's argument concentrates on cost issues - the Price component of the three P's. His analysis is, as usual, particularly cogent, as during the emerging Black Swan era of High Tech, pricing will be the primary determinant of competitiveness for most firms. This will be an especially difficult time for the Programmable Logic segment and may see a stunning reversal of fortune for its members, as possibly only Lattice can legitimately claim to be well-positioned in this respect.

Any advantage that custom logic companies can carve out for themselves in the other two P's - performance and power - will be the key to whether they can individually resist pricing pressures on their offerings to any degree. This raises the inevitable question - what technology changes might lead individual custom chip firms to a value proposition that can help them weather this storm more successfully?

The table below empirically stacks up the technology differences that define each sector's value proposition for custom logic users:


























When the segments are compared in this way, the outlook for ASIC is grim. The SoC segment captures all of ASIC's strengths but has none of its weaknesses. Almost all ASIC offerings today are part of larger systems or chip companies that offer SoCs of their own. Companies will thus be disinclined to invest to any significant extent in bolstering their ASIC capabilities, as SoC growth should offset any revenue shrinkage. Consequently, it seems more than likely that the ASIC business will continue to decline in favor of the superior cost-effectiveness and value proposition of SoC. 

The situation is different, though, when contrasting FPGA and SoC. Though SoC companies do offer some personalization thru software, it is expressly a word-level customization based on embedded CPU, MCU and DSP functions. What custom chip market customers have incontestably shown, however, is that bit-level programming is of considerably greater value to their system-level needs. As proof of this, the ASPs and margins of programmable logic companies have remained extraordinarily healthy over time, despite the extreme inefficiency of logic emulation and its associated performance, density and power penalties. Stated differently: the value of bit-level customization overwhelmingly trumps the decidedly poor 3P characteristics of programmable logic.

Intel evidently understands the remarkable value of bit level programmability and points the way forward for FPGA and SoC companies:
http://www.altera.com/devices/processor/intel/e6xx/proc-e6x5c.html

Though Intel is proceeding with a 2.5D IC approach, power and performance considerations point to an embedded FPGA capability as the preferable solution. The applications for such a capability are legion - reconfigurable memory control, programmable embedded test for networking QoS, configurable protocol interfaces, etc etc. There are also second order benefits  to a company deploying such a capability, as it could potentially reduce the number of devices required to fill out a product family, with cascading reductions to product development and support costs.

But embedding a bit-level programmable array is not anywhere near as simple as embedding a licensed DSP, CPU or GPU from Ceva, ARM or Imagination. First of all: what architecture should be used - a high density and low power array from Lattice, a high performance and OTF reconfigurable array from Tabula, or a highly flexible generic array from Xilinx or Altera? Perhaps none of those are the perfect solution, depending on individual application requirements for combinatorial, sequential or datapath logic. As stated previously in this editorial series, there is still a great deal of innovation potential for programmable architectures.

Secondly: how do you make it work? The timing characteristics of the programmable array will undoubtedly be different than the fixed circuitry surrounding it. This presents a particular quandary beyond that of simply delineating a separate clock domain - after all, the block is supposed to be reconfigurable. Hence, both its timing and signalling profiles are subject to change, perhaps even dynamically. Making that work while the block is surrounded by fixed functionality is a problem that has been solved before by both LSI Logic and IBM, but the methodology lessons will need to be re-learned.

Finally: how do you support it as an embedded capability? The design tools aspect of programmable logic is, in reality, the most critical component of the offering. It is also markedly different from the EDA tools & methodology bundle of ASIC or the CPU/DSP toolchain of SoC.

No one reaches a high position without daring. - Syrus

The struggle between FPGA and SoC vendors to establish and sustain a new value proposition during High Tech's Black Swan will be the defining battle of the custom chip market for the next decade. Yet despite the size and technical sophistication of the SoC segment weighing to its advantage, the outcome is actually very much in doubt.

In addition to the difficulties mentioned above, SoC firms will need to learn how to develop FPGA tools and design programmable fabrics. These are very, very serious challenges indeed. The hardware has all sorts of development issues related to memory design, as well as the paradoxes that spring from power, size and performance tradeoffs against global and local routing resources. The development of compilers and heuristic P&R algorithms will also be a major obstacle for SoC firms, especially with FPGA vendors having a 30 year head start. There are also quite a few hardware and software support issues and failure modes particular to programmable logic offerings which SoC companies will have to learn the hard way (and believe me, those lessons HURT.)

Of course, the obstacles faced by FPGA companies to properly compete with the SoC segment are quite formidable. FPGA firms, though, have some inherent advantages. The top vendors already have nearly a decade of experience in using CPUs and FPGA fabrics together. While the 'holy grail' of coding at a high level of abstraction such as "C" and having the tools automatically compile the code and subsequently partition functionality as appropriate across word-oriented and bit-oriented programmable elements is still a distant dream, FPGA companies nevertheless have tools flows for both technologies under an overall development methodology umbrella. The programmable logic houses also have considerable system expertise. Though not at the level of their SoC rivals, that ground can be gained thru a sustained and focused effort. The real barrier to such an effort is, as described in an earlier article, one of religious canon - specifically, the unshakeable zealotry of programmability acolytes and their contempt for the fixed function 'heathens.'

So: who will win this struggle? That embedded bit-programmable fabrics will become an embedded function is a given, as the pressure to find some/any source of value-add to protect against price erosion will be too great to resist. Personally, I suspect Broadcom and Samsung have the most backbone amongst SoC firms to attempt this, while MicroSemi might be the only FPGA player with the necessary technical sophistication and gumption to try, but that's just speculation on my part. If we were actually able to consult with the Oracle at Delphi, I suspect the Pythia would gladly take our offering of gold and, after an appropriately dramatic pause disguised as contemplation, would tell us something like this:

Life's battles don't always go to the stronger or faster man, but sooner or later the man who wins is the one who thinks he can. - T.J. Lucier