Friday, November 14, 2014

IoT Technology Trends, Part 5 - More Ways To Get Started

The explorers of the past were great men and we should honour them. But let us not forget that their spirit lives on. It is still not hard to find a man who will adventure for the sake of a dream or one who will search, for the pleasure of searching, not for what he may find. - Sir Edmund Hillary

The IoT provides fertile ground for experimenters, innovators and inventors to create a new and better future for us all. Before embarking on such an adventure, though, these explorers on the technology frontier need to have instruments adequate to the task.

Two weeks ago we looked at a few of the hardware platforms that are out there which are attracting interest from IoT product developers. The first ones we examined - Arduino and Raspberry Pi - were both designed from the ground up for ease of use and low cost. They were, in fact, originally targeted at technology neophytes for teaching purposes and to help ease them into the use of microprocessors, microcontrollers and other electronic systems hardware, applications software, operating systems, programming and the supporting tool chains.

The popularity of Arduino and Raspberry Pi vividly demonstrate that their originators did a great job. Their astonishing success and explosive adoption by the 3rd party community has spurred commercial aspirants in the IoT to develop their own platform offerings in this space in the hopes that HW and SW designers will in the end choose to employ their microelectronic products in IoT designs. This week, we'll look at some of the leading providers in this area.

(Note: I am not endorsing any of these products and am not being paid to write by any of these companies. My comments are simply my take on these products/offerings/systems and their capabilities.)

Breaking With Tradition


Nothing has such power to broaden the mind as the ability to investigate systematically and truly all that comes under thy observation in life. - Marcus Aurelius

Intel unquestionably chose a provocative name for their IoT development system. Galileo Galilei, a mathematician, astronomer and inventor during the Italian Renaissance, shook the social and religious establishment of the 16th century thru his application of mathematics to celestial observations and his conclusion that the Earth was not actually the center of the universe but instead was captured in an orbit about the Sun - a proclamation that put him in very hot water with the Vatican.

The Galileo development board also runs against the grain of tradition. Its centerpiece, the Quark SoCX1000, represents Intel's first real attempt at entering the fray against traditional MCU-based designs for the IoT. Though the X1000 CPU is still compatible with the x586 ISA, it is a much simpler hardware microarchitecture - a single threaded 32b core with a 16kB L1 cache running at 400MHz. The system has all the right I/O support for the IoT - 10/100Mb Ethernet, PCIe, USB 2.0 and so forth. OS support includes Windows, Linux and OSX. 

Intel also broke its "not invented here" tradition by not only acknowledging the leadership of Arduino but actively and purposefully jumping on the bandwagon. The tool chain is open source and compatible with the Arduino IDE. Galileo supports the Arduino shield ecosystem and is, in fact, the only system that is officially Arduino-certified.

Nevertheless, all is not perfect with Intel's first foray out of its PC comfort zone into the IoT. Though Galileo is capable of emulating Arduino's ATmega MCU architecture, the emulation is reportedly very slow. Furthermore, the low power mode 600mA current draw is higher than most Arduino boards in run time. Perhaps the culprit is Intel's ingrained tradition of maximizing performance. The 400MHz speed of the X1000 is overkill for the great majority of IoT applications, which typically are satisfied with less than 100MHz.

Hopefully Intel will take example from its old quasi-monopolistic ally Microsoft and not view these deficiencies as irretrievable setbacks but simply as lessons learned that need to be applied to future efforts. Trial and error is part & parcel of invention and creation. Microsoft proved that even an old dog can learn new tricks - that a very large old company can find within itself the necessary entrepreneurial spirit to engage in repeated experimentation with ultimate success. It will be imperative for Intel's future to aggressively and resolutely follow suit in like manner.

Chauncey Gardiner

If you don't know where you're going, you might not get there. - Yogi Berra

In 1979, famed British comic actor Peter Sellers starred in his last film, "Being There." The story revolves around a very simple, quiet man whose only concern in his entire life has been tending to his garden. Growing up in a wealthy man's home and having no exposure to the outside world, Sellers' character is suddenly thrown out into the street upon the death of his patron. 

Thru a series of chance events, the simple gardener and good soul is quickly taken into the household of another wealthy benefactor who is also a close confidante of the President of the United States. Both his benefactor and the President find it difficult to converse with Chauncey, as his responses to questions are always rudimentary commentaries on the minutiae of horticulture. Not realizing that Chauncey is, in fact, a simpleton who is completely innocent of the ways of the world and that his gardening comments reflect the fact that he understands absolutely nothing about what he is being asked, the businessman and politician assume Chauncey Gardiner is providing wisdom thru allegory and begin basing grand decisions, policies and actions on their interpretations of his 'advice' to their great benefit.

This is, to a significant extent, the story of MIPS Technologies. 

Please do not misunderstand me, folks - there are plenty of intelligent people at MIPS and some great engineering talent. Those engineers, in fact, have produced microprocessor designs of superlative quality and capability over the last 18 years.  Its CPU designs are a classic example of modified Harvard architecture and enjoy widespread GNU toolchain support. MIPS has always been highly orthodox in its adherence to the precepts of RISC CPU design. As a consequence, its embedded microprocessors have traditionally supported ISAs of very high orthogonality and its microarchitectures are exceedingly robust, high performance, feature rich, very straightforward and easy to understand.

Because of this, the MIPS synthesizable embedded processor IP line has always been very popular and respected in the High Tech engineering community worldwide. There are a huge number of systems incorporating MIPS processors, with various members of the CPU IP portfolio found in Wi-Fi access points, base stations, DVD players, STBs, home gateways, DTVs and even the Sony Playstation 1 and 2. The esteem in which the MIPS CPU line has been held clearly gave ARM fits over the years.

Yet at the same time MIPS has been inwardly focused for the most part, with a 'build it and they will come' philosophy to defining new CPUs. Customer support has always been excellent and the company's detailed engineering knowledge of microprocessor design is second to none. But despite its widespread appeal and usage across the communications and consumer system space, MIPS missed out on the truly enormous potential of mobile telephony. As a result, the company was never able to build sufficient mass to allow it the luxury of recruiting sufficient staff to closely track the system level application space of the markets it served from a hardware and software standpoint. This in turn resulted in MIPS historically not being positioned to drive market changes and advances or even anticipate them, but instead to be driven by them blindly and reflexively.

Though MIPS was bought by Imagination Technologies a little over a year and a half ago, the organization is still in much the same predicament, not yet able to build and sustain a proactive and outward-looking posture. MIPS is finding that much of its positioning for the IoT market is being developed on its behalf by the still very large existing fan base of engineers. As an example, the link below shows a broad selection of development systems targeted at the Arduino community that use Microchip Technology PIC32 MCUs, all of which employ the MIPS M4K:,892&Cat=18

Though not officially certified by Arduino, these development boards and shields are designed for Arduino compatibility, including form factor, support for many Arduino shields and a toolchain based on the original Arduino IDE modified for PIC32.

 And since we're talking about Arduino, take a look at one of their newest systems - the Yun:

The board contains both an Atmel ATmega32u4 and the AR9331, a device originally developed by Atheros and now offered by Qualcomm (which bought Atheros in 2011.) The AR9331 contains a 32b MIPS 24K CPU, one of the great workhorse processors of MIPS. The board itself supports 10/100Mb Ethernet, 802.11b/g/n and 64MB of DDR2 system RAM. There are 48kB of Flash unevenly distributed between the two CPUs - 32kB for the ATmega with a dedicated 4kB for the bootloader and 16kB for the 24K that contains a version of the OpenWrt OS optimized for Yun on 9kB.
To top it all off, a MIPS-based Android Wear reference design is offered by Ingenic, a MIPS licensee based in China. Ingenic has already produced a successor with even more capability than the first.

To be fair, Imagination and MIPS appear to be slowly waking up to the fact that they need to take charge of their own destiny in the IoT and are putting at least some resources into it:

Baby steps are better than no action at all, and there is at least one champion for the IoT cause within the company's ranks. However, if Imagination Technologies was to launch a concerted and properly staffed effort to focus on facilitating, promoting and supporting IoT design and development, it would likely prove highly disruptive to ARM's growth plans in the sector.

Space Age

And since we're talking about High Tech companies stumbling backwards into the IoT, we should take a look at the NVidia Jetson development system:

All new ideas and opportunities spawn from a mismatch between Reality and current conceptions. - John "40 Second" Boyd

There's nothing on the above link which suggests that NVidia has any kind of strategic focus on the IoT. Yet the Jetson development system's target audiences - machine vision, robotics, medical, security and automotive - all overlap extensively with the IoT TAM. 

NVidia stylizes itself as a groundbreaking leader in massively parallel computing - a self-portrait which it does indeed merit. It's application ambitions are focused on where it believes its markets will be ten years from now. A collision between NVidia and the IoT is therefore inevitable.

The Jetson platform is ridiculously overmuscled for most current IoT applications. Its GPU consists of 192 processing units that provide a massive amount of computational power not just for graphics, but for a broad range of multimedia applications, as well as machine vision and robotics. The control plane is managed by another CPU that is fantastically over the top for many IoT designs - an ARM Cortex A15 quad core. Connectivity is also supercharged with 1Gb Ethernet. What Jetson really is, for the most part, is a single board computer. The computational capabilities of Jetson are such that it is finding appeal in unanticipated areas - I've even heard from an old friend who is using Jetson for an application in satellite communications.

Yet this relative unsuitability for current IoT design directions paradoxically offers a tremendous opportunity for NVidia. The company's Tegra line is in full retreat from the smartphone market after an overwhelming Qualcomm onslaught. Nevertheless, there is deep technical expertise and market experience within the company ranks with regards to mobile computing. 

A strategic imperative for NVidia would be to steer a new version of the Jetson TK1 towards becoming a development system for an eventual convergence between smartphones and tablets. The new Jetson platform should also facilitate the use of daughtercards -specifically, Arduino shields - in conjunction with the motherboard to support development of IoT access points, sensors and other peripheral devices of more restricted functionality. The end goal would be to make NVidia technology the centerpiece of the emerging IoT universe, rather than leaving it to an uncontested iPad-iPhone merged device.

Man's Best Friend


Beagleboard, originally borne from an attempt by Texas Instruments OMAP strategic marketing to find a way of engaging hobbyists and energizing the 3rd party ecosystem, has since evolved into a non-profit corporation supported in part by DigiKey, TI and CircuitCo. It is now much like Arduino and Raspberry Pi, bringing sophisticated computing hardware, software stacks and IDEs to the general public thru an open architecture model with the goal of providing a set of educational tools.

There are several different motherboards available, the latest being the Beaglebone Black, selling for $45 and clearly targeted at aspiring IoT developers. At last count, there were 81 daughtercards available - referred to as 'capes' - with quite a variety of functions, including numerous display options. A vast selection of linux distributions are supported. A major 'partner in crime' with Beagleboard is Adafruit, who offers two dozen application tutorials.

The first Beagleboard was based on TI's OMAP3 architecture. With TI having abandoned its efforts on OMAP in the mobile computing market, the Beagleboard organization wisely decided to seek alternate SoC sources for newer boards, and is employing other TI devices that still incorporate the ARM Cortex - A8. Interestingly, all the Beagleboard SoCs also employ PowerVR graphics processors from Imagination. 

Much more can be found on the Beagleboard website (see link below.) It is abundantly clear that the user base is large and enthusiastic.

Scylla and Charybdis


Chance will not do the work—Chance sends the breeze;
But if the pilot slumber at the helm,
The very wind that wafts us towards the port
May dash us on the shelves.—The steersman's part is vigilance,
Blow it or rough or smooth. - Walter Scott

There are evidently as many choices that can be made for an appropriate development platform as there are variations in design and application for IoT offerings. However, one must weigh significant factors before choosing; thus, the choice should not be made hastily. In fact, any choice entails a certain amount of risk. As Odysseus discovered in trying to navigate the strait controlled by the monstrous Scylla and the whirlpool of Charybdis, there are dangers to face no matter which direction one selects in the end.

There are purely technical considerations, of course. Here's a side by side comparison of several of the platforms we've looked at in this editorial and the blog post from two weeks ago - again, courtesy of Adafruit:

Note that there are multiple MPU/MCU architectures employed by these platforms - Atmel, MIPS, ARM and Intel. What processor architecture is best to commit to, given the fortunes of High Tech? The long term strategic significance of the choice one makes for an IoT product is critical.

It's really not clear at this point which processor provider will have greater success or staying power. Ease of use and strength of ecosystem of both the processor and the IoT platform are also weighty considerations. For instance: will the target OS of your application be supported over the long run? If you make a wrong guess and need to switch to another OS to make your IoT offering a success, does the platform you've chosen and the processor it employs make that an easy or painful decision?

Intel marketing seems heavily committed to the IoT - there's an awful lot of promotional activity, at least. Nevertheless, a measure of the genuine rate of adoption and strength of promoted solutions is.......elusive. To be honest, the Intel stance on the IoT currently comes across as "all hat, no cattle." However, my impression is that the Intel IoT cowpunchers are genuinely trying to gather a large herd. Stated less metaphorically, it appears to me that Intel's commitment to the future of Quark and its presence in the IoT is a major strategic imperative for the company.


MIPS/Imagination is a dark horse. There is an opportunity for MIPS to make up for a lot of lost ground against ARM in the mobile phone market. What is still not evident is the resolution to do so.

ARM has a dominant position with all of the MCU companies. The Cortex M series is hugely popular. The Cortex M0 is a tiny yet capable device highly suited for IoT designs - a 20k gate 32b CPU with a deep sleep mode whose current draw is measured in microamps. ARM's new mbed OS is interesting, as it offers many features for power management as well as supporting mesh networks with a central controller or access point. 

There are no guarantees that ARM will emerge as the dominant CPU/MCU player in this market, though, as the IoT is still nascent and things can change very suddenly and drastically in High Tech. The fact that Ingenic, a Chinese company, has chosen MIPS over ARM is very telling and should be a cause for alarm in Cambridge.

The front runner at this point is, interestingly enough, Atmel. The adoption of ATmega-based Arduino platforms has been phenomenal. Things have reached a point where compatibility with the Arduino ecosystem is almost a de-facto requirement for a development board provider in order to be taken seriously as a platform for IoT design. This provides Atmel and Arduino a further advantage, in that one can find a path to migrate designs upward in complexity thru Arduino-compatible platforms.

What this means is that one could design a sensor, add features and capabilities to the basic design as needed for a given application segment, develop a controller and/or access point for this family of sensors & peripherals and potentially even develop its master - the server or personal processor employed by the user. There is also the possibility that Arduino will itself develop more complex platforms, very possibly as a joint effort with Atmel, with whom they have a strong relationship.

Regardless of what choice an inventor or design team makes, there will always be the chance that they signed up to a platform or CPU/MCU that in the end won't prove to be a winner in the IoT market. There is no way to eliminate such a risk, and there will be winners and losers in the IoT in part because of such changes in fortune. But that's part of the adventure of exploring a new frontier.

So which to choose? Perhaps this true dreamer captures it best:

Imagination is a force that can actually manifest a reality. … Don’t put limitations on yourself. Other people will do that for you. Don’t do that to yourself. Don’t bet against yourself. And take risk. NASA has this phrase that they like, "Failure is not an option." But failure has to be an option. In art and exploration, failure has to be an option. Because it is a leap of faith. And no important endeavour that required innovation was done without risk. You have to be willing to take those risks. … In whatever you are doing, failure is an option. But fear is not. - James Cameron


As part of the research I did for this post, I ran across quite a few articles discussing design approaches and guidelines for IoT applications. I've posted the most interesting of the links below and hope that you find them of value. :-)