As a template for our times, the Tower of Babel story has much to offer. In the Genesis story, when united in language and in purpose, humankind was able to construct a tower that reached into the heavens; but with languages “confounded” by a suspicious deity, people were scattered to the four corners of the earth. If the digital infrastructure needed to support IoT is viewed as a proxy for the tower, both the potential and the challenge to unified action come into better focus. In his keynote at the recent IoT World Forum 2014, Cisco chief globalization officer Wim Elfrink noted our rapid adoption of digital infrastructure — which is growing five times faster than the adoption of electricity or telephony — and asked “why not have a New Essential Application Centric Infrastructure,” and “why IT is a secondary thought?” One answer is Babel: while construction of the IoT tower requires collaborative effort across a multitude of vendors, currently, these are talking different languages.
A number of issues dog the successful transition to an IoT era. At root, IoT is not a discrete technology, but rather an amalgam of connectivity, analytics, sensors devices and services that have been created by many different vendors in the IT, communications and the electronics industries. These products operate according to different standards for connectivity, and run on software that is not always interoperable: IoT is a solutions play in search of a total solutions provider, a system in search of producer alliances. The need for greater interoperability is a familiar challenge in IT — a successful industry has in fact built around the integration of client and vendor systems, and vendor-to-vendor solutions. But IoT represents a horse of a different colour due to the participation of product and device manufacturers outside IT, and increasingly, to the application of IoT in industrial settings. In the view of many trend watchers, “Industrial Internet” or “Industrial IoT” holds the most promise for the connected future: Elfrink argued, for example, that the deployment of industrial sensors will exceed consumer sensors by 2017. However, this shift to the process industry will entail the introduction of additional technologies, protocols and standards to enable the convergence of IT and OT (Operational Technology) — in addition to a relentless requirement for real time operational data and no compromise security.
For the most part, discussions around IoT interoperability have yet to tackle the primary IoT concern identified by IoT World Forum attendees (and elsewhere) — security protocols. While IoT presumes the integration of multiple connected devices, security risk increases directly with the number of nodes on the network. Since IoT works as a connected solution — and not in silos — some mechanism is required to enable seamless integration of devices. As Schneider CTO Pascal Brosset put it, seamless connection of IT and OT systems in industrial applications is achieved through migration from traditional operating technology architectures to IP enablement of industrial devices, which push data and events as web services. However, IP-enabled devices typically live behind the corporate firewall, a limitation that he believes needs to be overcome in order to achieve the ubiquitous IP network that serves as the IoT backbone. Listing the IoT ‘needs’, Brosset advocated for: cheap sensors that self-identify and are self-powered to overcome device constraints; a uniform platform; and a mesh network that will support scale; as well as the integration of other information sources/data that can provide the “context” that makes IoT happen.
Is it possible to reconstruct the tower, to build a secure, collaborative platform required to support IoT’s solutions orientation? Cisco believes it is, and outlined a first step in this direction at the event with announcement of the IoT Reference Model. Designed to create what Cisco VP of IoT systems and software Kip Compton described as a “common language” that will help improve productivity in collaboration around IoT, the new Reference Model was an initiative of the Architecture Working Group of the IoT Forum Steering Committee (128 companies), consisting of 28 members who worked to develop consensus around this new “taxonomy.” A high level schema, the Reference Model is made up of placeholders at seven levels for: physical devices and controllers (the “things”), connectivity (communication and processing units), edge computing (data analysis and transformation), data accumulation (storage), data abstraction (aggregation and access), application (reporting, analytics, control), and collaboration and processes (people and business processes) — a broad list encompassing many IoT partner activities.
According to Compton, this initiative “was not necessarily easy, but a rewarding process,” with lots of dialogue and discussion with working group members from across IT, industrial and other segments on what that he views as a significant first step in establishing the foundation for common conversation around IoT. In addition to Cisco, input to the model was provided by several large companies, such as IBM and Intel, and from the industrial automation side by companies such as Rockwell Automation. But working group members came in all shapes and sizes. Synapse Wireless, for example, which began seven years ago as a 10 person shop installing M2M and mesh networks and is now a 150 employee operation providing end-to-end monitoring platforms that are embedded in systems for key verticals including solar energy, lighting, healthcare, food services and retail, was another contributor. Synapse’s interest in the project stemmed from challenge in its own experience deploying IoT solutions: as CTO David Ewing noted, “it really rang true to us because even across those five segments that we have, we have many instances where we need to interoperate. Intelligent lighting in a restaurant, for example, can really enable temperature monitoring in my food service business. Clearly what we want to do is to have synergy between those different verticals, and leverage the infrastructure, so that once you put in new systems you can layer applications on top of that to bring more value to the customer.”
But even within its own solutions, Synapse is still a step away from full interoperability — Ewing observed that “the optimal technology and the optimal architectural network for making street lighting work is likely not the optimal technology for making a healthcare asset tracking system work.” However, the company has been sketching out an approach to aggregation of data from various systems for interconnection at the appropriate layer for some time now, developing knowledge of patterns and typical models across five industries. It was this experience that Synapse was able to bring to the Reference Model discussions, and this requirement that Ewing believes was supported by the final model document. While the simple sketch of various layers in the Model does not resolve interoperability issues on its own, as Mark Guagenti, staff software engineer in the CTO’s office at Synapse, explained, it helps providers mark out the areas in which they can contribute to standards development. The Model also helps companies like Synapse identify their strengths in customer conversations, where they can leverage existing client architectures or solutions and seek out potential partners, though Guagenti added that over time Synapse by default had to build a platform stretching across the seven layers — through APIs connecting devices, through gateways and the ‘fog’ at the edge, to cloud and user applications. Summarizing the practical applications for the Reference Model, Compton added, as a vendor, you can show clients where your product fits in the overall space (what functions it provides); as a customer, you could use the model to start thinking about what pieces you need in an actual architecture, and about how you could build an architecture that addresses those different layers. For the service provider, the model provides a framework for identifying what categories they can possible participate in, and what in areas would it be better to partner.
In many ways, the language in the IoT Reference Model is very network, or even Cisco-centric, with heavy emphasis on connectivity, collaboration, computing at the edge — a not terribly thickly veiled reference to Cisco’s new fog platform, IOx — and analytics, an area that Cisco signalled its intent to develop last year with acquisition of data virtualization specialist Composite Software. But as a living document, Compton expects there will be further refinements as the industry evolves and as the model is presented to different standards bodies in order to contribute to their efforts. First stop on the standards tour for the model is the Industrial Internet Consortium, an industry, academic and government group formed this year to accelerate the development of intelligent industrial automation.
Compton distinguished the IoT Reference Model, which is “a logical diagram of pieces you should have in an IoT solution” from a ‘reference architecture’ or “technical diagram that implies how pieces of systems are connected to each other.” At the same time, he agreed that “abstraction,” which is featured in the Model, is “a pretty powerful element that will play an important role in the IoT”: by “providing vocabulary around where you might want to have abstraction layers and interfaces, the Reference Model will be helpful. Starting to think about how these systems will talk to each other in ways that are open and scalable is important, and abstraction will play a big role in that.” To demonstrate the art-of-the-possible on this front, the IoTWF featured a break out session with presentations from Cisco, Intel and IBM, outlining architectures that have been built on the Reference Model taxonomies. CTO for Cisco’s data & analytics business group, Jim Green, for example, outlined an IoT Reference Architecture that started with basic premises (need to build an open, integrated solution) and worked through devices, the network, edge computing, data storage and data bases, applications and the act and collaborate function to create “something that begins to look like a stack.” But most interestingly, Green demoed where integrations are possible to allow customers to “do best of breed” sourcing, and where it’s possible to build bridges that help IT to handle the fast and voluminous data streams entailed in OT — “IoT Middleware” — an abstraction layer that enables agility, scale and convergence of “edgeware,” or “an interface that will provide Interoperability across the industry.” According to Green, defining these interfaces, prototyping them and testing them is a joint effort that Cisco is undertaking with IBM, Intel and other members of an “open IoT community.”
For his part, Brian McCarson, senior IOT architect for Intel, described four critical elements in an IoT Architecture that will help bridge IT and OT in a way that is secure, scalable and interoperable: start with security that is hardware and software based, automate discovery and provisioning to ease deployment of endpoints, normalize data through protocol abstraction, and visualize value by broad analytics infrastructure stretching from the edge to the cloud. Key to McCarson’s schema was security — an “immutable hardware ID for each compute device to enable secure, automated provisioning,” SecureBoot through a kernel-level device image that enables secure device provisioning at OS/BIOS level, and White Listing, a secure image of allowable agents /applications for that specific device. McCarson also stressed the importance of addressing legacy protocols to protect industry investments in firmware and legacy systems, and use of “IT to create the Rosetta Stone” on which to develop and build apps. According to McCarson, there are approximately 114 industrial protocols — building APIs for each of these would be an impractical task. In his presentation, Mac Devine, CTO for IBM’s cloud services, also focused on security, noting that the “concept of trust” in IoT has moved beyond talk about passwords for individuals to the idea of control in geographical locations. For IBM, this is achieved through real time analytics that provide information at the “right layer, at the right time, and with the right actionable insights,” that the company aims to deliver through its own Reference Architecture. IBM has also has created a series of “composable services” based on the Cloud Foundry model, beginning with the IoT Foundation Service and the IoT-related Bluemix services, including “Flow Data Stream“ management for connection, collection (data) and command of a wide variety of devices, which “went live” at the event. IoT end-end solutions that involve broad partner ecosystems, would built on top of this foundation. According to IBM VP of IoT, John Thompson, IBM’s vision for IoT extends beyond connectivity. Without analytics, he argued, connectivity has no value — data must be served up and analysed in real time to provide actionable information for better asset maintenance, product management or commerce, etc.
These three different spins on IoT highlight the challenges in building consensus around IoT frameworks. Indeed, the Reference Model featured at the IoTWF 2014 is not the only one: the first version of an Architecture Reference Model developed by the European FP7 Research Project IoT was presented in Barcelona as early as 2011. But by keeping the Reference Model discussion at a very high level, the Architectural group of the IoT Steering Committee is hoping to provide the first steps towards broader consensus, the “ecosystem education” that John Chambers noted in his closing keynote will enable the cooperation needed to drive IoT deployments. Chambers also noted, however, that we live in an era of fierce competition where the pace of change is almost unmanageable — 40 percent of companies will go under in the next ten years, he predicted. To stretch another metaphor with a biblical root, the ultimate value of the IoT Reference Model will be apparent in its ability to withstand the pull of competitive positioning — as everyone knows, the devil is always in the details.