A standards anatomy: building tech norms in government Pt 1

Long term player in IT standards creation, the Object Management Group, explains process around standards research. In part one of a two part essay, focus is placed on ecosystem development, organizational issues, and establishment of the government use case.

Standards, huh, what are they good for?

Tech standards are widely touted as a panacea for a multitude of sins. Their application will solve interoperability challenges for technology consumers looking to deploy solutions, particularly in emerging areas, such as IoT; adherence to standards will help users battle security demons; and in standards inventors will find the guidance and frameworks that they need to employ in development to help ensure a place for their product/service in an increasingly global supply chain. But when faced with a dizzying array of sometimes competing standards, adopters are faced with important questions around what, who, why and how do these guidelines emerge, and which is the right one for their particular application?

As a starting point, it’s important to recognize diversity in different standards bodies and their output. Organizations, such as the ITU, have a long heritage in developing standards and protocols for the telecommunications industry, while others, such as the IEEE, have focused on electronics, computer engineering, IT and related fields (ex. WiFi). Often, standards emerge from guidance developed by national governments to inform technology implementation their own operations: the US NIST cloud definition and implementation standards are a good example of this kind of activity. In Ontario, the provincial government has prescribed use of a set of standards (over 70) that apply to the creation of applications for the public service, which is divided into technical, architecture, information, IT service management, green IT, networking, security and enterprise product categories. Government standards are often adopted by organizations in the private sector, or they may roll up to supranational bodies, such as the ISO, a global federation of national standards bodies which now represents 160 countries, that has developed standards across a range of ICT (and other) requirements. Key IT focused standards bodies in Europe include CENELEC (European Standardization in electrical engineering), ETSI (telecommunications), and CEN (other technical areas): these have developed standards that are often adopted outside European markets and border. As they are created, IT products, services and environments may encompass the application of multiple standards.

How are they developed?

Typically, work on standards issues begins in response to a call for solutions to a problem from industry groups or associations, which may include providers and consumers of technology. Standards research is carried out by discrete, expert working groups that are formed within traditional standards bodies such as those noted above, or may be conducted by industry groups who engage in independent best practices development work of their own in new tech fields. Prominent examples of this kind of consortia are the Cloud Security Alliance, or the Industrial Internet Consortium (IIC), which recruit broad industry participation for work on the development of technology solutions that may be used standalone or that may inform existing standards. The consultancy, Raising Standards, lists hundreds of consortia in the ICT world, claiming that these are increasingly responsible for much of the critical standardization work.

Richard Soley, Chairman and CEO, OMG and Executive Director, IIC
Richard Soley, Chairman and CEO, OMG and Executive Director, IIC

The primary job of consortium staff is to identify, gather and organize members of a group that have common interest in solving an industry problem, and a secondary task is to source the funding needed to support research into the issue. Richard Soley, chairman and CEO at the Object Management Group (OMG), a technology standards consortium created in 1989, and executive director of the IIC, has described the consortium mandate as follows: “It’s about ecosystem development. It’s not even about technology or security or standards. It’s about creating that ecosystem of companies that want and need to work together. Our job is to bring together the players, and we also help to identify funding for these testbeds.” And the aim of the working groups and testbeds formed to address specific problems, is to identify “the new requirements for new standards that we can deliver to standards organizations so that they can develop standards that will make it easier the next time a product/solution gets built. The testbed is really about inventing the future, rather than waiting for the future to be upon us and disrupt our businesses.”

Research output is only as good as its traction in the market, and a key role of these organizations is to socialize their findings. The OMG’s IIC program, for example, reports relevant testbed findings to over 30 different standards bodies. It also holds meetings on a regular basis in which testbeds report on their progress, end users outline the use cases – they talk about the problems that need to be addressed – and vendors discuss technologies that might fit into the stack to try and solve them. According to Soley, a well-defined use case and the establishment of success criteria are of critical importance in the development and scale of testbed research. And the use case is invariably a vertical application: “you will understand better how industries are going to be disrupted when you focus on the industry and the use case. This doesn’t mean there are no horizontal use cases – approximately 15% of our testbeds are horizontal, and the rest are very vertical. I’m a strong believer in vertical focus. If you are in the mining business, it’s much more helpful to look into mining use cases to see what people have done.”

And final requirements in standards research are openness, transparency and objectivity. At the OMG, Soley explained, “We insist that our process be open, neutral and international, and follow well defined process lines. That’s how you ensure you don’t have anti-trust issues. We’ve got competitors sitting in the room together, vendors, multiple banks – you’ve got to be equitable for your members, but also for your regulators.” The IIC, for example, which was launched in 2014 with participation from founding members AT&T, Cisco, GE, IBM, Intel as an “open, neutral sandbox” where industry, academia and government collaborate to innovate, now runs 30 projects aimed at capturing best practices and technical requirements in largely vertical focus areas.

Standards need – the use case

Many of these key activities were on display at a Technical Meeting hosted by OMG this past September. Attended by 343 IT professionals, the meeting provided an opportunity for networking, a forum for individuals to influence the technology adoption process, and for information sharing on the evolution of standards work – in both technical task force reporting and in presentations delivered for a broader conference audience. The meeting was held in Ottawa, Canada, a fitting locale for research designed to address vertical requirements shared by public sector organizations.

Teresa D'Andrea, Director Digital Exchange, Treasury Board of Canada Secretariat outlines interoperability needs in the federal government at OMG Technical Meeting in Ottawa, Sept. 2018
Teresa D’Andrea, Director Digital Exchange, Treasury Board of Canada Secretariat

At the event, the government use case was outlined by Teresa D’Andrea, Director Digital Exchange, Treasury Board of Canada Secretariat, who described her organization’s work towards creation of the DCXP – an open standards solution for software interoperability that will serve as the foundation for the federal government’s OneGC Digital Exchange Platform. According to D’Andrea, the government technology landscape is highly complex as it is the outcome of decades of accumulated, siloed information systems building – data moves across government in a wide variety of formats using multiple technologies, and interoperability, where it exists, is point to point. Difficulties in exchanging information and the extended amount of time it takes for data to move between departmental systems results, she explained, in delays of service delivery to Canadians, “information blind spots” when needed information isn’t readily available to different departments and/or agencies, use of inaccurate or incomplete information to render services, and the requirement, in some cases, for Canadians to interact with several government organizations to achieve a single outcome. Information exchange with the private sector is similarly fraught with challenge: though the government currently does exchange a lot of data with industry, this is cumbersome. Resolving this problem is no easy task: as D’Andrea explained, “The inconsistency and the shear permutations of different ways to pass data makes processes very complex and expensive to change.”

In the past, the government has found attempts to resolve this complexity in a single project to be unrealistic. As a result, D’Andrea explained, it is now working more broadly to remove some of the foundational barriers that have built up over time and to lay the groundwork for more efficient data sharing going forward. This approach involves three key initiatives: leadership to establish a virtual Community of Practice around Digital Exchange for skills building and knowledge sharing; deployment of the Canadian Digital Exchange Platform (CDXP), which will provide a service BUS, bulk data transfer, a gateway, messaging, and the API Store, a marketplace for reusable API services, along with other integration methods such as messaging/eventing and service orchestration, to support the creation of a sharing ecosystem that includes industry and the provinces; as well as the development of policy and “lightweight” standards for data sharing across government, including API Standards, Messaging Standards and Integration Patterns.

In standards development, the Treasury Board Secretariat has adopted a flexible approach: D’Andrea added, “These standards are meant to be read and consumed in 30 minutes or less by technologists. We focus on the baseline set of standards that are needed to make sure systems can share data consistently across government and leave enough room for innovation by the folks actually implementing these integrations and processes. We have also leveraged industry materials as well as the great work done already by some of the other leading digital countries (e.g., UK, New Zealand, US).” The aim in creating ‘lightweight” standards is to combine ease of use and speed in deployment with targeted engagement on standards value, to help manage and scale standards usage. According to D’Andrea, “Our philosophy is to shift towards defining simple and easy to adhere to standards, which focus more attention on stuff that has high impact on inter-organizational interoperability (e.g., security protocols, data encoding, interface binding protocols). We’re planning an extensive and ongoing engagement effort to promote the value of having standards. The standards are about simplifying the number of decisions when building digital services to ensure what you build can be easily consumed. The value is about faster delivery and being able to consume other services with less guessing or fiddling with technology options. In short, we’re making sure we heavily leverage active engagement techniques like communications and education to drive adoption of the standards.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.