Dashboards, and panels and labs, oh my! Donning a different dress with new topics, new industry participants and new session formats, this year’s Information Builders Symposium highlighted the company’s advance into additional areas of the Canadian market, as well as the expanding scope, importance and interplay of analytics and other technologies in success strategies across a range of sectors. Titled 2017 Symposium: Analytics in the Public Ecosystem, this year’s event, held in Niagara Falls, Ontario last month, explored themes of interconnection – between various branches of the public sector to improve information sharing – and coordination of tech implementation to optimize solution efficiencies. As Tara Myshrall, public services specialist for Information Builders Canada, suggested in her welcome remarks to Symposium attendees, when deploying analytics, users should also consider current IT trends – privacy and security risks, Big Data challenges, the emergence of IoT, and the importance of operationalizing data to ensure that information assets are actionable and usable.
Mitigating privacy risk in advanced analytics
Throughout Symposium, presenters returned to these trends in sessions designed to inform the deployment of analytics in healthcare, law enforcement and public sector organizations. In her keynote presentation, noted legal expert on privacy issues associated with the use of advanced technologies and partner in legal firm DLA Piper (Canada) LLP Kelly Friedman outlined the two legal frameworks – the Charter of Rights and Freedoms and data protection law in Privacy Act, FIPPA, PHIPA and MFIPPA – that govern the privacy issues in Big Data and IoT implementations. With an eye to helping organizations reconcile privacy rights and digital reality, Friedman argued that though there is no specific legislation in Canada that covers IoT and Big Data, “fair information principles” will apply to a wide range of data issues: accountability, the need to identify the purpose of data collection, limitations on the collection and use of data, the need to have individual consent to use personal information, requirements around disclosure and retention of data, accuracy (data quality), safeguards (security), openness and transparency on use of data, the individual’s access to data collected on him/her, and the individual’s right to challenge an organization’s compliance with data protection rules.
Furthermore, Friedman argued that Common Law of Privacy is evolving through new torts (legal decisions in court cases) that will only strengthen privacy requirements going forward. A banking case, for example, has established the notion of “Intrusion Upon Seclusion” in which the plaintiff did not need to prove financial damages – only that a reasonable person would consider invasion of their privacy offensive, and in a case that examined a utility provider’s sharing of metering data on electricity consumption with police, the courts ruled that sharing constituted an intrusion into privacy on the part of the state, and that police looking to identify marijuana grow ops would have to have a valid reason to seek a warrant. According to Friedman, these rulings and others indicate a clear trend towards limiting the use of personal information, new focus on consent, and the use of data only for declared purposes. To help organizations address movement towards greater emphasis on data protection in analytics projects, Friedman offered the following best practices advice to avoid risk in grey areas:
- Indirect collection of data for a secondary purpose – the benefits of the project must outweigh the privacy risk. The organization should have legal authority to collect the data, and to promote openness and transparency, take specific actions such as describing the project on its website.
- Speculation of need rather than necessity – Big Data fishing for significant patterns is tempting but should not be undertaken without prior knowledge and understanding of why the analysis might be useful. Data elements should be conceptually related to the subject matter under investigation in order to demonstrate need for the data.
- Privacy of publicly available information – just because information is in the public realm doesn’t mean it’s open season and you can use it. If an individual wouldn’t reasonably expect that their information would be used for in an organization’s Big Data project, it shouldn’t be used.
- Linking errors from probabilistic linkages – diversity of sources for personal information often used; it is rare for there to be a unique identifier. Ensure linkages are appropriate to the project in order to enable data accuracy.
- Government agencies have different roles – information gathered for an administrative function is often used for a policy function, but should not be used in a way that will link data to a person. Linked data sets should be de-identified to ensure adequate separation.
- Big data sets can reveal portraits of individuals – employ the principle of data minimization; don’t retain data longer than you need it.
- Garbage in; garbage out – ensure the data is accurate to the purposes of your project.
- Big Data is not objective – biases can exist within the data, so the data that feeds analysis makes a difference, and can negatively impact individuals associated with a particular race or socio-economic status.
- Equality rights – don’t use Big Data as a proxy for a group that is protected under the Charter or Rights. Making decisions based on race, for example, constitutes a breach of the Charter. Ethics boards can provide guidelines.
- Correlation does not mean causation – be aware of specious correlations, and don’t imply causation when it’s not there, even though cause may appear obvious.
- Profiling – when decisions are made based on Big Data analytics, a profile is created that is treated like personal information. Individuals have the right to rebut a conclusion that may have come from analytics.
- Recognize there will always be some data error – so verify the results of decisions based on data.
- Using privacy law in the new world – law is evolving slowly to accommodate not just bits of information, but information in conglomerate that is extracted from information systems. Analytics users must develop awareness of new risk associated with data systems.
Real time data reporting and visualization
Focusing on the art of the possible, other sessions described analytics projects that demonstrate the value in collecting, organizing, analyzing and reporting through Business Intelligence software aimed at increasing access to information. Stuart Betts and Greg Stanisci of York Regional Police, for example, outlined how data analytics provides value for policing. YRP’s analytics journey began in 2012, and with its data warehouse in place, the strategic services branch has begun to develop WebFOCUS dashboards with various types of drill-down designed to support process improvement and increase efficiencies. According to Inspector Betts, “technology is a tool, not a driver” for their project; rather the YRP embarked on data-driven policing to better manage staffing resources, to improve public safety and security, to reflect the true cost of policing, and to build community confidence and trust. But technology in the form of multiple dashboards that deliver traditional reporting as well as real time information to the officer in the field has made a difference. For example, the YRP’s Economics of Policing report, presented to Police Services Board, has delved into disjuncture between budgets and crime rates by looking at unaccounted activities (for example, mental health intervention) and to produce direct labour costs for each activity – a “game changer” that, Betts argued, allows the force to engage in more realistic budget discussions. Similarly, the Sector dashboard automates and optimizes use of staff resources based on an ideal model of police activities (30% of time is spent in administrative work; 30% of time in proactive work; 30% in citizen service calls; and 10% in community engagement) to track individual performance, but also to track engagement in life threatening priority calls. Other dashboards deliver information to field officers in real time, developing situational awareness that helps decision making during investigation; while others deliver information that is used in YRP’s real time operations centre – the first in Canada – that provides logistical information, such as how many officers are in the field at any one point to respond to an emergency, the type of calls that are currently in service, the units that are assigned to the call, resources by district, as well as the BI logic that supports alarming based on prescribed thresholds.
While dashboard wizardry animated the YRP presentation, HInext brought Symposium attendees behind the scenes with a glimpse into creation of a new mental health app. Based on an evaluation of the patient journey, TREAT is a SaaS-based application that manages clinical documentation and workflow to support the clinician’s patient assessment, establishment of a care plan, and the trigger of issues that could compromise plan and patient health. As Peggy Lucas of HInext explained, TREAT was built on standardized responses for care plan in collaboration with a customer; but successful development of the application also dependent on use of Information Builders’ integration and reporting technologies, which helped to standardize access to reporting in multi-customer settings. (HInext is also looking to support the development of standards for metrics in the Ministry of Health.) According to Lucas, use of WebFOCUS ultimately enabled TREAT to address requirements for operational reports, dashboards, user query, and integration with other SaaS services, but also provided an opportunity to revitalize the user experience, to improve access to data, including transactional data that is handled in real time, to deliver dashboards for the patient story, client profile for the organization, and to show data from the provider perspective – the “wow factor.”
Avoiding garbage in; garbage out
In his presentation, John Desborough, director of Consulting and Technology Services at MNP, addressed the elephant in the room – data quality issues. Speaking from his vantage-point as a member of the fifth largest accounting and consulting firm in Canada, Desborough discussed cost issues around data quality, noting that BI and analytics users need to think about potential opportunity costs, and to factor in quality monitoring in advance of an integration, building these costs into the project from the outset. While it’s difficult to measure the cost of poor quality data, it is possible to do a data quality assessment, using data profiling and models to quickly assure, and remediate issues up front. Desborough recommended that this task be “baked into your project management processes!” For its part, MNP relies on IB’s Data Quality Assurance solution, and as auditors use the tool to reduce risk associated with poor quality data. According to Desborough, a good data assessment tool should deliver data profiling results, data validation analysis, duplicate data analysis, pattern analysis, supporting a data quality strategy and program that can benefit project sponsors, business leaders and legal and compliance staff.
IoT in the ‘city’
To broaden the discussion, InsightaaS’ Mary Allen led a panel discussion that spoke to opportunities in IoT and to building intelligence in communities more generally. Drawing on some of the best practices findings uncovered in IoT Coalition Canada (IoTCC) working group sessions, panelist Campbell Patterson (from the Intelligent Community Forum) offered a definition of “community ecosystem” that extends beyond urban geographical boundaries, and perhaps more importantly, works through the engagement of multiple constituencies, rather than simply top down. The potential for innovation in this approach was confirmed by the City of Brampton’s Prasana Gunaskekera who has led city initiatives aimed at engaging citizens and integrating data from across city operations, as well as by Inspector Betts, who looks forward to sharing of information across public services, such as mental health, to deliver better coordinated service delivery to citizens at risk. To enable this kind of coordination, and collaboration across public and private sector entities, IT consultant, IoTCC contributor and panelist Don Sheppard offered some technical observations: standards are important to information sharing, blockchain is an emerging technology that offers significant promise, and new technologies demand that communities devise new approaches to IT procurement. Ultimately, the panel concluded that “intelligence” – or data driven service delivery and urban innovation – will depend on holistic planning that includes all stakeholders.
User experience design
Throughout Symposium, Jeff Hendrickson of Information Builders offered “intensive, experiential and fun labs,” hands-on sessions designed to introduce participants to design thinking in analytics solution development. Citing analyst firm Gartner, Hendrickson noted 35 percent of apps are not adopted; the goal of the labs was to reinforce the importance of including IT and business leaders and technology end users, cross functional teams in IT-enabled process innovation. “Part art; part science,” Hendrickson stressed that “UX is not the same thing as UI,” rather it is a content strategy and UX strategy that depends on establishing deep empathy with the user to ensure the technology is usable, and a firm basis for action. The IB UX Design Labs simulated projects (not products!) in healthcare, policing and city administration. As a general observation, Don Sheppard observed that a good user experience design can lead developers to the types and sources of data that are needed to generate intelligent insights. But is also exposes issues of data quality, security and privacy, cloud-based data gateways and data governance.
Returning to the theme of technology coordination in his keynote presentation on Day 2 of Symposium, InsightaaS principal analyst Michael O’Neil outlined a cloud play in five Acts, in which each stage contributes towards greater integration and optimization of data assets across the organization. With the introduction of IoT, AI, deep learning and other advanced analytics technologies, the sheer volume of Big Data means it is likely that most of the processing required will come from cloud service providers. For this to be effective, all technologies will need to be secured and integrated. According to O’Neil, cloud computing should be viewed as a management issue and a management tool that is critical to the support of Big Data, IoT and analytics projects, and evolves as follows:
Act 1 – point solutions, often SaaS, are introduced to automate individual tasks,
Act 2 – process integration extends automation to related tasks,
Act 3 – integration occurs across (extra) connected activities,
Act 4 – automation of the entire function, includes connections with external sources of expertise and data, and
Act 5 – Orchestration of data applications across the enterprise, based on the delivery of advanced business infrastructure/capabilities via the cloud.
The final keynote speech by Lyndsay Wise, solutions director in the professional services arm of Information Builders Canada, explored the future of analytics and the importance of strategic thinking in deploying BI and analytics. The constantly changing data landscape is focus for the integration of new, and more complex sources of data that will be used to generate business insights – to effectively add intelligence. But while emerging sources of data expansion such as IoT and Big Data will continue to evolve, Wise wisely stressed that effective information management and life cycle control over data continues to be essential to use of analytics in all sectors, and especially within the public sector.