Location analytics ‘how to’

Technology has long been labelled an enabler of business success. But as more and more companies come to recognize how important transition to the ‘digital enterprise’ can be to competitive operation, this axiom is gathering momentum, taking on life as a critical imperative as opposed to organizational detail. This shift in thinking that is now underway – where ITC serves as a source of gain rather than cost – was evident at the most recent Strategic Insights Session, hosted by DMTI Spatial and Insurance Business Canada Magazine at Toronto’s Air Canada Centre this past April. Billed as a discussion on How location analytics lowers risk and drives efficiencies, the session attracted key C-suite business executives from across the Canadian finance and insurance industries looking to maximize value by driving the alignment between IT and business needed to derive insight from information – tech specialists were notably absent from the list of power attendees.

Rob Daleman, VP marketing and inside sales, DMTI Spatial
Rob Daleman, VP marketing and inside sales, DMTI Spatial

To delve into the mechanics of this business/tech alignment, session hosts organized the event as a roundtable discussion, a format that allowed attendees to identify challenges and flag industry issues as a first step in the process of applying location analytics to industry pain points. On this score, Rob Daleman, VP marketing and inside sales, DMTI Spatial, launched the session with a précis of the “Six Steps to Leveraging Location,” a process for using location-based technology developed by DMTI in partnership with the Institute for Catastrophic Loss Reduction, Genworth and Canada Guaranty Mortgage Insurance to address growing risk and exposure that dramatic increases in both the incidence and severity of natural catastrophic events (2013 floods in Southern Alberta and the GTA, for example) have produced in the insurance industry.

In each of these steps, the business benefit of location analytics is clearly drawn: in step one, underwriter and claims communities use location-based data to understand risk associated with specific geographical areas, fine tuning price in current and future policies; stage two involves use of data and visualization in operations to link disparate information for real time decision making; step three applies location data in back office functions to enhance portfolio analysis across the whole book of business; a fourth uses location intelligence in post-event analysis and the provider activity this enables, a fifth area is marketing, where companies can use location data to research and better serve customer segments; and a final phase is technology implementation based on cloud architectures and infrastructure that can scale to quickly deliver the right information in the right format within milliseconds. According to Daleman, BT (before technology), “all of this was in the purview of the GIS analyst. This was all being done by individuals at desktops. It wasn’t real time and it wasn’t available through cloud technology. That’s really been a big change too in how we’ve been able to deliver these applications in a more meaningful way to customers today.”

Interestingly, technology alone is not sufficient to drive the kinds of improvements outlined in the six steps. Daleman shared the experience of one insurance company that actually experienced less productivity when location analytics were applied at the operational level – that is until rules and processes were put in place to systematize how underwriters approached their tasks. This refinement of business process also entailed the creation of standardization around addresses, the “Unique Address Identifier” or UAID, which can be plugged into CRM, SAP or other ERP tools that are also commonly used in the underwriting systems: according to Daleman, “We have one unique way of connecting all of these points together and that makes the whole operations element a lot more efficient.”

This standardization has also worked to enable communications across the industry. As Michael Mercer, VP sales and business development, DMTI Spatial, explained, an ecosystem is building that provides companies with a common language for location: “Intact’s [insurance company] assessment solution, people like MPAC and Landcor, BC assessment, title insurers like FNF, and all three of the mortgage insurers in Canada, including Canada Guaranty, CMHC and Genworth all use Location Hub [DMTI platform which contains the UAID]. AVM [Automated Valuation Models] providers are using Location Hub and add in that UAID; appraisers are doing it. So in this ecosystem, it’s possible now to communicate instead of worrying about ambiguous addressing.”

Ultimately, by using location data to optimize different processes and to look at data in different contexts, it is possible, Daleman argued, to consider business impact as a whole – where is there too much policy concentration, what markets are underserved, is appropriate pricing applied to areas with greater/lesser flood risk, or how does flood risk play in overall portfolio development? “Juggling the exposures,” concluded Don Horn, content specialist, Insurance Business Magazine.

But location-based intelligence also has impact on the customer, a point that Horn dramatized with description of his activities at the time of the 2013 Calgary flood crisis: “when the floods hit, I was busily contacting brokers as they were working from their homes trying to deal with clients. The cloud technology [used by the insurance company to confirm policies] made it possible for them to reassure clients. That was the biggest concern of clients – when they called in, they wanted someone to get back to them in 10-15 minutes, or at most a half hour to let them know, ‘yes, we know what’s happened. We’re on the case and we’re working on it.’ That is very, very big with clients.”

Post-event analysis to enable accurate and appropriate management of a catastrophic event and the delivery of a better, near real time customer experience is an emerging area in location analytics. As Mercer noted at the event, First Calgary Financial and one of the chartered banks quickly approached DMTI during the Calgary floods to access the company’s Canadian address database to see which properties had been affected. Based on a geographic boundary outlining the affected area, and application of the UAID within that area, DMTI was able to quickly identify the 68,000 properties that were affected – helping the institution understand in 24-48 hours what their exposure was. Once the 33 day flood was over, DMTI was asked to “stay on top of this type of service,” and has just released a new post-event service in which monitoring is done 24/7 for specific types of natural and manmade events, such as hurricanes, flooding, natural forest fires, man-made fires, an oil spill or train derailment that affect more than 10 properties. In case of an event, the company will send out an email alert to customers. According to Mercer, the service launched in response to customer demand: “the feedback came from the folks at the chartered bank and the folks at First Calgary and some other customers who we talked to. It was all a customer-driven product – it’s about what they needed and what their executives would want to see in the product. It’s another thing that we’re trying to do to address risk in the marketplace.”

Another key theme that emerged in the session was the importance of integrating different kinds of information, and drawing the right lines between data points to generate new insight. As one attendee lamented, “it’s all very well having great location data be the vessel for the ID which is this geocode, but for me one of the biggest challenges is the hazards. You mentioned having the flood data – you can have addresses, but these are only as good as the granularity of the perils you’re going to attach to that address.” Others are apparently already engaged in mapping out their own claims data against flood zones in order to understand impact on the business: another attendee claimed to be having “good success” mapping out company data against other perils in order to rate and price different zones, determine coverage availability and to decide, ultimately, whether or not to accept the level of risk indicated. And yet another participant noted the need for more individual/person data, the demographic characteristics of people living within a region, as an input to more clear definition of target markets, and the benefits of combining this with location and perils information in predictive models: “We’re seeing huge benefits and huge gains being made around combining all three of these sets of data in a predictive way,” he added.

Beyond these individual efforts, partnering at the company level is also occurring to address user interest in comparing different types of data sets. At the session, DMTI and environmental risk services provider ERIS described the ‘environmental perils score’ they have developed and are now rolling out that is based on the combination of location data and data around spills, empty tankers, where gas stations used to be, etc. – any environmental event that may have occurred in Canada. The goal is to help users better understand risks associated with proximity to that event or property. So far, flood data, earthquake data, crime data, weather data, demographic data and firmographic data have all been tied to a single address.

On the finance side, session attendees also identified a number of additional use cases for location intelligence. David Lawlor, director of R.E.S.L. policy & process, Scotiabank, for example, added the audit perspective, noting that in covered bond activity, addresses must be “scrubbed” down to ensure compliance with SCC and the regulators, as well as the granularity that can be provided in location data to help drive analytics at the condo-building and other community levels. Another participant pointed to possibilities for combining location intelligence with demographic information to achieve some level of predictive analytics in the mortgage business, projecting, for example, what sales might be associated with a high proportion of young families in an area or condo building. This is a proposition that was less than theoretical in the case of DMTI: as Daleman explained, the company provided location/demographic analysis for Global News on the Toronto District School Board school closings, projecting what zones would experience increased demand going forward. On this topic, yet another attendee noted the potential for location intelligence to calculate mortgage impairment risk – for the downtown Vancouver condo market, for example, in the case of a large earthquake in Shanghai that might impact people’s ability to make payments. On another note, Mercer and Horn noted the explosion of data that will be generated by driverless cars, which could be combined with location information to improve service delivery as well as insurance management.

If participants in the session expressed enthusiasm for new location intelligence use cases, they also outlined ongoing and future challenges in the world of analytics. While Paul Cutbush, SVP cat management, Aon Benfield, outlined the tension in “the concept of experience versus exposure,” where per peril pricing is a back looking exercise and where exposure pricing using third party exposure tools means “trying to look behind and forward at the same time,” Pauline Harrison, head of corporate underwriting strategy, RBC Insurance (and others), observed that though the bank is now focused on using location analytics in fraud management, “it’s still early days for predictive analytics.” For his part, Sebastien Vachon, VP of insurance solutions & actuarial, Desjardins General Insurance, described his biggest challenge as maintaining Big Data analytics once this has been rolled out through the organization; Johannes Tekle, VP of enterprise risk management, Home Trust Company, noted though his group may understand their potential exposure, “we don’t necessarily have the severity factor so well figured out”; Javier Brailovsky, directeur prinicipal, National Bank Financial, noted issues with the quality and completeness of the information; David Bradshaw, VP of client business support, Tangerine, pointed to issues with aggregating data into a digestible form that that can support client discussion in a meaningful way.

To conclude, session attendees also posed a number of specific questions on capabilities within the DMTI Location Hub platform, demonstrating both the sophistication of their understanding of industry problems, and offering insight into the business and process issues that users of location intelligence are looking to address. This roundtable engagement on challenge and capabilities was in many ways the most valuable part of the session, as the sharing of experiences can serve to educate peers, but vendors as well on what feature sets should be in place to drive business and system alignment. In the insurance industry in particular, which is itself increasingly exposed to catastrophic loss and margin squeeze, this exercise is becoming critical – as Daleman explained: “technology is changing very rapidly, and if you don’t take advantage of it, the competition very quickly will. The independent brokers out there working in larger networks need every tool they can use to gain advantage because the big directs and some of the larger institutions are applying pressure on them and they need to up their game to compete on a level playing field. Analytics can do that quite easily.”

 

 

 

 

 

 

 

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.