New V2V research explores Value Stream Mapping – a cutting-edge approach to data management

Value Stream Mapping: Optimizing Data Processes examines value-added approaches, DataOps

The Vision to Value (V2V): The Economics of Data best practice community has released its most recent report, which delivers insight into the application of value stream mapping – a management technique taken from the Lean methodology, focused on reducing waste and optimizing efficiency in ‘value’ areas – to data management processes.

Value Stream Mapping: Optimizing Data Processes, which is the second in a four-report V2V cycle dedicated to “analytics and process change,” was developed by InsightaaS principals Mary Allen and Michael O’Neil with input from 16 leading professionals, including managers from public and private sector organizations that are looking to advance the state of analytics within their organizations, academics, and product and service providers who work with clients to drive accelerated benefits from analytics initiatives. It was launched at a high-energy Meetup that featured contributions by many of the reports contributors that was hosted by V2V founding sponsor Information Builders on July 11th.

About the report

The Value Stream Mapping: Optimizing Data Processes report has four major sectors. The first, “Value Stream Mapping: key attributes” provides both a definition of VSM as a means of improving data management, and importantly, discusses how the concept is currently applied in different contexts. The two research calls that were used to collect input for the Graphic illustrating the process of applying value stream mapping to data managementreport, and the Meetup which contributed additional insights, all indicated that while the term Value Stream Mapping is not yet common in the analytics world, the concept resonates with many advanced analytics practitioners. Most of the working group members identified VSM as having two key characteristics: a strong focus on the ultimate business outcome (the ‘value add’ referenced above) and a continuous commitment to streamlining the steps needed to use data to achieve these outcomes. In several cases, discussion of the approach highlighted emerging IT management areas – DevOps, DevSecOps and DataOps – that take a holistic view of domains to reduce overall cycle time and improve understanding of adjacencies and opportunities to derive additional benefit from ongoing activities. The Meetup in particular also honed in on VSM’s value as a communication framework that can help identify specific points of value associated with a particular ‘map’ and provide all stakeholders – executives, business unit management and staff, and IT and analytics professionals – with a common understanding of targets and timeframes.

The second section of the report, “Business objectives driving VSM interest and investment,” contains some of the richest content in the document. While the initial hypothesis was that process improvement would be the main motivation for VSM adoption, analysis of contributor input identified two additional sources of VSM value: ‘top down’, enterprise use/activities and ‘bottom up’ business unit use and activities. Each of these three major categories was divided into two focus areas, and examples from the research were plotted within a resulting positioning framework. Report readers can use this section of the report to tie VSM to a wide range of organizational objectives.

The “Best Practices” section of the report is a continuation of the section on business objectives, as the previous chapter ties investments scenarios to the activities used to obtain needed benefits. The best practices material does identify a number of factors – master data management (MDM), the importance of developing a business glossary that articulates data definitions in terms that are meaningful to business, IT and analytics professionals, and process imperatives such as working across silos, dedicating resources to performance measurement and embracing a holistic ‘system’ view of goals and associated process streams – that should be considered in VSM initiatives. This section closes with the observation that “VSM is a relatively new approach to data optimization and management; early adopters will need to combine concrete suggestions (such as the emphasis on MDM) with anecdotal observations as they assemble an approach that is appropriate in their business contexts.”

Value Stream Mapping: Optimizing Data Processes closes with a section entitled “Five Dimensions of DataOps.” While most InsightaaS community best practices reports close with analysis of metrics and milestones, the research found that DataOps can be seen as a key objective at the end of the VSM process, and so the working groups developed a view of how multiple contributors to VSM – ‘the business’, analytics professionals, communications/legal/compliance, IT and the platform technology itself – should come together to build DataOps processes. This section is likely to act as a springboard to additional research – please stay tuned!

Launch event

Value Stream Mapping: Optimizing Data Processes launched on July 11th at a meetup hosted by Information Builders at its Canadian headquarters in Toronto. The first part of the meetup was given over to an executive Q&A, in which Information Builders’ Canadian business lead, John Ramoutsakis, quizzed Abdur Khan, a VSM expert active with energy firm ShawCor, on how VSM is applied in an industrial context. The event then moved to a dynamic panel discussion, moderated by InsightaaS CCO Mary Allen and including report contributors Kavita Khera (Ontario Ministry of Labour), Ashraf Ghonaim (City of Toronto), John Morris (Data Decisioning LLC) and Varinder Sembhi (Xodiac Inc.).

The panel discussion followed the outline of the report, and added observations that helped the Meetup attendees to understand and relate to key VSM issues. Ashraf Ghonaim got the first section rolling by noting that VSM “helps the whole organization to understand and appreciate” all of the steps in a data process. “This is a very good communication tool – we can start facilitating discussions around the value of each [process] step. The whole concept of Lean process exists also in data processes. Data moves from one stage to the other: it needs to be articulated to the organization ‘what is the value of moving the data from one step to another, and what is the value add?’”

Kavita Khera stepped through the application of the concept to the challenges that she faces in her workplace, using the example of an unmet service standard that was improved by:

  • “Looking at what our commitment to the customer was, and then at our processes and data”
  • Looking at the stages involved in delivery, and the bottlenecks in the process
  • Building a map that connects analytics to the business challenge – which resulted in understanding that the areas where the Ministry was planning to apply additional resources would not fix the problem
  • And ultimately, reducing cycle time by more than 40%, from more than 150 days to about 90 days.

The business objectives discussion built from these points. Varinder Sembhi expanded on the issues related to data moving through stages, noting that quality often degrades through transitions: “data is like a tomato – the more you touch it, the less you want to consume it.” The group then delved into  some of the challenges associated with data management; after discussing issues associated with signal overload and alarm fatigue, John Morris pointed out that “it’s really easy to spray data at people… [but] data is for decisions!”

When the conversation turned to best practices, all of the panelists highlighted the importance of MDM in developing effective data management practices: Khera pointed to MDM’s ability to overcome challenges associated with the creation and maintenance of golden records that extend across business unit silos, Ghonaim talked to need for consistent data to support performance measurement, and Sembhi linked MDM and data glossaries: “a lot of dashboards,” he said, don’t provide anticipated value “because there is no common business glossary – people aren’t able to translate all the tables and columns” into terms that are relevant in their business context.

With this level of concurrence on the importance of MDM, Meetup attendees might have been left wondering, ‘why isn’t MDM already in use everywhere?’ Morris articulated one key reason, noting that “MDM is infrastructure, and there’s never a business case for infrastructure…infrastructure is only funded at the board level, or at the senior executive level. You can’t make a business case for MDM, and yet MDM is the gating factor” in VSM/data management success.

The session closed with a view forward – one that demonstrated the role that VSM is likely to play moving forward. Ghonaim divided the essential participants in VSM into two groups:

  • “Those who build and sustain and maintain the pipeline” – front end designers, and people involved in data science, data warehouse development, machine learning and other related activities – “facilitating the process of capturing data to consumption.”
  • “The business silos.” This includes, importantly, the people who enter data into systems – “I have seen, in many organizations, that there was a huge disconnect between how the system was designed, and how the front-end people actually use it.” Other key participants include “super-users who have the ability to manipulate the data” (how do they use their privileges to alter data?), and supervisors and middle management all the way up to senior executives, to understand how data is being used.

Morris highlighted the importance of integration, observing that VSM “is the activity that the business analyst will do, to turn great data into actionable decisions…they’re the pivot point,” while the senior executive ties data outcomes to broader corporate objectives and processes, which reinforces the need for Value Stream Mapping. Sembhi honed in on the connections between the business domain and the data domain – the criticality of connecting infrastructure and a data management platform with “the final consumer – the customer – the people who are looking for and using the data.” And Khera provided the ‘last word’ in the discussion, tying the Meetup commentary together by stating that “demonstrating the value – cost savings, return on investment, whatever key metric we may choose – automating and working through the value stream…has helped us, and I want to do more and more of it.”

Obtaining the report

Value Stream Mapping: Optimizing Data Processes is available at no charge to professionals who would like to understand how to build an evidence-based culture within their organizations. Please follow this link to reach the registration form.

LEAVE A REPLY