Bringing Data and Process Together

[Interview from 2014]

In PEX Network’s latest survey on technology investment plans of process professionals, Big Data and analytics has emerged as the number one investment area in the year ahead. How do data and process fit together?

Rob Speck, Vice President, Global Services, at BPM software provider K2 weighs in.

http://www.processexcellencenetwork.com/business-process-management-bpm/interviews/bringing-data-and-process-together-the-future-of-p

Process Performance, Complexity and How I Learned to Stop Hating Traffic Lights

Measurement and Action – how do we improve performance through a cycle of measurement and action?

Measuring performance of operations is one of the most challenging of all management disciplines.  Financial performance tends to draw the most attention with investors and management having stakes that are dependent on the cash flow, income statement and balance sheet results each and every measurable period.  The quarter ending release of financials dominate most business news cycles and companies commit a fury of resources and activity to try to get numbers to align with pre-set goals, objectives and related expectations.  Sales teams are pounded hourly to get PO’s in before the quarter’s deadline.  Back office operations work frenetically trying to meet reporting deadlines and get all accounts reconciled in time.  And of course, executives prepare summary statements, conference interviews and advance guidance statements to investors in an effort to set a level of confidence in the direction of the company and how the broader market is impacting its results.

But what I’ve always found troubling with this nearly universal cycle of chaos during each financial reporting cycle is that the financials that are measured represent just a small picture of the overall health of the business.  While it’s an important part, it’s by no means even the majority of what investors, board members, partners and employees should be concerned with.  There are so many other factors that need to be measured, trended, compared and ultimately weighed into the analysis and assessment of each organization.

A Business Services Example

Recently, I was working with a Shared Services division of a global corporation.  This business services division manages all operational financial services including payroll, accounts payable, and receivables as well as human resource related operations.  To give you an idea of how complex their existing processes are, consider these conditions:

  • Many processes are “black box” in nature; managed by two separate third party global services firms.  The company does not know or see what the third parties do, only the results.
  • Other processes are managed by the shared services division which serves many businesses, but not all.
  • The company had recently acquired another set of businesses worth several billions USD and those organizations would have to be integrated into the shared services division as well as the outsourced third party operations.

The challenge to this organization was focused on the need to consolidate processes where possible and ensure all processes were designed in a way that the varying participants could understand.   As any of you who work in process management and process improvement know, just getting a common framework for communication is a major challenge.

Documentation existed everywhere; across all parts of all entities.  But every bit of the process documentation was disjointed; written sometimes in Microsoft Word or Visio or PowerPoint or embedded in SAP documentation or even in printed notebooks that no one could find the electronic versions for.  “OMG!”, my teenage daughter would say.   “A complete mess”, my client would admit.  Sound familiar?  This is a common condition that I encounter at nearly all of my clients.  But that wasn’t the only problem.  Even if we can get all process definition content in a single place and in a single language that everyone can agree on, how do we manage it ongoing?  Further, how do we know it’s right?  Is it really what peopleare doing?  Or just what they say they should be doing?  And finally, how can we start measuring processes; ultimately holding those accountable at the process level for measurable results?  These are big challenges for even small organizations, let alone an organization that operates dozens of businesses across dozens of countries.  To solve complex issues, it’s often easiest to compartmentalize them and solve them one at a time.  The key challenges with this scenario include:

  1. Complexity of Process
  2. Understanding Accountability
  3. Sustainability of Content
  4. Compliance with Process Standards
  5. Sustainability of Performance
  6. Defining measures that align with process design

As we break down these topics, one thing that becomes apparent is that the last category, “defining measures…” is something that is dependent on most of the items above.  While the organization did have performance measures in place, there was very little accountability and that was mostly because it was very difficult to know which roles and individuals really contributed to the factors of the final figure.  To understand what I mean by this statement, consider the scenario of Time and Expenses.

Time and Expenses is a common process in most organizations where employees record their time and expenses for payment to the employee.  A key measure that was tracked was the percentage of T&E submissions paid on-time.  These statistics were tracked and summarized monthly and reported through a sophisticated business intelligence system.  This was one of dozens of metrics.  Who is responsible for ensuring T&E submissions are paid on-time?  When the percentage of on-time payments in July dropped below 90%, this was an unacceptable “red-flag” alert.  Why did this happen during this month and who can ensure it is corrected?  These may seem like pretty straight forward challenges with the solution being that the organization just needs to be structured such that T&E has a single process owner and that all participants in the process are managed under that owner.  Right?  HA!  No way.  It’s far more complex with this global organization.  To start with, we have 5 separate high level process steps:

  1. Set Policy
  2. Arrange T&E Information
  3. Submit T&E Form
  4. Process T&E
  5. Pay Submitter

Each of these steps are managed separately with the policy (#1) owned by a governance board with input from Audit, steps #2&#3 owned by the individual, #4 depends on the region and the part of the organization and #5 a third party outsourced organization that was again variable depending on what organization and region the submitter was from.

So, how the heck do you know where the process is breaking down and why some submittals are paid beyond the required deadline?  I won’t go into the full analytics and forensics involved in identifying the “choke” points, but suffice it to say that a small minority of data points were throwing the average way out of range.  It wasn’t that the entire process was broken, it was that for certain circumstances payments were taking 2-3 times the allotted timeframe.

At the heart of successful process management and performance management is a platform for designing, capturing, maintaining and refining process definition.  To perform this exact analysis and ultimately define measurements that can be actively managed, organizations must fully document and manage processes in a common visual framework that clearly defines ownership, accountability and associations with each related process activity.

So, as we look at the next reporting cycle to review the  “performance” data for the organization, take a step back and ask the following:

  1. How well does the organization understand their own processes and those of their dependent outsourced and supply chain partners?
  2. How actively are those processes managed?  Meaning, how often are they reviewed, improved, updated?
  3. How adept is the organization at responding to drastic market shifts?

Managing process information and treating process information as a highly valued asset is the mindset that must exist at the heart of nimble and forward looking enterprises.  Without such rigor within business process management, organizations pose a high degree of risk to investors and the organization’s overall health.  Immobile organizations are more susceptible to rapid market shifts and less able to innovate where necessary.   As I will explore in later postings, sustainability of process content is what separates highly agile organizations from laggard organizations.

Arms Around Complexity

So, how did my client get their arms around the complexity of process documentation they were confronted with?  They took a number of steps including having all existing process content with the business services purview converted from static Visio files into Nimbus’ BPM platform, Control.  Further, BPM strategy was developed to include process improvement methodology, process sustainability using Nimbus, including collaboration on process information across the global organization.

Globally, the organization has a heavily invested program to drive continuous excellence methods throughout the wide scope of businesses.  This is a massive undertaking given the number of businesses and the number of countries that operate.  A core component of the continuous excellence (CE) program is cultural with some degree of best practice standards, reporting and auditing of implementation.  Another key element that falls under continuous excellence is the quality management system.  This “system” is not an IT system, rather another set of methods and standards that includes reporting and auditing to ensure implementation.

It’s most impressive to see how mature and visionary the executive team have been; fully committing to an enterprise emphasis on quality and continuous process improvement.  But, even with the executive vision, the level of complexity makes the challenge a tough one.   At the core of the objectives that include quality, continuous excellence, process improvement, performance management, and compliance management is one common denominator: PROCESS.  Understanding process activities, enables core elements of accountability, sustainability, and agility.

Associating KPIs with Process Activities and Owners

At a local level, this Business Services division developed a vision for process improvement that included the same core capabilities that was envisioned by the global Continuous Excellence program.  Their objective was to actively manage Key Performance Indicators (KPIs) and not just report on them.  Once their processes were established, KPIs were attached at the appropriate process level and process ownership now not only meant ownership of process definition, but also ownership of that exact performance metric.  Again, these relationships that are established on the process software platform enable the ability to understand performance in a far more meaningful and accountable way.  No longer is a metric just a number made of up of lots of calculations with no clear method of identifying the process failure.  Having KPIs associated with key process areas, all elements that feed that indicator and all ownership of activities within that process area is easily identified. 

Measurement Triggers Action

Great.  We now can understand the KPI in a way that clearly identifies where the process is failing and who is responsible.  So, what do we do next?  Send an email?  Set up a meeting?  Paint the wall red?  Don’t tell me that isn’t what’s going on in most organizations; it absolutely is.  What do you do to manage dozens of KPIs and dozens of alerts on performance that are “out of range”?  How can key management have consistent visibility into the state of action that is taking place on each of these issues?  Yup, you guessed it, this is a teaser for a following post.  Later, I’ll highlight how this is being done and how the full cycle of process improvement is effectively managed through your process management platform.

Addendum

Note that there are many ways to approach process improvement and performance management and you’ll note I’m not proselytizing around any specific method such as Six-Sigma, Lean, Kaizen or variations on quality management programs.

Make It Work

I’m old enough to have been on the leading edge of personal computing technology.  At eighteen years old, I sold my beloved drum kit to buy one of the first IBM PC computers; no hard drive, just two floppy drives, one for the program and one for data.  I was obsessed with specifications and design; reading every computer magazine I could find – comparing the Commodore 64 to the Kaypro II or maybe the TRS-80.  These early machines had big differences and like the early days of the automobile, you had to be part engineer to know what you were looking at.  For the next twenty plus years, while consolidation occurred and the PC, Windows platform dominated, hardware specifications was always a big factor when choosing a computer.  Clock speed, chip type, drive capacity, cache, video card, etc, all became a part of the lexicon for computer shoppers.  But this has changed in a big way the past few years; and good riddance.  MG Siegler’s recent post on techcrunch, clearly articulates this change in the computer market.  In reading this post, it brought me back to when I first adopted portable digital music technology, using MP3 and varying devices.

The iPod Experience

The iPod had been released, but I was still very much a PC guy and I just wanted to small player that fit in my pocket and cost no more than about $150.  I bought a product from a company called Digital River which used Microsoft Media Player or some earlier version of it and had loads of problems.  Often times songs didn’t sync or were lost.  As a user, I found the method of ripping music, loading them to the player and managing my music very confusing and it simply did not work as I would expect.  Ultimately, my user experience was poor and after several other attempts at using other Microsoftplatform products, I eventually ended up with an iPod.  Once I had that first iPod, it was clear to me that the platform worked well and my experience was dramatically improved.  What Apple understood was how to take the product from the end user experience perspective and make it easy and pleasurable.  The design work that went into the iPod was not only elegant and simple, the entire ecosystem that included iTunes was also a vast improvement over the alternatives.  Quite simply, it worked.  And it worked significantly better than anything out there.

Now, as I’m working with software technology within Life Sciences companies, I’m seeing similar trends in approach.  No longer is it as important to focus on and flaunt the incremental improvements in specifications as it is to understand and address end-user experiences.  Product design and specifications are not only about the specifications; but how end users respond to the product itself.  Last week I spent time meeting with a medical device manufacturer, discussing my company’s software product, but we took a break at one point.  Members of one of the product teams tested a product in our conference room and one of our team members was asked to test the product and provide feedback from the user perspective.   He was able to give the product team advice on what he was experiencing and how it was working for him.  Ultimately, the product’s success in the market will be determined more about how the user reacts to using the product –  more so than the specifications that are listed on the label.

Talking Cars

Mr. Daniel R. Matlis has again written an apropos article entitled, “Is That Car a Medical Device?”.  Here, he interviews Robert B. McCray, President and CEO of the Wireless Life Sciences Alliance, WLSA.  It’s a fascinating view into what is happening on the forefront of medical device development and how wireless technology is improving patient’s health management.  The key development with these technologies is not just the capabilities these devices can deliver, but how the patient experiences the device’s use that will determine its success.  Mr. Matlis notes, “A great example of the use of convergence [medical technology, connectivity and consumer devices] to support this challenge is Ford’s In-Car Health and Wellness Solutions. Researchers at Ford, in partnership with Medtronic and WellDoc, have developed a series of in-car health and wellness apps and services aimed at monitoring people with chronic illnesses or medical disorders so they can manage their condition while on the go.”  Several major advancements are at work with these developments; increased speed of information to patients, improved monitoring of their conditions, decreased therapeutics and treatments for patients through lower need for diagnoses.   Perhaps most importantly, patients will not require as many diagnoses and treatments with real-time monitoring.  In turn, it’s the experience of patients that what will drive adoption for these technologies.

The Internet of Things

The expression, “The Internet of Things” sounds kind of goofy – a bit like the title of a children’s book.  This phrase is commonly used to describe the connectivity of the wide variety of devices to the web.  Whether through 3G, 4G, or Wi-Fi, varying appliances, medical devices, automobiles, you name it, are quickly coming online.  They are sending status messages; in TIBCO terms, “events”, to the web.  For example, in 2011, I purchased an all electric vehicle, the Nissan Leaf.  One of the features of the car is that it sends data wirelessly about the current state of the battery.  Using an iPhone app, I can get full details on the battery’s charge status and I can opt to get text messages or email notifications when certain events or thresholds are crossed.  This capability adds tremendous value to my experience as I feel in greater control over the state of battery life remaining which is the biggest concern when owning this type of car.  Similarly, medical devices such as glucose monitors can be enabled to send data directly to the web and alerting patients via text, email or even having their car speak to them if and when specific conditions exist.  Now, I’m not certain of the exact capabilities that WLSA, Ford or others will bring to market, but one thing is certain: the customer experience will determine how well the product is accepted and demanded in the marketplace.

Making it Work with all that data

When I wrote this title, I couldn’t help hearing the voice of Tim Gunn from the show “Project Runway” uttering his famous tag line.  At the end of the day, organizations who are bringing products to market are seeing a massive opportunity coupled with a massive challenge.  The convergence of medical devices, consumer devices, and ubiquitous connectivity brings enormous potential and with it the challenge of handling a tidal wave of event data that is frequently sent by a massive volume of devices.  How should data be managed?  Again, taking the customer perspective, the end user wants this data to be monitored and reported immediately for specific conditions.  Especially, when we’re talking about medical devices that are monitoring critical patient physical conditions – the response must be fast.  Patients cannot have the data sent to some massive data repository, stored and then queried periodically.  This is the method of data storage and search that is quickly becoming antiquated.

21st Century Architecture

Capabilities exist with TIBCOs platform to perform what is referred to as pattern event matching or complex event processing (CEP).  These capabilities allow organizations to look at patterns within sets of events that may indicate specific conditions.  Within the Life Sciences sector, the movement towards creating greater value and usability with connected devices is growing.  The software platform that supports these capabilities is critical toward realizing that value.  In order for medical device companies to provide near real-time status to a patient, the system must utilize software that cannot only capture that data, but analyze it as it is received and trigger actions based on user-driven rules.  For instance, using the glucose monitor example, every patient may have specific requirements that they want to personally set for when and how they are notified by the device.  Providing a platform that allows the user to easily set those rules and modify them whenever they choose enables a positive user experience.  For insurers, patients that are able to personally manage their condition will require less professional consultation and thus will reduce the overall care expense.

As Life Sciences and other sectors fully adopt 21st century information architecture, I see an evolving process management paradigm which in turn brings us new customer experiences.  As any good Six Sigma expert would tell you, the customer experience drives the requirements for quality.  Make it work for the customer experience and then you know you’re on to something.

Creating Your Own Reality

On my white board sits a list of topics that are near and dear to my heart; topics that I think about often and want to espouse, pontificate and illuminate.  Most often, I think I have original ideas on these subjects and while I don’t feel I have the time to get it all out at once, I keep this list with the intention of banging them out slowly – one by one.  And almost without fail, in my regular reading or research, I’ll come upon an article or book on one of these topics and then suddenly, like a bolt of revelation, someone’s beaten me to the punch; made the key insights that I thought were my domain.

The Surety of Fools

One such happening as I perused the New York Times Magazine, a gentleman by the name of Daniel Kahneman wrote an article entitled“The Surety of Fools”, an adaptation from his book entitled “Thinking, Fast and Slow”.  He hit on a key observation that is at the core of what I’ve been writing about over these last few months; misperceptions of risk.  I won’t rehash the whole article, but in essence, Mr. Kahneman points out how we often hypothesize based on logic, but when empirical evidence belies our theories, we simply don’t believe the facts.  He calls this phenomenon the “illusion of validity.”  I love this premise as I see it so often with investment managers, news reporters, mortgage brokers, sport team coaches, politicians, voters and prognosticators in general – they all create their own reality.

Creating Your Own Reality

We ALL do it to some degree.  We watch the news channel that validates our set biases.  We befriend people who support and validate our opinions and views.  On the topic of investment risk, operational risk and risk in general, how does that phenomenon play out?  Do we see the facts and are we able to evaluate data without bias? Mr. Kahneman illustrates the reality of investment bias with examples of studying investment managers and how their performance is measured.  The vast majority of investment managers he studies do not perform better than a purely random pick of stocks.  Yet, the illusion of validity causes the management of the largest investment firms to bonus and commission those managers as if they are keenly skilled; as if the fund managers have brought tremendous value to their client’s interests.  They create their own reality – instead of accepting that the unbiased data shows no value in their management of investment assets.

Life Sciences’ High stakes

There are even greater risk examples.  Life Sciences companies such as pharmaceuticals, biotechnology and medical device firms have huge investments and pressures to produce new products.  Each development stage requires rigorous testing and massive volumes of data.  While the FDA enforces regulations and these companies are regularly audited both internally and externally, the pressure to produce is high.  Time is of the essence when it comes to bringing a new drug to market; both for the sake of patients as well as profits.  How well is the data reviewed and scrutinized before passing each validity stage?  Is there a bias that errs on the side of validation ahead of rejection?  Absolutely.  Kahneman’s Illusion of validity is at play and the consequences are immense.

The Supply Chain Fog

For Life Sciences companies the risks involve patient health as well as immense risks to the company including product recalls, regulatory findings, lawsuits, and ultimately, reputation damage.  The organizations I’ve worked with over these last few years are extremely diligent in their processes and methods for R&D, trials, manufacturing as well as distribution.  But other operational risks do exist.  In a post last year by Daniel R. Matlis entitled, “Life Science Executives Concerned about Outsourcing and Globalization Unintended Consequences”, Mr. Matlis notes, “In the drive to lower costs, manufacturing and sourcing of ingredients and components in countries such as China and India are playing a more prominent role. Yet, according to the research, outsourcing to manufacturers in developing economies carries significant operational risks.  Industry Executives surveyed for the research said that Raw Materials sourced outside the US represented the greatest risk to the Value Chain, with 94% of those who responded seeing it as a significant or moderate risk.  When comparing the risk profile of US vs. foreign raw material Suppliers, United States Suppliers were classified as low risk nearly 10 times as often as foreign Suppliers.”  Any Life Science company’s ability to define, monitor and track each and all of their third party providers adds a level of complexity and difficulty.  This difficulty stems from what consultants at Nimbus have labeled the “fog of process accountability, control and oversight.”

To be certain, this fog exists to some degree everywhere and obviously with supply chain partners even more so, but how well an organization tries to create clarity of process definition and clarity of quality both from within and beyond the enterprise is critical when managing operational risk.  Perhaps the biggest concern I have with the phenomena of “creating your own reality” is the fact that the “fog of accountability” provides a condition for pushing forward; an excuse for not accepting what the data is revealing; and a scenario wherein doubt can always be cast on outliers.

Focus on the Facts

I spent part of last week with a biotechnology firm’s scientific directors, their CIO and colleagues from TIBCO, briefing them on my company’s software technologies and how they apply to the wide variety of process areas they represent.  The volume of data and the complexity of that data as it applies within their product trials is tremendous.   Next week I’m with a medical device company who’s in the process of a major transformation and will need to address most every operational area as part of a corporate spinoff.  These are just a couple of quick snapshots, but they epitomize the speed with which organizations change, adapt, and grow.  Speed and volume is only increasing – further escalating the demands for validation of each initiative.

I can only hope that Mr. Kahneman’s “illusion of validity” is tempered when organizations manage operational risk and the key decisions that drive product development.  The stakes are indeed high when it comes to Life Sciences, but every industry is predisposed to this condition.  In short, we can never be to too sure.  Let’s not fall too in love with our own marketing slogans.  Let’s understand the complexity that we’re faced with, make our best, valid judgments and do the best with the facts we have.  While there is never purity in our judgments, we can at least try to be aware of the propensity to fulfill objectives through maintaining a blindness to the facts.