Healthcare processes are complex requiring a tool able to improve the patient experience, avoid compliance risks and increase efficiency. Process mining, a next-generation solution, is that tool.
Hospitals are continually struggling to meet the increasing demands for patient needs. This is due to the increasing population and the increase of chronic diseases associated with modern life. This leads to a longer waiting period for new patient admission, greater difficulties in maintaining planned admissions, undesirable patient experiences, and a reduced quality of care which patients receive.
At every level, healthcare institutions are facing increasing pressure to manage income, optimise patient journey, and reduce costs across their care continuum. Essentially, they are asked to do more with less amid other challenges including the COVID-19 pandemic.
While hospital management boards are looking for ways to solve these issues, they are lacking the means to do so. Implementing changes without factual evidence leaves a gap in understanding of the size of the problem, its impact, and the impact of their decision-making approach. That, coupled with the lack of immediate feedback on implemented changes, leaves problem areas in the processes un-dealt with, compounding the struggle hospitals are facing.
How does process mining actually work?
Process mining applies to a wide range of systems. The only requirement is that the system produces ‘event logs’ recording at least part of the real execution of the process. These event logs hold data on a defined process (e.g. a lab test) and that process is related to a specific case (e.g. a patient).
Additional information such as the executor of the event (e.g. the doctor requesting the test) or data recorded along with the event (e.g. time sample taken) may be stored. Based on these event logs, the goal of process mining is to extract information on the process in order to discover, monitor, and improve real processes.
There are three types of process mining that can be distinguished:
Discovery: the actual discovery of how any process in your enterprise is executed. Using newer technologies such as computer vision, machine intelligence, and deep learning, process discovery creates a ‘digital twin’ of that process. It infers process models and reproduces the observed behaviour.
Conformance: Conformance checking is a technique used to compare event logs or the resulting process with the existing reference model (target model) of the same process. This technique is used to determine whether the target process corresponds to the actual process. Conformance Checking is a process mining method used to check compliance.
Extension: Used when there is an a priori model. The model is extended with additional performance information such as processing times, cycle times, waiting times, costs and so forth. This makes the goal not to check conformance, but rather to improve the performance of the existing model with respect to certain process performance measures.
Why Process Mining and not traditional business intelligence tools?
Process mining extracts process-related information from event logs and is used to produce data visualisation to guide operational strategy enhancement. This provides insight into how the processes are really being executed making it easier to identify exactly the kind of change to implement.
Alternative approaches include people having 1 or 2 ideal scenarios in mind, however, in reality, that approach fails to encompass the many more scenarios possible. Other existing platforms, like traditional Business Intelligence tools, may allow you to monitor operations and performance (with extensive customised reporting) but fall short at identifying causes for underperforming processes and are unable to provide predictive analysis in the way process mining does.
This demonstrates the necessity of adopting modern tools that can provide a complete picture of end-to-end processes, especially in high-demand situations where you don’t have the time for trial-error or speculative decision making. The pandemic is a great example of this.
The healthcare system, particularly lab testing facilities, has experienced a huge influx of patients resulting in a total of 1,772,480 tests taken between 22 January 2020 and 06 March 2021 in NZ alone. This has highlighted the crucial need for boards to make the right decisions promptly and with little space for error to avoid congestion and delay.
A case study by Arkturus
Using existing data from multiple systems, we worked with the Auckland District Health Board (ADHB) to create a highly accurate digital twin of how the Test Laboratory operates. The resulting process visualisation dashboards showed the journey from originating a test request through each of the process stages to reporting the results.
Our process improvement analysis identified the hidden cost of duplicate lab testing unbeknownst to ADHB. Through utilising that data, to refine the process, we identified a potential annual cost savings of $1m – $2m through eliminating unnecessary repeat tests.
Arkturus also highlighted a high incidence of tests being performed but not actually billed for – representing additional potential cost recovery for the ADHB. The information we were able to extract revealing these realities would have otherwise remained unknown to the ADHB.
Arkturus’s Process Intelligence platform leverages advanced Process Mining technology, which includes process mapping, visualisation and understanding, task mining and AI process forecasting capabilities, for process transformation. Click here to learn more.