Poor data quality and lack of completeness can have significant impacts on OPEX costs, safety and asset uptime. In relation to the financial impacts, IBM estimates that “bad data” costs the US economy a staggering $3.1 trillion per year.
In almost every case of an incident on an operating asset which results in equipment downtime, production loss, personnel injuries/fatalities or environmental impacts, the cause can be linked back to an issue within the data available.
Particularly, for production assets, poor Computerized Maintenance Management System (CMMS) data can incur:
- Loss of production
- Increased OPEX
- Injury and fatalities
- Environmental damage
- Non-compliance with regulations
In this blog, our Senior Consultant based in our Houston office, Ed Galyen, shares real life experiences of the impact that bad data can have, and provides insights on how to focus your efforts to efficiently improve your data with limited budget and resources.
A Real-Life Impact of Poor Base Data:
The process of enhancing base data can often been as seen as laborious and unnecessary task for many businesses, mainly due to the focus being on production, firefighting and not fully understanding the magnitude of the gap in their data, and the subsequent consequences.
In addition to this, it can be difficult to build a business case to allocate budget to fixing base data issues that have existed since day one. It is therefore important to communicate the possible risks and benefits, in relation to additional OPEX, lost production risks and safety incidents in order to be “heard”.
An example of where we have seen poor data incur extreme costs and lost production comes from an offshore asset we worked on in the US:
A multistage pump’s operating suction pressure was not met, causing cavitation, damaging the pumps internals which led to equipment downtime
The asset register was incomplete, meaning the basket strainer was not captured and therefore:
Advice on how to efficiently improve base data
Step 1: Identifying the magnitude of the problem:
Conduct a gap assessment of the existing CMMS data.
There are a few ways to do this, in a recent project using our own CMMS data analysis software, we were able to quickly identify:
- 33% of Safety Critical Equipment (SCE’s) had no maintenance assigned
- 39 emergency shutdown valves with missing performance standard assurance activities
- 38% of “production critical” equipment wrongly assigned as “non-critical”
- 41% of tags were not in the asset register when compared with the engineering drawings, including pressure relief valves (PSV’s) and diesel engines
- 11% of the PSV’s were categorized as “pump” meaning the wrong maintenance was assigned
- 7112 missing BoMs
This exposed the problem areas, enabling the client to understand where they should focus their efforts for the most return on investment.
Step 2: Communicating the impact of improvements
When speaking with leadership to obtain budget to fix base data issues, it is important to focus on:
- How much money it will save
- The efficiencies it will unlock
- The risk posed to the asset’s safety and production
To get an idea of these savings, below is a summary of an improvement project delivered by our team which focused on criticality and maintenance improvements, and the types of efficiencies it unlocked:
- Increased the criticality allocation of production critical equipment by 64% and updated associated maintenance, providing assurance that all production critical equipment was being managed and maintained appropriately to mitigate risk
- 1 equipment type had a reduction in criticality ranking applied to 33% of its population, saving 456 annual maintenance man hours
- Saved $1.7M by removing unnecessary maintenance tasks from the CMMS
- Removed 82,901 maintenance man hours through smarter utilization of resources and logistics
Step 3: Fix the base data problems
The findings from a gap analysis will allow your team to focus efforts and resources on the most problematic areas with the most return on investment. Here are some proven initiatives that are guaranteed to save you money and reduce risk:
How to get started:
If the gap analysis has identified missing equipment and key information in your asset register, it is critical to consider:
1. A Desktop Asset Verification (DAV)
Remotely verifying the existing asset register against engineering drawings, this will:
- Provide a list of equipment that is not included in the asset register
- Identify where engineering drawings need to be updated
- Enable a functional hierarchy to be built by understanding equipment relationships derived from the drawings
- Help assign criticality
- Enable object types to be allocated (we in Add Energy recommend using ISO 14224 as a standard for object type codes and classification)
2. A Physical Asset Verification (PAV)
Physically verifying the equipment on the asset against the engineering drawings and asset register will, as well as providing all the benefits of a DAV:
- Provide resolution to any DAV discrepancies
- Enable name plate data to be collected which is required for BoM development and spare part sourcing
- Allow the opportunity for a visual inspection of the equipment to be conducted
How we can help:
There is no doubt a PAV exercise can be tedious, time consuming and inefficient. This is usually because:
- Maintenance and operation teams are often expected to complete this on top of their day jobs
- Handwritten data collected in the field is difficult to decrypt once back in the office
- Inaccuracies and inconsistencies in data can make processing and data interpretation difficult
- Processing the data captured in the field can take months depending on the quality and quantity
- Data points are usually missed because the engineers are trying to balance distractions of the site in general, the verification exercise of equipment and manual updating of information into spreadsheets
To streamline this process, Add Energy have created a digital data collection app - ePAV™, enabling data to be collated from source in half the time and with 4x more accuracy.
For advice and techniques to efficiently develop an accurate digital twin: