Get the most out of Maximo through data optimization
Recently we released a series of the Top 10 best practices for Maximo implementations. But the challenges of asset intensive companies don’t end with deploying the best EAM solution, IBM Maximo. Observing anomalies in our industrial assets and maintaining heaps of data recorded in these EAM applications is equally significant. It is vital to check the reliability and authenticity of data captured in the system to eliminate redundancy and duplication. In fact, data is the blood line for any successful acceptance of an application. Data discrepancies, at any time, reflect inefficiency in organizational reports, affecting the repair and replacement decisions, proper planning of maintenance, financial roll-ups or evaluation of asset performance, ultimately resulting in loss of profits by a company.
Today, the majority of companies use EAM / ERP systems. A major risk of these applications, which help improve efficiencies and give a real-time view of all business process, is that bad data can cause them to fail. Our analysis suggests that these data management challenges prevail in most, if not all companies. Since there is huge amount of transactional & master data in Maximo which is integrated with other applications, the impact of bad data can be enormous and can percolate to the entire system.
So, what causes data redundancies and duplications? Every business process needs users to either enter new data or update/ take action on existing data. Users upload data in bulk or get it from external application through Integrations. As data is used in all business processes, data challenges increase the chances of mistakes – leading to risks. The problem is that by the time the decision makers understand the impact, it’s too late. So what are the repercussions of inaccurate, wrong, or missing data? We will delve into all such repercussions and their solutions in this blog. Here are some commonly known situations influencing data corruption, and its imminent impact on maintenance cost:
Bad data results in Inaccurate Financial / Cost Roll-Ups:
In most businesses, one single asset is shared among different departments of an organization, where each department has its own standards to control and monitor the maintenance of that asset. In some cases, each department may even enter unique IDs, different records, for the same asset within Maximo. This results in a distorted representation of financial / cost roll-ups within organizational asset maintenance reports. To obtain a functionally accurate estimate of the total cost of ownership of such assets, a consolidated report should be worked out, indicating the total maintenance spend by each department on these assets.
Organization XYZ has multiple compressors installed in different departments across the company. For every two compressors, which happen to be in two different departments (department A and department B), there is one centrifugal fan allotted. Each department creates a separate asset ID in Maximo for the single centrifugal fan. Let’s say, ‘Department A’ reports maintenance cost of the fan as $100, it will be doing so on the asset id created for ‘Department A’ in Maximo. While, on the other hand, ‘Department B’ raises another work order for maintenance of the same centrifugal fan at USD 150 and updates it under a new asset id specifically created for ‘Department B’ within Maximo. In this scenario, the reports extracted from Maximo would describe two asset records for a single physical asset, along with an inaccurate figure of the overall maintenance cost. Such redundancies lead to inappropriate financial / cost roll-ups.
Incorrect Initial Setup Leads to Inefficient Planning & Scheduling:
Planning and Scheduling asset maintenance during the initial setup is a crucial step. The person responsible for planning and scheduling needs to acquire a broad range of technical skills including maintenance and operating competencies as well as a deep engineering insight. Over or under maintenance of an asset can lead to asset failure or increase the maintenance cost for the company. Negligent initial setup of maintenance task or creation of different job plans & PMs instead of using the built-in feature of nested job plan can result in an inefficient planning process, inconsistent repair and maintenance schedule, duplication of efforts and general inefficiencies. .
The manufacturer’s guideline for an electric pump recommends preventive maintenance occur every six months. However, the initial operator schedules maintenance activities for the pump on a monthly basis in Maximo. This leads to over-maintenance of the asset, which does not extend the service life of the asset, while increasing the burden of maintenance cost. Such scenarios even jeopardizes data integrity, lowering the visibility into the actual business problems by evaluating the unnecessary maintenance cost.
Data Collected through Sensors can overwhelm the system & processes:
Integrating IoT and machine learning technologies with Maximo, along with automating data capture through sensors have increased the probability of our systems acquiring irrelevant, redundant, and duplicate data. These technologies are exponentially increasing the amount of data fed into Maximo, extending the size of its databases and increasing the need to spend more money on acquiring space to absorb the flood of information.
Maximo implementation project is as good as its data. Disposing of unwanted information from your Maximo environment displays more accurate functional asset cost and efficiency, aiding the management in taking right decisions on whether to continue with the same asset and repair it or replace the asset with a new one.
ValuD has served several clients from varied industries by eliminating redundant data from their Maximo environment through our custom-built data deduplication process, and designing OOTB Maximo solution accelerators to avoid reoccurrence in the coming future. Our team consists of functional consultants comprising of data analysts, data experts and a specialized team dedicated towards data cleaning and migration. Our Data analysts & experts scrutinize data on a granular level, finding unique binary patterns under the conventions of duplicate assets, locations, job plans, manufacturer information, etc. identifying repeat patterns which can be eliminated.
To learn more, contact us at: email@example.com.