Pages

Monday, February 25, 2019

Mega Projects Performance & Predictive Analytics


Predictive Analytics have found fertile grounds in various sectors and applications over the past 2 decades; mainly in the financial and telecom services sectors and for such
applications in marketing, as for example, for customer churn prevention, fraud detection, cross & upselling campaigns, credit scoring, etc. 
Most recently, the new buzzwords and capabilities in the analytics world are the ones like Machine Learning and Artificial Intelligence.
In the manufacturing operations and maintenance, Predictive Analytics have been applied sporadically for preventive maintenance and even in more rare occasions for failure root cause analyses and online fault modeling for operator guidance and decision support.
Nowadays, the EPC industry is set to pay attention to Predictive Analytics as part of their #digitaltransformation and #EPC4.0 initiatives. In late 2018, a leading EPC, Fluor, announced an initiative to apply Predictive Analytics for project performance, and this is great news. Especially for those who pay the bill, that is, owner/operators of projects and #megaprojects.
Yet, you may wonder how this can be achieved. On one hand, history of projects data is needed.  The more projects’ history available, the better, and as long as the data is or can be normalized.
Secondly, such history should also contain extensive asset – specific work steps history for all phases; if possible, from FEL 1, 2, 3 stages, down to the detailed engineering, procurement construction and pre-commissioning phases.  When such data is collected, SEMMA (Sample, Explore, Modify, Model, Assess) methodologies can be applied to analyze and find related predictors and dependent variables, as for example cost, schedule and quality variations.
With the emergence of the above – mentioned initiatives (#digitaltransformation and #EPC4), in the EPC domain, it will be interesting to watch if and when owner/operators and EPCs take advantage of such possibilities. Megaproject performances over the years confirm that insight into past performances can be the mother of improvement.

Mega Projects, EPC 4 & Digital Transformation

Interesting white paper https://bit.ly/2GXB6hN on the subject of #EPC4 by German Think Tank. Very applicable and true for #megaprojects.
Despite the impressive results of #megaprojects, stakeholders (before, during and after) seem to agree that collaboration over all phases and all disciplines needs improvements, and so do the project data quality, integrity and interchange. Enter the hundreds of thousands of work steps involved in projects, this Think Tank team has a valid point. Obviously, #digitaltransformation and #EPC4 are related.  In their article, https://bit.ly/2GX6g8JForbes make also a great point, which,  although related to their own industry, rings a bell when it refers to silos of departments and data. Data that have to do with both ... assets and project work steps. Worth reading both links. 

Monday, September 24, 2018

BIM + SIM = Turning MegaProjects to MegaSuccesses


MegaProjects are MegaChallenges

Are you a stakeholder for an on-going or a new megaproject? It could be an new airport, a hospital, a stadium, a highway or  manufacturing facility with tens of thousands of assets. Project teams may be relying on a first class BIM Solution to manage your project, which is great, yet, you want to do better when it comes to the mega challenges of project cost, schedule and quality. Why? Simply, there is always room for improvement.
How to do better?
1.  Go beyond BIM with BIM+SIM
Why? Because SIM (System Information Model) can combine BIM asset data with critical EIT (Electrical, Instrumentation and Telecommuncation) asset information.
2. Improve the way your project teams collaborate over the entire project lifecycleWhy again? Because project teams develop enormous amount of data with various software applications used by different disciplines. This includes BIM data that describe thousands of assets, asset design data and project activities, supplier and cost data, schedule and work step data, construction, commissioning and handover completions, etc. Along the way, for this “tzunami” of data, quality, consistency and proper use suffers. Lack of data consistency and lack of adequate collaboration leads to frustration. And as time passes, finger pointing and confrontations begin. Procurement department blames Engineering, Construction blames both, and, at the end of the project lifecycle, Pre-commissioning and Handover blame everyone else.Digital Asset Delivery over the entire Project LifeCycle can help using Deming’s PDCA principles.Until now, there has not been a single framework that can bring together all tag - based, asset design data and project activities information in near real time, thus empowering effective collaboration for project teams.
Digital Asset Delivery, powered by DAD software and Deming’s Plan-Do-Check-Act (PDCA) framework, can help megaproject teams and stakeholders do better. All project data comes together in a comprehensive, cloud-based environment, so that everyone can see, from anywhere in the world, and from any device (laptops, tablets and smart phones), all information about who did what, when and how, under a sustainable project governance framework for project data management.
Digital Asset Delivery combines BIM + SIM asset data from all disciplines and project activities over all phases of a megaproject lifecycle. Even with multiple contractors, projects and locations involved, this “game changer” approach can provide consistent information and collaboration globally including multi - contractor interface management.
Benefits from Cost, Schedule and Quality savings in megaprojects can be in the range of tens of millions of USD.


Saturday, April 14, 2018

Industry 4.0 and Digital Asset Delivery

Industry 4.0 Challenges
In a 2017 article, Chemical Processing states that, while Industry 4.0 looks at the modes of production (such as moving toward modular production units for appropriate processes), the initial focus for IIoT tends to be more on connectivity, Big Data, and analytics to improve production asset availability and performance. Enter Digital Asset Delivery. This is about each asset original data (engineering), its lifecycle activity data (engineering, procurement, construction, commissioning and maintenance. It is also about “as - built” data.
All of this is Big Data.
And what about  connectivity? Design of  process, piping, electrical, instrumentation and telecommunication systems originate in different engineering applications and their inter - connectivity is missed. Yet, there are more connectivities. Business processes and work steps, data flows, applications and functional blocks, organizational structures reflected in business process modeling swimlanes. And there is more. Procurement and construction activities, cost data, schedules, etc. All these systems are interconnected and can be linked back to assets. Make them tags. How can all this data come together? How can Industry 4.0 be realized with so many silos of data and people? How can people collaborate under such circumstances? Can all this data be integrated into a single environment? How can this be done when in most cases it is not clean data and when there are inconsistencies all over the data? Such situations can not even facilitate data analysis. All this leads to analytical paralysis.
Digital Asset Delivery for Industry 4.0
Until now, there has not been a single framework that can bring together all tag - based, asset data and project activities information in near real time to serve Industry 4.0 initiatives.
Digital Asset Delivery gives the SIMPLIFY-INTEGRATE-COLLABORATE-ACHIEVE foundation for Industry 4.0 initiatives to bring the required asset and activity data in simplified forms, integrate it and serve users for collaboration over the lifecycle of any asset and in any project, for greenfield or brownfield, for small or megaproject. And this is done without  engineering  having to abandon their engineering tools.  Digital Asset Delivery targets collaboration tailored to Industry 4.0 requirements. This is about sharing all information from planning and scheduling of tag - based activities for engineering, procurement, construction and commissioning for electrical, mechanical, instrumentation, cables, piping, etc.  And for maintenance too. This is SIM - System Information Modeling. Each tag in the SIM has activities. And each activity complies to Deming’s  Plan-Do-Check-Act (PDCA) methodology, so that everyone can see, from anywhere in the globe, all information of who did what, when and how, under a sustainable governance framework.
Digital Asset Delivery combines tag data from all disciplines and tag activities to facilitate Industry 4.0 initiatives. Digital Asset Delivery takes into account all of the systems involved for all disciplines, their connectivities, involved people and departments, their workflows, associated documents and more. This is the Industry 4.0 System Information Model. And this is what saves cost, schedule and quality.
Where and how to start? Learn about Digital Asset Delivery, Industry 4.0 SIM, PDCA and SICA (SIMPLIFY-INTEGRATE-COLLABORATE-ACHIEVE) methodologies as part of an in - house  workshop towards achievement of excellence in asset and project lifecycle management, and, last but not least, for Industry 4.0 initiatives.





Friday, April 22, 2011

Tracking the Total Cost of Ownership (TCO)

TCO is usually calculated at purchasing time and when comparing offers from potential suppliers, and sometimes a supplier will give their own version of TCO.
Yet, the useful side of calculating TCO is after acquisition and during lifecycle, since the operation and maintenance of the system is typically the biggest component of TCO. Starting with an expected TCO, during lifecycle the actual TCO changes. Operation and maintenance costs add up. An unexpected failure in a system and repair will make the TCO Index go up, while efficient use of a system, with no problems, will improve the expected TCO Index.  Also, if the lifecycle of a system is extended, the TCO Index will get better (decrease). 
Monitoring and calculating TCO over the lifecycle of a system requires proper financial accounting practices and cost allocations and this data is typically readily available. Generating TCO reports for each family of systems on a monthly basis provide excellent KPIs . Management can appreciate such information and it develops a good history for making educated decisions during the next purchase and capital expenditure for a system .
Developing TCO models such as the one shown in the above figure, as simple as it seems, it will require commitment towards project execution excellence, since the TCO modeling approach behind it can get complex. Yet, the benefits can be substantial, since proper management of TCO KPIs can lead to very attractive savings.

Sunday, March 27, 2011

Monday, February 7, 2011

Asset documentation and CAD - based drawings need not be a pain anymore.


Documentation and creation of drawings is simply ... a big pain.

To build and maintain control, communication and various systems connected in a facility, documentation and related information must be generated, maintained and be readily available during the lifecycle of the related assets.
The classic approach is that such information is created and stored in CAD drawings and documents. Even "smart CAD drawings" though may now be as smart versus new approaches that can result into savings of millions of dollars, especially in the case of large capital projects. 
In addition, as information is repeated on several drawings and documents in order to relate various devices to each other, consistency is even more challenging, and with the classic CAD - based approach, the preparation of documentation is slow and inconsistencies appear between documents.
One example - scenario that is that if engineering represents as much as 20% of a large capital project with about 50% of that related to drawings and documentation, then we there is so much room for cost savings. Take a capital project of say 10 million USD, with these practices capital investment and asset lifecycle management costs can be huge as systems grow in size and complexity over time.
A data model - based design can eliminate the pains that a CAD based approach creates, since this approach can deliver.

1. cost savings between 50 and 90% of classic approach over the entire cost of ownership timeline
2. zero documentation errors
3. instant access to all relevant information
4. an auditable history for the complete model

The data model - based approach is applicable to all industries and sector cover documentation of electrical and wiring systems, interconnected assets in telecommunications and networks, data centers, etc.
Comments and questions are welcome.

Sunday, October 17, 2010

Data Mining in Manufacturing & Process Industries

The use and application of Neural Networks (NN) has found a “home” in the domain of industrial process control. At the same time, NN is practically a core function in most popular data mining solutions. NN algorithms have been embedded in process control solutions, yet sometimes seen or even projected as a bit of a “black box” or “magic box”. Obviously, because of the complexity involved for most process control engineers to rationalize the output of an NN algorithm.


Root Cause Analysis (RCA) has traditionally been conducted by core statistical applications in order to identify cause of failure of plant equipment. RCA is classified based on the use or objectives as:
  1. Safety-based RCA, which descends from the fields of accident analysis and occupational safety and health
  2. Production-based RCA, which has its origins in the field of quality control for industrial manufacturing.
  3. Process-based RCA, which is an “add-on” to production-based RCA, but with a scope that has been expanded to include business processes.
  4. Failure-based RCA is rooted in the practice of failure analysis as employed in engineering and maintenance.
  5. Systems-based RCA emerged as an amalgamation of the preceding uses, along with ideas taken from fields such as change management, risk management, and systems analysis.

Thursday, September 23, 2010

More on Total Cost of Ownership for IT Systems

How to calculate TCO during a bid evaluation?
A TCO Index calculation method has been developed  using the adjacent figure. 
The model takes into account  all related costs to be accounted for in a bid evaluation. The model also considers Asset Lifecycle duration and management aspects.
The TCO Index is a normalized figure in order to ensure confidentiality of financial figures presented by bidders. 
There are two versions of the TCO Index calculation ... 
1) a detailed model, and 2) simplified version. 
A TCO Index example of 77.5 versus a base case of 100 for a lifecycle of 20 years is shown, in which lifecycle costs are allocated to CAPEX and OPEX types (e.g. acquisition, deployment, operation and support, retirement and replacement). 


The Total Cost of Ownership for IT Systems

The Total Cost of Ownership (TCO) for capital investments, as for example an IT system, is allocated to various cost components e.g. purchase/acquisition costs, operational costs, replacement costs, etc. 
Gartner and Forrester Research present some typical TCO figures as per the adjacent image. Both firms indicate the purchase cost being about >32% of the TCO. Given the asset lifespan period, the longer the lifespan, the higher the TCO in absolute figures, but the purchase cost as % of the TCO gets lower. That's why Life Cycle Management of IT assets is significant.
A key point of course is that the purchase/acquisition costs alone do not represent the most advantageous selection of a suppliers, from the financial point of view. It is the TCO Index. 
Calculating the TCO Index for each competitive bid in IT is quite analytical but worth the effort during evaluation of supplier proposals.