Consolidate Your Field Data to Save Millions!
The Foundation of Continuous Improvement
QT OT Web Site = www.qt-ot.com / Contact = shane.yamkowy@qtltd.ca
You have probably heard many times, “you can’t manage what you can’t measure.” (Peter Drucker). And I would add, “if your OT data is not in ONE CONSOLIDATED place, organized and clean, you can’t make quick accurate, daily decisions on how to direct your work”
The focus of OT (Operational Technology) is to provide real time/near real time access to your operational and engineering data, so you can manage, make decisions, analyze and maintain a state of continuous improvement, around your daily operations. And for conventional oil and gas, it is to also automate as much daily volumetric balancing / “doing books” as you can, so you can focus on operations. This could be a simple definition of operational efficiency.
Think of your OT Platform, focused around your consolidated, single source of truth data, as real time “operational air traffic control”. A web application and mobile app solution, accessible securely, and anywhere you have internet, providing the ability to “Direct Work”.
The ability to direct work is key in assuring the right people are doing the right things, at the right time, while integrating in alarm/exception management, remote monitoring, remote control, and well optimization and servicing.
And all “the work” is being logged in operations and engineering log books, saved, available for analysis and being shared real time with other systems. And your maintenance program, and operations by exception plans, are being executed, largely automated, via planned, focused site visits and work orders.
As discussed in my previous article (see below), the details around your operations, work flow requirements, culture, tolerance to change, field automation in place, and third-party vendor solutions, are why using a base set of OT tools, and then customizing them for your specific optimization needs, is a solid approach to operational excellence.
Our last client achieved approx $60,000,000 in annual recurring OPEX savings following our Trusted OT Advisement and these principles being laid out in our articles.
OT in Conventional Oil and Gas – Work Flows, Data, Customization, Configuration, and Integration
Operational improvement and efficiency needs a focus within each organization. For conventional oil and gas for example, we typically see the focus being categories like this:
- Reduce downtime
- Optimize the Wellbore / Well Optimization
- Human Productivity (automate “low value work”)
- Regulatory, Compliance, and Safety Reinforcement
- Supporting the Sustainability Program/Goals
After you define your Typical Operations Day, by reviewing, defining, and integrating your work flows (i.e., defining your level of operational efficiency), the next critical success factor is consolidating your OT Data.
First, we need to be aware of the three types of OT Data, needing to be consolidated in one spot.
OT data comes in three general types:
- time series / sensor / historian data = streaming data
- transactional data = truck tickets, well tests, things that do not stream and are not polled
- meta data = characteristics of your data that helps you organize, quickly find, and analyze your data
Most companies do not start by having am OT vision; it comes later. By this time, significant organic data and system growth has occurred, and likely without an overall solution architecture to maximize your operational goals.
This is where QT OT comes in … we can help you review your operational goals, work flows, and data/system environment and define a vision that creates the core of your OT Platform. An OT Platform, creates a foundation for you to continually work on operational efficiency by creating a “single source of truth” for your data, work flows, and reporting.
Depending on your current environment, the biggest challenge everyone faces is handling Time Series streaming data, the type of data typically stored in proprietary vendor historians with no easy way to get it out. And the nature of time series data is that it can grow rapidly, creating billions of rows of data in a year, especially if frequent polling rates are used.
In today’s OT world, most companies face four key challenges in consolidating OT data:
- how to FREE YOUR DATA OUT OF YOUR HISTORIAN and MAKE IT AVAILABLE TO ALL OPERATIONAL and ENGINEERING SYSTEMS! and potentially, how to CREATE an ENTERPRISE HISTORIAN across different SCADA systems
- how to get a COPY / ACCESS to your data in Third Party Vendor (3PP) solutions
- how to design an ASSET FRAMEWORK to quickly find, organize, and analyze data specific to your operational requirements, including the appropriate META DATA
- how to manage daily DATA QUALITY
FREEING YOUR HISTORIAN DATA FOR USE WITH OTHER SYSTEMS
The problem with leaving your SCADA / time series data in an historian is that is not available easily in a real time manner with other operational, maintenance, engineering, and corporate systems. These systems do not understand process control of SCADA protocols and concepts, “out of the box”. They expect to see this data in “tables”, either SQL or NoSQL or Columnar based.
Some process control and SCADA systems let you “request on demand” data you need, but that is typically expensive, like six digit or more expensive, and puts an unacceptable load on the process control of SCADA system.
Or a bigger challenge is what is you have many different SCADA systems? If you do, and you need to consolidate your OT data, you will need some form of an Enterprise Historian.
But even if you do that, you would have all your time series data in one spot, but not your transactional and meta data. Ideally, you want it all in one source; one spot.
We have solved this problem for other customers and can solve it for you! We like using PostgreSQL, which is an open source, free licensing solution, capable of handing billions of rows of data. We can use different data base engine back ends, but we like free and low-cost annual support for on going maintenance and tuning, five-digit type support costs.
Once you get all your time series, transactional, and meta data in one OT DATA WAREHOUSE, you then just need to build materialized and non materialized (virtual) views to “show” all the other systems what they like seeing, those SQL and NoSQL/columnar data tables.
CONSOLIDATING / REPATRIATING and INTEGRATING YOUR THIRD-PARTY VENDOR DATA
With field automation solutions continuing to drop in price and increase their capabilities and the non-stop growth of cloud computing, many operators are using third party packaged vendor solutions.
Typically, there include some vendor specific automation on site, with a built-in cell modem, sending telemetry to the vendor cloud.
Many companies forget to write into their contracts a “reminder” that the data being generated should be theirs or a minimum co owned with the vendor, and they should always have pre defined real time access to “their data”.
We do want to interfere with the warranty and support provided by the third-party vendors, we want to hold them responsible for what you paid for, and not replace their communications solutions, but we do want a copy of our data!
When consolidating all your data into an OT Platform, you need all these cloud vendor third party solution data sets and alarms in your OT Data Warehouse.
That way you can use your single source of truth OT web application to analyze that data and integrate it with all the other vendor and automation data you have, ideally with SMART ALARMS and SMART EVENTS (a future article coming soon about using multi variable equations instead of traditional set point alarms to provide high confidence context rich alarms and events, and allow you to include multiple variables from different automation, different sites, and different SCADA systems).
With previous customers we have developed and executed the concept of VIRTUAL POLLING, where we pull inventory, telemetry, alarms, and photos from cloud based third parties, using REST API or GRAPHQL interfaces, into the consolidated OT Data Warehouse. Then the OT Platform web app “sees” all the data as equal and execute what ever needed remote monitoring, control, or alarm management is required.
DESIGNING and USING YOUR ASSET FRAMEWORK TO ORGANIZE YOUR OT DATA and META DATA CONSIDERATIONS
Before you begin consolidating your OT Data, you need to define your ASSET FRAMEWORK. Each customer has a slightly different asset framework. Usually, it is a combination of responsible operating areas and flow.
Maybe something like
- REGION (Superintendent) / FOREMAN AREA / OPERATING AREA / BATTERY / SATELLITE/ HEADER / WELL
Once you design your hierarchy / tree structure, then for each level and for all key objects like a well or battery, you need to define what kind of META DATA you will need to find and organize your data.
The asset framework is primarily for filtering down to an AREA OF RESPONSIBILITY. Meta data is primarily for filtering and sorting.
Meta data are items like: well test interval, tank names, production profiles, and visit frequency.
DATA QUALITY, EXCEPTION REPORTS, and DAILY CHANGE LISTS
Once the data is freed from the historian, your transactional data in merged in, your asset framework is designed and structured, and your meta data in place, then you need to maintain DATA QUALITY.
The simplest approach to this is to define EXCEPTION REPORTS you need to get each day highlighting when data is out of it’s assumed operating range.
Another useful check, especially when significant automated updates are being made, is to do a DAILY CHANGE REPORT. Compare today to yesterday and summarize the number of automated and not automated changes that are being done, to assure no accidental or rogue large-scale changes due to data quality issues or configuration issues.
QT OT is here to help you through the journey to review your work flows, consolidate your data, and set up your OT Platform, so you can save millions in annual recurring OPEX!
Contact us at
shane.yamkowy@qtltd.ca and check out our web site at www.qt-ot.com