Quality governs what crude oil is worth and where it can be delivered. What is often described as a uniform commodity actually has a range of qualities that impact the end use and ease of transport. Despite the importance of quality and the long history of the industry, product qualities remain difficult to measure accurately and in real-time. It’s a daunting task for the industry as product moves quickly and mixes constantly, irreversibly changing the composition. The result is that industry participants are often buying, selling, and transporting a crude oil stream without full clarity on content. This creates inefficiencies across the supply chain, as the right products aren’t getting to the most suitable end markets. This costs all participants time and money.
Crude qualities have become increasingly variable. Recent supply additions have primarily come from shale, which has higher decline rates than conventional production. This means oil infrastructure will have different supply sources flowing through it in only a few years, which contrasts with infrastructure asset lives spanning decades. Extraction methods have also changed significantly and new completion techniques, combined with enhanced recovery methods, have further impacted stream compositions. Additionally, global shifts in refined product demand, and new environmental regulations, simultaneously increased quality-dependent price variability. All of this has led to a push for more quality specifications across the supply chain to drive increased transparency and liquidity. Some of these new rules have already been implemented. For example, as of January, the CME group added five new quality parameters to the NYMEX WTI contract. The added rules and the increased focus on quality are making the subject much more important for companies.
Midstream companies are perhaps the most affected by increasing crude variability, making it difficult to keep track of the quality of volumes they gather. That variability has caused refineries to push for more detailed specifications on delivery. Compounding the problem is the fact that pipeline development is constrained in parts of North America, meaning pipeline companies are forced to deliver a variety of crudes for their clients using the same infrastructure, leading to increased batching and associated transition zones. Midstream companies need to keep track of the crude quality in all trucks, pipelines, tanks, and terminals in order to properly schedule deliveries that meet specifications. To attract volumes to their existing assets in an increasingly constrained market, and maintain delivery specifications, pipelines need to have a clear focus on quality demands. To support this, they need to build robust measurement programs. These programs are difficult due to the complexity of accurate measurement combined with the sheer volume of data points. A pipeline company could easily transport millions of barrels a day, and every time a given barrel is mixed with another barrel the combination has entirely new properties. It is the ultimate big data problem, combined with the fact that certain properties require complex tests.
This challenge of accurately assessing crude quality, which is both a measurement and data analysis issue, is a problem that is well suited to the Internet of Things (IoT) and Artificial Intelligence (AI). The key advantage of IoT is its ability to aggregate and organize vast amounts of data in real-time. AI’s advantage lies in its ability to make optimal operational decisions that take into account the massive data sets collected by IoT technology. Even with organized data, it is difficult for humans to identify patterns and actionable insights. With the quantity of data in the oil and gas industry, it becomes almost impossible. AI can learn to recognize important trends and if provided with a set of optimization criteria, can be used to recommend real-time operational changes. Simply put – it makes data useful.
Energy companies have already started to implement IoT systems to combine data from instruments distributed across their operations. Management can now manage field activities, recognize potential issues, and analyze current and historical data all in real-time from head office. On top of that, these systems also validate, timestamp, organize, and cross-correlate across measurement sources (third-party labs, field labs, on-line analyzers, etc.), with effectively zero human interaction. Even processes that used to be completely manual, such as a centrifuge test, have now been completely automated to reduce data errors and connect directly with data systems. Many clients of Validere use a Centrifuge Tube Reader for this very purpose; it uses machine learning to automatically analyze tube readings and upload results to the cloud, eliminating the human error and time typically associated with the test. All of these developments lead to data that is more accurate, real-time, and useful for operational decisions.
With AI, innovative midstream companies that have invested millions of dollars in complex measurement infrastructure, are gaining the ability to optimize the use of that infrastructure. Leveraging millions of data points from their measurement programs, AI can now identify where measurement resources are overallocated and underused, or insufficient, and adjust accordingly. It can pre-emptively alert management of unfavorable quality trends before they lead to off-spec shipments that would have to be off-loaded and recycled, which is extremely important for an industry that continues to be a leader in safety performance. It can also identify opportunities that may otherwise be missed without a broad view of all facility data. This includes the identification of compatible quality crudes from new suppliers, optimal allocation of producer shipments, and pre-emptive predictions of comingled qualities based on physical and statistical modeling. Product quality has always been a constraint on operational decisions, but innovative companies in the industry are now dealing with it proactively rather than reactively.
Quality management is an increasingly important opportunity for midstream companies, and for oil and gas producers and refiners. They are seeking out industry partners that care about quality and are using IoT and AI technology in aggregating and interpreting data to both save costs and increase revenues.
Oil and gas companies are regularly faced with many industry-specific issues to overcome. Such issues, including exploration and drilling, are often complex and intricate processes with many unique challenges to overcome. Data analytics can play a massive part in streamlining some of the most fundamental operations that are involved in the oil and gas industry.