Low Oil Prices Offer Case for Further Digital Oilfield Implementation
However, the solution is not just adding more sensors. Oil and gas companies need to need to clearly identify primary business objectives before implementing IoT technology, ascertaining new sources of information and clearing bottlenecks that hinder information flow. Companies also need to closely monitor IoT deployments and results to keep applications on track, at least in the early years. Both IT and C-suite executives should ask whether IoT deployments are generating the needed momentum and learning across businesses and employees, what the future costs and complexities associated with retrofitting and interoperability of applications are, and what the security shortcomings are in light of new developments.
New Infrastructure Needed to Gain Full Benefits of IoT Technology
Instead of continuing to rely on supervisory control and data acquisition (SCADA) systems as central data and control system at plants, the oil and gas industry needs to implement new infrastructure to fully take advantage of the benefits of IoT technology.
The drive towards “digitization” has been driven mainly by the proliferation of new software applications, new data formats, and the availability of massive amounts of real-time data. Over the past 10 years, the industry has experienced significant shifts in how production operations have been improved through the application of new, emerging technologies, according to an October 2015 report by the Industrial Internet Consortium “Beyond Digitization: The Convergence of Big Data, Analytics and Intelligent Systems in Oil and Gas”.
“New analytic-centric technologies are now revealing operational nuances that were previously captured manually, or not at all,” according to the report.
During this time, oil and gas companies have created remote operation centers focused on optimizing operations based on these analytic insights.
However, these centers have principally focused on larger assets, where such investments are cost-effective or practical, according to the report.
“To truly achieve the Digital Oilfield or ‘Digital Operationalization’ with embedded analytics, oil and gas executives must overcome the challenges of the inherent limitations in today’s infrastructure and systems. Next-generation, real-time analytic platforms require new ways of thinking on how data processing, acquisition infrastructure, and analytics platforms are designed, deployed, maintained, and used in a distributed and robust manner.”
Previously, robust data connectivity in plants has been a real challenge to design, deploy and maintain. For upstream assets such as offshore rigs or mature fields distributed across large, remote areas, the fundamental communications infrastructure is extremely limited or simply non-existent, according to the consortium’s report. At downstream assets such as refineries, the large amount of various metals and other materials can negative impact the robustness and reliability of many telecommunications technologies.
Historically, approaches to legacy communications tend to inhibit the advantages associated with real-time connectivity. According to the report, these were never built to support high fidelity data streams and analytics. Leveraging new analytics and methods of insight has meant layering different telecommunications, data processing and data storage technologies, which in turn has created many silos of data, creating more complexity in achieving meaningful data insights.
Over the years, additional capabilities have been added to SCADA systems, but many still lack in some critical areas that can help oil and gas companies gain and optimize data insights. These limitations include underlying data structures not set up optimally to handle complex statistical approaches, critical for recognizing and predicting insights, relatively low fidelity of measurement and data storage ranging from a data point every 10 minutes to an hour or longer. In some cases, SCADA systems cannot reach the field devices due to communication issues, resulting in the inability to capture high fidelity data for that specific time period.
To address these challenges, oil and gas companies have turned to commercially available historian platforms because they offer better user experience and visualization capabilities. Additionally, these systems ingest cross-functional data into a single centralized location. These systems have allowed for a new level of data analysis, but these systems also have limitations.
These limitations include the fact that many historian systems are built on proprietary or traditional row-based databases, requiring the partitioning of data across multiple servers, which in turn can limit complex data analysis across the entire dataset in an expedient and robust manner. Accessibility to raw historical data for analytics may be limited because systems archive and expunge data frequently. Data quality is also an issue; when provided with poor quality, inconsistent data, most historian systems store the bad data along with the good data.
To address challenges as it starts leveraging new capabilities, the oil and gas industry must collaborate and share best practices to learn, adjust and adapt moving forward. New risk scenarios need to be rigorously tested through the application of use cases and testbeds, according to the Industrial Internet Consortium’s report. The consortium will use the public-private partnership model to address interoperability, use cases, testbeds and security/safety/privacy in the coming months. Information gathered will be used to provide recommendation and best practices to the oil and gas industry as they start to understand the convergence of Big Data, analytics and intelligent systems on their businesses.
View Full Article
WHAT DO YOU THINK?
Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.
Senior Editor | Rigzone