The Big Challenges of Big Data for Oil, Gas

Sperrazza sees production field logistics as the one area that can benefit the most from the industry’s success in addressing Big Data. Companies can maximize the revenue they get from a field by ensuring that enough trucks are made available to haul production from a site before a storage facility fills up. When a facility fills to capacity, it shuts down, which means that production from that facility is deferred production, or deferred revenue. For this reason, using Big Data to solve logistical problems could create a huge near-term benefit for a company.

The trend of analyzing Big Data for decision-making and forecasting has prompted discussions of introducing automated controls to alert operators to problems. Oil and gas companies are now seeking systems that might alert them to an event that might happen in four hours or six hours.

“Once you’ve got analytics running and predictive environments, the question becomes, ‘how do you get the system to tell you, hey you might run into this problem?’ How do you build-in automated controls so the system can rectify itself?”

This automation will be critical to optimizing operations in a safe, efficient manner, particularly in ultra-deepwater exploration. Given the expense of rig day rates, as well as the engineering requirements and need to address safety and efficiency in operations, companies are doing “whatever they can” to prevent downtime, Sperrazza noted.

“If they can drill 10 feet faster per minute, that drives profitability and makes the company more efficient and competitive.”

 “We’re always exploring for oil at the edge of data, and new drilling environments and reservoirs are typically outside the range,” Sperrazza noted.

Oil and gas companies also must track which well leases are profitable, which ones can expire, and what its contractual drilling obligations are, said Charles Karren of Oracle in an interview with Rigzone. Companies are tracking structured data, such as production data from sensors, and unstructured data, which can be emails or drilling reports filed in a cabinet. Oil and gas companies must also deal with a great number of data coming from sources such as vendors, and tracking service crews, truck traffic, equipment and hydraulic fracturing water usage.


123456789

View Full Article

WHAT DO YOU THINK?


Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.

Thomas Speidel  |  March 05, 2014
I enjoyed reading the article since it closely aligns to how I have envisioned analytics can help oil and gas companies. However, its a myth that the more the data the closer one gets to some objective measure. It is not the size of data that matters. Most data isnt big and, frankly, does not need to be. "Risk" or "probability of success" are not a function of the size of data. In fact, notwithstanding regulatory and compliance obligations, too much information present many challenges, not just to store and manage, but also to analyze (in statistics we call it the curse or large numbers). Oil and gas is capital intensive and risk prone. Hence, organizations in this industry ought to be cautious before applying the learning, methods and strategies of large internet based organizations. To a company like Google, serving the wrong ad to a user has a non-consequential irrelevant cost. To an oil and gas the cost of improper decision making can have a severe negative impact on profit, safety or productivity. What we need is just the right amount of data and that may mean throwing away some data (do we really need temperature data by the minute? Do we really think it is going to improve our state of knowledge?). We also need to approach problem first, rather than data first, which for some bizarre reason, appears to have been the modus operandi. We also need to rethink how we are going to use data. Simply finding ways to do more of the same, that is, overwhelm users with even more data is hardly going to create any meaningful change. And creating even more summary metrics will not help us understand an outcome. I echo the author when she writes that we need predictive tools to look forward. We also need explanatory models to better help us understand the various contributors to a problem and if and how we can control those. All of this is nothing new, by the way. We have been using them successfully for many decades in other fields. It has just taken that long for organizations to catch up.
James Baird  |  December 31, 2013
This is exciting article especially about storing big data in order to meet audit/ and regulatory needs.


Most Popular Articles