The Big Challenges of Big Data for Oil, Gas

The Big Challenges of Big Data for Oil, Gas

Rigzone Looks Back: The industry’s need to better understand subsurface has driven oil and gas companies to collect a greater volume of data overall, more types of data, and is gathering data more frequently and at higher speeds. As a result, oil and gas companies are grappling with how to leverage Big Data to improve their business strategies.

Now that the oil and gas industry has addressed the challenge of storing Big Data – the collection of data sets so large and complex that processing it with traditional data applications is difficult – oil and gas companies are seeking how to leverage Big Data to improve their business strategies.

The desire to collect more data for informed decision-making is driving the industry’s need to address Big Data. The oil and gas industry’s need to better understand subsurface has driven oil and gas companies to collect more data, more types of data, more quickly and at higher frequencies, said Dale Sperrazza, director of marketing at Landmark Software & Services, a division of Halliburton, in an interview with Rigzone.

“Geoscientists in particular are always looking for more data – consequently, the amount of data being gathered in the well planning and drilling process has skyrocketed,” said Sperrazza.

The Big Challenges of Big Data for Oil, Gas

The amount of seismic data has grown significantly due to its availability now in finer increments. Data gathered from actual drilling and logging activity can also be measured in smaller increments thanks to tools now available, boosting the overall amount of data, Sperrazza noted.

The amount of data gathered from production activity has also increased due to placement of downhome sensors that relay information to the operator on a real-time basis. A variety of sensors are being deployed downhole, including pressure, temperature and vibration gauges, flowmeters, acoustic and electromagnetic.

“Given the fact that drilling costs have gone up and returns on margins have gotten thinner – particularly in unconventionals – we’re trying to get more data and better refined data so when we make a decision we can remove the variables of risk to ensure the highest probability of success possible before you spend money to drill wells,” Sperrazza noted.

Communication technology to transmit data from the field to the company bases has improved over the past 15 years, which also has contributed to the surge of data. Previously, well data was relayed by satellite, which was not as strong; today, data can be viewed from a cell phone, which also is adding to the overall volume of data, Sperrazza noted.

Evolving data has been a factor in the oil and gas business for a long time. 

“In its early years, data was paper-oriented, in the late 1980s, the first systems were introduced that allowed the industry to put data in a digital format to allow us to do the science.”

Initially, data stored in these systems was fragmented, with systems not yet capable of communicating with each other. The fragmentation disappeared as systems evolved for communications, and the requirements for better understanding of subsurface data meant oil and gas companies had to integrate this data.

As drilling got more complicated and the cost and requirements for speed went up, demand for data became critical. With unconventional oil and gas fields, companies must constantly reevaluate plays every week, two weeks or 15 days, depending on the field.

“It used to be that we would evaluate an area, then go drill a well,” Sperrazza commented. “Today, we evaluate an area, drill a well, gather data in real-time, stick it into the system to plan for the next well, and then drill another well a week later. Multiply this action 50 times as 50 wells are drilled someplace else at the same time, and the amount of data that needs to be evaluated mushrooms.”

The oil and gas industry has largely overcome the basic challenge of how to store the massive amounts of data being collected in exploration and production. But now the industry must answer the question of what to do with all this data, Sperrazza noted. 

“How do you access it? How do you give users access to the data that is relevant to their question? How should it be served to users?” Sperrazza said. “Once the relevant data is retrieved, what kind of knowledge can be gleaned from it?”

Many companies are now addressing how to give user access to data in a quick, efficient manner. By doing so, companies can allow scientists or engineers to run analytics on data to identify trends and patterns and make assumptions about certain business decisions.

Oil and gas companies also must correctly pool data together from different databases and different types of data.

“Once you have data, it’s not typically one location, it’s many locations, meaning that, before analytics is run, the data has to be federated, or brought together. We may have different databases with different data, you might have one land data, I might have one on wells,” Sperrazza noted. 

The need to gather data into a format for analysis is one of the biggest infrastructure problems the oil and gas industry faces, but one that must be overcome for successful analytics to be created. 

Oil and gas companies are beginning to look at using analytics to improve decision-making in identifying new places to shoot seismic or drill or how to make the next well better, Sperrazza noted. “When you think about the problems of drilling in 20,000 feet of water in deepwater environments and the main causes of stuck pipe, you then try and come up with metrics so that you can estimate the probability of an event occurring,” Sperrazza noted.

All aspects of the upstream oil and gas industry are facing challenges from Big Data, from seismic acquisition to reservoir interpretation to drilling, field planning, finding first oil and maximizing production.

Big Data presents similar challenge for both ultra-deepwater and onshore shale exploration and production, Sperrazza noted. Analytics is playing a critical role in field logistics for onshore shale production to ensure that capital being spent on transportation and storage facility infrastructure is being spent correctly. The need to analyze Big Data has not only grown for decision-making and efficiency, but ensuring compliance with regulations in the United States and overseas.

Sperrazza sees production field logistics as the one area that can benefit the most from the industry’s success in addressing Big Data. Companies can maximize the revenue they get from a field by ensuring that enough trucks are made available to haul production from a site before a storage facility fills up. When a facility fills to capacity, it shuts down, which means that production from that facility is deferred production, or deferred revenue. For this reason, using Big Data to solve logistical problems could create a huge near-term benefit for a company.

The trend of analyzing Big Data for decision-making and forecasting has prompted discussions of introducing automated controls to alert operators to problems. Oil and gas companies are now seeking systems that might alert them to an event that might happen in four hours or six hours.

“Once you’ve got analytics running and predictive environments, the question becomes, ‘how do you get the system to tell you, hey you might run into this problem?’ How do you build-in automated controls so the system can rectify itself?”

This automation will be critical to optimizing operations in a safe, efficient manner, particularly in ultra-deepwater exploration. Given the expense of rig day rates, as well as the engineering requirements and need to address safety and efficiency in operations, companies are doing “whatever they can” to prevent downtime, Sperrazza noted.

“If they can drill 10 feet faster per minute, that drives profitability and makes the company more efficient and competitive.”

 “We’re always exploring for oil at the edge of data, and new drilling environments and reservoirs are typically outside the range,” Sperrazza noted.

Oil and gas companies also must track which well leases are profitable, which ones can expire, and what its contractual drilling obligations are, said Charles Karren of Oracle in an interview with Rigzone. Companies are tracking structured data, such as production data from sensors, and unstructured data, which can be emails or drilling reports filed in a cabinet. Oil and gas companies must also deal with a great number of data coming from sources such as vendors, and tracking service crews, truck traffic, equipment and hydraulic fracturing water usage.

Opportunities for Oil & Gas in Big Data

The requirements from Big Data are not coming from one part of the industry, but pose a dilemma across the entire industry spectrum. However, Sperrazza sees Big Data not as a dilemma but as an opportunity for an oil and gas to create a competitive advantage. Companies that can capitalize on Big Data will come ahead of their competitors.

This opportunity depends on the company – an exploration company may find an advantage in finding a new area and understanding how to drill a play, while an exploitation company that focuses more on production may see opportunity in maximizing production performance while reducing risk. A deepwater operator may see it as operational performance and risk reduction, while a land operator focused on unconventional factory style drilling may focus on increasing efficiency, Sperrazza noted.

Several things need to happen for a company to successfully take advantage of Big Data, including how data is stored and the process through which it is accessed, to allow employees to get the most out of a company’s data. Successfully addressing the challenges of Big Data in oil and gas is about more than the right tools, but how the data is interpreted. Once analytics are running and real-time data starts streaming in, data must be evaluated based on historical data to determine whether it’s within bounds, Sperrazza noted.

The industry is leveraging Big Data to reduce non-productive time and enhance production which, even in small increments, can make a big impact on a company’s bottom line. To take advantage of Big Data, companies need to implement the infrastructure that will allow them to slice and dice data as needed, Karren noted.

“After the infrastructure is in place, companies then need to build statistical and predictive models based on leveraging structured data and unstructured data.”

Companies then need to be able to leverage these predictive models for real-time monitoring of production, with a system that can alert companies to issues and recommendations on addressing these issues.

Karren said he is not seeing any operator or service company that currently has the ability to take a holistic view of drilling and production as they lack the ability to integrate multiple data systems to enhance decision-making. Integrating these systems can allow companies to deliver information to workers in the field in a timely manner.

“A lot of the data being tracked by sensors such as temperatures on rigs is thrown out because companies don’t have the infrastructure for this data. Infrastructure goes beyond data storage, and is the capacity to warehouse and model data. If the infrastructure is in place, this data could be saved and analyzed to locate anomalies such as opportunities to enhance production or health, safety and environmental issues.

With current oil prices, the incremental changes that can be made by successfully managing Big Data could mean big bucks at the end of the day, Karren noted.

Karren sees the need in the future for oil and gas companies to hire essentially a chief data scientist, somebody who could analyze drilling and production operations that have been completed to see if any best practices or learnings could be carried over to other projects, especially those in the same plays. Instead of having to ask somebody who has worked on a previous project what to do, workers on a project could turn these best practices to figure out the next step. Not only could drilling and production activity be monitored, but the performance of service crews evaluated as well.

Big Data Trend Seen in Financial, Regulatory Transactions

The Big Data trend is behind the fairly dramatic increase in data volumes in terms of financial transactions and regulatory data flowing through transaction processing system for both upstream and downstream companies, said Dr. Werner Hopf, CEO of Dolphin Enterprise Solutions Corp., a provider of data management solutions for SAP applications. This increase has occurred as logistics has become an issue for oil and gas companies, such as the need to track sand and oilfield service supplies.

Hopf characterizes much of the data flow as logistics transactions. But given the way SAP is built, more or less as an accounting system, almost everything that happens in the logistics models has an impact on financials.

“If you move inventory from one plant to another, it causes financing posting to reflect this from an accounting perspective.”

Werner notes that a plant can be a drilling rig or a production platform, and that almost all information is important from a regulatory data retention perspective.

Oil and gas companies have to keep track of more data, such as regulations in different U.S. states and different countries, to ensure that correct royalties are being paid, as well as regulatory agency requirements.

“There are lots of dimensions to regulatory data,” Werner noted.

Much of the recent focus on regulatory data has centered on plant maintenance and tracking repairs.

Companies are collecting a lot more detailed information from a plant and equipment maintenance perspective, particularly in light of the Macondo oil spill in April 2010, given the environmental impact of that spill and potential damage or liabilities. Regulatory data also includes tax liabilities; in some cases, companies will minimize tax liability by shifting revenues to low tax areas, and the calculations of transfer pricing is one thing in which auditors are interested. Companies also are tracking regulatory and tax requirements in multiple states and worldwide, adding to the amount of data which companies must consider.

To address these issues and extract more value from Big Data, companies are turning to in-memory computing technology. As the demand to more quickly process growing volumes of data increases, so does the need for additional storage. Corporations investing in in-memory platforms quickly appreciate and see their ROI through performance metrics. With in-memory computing and complementary data management strategies, data is kept as close as possible to processors, enabling companies to not only store larger amounts at a lower cost but to also analyze it more quickly and efficiently.

Harnessing Big Data can allow companies to achieve operational excellence, according to a Dolphin white paper, “Managing SAP Data and Processes to Maximize Operational Efficiency in the Energy Industry”. These opportunities include finding ways to reduce costs and improve operational performance, such as negotiating ad hoc discounts with vendors; processing more without additional headcount, maximizing cash flows, and streamlining IT infrastructures.

Companies that adopt these types of practices are automatically more resilient, and can manage price downturns more easily and reap a larger benefit from price increases, Dolphin noted in the report.

“Three or four years ago, everybody was excited and talking about the potential applications of Big Data and the information you could extract for your business by implementing Big Data solutions. Now the discussion has shifted towards more emphasis on the value cost ratio. In the beginning, if a company had Big Data in the description, that project got funding. Now we see management looking more carefully at how much the project would cost and how does the cost compare to the value the company will get out of it.”

“In the early adopter wave, companies would jump on the train without questions asked. Now, they’re asking how much does the train ticket cost,” Hopf noted.

Big Data Potential for Project Management?

The successful analysis of Big Data could potentially help oil and gas companies overcome planning challenges as the industry tackles large capital projects in North America and worldwide. In 2014, the industry will spend an estimated $700 billion on global capital expenditures (CAPEX). The industry is facing not only rising capital costs, but higher labor costs as well.

The U.S. shale boom has ignited CAPEX momentum, with more capital projects in oil and gas valued at over $1 billion, said Alan Richard, director of Deloitte Financial Advisory Services LLP, at the Deloitte Oil & Gas Conference in Houston Nov. 19.

Gary Fischer, general manager of consulting services for Chevron Corp. subsidiary Chevron Project Resources Co., said he is intrigued by Big Data and the possible insight and information that the industry could glean from it. To improve its track record with capital megaprojects, the industry must find ways to boost efficiency in its planning. Project scheduling, engineering and materials management are all issues the industry needs to address.

“Large megaprojects are fragile,” Fischer told attendees at the conference. “They’re really fragile, and they’re totally unforgiving.”

To successfully execute projects, companies must focus on defining projects to an adequate level of scope in the beginning, lining up project stakeholders so everyone is on the same page, and actively managing project risk by having plans in place for all issues that may arise.

Click the link to read more about Big Data

WHAT DO YOU THINK?

Click on the button below to add a comment.
Post a Comment
Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.
Thomas Speidel | Mar. 5, 2014
I enjoyed reading the article since it closely aligns to how I have envisioned analytics can help oil and gas companies. However, its a myth that the more the data the closer one gets to some objective measure. It is not the size of data that matters. Most data isnt big and, frankly, does not need to be. "Risk" or "probability of success" are not a function of the size of data. In fact, notwithstanding regulatory and compliance obligations, too much information present many challenges, not just to store and manage, but also to analyze (in statistics we call it the curse or large numbers). Oil and gas is capital intensive and risk prone. Hence, organizations in this industry ought to be cautious before applying the learning, methods and strategies of large internet based organizations. To a company like Google, serving the wrong ad to a user has a non-consequential irrelevant cost. To an oil and gas the cost of improper decision making can have a severe negative impact on profit, safety or productivity. What we need is just the right amount of data and that may mean throwing away some data (do we really need temperature data by the minute? Do we really think it is going to improve our state of knowledge?). We also need to approach problem first, rather than data first, which for some bizarre reason, appears to have been the modus operandi. We also need to rethink how we are going to use data. Simply finding ways to do more of the same, that is, overwhelm users with even more data is hardly going to create any meaningful change. And creating even more summary metrics will not help us understand an outcome. I echo the author when she writes that we need predictive tools to look forward. We also need explanatory models to better help us understand the various contributors to a problem and if and how we can control those. All of this is nothing new, by the way. We have been using them successfully for many decades in other fields. It has just taken that long for organizations to catch up.

James Baird | Dec. 31, 2013
This is exciting article especially about storing big data in order to meet audit/ and regulatory needs.


Events  SUBSCRIBE TO OUR NEWSLETTER

Our Privacy Pledge
SUBSCRIBE



Most Popular Articles

From the Career Center
Jobs that may interest you
Director, Business Development
Expertise: Business Development
Location: Hanover, MD
 
Indirect Tax Supervisor Job
Expertise: Accounting
Location: Houston, TX
 
Project Billing Specialist
Expertise: Accounting|Customer Service|HR - General|IT - SAP / ERP|Payroll|PR / Corporate Communications
Location: Houston, TX
 
search for more jobs

Brent Crude Oil : $50.8/BBL 0.13%
Light Crude Oil : $47.55/BBL 0.08%
Natural Gas : $2.94/MMBtu 0.67%
Updated in last 24 hours