Supercomputing Needs in Oil, Gas Industry to Keep Growing

Supercomputing Needs in Oil, Gas Industry to Keep Growing
The oil and gas industry continues to ramp up its supercomputing capabilities as it explores for oil and gas in complex frontier basins and seeks to enhance production recovery.

Oil and gas majors such as Total S.A. and BP plc have or will seek to ramp up their supercomputing capabilities to meet their growing high-performance computing (HPC) needs.

Earlier this year, Total reported it would upgrade by 2016 the storage capacity of its Pangea supercomputer from its current 16 petabytes to around 26 petabytes.

The upgrade was planned as an option when Pangea Phase 1 was installed, said Philippe Malzac, executive vice president of information systems at Total, in a statement to Rigzone. The company decided to pursue the upgrade due to the growth in seismic acquisition density and the use of new and more sophisticated algorithms, as well as to fulfill the need of Total’s explorers and reservoir engineers.

The move towards exploring for oil and gas not only in deepwater, but away from known provinces and into promising, but complex, frontier domains, is driving the need for greater supercomputing power in the oil and gas industry, François Alabert, vice president of Exploration Technologies at Total, told Rigzone.

Total is exploring prospects and fields in the deeper portions of petroleum basins, sometimes hidden 30,000 feet underneath geological strata, making it difficult for operators to use acoustic seismic waves used to image the subsurface.

“Increasingly precise seismic algorithms are therefore needed to see and map them, to increase the chance of success and to design safe, cost-effective well operations to probe them,” Alabert said.

Oil and gas companies are also exploring complex frontier oil and gas plays, either on extreme continent edges offshore or onshore in mountainous areas, where geological risks are high as traps are more subtle, thereby requiring big imaging computing power, Alabert explained.

“Last but not least, maximizing oil and gas recovery and value from all of our fields uses increasingly precise engineering models to optimally design development schemes and production methods: petroleum engineering represents a growing demand in supercomputing resources,” Alabert noted.

Since 2000, the oil and gas industry has been continuously investing in high performance computing, increasing by approximately 10 folds its computing power every three years, Alabert explained.

“The primary objective has been to improve seismic imaging in deeper and more complex parts of the subsurface, thanks to ever more precise algorithms.”

“Without such computational power, it is fair to say that exploration and development of many large offshore petroleum basins like the Gulf of Mexico, the Gulf of Guinea, or Brazil’s deep offshore, would not have been possible.”

In late May, Australia-based Woodside Energy said it would use IBM’s Watson supercomputer as part of the company’s next steps in data science. The Watson supercomputer – which beat well-known Jeopardy game show human competitors Ken Jennings and Brad Rutter in a 2011 match – will be trained by Woodside engineers, allowing users to surface evidence-weighted insights from large volumes of unstructured and historical data contained in project reports in seconds.

Watson is part of Woodside’s strategy of using predictive data science to leverage more than 30 years of collective knowledge and experience as a leading liquefied natural gas operator, to maintain a strong competitive edge, Woodside said in a May 27 press statement. Through this strategy, the company will bring a new toolkit in the form of evidence-based predictive data science that will bring costs down and boost efficiencies across the company.

“Data science is the essential next chapter in knowledge management, enabling the company to unlock collective intelligence,” said Shaun Gregory, Woodside senior vice president strategy, science and technology, in the May 27 press release. “Data science, underpinned by an exponentially increasing volume and variety of data and the rapidly decreasing cost of computing, is likely to be a major disruptive technology in our industry over the next decade.”


123

View Full Article

WHAT DO YOU THINK?


Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.


Most Popular Articles