Six Tech Advancements Changing the Fossil Fuels Game
SUPERCOMPUTING & SEISMIC DIMENSIONS EINSTEIN WOULD APPRECIATE
Oil majors are second only to the US Defense Department in terms of the use of supercomputing systems. That's because supercomputing is the key to determining where to explore next—and to finding the sweet spots based on analog geology.
What these supercomputing systems do is analyze vast amounts of seismic imaging data collected by geologists using sound waves. What's changed most recently is the dimension: When the oil and gas industry first caught on to seismic data collection for exploration efforts, the capabilities were limited to 2-dimensional imaging. Now we have 3-dimensional imaging that tells a much more accurate story.
But it doesn't stop here. There is 4-dimensional imaging as well. What is the 4th dimension, you ask: Time (and Einstein's theory of relativity). This 4th dimension unlocks a variable that allows oil and gas companies not only to determine the geological characteristics of a potential play, but also gives us a look at the how a reservoir is changing LIVE, in real time. The sound waves rumbling through a reservoir predict how its geology is changing over time.
The pioneer of geological supercomputing was MIT, whose post-World War II Whirlwind system was tasked with seismic data processing. Since then, Big Oil has caught on to the potential here and there is no finish line to this race—it's constantly metamorphosing. What would have taken decades with supercomputing technology in the 1990s, now can be accomplished in a matter of weeks.
In this continual evolution, the important thing is how many calculations a computer can make per second and how much data it can store. The fastest computer will get a company to the next drilling hole before its competitors.
We are talking about MASSIVE amounts of data from constant signal loops from below the Earth's surface. For example, geologists generate sound waves using explosives or other methods that dig deep into the Earth's surface and then are sample 500 times per second. Only a supercomputer could possibly process all this complex data and make sense of it.
1234567
View Full Article
WHAT DO YOU THINK?
Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.
- USA Energy Sec Leads Meeting with 7 Major Oil Companies
- Russian Oil Isn't Dead Yet
- US Gas Production Up 2 Pct In Q2, S&P Global Platts Says
- Sunak Stands Firm on UK Oil Windfall Tax
- Recession Talk Reigns Supreme
- ABB Systems Chosen For Northern Lights CCS Project
- Coretrax Completes Expandable Technology Project In USA
- OGDCL Makes Oil and Gas Discoveries
- Oil Dive Will Not Bring Any Immediate Relief on Inflation
- Germany Fears Russia Could Permanently Close Main Gas Pipeline
- USA Navy and Iran Corps Clash in Strait of Hormuz
- Russian Oil Disappears as Tankers Go Dark
- Diesel Price Shock Imminent As Reserves Drop, Refining Lags
- USA Refinery Capacity Drops
- USA Gasoline Price Falls
- ConocoPhillips Makes Norway Gas Discovery
- USA Energy Sec Leads Meeting with 7 Major Oil Companies
- These Are the Largest Energy Companies by Market Cap Right Now
- $150 Oil Could Still Happen. Here's How.
- New Mexico Oil Refinery Cost Doubles
- USA Navy and Iran Corps Clash in Strait of Hormuz
- Oil Industry Responds to Biden Letter
- Rapidly Decaying Supertanker Could Explode at Any Time
- Oil Nosedives on Fed Inflation Actions
- Top Headlines: ADNOC Announces 650MM Barrel Oil Find and More
- Bankrupt Sri Lanka Takes Russia Oil
- Biden To Restart Idle Refineries To Tame Fuel Prices
- Top Headlines: Oil Industry Responds to Biden Letter and More
- Iran Seizes 2 Greek Tankers
- Too Early To Speculate on ExxonMobil Refinery Fire Cause