Six Tech Advancements Changing the Fossil Fuels Game
SUPERCOMPUTING & SEISMIC DIMENSIONS EINSTEIN WOULD APPRECIATE
Oil majors are second only to the US Defense Department in terms of the use of supercomputing systems. That's because supercomputing is the key to determining where to explore next—and to finding the sweet spots based on analog geology.
What these supercomputing systems do is analyze vast amounts of seismic imaging data collected by geologists using sound waves. What's changed most recently is the dimension: When the oil and gas industry first caught on to seismic data collection for exploration efforts, the capabilities were limited to 2-dimensional imaging. Now we have 3-dimensional imaging that tells a much more accurate story.
But it doesn't stop here. There is 4-dimensional imaging as well. What is the 4th dimension, you ask: Time (and Einstein's theory of relativity). This 4th dimension unlocks a variable that allows oil and gas companies not only to determine the geological characteristics of a potential play, but also gives us a look at the how a reservoir is changing LIVE, in real time. The sound waves rumbling through a reservoir predict how its geology is changing over time.
The pioneer of geological supercomputing was MIT, whose post-World War II Whirlwind system was tasked with seismic data processing. Since then, Big Oil has caught on to the potential here and there is no finish line to this race—it's constantly metamorphosing. What would have taken decades with supercomputing technology in the 1990s, now can be accomplished in a matter of weeks.
In this continual evolution, the important thing is how many calculations a computer can make per second and how much data it can store. The fastest computer will get a company to the next drilling hole before its competitors.
We are talking about MASSIVE amounts of data from constant signal loops from below the Earth's surface. For example, geologists generate sound waves using explosives or other methods that dig deep into the Earth's surface and then are sample 500 times per second. Only a supercomputer could possibly process all this complex data and make sense of it.
View Full Article
WHAT DO YOU THINK?
Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.