Big Data Goes Bust
Back in 2012, Steve Lohr writing for the New York Times did average folks a favor and introduced us to “Big Data.”
Yes, analysts, data scientists and the people in Silicon Valley had already heard of it, but thanks to Steve and the Times, the rest of us found out that it’s a “meme and a marketing term, for sure, but also shorthand for advancing trends in technology that open the door to a new approach to understanding the world and making decisions.”
Sounds intriguing, huh?
Despite a few years of boom for the concept of “Big Data,” the term itself is old news. Big Data came with a “measure it all” attitude that eventually gave the phrase a bad rap, but smart companies didn’t throw the baby out with the bathwater. Data plays an expanded role in our daily lives, including how we do business. Now, instead of worrying about how big the data is, the winning trend is to focus in on only what’s actually relevant.
Now fast forward to today: leveraging data for higher productivity and reduced costs in the oil patch is a reality. Not Big Data, but the right data. And to capitalize on this reality, we must think much, much smaller...
Forget Big Data
Why small data? In part, that’s because Big Data is a pain in the you-know-what. It’s hard work, plagued by data quality issues, and it’s expensive to boot.
Matt Turck points out that an organization who buys into Big Data (and yes, it has to be the whole organization), needs to “capture data, store data, clean data, query data, analyze data and visualize data.” And while software can handle it, “some of it will be done by humans,” and all of it needs to be part of a seamless integration.
If this doesn’t sound easy, it’s because it’s not.
After you analyze anything and everything to get whatever info is available and come up with correlations, sometimes your takeaways aren’t only unexpected, but off the wall. For many businesses, coincidences continue to be suddenly pegged on cause and effect, leading many on some expensive wild goose chases.
At Slate, Will Oremus points out that Big Data’s problem isn’t that the data is bad, it’s this over-enthusiastic, fetishistic application of data to everything as often as possible. Data in the short-lived boom of Big Data was not being used in a careful, critical way.
Really, all of the data being collected was hard to interpret. When you’re collecting billions of data points – clicks or cursor positions on a website, turns of the drillbit or strokes of the pump – the actual importance of any single data point is lost.
So, what may seem to be a big, important high-level trend might not be a trend at all. There could be problems in the data, an issue with the methodology or some kind of human error at the well site.
Simply put, Big Data alone doesn’t always add up.
Oil, Gas and Data
While most oil and gas producers are relatively late to the Big Data party, we might consider ourselves lucky.
Why? To start, the wider the gap between the proxy and the thing you’re actually trying to measure, the more dangerous it is to place too much weight on it.
For example, well performance indicators are a function of numerous important factors outside of an engineer or supervisor's control.
Part of the draw of Big Data was the idea you could find meaningful correlations even in very noisy (and seemingly completely unrelated) data sets. And, thanks to the sheer volume of data, coupled with powerful software algorithms that can theoretically control for confounding variables. The model we're describing would draw upon a very wide range of well production correlations from across many basins and production environments to generate an “expected” set of production outcomes against which actual results could be compared.
Now, imagine for a minute that such a system were applied within the context of a single company or field – with just the wells in a particular area compared with one another.
Without the “magic” of Big Data, anomalies in total production in a given timeframe would be glaring. Thank you, small data!
No intelligent oilman (or woman) examining them would be under the illusion each well’s performance corresponded neatly with the historical trend of that well, let alone the particular field of wells.
Moreover, it would be relatively easy to investigate each well on a case-by-case basis and figure out what was going on.
However, a system based on the idea of Big Data would be far more opaque. Because the data set was big rather than small, it's many times crunched and interpreted by a third-party using a proprietary mathematical model.
This lends a veneer of objectivity, but it forecloses the possibility of closely interrogating any given output to see exactly how the model arrives at its conclusions.
For example, some wells may have underperformed not because of a technical issue, but because of a skimming vacuum truck operator or a pencil whipping pumper – common occurrences apparent to humans but lost on the data.
You Are Your System
In the oil and gas market, the cost at which you extract oil and the system that enables you to do it is who you are. This ‘machine’ you either create or ascribe to is the reason your company is alive.
However, as the world changes, you and your operations must become street-smart if you’re going to succeed – perhaps, even survive.
Remember, all the work we’re doing is about getting results. We’re not innovating simply to be innovative. We’re not creating only to be creative. We’re not producing oil and gas simply to produce oil and gas. We’ve got a serious goal in mind.
To discover how to get there, we’re going to want to study the laws of production and cost to most effectively produce exactly what we’re looking for from our oil and gas assets, which is maximum profit.
That said, we look for a machine that has already proven itself: one that can be successfully implemented and enables you to scale at will, in the hands of ordinary people, all at a low cost, to give you an extreme competitive advantage.
And we simply believe that a new breed of machine serving up ‘small data’ to the right people at the right time is the quickest to get you there.
Sure, we’ll continue to hear a lot of talk about Big Data. And while this idea of Big Data has some merit, the operators who take on these type of challenges do so before they come to understand their most chronic dysfunction.
And that underlying dysfunction is always a strategic problem, a workflow problem, something easily solved by reevaluating the way they communicate – never a technological problem.
For software to be successful, it must focus on the human element, not just the technical. And the companies that forget this are setting themselves up for failure.
This is why we must focus on small data, starting on the front line, in the field with the pumpers and field staff who collect it. Because Big Data is a bust for smart oil.
Greg Archbald is the founder and CEO of GreaseBook.
WHAT DO YOU THINK?
Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.