Three mistakes the oilfield keeps making about Big Data and what to do about them
I’ve often heard “the upstream industry has handled big data for ever. G&G data is huge, so we are the experts”. Even if the basic premise was correct (it’s not: lots of other industries work with larger datasets), size isn’t everything. My response used to be that we don’t have big data; we have lots of data that lives in silos which limit its value. That’s true, but it doesn’t mean much to a lot of people. Instead, I’ll offer up an analogy.
One of the greatest innovators of the industrial age, Henry Ford is renowned for having had the vision to give people what they didn’t know they wanted – he is famously quoted as saying that if he had asked, people would have told him they wanted a faster horse. As it turns out he didn’t say this, but he did something far more powerful.
His genius lay in the adaptation and integration of existing designs, tools, and processes to revolutionize automobile manufacture. He delivered on something that he actually did say; a vision of a mass-produced car for “the great multitude…constructed of the best material…so low in price that no man making a good salary will be unable to own one.”
Ford’s game-changing idea was not the improvement of the car itself, but that a good, reliable car at a fair price would sell like crazy – and the world was never the same again.
A successful big data story follows this same principle: integrate data, tools and processes to revolutionize business performance. The idea isn’t to just incrementally improve existing systems, but to explore, discover, and implement new ways to deliver value. This is where we make our first mistake.
1: Thinking small about big data
Forget about technology-focused initiatives to give faster answers to known questions. What are the major priorities of your organization? What kind of information is needed to support decisions to achieve those goals? What range of data is needed to create that information? The system and processes that make that data available should be flexible, connected, and extensible to answer the next set of questions that power your business.
It’s what Walmart did when they started to connect store transactions with demographics, weather patterns, trucking, vendor supply chains, and more – to enable them to optimize inventory and logistics, and beat everyone on price.
Closer to home, it’s what ConocoPhillips did when they decided to connect wellhead SCADA, transport logistics, equipment configs, maintenance planning, and more – to optimize Eagle Ford production and boost their balance sheet with a more accurate EUR.
In both of those cases, success demanded a focused effort to implement analytics at scale. It didn’t happen overnight, it wasn’t a single project, but a comprehensive program for which no single part of the organization was wholly responsible.
Accelerating existing processes may get you a faster horse, and that’s probably not a bad thing, but don’t be surprised if you’re beaten by a Ford. It may well require a cultural shift, which is always difficult. Which bring us to our next issue…
Mistake 2: It’s an IT thing. Just give me an easy button
Give me what I want with no effort on my part? I’m in! Buying from Staples may well be as easy as their marketing claims, and the appeal is undeniable. But nothing is ever that simple under the covers – especially making something “easy”.
Lots of technical systems are unnecessarily complicated, not just in terms of software, but in the wider process. Studies show that engineers spend over 75% of their time finding and preparing the data they need to do their jobs because systems are so disjointed. Naturally, this creates huge frustration.
Taking a big data approach can drastically reduce data sourcing time as well as adding analytic insight.
It may seem quicker to sidestep the “official” system and do your own thing on your desktop, or in the cloud, but a bunch of point solutions won’t fix the problem, and will never be an easy button. Oh…and free open source software? Make sure that you understand that we’re talking free speech, not free beer.
To build and maintain an analytic capability at scale is a significant undertaking that can only succeed if all parties are fully committed to its success and trust in their partners. This industry should be very comfortable with this kind of setup, because we see it in other areas of our operations.
Operators stopped doing their own drilling and completions long ago, choosing instead to focus their expertise on how to best develop assets and pay service companies to perform the work. To get their gear to the wellsite, service companies don’t build their own trucks; they extend a base platform from a specialist truck builder. And those trucks are assemblies of subsystems from numerous suppliers, each with their own expert knowledge of brakes, hydraulics, navigation systems, etc.
It comes down to understanding between businesspeople and IT support people, each trusting the other’s expertise to do their part without imposing their opinions unduly on the other. Without such trust, a “solution” for one group can be a nightmare for others, and a losing proposition for the organization. Get the balance right, and analytics at scale can deliver enormous value.
Mistake 3: We’ll start as soon as we’ve fixed the data
A few years ago, I sat through a presentation given by a major E&P representative, reviewing a project called “Backbone”. This was a mammoth data management project with the noble goal of cleaning up and modeling all of their global operational data so that their engineers could analyze it. After seven years on the project, the speaker told us, they were now “looking for some quick wins”!
This kind of ocean-boiling approach never works. Don’t even try. Yes, data quality is always a problem, and yes, you have to clean it up if you want to support the best decisions, but the way to do this is to start small and focus on answering a specific problem, working together with subject matter experts and fixing things as you go.
If that sounds contrary to my earlier advice to not think small, it’s not – provided you are committed to the delivery of an enterprise solution. Starting small will highlight your data issues and probably show them to relate to data access and availability, not quality. Even more important, you will quickly realize value through addressing a specific problem, and be able to build on that solution to answer other questions with the same data platform.
For example, a US land operator saw excessive bit failures in their mid-continent operations. Over the course of a couple of years, numerous suggestions as to cause had been put forward, numerous equipment variations tried out. Everybody had their pet theories, but none had come up with an answer.
Everything else having failed, they brought all of the real time, well survey, and well operations data together in an analytic system capable of a foot-by-foot evaluation across thousands of wells.
Within six weeks, the answer was confirmed and an operational solution defined. Most of that time was spent in data sourcing and cleanup processing. If they had waited until they had a perfectly curated system for managing all of their data for all of their operations before starting the analysis, they would still be suffering the same failures today.
The legacy of such a project? A rich reserve of high quality data and a process for ongoing collection that can now be used to answer a range of other operational questions. And if the next question requires, say, maintenance data from ERP, that can be added and the range of answerable questions expands further.
The biggest mistake of all?
There is a common thread that connects these mistakes. It’s the notion that if we can just clear a specific hurdle, we will have a permanent fix and a permanent advantage.
Coincidentally, it is the same thing that brought Ford’s dominance in the nascent automobile market to a screeching halt in 1927. Ford changed the game, but then he couldn’t change his winning formula to compete when others added new rules. No matter how cheap he made the Model T, the new features and choices that GM added to Chevrolets rendered Ford’s only product obsolete, and he had no plan B.
The speed of technical development will always outpace the ability for a particular technology solution to deliver a permanent advantage. If you disagree, I’ve got a lightly-used Blackberry that you might like.
Ask yourself or your leaders how your organization is meeting the data challenge. If the answer is anything like “we researched technology some time ago and chose <insert vendor name here> as our solution provider”, or conversely “we researched this some time ago and chose to develop our own custom tools”, then take a note of where the exits are.
To my way of thinking, a textbook answer would be along the lines “from top to bottom, we see data as a vital asset and we’re always looking for new approaches to create useful information that we can use to support our decision-making”. I’ve never heard that response from an oilfield company, but if you do, perhaps you’ve found the true Digital Oilfield company – one that will thrive in the Information Economy.
This article originally appeared in Infill Thinking.