Home / Business / Taming Big Data: A Strategic Approach to Unconventional Analysis in Unconventional Reservoirs

Taming Big Data: A Strategic Approach to Unconventional Analysis in Unconventional Reservoirs

Sponsored Content

Extracting hydrocarbons from unconventional reservoirs using horizontal drilling and multi-stage fracking is more complex and generates more data than traditional E&P activities. Accordingly shale-focused operators and investors are no longer able to maximize their competitive advantage using conventional data analysis techniques like bivariate analysis. Especially in the current oil commodities market, advanced statistical modeling can be the most effective way to create material competitive advantage across the shale asset lifecycle.

Before big data was a buzzword, it could be loosely defined as data requirements that exceeded commercially available technology, the boundaries of which the oil and gas industry has been pushing since the dawn of computing. Cutting through the fluff, big data can be defined today as transforming existing data from a cost into a revenue generating and cost savings asset.

Technical big data definitions typically include volume, velocity, and variety of data. Delivering value to the business is often aided by a more strategic approach that focuses on: data management, advanced analysis, and operationalization.

Data Management

Asset teams are spending increasing amounts of time outside their areas of core competence attempting to wrangle increasingly unwieldy unconventional reservoir data. Taming the high volume, variety, and velocity of today’s data is a necessary, though not sufficient, condition for creating strategic competitive advantage for shale-focused operators and investors. The foundation of data-driven analysis is the availability and quality of the underlying data. The firms with automated processes that continuously update, aggregate, normalize, and “QC” data from all available sources are already enjoying increased efficiency and, perhaps more importantly, are in a unique position to maximize the value of big data-specific analysis techniques like machine learning.

Advanced Analysis

Traditional bivariate analysis, i.e. scatter plots and regression, is highly ineffective for large data sets, especially when parameters are highly auto-correlated, i.e. interrelated; contemporary upstream data is highly auto-correlated. See Figures 1 and 2 below.

Figure 1: Traditional Bivariate Analysis (Noise)

Figure 1: Traditional Bivariate Analysis (Noise)

 

Figure 2: Chord Diagram of Parameter Relationships (High Autocorrelation)

Figure 2: Chord Diagram of Parameter Relationships (High Autocorrelation)

The rise of companies like Google, Facebook, and eBay dramatically accelerated big data technologies, particularly as these firms released some of their core intellectual property to the public domain. In turn, wide availability of these innovations created opportunities for other industries to share in the massive value creation from big data. Among the more notable advanced analysis technologies is machine learning, a branch of statistics designed for big data (Harvard Business Review, July 2015).

Machine learning, applied correctly, is delivering material competitive advantage to shale-focused operators and investors.  Fundamentally, machine learning exists to separate signal, a valuable consistent relationship in the data, from noise, random correlations that won’t occur again in the future. It is a computationally intensive and highly iterative process in which computers study systems to learn from the data, rather than follow explicitly programmed instructions.

Figure 3: Signal Extracted Using Advanced Machine Learning

Figure 3: Signal Extracted Using Advanced Machine Learning

Machine learning is very powerful, but also very complex and requires a heterogeneous set of skills at both the strategic and implementation levels. Accordingly, McKinsey suggests that, “companies embarking on machine learning should pursue the strategy wholeheartedly at the C-suite level and acquire existing expertise and knowledge to guide the application of that strategy” (McKinsey, An Executive’s Guide to Machine Learning, June 2015.)

Operationalization

Virtually the entire universe of the 87% of O&G firms in which big data is a top-tier priority aspires to deliver actionable insights to their decision makers. (GE/Accenture: Industrial Internet Insights report for 2015). Very few of these firms have succeeded so far, as “the major limitation in [the big data] market is the lack of skilled personnel” (P&S Market Research 2015). Converting geoscientists and petroleum engineers to data scientists and software engineers has not been effective. Similarly, data scientists and software engineers have not been successful on a stand-alone basis.

 McKinsey defines a translator role in their 2015 Executive Guide to Machine Learning Value, which when applied to the oil and gas industry suggests that data-driven value is maximized at the confluence of first principles and experience with data science and software engineering (McKinsey, An executive’s guide to machine learning, June 2015.)

The most effective data-driven actionable insights are currently delivered through purpose-built workflows that encapsulate the requisite dimensions of big data and have been developed in close partnership with the oil and gas industry.

Conclusion

Many firms have effectively managed risk in unconventional reservoirs using an “emulate my most successful neighbor” strategy for evolving their drilling and completion techniques. The firms that continue to neglect the adoption of big data and advanced analytic strategies are falling further behind every day. Moreover, unlike the engineering-centric aspects of unconventional reservoir development, firms are not going to be able to call their preferred oilfield services provider and quickly enjoy a material upgrade to their big data strategy the way they are able to request different drilling rigs, completion materials, or completion techniques.

Many attempts to deliver more value from existing data have failed. While the industry should perform thorough due diligence before adopting new technology, those who choose to ignore advances are likely to fall behind. Proven big data solutions exist for shale-focused operators and investors and are currently being used to rapidly value assets, perform competitive analysis, reduce costs, and optimize completions.

For more information or questions about OAG Analytics, contact:

Luther 2Luther Birdzell

info@OAGAnalytics.com

844.OAG.WELL (844.624.9355)

www.OAGAnalytics.com

OAG logo

 

 

 

 

Prior to founding OAG Analytics in 2013, Luther Birdzell held executive and technical leadership roles in both software product and services organizations. Most recently, Luther executed top line initiatives for iTKO that were instrumental in positioning their acquisition by CA Technologies (NYSE:CA) in 2011 for $330MM. In 2007 Birdzell delivered the initial strategic vision that became the company’s critical pivot to develop Service Virtualization, the driving technology for iTKO’s acquisition. At iTKO, Birdzell made distinguished contributions to technology, sales, and strategic delivery. He also lead the growth of the global delivery team, while substantially increasing services revenue. Birdzell holds two electrical engineering degrees from Dartmouth College.

Leave a Reply

Your email address will not be published. Required fields are marked *

*