I love spreadsheets, BUT…

Like any other engineer, I love spreadsheets. They are very powerful, easy to use and can be employed in a wide variety of situations. I use them almost daily for completing personal tasks and, to a limited extent, as handy tools in the business.


The danger is to try to use them in the corporate environment. Several companies are still using spreadsheets to collect data for daily morning reports. The problem is that once the company collects hundreds of those files, from many wells, it is difficult to consolidate the information for benchmarking, comparisons, planning, analytics, etc. They can be used as a data entry tool, but the information should be stored in a single database.

Another use is to get data from corporate databases and perform data analysis using spreadsheets. If you’re doing it for yourself, that’s fine. But if it’s is to be shared or used within the company, it poses some problems. A simple example is when the engineer asks IT for an extract from the database “to do some analysis”. Once this data is received, the engineer starts “preparing the data” which means QC, fixing wrong values, changing codes, removing outliers, etc.

Then the data is aggregated using the engineer’s unique preference. After a substantial amount of work, it is ready to be presented. While presenting the report someone else questions the offset wells used and requires another study with a different dataset. The engineer then goes back to square one, asking IT for some more data etc, etc. This involves an enormous amount of re-working. Not everyone is willing to repeat this process of re-analyzing “what-if” scenarios on a weekly basis.

What may then happen is that another engineer is asked to produce similar work in another field, using their own criteria as they do not know what the first engineer has done. When the two reports arrive at the common boss’s desk they are not comparable because, for example, they use a different aggregation process. Without going any further into this data conundrum, it becomes clear to everyone that using local separated files and processes, based on personal references, can lead to massive confusion and a loss of productivity. The data needs to be live from the main database and the reporting process common to all users involved.

The solution is to replace the spreadsheet in this function with a corporate application, which can read data live from databases and allow data combination/aggregation and cross-plotting facilities on this data.



Call For Abstracts PNEC 2015

PNEC is a very important conference focused in Data Management and Integration to enhance profitability and productivity in Oil & Gas.

Please submit a 150-400 word abstract by October 24, 2014 at:


Abstracts should summarize a noncommercial, technical presentation about solutions, approaches, areas of technology, technology application, theory, case studies on application, evaluation, results, management processes and application, impact on value, and/or other management and technical benefits and costs.

Continuous Process Improvement in Drilling

In order to understand the role of drilling data in field development, we need to look at a well’s life cycle. One simple form is using PDCA (Plan, Do, Check, Act), which are the four basic stages of the well operations process, as shown in figure below.



PDCA was created by W.E. Deming. It is designed to lead to continuous process improvement. In this cycle companies:

  • Plan the next well to drill;
  • Do (execute) the well;
  • Check what went right and what went wrong; and
  • Act with proposed changes based on this analysis, so the next well will include improvements.

Drilling data plays a fundamental role in Planning and Checking the process used. For planning users can use historical offset data from wells in similar context of the new planned well.

The checking phase uses internal and external benchmarking data to compare the performance of each operation in terms of time and speed. The identification of anomalies in this comparison can lead to changes in the workflow and create improvement actions for the next well.

Data needs to flow from one stage to another. If it stays stagnated in one place it has no value. The faster we can loop through PDCA the more value we can add to the business.


Data Security

Strictly speaking, “data security” is not in “data management” as it operates “on” data. It adds another dimension to data management and is not limited to drilling data, or E&P, but affects all data and data access in a company. It becomes a specific problem for E&P due to the increased integration of rigs and other facilities in corporate networks. Once an attack happens in the web site, server, computer, personal gadget, it can potentially affect production facilities, a drilling rig, etc. Continue reading

Data Quality Control

Data Quality Control (QC) is like cleaning the toilet. Everyone wants to use a clean toilet but no one likes to clean it! The reason is simple. Like the sign in a shared toilet, “clean after yourself for next guest”, it would involve minimal work if everyone did their share of work in the first place (including sensors). But that is not the case as not everyone does it. Continue reading