“Intelligent Salesman”

I heard a joke recently about DoF – Digital (or Intelligent) Oil Field. It says that the most intelligent component of the Intelligent Oil Field initiative is the salesman. As in any joke, there is some truth to it. The joke clearly points to the over-emphasis placed on the technology component in deploying DoF initiatives. As we all know, the triangle People-Process-Technology requires balance for proper development to occur. There are many vendors with good technology1, so there is not much of a problem in this area.

The “process” is much more of a challenge as it is more difficult to implement. In a nutshell, Project Management needs to be better implemented. We need strategic vision and better ways to plan, follow procedures, monitor, analyse, maintain safety tracking, etc.

The least developed point of the triangle is “people”. This not only involves training, management and performance reviews, a good working environment, etc. There is also a need to emphasize the collaboration between teams, working together to a common goal – keeping in mind that a set of players does not necessarily form a team.

Some people are puzzled by the lack of results achieved by their DoF initiatives. They buy and install all the required technology but forget the other two components of the triangle.

How much is data worth?

Well, this is similar to the question: “How long is a piece of string?” It all depends on how you look at it. The cost of data can be measured by how much it cost to acquire, store, quality control and communicate with it. But the “value” of the data resides in the use of it to improve the business. The cost and value are two extremes of the same axis, as shown in this figure.1

Data acquisition is a necessary first step on the road to achieving value.  Thereafter, we have to take three further steps before our data can yield value – data quality control, data communication and data storage. Most IT vendors operate in those areas with their commodity products. That’s the easy part – a part that usually adds costs to the data, but doesn’t produce much value.

As in any other business investment, the management of data needs to be seen from a ROI perspective. The amount of money corporations are prepared to spend on data acquisition, storage, communication and quality control, is proportional to the value they believe they are getting from it. If the perceived value is not there, then anything spent on it will probably be considered a “high cost” expense. This makes it essential for corporations to understand that the actual value comes from how to retrieve and use the data as a Business Intelligence (BI) tool.

It is not always simple to measure the value added to the business as there are other variables which, along with the data, improve business performance. The issue is the “perceived” value which users place on data regardless of the underlying data management technology used. For example, a manager would not care less if the data is stored in Oracle or MS SQL Server database, or how often the back-up is done, etc. Users want readily available, reliable data, which keeps them informed on how the business is going based on data perspective (KPIs). That’s where the data’s value is realised. The rest is just “cost”.

 

“I got an iPhone 6.” So what?

I’ve attended many conferences, workshops, and similar show-and-tell events about data management and data analytics. From time to time I find users who, while enquiring about a system’s capability, ask the following question: “Can I have this on my smart phone/tablet?” Usually the answer is “yes” as most systems have some web interface that enables web “user-friendliness”.

This geCarlos-iphoneneral question comes from the enquirer’s past experience in using some mass consumer functionality over their smart phones, such as banking, social media, etc. Those applications are very basic in terms of options and few would have some plotting functionality.

In the case of corporate application, more specifically in oil & gas, this is far from the truth. Would you imagine planning a well, analysing real-time operations, or monitoring a detailed production profile using a smart phone? Sure, this could be possible, but “would you”? The size and chunkiness of those interfaces would make your life more difficult than “cool”. In addition, there is the whole issue of security. If there is a feed for confidential data, it might need to be stored somewhere in the smart phone, which creates a potential security breach, even if the communication between the server and the smart phone is encrypted.

I believe most of this functionality could and would be available for simple dashboards where minimal and consolidated information can be presented. This is like green-yellow-red semaphores for production or drilling, comparing planned vs actual and things like that. That would be enough to keep users informed and at rest or alternatively start them worrying and looking deeper into the issue. There are a number of obvious advantages of using tablets/smart phones to follow critical operations, but they should have limited scope.

 

Rig automation, pero no mucho

During the last SPE ATCE conference in Amsterdam, I attended the DSATS segment (Drilling Systems Automation Technical Section) of the SPE . In case you don’t know, the purpose of DSATS is to accelerate the development and implementation of systems automation in the well drilling industry by supporting initiatives which communicate the technology, recommend best practices, standardize nomenclature and help define the value of drilling systems automation.

This was not the first time I attended this meeting and I must say I have not seen much progress in recent years. It is, however, always interesting to hear different opinions on how to move forward. Basically, the solution is entangled between adding more technology to make the drilling rig fully automated and the ROI of such solutions. Does the cost really justify replacing humans on the drilling floor? Do the automated systems perform better than humans? To date, safety is the major beneficiary of any automation, which is a good reason to pursue automation further. Another issue is the emphasis on automating “drilling” operations, which represent around 30% of total time on the rig. As it was pointed out, the rig spends more time in “flat time” than in making a hole, so the automation needs to focus on those flat time operations as well.

As a future trend, I believe we will see simpler rigs capable of working almost independently. They could even have much lower specifications than the current high-tech rigs. But they would be cheaper and safer to run. We could therefore have more rigs operating simultaneously than we can today. This is similar to Google’s driverless car, where a very basic and simple car can transport people from A to B autonomously. The business model would need to change to cater for such rigs.

Another interesting result that came out of the discussions was that one essential problem with drilling automation is data – more specifically, data quality control. It is illuminating to observe that all the sophisticated equipment used to automate a rig will rely on data from sensors, as well as decisions made by human supervisors. So, it does not matter what degree of automation is implemented on a rig, there will still be a need to look into data management processes in order to  achieve greater success.

I love spreadsheets, BUT…

Like any other engineer, I love spreadsheets. They are very powerful, easy to use and can be employed in a wide variety of situations. I use them almost daily for completing personal tasks and, to a limited extent, as handy tools in the business.

BUTphoto_blog_1_selected

The danger is to try to use them in the corporate environment. Several companies are still using spreadsheets to collect data for daily morning reports. The problem is that once the company collects hundreds of those files, from many wells, it is difficult to consolidate the information for benchmarking, comparisons, planning, analytics, etc. They can be used as a data entry tool, but the information should be stored in a single database.

Another use is to get data from corporate databases and perform data analysis using spreadsheets. If you’re doing it for yourself, that’s fine. But if it’s is to be shared or used within the company, it poses some problems. A simple example is when the engineer asks IT for an extract from the database “to do some analysis”. Once this data is received, the engineer starts “preparing the data” which means QC, fixing wrong values, changing codes, removing outliers, etc.

Then the data is aggregated using the engineer’s unique preference. After a substantial amount of work, it is ready to be presented. While presenting the report someone else questions the offset wells used and requires another study with a different dataset. The engineer then goes back to square one, asking IT for some more data etc, etc. This involves an enormous amount of re-working. Not everyone is willing to repeat this process of re-analyzing “what-if” scenarios on a weekly basis.

What may then happen is that another engineer is asked to produce similar work in another field, using their own criteria as they do not know what the first engineer has done. When the two reports arrive at the common boss’s desk they are not comparable because, for example, they use a different aggregation process. Without going any further into this data conundrum, it becomes clear to everyone that using local separated files and processes, based on personal references, can lead to massive confusion and a loss of productivity. The data needs to be live from the main database and the reporting process common to all users involved.

The solution is to replace the spreadsheet in this function with a corporate application, which can read data live from databases and allow data combination/aggregation and cross-plotting facilities on this data.