Everything is ‘big data’ these days; if you’re not using Hadoop on datalakes then you’re yesterday’s news. (Or tomorrow’s!)
Over the last few years we’ve spent some quality time with large datasets in the oil & gas an petrochemicals sectors and here’s a challenge that’s not often mentioned:
Not knowing what you want to know
What do we want to know? It’s a simple question but it does confuse some customers. Right now ‘big data’ is everywhere in the tech and mainstream media, so there’s the risk that customers have unreasonable expectations of how it works. The same applies to customers who’s interest is sparked by innovative visualisations – sometimes a simple scatter graph will do the trick if you understand what you want to achieve.
Our preference is to focus on something niche, preferably with a small set of stakeholders and end-users. This allows us to deliver meaningful change quickly, even while we continue to elucidate more complex requirements.
For example, if furnace coking is a problem, it’s relatively easy to visualise it, build some algorithms to highlight the position of each furnace with respect to it’s coking state, and then to move towards predicting the maintenance schedule required.
Energy Management is also popular as a first niche particularly as it goes hand in hand with EU-ETS and Climate Change Levy reporting. There are usually significant savings to be made where utilities are concerned; with some simple calculations and simple visualisations a lot of benefit can be delivered in short order.
With such successes achieved and good working practices in place it’s easy then to start the more complex processes associated with introducing intelligent and predictive analytics.
Even when we do start to introduce the more complex areas, we do so with relatively simple steps, building all the time for the future. Wherever a customer is in their journey, we concentrate on achievable, useful steps towards a greater level of capability.