Halvard Ellingsen, CEO, and Gjermund Weisz, COO, Turbulent Flux
(Contributions by Johan Henriksson & Zongchang Yang)
First published by Globuc
Learn more about our Hybrid Technology Approach– On-Demand Webinar
It is happening! Data democratization, meaning the ability for data to be made accessible to anyone at any time, is well on its way within the energy industry. Data availability is improving with innovation and transformation teams creating open data environments to use operational data in cloud solutions, where data can be managed with open APIs.
Cloud-native computing enables almost limitless scaling as well as secure data sharing. This allows companies to perform more powerful computing and access data more easily. Companies who fail to fully adopt cloud-thinking in their operations will face a competitive disadvantage. This is mostly because of cost competitiveness, enhanced functionality and insights gained from cloud computing along with easy data sharing. In other words, cloud-thinking fosters collaboration!
Yet and still, the industry needs to work more on value creation on top of the data itself. Turbulent Flux has developed cloud-native software applications that monitor, predict and assess the flow of energy sources, with home grounds in oil and gas producing assets. It uses a unique hybrid technology flow simulation principle; predictive capabilities of physical models are combined with machine learning models that assist and self-adjust data over time. This assures an unprecedented level of accuracy and accessibility in the market. The access to quality flow rate predictions and insights in real-time is the basis for effective production optimization, thereby reducing financial and environmental costs.

Physics-based & Machine-Learning-based Flow Simulations
Physics-based modeling builds on well understood concepts in, e.g. Thermodynamics, fluid dynamics, fluid modeling and optimization techniques. If the physics model is accurate, this approach is often a good solution. Yet, it still requires deep domain knowledge as well as accurate fluid data and may incur significant computational cost.
Machine Learning (ML) systems are based on learning algorithms which find relationships between sensor data and output variables in a training dataset. The ML approach does not require deep knowledge about physics, but rather a good understanding of the learning algorithms and statistics. Yet, small datasets or changes in operational conditions limit the suitability of this approach.
By applying machine learning on real-life data and simulations from physics models, we can create ML models which mimic physics models and are faster and easier to deploy. Machine learning models can also be used to generate synthetic data for failed or missing sensors, to improve the robustness and accuracy of physics models. There are multiple ways to construct a hybrid model; the weights depend on the context and must be validated.

Digital Transformation is essential to meet sustainability targets
Innovations have started to create user values that matter for the industry. There is a need for new software tools that will do more with less resources. As energy companies are today most concerned about efficient operations whilst achieving their emission targets, we will continue to see growth in the number of innovations in areas related to production optimization, surveillance and sustainable conduct. Digital technologies, data automation and data integration are essential here to meet these KPIs.
Learn more about our Hybrid Technology Approach– On-Demand Webinar