The advent of data-driven technologies is certainly shaking the basis for this line of thought as many parameters can be simultaneously analyzed to uncover underlying physical laws. The oil and gas industry has traditionally relied on empirical and numerical models to explain reality. Ideally, we would like to preserve as much physics and data as possible without compromising speed and flexibility as we tackle increasingly larger reservoir studies. At the same time, we need to decrease the data resolution due to modeling and computing limitations when looking at larger cases. The figure below illustrates how the complexity of our models is forcibly required to decrease as we increase the size of our reservoir models. Ultimately, the physics we know needs to rely on data to unmask the physics that we do not yet know. ![]() Moreover, the current stringent economic environment and the increasing interest in pursuing cleaner, safer, and cheaper sources of energy are driving the need for more practical but more robust predictive and prescriptive models. The abundance of data and the persistence of elusive physical laws to satisfactorily explain the complexity of our assets and operations are promoting a renowned interest for finding ways to extend current model capabilities and decision workflow practices. In a matter of a few years, we have also witnessed the consolidation of data science and machine learning as widespread disciplines that could help generate new technologies derived from data. The proliferation of high-resolution datasets and decrease in sensor, storage, and computing costs are significantly extending our ability to grasp new concepts, improve predictions, and perform better field decisions.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |